Right and Wrong
Acquired, instinctive or social restraint, who decides what, is right and wrong? With less restraint upon the mind, there is more development of thought or an interaction with our own subconscious. Conflict arises when moral judgment, that which is taught at an early age interacts with a wider social level of right and wrong. Added to the diversity of our own moral code we have the combinational key of interpretation. Interpretation unequal to judgment proves only that there are many views through one window. As I look at the bigger scheme of things which exist within the world, I see choices and decisions are based on who they will benefit most at the risk of hurting others. Laws and rules were enacted to enforce right and wrong and to create harmony within our own human structure. On a personal note, I see that in many cases right and wrong are likened to that of the dark gray sky, no clarity to the view.
Many years ago before the enforced regulation of safety floatation devices aboard small water vessels, we had met up with a water patrolman. In his routine patrol of the waterways, he pulled our boat aside; he said” no safety vest?” We responded with the fact that we had a few cushion floats in the boat. He went about to tell us of all the possibilities which could cause a boat to upset and than he asked one question, “with four of you on board and the boat tips over, whom do you save? Instinctively the answer was the closest one to you. But the question always remained with me, as I realize that everything we do in life is for the greater well being. Every decision we make, no matter the rules and laws society have placed upon us, are based on our own personal thought process, we are given a choice.
The more I thought about what is right and wrong, the more complicated two words became. Why is it that we can’t find easy answers, to simple questions? Ahhh empathy, our ability to feel the pain of someone else leads us to make decisions based on the greater need. The question will always remain the same, but the answers will morph upon our own moral and social values. I questioned again why is it that the answers at times seem so marbled? My religious upbringing surfaced and I responded emotionally to the Ten Commandments, a guide of moral teaching which was instilled in me at an early age. I again thought of the boating incident and it took on a wider view, I spoke out loud “three are on a boat and it becomes unsteady in the waves, one falls overboard and is drowning, do we risk three to save one, or do we allow one to drown?” Are we bias in our decision making, based on human behavior? Since cognitive bias is a pattern of deviation in judgement based that occurs in particular situations based on individual verifiable facts, the outcome of judgement will be altered as will the outcome of any particular situation.The reality that many are practicing is torn between cognitive biases and our own human behavior. Whether stepping outside the box or maybe stepping inside the box, hopefully we will gain a wider perspective of our alternate choices in life and each individual outcome. It so reminds of the holiday movie, "It's a wonderful Life", just to imagine the changing of events and how that to will change history.
Acquired, instinctive or social restraint, who decides what, is right and wrong? With less restraint upon the mind, there is more development of thought or an interaction with our own subconscious. Conflict arises when moral judgment, that which is taught at an early age interacts with a wider social level of right and wrong. Added to the diversity of our own moral code we have the combinational key of interpretation. Interpretation unequal to judgment proves only that there are many views through one window. As I look at the bigger scheme of things which exist within the world, I see choices and decisions are based on who they will benefit most at the risk of hurting others. Laws and rules were enacted to enforce right and wrong and to create harmony within our own human structure. On a personal note, I see that in many cases right and wrong are likened to that of the dark gray sky, no clarity to the view.
Many years ago before the enforced regulation of safety floatation devices aboard small water vessels, we had met up with a water patrolman. In his routine patrol of the waterways, he pulled our boat aside; he said” no safety vest?” We responded with the fact that we had a few cushion floats in the boat. He went about to tell us of all the possibilities which could cause a boat to upset and than he asked one question, “with four of you on board and the boat tips over, whom do you save? Instinctively the answer was the closest one to you. But the question always remained with me, as I realize that everything we do in life is for the greater well being. Every decision we make, no matter the rules and laws society have placed upon us, are based on our own personal thought process, we are given a choice.
The more I thought about what is right and wrong, the more complicated two words became. Why is it that we can’t find easy answers, to simple questions? Ahhh empathy, our ability to feel the pain of someone else leads us to make decisions based on the greater need. The question will always remain the same, but the answers will morph upon our own moral and social values. I questioned again why is it that the answers at times seem so marbled? My religious upbringing surfaced and I responded emotionally to the Ten Commandments, a guide of moral teaching which was instilled in me at an early age. I again thought of the boating incident and it took on a wider view, I spoke out loud “three are on a boat and it becomes unsteady in the waves, one falls overboard and is drowning, do we risk three to save one, or do we allow one to drown?” Are we bias in our decision making, based on human behavior? Since cognitive bias is a pattern of deviation in judgement based that occurs in particular situations based on individual verifiable facts, the outcome of judgement will be altered as will the outcome of any particular situation.The reality that many are practicing is torn between cognitive biases and our own human behavior. Whether stepping outside the box or maybe stepping inside the box, hopefully we will gain a wider perspective of our alternate choices in life and each individual outcome. It so reminds of the holiday movie, "It's a wonderful Life", just to imagine the changing of events and how that to will change history.
Bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behaviour.
Base rate fallacy — ignoring available statistical data in favor of particulars.
Bias blind spot — the tendency not to compensate for one's own cognitive biases.
Choice-supportive bias — the tendency to remember one's choices as better than they actually were.
Confirmation bias — the tendency to search for or interpret information in a way that confirms one's preconceptions.
Congruence bias — the tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.
Conservatism bias — the tendency to ignore the consequence of new evidence. (Related to base rate fallacy.)[1]
Contrast effect — the enhancement or diminishing of a weight or other measurement when compared with recently observed contrasting object.
Déformation professionnelle — the tendency to look at things according to the conventions of one's own profession, forgetting any broader point of view.
Denial[verification needed] — the tendency to disbelieve or discount an unpleasant fact.
Distinction bias — the tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.[2]
Endowment effect — "the fact that people often demand much more to give up an object than they would be willing to pay to acquire it".[3]
Expectation bias — the tendency for experimenters to believe, certify, and publish data that agrees with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appears to conflict with those expectations.[4]
Extreme aversion — the tendency to avoid extremes, being more likely to choose an option if it is the intermediate choice.
Focusing effect — prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
Framing — by using a too narrow approach or description of the situation or issue. Also framing effect — drawing different conclusions based on how data are presented.
Hostility[verification needed]
Hyperbolic discounting — the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs, where the tendency increases the closer to the present both payoffs are.
Illusion of control — the tendency for human beings to believe they can control or at least influence outcomes that they clearly cannot.
Impact bias — the tendency for people to overestimate the length or the intensity of the impact of future feeling states.
Information bias — the tendency to seek information even when it cannot affect action.
Irrational escalation — the tendency to make irrational decisions based upon rational decisions in the past or to justify actions already taken.
Loss aversion — "the disutility of giving up an object is greater than the utility associated with acquiring it".[5] (see also sunk cost effects and Endowment effect).
Mere exposure effect — the tendency for people to express undue liking for things merely because they are familiar with them.
Moral credential effect — the tendency of a track record of non-prejudice to increase subsequent prejudice.
Need for closure — the need to reach a verdict in important matters; to have an answer and to escape the feeling of doubt and uncertainty. The personal context (time or social pressure) might increase this bias.[6]
Neglect of probability — the tendency to completely disregard probability when making a decision under uncertainty.
Not Invented Here — the tendency to ignore that a product or solution already exists, because its source is seen as an "enemy" or as "inferior".
Omission bias — the tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).
Outcome bias — the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
Planning fallacy — the tendency to underestimate task-completion times.
Post-purchase rationalization — the tendency to persuade oneself through rational argument that a purchase was a good value.
Pseudocertainty effect — the tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
Reactance — the urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice.
Selective perception — the tendency for expectations to affect perception.
Status quo bias — the tendency for people to like things to stay relatively the same (see also loss aversion, endowment effect, and system justification).[7]
Von Restorff effect — the tendency for an item that "stands out like a sore thumb" to be more likely to be remembered than other items.
Wishful thinking — the formation of beliefs and the making of decisions according to what is pleasing to imagine instead of by appeal to evidence or rationality.
Zero-risk bias — preference for reducing a small risk to zero over a greater reduction in a larger risk.
Base rate fallacy — ignoring available statistical data in favor of particulars.
Bias blind spot — the tendency not to compensate for one's own cognitive biases.
Choice-supportive bias — the tendency to remember one's choices as better than they actually were.
Confirmation bias — the tendency to search for or interpret information in a way that confirms one's preconceptions.
Congruence bias — the tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.
Conservatism bias — the tendency to ignore the consequence of new evidence. (Related to base rate fallacy.)[1]
Contrast effect — the enhancement or diminishing of a weight or other measurement when compared with recently observed contrasting object.
Déformation professionnelle — the tendency to look at things according to the conventions of one's own profession, forgetting any broader point of view.
Denial[verification needed] — the tendency to disbelieve or discount an unpleasant fact.
Distinction bias — the tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.[2]
Endowment effect — "the fact that people often demand much more to give up an object than they would be willing to pay to acquire it".[3]
Expectation bias — the tendency for experimenters to believe, certify, and publish data that agrees with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appears to conflict with those expectations.[4]
Extreme aversion — the tendency to avoid extremes, being more likely to choose an option if it is the intermediate choice.
Focusing effect — prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
Framing — by using a too narrow approach or description of the situation or issue. Also framing effect — drawing different conclusions based on how data are presented.
Hostility[verification needed]
Hyperbolic discounting — the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs, where the tendency increases the closer to the present both payoffs are.
Illusion of control — the tendency for human beings to believe they can control or at least influence outcomes that they clearly cannot.
Impact bias — the tendency for people to overestimate the length or the intensity of the impact of future feeling states.
Information bias — the tendency to seek information even when it cannot affect action.
Irrational escalation — the tendency to make irrational decisions based upon rational decisions in the past or to justify actions already taken.
Loss aversion — "the disutility of giving up an object is greater than the utility associated with acquiring it".[5] (see also sunk cost effects and Endowment effect).
Mere exposure effect — the tendency for people to express undue liking for things merely because they are familiar with them.
Moral credential effect — the tendency of a track record of non-prejudice to increase subsequent prejudice.
Need for closure — the need to reach a verdict in important matters; to have an answer and to escape the feeling of doubt and uncertainty. The personal context (time or social pressure) might increase this bias.[6]
Neglect of probability — the tendency to completely disregard probability when making a decision under uncertainty.
Not Invented Here — the tendency to ignore that a product or solution already exists, because its source is seen as an "enemy" or as "inferior".
Omission bias — the tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).
Outcome bias — the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
Planning fallacy — the tendency to underestimate task-completion times.
Post-purchase rationalization — the tendency to persuade oneself through rational argument that a purchase was a good value.
Pseudocertainty effect — the tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
Reactance — the urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice.
Selective perception — the tendency for expectations to affect perception.
Status quo bias — the tendency for people to like things to stay relatively the same (see also loss aversion, endowment effect, and system justification).[7]
Von Restorff effect — the tendency for an item that "stands out like a sore thumb" to be more likely to be remembered than other items.
Wishful thinking — the formation of beliefs and the making of decisions according to what is pleasing to imagine instead of by appeal to evidence or rationality.
Zero-risk bias — preference for reducing a small risk to zero over a greater reduction in a larger risk.
Embrace my soul, my heart in need,
to feel your love so tenderly.
~
1 comment:
You bring to light a great question as to what is "right" or what is "wrong". Life is not as black and white as we would like to think it is and rules are not always good or bad. There are many rules which are born out of love and truth and many which are born out of fear and ignorrance. It is for each individual to determine which align with our authentic self and which ones are out of alignment and cause greater damange than good. Very though provoking Rachel, thank-you.
Post a Comment