Marie DavisAbility to be rational is contingent on inherent biases Noah Baum December 8, 2017 Opinion Most people think, with a certain amount of effort, they approach a problem with a logical mind. Countless studies, however, show that we may have overestimated our ability to be rational. Neuroscientist Read Montague of Baylor College of Medicine conducted a study in 2003 which found that, when unaware of which soda they were drinking (either Coke or Pepsi), around half the subjects preferred Pepsi and half preferred Coke. However, when the subjects knew which drink was which, there was “a dramatic influence on expressed behavioral preferences and on the measured brain responses” — now, three-fourths of the subjects preferred the Coke. This probably results from the extensive advertising run by the Coca-Cola Company, which has established Coke as the champion of the cola industry in many people’s minds. You would think a taste test would garner opinions solely based on taste, but this study uncovers how people cannot always reason completely independent of their biases. We may not have as much control over our decisions — or whether they’re rational — as we have thought, as the Montague study makes evident. Irrational behaviors are manifest in different aspects of a person’s life; however, economics has established itself as a magnet for human irrationality. Psychologists and researchers Daniel Kahneman and Amos Tversky first introduced “loss aversion,” another example of irrational human behavior, in 1979, and defined it as the observed pattern where “losses loom(ed) larger than corresponding gains.” More simply, the thought of losing something affected one’s choices more than the thought of gaining the exact same thing. Kahneman said we fear loss twice as much as we feel the positive aspects of gaining something. That is why most people find it so difficult to take a gamble, no matter the likelihood of gain it promises. When we think about how irrational we really are, it begs the question of whether logic is something which our mind is naturally equipped with, or a concept we have adopted with time. Additionally, the errors we do make in our reasoning do not seem to be random but rather patterns of the same errors, over and over. Loss aversion is an example of a logical error we all share when it comes to risk and reward, but what about when it comes to reasoning in contexts which involve other people? We still get it wrong (at least some of the time). A well-known example of this is the halo effect — or making a judgment on someone’s ability to do a certain task or tasks based on their demonstrated proficiency at unrelated tasks. An example of the halo effect would be the assumption that someone is a kind person because they are attractive or intelligent. This phenomenon heavily plays into the idolization of celebrities in Hollywood, with their prowess in acting and entertaining leaving us to assume they are decent people (and recent allegations in the movie industry has revealed this is not always the case). One of the biggest causes for inaccurate analysis of problems and their solutions according to the Harvard Business Review is “out deep-seated need to see patterns,” which can result in us seeing patterns when there are none in order to make sense of a problem. This phenomenon, named apophenia, was first discovered by German scientist Klaus Conrad in 1958 while observing the false connections mentally ill patients made when in early stages of schizophrenia. However, milder apophenia can be observed in plenty of otherwise-healthy human behaviors. Irrational choices are often viewed as errors in otherwise sound logic, but science tells us a different story. People are inclined to make the same errors due to the same biases and by understanding this, we can begin to move forward. It is likely these systematic errors and bias once worked in our favor, to avoid a gamble when it was our lives at risk and to make quick judgments on a person’s character — because they may be the very person who would leave us for dead. Now, however, we have moved past a world with such scenarios. It is in our best interest to educate ourselves about the areas in which our rationality is lacking and use more caution in those situations. By knowing when our “gut instinct” is only an artifact of the past, we can learn when we need to assess a problem more carefully. Kahneman, the psychologist behind much of the research in decision-making, told the McKinsey Quarterly journal he thought people should not “take [their] intuitions at face value.” Gary Klein, a cognitive psychologist who was interviewed in the McKinsey Quarterly along with Kahneman said unless in situations with a “certain predictability” and “the chance to get feedback on [our] judgment,” then the intuitions “aren’t going to be trustworthy.” We will never be entirely certain of our rationality in problem-solving or when approaching a situation. However, by leaving decisions to what we know, and not to how we feel, we can be more assured that our results will reflect our knowledge and not our emotions. Humans operate largely on emotions, which is why it can be so hard for us to leave them by the door — but if we know our inherent irrationality as we approach decisions, we begin to reason instead of feel, and know instead of guess. Leave a Reply Cancel Reply Your email address will not be published.CommentName* Email* Website Notify me of follow-up comments by email. Notify me of new posts by email.