notes-science-psych-weAreAllConfidentIdiots

http://www.psmag.com/navigation/health-and-behavior/confident-idiots-92793/

"We Are All Confident Idiots"

if you go to the South by Southwest music festival and ask people their opinion of imaginary bands, some of them will give an opinion as if they know of the band. Similarly if you ask people on the street regarding their opinion on imaginary news or historical events.

" we ask survey respondents if they are familiar with certain technical concepts from physics, biology, politics, and geography. A fair number claim familiarity with genuine terms like centripetal force and photon. But interestingly, they also claim some familiarity with concepts that are entirely made up, such as the plates of parallax, ultra-lipid, and cholarine. In one study, roughly 90 percent claimed some knowledge of at least one of the nine fictitious concepts we asked them about. In fact, the more well versed respondents considered themselves in a general topic, the more familiarity they claimed with the meaningless terms associated with it in the survey. ... The American author and aphorist William Feather once wrote that being educated means “being able to differentiate between what you know and what you don’t.” As it turns out, this simple ideal is extremely hard to achieve. ... In 1999, in the Journal of Personality and Social Psychology, my then graduate student Justin Kruger and I published a paper that documented how, in many areas of life, incompetent people do not recognize—scratch that, cannot recognize—just how incompetent they are, a phenomenon that has come to be known as the Dunning-Kruger effect. ... What’s curious is that, in many cases, incompetence does not leave people disoriented, perplexed, or cautious. Instead, the incompetent are often blessed with an inappropriate confidence, buoyed by something that feels to them like knowledge.

This isn’t just an armchair theory. A whole battery of studies conducted by myself and others have confirmed that people who don’t know much about a given set of cognitive, technical, or social skills tend to grossly overestimate their prowess and performance, whether it’s grammar, emotional intelligence, logical reasoning, firearm care and safety, debating, or financial knowledge. College students who hand in exams that will earn them Ds and Fs tend to think their efforts will be worthy of far higher grades; low-performing chess players, bridge players, and medical students, and elderly people applying for a renewed driver’s license, similarly overestimate their competence by a long shot. ... And recent research suggests that many Americans’ financial ignorance is of the inappropriately confident variety. In 2012, the National Financial Capability Study, conducted by the Financial Industry Regulatory Authority (with the U.S. Treasury), asked roughly 25,000 respondents to rate their own financial knowledge, and then went on to measure their actual financial literacy.

The roughly 800 respondents who said they had filed bankruptcy within the previous two years performed fairly dismally on the test—in the 37th percentile, on average. But they rated their overall financial knowledge more, not less, positively than other respondents did. The difference was slight, but it was beyond a statistical doubt: 23 percent of the recently bankrupted respondents gave themselves the highest possible self-rating; among the rest, only 13 percent did so. Why the self-confidence? Like Jimmy Kimmel’s victims, bankrupted respondents were particularly allergic to saying “I don’t know.” Pointedly, when getting a question wrong, they were 67 percent more likely to endorse a falsehood than their peers were. ... Because it’s so easy to judge the idiocy of others, it may be sorely tempting to think this doesn’t apply to you. But the problem of unrecognized ignorance is one that visits us all. And over the years, I’ve become convinced of one key, overarching fact about the ignorant mind. One should not think of it as uninformed. Rather, one should think of it as misinformed.

An ignorant mind is precisely not a spotless, empty vessel, but one that’s filled with the clutter of irrelevant or misleading life experiences, theories, facts, intuitions, strategies, algorithms, heuristics, metaphors, and hunches that regrettably have the look and feel of useful and accurate knowledge. This clutter is an unfortunate by-product of one of our greatest strengths as a species. We are unbridled pattern recognizers and profligate theorizers. Often ... As the humorist Josh Billings once put it, “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” ... Because of the way we are built, and because of the way we learn from our environment, we are all engines of misbelief. ... Some of our deepest intuitions about the world go all the way back to our cradles. Before their second birthday, babies know that two solid objects cannot co-exist in the same space. They know that objects continue to exist when out of sight, and fall if left unsupported. They know that people can get up and move around as autonomous beings, but that the computer sitting on the desk cannot. But not all of our earliest intuitions are so sound.

Very young children also carry misbeliefs that they will harbor, to some degree, for the rest of their lives. Their thinking, for example, is marked by a strong tendency to falsely ascribe intentions, functions, and purposes to organisms. In a child’s mind, the most important biological aspect of a living thing is the role it plays in the realm of all life. Asked why tigers exist, children will emphasize that they were “made for being in a zoo.” Asked why trees produce oxygen, children say they do so to allow animals to breathe. ...

when rushed, even professional scientists start making purpose-driven mistakes. The Boston University psychologist Deborah Kelemen and some colleagues demonstrated this in a study that involved asking 80 scientists—people with university jobs in geoscience, chemistry, and physics—to evaluate 100 different statements about “why things happen” in the natural world as true or false. Sprinkled among the explanations were false purpose-driven ones, such as “Moss forms around rocks in order to stop soil erosion” and “The Earth has an ozone layer in order to protect it from UV light.” Study participants were allowed either to work through the task at their own speed, or given only 3.2 seconds to respond to each item. Rushing the scientists caused them to double their endorsements of false purpose-driven explanations, from 15 to 29 percent. ... This purpose-driven misconception wreaks particular havoc on attempts to teach one of the most important concepts in modern science: evolutionary theory. Even laypeople who endorse the theory often believe a false version of it. They ascribe a level of agency and organization to evolution that is just not there. If you ask many laypeople their understanding of why, say, cheetahs can run so fast, they will explain it’s because the cats surmised, almost as a group, that they could catch more prey if they could just run faster, and so they acquired the attribute and passed it along to their cubs. Evolution, in this view, is essentially a game of species-level strategy. ... While educating people about evolution can indeed lead them from being uninformed to being well informed, in some stubborn instances it also moves them into the confidently misinformed category. In 2014, Tony Yates and Edmund Marek published a study that tracked the effect of high school biology classes on 536 Oklahoma high school students’ understanding of evolutionary theory. The students were rigorously quizzed on their knowledge of evolution before taking introductory biology, and then again just afterward. Not surprisingly, the students’ confidence in their knowledge of evolutionary theory shot up after instruction, and they endorsed a greater number of accurate statements. So far, so good.

The trouble is that the number of misconceptions the group endorsed also shot up. For example, instruction caused the percentage of students strongly agreeing with the true statement “Evolution cannot cause an organism’s traits to change during its lifetime” to rise from 17 to 20 percent—but it also caused those strongly disagreeing to rise from 16 to 19 percent. In response to the likewise true statement “Variation among individuals is important for evolution to occur,” exposure to instruction produced an increase in strong agreement from 11 to 22 percent, but strong disagreement also rose from nine to 12 percent. Tellingly, the only response that uniformly went down after instruction was “I don’t know.”

And it’s not just evolution that bedevils students. Again and again, research has found that conventional educational practices largely fail to eradicate a number of our cradle-born misbeliefs. Education fails to correct people who believe that vision is made possible only because the eye emits some energy or substance into the environment. It fails to correct common intuitions about the trajectory of falling objects. And it fails to disabuse students of the idea that light and heat act under the same laws as material substances. What education often does appear to do, however, is imbue us with confidence in the errors we retain. ... Imagine that the illustration below represents a curved tube lying horizontally on a table:

http://d1435t697bgi2o.cloudfront.net/wp-content/uploads/2014/10/confident-idiot-chart.jpg

In a study of intuitive physics in 2013, Elanor Williams, Justin Kruger, and I presented people with several variations on this curved-tube image and asked them to identify the trajectory a ball would take (marked A, B, or C in the illustration) after it had traveled through each. ... But something curious started happening as we began to look at the people who did extremely badly on our little quiz. By now, you may be able to predict it: These people expressed more, not less, confidence in their performance. In fact, people who got none of the items right often expressed confidence that matched that of the top performers. Indeed, this study produced the most dramatic example of the Dunning-Kruger effect we had ever seen: When looking only at the confidence of people getting 100 percent versus zero percent right, it was often impossible to tell who was in which group. ... Why? Because both groups “knew something.” They knew there was a rigorous, consistent rule that a person should follow to predict the balls’ trajectories. ... People who got every item wrong typically answered that the ball would follow Path A. Essentially, their rule was that the tube would impart some curving impetus to the trajectory of the ball, which it would continue to follow upon its exit. This answer is demonstrably incorrect—but a plurality of people endorse it.

These people are in good company. In 1500 A.D., Path A would have been the accepted answer among sophisticates with an interest in physics. ... What this study illustrates is another general way—in addition to our cradle-born errors—in which humans frequently generate misbeliefs: We import knowledge from appropriate settings into ones where it is inappropriate.

Here’s another example: According to Pauline Kim, a professor at Washington University Law School, people tend to make inferences about the law based on what they know about more informal social norms. This frequently leads them to misunderstand their rights—and in areas like employment law, to wildly overestimate them. ... Eighty to 90 percent of the Buffalonians incorrectly identified each of these distasteful scenarios as illegal, revealing how little they understood about how much freedom employers actually enjoy to fire employees. (Why does this matter? Legal scholars had long defended “at-will” employment rules on the grounds that employees consent to them in droves without seeking better terms of employment. What Kim showed was that employees seldom understand what they’re consenting to.) ... Elderly patients, for example, frequently refuse to follow a doctor’s advice to exercise to alleviate pain—one of the most effective strategies available—because the physical soreness and discomfort they feel when they exercise is something they associate with injury and deterioration. Research by the behavioral economist Sendhil Mullainathan has found that mothers in India often withhold water from infants with diarrhea because they mistakenly conceive of their children as leaky buckets—rather than as increasingly dehydrated creatures in desperate need of water. ... Some of our most stubborn misbeliefs arise not from primitive childlike intuitions or careless category errors, but from the very values and philosophies that define who we are as individuals. Each of us possesses certain foundational beliefs—narratives about the self, ideas about the social order—that essentially cannot be violated: To contradict them would call into question our very self-worth. As such, these views demand fealty from other opinions. And any information that we glean from the world is amended, distorted, diminished, or forgotten in order to make sure that these sacrosanct beliefs remain whole and unharmed. ... One very commonly held sacrosanct belief, for example, goes something like this: I am a capable, good, and caring person. Any information that contradicts this premise is liable to meet serious mental resistance. ... The anthropological theory of cultural cognition suggests that people everywhere tend to sort ideologically into cultural worldviews diverging along a couple of axes: They are either individualist (favoring autonomy, freedom, and self-reliance) or communitarian (giving more weight to benefits and costs borne by the entire community); and they are either hierarchist (favoring the distribution of social duties and resources along a fixed ranking of status) or egalitarian (dismissing the very idea of ranking people according to status). According to the theory of cultural cognition, humans process information in a way that not only reflects these organizing principles, but also reinforces them. These ideological anchor points can have a profound and wide-ranging impact on what people believe, and even on what they “know” to be true. ... In ongoing work with the political scientist Peter Enns, my lab has found that a person’s politics can warp other sets of logical or factual beliefs so much that they come into direct contradiction with one another. In a survey of roughly 500 Americans conducted in late 2010, we found that over a quarter of liberals (but only six percent of conservatives) endorsed both the statement “President Obama’s policies have already created a strong revival in the economy” and “Statutes and regulations enacted by the previous Republican presidential administration have made a strong economic recovery impossible.” ... Among conservatives, 27 percent (relative to just 10 percent of liberals) agreed both that “President Obama’s rhetorical skills are elegant but are insufficient to influence major international issues” and that “President Obama has not done enough to use his rhetorical skills to effect regime change in Iraq.” ... Sacrosanct ideological commitments can also drive us to develop quick, intense opinions on topics we know virtually nothing about—topics that, on their face, have nothing to do with ideology. ... In 2006, Daniel Kahan, a professor at Yale Law School, performed a study together with some colleagues on public perceptions of nanotechnology. They found, as other surveys had before, that most people knew little to nothing about the field. They also found that ignorance didn’t stop people from opining about whether nanotechnology’s risks outweighed its benefits.

When Kahan surveyed uninformed respondents, their opinions were all over the map. But when he gave another group of respondents a very brief, meticulously balanced description of the promises and perils of nanotech, the remarkable gravitational pull of deeply held sacrosanct beliefs became apparent. With just two paragraphs of scant (though accurate) information to go on, people’s views of nanotechnology split markedly—and aligned with their overall worldviews. Hierarchics/individualists found themselves viewing nanotechnology more favorably. Egalitarians/collectivists took the opposite stance, insisting that nanotechnology has more potential for harm than good.

Why would this be so? Because of underlying beliefs. Hierarchists, who are favorably disposed to people in authority, may respect industry and scientific leaders who trumpet the unproven promise of nanotechnology. Egalitarians, on the other hand, may fear that the new technology could present an advantage that conveys to only a few people. And collectivists might worry that nanotechnology firms will pay insufficient heed to their industry’s effects on the environment and public health. Kahan’s conclusion: If two paragraphs of text are enough to send people on a glide path to polarization, simply giving members of the public more information probably won’t help them arrive at a shared, neutral understanding of the facts; it will just reinforce their biased views. ... One might think that opinions about an esoteric technology would be hard to come by. Surely, to know whether nanotech is a boon to humankind or a step toward doomsday would require some sort of knowledge about materials science, engineering, industry structure, regulatory issues, organic chemistry, surface science, semiconductor physics, microfabrication, and molecular biology. Every day, however, people rely on the cognitive clutter in their minds—whether it’s an ideological reflex, a misapplied theory, or a cradle-born intuition—to answer technical, political, and social questions they have little or no direct expertise in. We are never all that far from Tonya and the Hardings. ... The way we traditionally conceive of ignorance—as an absence of knowledge—leads us to think of education as its natural antidote. But education, even when done skillfully, can produce illusory confidence. Here’s a particularly frightful example: Driver’s education courses, particularly those aimed at handling emergency maneuvers, tend to increase, rather than decrease, accident rates. They do so because training people to handle, say, snow and ice leaves them with the lasting impression that they’re permanent experts on the subject. In fact, their skills usually erode rapidly after they leave the course. And so, months or even decades later, they have confidence but little leftover competence when their wheels begin to spin.

In cases like this, the most enlightened approach, as proposed by Swedish researcher Nils Petter Gregersen, may be to avoid teaching such skills at all. Instead of training drivers how to negotiate icy conditions, Gregersen suggests, perhaps classes should just convey their inherent danger—they should scare inexperienced students away from driving in winter conditions in the first place, and leave it at that.

But, of course, guarding people from their own ignorance by sheltering them from the risks of life is seldom an option. Actually getting people to part with their misbeliefs is a far trickier, far more important task. Luckily, a science is emerging, led by such scholars as Stephan Lewandowsky at the University of Bristol and Ullrich Ecker of the University of Western Australia, that could help.

In the classroom, some of best techniques for disarming misconceptions are essentially variations on the Socratic method. To eliminate the most common misbeliefs, the instructor can open a lesson with them—and then show students the explanatory gaps those misbeliefs leave yawning or the implausible conclusions they lead to. For example, an instructor might start a discussion of evolution by laying out the purpose-driven evolutionary fallacy, prompting the class to question it. (How do species just magically know what advantages they should develop to confer to their offspring? How do they manage to decide to work as a group?) Such an approach can make the correct theory more memorable when it’s unveiled, and can prompt general improvements in analytical skills.

Then, of course, there is the problem of rampant misinformation in places that, unlike classrooms, are hard to control—like the Internet and news media. In these Wild West settings, it’s best not to repeat common misbeliefs at all. Telling people that Barack Obama is not a Muslim fails to change many people’s minds, because they frequently remember everything that was said—except for the crucial qualifier “not.” Rather, to successfully eradicate a misbelief requires not only removing the misbelief, but filling the void left behind (“Obama was baptized in 1988 as a member of the United Church of Christ”). If repeating the misbelief is absolutely necessary, researchers have found it helps to provide clear and repeated warnings that the misbelief is false. I repeat, false.

The most difficult misconceptions to dispel, of course, are those that reflect sacrosanct beliefs. And the truth is that often these notions can’t be changed. Calling a sacrosanct belief into question calls the entire self into question, and people will actively defend views they hold dear. This kind of threat to a core belief, however, can sometimes be alleviated by giving people the chance to shore up their identity elsewhere. Researchers have found that asking people to describe aspects of themselves that make them proud, or report on values they hold dear, can make any incoming threat seem, well, less threatening.

For example, in a study conducted by Geoffrey Cohen, David Sherman, and other colleagues, self-described American patriots were more receptive to the claims of a report critical of U.S. foreign policy if, beforehand, they wrote an essay about an important aspect of themselves, such as their creativity, sense of humor, or family, and explained why this aspect was particularly meaningful to them. In a second study, in which pro-choice college students negotiated over what federal abortion policy should look like, participants made more concessions to restrictions on abortion after writing similar self-affirmative essays.

Sometimes, too, researchers have found that sacrosanct beliefs themselves can be harnessed to persuade a subject to reconsider a set of facts with less prejudice. For example, conservatives tend not to endorse policies that preserve the environment as much as liberals do. But conservatives do care about issues that involve “purity” in thought, deed, and reality. Casting environmental protection as a chance to preserve the purity of the Earth causes conservatives to favor those policies much more, as research by Matthew Feinberg and Robb Willer of Stanford University suggests. In a similar vein, liberals can be persuaded to raise military spending if such a policy is linked to progressive values like fairness and equity beforehand—by, for instance, noting that the military offers recruits a way out of poverty, or that military promotion standards apply equally to all.

But here is the real challenge: How can we learn to recognize our own ignorance and misbeliefs? To begin with, imagine that you are part of a small group that needs to make a decision about some matter of importance. Behavioral scientists often recommend that small groups appoint someone to serve as a devil’s advocate—a person whose job is to question and criticize the group’s logic. While this approach can prolong group discussions, irritate the group, and be uncomfortable, the decisions that groups ultimately reach are usually more accurate and more solidly grounded than they otherwise would be.

For individuals, the trick is to be your own devil’s advocate: to think through how your favored conclusions might be misguided; to ask yourself how you might be wrong, or how things might turn out differently from what you expect. It helps to try practicing what the psychologist Charles Lord calls “considering the opposite.” To do this, I often imagine myself in a future in which I have turned out to be wrong in a decision, and then consider what the likeliest path was that led to my failure. And lastly: Seek advice. Other people may have their own misbeliefs, but a discussion can often be sufficient to rid a serious person of his or her most egregious misconceptions. ... In 2008, the Intercollegiate Studies Institute surveyed 2,508 Americans and found that 20 percent of them think the electoral college “trains those aspiring for higher political office” or “was established to supervise the first televised presidential debates.” Alarms were again raised about the decline of civic literacy. Ironically, as Stanford historian Sam Wineburg has written, people who lament America’s worsening ignorance of its own history are themselves often blind to how many before them have made the exact same lament; a look back suggests not a falling off from some baseline of American greatness, but a fairly constant level of clumsiness with the facts. "

some comments of mine

Failure to choose "I don't know" option could be misleading

The author seems to imply that study participants who were offered an 'I don't know' option and who did not choose it are confident.

People not saying "i don't know" on surveys does not imply that they are confident in their answers. First, in everyday life, you often have to make decisions without having enough information to really know the right answer. For this purpose, one often has to adopt an opinion even if one does not know much about it. With respect to actions to be taken, then opinion may (rationally) be held very strongly; for example if you tell someone "X is a form of (scientific research or industrial process) that a large minority of scientists consider to be very dangerous" then it may be rational to strongly support the action "X should be regulated" even if you don't really even know what X is; but this requires adopting the belief "X has a significant probability of being dangerous".

Second, in many situations, the convention (language pragmatic) often is that , if the speaker and the listener share contextual knowledge that the speaker doesn't have enough expertise in a topic, that statements come with the implict qualifier "I am not a (lawyer/doctor/scientist/etc) so i don't really know, but my guess is...". If you give someone a questionairre about things they obviously are not well-informed about, they may presume that the context is that you want guesses, rather than just checking 'i don't know' for every question.

Third, this convention occurs less commonly within academia, making it easy for psychologists not to account for it.

Fourth, there are many degrees of confidence, but "I don't know' is binary. People may think that "I think X but i'm not sure" is better represented by saying "X" rather than "I don't know".

Fifth, when presented with multiple choice answers, people may see their task as to choose the multiple choice answer that is closest to accurately representing their views, rather than as an opportunity to make an assertion, or to pass on making an assertion by saying "I don't know". One might imagine each view corresponding to a prototypical person holding an idealized belief state, and calculate the similarity between your belief state and the idealized belief state. In that case, "I don't know" would a belief state in which the prototypical person shrugs their sholders, smiles, and has no thoughts whatsoever regarding the question. A study participant may decide that, although none of the choices accurately represent them, that they are closer to one of the primary choices than to "I don't know".

People holding contradictory or incorrect beliefs

The author mentions people agreeing with statements that formally contradict one another.

First, when a person agrees with a prewritten statement, they might be thinking not "I agree with this exactly", but rather, "As written, this is technically not what I believe, but it's very close, so I'll say I agree". This makes sense in the context of group consensus-building; if whenever each person proposes something, every other person points out tiny flaws in the wording, coming to agreement will take a long time (and may cause animosity); in some situations, this is worth it because it's important to get the wording exactly right, but in most situations, precision is unnecessary and the group just wants to "approximately agree" relatively quickly with a minimum of bad feelings; in these latter situations, someone who refuses to say "I agree" unless the statement presented exactly matches their beliefs would be considered obnoxious. As before, in academia, the need for precision is more common so the convention is different. This is similar to the fifth objection in "Failure to choose "I don't know" option could be misleading"; it's possible that a person holds a consistent set of beliefs but that these beliefs do not exactly match any multiple choice question answers or prewritten statements, so the person chooses the closest answer for each question, and that these answers are inconsistent even if the person's actualy beliefs are consistent.

Second, even if a person wrote their own assertions, one might find contradictions; but it is hard to know if this is because the person does not possess a coherent set of beliefs, or if they do but they are having difficulty representing those beliefs in language.

Suggestion: instead of looking for inconsistencies in assertions, give people decision-making scenarios and ask what decision they would make. I'm guessing you may find less inconsistency.