notes-group-rationality

one guy's idea of an enumeration of the actions available during debate:

" The defender then responds to the challenger's argument, for each proposition either (a) conceding it; (b) dismissing it contemptuously, as unworthy of serious consideration; (c) equating it to some other proposition stated by the defender, or negation of some proposition stated by the challenger (ie, putting a symbolic link in the argument tree); or (d) responding with a counterargument. " -- http://unqualified-reservations.blogspot.com/2007/10/duelnode-another-free-startup-idea.html

--

another one, similar to conceding: "I think you've misconstrued my point as I'm generally in agreement with you."

--

that theorem that a group that decides which propositions it believes in by majority vote can be irrational (in the sense of holding inconsistent beliefs), even if every member is rational, e.g.:

all members believe (A and B) implies C 1/3 of members believe A and not-B and not-C 1/3 of members believe B and not-A and not-C 1/3 of members believe A and B and C

by majority vote, the group believes A and B but not-C.

(cite [1])

--

http://www.huffingtonpost.com/marty-kaplan/most-depressing-brain-fin_b_3932273.html

"

Yale law school professor Dan Kahan's new research paper is called "Motivated Numeracy and Enlightened Self-Government," but for me a better title is the headline on science writer Chris Mooney's piece about it in Grist: "Science Confirms: Politics Wrecks Your Ability to Do Math."

Kahan conducted some ingenious experiments about the impact of political passion on people's ability to think clearly. His conclusion, in Mooney's words: partisanship "can even undermine our very basic reasoning skills.... [People] who are otherwise very good at math may totally flunk a problem that they would otherwise probably be able to solve, simply because giving the right answer goes against their political beliefs." "

"

Here's some of what Nyhan found:

    People who thought WMDs were found in Iraq believed that misinformation even more strongly when they were shown a news story correcting it.
    People who thought George W. Bush banned all stem cell research kept thinking he did that even after they were shown an article saying that only some federally funded stem cell work was stopped.
    People who said the economy was the most important issue to them, and who disapproved of Obama's economic record, were shown a graph of nonfarm employment over the prior year - a rising line, adding about a million jobs. They were asked whether the number of people with jobs had gone up, down or stayed about the same. Many, looking straight at the graph, said down.
    But if, before they were shown the graph, they were asked to write a few sentences about an experience that made them feel good about themselves, a significant number of them changed their minds about the economy. If you spend a few minutes affirming your self-worth, you're more likely to say that the number of jobs increased.

In Kahan's experiment, some people were asked to interpret a table of numbers about whether a skin cream reduced rashes, and some people were asked to interpret a different table -- containing the same numbers -- about whether a law banning private citizens from carrying concealed handguns reduced crime. Kahan found that when the numbers in the table conflicted with people's positions on gun control, they couldn't do the math right, though they could when the subject was skin cream. The bleakest finding was that the more advanced that people's math skills were, the more likely it was that their political views, whether liberal or conservative, made them less able to solve the math problem. "

http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2319992

http://www.dartmouth.edu/~nyhan/nyhan-reifler.pdf

http://www.dartmouth.edu/~nyhan/opening-political-mind.pdf

---

notes on http://www.newyorker.com/magazine/2017/02/27/why-facts-dont-change-our-minds

" In a study conducted in 2012, they asked people for their stance on questions like: Should there be a single-payer health-care system? Or merit-based pay for teachers? Participants were asked to rate their positions depending on how strongly they agreed or disagreed with the proposals. Next, they were instructed to explain, in as much detail as they could, the impacts of implementing each one. Most people at this point ran into trouble. Asked once again to rate their views, they ratcheted down the intensity, so that they either agreed or disagreed less vehemently. "

" graduate students were asked to rate their understanding of everyday devices, including toilets, zippers, and cylinder locks. They were then asked to write detailed, step-by-step explanations of how the devices work, and to rate their understanding again. Apparently, the effort revealed to the students their own ignorance, because their self-assessments dropped....Sloman and Fernbach see this effect, which they call the illusion of explanatory depth, just about everywhere. "

" Participants were asked to answer a series of simple reasoning problems. They were then asked to explain their responses, and were given a chance to modify them if they identified mistakes. The majority were satisfied with their original choices; fewer than fifteen per cent changed their minds in step two. ... participants were shown one of the same problems, along with their answer and the answer of another participant, who d come to a different conclusion. Once again, they were given the chance to change their responses. But a trick had been played: the answers presented to them as someone else s were actually their own, and vice versa. About half the participants realized what was going on. Among the other half, suddenly people became a lot more critical. Nearly sixty per cent now rejected the responses that they d earlier been satisfied with. "

" Consider what s become known as confirmation bias, the tendency people have to embrace information that supports their beliefs and reject information that contradicts them. Of the many forms of faulty thinking that have been identified, confirmation bias is among the best catalogued; it s the subject of entire textbooks worth of experiments. One of the most famous of these was conducted, again, at Stanford. For this experiment, researchers rounded up a group of students who had opposing opinions about capital punishment. Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.

The students were asked to respond to two studies. One provided data in support of the deterrence argument, and the other provided data that called it into question. Both studies you guessed it were made up, and had been designed to present what were, objectively speaking, equally compelling statistics. The students who had originally supported capital punishment rated the pro-deterrence data highly credible and the anti-deterrence data unconvincing; the students who d originally opposed capital punishment did the reverse. At the end of the experiment, the students were asked once again about their views. Those who d started out pro-capital punishment were now even more in favor of it; those who d opposed it were even more hostile. "

" The students were handed packets of information about a pair of firefighters, Frank K. and George H. Frank s bio noted that, among other things, he had a baby daughter and he liked to scuba dive. George had a small son and played golf. The packets also included the men s responses on what the researchers called the Risky-Conservative Choice Test. According to one version of the packet, Frank was a successful firefighter who, on the test, almost always went with the safest option. In the other version, Frank also chose the safest option, but he was a lousy firefighter who d been put on report by his supervisors several times. Once again, midway through the study, the students were informed that they d been misled, and that the information they d received was entirely fictitious. The students were then asked to describe their own beliefs. What sort of attitude toward risk did they think a successful firefighter would have? The students who d received the first packet thought that he would avoid it. The students in the second group thought he d embrace it. "

" As is often the case with psychological studies, the whole setup was a put-on. Though half the notes were indeed genuine they d been obtained from the Los Angeles County coroner s office the scores were fictitious. The students who d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.

In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well significantly better than the average student even though, as they d just been told, they had zero grounds for believing this. Conversely, those who d been assigned to the low-score group said that they thought they had done significantly worse than the average student a conclusion that was equally unfounded. "

" Where it gets us into trouble, according to Sloman and Fernbach, is in the political domain. It s one thing for me to flush a toilet without knowing how it operates, and another for me to favor (or oppose) an immigration ban without knowing what I m talking about. Sloman and Fernbach cite a survey conducted in 2014, not long after Russia annexed the Ukrainian territory of Crimea. Respondents were asked how they thought the U.S. should react, and also whether they could identify Ukraine on a map. The farther off base they were about the geography, the more likely they were to favor military intervention. (Respondents were so unsure of Ukraine s location that the median guess was wrong by eighteen hundred miles, roughly the distance from Kiev to Madrid.) "

---

See also [2], [Self-ideas-groupDecisionMaking-sourcesOfCollectiveInsanity].

---

todo should combine the pages [3], [4], [5], [6], [7].