Quote:
Originally Posted by Captain Snooze
Why Facts Don’t Change Our Minds
"In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.
“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”"
New Yorker
|
Double down on the bias. This is a crazy experiment with some very cool observations about what makes us human.
EDIT: This conclusion on the study "Do not get people to describe their beliefs publicly if you want to change them." is counter to the usefulness virtue signalling. Virtue signalling would actually be harmful to drawing people to your cause.