Sometime during the 1950’s, Solomon Asch performed a psychological experiment in which participants were brought into a room and told that their vision was going to be tested. One group of participants were simply shown an image of a line, and asked to compare the length of the line to three other lines. Specifically, they were asked, “Of these three lines, which one is the same length?” Only 1 in 35 participants answered incorrectly.
Another group of participants sat with a number of other “participants,” and each participant in the room was asked the same question, in turn. The other participants in the room were all collaborators with the experimenter, and they each gave the same incorrect answer to the question. The real participants were always the last to answer the question, and 75% of the participants gave the same incorrect answer as the rest of the collaborators. These results are interesting enough that this experiment has been performed by psychology undergraduate students hundreds of times, and often with the same results. Even I’ve participated in this experiment as an undergrad, because, face it, it’s fun to watch obvious social conformity at work.
Essentially, when everybody else seemed to be giving an obviously false answer, most participants conformed to the group’s consensus. That’s startling. Here’s what’s even more startling: few, if any, of the participants recognized that they were being pressured into giving the wrong answer. None of them said, “I knew what the right answer was, but I caved in the face of social pressure.” Rather, they said things like, “My eyesight’s been bothering me today.” Or, “Line A looked ‘farther away’ than line B, and therefore the same length.” Essentially, each person involved invented a narrative that made their choice seem rational and correct. This is significant: when providing an obviously incorrect answer, participants interpreted reality in a way that allowed them to feel like they were providing the right answer. In other words, nobody feels like they are just conforming. Conformists always feel as if they have an ideological reason to do what they do, or that they have evidence that what they’re doing is correct, or that their judgment was compromised by something other than social pressure (bad eyesight, for example).
This leads me to wonder how often this happens in day-to-day life. For example, nearly everybody supports public schools, but everybody maintains an ideological or practical reason to support them (even in the face of direct evidence that they are radically failing our children, both socially and academically). Nobody admits to supporting them because it’s unpopular not to. Even staunch conservatives who vehemently resist the government takeover of the medical industry claim that education is somehow “different,” and that there are good reasons for the government to subsidize education that don’t apply to the medical industry. I have a hard time believing that this isn’t at least partly the same phenomenon we observe in Asch’s experiment.
I think this can happen in population subgroups as well. It is very unpopular for conservatives to oppose strict immigration laws, even though such measures clearly and obviously resemble the exact kinds of government action they claim to oppose. And yet, collectively and individually, we’ve invented a myriad of ways to rationalize the discrepancy, because nobody admits to simply conforming to the social consensus. Nobody experiences it that way.
I imagine that this has direct implications in our understanding of peer pressure in school. It could even happen in church. Is it possible that because everybody interprets a specific scripture a certain way, alternative (and perhaps more obvious) interpretations are invisible to us? Is it possible that we simply ignore or explain away the discrepancies between what we read on paper and how others live and interpret it, in the same way that Asch’s participants ignored and explained away the differences between what they saw and what others were saying?
I don’t know all the answers. Many reading this will probably question my examples, and perhaps even be offended by them. But you don’t have to agree with me, and that’s my point. You don’t have to believe that public education and immigration are examples of this phenomenon. However, I think it’s important to stop and think, and just consider if some of our ideologies or beliefs are simply attempts to rationalize a discrepancy between what we see and what everybody else claims to see.
For example, the majority of un-documented Mexican immigrants I meet are good, hard working people. But everyone else seems to believe that they’re a burden to the public. There’s a discrepancy there, and so there must be some sort of conspiracy that I can’t see that explains why we must deport them. And by accepting that premise, I can align myself with the majority (at least, the majority within my political subgroup) and not feel like I’m simply following the consensus. I don’t know for sure if this example is truly the way things happen. It’s just a hypothetical example that I’m using to illustrate my point, which is that none of us think we’re guilty of this. And all I know is that this leads me to pause and reconsider many of the assumptions I hold.
Fortunately, there’s a silver lining in Asch’s experiment. If just one of the collaborators gave the right answer, even if all of the other collaborators gave the wrong answer, the participant of the study would once again be more likely to give the right answer. In fact, if just one of the collaborators gave a different wrong answer than the rest, the participant felt more free to dissent as well, and give the right answer. The lesson here is clear: if you have a dissenting opinion, express it. Don’t rationalize it away. Just say what’s on your mind, even if it contradicts the consensus. Point it out if you see that the emperor’s naked, even if everybody else sees clothes. And your one dissenting voice will crack the tyranny of the majority. Just one dissenting voice makes a difference.