26 December 2010

One Pill, Two Pill, Red Pill, Blue PIll

Some people (check google) would have you believe that the red pill/blue pill question is Important and Insightful. They are wrong. On the off chance you are lucky enough to have never seen the Matrix here is The Offer:
Morpheus : This is your last chance. After this, there is no turning back. You take the blue pill, the story ends. You wake up and believe... whatever you want to believe. You take the red pill... you stay in wonderland... and I show you just how deep the rabbit hole goes.
On first glance this is a simple offer: Do you want to know unpleasant truths or continue living a lie. However, if you think about it for longer than a second it’s much more complex. There is a very crucial aspect of The Matrix that is of paramount importance and often glossed over: The illusion is perfect. That is to say, you cannot know your world is not real. Thus, for the time between the offer and your decision there are two realities that simultaneously exist: your epistemic one, and the objective one. In this way, Morpheus' offer is the choice between remaining in your epistemic one (and forgetting entirely the existence of the objective one) and accepting the objective one and knowing that your heretofore epistemic reality was a lie.

Which do you choose? You’re probably still thinking that you’ll choose the red pill, "After all," you tell yourself, "I want to know what is real." Let me reiterate: Realness has no place in this argument. In The Matrix the veil of ignorance is perfect, aside from Morpheus, there is nothing to pierce it.

Change the paradigm: If Morpheus never showed up, must you engage in any sort of cognitive trickery to keep yourself from questioning the nature of your reality? The answer is 'no' because script writers are lazy and people don’t like thinking this way.

The Matrix offer is useful because it is impossible in the real world. However it's useful only insofar as it can show us the difference between our own daily red pill/blue pill choices and what we think those choices look like.

When you choose to believe a lie (or to take any assertion on faith) your brain, probably unbeknownst to you, is making a calculated decision between the palatability of the lie and that of the truth. When my girlfriend asks me whether I care that she doesn’t wear makeup all the time she wants me to say that I do not care. When I oblige her she happily swallows the pill because this makes her feel good. Read that again and note what’s missing: any mention of my truthfulness. Was I lying? Does it matter?

Ask yourself if fat people ought to be happier than thin people. Your answer and your BMI are, incidentally, not of any concern here. How did your answer make you feel about yourself? If it made you feel bad, you have body image problems; if it made you feel good, you don’t. Again, the empirical accuracy to the answer does not matter.

Okay, neat game, we can diagnose all sorts of stuff this way, but it’s basically a parlor trick right? Wrong. Shit gets real when your epistemic reality and objective reality do not play nice together. To wit:
Political scientists Brendan Nyhan and Jason Reifler provided two groups of volunteers with the Bush administration's prewar claims that Iraq had weapons of mass destruction. One group was given a refutation -- the comprehensive 2004 Duelfer report that concluded that Iraq did not have weapons of mass destruction before the United States invaded in 2003. Thirty-four percent of conservatives told only about the Bush administration's claims thought Iraq had hidden or destroyed its weapons before the U.S. invasion, but 64 percent of conservatives who heard both claim and refutation thought that Iraq really did have the weapons. The refutation, in other words, made the misinformation worse.
You still sure you want to take that red pill?

One of the less acknowledged aspects of the human condition is that we think we want things that we do not. The Matrix offer isn't real, in order to maintain your epistemic reality in the face of objective truths you have to either change your beliefs (hard) or change what you will believe (easy).

Don't take my word for it:
Marion Keech, a housewife in suburban Minneapolis who was convinced that the apocalypse was coming. Keech had started getting messages from aliens a few years before, but now the messages were getting eerily specific. According to Sananda, an extra-terrestrial from the planet Clarion who was in regular contact with Keech, human civilization would be destroyed by a massive flood at midnight on December 20, 1954.

Keech's sci-fi prophecy soon gained a small band of followers. They trusted her divinations, and marked the date of Armageddon on their calendars. Many of them quit their jobs and sold their homes. The cultists didn't bother buying Christmas presents or making arrangements for New Years Eve, since nothing would exist by then.

On the night of December 20, Keech's followers gathered in her home and waited for instructions from the aliens. Midnight inexorably approached. When the clock read 12:01 and there were still no aliens, the cultists began to worry. A few began to cry. The aliens had let them down. But then Keech received a new telegram from outer space, which she quickly transcribed on her notepad. "This little group sitting all night long had spread so much light," the aliens told her, "that god saved the world from destruction. Not since the beginning of time upon this Earth has there been such a force of Good and light as now floods this room." It was their stubborn faith that had prevented the apocalypse. Although Keech's predictions had been falsified, the group was now more convinced than ever that the aliens were real. They began proselytizing to others, sending out press releases and recruiting new believers. This is how they reacted to the dissonance of being wrong: by being more sure than ever that they were right.
Or a more modern example:
The Princeton political scientist Larry Bartels analyzed survey data from the 1990's to prove the same point. During the first term of Bill Clinton's presidency, the budget deficit declined by more than 90 percent. However, when Republican voters were asked in 1996 what happened to the deficit under Clinton, more than 55 percent said that it had increased. What's interesting about this data is that so-called "high-information" voters - these are the Republicans who read the newspaper, watch cable news and can identify their representatives in Congress - weren't better informed than "low-information" voters. (The sole exception was Republicans who are ranked in the top 10 percent in terms of political information. As Bartels notes, it's only among these people that "the pull of objective reality begins to become apparent.")

These citizens According to Bartels, the reason knowing more about politics doesn't erase partisan bias is that voters tend to only assimilate those facts that confirm what they already believe. If a piece of information doesn't follow Republican talking points - and Clinton's deficit reduction didn't fit the "tax and spend liberal" stereotype - then the information is conveniently ignored. "Voters think that they're thinking," Bartels says, "but what they're really doing is inventing facts or ignoring facts so that they can rationalize decisions they've already made." Once we identify with a political party, the world is edited so that it fits with our ideology.
To put this in the context of the Matrix, we're all choosing Morpheus' blue pill, but remembering that we chose the red one. Morpheus even makes this an explicitly allowed outcome by telling Neo that he can, "wake up and believe... whatever you want to believe." Except that the illusion isn't perfect, as it would be in the Matrix. Check out those top 10% of Republicans... they had to acknowledge the reality of Clinton's deficit reduction because they had so much information. It's taken for granted now, in 2010, that Iraq had no WMD, those who used to fervently believe it are either among the conspiracy theorists or have discarded (and likely, quite literally, forgotten) their old belief.

Each time you swallow the blue pill and tell yourself it was red you draw the veil of ignorance that much tighter around yourself and close the distance between comforting lies and true delusions. In the words of psychiatrist Noel Gardner, testifying to Ron Lafferty's sanity after the brutal murder of his brother's wife and toddler:
Ron had constructed his own idiosyncratic theology in a very non-psychotic way... He created it by whatever feels good to him. [Ron] says, ‘It just gives me a sense of peace, and I know it's true,' and it becomes a part of his own unique article of faith.
Sound familiar?

No comments:

Post a Comment