Do iPads Cause Religious Experiences?

In a recent BBC documentary, Secrets of the Superbrands, presenter Alex Riley attended the opening of a London Apple store and noted the “evangelical frenzy” of the fans lining the block. At first Riley stayed mostly at the periphery of the crowd, shooting sarcastic looks to the camera and lobbing one-liners about the seemingly religious ecstasy of the “glassy-eyed” consumers being ushered in by a team of Apple’s blue-shirted “preachers.” Entertaining as these loose comparisons might be, it was what Riley did next that caused a brief media flurry and garnered coverage by MSNBC, Business Insider, and over two hundred blogs, news sites, and forums within days of the documentary’s release. 

Riley tracked down an Apple super-fan, Alex Brooks (editor of World of Apple), and had a team of neuroscientists study Brooks’ brain with an MRI scanner as he was shown pictures of various mp3 players, computers, and gadgets. The goal of the experiment was to determine whether Apple products uniquely activated specific parts of Brooks’ brain, and whether this might say something about his psychological devotion to the brand. The result: Brooks’ brain showed similar patterns of activity when viewing Apple products as did religious practitioners who were shown religious imagery. According to Dr. Gemma Calvert, the expert who led the experiment, it seems that technology brands “exploit the brain areas that have evolved to process religion.”

This is, at first glance, a familiar kind of story. Over the last decade an entire genre of scholarship has emerged that correlates neuroscientific data with religious experiences and practices. Books with titles like Zen and the Brain, Neurotheology, and Religion Explained have become ubiquitous, and can generally be stacked into two piles. The first, what we might call “neurotheologies,” is comprised of books and articles that attempt to objectify and validate religious experiences with data about the brain. These studies were especially common in the late 1990s and early 2000s, when medical researchers and scholars of religion started training neural imaging technologies on the brains of meditators, shamans, and mystics with the hope of isolating neural circuitry that was uniquely religious. Research in this vein continues to this day, often with the goal of offering empirical support for otherwise subjective accounts of religious states.

The second, often referred to as the “cognitive science of religion,” or CSR, includes books and articles that generally try to explain salient features of religious experiences, beliefs, and practices by appealing to the functions of evolved cognitive mechanisms. A common method within the CSR is to isolate a part of the brain, A, that is designed by evolution for function B, and then show how religious thing C exploits this part of the brain.

One of the most popular examples is that humans evolved A) a highly active agency detection system to B) quickly identify predators in our environment, but which now gets exploited by C) beliefs in supernatural agents, which are bolstered every time a natural event is confused with an intentional action. CSR is quite popular today due in large part to its promise to explain certain “universals” of religion (e.g. supernatural agents, burial practices) by describing brain features that all humans share. These two approaches represent different responses to a recent explosion of information that we have about the brain. Generally speaking, stack one is a place to go if you are interested in finding the neurological basis for an authentic religious experience; stack two is for those seeking to demystify religious experiences with science.

Whether you prefer neurotheology or CSR, any attempt to pin down religious parts of the brain faces the same obstacle: the human brain is inordinately complex, and deceptively complicated. This is, in part, because it was not designed by an engineer with a master plan, but rather by hundreds of thousands of years of evolutionary tinkering. 

Over the last few decades, our picture of the human brain has morphed considerably. During much of the twentieth century the brain was thought to act like an efficiently managed corporate skyscraper, where information is brought to the CEO at the top floor by a hierarchy of workers, divided into departments with specific tasks, functions, and assigned locations. More recent studies describe of the brain as something more akin to an old city, with circuitous roads, overlapping districts, and complex daily rhythms. Whereas a corporate skyscraper might include specific floors for various types of work, in an old city one can find butchers or doctors distributed quite randomly, depending on the history of the city.

Burkhard Bilger, in a recent New Yorker profile of neuroscientist David Eagleman, describes this transition in our understanding of how the brain keeps time. During the mid-nineteenth century, the prevailing theory was that there was a single, integrated time-keeper somewhere in the brain—the equivalent of a neurological stop watch. More recent studies, however, suggest a hodgepodge of overlapping systems: the hypothalamus, the cerebellum, the basal ganglia, and other brain regions have been proposed to accomplish a number of specific functions. 

Imagine, for example, working intently on some task until you notice that your stomach is rumbling and that the daylight has shifted. You think back to when you last ate, gauge how long you’ve been working, and give yourself five minutes to finish up before having a light snack, so as not to ruin your dinner. An MRI scan of this thought process wouldn’t show a single cluster of time-neurons hard at work, but rather different groups of neurons governing circadian rhythms, short-term time keeping, longer term tracking, and a host of other processes. Telling time also involves parts of our brain that likely weren’t used by our ancestors: we read clocks faces and digital watches, listen for egg timers in the kitchen, and calculate distances in the minutes it takes to drive there. In short, “time” is too open and ambiguous a category to involve a single, dedicated set of neurons. The same is true of religion, and we need only return to our Apple enthusiast to see why.

Which “religious” parts of Alex Brooks’ brain lit up when he was shown Apple products? According to Dr. Calvert, Riley’s expert, it was the visual cortex, the part of the brain that processes visual information. When Brooks was shown Apple products, she states, the MRI indicated, “much more activity in the visual cortex, [indicating] enhanced sort of visual attention.” In other words, Brooks focused more on pictures of Apple products than he did on other gizmos and gadgets. This shouldn’t be a surprise—he runs an Apple-news website, and makes a living tracking the product releases, updates, and rumors associated with the brand. “Enhanced visual attention” should be expected in the brains of any other experts, from fashion designers to cell biologists to shepherds shown pictures of sheep among other animals. 

At most, Calvert’s experiments seem to show a correlation between visual attention and past experience: if you have spent a good deal of time studying something in the past, then you spend relatively more mental energy looking at it compared to other objects. How, then, do she and Riley conclude that Apple is exploiting parts of the brain that evolved to process religion?

In a separate study, they argue, a similar pattern was found in “very religious” persons shown religious and nonreligious images. The simplest explanation for this similarity is that humans spend more visual attention on images which they find interesting or which they see quite often, whether they are Apple products or religious images. Instead, Calvert and Riley posit a “stack two” argument: that the visual cortex is a uniquely religious part of the brain, which Apple exploits in order to sell more mp3 players. The glaring problem here is that the visual cortex is not uniquely religious, nor is religion essentially visual. Just like time, “religion” is too multifaceted to be found in just one room of a mental skyscraper.

Under casual scrutiny, the objective aura of Alex Riley’s neuroscientific comparisons has begun to fade. Apple isn’t exploiting the religious part of our brains because the brain likely doesn’t have any essentially religious real estate, and even if it did, it certainly wouldn’t be the visual cortex. Perhaps objective and careful research was never what Riley was after; this would explain why he consulted Dr. Gemma Calvert, who isn’t a neuroscientist of religion, but the managing director of Neurosense, a “next generation consumer research enterprise” that applies neuroscience to marketing. Riley’s documentary wasn’t the first to make such claims, and given the popularity of neuroscience it likely won’t be the last. However, as this new picture of the “old city” brain continues to replace the “skyscraper” brain of the twentieth century, something else has become increasingly clear: unlike the well-labeled floors of an office building, the hodgepodge of an old city can easily disorient the casual tourist.