Study: For Some Time Now, People Have Been Getting Measurably Dumber

We spend too much time consuming "babble."

The other day, during one of my too-frequent tumbles down the rabbit hole of cyberspace, I came across an article discussing a paper published in the journal Trends in Genetics that posits an interesting, if vaguely disheartening theory on the nature of human intelligence.

Dr. Gerald Crabtree, a Stanford University biochemist, analyzed the mutation rates of the 2,000-5,000 genes required for optimal human intellectual and emotional fitness and determined that, for some time now, people have been getting measurably dumber:

A hunter–gatherer who did not correctly conceive a solution to providing food or shelter probably died, along with his/her progeny, whereas a modern Wall Street executive that made a similar conceptual mistake would receive a substantial bonus and be a more attractive mate. Clearly, extreme selection is a thing of the past.

In other words, a person who was once—in a Darwinian sense—literally too stupid to live, now has it pretty easy.

Not everyone is convinced by Crabtree’s findings; but they got me thinking about how the digital age is having a detrimental effect on our ability to think.

In 2012, we find ourselves awash in more potentially actionable intelligence than a person from our grandparents’ generation would ever think possible. According to one oft-cited statistic, a Sunday edition of the New York Times carries more information than the average 19th-century citizen accessed in his entire life. And that’s the print version I’m talking about.

As citizens of the digital age, we have been granted access to the most powerful repository of knowledge ever conceived of—the Internet—and we’ve taken to it like kids to cake. There’s just one little problem: We haven’t really learned how to digest it all yet.

See, having access to more information does not necessarily translate into a greater intellect. Quite the contrary; the more information we have at our fingertips, the less of it that actually gets processed, internalized and stored for later use. Knowledge isn’t measured by your ability to regurgitate facts at cocktail parties, but by your ability to recall those facts, organize them into coherent concepts, recognize historical patterns, and apply them in a critical way to solve problems or develop new theories.

To do that successfully requires learning, and learning involves a lot more than just consuming information. While we are consuming infinitely more information these days, research suggests we are spending very little time ruminating on it. According to Nielsen, the average person views more than 2,500 individual webpages a month, but spends an average of less than a minute on each. How much learning do you think is happening there?

Meanwhile, even when we think we’re getting a beneficial piece of information, chances are we are not. Nearly a quarter of the information we consume now comes from social media. How much actionable information does the average Facebook user get for his seven hours a month spent on the site? In 2009, Pear Analytics studied 2,000 Twitter messages and broke their contents into three categories: those that are conversational, those with pass-along value, and those that are “pointless babble.” The winner? You guessed it, babble—accounting for more than 40 percent of total tweets.

Higher societies are societies of reflection—a vital faculty that has become a rare commodity these days. Wisdom for the ancients—the wisdom Crabtree says we are losing—was an active endeavor, and information was the fuel that propelled this activity. Today we are passive, eagerly sucking up everything in our path. Yet with every webpage we surf, every 30-second video clip we stream on YouTube, every tweet we digest, we lose our capacity for critical thought, active thought, creative thought. We have become like swollen sponges.

There may be hope. In his book The Overflowing Brain: Information Overload and the Limits of Working Memory cognitive scientist Torkel Klingberg suggests that while it’s important to find a balance between access to information and our capacity to process it, given time and practice our brains will begin to adapt to the increased demands we are putting on our working memory.

How do I know this? Well, I learned it on the Internet of course.