Two autumns ago, in my son Jake’s junior year of high school, he took an AP English course. Junior year was bad for him and me — we never seemed to have anything nice to say to one another. But Jake did like to read, and it occurred to me at some point that perhaps I could use his AP English course to connect with him. Surely I’d read the same books he was reading, since the high-school reading list was carved in stone sometime in the early 1950s. So I asked him: What are you reading in AP English?
“The Great Gatsby,” he said.
“Do you … like it?” I asked delicately, thrilled to be having what was almost a conversation with my teenage son.
“I don’t really like the actor who plays Gatsby,” he said. “He’s got these weird bumps on his face that keep distracting me.”
“We’re not actually reading the book,” Jake informed me. “We haven’t read a book all semester. We watch the movies instead.”
It sort of made sense, once I calmed down and thought about it. It was hard to get kids to read back when I was in high school; what must it be like now, when there are iPods and iPhones and Internet and cable TV? Better to have seen Robert Redford pretend to be Gatsby than never to have known Gatsby at all.
Just the same, I was glad when, for his senior year, Jake proposed taking an English course at the local community college. Come September, he and a buddy drove to the college every Monday night and sat for three hours in English 101 — where they never once read a book. They watched movies instead.
Jake got an A- in the course.
We live in interesting times. In the past decade, the number of college grads who can interpret a food label has fallen from 40 percent to 30 percent. An American child is six times more likely to know who won American Idol than the name of the Speaker of the House. (For more bad news, see the sidebar on page 59.) Reading and writing scores both fell on the 2008 SATs. Not long ago, a high-school teacher in California handed out an assignment that required students to use a ruler — and discovered not a single one of them knew how.
What in the world is going on with our kids?
Bring the subject up in any group of parents around Philadelphia, and you’ll hear the same thing: Children today seem, well, dumber than they used to. They don’t know the most basic stuff: who fought against whom in World War II, how many pints are in a quart, and in Jake’s case, the days of the week. (He’s shaky on the months, too.) They may be taking every AP and Honors course their schools offer, but they can’t tell you who invented pasteurization. (They do know who invented Facebook, because they saw the movie The Social Network.) They spend an average of eight and a half hours a day in front of screens — computer screens, TV screens, iPhone screens. Add in eight hours of sleep and seven of school, and that leaves half an hour when their senses aren’t under siege — just enough time for a shower.
Besides, these educators say, we don’t have solid data to tell us what kids really did know 30 or 40 years ago, not to mention that the American education system is struggling to decide exactly what should be taught now, given the ever-increasing possibilities. The Penn profs see the world changing, not our kids.
So I challenge them: What is Jake learning when he spends six hours a day on his computer, playing World of Warcraft? Everyone turns to Yasmin Kafai, who, it turns out, has devoted extensive research to computer games. “Over so many hours,” she says, “he’s learned how to master an incredibly complex system. These multi-person games that involve intra-functional teams — ‘guilds,’ they call them — organize their entrants the way some workplaces do. These are skills that corporate employers are very interested in.” And, she says, he’s learning perseverance: “Kids invest hundreds of hours in gameplay.”
IN SEPTEMBER, the New York Times published an article on a young couple, Taylor Bemis and Andrea Lieberg, who serve as caretakers for Ralph Waldo Emerson’s former home in Concord, Massachusetts. In part, it read:
To connect with their long-gone host and his philosophy of individualism, freedom and self-reliance, the couple tried to read his essays and to listen to his work on audiotape, but it was only after watching a DVD about Emerson that they began to understand him.
“I felt like he was the first person, or one of the first people, to start thinking outside the box with his whole Transcendentalism and, like, God and nature and all that,” Ms. Lieberg said. “So we were like, okay, he’s cool, nonconformist. And we like that.”
The Times is clearly poking fun at Bemis and Lieberg: Ha-ha, the caretakers who, like, couldn’t even read Emerson’s work! But there’s another way to interpret the couple’s experience, and it goes to what’s happening with kids today. It’s not that Bemis and Lieberg were too stupid to read Emerson. It’s that their brains no longer function like that. They quite literally couldn’t understand Emerson’s philosophy until it was presented to them in a form that engaged them differently than just words on a page.
I know what you’re thinking- — it’s like reading The Great Gatsby vs. watching the movie. The movie has to be an inferior intellectual enterprise. But is that true, or has our culture just taught us to think that way?
Marshall McLuhan wasn’t the first to observe that how we garner information, or share it, inevitably affects the content. In an Atlantic article called “Is Google Making Us Stupid?” Nicholas Carr relates that in 1882, Friedrich Nietzsche invested in a typewriter after problems with his vision made using a pen difficult. Now he could write with his eyes closed! What he didn’t anticipate was how the substance of his thoughts, as transmitted to the page, would change, moving “from arguments to aphorisms, from thoughts to puns, from rhetoric to telegram style.” His experience wasn’t unique.