The Dumbest Generation? Don’t Be Dumb

George Santayana, too, despaired of a generation's ignorance, warning that 'those who cannot remember the past are condemned to repeat it.' That was 1905.

Really, don't we all know by now that finding examples of teens' and twentysomethings' ignorance is like shooting fish in a barrel? If you want to exercise your eye-rolling or hand-wringing muscles, take your pick. Two thirds of high-school seniors in 2006 couldn't explain an old photo of a sign over a theater door reading COLORED ENTRANCE. In 2001, 52 percent identified Germany, Japan or Italy, not the Soviet Union, as America's World War II ally. One quarter of 18- to 24-year-olds in a 2004 survey drew a blank on Dick Cheney, and 28 percent didn't know William Rehnquist. The world's most heavily defended border? Mexico's with the United States, according to 30 percent of the same age group. We doubt that the 30 percent were boastful or delusional Minutemen.

Like professors shocked to encounter students who respond with a blank-eyed "huh?" to casual mentions of fireside chats or Antietam or even Pearl Harbor, and like parents appalled that their AP-amassing darling doesn't know Chaucer from Chopin, Mark Bauerlein sees in such ignorance an intellectual, economic and civic disaster in the making. In his provocative new book "The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future (Or, Don't Trust Anyone Under 30)," the Emory University professor of English offers the usual indicators, grand and slight. From evidence such as a decline in adult literacy (40 percent of high-school grads had it in 1992; only 31 percent did in 2003) and a rise in geographic cluelessness (47 percent of the grads in 1950 could name the largest lake in North America, compared with 38 percent in 2002), for instance, Bauerlein concludes that "no cohort in human history has opened such a fissure between its material conditions and its intellectual attainments."

He is a little late to this party, of course. The old have been wringing their hands about the young's cultural wastelands and ignorance of history at least since admirers of Sophocles and Aeschylus bemoaned the popularity of Aristophanes ("The Frogs," for Zeussakes?!) as leading to the end of (Greek) civilization as they knew it. The Civil War generation was aghast at the lurid dime novels of the late 1800s. Victorian scholars considered Dickens, that plot-loving, sentimental ("A Christmas Carol") favorite, a lightweight compared with other authors of the time. Civilization, and culture high and low, survived it all. Can it survive a generation's ignorance of history? For those born from 1980 to 1997, Bauerlein lamented to us, "there is no memory of the past, just like when the Khmer Rouge said 'this is day zero.' Historical memory is essential to a free people. If you don't know which rights are protected in the First Amendment, how can you think critically about rights in the U.S.?" Fair enough, but we suspect that if young people don't know the Bill of Rights or the import of old COLORED ENTRANCE signs—and they absolutely should—it reflects not stupidity but a failure of the school system and of society (which is run by grown-ups) to require them to know it. Drawing on our own historical memory also compels us to note that philosopher George Santayana, too, despaired of a generation's historical ignorance, warning that "those who cannot remember the past are condemned to repeat it." That was in 1905.

A more fundamental problem is what Bauerlein has in mind by "dumbest." If it means "holding the least knowledge," then he has a case. Gen Y cares less about knowing information than knowing where to find information. (If you are reading this online, a few keystrokes would easily bring you, for the questions so far, vice president, former chief justice of the Supreme Court, North and South Korea, Lake Superior.) And it is a travesty that employers are spending $1.3 billion a year to teach basic writing skills, as a 2003 survey of managers found. But if dumb means lacking such fundamental cognitive capacities as the ability to think critically and logically, to analyze an argument, to learn and remember, to see analogies, to distinguish fact from opinion … well, here Bauerlein is on shakier ground.

First, IQ scores in every country that measures them, including the United States, have been rising since the 1930s. Since the tests measure not knowledge but pure thinking capacity—what cognitive scientists call fluid intelligence, in that it can be applied to problems in any domain—then Gen Y's ignorance of facts (or of facts that older people think are important) reflects not dumbness but choice. And who's to say they are dumb because fewer of them than of their grandparents' generation care who wrote the oratorio "Messiah" (which 35 percent of college seniors knew in 2002, compared with 56 percent in 1955)? Similarly, we suspect that the decline in the percentage of college freshmen who say it's important to keep up with political affairs, from 60 percent in 1966 to 36 percent in 2005, reflects at least in part the fact that in 1966 politics determined whether you were going to get drafted and shipped to Vietnam. The apathy of 2005 is more a reflection of the world outside Gen-Yers' heads than inside, and one that we bet has changed tack with the historic candidacy of Barack Obama. Alienation is not dumbness.

Bauerlein is not the first scholar to pin the blame for a younger generation's intellectual shortcomings on new technology (television, anyone?), in this case indicting "the digital age." But there is no empirical evidence that being immersed in instant messaging, texting, iPods, videogames and all things online impairs thinking ability. "The jury is still out on whether these technologies are positive or negative" for cognition, says Ken Kosik of the University of California, Santa Barbara, codirector of the Neuroscience Research Institute there. "But they're definitely changing how people's brains process information." In fact, basic principles of neuroscience offer reasons to be optimistic. "We are gradually changing from a nation of callused hands to a nation of agile brains," says cognitive scientist Marcel Just of Carnegie Mellon University. "Insofar as new information technology exercises our minds and provides more information, it has to be improving thinking ability."

We think that even English professors should respect the difference between correlation and causation: just because ignorance of big lakes and oratorios got worse when the digital age dawned doesn't mean that the latter caused the former. To establish that, you need data. Alas, there isn't much. The ideal experiment is hard to pull off: to study the effect of digital technology on cognitive processing in a rigorous way, you must randomly assign groups of young people to use it a lot, a little or not at all, then follow them for years. As one 19-year-old of our acquaintance said about the chances of getting teens to volunteer for the "not at all" group, "Are you out of your [deleted] mind?"

What we do know about is multitasking: it impairs performance in the moment. If, say, you talk on a cell phone while driving, you have more trouble keeping your car within its lane and reacting to threats, Just reported earlier this year. "Multitasking forces the brain to share processing resources," he says, "so even if the tasks don't use the same regions [talking and driving do not], there is some shared infrastructure that gets overloaded." Chronic multitasking —texting and listening to your iPod and updating your Facebook page while studying for your exam on the Italian Renaissance—might also impair learning, as a 2006 study suggested. Scientists at UCLA led by Russell Poldrack scanned the brains of adults ages 18 to 45 while they learned to interpret symbols on flashcards either in silence or while also counting high-pitched beeps they heard. The volunteers learned to interpret the cards even with the distracting beeps, but when they were asked about the cards afterward, the multitaskers did worse. "Multitasking adversely affects how you learn," Poldrack said at the time. "Even if you learn while multitasking, that learning is less flexible and more specialized, so you cannot retrieve the information as easily." Difficult tasks, such as learning calculus or reading "War and Peace," will be particularly adversely affected by multitasking, says psychologist David Meyer of the University of Michigan: "When the tasks are at all challenging, there is a big drop in performance with multitasking. What kids are doing is learning to be skillful at a superficial level."

A lab experiment with cards and beeps is not real life, however. Some scientists suspect that the brain can be trained to multitask, just as it can learn to hit a fastball or memorize the Aeneid. In an unpublished study, Clifford Nass of Stanford and his student Eyal Ophir find that multitaskers do let in a great deal more information, which is otherwise distracting and attention-depleting. But avid multitaskers "seem able to hold more information in short-term memory, and keep it neatly separated into what they need and what they don't," says Nass. "The high multitaskers don't ignore [all the incoming signals], but are able to immediately throw out the irrelevant stuff." They have some kind of compensatory mechanism to override the distractions and process the relevant information effectively.

Even videogames might have cognitive benefits, beyond the hand-eye coordination and spatial skills some foster. In his 2005 book "Everything Bad Is Good for You," Steven Johnson argued that fantasy role-playing games such as Dungeons & Dragons are cognitively demanding, requiring players to build "elaborate fantasy narratives—all by rolling twenty-sided dice and consulting bewildering charts that accounted for a staggering number of variables." Players must calculate the effect of various combinations of weapon, opponent and allies "that would leave most kids weeping if you put the same charts on a math quiz," Johnson wrote. They must use deductive reasoning to infer rules as they go, such as the use of various implements, what you need to do to level-up, intermediary goals, who's friend and who's foe. The games challenge you to identify cause and effect—Johnson describes how SimCity taught his 7-year-old nephew that high tax rates in a city's industrial zone can deter manufacturers from relocating there—and to figure out nested goals, such as the need to find the tool to get the weapon to beat the enemy to cross the moat to reach the castle to (phew) save the princess. This is nothing if not hypothesis testing and problem solving, and games such as Final Fantasy exercise it no less than figuring out where cars traveling toward one another from 450 miles apart, one at 50mph and one at 60mph, will meet.

No one knows what kids will do with the cognitive skills they hone rescuing the princess. If they just save more princesses, Bauerlein will be proved right: Gen Y will turn out to be not just the dumbest but also the most self-absorbed and selfish. (It really aggravates him that many Gen-Yers are unapologetic about their ignorance, dismissing the idea that they should have more facts in their heads as a pre-Google and pre-wiki anachronism.) But maybe they'll deploy their minds to engineer an affordable 100mpg car, to discover the difference in the genetic fingerprints of cancers that spread and those that do not, to identify the causes and cures of intolerance and hate. Oddly, Bauerlein acknowledges that "kids these days are just as smart and motivated as ever." If they're also "the dumbest" because they have "more diversions" and because "screen activity trumps old-fashioned reading materials"—well, choices can change, with maturity, with different reward structures, with changes in the world their elders make. Writing off any generation before it's 30 is what's dumb.