Skip to content

The “Digital Native,” a Profitable Myth

baby typing

Technology buzzwords, although annoying, often seem innocuous enough. They’re just catchy and trite enough to bleed into common usage, but just misleading and obfuscatory enough as to discourage deep analysis. Two of the most widespread buzzwords and phrases to escape the tech world and infiltrate our general lexicon are the couplet “digital native” and “digital immigrant.”

Unlike many buzzwords that buttress the latest, supposedly world saving, innovations, their origins—and the definitions that have stuck with them—can be clearly traced to one point: A 2001 article written by the education consultant Marc Prensky. The article exists only to coin these dichotomous labels. And based on the article’s astronomically high number of citations—8,748 and growing, as indexed by Google, upon my latest check—it’s safe to say Prensky succeeded.

According to Prensky, the arrival of digital technologies marked a “singularity” that changes everything so “fundamentally” that there’s “absolutely” no turning back. This singularity has caused a schism in the population, especially in Prensky’s area of concern, the realm of education reform.

“Students today,” Prensky wrote in 2001, “are all ‘native speakers’ of the digital language of computers, video games and the Internet.” By contrast, “Those of us who were not born into the digital world but have, at some later point in our lives, become fascinated by and adopted many or most aspects of the new technology are, and always will be compared to them, Digital Immigrants.” (The typical birthdate cited for “digital native” status is 1980 and after.) Of course, Prensky admitted, some immigrants are going to adapt to their digital environment better than others; but compared to the natives they will always, to some degree, retain an “accent.” Prensky wrote as if “digital” is the name of a country you’re either born into with citizenship, or try desperately to enter with your virtual visa.

For any buzzword, we should ask what assumptions and generalizations using it obscure and who benefits from its propagation. These two particular labels are prime subjects for inquiry. In brief, they overlook socio-economic differences, which exist within the younger generations, and do so in a way that creates lucrative business opportunities for education gurus.

In his article, and subsequent writing and consulting, Prenksy treats the sweeping differences between “digital natives” and “immigrants” like an obvious piece of wisdom. And other commentators and consultants using them today often maintain the same air of unassailable truth. Proponents like Prensky largely support their claims with not much more than flourishes of rhetoric, anecdotes, and appeals to “common sense.” The labels elide the question, “How digitally adept is this generation, actually?” Sweeping statements about technological aptitude overshadow the actual differences in how—and how well—people use digital technologies. They assume a natural, generational baseline where one doesn’t necessarily exist.

Such assumptions, which creep into how we think about the world via the language we use, help form a mental image of what it means to be a “digital native.” Consider a 2013 article from the New York Times entitled “Technology and the College Generation,” brimming with stories about how surprised professors are at things like their students’ disdain for email. “Some of [the junior-level college students] didn’t even seem to know they had a college e-mail account,” one professor said. Odd behavior for a group of “digital natives” who supposedly know the Net like a spider knows its own web.

When we take a look at the data and research, however, it becomes clear that the great divide between “digital natives” and “digital immigrants” is a puff of smoke—one that obscures the actual differences that other factors (like socio-economic status, gender, education, and technological immersion) play in digital proficiency.

An academic article from The British Journal of Educational Technology last year, which critically examined the discourse around “digital natives,” found that “rather than being empirically and theoretically informed, the debate can be likened to an academic form of a ‘moral panic.’” The authors found that the commonly made claims are largely under-researched or just plain wrong when compared to the data. For instance, computer and web access, as well as activities like creating content on the Internet (keeping a blog or making videos), are in no way universal, and in some cases apply only to a usually affluent minority of the so-called “digital generation.”

The problem here is not just that we turn people into caricatures when we paint them as a monolithic group of “digital natives” who are more comfortable floating in cyberspace than anywhere else. The larger issue is that, when we insist on generalizing people into a wide category based on birth year alone, we effectively erase the stark discrepancies between access and privilege, and between experience and preference. By glancing over these social differences, and just boosting new technologies instead, it becomes easy to prioritize gadgets over what actually benefits a diverse contingent of people. And those skewed priorities will be to the detriment of, say, less well off groups who still lack the educational resources necessary to learn basic reading and writing literacy skills.

Even though claims about “digital natives” and “immigrants” don’t rely on much, if any, empirical data or robust theory—and only gain legitimacy by stoking a sense of “moral panic”—there’s plenty at stake for the cottage industry of education technology (sometimes called ed-tech) consultants. After all, if those dumb “digital immigrant” teachers don’t do something now, then how can they possibly educate the savvy “digital natives”?

According to Prensky and his ilk, teaching “digital natives” require totally new styles, techniques, and devices—because they think and process information “fundamentally differently from their predecessors.” But when held up to the light, these arguments appear thin. For instance, a common characteristic cited is that, unlike others, “digital natives” are natural multi-taskers, which means the single-task focus of traditional classrooms is detrimental to their advanced brains. Aside from select anecdotal observations, as the above academic article reports, there is simply no evidence that multi-tasking is a new phenomenon exclusive to the younger generations. One leading cognitive psychologist and researcher on the science of multi-tasking told Boing Boing, “Humans don’t really multitask—we task switch.”

Similarly, ed-tech proprietors are ready to jump at the chance to consult and sell schools new learning techniques and technologies—such as video games, which is supposedly the medium most familiar and engaging to “digital natives.” That’s all fine, except for the fact that video games are not so universally played in the first place—there’s a large gender gap, for instance—and there’s inadequate understanding among educators about how to actually use video games to foster learning rather than just entertainment.

But of course the originators and biggest perpetuators of the “digital native” discourse—the ed-tech gurus and hucksters—aren’t trying to make well-researched, academic claims. What matters most is that educators, school administers, and parents believe there’s a drastic divide in need of a bridge. And that bridge is usually built with expensive seminars, consulting fees, and technologies.

Audrey Watters, an education writer who is often critical of the ed-tech gospel, told me that it has been fascinating “to watch the fallout from the Snowden revelations and see that no one who’s an ed-tech guru (Prensky et al.) has raised some of the hard questions about what NSA surveillance of technology might mean for those growing up using these services.” Assuming that students have a natural digital fluency leads schools to ignore issues around subjects like data security where students and educators are both in dire need of learning important privacy skills. “No one is talking about things like ‘who owns your data?’” added Watters. “No one is helping students (pick the age: middle school, high school, college) recognize that they are producing massive amounts of data through their digital productivity.”

One reason that so many people are so ready to fall for the rhetorical devices that give “digital native” and “digital immigrant” sticking power is that we’re all already primed to grant sanctified status to “digital”—to separate this current phase of development ¬¬out from the broad history of technology. I wonder, did people talk about “industrial natives” and “industrial immigrants”? Were kids who grew up working in factories just “industrial natives” who were at home amongst the machines? (I actually posed these questions to a few technology historians, to no avail.)

Watters is right when she says that these terms “do little to help us understand how power and agency work in computing culture.” Instead, they obscure parts of society by focusing on a proportionally small group of technically adept young people, they help line the pockets of ed-tech consultants, and in the process make education about data privacy and security seem unnecessary.

The “digital native” and “digital immigrant” buzzwords can describe the world only insofar as they describe a world that would benefit the ed-tech profiteers. As with any other fetishized innovations, it’s worth keeping in mind that our initial introduction to (and understanding of) new technologies tend to come directly from the very people who stand to reap the profits from them. That alone is reason enough to be skeptical.