One arctic February morning in 2015, Christopher Federico and Karen Wolf stood front of a classroom of teachers at the University of Toronto’s Rotman School of Management. Federico and Wolf are both full-time teachers themselves: he teaches problem-based learning at the gifted high school run by the University of Toronto and she teaches English at North Toronto, a public high school.
The twenty-odd teachers before them came from a variety of backgrounds, ranging from kindergarten educators to community college professors, and were here for a two-day course the university offered in Integrative Thinking for educators. Integrative thinking is a methodology for complex problem solving used by management consultants, which the Rotman School taught to its MBA students. A few years back, Rotman began offering short courses to educators in integrative thinking, so they could teach these methods to their own students and build problem-solving skills into the curriculum.
Federico drew a line down the center of a whiteboard. “What is the future vision of what school looks like?” he asked the class.
This was not a rhetorical question, but the problem these teachers would tackle today, first by comparing and evaluating two apparently contrasting models of education and later using the data to create a new approach for schools. One model was the brick-and-mortar school, the analog bedrock of teaching that exists the world over, and the place where all these teachers worked. The other model was the online-only, virtual school, a digital alternative that Federico said seemed to be the way of the future.
Wolf then asked the teachers to list only positive attributes of each model, as they would be making something called a pro-pro chart. “Nothing negative,” she said, “only pro here.” Teachers called out ideas for each: Online-only schools could connect students to teachers anywhere and anytime. They could be more cost-effective, and nearly every aspect of the experience was customizable to the individual needs of students. Teachers could even work from home, in their pajamas—a comment that elicited whoops of approval.
In terms of advantages, brick-and-mortar schools were situated in a particular community, and students could form deep social bonds with teachers and peers there, what Federico called the “hidden curriculum” of socialization. Educators in traditional schools got a job, a sense of belonging and purpose, and the reward of seeing students learn in front of their eyes.
Presented here as brightly as possible, these two models for education showed a harmonious, positive future for schooling, whether in-person or online. But out in the real world, the future of school and the role of digital technology in it have become one of the most hotly debated issues in the public interest. Education, especially in the United States, is often referred to as “broken” and “failing.” In global assessments and test scores, American students perform meagerly, far worse than those in other wealthy nations, and often less than some developing nations. Education reform has become the great cause in America, and various stakeholders champion a host of solutions to save it.
Few industries have embraced the desire for radical, transformative change in education with the zeal, enthusiasm, and commitment of the digital technology industry. This makes sense for two key reasons. First, education is a prized pig, ready to be roasted and devoured by digital disruption. Today, total spending on education technology remains low, around 5 percent of total education budgets in the United States, and less than 2 percent globally, but worldwide spending on K–12 classroom hardware technology alone is expected to reach $19 billion by 2019, and 2014 saw more than a 50 percent increase in venture capital funding for education technology companies. That’s a lucrative market to tap into.
Second, the high-tech world is fueled by education. Its businesses are created by highly educated individuals, often at universities, and many of the products and services it sells appeal to an educated population. Education has become the pet cause of digital’s business leaders. Bill and Melinda Gates, Mark Zuckerberg, and the venture capitalist Jim Breyer are among the top supporters of education philanthropy, funding everything from university scholarships and research grants to experiments in school reform stretching from inner-city Newark to remote African villages.
Underlying this is the belief that digital technology can transform education in the same way it transformed business, media, and communications. What emerges is a vibrant, multibillion-dollar market in education technology (ed tech, as it’s commonly known) that promises nothing less than a radical rethinking of education. Here is where the utopianism and manifest destiny of Silicon Valley meet your child’s elementary school, and where pedagogy and philosophy intersect with politics and business. Attend a presentation of an ed tech company, watch a TED talk about education, or listen to a school superintendent talk breathlessly about the new virtual-reality goggles she just bought for your kid’s school, and the future is bright indeed.
It is a future where every child has the ability to learn at their own pace, in the most stimulating way possible, from wherever and whenever suits them best, at a lower cost but with greater accountability and results. It is a future where school will be dynamic, where teachers will truly be able to unleash their creative potential, where inner-city teens will have the same advantages as those in wealthy suburbs, and the greatest university in the world will not be some ivy-covered campus, but anywhere your device gets a signal. The old, ineffective system of sitting in rows of desks, listening to a teacher regurgitate information from the pages of books, will be turned on its head. We won’t need their education. We won’t need their thought control. The walls will come down, and a bright new future will emerge.
That’s the promise, at least.
The reality of digital education technology, which has attempted to realize this future for much of the past thirty years, is that of troubled students who have shown tremendous promise but consistently gets D’s on their report card. It is a cautious tale of what happens when schools, communities, and educators place their blind faith in digital innovation while ignoring the proven evidence and research around the benefits of analog education.
This story is not unique to the digital era. The inventors, manufacturers, and evangelists for radios, mail-order correspondence courses, television, VCRs, and even the printing press all made grandiose predictions that their technology would either transform traditional schools or eliminate them entirely. Thomas Edison himself proclaimed that books and teachers would soon disappear from classrooms, because students would learn through the motion pictures he helped invent. The birth of the digital computer just added more claims to this long history. The latest educational software or device is always unveiled with the same breathless belief in technology’s potential to disrupt school.
“This pattern goes back well over a century,” said Larry Cuban, a professor of education at Stanford University. Cuban, who lives and works in the heart of Silicon Valley, began as a hopeful evangelist for education technology, but slowly turned into one of ed tech’s most prominent skeptics after witnessing, time and again, the failure of ed tech to deliver on its promises. He calls it the hype cycle. “There is this pattern of extreme claims for transformation, and then a kind of bumpy landing and disappointment.”
Why does this happen, over and over again, without the technology industry, the educational institutions, and other stakeholders learning from their mistakes? It is not as though the evidence is lacking, or industry leaders lack the ability to learn from mistakes. Rather, Cuban attributes the persistence of ed tech’s hype cycle to deeply held values around technology and innovation. “In this culture, like other developed cultures, technology is seen as an unadulterated good,” he said. “The presumption is that the technology will improve one’s life, in whatever it is.” Education’s stakeholders are often blinded by this presumption of technological progress as the ultimate good and cannot look at its actual performance critically.
“The skepticism that one would ordinarily raise about inflated claims comes pretty late in the process when it comes to anything technologically innovative,” Cuban said. “Any skepticism about decisions that buy and distribute electronic devices [for schools] tend to be rushed into very quickly. And a lot of money gets spent. Why? It doesn’t matter what the research studies say, or what any doubters in this field say. Anyone who doubts is called a Luddite. When it comes to technology, it’s very important for school boards and trustees to say ‘We’re at the cutting edge. We bought these iPads for kindergartners!’ Teachers are rarely involved in those decisions, and these devices show up at the classroom door.”
Cuban cites three reasons that policymakers typically use to justify the purchase of new technology for schools. First, the technology will improve student achievement and marks. Second, the technology will change traditional teaching to nontraditional teaching. Third, the technology will better prepare students for the modern workplace. At best, Cuban says, there is contradictory evidence for the third reason, little for the second, and none for the first.
To understand why education technology fails so frequently, it’s important to start at the beginning of our learning life, a period known in the field as early childhood education (ECE), which covers daycare, preschool, and kindergarten. While many activities during this time may seem like a lot of aimless playing, naps, colds, and diaper changes, it is actually the most crucial educational experience of our lives, because it provides the foundation for all our learning that follows. Young children learn about the world through physical senses: grabbing and touching, smelling and hearing, seeing, licking. The widely held recommendations by pediatricians the world over to avoid exposing children under age two to screens is not out of concern that the content on those screens will damage their brain, but for fear that they will replace more valuable, sensory activities, such as putting their hands through a box of sand, or eating a tub of Play-Doh.
“The big organizing ideas around our formations of relationships are that physical experience,” said Diane Levin, professor of early childhood education at Wheelock College in Massachusetts. “ECE theorists say that’s the foundation for both learning [and] social, emotional, and cognitive development.” Levin used my own daughter’s experience at daycare that day as an example. At the time, she was one and a half, and was finger painting in her class. That activity not only involved her ability to create an image on paper, Levin said, but the sensory feeling of the wet paint running down her arm, the visual learning of the colors mixing as she moved her fingers around the paint, the spatial learning when she moved her arm off the paper and the paint dripped onto the floor, and the social learning when she flung paint at another kid and they cried, and the teacher told her why that wasn’t cool and why she had to apologize.
Finger painting was a full-body, full-mind experience. Compare that to numerous finger painting apps available for a tablet, and the sensory learning experience is reduced down to the tips of her fingers dragging across a small glass surface, without texture, smell, taste, or other physical and social consequences. “When you’re pushing buttons, it’s an abbreviation of all of that,” Levin said. “You’re just not getting it.”
Even the best educational computer programs and games, devised with the help of the best educators, contain a tiny fraction of the outcomes of a single child equipped with a crayon and paper. A child’s limitless imagination can only do what the computer allows them to, and no more. The best toys, by contrast, are really 10 percent toy and 90 percent child: paint, cardboard, sand. The kid’s brain does the heavy lifting, and in the process it learns.
All of this is necessary, even as children inevitably grow up to use computers in their later schooling or work. Education is a lifelong building process that starts with a foundation of very basic skills and increases, year after year, in its complexity and abstractness. When I am typing these words on my laptop, I am using spatial and social reasoning skills that I learned as a three-year-old playing with LEGO bricks. “With parents, there’s the belief that we live in a digital age, and it’s a good idea to give them the technology early,” said Jeff Johnson, an early-childhood-learning author and partner in the business Ooey Gooey, which makes play sand and other learning toys for preschoolers. “But just because they’ll use a piece of technology when they grow up, doesn’t mean we have to give it to them now.”
A quick caveat: I am not damning the wholesale use of digital technology in education. Digital technology can make education more effective when used appropriately. Schools run more efficiently thanks to the use of computer systems, which manage everything from report cards to budgets. Teachers and students can use computers to research, write, create, evaluate, correct, and manage their own educational environment. Academics from around the world can coauthor studies, evaluating far more data, far more quickly, while kids with special needs (autism, ADHD, dyslexia) have been shown to respond effectively to digital learning tools and environments in many cases. The criticism around educational technology also does not apply to the teaching of computer technology itself. Computer programming, coding camps, maker clubs, and robotics competitions are all valuable and necessary for teaching the knowledge and skills of those who wish to learn about digital technology. These are growing and increasingly important fields.
But including a mandatory course in computer programming for students is a very different thing from what the majority of ed tech evangelists hope to achieve, which is the integration of digital technology across all schools and subjects. It is rooted in the idea Cuban spoke to—that technology equals progress—and the more it can be woven throughout the school experience, the better off students will be.
At its most optimistic and dangerous, education technology arrives as the transformative panacea that will fix education and leaves a trail of disappointment and failure in its wake. The evidence for this just keeps on piling up. Study after study seems to confirm how the implementation of educational technology produces little net benefit to student performance, and in many cases, actually makes things worse. The examples cited here, which represent just a fraction of the existing and ongoing research into this, show the various ways educational technology falls short.
One of the big beliefs in the ed tech movement is the need to bridge the so-called digital divide between those who have access to computers and those who don’t. Increase access to computers and the Internet, in schools and at home, the thinking goes, and watch inequality fall. This is a project politicians, parents, school administrators, philanthropists, and the media have taken to with great gusto.
A 2010 study by Duke University tested this theory out by looking at North Carolina public school students who were given free laptops, and what it found was the diametric opposite. “The introduction of home computer technology is associated with modest but statistically significant and persistent negative impacts on student math and reading test scores,” the study’s authors wrote. “For school administrators interested in maximizing achievement test scores, or reducing racial and socioeconomic disparities in test scores, all evidence suggests that a program of broadening home computer access would be counterproductive.”
The same logic of bridging the digital divide was behind the wildly ambitious One Laptop per Child (OLPC) nonprofit, spearheaded by MIT Media Lab founder Nicholas Negroponte and set up in 2005 with the backing of a vast coalition of philanthropists and technology companies. OLPC’s goal was to produce and distribute rugged, inexpensive, Internet-enabled laptops to the world’s poor with innovative features such as solar panels and hand cranks. OLPC successfully created several devices that met this goal, but in every other respect, OLPC was a colossal failure that typifies the hubris of tech-centric educational utopianism.
From the outset, education ministers and development professionals pointed out that what children in rural Pakistan or Rwanda needed most were safe schools, clean drinking water, and trained teachers—not computers. OLPC nevertheless pressed ahead, and sold nearly 3 million of its custom laptops to schools around the world. Negroponte loved telling the story about OLPC distributing tablet computers to remote villages in Ethiopia with no schools so children could teach themselves.
Then the evidence emerged. Across continents and countries, from Peru and Uruguay to Nepal, well-funded academic studies demonstrated no gain in academic achievement for OLPC students when compared with those who didn’t participate in the program. The evidence mirrored other laptop and computer handout programs in such countries as Israel and Romania, where the introduction of computers also did nothing to improve learning. Last year, a report from the Organization of Economic Co-operation and Development concluded that “students who use computers very frequently at school do a lot worse in most learning outcomes,” and technology did nothing to improve scores across subjects, and less to bridge gaps between rich and poor students. In 2014 One Laptop per Child closed its Boston headquarters and drastically cut down on staff and new programs.
OLPC’s great mistake was presuming the universal importance of a shiny imported technology in spite of the recommendations of people closer to the problem at hand. This problem is not confined to international development. In 2001, the Los Angeles Unified School District spent $50 million on a computer system called the Waterford Early Reading Program, created by the education publisher Pearson to improve language instruction in kindergarten and first grade. Shortly after, research by the district discovered zero or negative reading improvement for students who used the program. When Waterford was abandoned in 2005, the school board’s president at the time, Jose Huizar, told the Los Angeles Times, “How could anyone continue to argue that it’s working when it’s not? It’s underutilized and ineffective.”
Nine years later, the very same LA school board announced a plan to put an iPad into the hands of all its 650,000 students. The iPads were loaded with Pearson educational software and coupled with a big push to improve Internet access at LA schools, all at a total cost of $1.3 billion, one of the largest single ed tech investments worldwide. Shortly after the first batch of iPads was distributed, the rushed, ill-conceived folly of the entire enterprise became apparent.
The iPads had no keyboards, which made them useless for students to do homework on, and the software that was supposed to prevent students from using the iPads for games and social media was easily hacked. The iPads frequently malfunctioned, were lost or stolen, and the software was inadequate for learning and assessment. To top it all off, the FBI launched an investigation into whether Apple and Pear-son had received preferential status as vendors over other potentially less expensive competitors. Barely one year after launch, LA’s iPad program was canceled and the city’s school superintendent resigned in disgrace.
From failed laptop implementations in Hoboken, New Jersey, to the tales of cracked screens, melted chargers, and tremendous financial losses for News Corp’s Amplify tablet program, time and again the mass “airdrop” of new ed tech devices into schools has fallen flat. But the attraction of politicians and policymakers to ed tech’s charms remains irresistible for a number of reasons.
One is political. Announcing you are going to give out iPads to every child in a community appears as a bold, clear signal that you are investing in the future and aligning your schools with the biggest, most innovative company in the world. It steers clear of any sticky issues with powerful teachers’ unions, and provides for great photo opportunities and news stories.
Another big reason behind the eager embrace of technology, especially with public school boards, is the promise of savings. With the help of digital technology such as computer-assessed standardized materials and tests, a school board can theoretically achieve economies of scale. And the hope is that once the learning becomes effective through devices, a school board should need fewer highly paid teachers and professors, who can be replaced with facilitators and teaching assistants hired to aid in the digital learning and exams, while the computer does the heavy lifting.
The temptation for eventual savings is powerful, but it underscores that the implementation of technology in schools carries a financial burden. Not just the initial capital cost of acquiring the technology, but continual expenses to maintain, repair, replace, and update it. A school gymnasium can last decades, a good textbook sometimes fifteen years or more. Some of the desks at my university were damn near a hundred years old. But any digital technology, no matter how well designed it is, becomes obsolete in just a few years, and inevitably stops working. My only memory of school computers was of dusty relics in the corner that didn’t even turn on.
Dollars spent on digital education technology are dollars that cannot be spent on teachers, building maintenance, or textbooks. It is money that has been pulled from programs in art, sports, music, and drama. Even though the research shows one of the greatest factors in reading improvements in students is the presence of school libraries, the number of libraries across school boards in the United States has declined dramatically. The logic behind this is often that libraries are pointless in the age of Google and eBooks, and that money would be better spent buying tablets or drones.
In his riveting book The Flickering Mind, Todd Oppenheimer chronicled the failure of various education technology initiatives in America, and the real cost they imposed on the schools that adopted them: “In debates about the importance of classroom basics, the technologists often argue that they aren’t trying to displace solid fundamentals. Tech isn’t meant to be a replacement, they say, it’s a supplement,” Oppenheimer wrote. “The line is hollow . . . an ‘e-lusion.’ Trying to fully support technology initiatives is extremely costly. Beyond the financial expense, there are the demands that computers make on a school’s time and energy . . . these are not flexible resources; every community can only offer a fixed amount of each one, and any amount devoted to technology leaves less available for other practices. So when technologists argue that tech is only meant as a supplement, they’re either fools or liars.”
Excerpted from The Revenge of Analog: Real Things and Why They Matterby David Sax. Copyright © 2016. Available from PublicAffairs, an imprint of Perseus Books, LLC, a subsidiary of Hachette Book Group, Inc.