Learning the Hard Way

Technology has a place in classrooms, but it shouldn't be a crutch used by lazy professors

Photograph by Jeremy Wilburn
Photograph by Georgia SouthernJeremy Wilburn / CC BY-NC-ND 2.0

My articlePass, Fail” seems to have triggered a massive response from readers, most of it approbative but some of it highly critical, in several instances verging on being ad hominem. Happily, this not the case with Darryl Whetter’s “The Kids Are Alright.” He does not agree with my argument, but nor does he condemn it. Rather, he attempts to think about it critically. I offer the following brief remarks in response to his concerns.

Darryl Whetter’s criticism of my article consists of two substantial assertions and two corollaries. First the assertions: (1) Technology has not had the deleterious effect on education I claim it has; (2) neither student ability nor university education per se has declined over the past several decades in the way I suggest. As to the corollaries: (1) Professors who endorse either or both of these assertions may be nostalgic, narcissistic, and perhaps even ill-inclined toward contemporary students; (2) my article indicates that I am such a professor.

I’ll address Whetter’s substantial assertions directly; I’ll leave the matter of the corollaries to the readers’ judgement.

First, nowhere in the article do I offer a wholesale critique of technology or of its role in university education, as Whetter suggests. Nor am I against technology in the manner in which the comparison of me with Bernard of Clairvaux implies (the prostitutes aren’t “student e-distraction.” The students are the mill workers. The prostitutes are those who profit from their misfortune—the lower ranks of the administrative cast, the student services cabal and the e-cheerleaders.) Insofar as I have a critique of technology, I’d be inclined to agree with Jaron Lanier—technology is merely a tool, and should be thought of and used as such. To think of it as being more than this—as salvation, as the most important thing, as the future—is to misunderstand it and ourselves. A cellphone allows me to call someone in Chile as I walk to my car in Charlottetown. That may be a good, though perhaps not an unqualified one. But to use that same device to text or shop or polish one’s image on Facebook while attending a class on quantum mechanics is not good, if we’re to measure goodness by the extent to which the goal of our activity is achieved.

When a tool ceases to aid in the accomplishment of the task for which it was created, it is no longer a tool but an impediment. It is highly unlikely that cellphones and iPads were created primarily to educate people. Were that true, Steve Jobs would have given his children free access to them, instead of restricting their usage to one or two hours a week. (Don’t get high on your own supply.) And Apple’s $80-billion profit last quarter—the highest ever recorded by a company—may suggest motives other than philanthropy. But even were it true—even were the tech industry’s primary motive to educate the masses—the evidence points in the opposite direction: people who text and shop and otherwise distract themselves with electronic devices during class learn and understand far less than those who do not, as Nicholas Carr has demonstrated compellingly in his book, The Shallows.

Distraction is a complex problem, within and without the university. As Mark Edmundson argued recently during a panel discussion at the New York Institute for the Humanities at NYU, perhaps we are distracted because there simply isn’t much of worth happening out there to engage our attention. Rather than real greatness and the pursuit of ideals like courage, truth, and compassion, Edmundson argues that we busy ourselves instead with our security and our comfort and our self-interest—our selves, in short. But because people cannot in fact live by bread alone, as it were, we’ve had to create simulacra of these ideals we can enjoy on the cheap and without actually having to risk our necks to achieve them. Thus does each day and each class become just one more scripted televisual event—titillating but ultimately empty. If you think I exaggerate, just try to find a public space in which there is no screen. Dentist offices and, stranger still, emergency rooms capitulated long ago. Movie theatres have held out so far, but there are signs they are weakening. I haven’t been to church lately. But perhaps the divinity still warrants unmediated attention.

You could argue, as Whetter does, that the real problem in education is the insistence that students must learn in a particular way—what he assumes to be my way, for instance—“verbal lectures with heavy reading.” He’s wrong about this, at least regarding lectures, but he’s also wrong about the question of method as it concerns technology, I think.

Reading is, and always has been, essential to education, though what we read is an open and ongoing question, as Whetter’s example of the introduction of English literature into the academy indicates. Yet to use this change to criticize my claim that students are not reading seems a little odd and disingenuous. First we had Sophocles and Thucydides, then Cicero and Horace, then Dickens and Eliot. Now we have what? Nothing? YouTube?

As for lectures, I find them increasingly problematic—problematic to listen to, and problematic to give, though there are exceptions. My own preference (We all have them, eh? Let’s be honest that far.) would be for something more akin to discovery-based learning and the free conversation described by Socrates in the Republic as an antidote to Glaucon and Adeimantus’ brain-training.

Unfortunately, my students are even less capable of those activities than they are of listening to an hour-long lecture, their ability to think and talk in public having been eroded so precipitously, perhaps by the image-conscious passivity of their electronic media. (And just to be clear about the allocation of blame, this is not the kids’ fault. We did this to them.) But what’s to be done about it? Contemporary students are “wired, wired, wired,” Whetter tells us. So perhaps the best way to educate them is to trick them. Since they’re wired, we’ll be wired too, but at the far end of our digital universe there will be not Etsy or Reddit or Snapchat, but some other, more worthy object of interest.

I think there are at least two problems with this argument. Objects of genuine academic worth require real effort and concentration, same as it ever was. Whether the object is special relativity, ancient Greek, Shakespeare, contemporary cinema, the Islamic State, or American politics, mastery does not come cheap. Enlisting devices that have hitherto distracted students into the service of educating them will work only if they cease to distract. Which means that sooner or later someone will have to turn off the movie clip, put the cellphone away, cut the music soundtrack and get down to the business of understanding what these things mean.

This effort to understand meaning is not a technical one. It is a matter of insight and judgement. Attaining it does, as Whetter is right to say, require that we distract ourselves occasionally from the task. But my guess is that Feynman likely didn’t pull out the bongos and started pounding in seminars with Einstein, Pauli, and von Neumann at Princeton. And even if he did, I imagine it would have been something they could have worked with. Why? Because it would have been a real response, however unusual, to what was being discussed, unlike the blank, absent stare of the average cellphone addict.

Rather than the university curriculum elevating technology, technology tends to degrade the university curriculum until it fits technology’s customary (and intended?) usage patterns—short and fast. Thus do university classroom “activities” come to resemble Twitter and Reddit far more than Reddit and Twitter come to resemble Marx and Hegel. With every diminution of someone’s attention span, the world gets smaller too. This brings me to Whetter’s second assertion.

It is true that nostalgia can be used as a weapon by the old against the young who wish, as they must, simply to do something different. However, it’s also true that calm assurances that things are continuing much as they always have can be naive and even irresponsible in the event that fundamental changes are underway. If this were 1938, Whetter would think I was Churchill, and I’d think he was Chamberlain.

Serious declines in student abilities are no longer a matter of argument but of fact. So too is the metastatic burgeoning of the university administrative cast and the “fall of the faculty,” as Benjamin Ginsberg describes this phenomenon. The real question concerns why these things are happening. I offer two answers to it.

The first is that they are, in part, just the normal corruption that accompanies the decline of an institution or regime. As the decline sets in, people cease to take themselves and their mandate seriously, wish to enjoy their wealth and reputation more than to earn them, and tend to serve their own interests first and foremost rather than the those of the people in their charge. However, because such practices are too ugly for most people to admit to publicly, ways must be found to simulate real achievement in order to hide the truth of their decline from themselves and others. In the case of universities, the quality of that simulation will depend on the particular market.

The second explanation is that the dwindling of genuine content from the university curriculum is not a by-product of institutional decline but an ambition. Technological civilizations (not civilizations with technology, which is to say, all of them) understand technique and efficiency to be ends in themselves, not means. A good act is one that is done efficiently and produces more. In the liberal arts and sciences, technique and efficiency are merely means to substantial ends (understanding, happiness, meaning) to which they are subordinate and which cannot be quantified. The liberal arts and sciences (they are both “humanities” in the fullest sense of the word) aim to educate students who are human and who wish to live in a world guided by substantial human ends that both surpass and take precedence over those of technological civilization. Technological civilization requires technological citizens who favour technique over substantial ends. For such a civilization to secure its ascendancy, liberal arts and sciences must be either refashioned to support efficiency training while maintaining the appearance of substantial education, or eliminated.

I argue that both efforts are currently underway in modern universities, and that the stakes they involve are so high that it is imperative we understand them fully, assess them carefully, and then decide what we will to do about them in all seriousness.

Whetter’s assurances are not up to the task; nor perhaps is my critique. Fortunately they’re not all we have to go on. George Grant, Stefan Collini, Marilynne Robinson, Martha Nussbaum, Richard Arum and Josipa Roksa, Benjamin Ginsberg, and James Côté have already done much of the analysis. It’s about time all of us invested in the future of Canadian universities—parents, students, faculty, administration, and government—started taking them seriously and at their word in our own efforts to understand our predicament.

Ron Srigley
Ron Srigley teaches philosophy and religious studies at Laurentian University.