Why Technology Hasn't Boosted Learning-And How It Could
Technology has been hailed as the key to improving education. But that won't happen as long as our use of technology is based on mistaken beliefs about how people learn.
Especially over the last ten months of remote schooling, technology has become deeply intertwined with education. Even in the before times, classrooms often featured not just a bank of computers but a digital device for every student and an interactive white board that left the old overhead projectors in the dust. Education philanthropists-especially big tech moguls like Bill Gates and Mark Zuckerberg-have invested heavily in "ed tech." There's a major annual conference devoted largely to the field. Estimates of spending on K-12 technology in the U.S. range from $13 billion to $35.8 billion a year.
But there's been little to show for all this enthusiasm. Test scores have continued to stagnate or decline despite predictions that technology would revolutionize education. And most would agree that the heavy dose of technology many students have gotten over the last ten months hasn't boosted learning.
Some say the problem with ed tech is that while schools may have bought it, teachers aren't using it (although lately many have had no choice). But even if they do use it, it's unlikely to help most students. Technology only works when you know how to use it. And as a recent book by British education writer Daisy Christodoulou makes clear, the education establishment-in the U.S., the U.K., and no doubt other places-doesn't have a clue. Ed tech is mostly just replicating existing ineffectual approaches to teaching, and sometimes making them worse.
The book's title-Teachers vs. Tech?: The Case for an Ed Tech Revolution-might lead you to believe it's a call for more technology in the classroom. It's not. At the same time, Christodoulou doesn't dismiss technology out of hand. If you understand the science of learning, she explains, you can use technology to do things like help students move information into long-term memory-in other words, to memorize it. That's important, because the evidence shows that the more information you have in long-term memory, the more cognitive capacity you have for comprehension, analysis, and additional learning.
First, complex content or skills need to be broken down into manageable chunks. Then, using algorithms, technology can provide students with practice in recalling important information. Ideally, computer programs or apps will adapt this practice to student's individual needs in ways that maximize the chances they'll retain the content. Technology can also make what might otherwise be routine drilling into an engaging game (although it's possible to do that without technology).
These approaches have worked well with subjects like math and language-learning (think Duolingo). But a basic problem is that there's a longstanding belief among most educators-in both the U.K. and the U.S.-that getting factual information into students' long-term memory just isn't important. In recent decades, the often-heard justification has been technology itself: why teach facts if you can just Google them?
It's far better, most educators believe, to teach students skills like critical thinking. That sounds good, but as research has found-and as Christodoulou lays out-it doesn't work. To think critically about a topic, you need to have information about it; the more information you have, the better you can analyze it. As for "just Googling it," there are multiple problems-not the least of which is that every time you stop reading to look something up, you risk losing the thread of what you're reading.
Here are some of the reasons ed tech isn't working, as explained by Christodoulou:
Personalization: This is a big buzz word in ed tech. If it's used to mean adapting content and practice to an individual students' needs so that information can be transferred to long-term memory, it can work. But if it means-as it often does-tailoring instruction to students' individual "learning styles" or allowing students to choose what content to study, forget it. People may have different learning preferences, but there's no evidence for "learning styles." And students who aren't already competent in a field are ill-equipped to direct their own learning.
Flipped classrooms: This is the idea that students will acquire knowledge before class, often through a recorded lecture, and then (as described by Christodoulou) practice skills during class through projects. But studies show many students aren't engaged by or don't understand the pre-class video. Christodoulou explains the problem as a false dichotomy between knowledge and skills: first students are to acquire the knowledge, then (in class) the skills. In fact, she argues, knowledge should be seen as the pathway to skills, and instruction needs to integrate the two rather than "balance" them.
Distraction: When students have access to devices in class, they use them for their own purposes. I've seen first-graders use iPads to play games instead of doing math problems. Christodoulou cites a study showing that during a college lecture, 94% of students used email and 61% used instant messaging. Some argue that "digital natives"-those born after around 1980-are better at this kind of multi-tasking, but Christodoulou cites evidence showing that no one is good at multi-tasking.
Search skills as a substitute for knowledge: Part of the "just Google it" argument is that students don't need to know facts-they just need to know how to evaluate the reliability of a website. But Christodoulou describes an experiment in which 7th-graders-the school's "most proficient online readers"-were shown a hoax website describing the "Pacific Northwest Tree Octopus." The website included fake markers of reliability: an author's name, an affiliated organization-in this case, the "Kelvinic University branch of the Wild Haggis Conservation Society." Of the 25 students, 24 found the website "very credible." They would have been in a far better position if they'd had some basic information about octopuses.
One common misuse of technology that Christodoulou doesn't mention occurs in connection with reading instruction, at least in this country. Technology might be an effective tool for one part of reading: teaching kids to connect sounds to the letters that represent them. But it's often used for practice in reading comprehension "skills" like "finding the main idea." That kind of instruction doesn't work, whether it's delivered by a computer or a human being. As with critical thinking, it's knowledge that's crucial to comprehension, not "skill."
Teachers vs. Tech? is written in a clear, accessible style, and it's packed with useful information for anyone interested in learning, especially teachers and prospective teachers. The unfortunate truth is that educators will gain far more useful information from the book than they're likely to acquire through their professional training.
I recently sat in on an education school class on using technology to teach science. Mostly what I heard were platitudes and misinformation. Students gave presentations in which they asserted that technology would make it easier to tailor instruction to "learning styles." The well-meaning professor cautioned students against quizzing students on things like "dates and important people," because "those are things that kids can now access at the touch of a phone." And the professor encouraged students to sign up for a Google certification course that-as I discovered from reading Christodoulou's book-tells teachers that "it is no longer relevant to focus on teaching facts."
Christodoulou ends her book on an optimistic note, urging that "if we can disrupt the influence of bad ideas, there are enormous opportunities for technology." But that's a big "if." Before we can harness the power of technology to improve learning, we have to first understand how learning works-and that means reversing mistaken assumptions that have been entrenched for decades.