Ice Lounge Media

Ice Lounge Media

How to teach kids who flip between book and screen

IceLoungeMedia IceLoungeMedia

Linus Merryman spends about an hour a day on his laptop at his elementary school in Nashville, Tennessee, mostly working on foundational reading skills like phonics and spelling. He opens the reading app Lexia with ease, clicking straight through to lessons chosen specifically to address his reading needs. This week Linus, who’s in second grade, is working on “chunking,” finding the places where words are broken into syllables. The word chimpanzee appears on the screen in large letters, and Linus uses his mouse pad to grab cartoon Roman columns and slip them into the spaces between letters, like little dividers, where he thinks the syllable breaks should be. The app reads his guesses back to him—“chim-pan-zee.” He gets it right. 

After practicing these foundational skills on the computer, he and his classmates close their laptops and head to the rug, each with a print copy of their class reader, I Have a Dream, a picture book featuring the text of Martin Luther King Jr.’s speech. Students follow along in their books as the teacher reads aloud, occasionally stopping so they can ask questions and point out things they notice, like how the speech is written in the first person. 

Linus’s mom, Erin Merryman, an early reading interventionist at another Nashville school, initially worried about how well her son would learn to read in a classroom that made so much use of computers. He has been diagnosed with the learning disability dyslexia, and Merryman knows from her training that dyslexic students often need sensory input to learn how sounds are connected to letters. Close oversight from a teacher helps them as well. But since his reading has vastly improved this year, she’s adjusted her view. 

“I think a lot of what the app is doing is very good, very thorough,” Merryman says. “I’m surprised by how effective it is.”

Like Merryman, a growing group of experts and educators are trying to figure out what the relationship should be between digital technology and reading instruction. Both reading and digital tech are world-expanding human inventions, and laptops and smartphones have arguably given humans unending opportunities to read more; you can access pretty much anything in print within a few seconds. In terms of “raw words,” the cognitive scientist Daniel T. Willingham has said, kids read more now than they did a decade ago. But many reading experts suspect that the technology may also be changing how they read—that reading on a screen is fundamentally different from reading on the page. 

Researchers who study young readers’ brains and behaviors are eager to understand exactly where tech serves kids’ progress in reading and where it may stand in the way. The questions are still so new that the answers are often unclear. Since the covid-19 pandemic closed schools in 2020, nearly all students have been organizing their learning around a school-issued laptop or tablet. But educators who are more dependent than ever on digital tech to aid learning in general often have little or no guidance on how to balance screens and paper books for beginning readers accustomed to toggling between the two. In a lot of ways, each teacher is winging it. 

Figuring out how best to serve these young “biliterate brains” is crucial, cognitive scientists say—not just to the future of reading instruction, but to the future of thought itself. Digital technology has transformed how we get knowledge in ways that will advance and forever alter our species. But at the individual level, the same technology threatens to disrupt, even diminish, the kind of slow, careful learning acquired from reading books and other forms of print. 

Those seemingly contradictory truths underline the question of how we should go about teaching children to read in the 21st century, says neuroscientist Maryanne Wolf, author of Reader, Come Home: The Reading Brain in a Digital World. Wolf, the first to use the term “biliterate brain,” is busy researching the relative merits of screen- and page-based approaches, adopting in the meantime a stance of what she calls “learned ignorance”: deeply investigating both positions and then stepping outside them to evaluate all the evidence and shake out the findings. 

Researchers who study young readers’ brains and behaviors are eager to understand exactly where tech serves kids’ progress in reading and where it may stand in the way.

“Knowledge has not progressed to the point where we have the kind of evidence I feel we need,” Wolf says. “What do the affordances of each medium—screens vs. print—do to the reading brain’s use of its full circuitry? The answers are not all in.” 

But, she continues, “our understanding is that print advantages slower, deeper processes in the reading brain. You can use a screen to complement, to teach certain skills, but you don’t want a child to learn to read through a screen.” 

Which is best for comprehension, screens or books?

Once children have learned to decode words, research on how they comprehend texts encountered on screens and paper gets a little more decisive. Experts say that young readers need to be reading alongside adults—getting feedback, asking questions, and looking at pictures together. All this helps them build the vocabulary and knowledge to understand what they’re reading. Screens often do a poor job of replicating this human-to-human interaction, and scientists like Wolf say that the “reading circuits” in children’s brains develop differently when the young learners are glued to a screen. 

Studies on the inner workings of the brain confirm the idea that human interaction helps develop beginning readers’ capacity for understanding. But they suggest that reading paper books is associated with that progress, too. In one study, researchers found that three- and four-year-old children had more activation in language regions of the brain when they read a book with an adult like a parent than when they listened to an audiobook or read from a digital app. When they read on an iPad, activation was lowest of all. In another study, MRI scans of eight- to 12-year-olds showed stronger reading circuits in those who spent more time reading paper books than those who spent their time on screens.

For older students, significant research shows that comprehension suffers when they read from a screen. A large 2019 meta-analysis of 33 different studies showed that students understood more informational text when they read on paper. A study by the Reboot Foundation, evaluating thousands of students across 90 countries including the US, found that fourth graders who used tablets in nearly all their classes scored 14 points lower on a reading test than students who never used them. Researchers called the score gap “equivalent to a full grade level” of learning. Students who used technology “every day for several hours during the school day” underperformed the most, while the gap shrank or even disappeared when students spent less than half an hour a day on a laptop or tablet.   

Why do students understand more of what they read when it’s in a book? Researchers aren’t entirely sure. Part of the issue is distraction, says Julie Coiro, a researcher at the University of Rhode Island. Kid-friendly reading apps like Epic! offer thousands of books that often contain images, links, and videos within the body of the text. These are meant to enhance the reading experience, but they often drag children away from concentrating on the meaning of the text. Even in reading experiments where students weren’t allowed to browse the web or click on embedded links, though, they still performed worse. 

Virginia Clinton-Lisell, the author of the 2019 meta-analysis, hypothesized that overconfidence  could be another aspect of the problem. In many of the studies, students who read from a laptop seemed to overestimate their comprehension skills compared with those reading the paper books, perhaps causing them to put in less effort while reading.  

Students self-report learning more and having a better reading experience when they read paper books. Linguist Naomi Baron, author of How We Read Now: Strategic Choices for Print, Screen, and Audio, says that when she interviews students about their perceptions, they often say reading from a book is “real reading.” They like the feel of the book in their hands, and they find it easier to go back to things they’ve already read than when they are reading from a screen. While they might prefer digital formats for reasons of convenience or cost, they sense they have greater concentration while reading print. 

But Baron says school districts and educators often aren’t aware of the strong research connecting books to better comprehension or confirming student preferences for print. Baron’s research dealt with college students, but last year a study by the Organization for Economic Cooperation and Development (OECD) of 15-year-olds in 30 countries showed that students who preferred reading on paper scored 49 points higher, on average, on the Program for International Student Assessment (PISA)—and the study hinted at an association between reading paper books and liking to read.

Baron also thinks there should be more practical attention paid to developing pedagogical approaches that explicitly teach the slower, more focused habits of print reading, and then help students transfer those skills to the screen. Reinforcing those habits would be helpful even for people who usually read books, because someone reading a book can get distracted too—especially if a phone is nearby. 

The use of digital books and textbooks exploded during the pandemic, and it may be only a matter of time before all educational publishing moves online. So it’s all the more important to keep making digital reading better for students, says literacy educator Tim Shanahan. Instead of trying to make the digital technology more like a book, Shanahan has written, “[engineers] need to think about how to produce better digital tools. Tech environments can alter reading behavior, so technological scaffolding could be used to slow us down or to move around a text more productively.” In the future, students might read about history or science from something like a “tap essay,” where words, sentences, and images are revealed only when a reader is ready and taps the screen to move on to the next piece of text. Or maybe their reading material will look more like a New York Times digital article, in which text, images, video, and sound clips are spaced out and blended together in different ways.

Hooked on computer phonics 

About two-thirds of American schoolchildren can’t read at grade level. At least partly to blame is a widespread method of reading instruction that dominated classrooms for 40 years but was not based on scientific evidence about how the brain learns to read: “balanced literacy,” and its close cousin “whole language,” deemphasized explicit instruction in reading’s foundational skills, leaving many children struggling. But over the last several years, a new method strongly focused on these foundational skills, often referred to as the “science of reading,” has brought sweeping changes to the US education system. Based on decades of scientific evidence, the “science of reading” approach is organized into five areas: phonemic awareness (learning all the sounds of the English language), phonics (learning how those sounds are attached to letters), vocabulary, comprehension, and fluency.

Learn-to-read apps and digital platforms have the potential to teach some of these foundational skills efficiently. They’re especially well suited to phonemic awareness and phonics, making learning letters and sound combinations a game and reinforcing the skills with practice. Lexia, arguably the most widespread digital platform devoted to the science of reading, teaches basic and complex foundational reading skills, like letter-sound blends and spelling rules, using responsive technology. When learning a specific skill, such as figuring out how to read words like meal and seam with the “ea” vowel combination in the middle, students can’t move on until they’ve mastered it. 

Digital platforms can reinforce certain specific reading skills, but it’s the teacher who is constantly monitoring the student’s progress and adjusting the instruction as needed.

A new wave of predictive reading platforms goes one step further. Companies like Microsoft and SoapBoxLabs are envisioning a world where students can learn to read entirely via computer. Using AI speech recognition technology, the companies claim, these digital platforms can listen closely to a student reading. Then they can identify trouble spots and offer help accordingly. 

As digital tech for learning to read spreads into schools—Lexia alone serves more than 3,000 school districts—some reading experts are wary. Research on its efficacy is limited. While some see technology playing a useful role in reading-related functions like assessing students and even training teachers, many say that when it comes to actually doing the teaching, humans are superior. 

Digital platforms can reinforce certain specific reading skills, explains Heidi Beverine-Curry, chief academic officer of the teacher training and research organization The Reading League, but it’s the teacher who is constantly monitoring the student’s progress and adjusting the instruction as needed. 

Faith Borkowsky, founder of High Five Literacy, a tutoring and consultancy service in Plainview, New York, is not bothered by reading instruction apps per se. “If it happens to be a computer program where a few kids could go on and practice a certain skill, I’d be all for it, if it aligns with what we are doing,” she says. But often that’s not how it plays out in classrooms. 

In the Long Island schools Borkowsky works with, it’s more likely that students do more reading work on laptops because schools purchased expensive technology and feel pressured to use it—even if it’s not always the best way to teach reading skills. “What I’ve seen in schools is they have a program, and they say, ‘Well, we bought it—now we have to use it.’ Districts find it hard to turn back after purchasing expensive programs and materials,” she says.

Some platforms are working to bridge the gap between online and in-person instruction. Ignite! Reading, an intensive tutoring program launched after the pandemic closed schools, teaches foundational reading skills like phonemic awareness and phonics through a videoconferencing platform, where reading tutors and students can see and hear one another. 

Ignite’s instruction attempts to blend the benefits of digital tech and human interaction. In one tutoring session, a first grader named Brittany in Indianapolis, Indiana, sounded out simple words, prompted by her reading tutor, whom she could see through her laptop’s camera. Brittany read “map” and “cup,” tapping the whiteboard in her hand each time she made a sound: three sounds in a word, three taps. At the same time, a digital whiteboard on her laptop screen also tapped out the sounds: one, two, three. As Brittany sounded out each word, the tutor watched the child’s mouth through the computer’s camera, giving adjustments along the way. 

Ignite cofounder and CEO Jessica Sliwerski says she’s building an army of remote reading tutors to assist teachers in helping kids catch up after the pandemic years. Students get 15-­minute sessions during the school day, and when sessions are over, tutors get coaching on how to make the short bursts more effective. 

Sliwerski believes technology can be incredibly useful for giving more students one-on-one attention. “We are taking a different approach to the technology,” she says. “We are centering the child on a human who is highly trained and accountable. That’s the core of it, and there’s not really anything tech about that.”

Preserving deep reading 

Once students can decode words and comprehend their meaning, the real work of reading begins. This is what Wolf calls “deep reading,” a specific set of cognitive and affective processes in which readers are able to take in whole chunks of text at a time, make predictions about what comes next, and develop lightning-fast perception. These interactive processes feed each other in the brain, accelerating understanding. 

But since the vast majority of the reading that today’s young people do—let’s face it, the majority that we all do—is skimming an online article, a Facebook post, or a text from a friend while hopping from one tab to another, deep reading as a cognitive process is at risk. If today’s kids read only from screens, Wolf says, they may never learn deep reading in the first place—that elaboration of the brain’s reading circuit may never be built. Screen reading may “disrupt and diminish the very powers that it is supposed to advance.” 

“We are amassing data that indicates there are changes in the reading brain that diminish its ability to use its most important, sophisticated processes over time when the screen dominates,” Wolf says. Deep reading is something that came naturally to many readers before digital tech and personal computers, when they had lots of time to spend doing nothing but reading a book; but it can’t be assumed that today’s young readers, with their biliterate brains, will automatically learn the process. 

Some educators are paying more attention to how to help students begin to learn deep reading. Doug Lemov, a charter school founder who now trains teachers full time with his “Teach Like a Champion” books and courses, is acutely concerned that many middle and high school students no longer appear to have the attention span to concentrate on a text for long periods of time. So he encourages the teachers he trains to adopt “low-tech, high-text environments” inside their classrooms, with paper books, pencils, and paper. In such a setting, students slowly build up their attention spans by doing nothing but reading a book or scratching out a piece of writing, even if that means beginning with just a few minutes at a time. 

“Build on that until they can go for 20 minutes, either in a group or individually—just reading the text, sustaining their attention and maintaining focus,” Lemov says. “Writing does the same thing: it improves the focus and attention that students will need to do deep reading.” 

It’s possible, of course, that kids’ attention spans haven’t actually changed that much with the advent of digital technology. Instead, argues Willingham, the cognitive scientist, in his book The Reading Mind: A Cognitive Approach to How the Mind Reads, it’s their expectations for entertainment that have changed. “The consequence of long-term experience with digital technologies is not an inability to sustain attention. It’s impatience with boredom,” he writes. “It’s an expectation that I should always have something interesting to listen to, watch, or read, and that creating an interesting experience should require little effort.” Deep reading, on the other hand, requires “cognitive patience”—an entirely different set of skills in which kids often have to put in great effort for a payoff that is sometimes many pages down the road. 

Yet in Wolf’s view, getting rid of all reading tech would be as ill-advised as relying on it exclusively. Instead, she’s hoping to spur a conversation about balance, gathering evidence about which ways of using digital technology work best for diverse learners and for different age groups—information that could help districts and teachers guide the decisions they make about teaching reading. A five- to 10-year-old child who is learning to read has different needs from a 12-year-old, or from a high schooler whose smartphone is loaded with five social media apps. Young children just beginning to build their reading circuit benefit most from books and human interaction. Older kids can cultivate the “digital wisdom” to make smarter choices while working on developing the ability to toggle effortlessly between print and digital worlds. 

Some kids, though, may be tired of all that toggling. Matt Ryan, a high school English teacher in Attleboro, Massachusetts, doesn’t allow any e-books in his class—when he assigns a novel, it’s paper only. Not only does he not get any pushback, he says, but he senses students are somewhat relieved. 

“Distractions are a very real issue, so reading on a device will not be effective for most of them,” Ryan says. “My sense is that so much of what they do is on a device—they welcome something off of it.” 

Holly Korbey is an education and parenting journalist and author of Building Better Citizens: A New Civics Education for All.