Children are not what they used to be. They tweet and blog and text without batting an eyelash. Whenever they need the answer to a question, they simply log onto their phone and look it up on Google. They live in a state of perpetual, endless distraction, and, for many parents and educators, it's a source of real concern. Will future generations be able to finish a whole book? Will they be able to sit through an entire movie without checking their phones? Are we raising a generation of impatient brats?
According to Cathy N. Davidson, a professor of interdisciplinary studies at Duke University, and the author of the new book "Now You See It: How Brain Science of Attention Will Transform the Way We Live, Work, and Learn," much of the panic about children's shortened attention spans isn't just misguided, it's harmful. Younger generations, she argues, don't just think about technology more casually, they're actually wired to respond to it in a different manner than we are, and it's up to us -- and our education system -- to catch up to them.
Davidson is personally invested in finding a solution to the problem. As vice provost at Duke, she spearheaded a project to hand out a free iPod to every member of the incoming class, and began using wikis and blogs as part of her teaching. In a move that garnered national media attention, she crowd-sourced the grading in her course. In her book, she explains how everything from video gaming to redesigned schools can enhance our children's education -- and ultimately, our future.
Salon spoke to Davidson over the phone about the structure of our brains, the danger of multiple-choice testing, and what the workplace of the future will actually look like.
Over the last few years, there's been a lot of concern that new forms of technology, like smart phones, video games and the Internet, are ruining the next generation of kids -- that they can't concentrate on anything, that they're always distracted. You don't think that's the case?
Back in the '80s and '90s, there were findings that suggested this new technology would be great as an education and learning tool. Then Columbine happened, and you could see all the research money go from "Wow, we’re in this new digital age and it’s going to be great for all of us" to "How is it that this digital era is destroying our kids?" It’s email, it’s the Internet, it’s video games, then when texting comes along, it’s texting, and when social networking comes along, it’s social networking. So whatever the flavor of the month in terms of new technologies are, there’s research that comes out very quickly that shows how it causes our children to be asocial, distracted, bad in school, to have learning disorders, a whole litany of things.
And then the Pew Foundation and MacArthur Foundation started saying, about three or four years ago: "Wait, wait, wait, let’s not assume these things are hurting our kids. Let’s just look at how our kids are using media and stop with testing that’s set up from a pejorative or harmful point of view. Let’s actually look at what’s happening." So we've wasted time -- but we can make it up. I think the moralistic research really, really colored over a decade of research, especially on kids.
So tell me, why isn't all this distraction bad for our kids' brains?
The phenomenon of attention blindness is real -- when we pay attention to one thing, it means we’re not paying attention to something else. When we’re multitasking, what we’re actually really doing is what Linda Stone calls "continuous partial attention." We’re not actually simultaneously paying equal attention to two things: One of the things that we’re doing is probably being done automatically, and we’re sort of cruising through that, and we’re paying more attention to the other thing. Or we’re moving back and forth between them. But any moment when there is a major new form of technology, people think it’s going to overwhelm the brain. In the 1930s there was legislation introduced to prevent Motorola from putting radios in dashboards, because it was thought that people couldn't possibly cope with driving and listening to the radio.
As you point out in the book, the reason why certain things distract us more than others has to do with the way our brains develop when we're young children.
We used to think that as we get older we develop more neural pathways, but the opposite is actually the case. You and I have about 40 percent less neurons than a newborn infant does. A baby pays attention to everything. You’ve probably witnessed this -- if there are shadows in the ceiling or sand blades are making peculiar patterns, we adults don’t recognize that, but it can be utterly mesmerizing to a child. They learn what not to pay attention to over and over and over again, and learn what to pay attention to, and that makes for neural pathways that are very efficient. They're what we tend to call reflexes or automatic behaviors, because we’ve done them so many times we don’t pay attention to them anymore. As an adult, you feel distracted when you learn something new and you can’t depend on those automatic responses or automatic reflexes that have been streamlined neurally over a lifetime of use.
Younger generations are being exposed to all these new stimuli -- texting, Facebooking, Googling -- from an early age. Does that mean their brains, their neural pathways, are built differently than ours?
When my students go to the Web and they’re searching and they’re leaving comments and they’re social networking and they’re Facebooking and they’re texting at the same time -- those are their reflexes. They are learning to process that kind of information faster. That which we experience shapes our pathways, so they’re going to be far less stressed by a certain kind of multitasking that you are or than I am, or people who may not have grown up with that.
Our tools are substitutes for those things that society has taught us aren’t worth paying attention to and aren’t really valuable -- and our neural pathways have followed right along. Back in the days when the slide ruler was invented, people thought it was terrible and you would lose math abilities. Well, they were right, we did lose certain math abilities, and no one cared. So kids today, I think, because of the way they learn, they are used to lots of different media, and they are learning in a different way than kids who were trained by television, for example, in a previous generation. It’s not that one is better or worse than another, it’s just that they’re absolutely different. There’s always something that is easy for a kid not because they’re superior, but because that’s exactly the thing that shaped their neural pathways.
In the book, you have this fascinating statistic that 65 percent of kids born today will have careers that don’t exist yet. Right now, under No Child Left Behind, the school system puts tremendous emphasis on standardized multiple choice tests, which, as you point out, don't exactly train kids to think creatively about the technological future.
The whole point of standardized testing was invented in 1914 and modeled explicitly as a way to process all these immigrants who were flooding into America at the same time as we were requiring two years of high school, and men were off at war and women were working in factories. The multiple choice test is based on the assembly line – what’s fast, what’s machine readable, what can be graded very, very rapidly. It’s also based on the idea of objectivity and that there's a kind of knowledge that has a right answer. If you chose a right answer, you’re done. It's really only in the last 100 years that we’ve thought of learning in that very quantifiable way.
We’re now in an era where anybody can find out anything just by Googling. So the real issue is not how fast can I choose a fact A, B, C or D. Now if I Google an answer I’ve got thousands of possibilities to choose from. How do you teach a kid to be able to make a sound judgment about what is and what isn’t reliable information? How do you synthesize that into a coherent position that allows you to make informed decisions about your life? In other words, all of those things we think of as school were shaped for a vision of work and productivity and adulthood that was very much an industrial age of work, productivity and adulthood. We now have a pretty different idea of work, productivity and adulthood, but we’re still teaching people using the same institutionalized forms of education.
So what do we do to change that?
First I’d get rid of end-of-grade tests. They demotivate learning, in boys especially. Establish more challenge-based problem-solving kinds of education. This is hardly revolutionary. Montessori schools do this. I would like to see more attention paid to how you go from thinking something to making something. If I’m learning about numbers, how will that help me understand the financial situation that no one in the world seems to understand right now. You’re lucky that you’re 27; imagine being 15 right now, and hear every pundit saying that your generation is the first generation to be poorer than your parents, you’re not going to have jobs, we’re going to go into a worldwide depression, and the Internet has made you dumb, shallow, stupid, lonely -- that’s a lot to deal with.
One of the things you advocate is getting rid of the traditional grading system in favor of something more group-sourced. How would that work? I find it a little shocking.
There are all these really stunning computer scientists that are just frustrated as heck about how badly we’re training scientists. And many of them feel that A,B,C,D and numeric grades are disincentives to exactly the kind of inductive thinking, creative thinking that is the scientific method. Top Coder is the world’s most important certification system for people who are doing open Web development around the world, and they've come up with an incredibly complex badging system, where if I’m working with you on code and I see you’re doing a great job, it’s part of my job as a member of the Top Coding community to give somebody points. So if I think you’re doing a great job solving some problem in C++ that I can’t see a solution to, I might give you 20 points. If I’m a third developer, and I say I really need somebody who can help me with some really complicated stuff on C++ , and I see you have a badge with 1,000 points on it on your website, I can click on your badge and it will give me in minute and excruciating detail, how you earned every one of those points.
There are now a group of computer scientists who are working together to see if we can’t come up with ways that textbooks -- particularly online and interactive textbooks; there’ve been some wonderful ones for algebra, for example -- could be based on testing that works in some similar way, where a teacher would give you points for succeeding at a problem, where you would automatically get points for getting the correct answer. You wouldn't even worry about giving negative points because it doesn’t matter; all you do is get points when you do something well. Even saying that is a conceptual breakthrough. When I told my students that we don’t have to worry about trolls and criticism, all we have to do is make really sound, conscientious, articulate judgments about positive things, it was as if a cloud opened.
Work life is also changing. It's weird, I go to an office every day, and sit at a desk, but as soon as I open my laptop my non-work life floods back in. I'm answering emails from my parents, or checking my Facebook page, or getting texts from friends. Our work life has become very porous.
Right. It’s bizarre -- we go to work, and then we live in a workplace that is our desktop, which brings all the whole world to our workplace with us. Why are we going physically to work to then be subjected to everything else in the world that isn’t work along with everything that is work? It's all jumbled up together. We’re in a very transitional moment. But that’s going to get fixed too, I think grading is going to get more sophisticated and I think we’re just going to have more and more complex ways of working .
Do you think that we’ll just all be telecommuting, eventually?
You know, I love talking about IBM because what stodgier company is there on earth than IBM? There, people are reshaping the workplace to the task before them. Sometimes they physically come together, sometimes they’re in a 15-person phone call, sometimes they’re on their laptops [IM'ing]. I think the dexterity of work is just something we’re beginning to explore. I think you need some really solid infrastructure so it’s not exploitative. It would be great to have universal healthcare because if you had universal healthcare you actually could employ people in variant ways. Like I might want to work 10 hours a week at Salon, and 20 hours a week at NYU, and 15 hours a week at the New York Times. I think flexibility and variability will be more important in the future.
I think basically everybody in the white-collar field has noticed that this change goes both ways -- our personal life intrudes into our workplace, but our work is increasing intruding into our personal life. We're answering emails on weekends or checking our BlackBerrys at night. Is this something we’re just going to get used to?
We could either get used to it, or we could say no. Computers could have software that said how much time I was logged in [and then paid me accordingly]. Why couldn’t that be the new work life? We have to think very carefully about what we want from work now that those new conditions and possibilities exist. That’s why I teach my students about judgment.
Shares