This is the fourth installment of The Influencers, a six-part interview series that Lynn Parramore, the editor of New Deal 2.0 and a media fellow at the Roosevelt Institute, is conducting for Salon. She recently talked to Internet activist and guru Eli Pariser, board president of MoveOn.org, who is currently writing a book exploring an invisible feedback loop he calls "the filter bubble." They discussed the dangers of this trend, along with net neutrality and the future of the Web.
What is the filter bubble and why is it significant?
Increasingly on the Internet, websites are personalizing themselves to suit our interests. We all see this happening at Amazon, where if you order a book, Amazon will send you the next book. We see it happening in Netflix, but it's also happening in a bunch of places where it's much less visible.
For example, on Google, most people assume that if you search for BP, you'll get one set of results that are the consensus set of results in Google. Actually, that isn't true anymore. Since Dec. 4, 2009, Google has been personalized for everyone. So when I had two friends this spring Google "BP," one of them got a set of links that was about investment opportunities in BP. The other one got information about the oil spill. Presumably that was based on the kinds of searches that they had done in the past. If you have Google doing that, and you have Yahoo doing that, and you have Facebook doing that, and you have all of the top sites on the Web customizing themselves to you, then your information environment starts to look very different from anyone else's. And that's what I'm calling the "filter bubble": that personal ecosystem of information that's been catered by these algorithms to who they think you are.
Why is this dangerous?
We thought that the Internet was going to connect us all together. As a young geek in rural Maine, I got excited about the Internet because it seemed that I could be connected to the world. What it's looking like increasingly is that the Web is connecting us back to ourselves. There's a looping going on where if you have an interest, you're going to learn a lot about that interest. But you're not going to learn about the very next thing over. And you certainly won't learn about the opposite view. If you have a political position, you're not going to learn about the other one. If you Google some sites about the link between vaccines and autism, you can very quickly find that Google is repeating back to you your view about whether that link exists and not what scientists know, which is that there isn't a link between vaccines and autism. It's a feedback loop that's invisible. You can't witness it happening because it's baked into the fabric of the information environment.
Is this the new narcissism?
Yeah, and you know, these filters are coming out for totally legitimate reasons. There's a mass of information that we all have to deal with every day, way more than we can grapple with, and we need help. The Google CEO, Eric Schmidt, likes to tell people this statistic: From the beginning of civilization to 2003, if you took all of human intellectual output, every single conversation that ever happened, it's about two exabytes of data, about a billion gigabytes. And now two exabytes of data is created every five days.
So there's this enormous flood of bits, and we need help trying to sort through it. We turn to these personalization agents to sift through it for us automatically and try to pick out the useful bits. And that's fine as far as it goes. But the technology is invisible. We don't know who it thinks we are, what it thinks we're actually interested in. At the end, it's a set of code, it's not a person, and it locks us into a specific kind of pixelated versions of ourselves. It locks us into a set of check boxes of interest rather than the full kind of human experience. I don't think with this information explosion that you can go back to an unfiltered and unpersonalized world. But I think you can bake into the code a sense of civic importance. You can have a sense that there are some things that we all need to be paying attention to, that we all need to be worried about, where you do want to see the top link on BP for everyone, not just investment information if you're interested in investments.
How can we ensure that certain kinds of public information will be readily available to all users?
I think the change happens on a bunch of levels, and the first is on an individual level. You can make sure that you're constantly seeking out new and interesting and provocative sources of information. Think of this as your information diet. The narcissistic stuff that makes you feel like you have all the right ideas and all the right opinions -- our brains are calibrated to love that stuff because in nature, in normal life, it's very rare. Now we have this thing that's feeding us lots of calories of that stuff. It takes some discipline to forgo the information junk food and seek out stuff that's a little more challenging.
So that's one piece of it. I think the second piece is we've had institutions that have been mediating what we get to know for a long time. For most of the last century they were newspapers that produced about 85 percent of the news in that model. They were always commercial entities. But because they were making so much money, they were able to afford a sense of civics, a sense that the New York Times was going to put Afghanistan on the front page, even if it doesn't get the most clicks.
So newspapers found this kind of happy medium that didn't always work perfectly, but it worked better than the alternative. I think now the baton is passing to Google, to Facebook, to the new filters to develop the same kind of sense of ethics about what they do. If you talk to the engineers, they're very resistant because they feel like this is just code, it doesn't have values, it's not a human thing. But of course they're writing code, and every human-made system has a sense of values. And so, I think calling them to their civic responsibility is an important part of the puzzle.
There are some places where we need regulation. There's the idea that people should be able to control how the information that they're giving to websites is used and monetized in a more clear and powerful way. That's something that probably will need government action. It could be something as simple as this: Instead of requiring sites to have privacy policies that customers agree to, you could pass a law that says, "Here's a standard format by which customers can have their own policy for how they want their data used," and sites essentially have to read that code and abide by it. That would change a lot of things, as opposed to having the hundred-page Apple terms of service agreement that I just got on my iPhone that nobody probably has ever read in its entirety.
Let's turn to net neutrality. You've been at the forefront of harnessing the Internet in a way that allows ordinary people to participate in the political arena. Why is net neutrality important to a democracy?
It's extremely important because the Internet was built on the principle that it would carry all different types of data. And it didn't really care what kind of data it was carrying. It was going to make sure that it got from Point A to Point B. That's the Internet: There's kind of a social contract between all the machines on the Internet that says, "I'll carry your data if you carry my data, and we'll leave it to the people on the edges of the network -- to your home PC or the PC that you're sending something to -- to figure out what the data means." That's the net neutrality principle. It's really at the core of the founding idea of the Internet.
Now, big companies like Verizon and Comcast are looking at how the Internet is eroding their profit margins. They're saying to themselves, what can we do to get a piece of this growing pie? They want a tiered Internet where you can pay them to go to the front of the line with your data. That will really erode that amazing thing we all know the Internet facilitates: that anyone with an idea can reach the world. You talk to venture capitalists and they're scared. They say a new start-up is just never going to be able to buy the speed that a Google or a Microsoft will be able to. Incumbent industries will be able to get their data to you quickly and new start-ups won't have a chance. And as a result, you'll have a drying up of the entrepreneurialism that's happened on the Internet. And you'll have a drying up of the Wikipedias, the nonprofit projects. Wikipedia works because it's just as fast as Google. When Wikipedia starts to slow way down relative to Google, you're more likely to just go to Google. So that's a problem.
If you have the filter bubble and a non-neutral Internet, doesn't it become even harder to find alternative points of view and for the dissident voice to be heard?
Yeah, and you know, I think this isn't pessimism. This is realism. Tim Wu wrote a new book called "The Master Switch," which talks about how every new medium in telecommunications history has had a period just like what we've seen with the Internet. There was democratization and openness, and people have dreamed these really big dreams -- about the telegraph, about the telephone, about the fax machine. They dreamed it was going to connect us all together and transform everything, and that it could never be owned. Everybody has been saying the exact same thing about the Internet. This is like the eighth time we've been through this. The likelihood is that the Internet will be owned by a few large media companies. The question is, can we rally around this brief moment of incredible efflorescence of creativity and innovation, and say, "No, thank you very much, we want to keep it like this"? That's the project of the next couple of years.
Have we learned any lessons along the way about what remedies and antidotes are actually effective?
We're not learning. It's like every time it happens people sort of think it's totally different. The important thing to remember with the Internet is that there are large companies that have an interest in controlling how information flows in it. They're very effective at lobbying Congress, and that pattern has locked down other communication media in the past. And it will happen again unless we do something about it.
- - - - - - - - - - -
Previous interviews in this series:
Eliot Spitzer on the crisis of accountability (Oct. 1)
Lewis Lapham on the "end of capitalism" (Sept. 23)
Elizabeth Warren in her own words (Sept. 15)
Shares