What do you do when you're a hacker specializing in secure communications protocols, and you get a request to help the Kingdom of Saudi Arabia spy on its own people? For San Francisco's Moxie Marlinspike, a respected computer security expert, the experience provoked a thoughtful examination of the current state of hacker culture.
Not so long ago, hackers often perceived themselves as standing in opposition to authority and governments. Moreover, the subcategory of hackers who specialized in discovering and publicizing security vulnerabilities -- referred to as "exploits" in the security trade -- did so out of a belief that the best way to improve the integrity of our communication systems was by publicizing dangerous security holes.
Times have changed. As Joseph Menn documented in a breakthough special report for Reuters last week, today's security-minded hackers often end up working directly for defense contractors, hand in hand with the U.S. government. Identifying exploits and selling them off to the highest bidder has become a lucrative business. Worst of all, the buyers of these exploits aren't interested in improving security, but instead often plan to deploy these vulnerabilities for their own purposes.
Marlinspike spoke with Salon on Tuesday morning to explain how his Saudi Arabian encounter encouraged him to challenge the hacker community to rethink its values.
A week ago you were approached by a Saudi Arabian telecom company. What did they want and why did they come to you?
The company Mobily is actually from the United Arab Emirates, but they are one of the three major telecoms that operate in Saudi Arabia. They'd gotten a requirement from the regulator in Saudi Arabia to be able to both monitor and block mobile application data -- data transmitted from apps on phones. They were trying to meet that requirement and were looking for help on the surveillance.
You said they came to you because you had written some software tools that targeted security holes in communications software? Can you explain what that means?
A lot of these apps use a secure protocol for communicating with their server called SSL. I have spent some time doing security research in that area, and I've published a number of vulnerabilities concerning SSL over the years. I think they saw that and assumed that I would be able to help them intercept SSL communications.
Why had you chosen to focus on exposing such vulnerabilities?
For a bunch of reasons. I'm just interested in security protocols, for whatever weird reason. And SSL is probably the most popular secure protocol on the Internet, so focusing work in that area just makes a lot of sense, you know, bang for the buck. I'm also interested in doing research in secure protocols and specifically SSL because more and more that's what we depend on for the security of our communications, and more and more there are people who are interested in intercepting that communication, and I think we have to look at it really critically to make sure that it is as secure as we want it to be.
Ultimately, you turned Mobily down. Why?
Well, I'm not interested in helping them surveil the private communications of millions of people.
That led to the Mobily guy saying to you: "If you are not interested then maybe you are indirectly helping those who curb the freedom with their brutal activities." Kind of a,"if you're not with us, you're against us" moment. How did that make you feel?
Obviously concerned. But I do think it was a really great example of the same logic we are going to be confronted with over and over again. There's sort of an ongoing debate in the security community about what our role is in this new dynamic where governments are weaponizing the insecurities that are out there. Over and over again we hear it's us or them, you're with us or against us, your choice is either bombs or exploits. That it is something that we in the security community need to be talking about and be aware of.
Joseph Menn's Reuters article on how the U.S government is one of the biggest purchasers of these exploits was a real eye-opener. It's weird to see security hackers co-opted by the military-industrial complex -- selling exploits to the highest bidder. How did that happen?
I don't know. Slowly. But it is shocking how far it has come. For instance, the most popular security conference in the United States is called DefCon. In the early days of DefCon, there was a game that was sort of collectively played by everyone there called "Spot the Fed." The idea was that you would get points for every government employed agent that you could identify who was at the conference. Now, some of the major conference organizers actually work for the Department of Homeland Security. So, there's been a major transition in terms of that culture.
Isn't that a betrayal of hacker culture?
I'm wary of trying to define who is a hacker and who is not a real hacker. Betraying our true nature, or whatever. But I am interested in trying to talk more about what it is that we value and prioritize and who it is that we want to reward. To think intentionally about that. A lot of it just has to do with money. When you go back to the origins of the hacker community, our skills weren't valued by these players. And now they really are. Money has certainly changed this for a lot of people, this is where their bread is buttered. But I still think that as a community we can think about culture and try to influence that.
Do you think there is potential here for a counter-reaction? A return to this idea that exposing these vulnerabilities ultimately makes us more safe? A sense that the market for exploit sales has gotten completely out of hand?
I think there is a growing narrative along those lines. So far the discussion has focused a lot on legality, whether it should be legal to sell exploits or not, whether it should be regulated, whether Congress should step in. But I think simultaneously, it would be good to have a conversation about exploit sales in the context of culture. What does this community value and prioritize?
Are you talking morality? Ethics?
I wouldn't use the word "morality" -- I'm talking about cultural norms. I think that it is getting easier to talk about because there is more information about what is going on. All this stuff was so opaque for awhile that it was hard to really have a real conversation about what was happening. This Saudi Arabia stuff is a great example. If someone is selling exploits to U.S. defense contractors, those same exploits could very easily end up in the hands of the Kingdom of Saudi Arabia through the corporate partnerships that the U.S. has established with that government.
How has this affected what you focus on in your own work? Has it changed your research interests?
It has. Specifically, I spend a lot more time now working on developing tools for secure communications, and working on proposals to strengthen the secure protocols that we already have, whereas before I probably spent more time doing research, looking for holes that could be exploited.
Shares