The tech sector is heavily run by men — sometimes, men who seem blind to the potential dangers of their innovations. Now, in a move that seems to confirm this stereotype, Facebook has come up with a solution to one of its many PR crises — the rise of revenge porn on the platform. To combat this problem, Facebook is proposing that women upload nude selfies preemptively to their websites so that bots can block their exes from posting similar images.
Perhaps sensing the eyebrows raising from this announcement, Facebook sent out a flock of female media relations staffers to defend its anti-revenge porn system.
“It would be like sending yourself your image in email, but obviously this is a much safer, secure end-to-end way of sending the image without sending it through the ether,” online safety commissioner Julie Inman Grant told ABC. An even higher-up employee, Antigone Davis, Facebook’s head of global safety, further assured users that “the safety and well-being of the Facebook community is our top priority.” Uh huh. Sure.
The idea is that women would snap nude selfies and upload them to Facebook Messenger, which would then use AI technology to create a digital copyright of their naked bodies. That would automatically block a different user from posting a photo of the same body. It’s the same concept as fingerprinting at birth . . . except way more invasive. It sounds like a poor plan, especially at a time when users have less trust than ever in technology’s ability to protect their privacy. Leaked nude photos afflict celebrities and normal folks alike, and hackers who access supposedly secure information are a regular feature in our news cycle, so why on earth should Facebook users be expected to happily hand over even more of their private lives?
Facebook is piloting the technology in Australia, where Inman assured the Australian Broadcasting Corporation that users shouldn’t worry about their nude selfies winding up in Facebook’s cloud. “They’re not storing the image. They’re storing the link and using artificial intelligence and other photo-matching technologies. So if somebody tried to upload that same image, which would have the same digital footprint or hash value, it will be prevented from being uploaded.”
Facebook has come under particular pressure in recent months to tackle the problem of users posting explicit images of women online without their consent. The most publicized case was the all-male Facebook group United Marines, which was found in March 2017 to have harvested hundreds of photos of female Marines and veterans. The men would regularly leave lewd messages in the comments, like the Marine who commented on one woman’s photo that he’d like to “take her out back and pound her out.”
It’s a major problem for civilians, too—10% of American internet users under age 30 have been victims of revenge porn or online harassment. In recent years, multiple women have sued Facebook for failing to promptly take down nude images that were posted without their consent.
As Slate’s April Glaser writes, it’s pretty tone-deaf of Facebook to ask victims of revenge porn to upload more nude photos of themselves. “When a naked photo of a person is circulated without her consent, it can be ruinous emotionally and professionally. Requesting that women relive that trauma and trust Facebook, of all companies, to hold that photo in safekeeping is a big ask.”
Surely there must be other ways to prevent revenge porn on social media — heavier community monitoring by humans, perhaps, or a system to automatically flag users whenever they upload photos that contain nudity? Just some thoughts.
Whether Facebook's anti-revenge porn initiative will succeed depends on an ethical question for the 21st century: do women today trust anonymous AI technology more than they trust the shady men to whom they send nude photos in the first place?
Shares