Facebook wants to prevent your ex from sharing intimate pictures as revenge porn, but first it needs you to send samples for them to look. Image: TheUSBPort

On Wednesday, Facebook announced a wildly controversial measure to avoid the unapproved spread of private nude photos of people on their platforms: a system that prompts users to upload their own intimate pictures for Facebook to review and prevent their unauthorized sharing as “revenge porn.”

The idea was met with instant criticism by the online community, proving once again the old adage that says the road to hell is paved with good intentions. Facebook reassured people that they wouldn’t store the images per se, but only a digital fingerprint identifiable across their platforms using machine learning.

The tech giant has recently been the orchestrator of similar, not-so-sensitive policies. A few months ago, journalists uncovered that Facebook’s ad platform allowed for racial targeting and discrimination. Instagram was also the center of controversy for allowing influencers to advertise products without disclosing so.

How does Facebook plan to combat revenge porn?

On paper, the social media giant’s idea to fight the regrettable phenomenon of revenge porn is a good one. However, the privacy implications to accomplish the proposed objectives are troublesome, to put it mildly, to millions of users who might want or need this sort of tools.

For starters, Facebook would require you to send nude pictures of yourself to yourself via Messenger, so they can create a “digital fingerprint” of the image so their systems can detect that photo or similar ones using machine learning models.

If that did not sound controversial enough, the hypothetical scenario where your nudes do get shared might seem worse. If someone uploads revenge porn of you to Facebook, Instagram or Messenger, the system will send an alert to the Community Operations team, and a staffer will have to review if it’s actually you in the pic.

Facebook’s execution has to be as good as their intentions

Of course, the immediate backlash that followed the announcement must have been enough for Facebook to realize that their strategy is not as sound as they thought. The program is already deployed in Australia as a preliminary trial that could spread to the U.S., Canada, and the U.K. if it proves successful.

However, it is likely for it to not reach that point without some core modifications. First, the platform must be transparent about the methods it applies if it truly doesn’t store the image files. Second, it must reinforce the review process since it relies on humans; and third, it must train AI to deal with this ASAP.

For a company that boasts so much about their machine learning potential, Facebook sure tends to end up relying on humans for sensitive tasks. People appreciate the initiative, but they reject outright the idea of a revenge porn catalog to which they willingly contribute their own photos.

Source: Facebook