Facebook Was A Revenge Porn Dumping Ground But Now Its Trying To Stop It

Facebook Was A Revenge Porn Dumping Ground, But Now It’s Trying To Stop It

Meta assures that when users submit sensitive videos and photos, the actual files never leave their device and only the hashed data is shared.



You Are Reading :Facebook Was A Revenge Porn Dumping Ground But Now Its Trying To Stop It

Facebook Was A Revenge Porn Dumping Ground But Now Its Trying To Stop It

Meta has launched a new tool that will let Facebook and Instagram users stop the spread of intimate images shared without their consent on the platforms. Facebook and its sister social media platform are no stranger to the problem that has now gained infamy under the ‘revenge porn’ umbrella term. The company already has a system in place since 2019 that uses machine learning and artificial intelligence to detect nude images and videos that might have been shared non-consensually on Instagram and Facebook.

But so far, the system has relied on human intervention — a “specially-trained member of our Community Operations team” — to detect if a media violates the community guidelines and needs to be removed before much damage happens. However, the two platforms have been struggling against an invasion of sexual privacy and online harassment, and it can be blamed on both internal and external failings. Leaked internal research recently revealed that Facebook dished out disturbing content depicting graphic violence or sexual imagery to users who aren’t skilled at navigating social media.

Taking yet another go at the issue, the company has launched a free tool for users who are a victim of Non-Consensual Intimate Image (NCII) abuse. The tool, which has been developed in collaboration with UK Revenge Porn Helpline, has the backing of over 50 NGOs across the globe and will be available to Instagram and Facebook users worldwide. Now going by the name Meta, the company says its safety tool is not only for users who are a victim of “revenge porn,” but can also be helpful for those who fear it might happen to them in the near future. Users are asked to create a coded warning system that will stop the publishing of non-consensual intimate imagery and videos, lowering the risks in a proactive manner.

See also  Miles Morales Becomes Captain America In Stunning New Marvel Cover

A Big Step With One Huge Drawback

However, the whole procedure might make users feel uneasy because the system asks them to submit intimate photos and videos for the sake of matching. Regarding the privacy aspect, Meta says that using hashing will allow it to catalog the identifiable data in an image while keeping the media itself private to the individual that submitted it. When users share an image or video, it is instantly c0nverted into a unique digital code using an algorithm that serves as a digital fingerprint for matching. For example, a duplicate of an image has the same digital fingerprint as the original, and that’s how the whole identification and removal system kicks into action. The actual image never leaves the user’s device, and StopNCII as well as participating companies only receive this hashed data as a safety measure.

Apple also relies on the hash generation process for detecting Child Sexual Abuse Material (CSAM) as part of its iCloud photo matching system. Meta is using open-sourced industry-standard algorithms for hashing photos and videos that can be implemented easily across platforms for matching. But the system can also be fooled easily with minor edits made to an NCII image or video. “If an image that has been hashed is edited through cropping, filters added or a video clipped, the original hash may not recognize the image. The new image will need to be hashed separately,” says the FAQ section of the anti-revenge porn tool.

Link Source : https://screenrant.com/facebook-instagram-revenge-porn-tool-hash/



Leave a Reply

Your email address will not be published. Required fields are marked *