Underage Australians trying to scrub their nudes from the internet could soon find the job easier after one of the world's biggest social media giants revealed it was signing up to a new service designed to hunt down explicit images.
- Meta is joining a platform which will which help young people remove sexually explicit content of themselves from Facebook and Instagram
- Australia's e-safety commissioner says "image-based abuse has become the scarlet letter of the digital age"
- The new platform is open to anyone under the age of 18, and users who holds fears about intimate images or video of themselves as a child
Meta, the company behind Facebook and Instagram, is joining Take It Down, a new platform which was developed by the National Centre for Missing & Exploited Children in the United States to help young people remove sexually explicit content of themselves.
While the platform is based in America, Meta's decision to take part opens the service up to millions of Australian users across both of its key social platforms.
The move comes as social media companies find themselves under increasing pressure to do more to fight child exploitation.
Australia's eSafety Commissioner, Julie Inman Grant, told the ABC that the move was a step in the right direction, but urged social media giants like Meta to do more.
"The service relies on user-reporting, rather than the companies proactively detecting image-based abuse or known child sexual exploitation and abuse material," she said.
"We maintain the view that companies need to be doing much more in this area.
"Image-based abuse has become the scarlet letter of the digital age … once these go up on the internet, it can be very insidious and be difficult to take down, particularly if you're trying to work on your own."
The new platform is open to anyone under the age of 18, or who holds fears about intimate images or video of themselves as a child.
Take It Down's technology works by matching an underage user's images, whether they are similar or identical.
Young Australians who fear that their intimate photos or videos have been or will be splashed across the internet can securely put their content through a special algorithm which spits out a code.
That code is then shared with major online platforms like Facebook, Instagram, OnlyFans and Pornhub, which then delete the content if it is detected.
The code can be created without the content being uploaded and also works on deepfakes, which is when someone's likeness is superimposed on an existing photo or video.
The service is expected to bolster protections already available in Australia, which is the only country in the world with image-based abuse laws.
The commissioner also has broad powers to slug perpetrators more than $100,000 per intimate image or video, and force them to apologise or delete the content.
Michael Salter, an associate professor of criminology at the University of New South Wales, said Meta's decision to sign up to the new service showed how much pressure companies like it were facing.
"Technology companies need to be held responsible for the illegal content that they facilitate and distribute on their sites, their business model hinges on enabling users to circulate whatever content they feel like," he said.
"They've been able to avoid accountabilities for the sexual and nude images of children circulating on their platform for a long time and I think this sort of initiative indicates that public pressure and government pressure have worked, and they're starting to implement basic victim remedies that should have been there the whole time," he said.
But Dr Salter said there was still a long way to go.
"We are not going to see a reduction in child sexual abuse material online until major service providers are required to proactively scan for and seek the removal of child sexual abuse material," he said.
"We know that young people … are in fact spending a lot of time online trying to find those images, trying to seek their removals.
"I think this will offer some victims and survivors some relief and take the pressure off their mental health."
In a statement Meta Australia's head of public policy, Josh Machin, described Take It Down as a dedicated tool for young people.
"This world-first platform will offer support to young Australians to prevent the unwanted spread of their intimate images online, which we know can be extremely distressing for victims," he said.