Facebook is cracking down on "non-consensual intimate images" (aka revenge porn) with new detection technology and an online resource hub for victims.
"By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram," Facebook Global Head of Safety Antigone Davis wrote in a Friday blog post. "This means we can find this content before anyone reports it, which is important for two reasons: often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared."
The renewed effort comes after Facebook in 2017 launched a pilot program in Australia that lets people preemptively send nudes before angry exes post them. Facebook then uses photo hashing technology to prevent those photos from being uploaded to its platforms in the future.
Over the coming months, Facebook plans to expand that pilot "so more people can benefit from this option in an emergency," Davis wrote.
The company has also launched a support hub for victims called "Not Without My Consent" offering instructions on how to report revenge porn shared on Facebook (use this form) and other websites. The hub advises victims to take screenshots of revenge porn before taking steps to have it removed.
"It may be illegal where you live to post or threaten to post things like this, and you might need a screenshot or other record of the post to serve as evidence if you pursue legal action," Facebook advised.
Going forward, Facebook plans to "make it easier and more intuitive" to report this type of abuse, Davis wrote. The company also plans to work with experts to put together a "support toolkit" offering "locally and culturally relevant" information for victims around the world.
This article originally appeared on PCMag.com.