A bot has used the photos of more than 680,000 women to generate deepfake nudes without their knowledge, according to a report by cybersecurity researchers.
The findings, released on Tuesday, revealed that the women – some of them underage – had images of themselves uploaded on the messaging app Telegram, where an ecosystem of bots would then generate deepfake nude images which could be distributed on other Telegram channels. Approximately 104,080 of the deepfake nude images were publicly shared on the app, with the rest believed to be distributed privately.
Sensity, an intelligence firm specialising in deepfakes, says that users were able to have the nudes generated on request, and were primarily uploading images taken from the social media accounts of women that they knew without their knowledge or permission.
The channels used to generate and distribute the deepfakes were made up of 101,080 members, around 70% of whom were based in Russia and other eastern European countries. Access to the channels was also reportedly advertised on the popular Russian social media platform, VK.
Only a single image is needed to operate this technology, and simply by uploading to a chat room […] if you have ever shared publicly one photo on social media, you may be under threat.Sensity CEO Giorgio Patrini talking to Forbes
Researchers say this scheme differs from other deepfake nude generators, in that the majority of the victims were private individuals. In the past, deepfake non-consensual pornography has been used to target celebrities, so it’s a disturbing development that bots are now allowing users to ‘request’ nudes of people they actually know.
Original publication 23 October, 2020
Posted on NatCorn 11th November 2020
Reference to an article does not infer endorsement of any views expressed.