Microsoft has announced a partnership with StopNCII to remove non-consensual intimate images, including deepfakes, from its Bing search engine. StopNCII creates a digital fingerprint of an intimate image or video, which is then sent to participating industry partners to identify and remove the content if it violates their policies. Several tech companies, including Meta, TikTok, and Reddit, have agreed to work with StopNCII to scrub intimate images shared without permission. Google, however, is not participating in the effort and has its own tools for reporting non-consensual images. The US government has also taken steps to address the issue, with the US Copyright Office calling for new legislation and a group of Senators introducing the NO FAKES Act in July. Victims can open a case with StopNCII or Google to address non-consensual image-sharing. If you're under 18, you can file a report with NCMEC. The process applies to AI-generated deepfakes as well as real images and videos.
www.engadget.com
www.engadget.com
Create attached notes ...