Taylor Swift stand zuletzt wegen Nacktfoto-Deepfakes auf den Trendlisten der Social-Media-Plattformen ganz oben. Ein gefährliches Phänomen – nicht nur für #Popstars. Denn theoretisch kann jede Person Opfer einer solchen Manipulation werden.
Sophie Maddocks: There’s a historian, Jessica Lake, and she’s done some really interesting research tracing the potential origins of the creation of fake nude images. She talks a lot about the rise of photography in the late 19th century, and writes about an example of face-swapping in late-19th-century photography where images of the faces of high-society women were pasted onto nude bodies and then circulated. And not only is that one possible starting point when thinking about the history of fake nudes, it’s also an interesting starting point for how we see the creation of A.I.–generated fake nudes. Fake nudes first went viral in the online sense in 2017 with the creation of the DeepNude app where the faces of individuals were digitally pasted onto the bodies of adult film actors, almost exactly mimicking what had been done in the late 19th century with photography.
So there is a long history to this harm, but I think there is that long-standing desire to produce fake nude images—almost exclusively of women. With the rise of the internet, we’ve seen ways of creating and sharing ever more photorealistic images—until we get to the last year with the rise of video- and image-generation models that create extremely realistic imagery and A.I. tools trained on millions of images of girls and women scraped from the internet without their consent. You can either use a text prompt or an existing image to produce a very realistic fake nude.
So A.I. has increased the volume and severity of this problem on the internet.
Absolutely. In 2017, when activists and the first people affected by A.I.–assisted deepfakes, like famous actors and singers, started to raise the alarm about this issue, they really gave us a roadmap for what would happen."
Whenever I hear the phrase 'move on', what I hear is 'this is inconvenient for those in authority'. Those little shits should have been expelled and prosecuted.
"...One 14-year-old girl told the NSPCC’s ChildLine service last year that a group of boys made fake explicit sexual images of her and other girls and sent them to group chats. The boys were excluded... but returned, and the girls were told to move on, which they struggled to do..."
#AI#GenerativeAI#GeneratedImages#DeepFakes: "Microsoft has introduced more protections to Designer, an AI text-to-image generation tool that people were using to make nonconsensual sexual images of celebrities. Microsoft made the changes after 404 Media reported that the AI-generated nude images of Taylor Swift that went viral on Twitter last week came from 4chan and a Telegram channel where people were using Designer to make AI-generated images of celebrities.
"We are investigating these reports and are taking appropriate action to address them," a Microsoft spokesperson told us in an email on Friday. "Our Code of Conduct prohibits the use of our tools for the creation of adult or non-consensual intimate content, and any repeated attempts to produce content that goes against our policies may result in loss of access to the service. We have large teams working on the development of guardrails and other safety systems in line with our responsible AI principles, including content filtering, operational monitoring and abuse detection to mitigate misuse of the system and help create a safer environment for users.”"
Nonconsensual sexually explicit deepfakes of global pop star Taylor Swift went viral this week on platform X. The images were viewed over 27 million times after they were shared on Wednesday. Fortunately for the singer-songwriter her many fans came to the rescue by mass-reporting the images as "Protect Taylor Swift" began to trend on X. But what about the everyday victims of these online attacks? NBC News reports:
We should not be alarmist about AI and elections. But we should also not be naive and pretend the techniques are not there and bad actors would not deploy them. Exhibit #...
How #AI has been helping criminals who use #deepfakes and voice cloning for financial scams, forcing banks and #fintechs to invest in AI to counter fraud.
A New Jersey high schooler and victim of nonconsensual sexually explicit deepfakes spoke out this week and said “I’m here, standing up and shouting for change, fighting for laws so no one else has to feel as lost and powerless as I did." Currently, there are no federal laws banning the creation and distribution of nonconsensual sexually explicit deepfakes. However, there is a bill stalled in the House that could one day criminalize the creation of them.
When called down to the principal's office last October, high school student Francesca Mani learned that someone had taken online pictures of her and used artificial intelligence to generate fake nudes that were then shared on social media.