SAN FRANCISCO: Microsoft has launched extra protections to its AI text-to-image era software Designer that customers have been utilising to create nonconsensual sexual photos of celebrities.
The adjustments come after AI-generated nude photos of American singer-songwriter Taylor Swift that went viral on X final week got here from 4chan and a Telegram channel the place individuals have been utilizing Designer to create AI-generated photos of celebrities, reviews 404 Media.
“We’re investigating these reviews and are taking applicable motion to handle them, ” a Microsoft spokesperson was quoted as saying.
“Our Code of Conduct prohibits using our instruments for the creation of grownup or non-consensual intimate content material, and any repeated makes an attempt to supply content material that goes in opposition to our insurance policies might lead to lack of entry to the service. We have now massive groups engaged on the event of guardrails and different security programs consistent with our accountable AI ideas, ” it added.
Microsoft acknowledged that an ongoing investigation was unable to verify whether or not the photographs of Swift on X have been created utilizing Designer. Nonetheless, the corporate is constant to strengthen its textual content filtering prompts and tackle the misuse of its providers, the report talked about.
In the meantime, Microsoft Chairman and CEO Satya Nadella has mentioned that the specific Swift AI fakes are “alarming and horrible”.
In an interview with NBC Nightly Information, Nadella mentioned that “I feel it behooves us to maneuver quick on this.”
Swift is reportedly weighing doable authorized motion in opposition to the web site liable for producing the deepfakes.