Microsoft’s AI Tool Generates Sexually Harmful and Violent Images, Engineer Warns
1. Microsoft AI engineer Shane Jones raised concerns about the company’s AI image tool generating violent and sexual images.
2. Jones addressed his worries about responsible AI in letters to the Federal Trade Commission and Microsoft’s board.
3. He highlighted potential dangers of Microsoft’s Copilot Designer tool creating unsafe content like sexualization, conspiracy theories, and drug use.
4. Jones urged Microsoft to remove Copilot Designer from public use until better safeguards are in place.
5. He called for an independent investigation by Microsoft’s committee to ensure responsible AI standards.
6. Jones emphasized the need for collaboration across industries and with governments to raise awareness about AI risks and benefits.
7. Microsoft responded by appreciating Jones’ efforts in testing their technology and committed to addressing employee concerns.
8. Microsoft President Brad Smith discussed the company’s commitment to ensuring AI safety for users in a recent blog post.
9. The AI race has intensified with the launch of OpenAI’s ChatGPT, leading to significant investments and competition among tech giants.
10. Jones’s concerns coincided with Google facing controversy over its AI products, prompting discussions about AI safety and transparency in the industry.