Microsoft Engineer Raises Alarm on Copilot's Offensive Content
A Microsoft engineer has publicly voiced concerns over the safety of the Copilot image generator, citing its production of offensive and inappropriate content.
- Microsoft engineer Shane Jones has publicly raised concerns about the safety of the Copilot image generator, urging the FTC to intervene.
- Jones's testing revealed that Copilot Designer can bypass safety measures to produce offensive images, including sexualized content and violent imagery.
- Despite repeated attempts to address these issues internally, Microsoft has not taken significant action, according to Jones.
- Jones's concerns highlight systemic flaws in Copilot's design and the need for better safeguards.
- The controversy raises questions about the responsibility of AI developers to ensure their products do not perpetuate harmful stereotypes or violate copyright.