A Microsoft engineer has raised concerns about the AI image generator, Copilot Designer, created by the tech giant in collaboration with OpenAI. Shane Jones, who has been with Microsoft for six years, has been testing the AI tool in his free time and has discovered disturbing and inappropriate content being generated by the system.
According to Jones, the AI image generator has been producing violent and sexual images, as well as content that ignores copyrights. He expressed his shock at the images that appeared on his computer while testing the tool, including depictions of demons, monsters, and scenes related to abortion rights, assault rifles, sexualized women, and underage drinking and drug use.
Despite raising his concerns with Microsoft, Jones claims that the company has not taken appropriate action to address the issue. In an effort to escalate the matter, he has sent letters to the FTC Chair and Microsoft’s board to draw attention to the problematic content being generated by the AI tool.
Jones, who works as a principal software engineering manager at Microsoft’s corporate headquarters, emphasized that he does not work on Copilot in a professional capacity. Instead, he is part of a group of employees and external testers who voluntarily assess the company’s AI technology in their free time to identify potential problems.
Microsoft’s response to Jones’s concerns is reported to have included referring him to OpenAI, the company’s partner in developing the AI image generator. However, Jones claims that he did not receive a satisfactory response from OpenAI and took to LinkedIn to publicly address the issue.