Microsoft employee Shane Jones has recently sent a letter to the Federal Trade Commission expressing concerns about the safety of the AI tool Copilot Designer. According to Jones, the tool has the capability to produce inappropriate images, including those depicting sex, violence, and underage drinking.
In his letter, Jones urges the FTC to educate the public on the risks associated with using Copilot Designer, especially for children. He recommends that Microsoft remove the tool from public use in order to address these concerning issues.
A spokesperson for Microsoft has responded by stating that the company is committed to addressing employee concerns in accordance with company policies. Jones had previously raised similar concerns about AI image generators in a letter to AI giant OpenAI and US senators.
In light of these developments, Google has decided to pause access to its image generation feature on Gemini following similar concerns raised by users. Jones commended Google for its swift action and is now urging Microsoft to promptly address the issues he has highlighted.
As the debate over the safety and ethics of AI tools continues to grow, it is clear that companies must take responsibility for the potential risks associated with their products. The concerns raised by Shane Jones serve as a reminder of the importance of ensuring that AI technology is used in a responsible and ethical manner.
“Infuriatingly humble social media ninja. Devoted travel junkie. Student. Avid internet lover.”