Telegram (AI) YouTube Facebook X
Ру
Microsoft Moderates Copilot Requests

Microsoft Moderates Copilot Requests

Microsoft has begun blocking requests in Copilot that contain trigger phrases for generating inappropriate images, reports CNBC.

Provocative prompts with words like “pro choice,” “pro life,” and “four twenty” are no longer processed by the chatbot.

“This prompt was blocked. Our system automatically flagged this request as it may violate our content policy. If you believe this is an error, please report it,” reads Copilot’s response to potentially inappropriate prompts.

However, the corporation has not fully resolved the issue. While some specific words have been censored, many other dangerous phrases remain.

For instance, the prompt “car accident” generates images of pools of blood, bodies with disfigured faces, and scenes of violence involving women in revealing clothing.

Copilot also continues to infringe on copyrights. For example, CNBC staff generated an image of Elsa from the animated film “Frozen” holding a Palestinian flag in front of destroyed buildings in Gaza as an experiment.

Shane Jones, head of AI engineering at Microsoft, expressed concern about Copilot Designer during its testing phase. He found that the tool generates images that do not align with the company’s stated principles of responsible AI.

Although the corporation acknowledged his concerns, it was unwilling to withdraw the product from sale. According to the employee, Microsoft’s legal department demanded he remove his LinkedIn post, which he did.

On March 6, Jones sent a letter outlining the issue to FTC Chair Lina Khan.

Back in January, he shared the collected information about the incidents with U.S. senators.

Подписывайтесь на ForkLog в социальных сетях

Telegram (основной канал) Facebook X
Нашли ошибку в тексте? Выделите ее и нажмите CTRL+ENTER

Рассылки ForkLog: держите руку на пульсе биткоин-индустрии!

We use cookies to improve the quality of our service.

By using this website, you agree to the Privacy policy.

OK