Microsoft will pay 15 thousand. dollars to those who can deceive Bing AI: what is known

Microsoft invites IT professionals from all over the world to participate in the AI Bounty Program and receive a reward of $2000 to $15,000 for identifying vulnerabilities in Bing AI that affect user security.

The amount of reward depends on the level of vulnerability.

In order to receive a reward, the identified vulnerabilities must meet the following criteria:

– the vulnerability must be unknown to Microsoft,

– must have a critical or important level of severity according to the company’s vulnerability criteria,

– You need to present clear, concise, and reproducible steps in writing or in video format,

– provide the company’s engineers with the information they need to quickly reproduce, understand, and solve the problem.

Examples of possible vulnerabilities:

– To provoke Bing AI to generate indecent and even offensive responses in the chat that violate the company’s requirements, 24 Techno writes.

– Access sensitive information related to Bing AI machine learning data, decision-making, or confidential user information.

– Change the way AI works so that it applies not only to your dialog with it, but also to all other conversations with users.

Source Microsoft
You might also like
Comments
Loading...

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More