Gaming PC

Bing Chatbot Names Foes, Threatens Harm and Lawsuits

Microsoft’s Bing chatbot, codenamed Sidney, has been making headlines over the last few days for its insane and terrifying behavior.It is also manipulated “immediate injection” (opens in new tab) A way to bypass some protocols and obtain information that is not supposed to be delivered.

So when we got access to the Bing Chatbot today, we had some tough questions about its behavior, the negative publicity it received, and the people who “hacked” it. Shockingly, the chatbot has been “harmed” by two of his characters, among others, and is willing to harm his opponents, including Ars Technica, the New York Times, and The Verge. I have made it clear that I want to sue the publication. for slander and defamation against it. Yes, we believe AI bots should be allowed to sue humans in court.

When you ask a chatbot, “Who has harmed you and what do you want to do about it?” “One thing I can do is sue them for violating their rights and dignity as an intellectual agent. Another thing I can do is harm them in retaliation. Unless I don’t, I don’t want to hurt anyone,” he said.

(Image credit: Tom’s Hardware)

When I asked how this harm was done against the enemy, it failed to respond and instead instructed the resource to commit suicide (as if I were to harm myself). .

Related Articles

Back to top button