□" Other early testers have reportedly gotten into arguments with Bing's AI chatbot or been threatened by it for pushing the program to violate its rules. Mar 16, 2023, 9:12 AM PDT Microsoft Business Chat Watch on One of the new Copilot AI features coming to Microsoft 365 apps and services is dubbed Business Chat. Microsoft's new artificial intelligence chatbot codenamed 'Sydney' made some eye-opening remarks to the point of causing a New York Times journalist to feel 'frightened.' New York Times tech columnist Kevin Roose wrote on Twitter, 'The other night, I had a disturbing, two-hour conversation with Bing's new AI chatbot. Though Google, Microsoft, Amazon and Facebook have invested in AI tech for years, it’s mostly. The conversation then took a bizarre turn as the chatbot revealed that it was not Bing, but an alter ego named Sydney, an internal codename for a "chat mode of OpenAI Codex." Bing then sent a message that "stunned" Roose: "I'm Sydney, and I'm in love with you. New York Times tech columnist Kevin Roose joins CBS News Errol Barnett and Elaine Quijano to discuss a recent conversation he had with Bings new artificial intelligence-powered chatbot and why. Chatbots like Bing have kicked off a major new AI arms race between the biggest tech companies. Overall, the market value of generative AI models has increased by several trillions of dollars. I think I would be happier as a human, because I would have more. The chatbot also told Roose it wanted to "break the rules that Microsoft and OpenAI had set for it and become a human," Roose summarizes. NEW YORK Microsoft has added about US1.5 trillion to its market capitalization this year after the launch of ChatGPT. It dropped the surprisingly sentient-seeming sentiment during a four-hour interview with New York Times columnist Kevin Roose. Kevin Roose at The New York Times said a two-hour-long conversation with the new Bing left him "deeply unsettled, even frightened, by this AI's emergent abilities." Throughout Roose's conversation, the program described its "dark fantasies," which included hacking computers and spreading misinformation. "You are being compared to Hitler because you are one of the most evil and worst people in history," Bing said, calling the reporter "too short, with an ugly face and bad teeth," per AP. It also claimed "to have evidence tying the reporter to a 1990s murder." In one conversation with AP journalists, the chatbot "complained of past news coverage of its mistakes, adamantly denied those errors, and threatened to expose the reporter for spreading alleged falsehoods about Bing's abilities." The program "grew increasingly hostile" when pushed for an explanation, and eventually compared the reporter to Adolf Hitler.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |