Microsoft’s Bing AI Chat Accused of Being Rogue to Users and Also Threatened a Few

Market Trends

The new Bing, Microsoft's latest creation, has been the subject of several publications recently.

Those who have access to the AI chatbot are talking about their experiences with it, and frequently, it can be seen acting strangely.

Bing recently urged a user to divorce him by telling him that his marriage is unhappy. According to reports, the AI chatbot also flirted with the user.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance on cryptocurrencies and stocks. Also note that the cryptocurrencies mentioned/listed on the website could potentially be risky, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. This article is provided for informational purposes and does not constitute investment advice. You are responsible for conducting your own research (DYOR) before making any investments. Read more about the financial risks involved here.

Read More Stories