The new Microsoft Bing sometimes distorts the information it finds
Search engines are at a new stage of development with the integration of artificial intelligence. Now you won't have to browse a bunch of websites to find the information you need - it will be presented to you in short (or not) messages generated by AI. However, the question arises: how reliable is such a system?
Yesterday, Microsoft announced its updated Bing search engine with ChatGPT, which should significantly improve the user experience. And although Microsoft has taken many precautions to avoid a repeat of the Tay situation (the Twitter chatbot that became a racist and misogynist), the company still warns that some Bing search results may be bad. Here's what the company says in a FAQ about Bing:
"Bing tries to keep answers fun and factual, but given this is an early preview, it can still show unexpected or inaccurate results based on the web content summarized, so please use your best judgment. Bing tries to keep answers fun and factual, but given this is an early preview, it can still show unexpected or inaccurate results based on the web content summarized, so please use your best judgment. Bing will sometimes misrepresent the information it finds, and you may see responses that sound convincing but are incomplete, inaccurate, or inappropriate. "
And these "errors" are easy to spot: The Verge asked Bing what Microsoft showed today and was told that it showed the new Bing search engine with ChatGPT, which is true, and added that Microsoft demonstrated the search engine's capabilities for "celebrity parodies", which of course was not the case.
Source: The Verge