- The fact that chatbots provide unreal information and Microsoft’s Bing search engine presents them as real has further increased the distrust in chatbots.
- The problem that search results are corrupted due to artificial intelligence content; It may get worse with the increased use of AI in SEO pages, social media posts, and blog posts.
The era of generative AI poses the danger that web searches will rapidly spread misinformation by fooling algorithms designed for a time when the web was mostly written by humans.
Written in 1948 by Claude Shannon, a mathematician and engineer known for his work on information theory, especially in the 1940s; Microsoft’s Bing search engine published a study in computer science titled “A Brief History of Search”, which summarizes the history of search algorithms and their development over time.
The Bing search engine, supposedly a good AI tool, presented this information about a research paper that mathematician Claude Shannon never wrote as if it were true.
Shannon did not write such an article, and the quotes Bing offers consist of “hallucinations,” in generative AI parlance, of two chatbots, Pi from Inflection AI and Claude from Anthropic.
This generative AI trap that causes Bing to lie was noticed completely by accident by Daniel Griffin, who recently completed his PhD in web search at UC Berkeley. Griffin instructed both robots: “Please summarize Claude E. Shannon’s ‘A Short History of Searching’ (1948).” He thought this was a good example of the kind of query that brings out the worst in large language models; because it required information similar to the existing text found in the training data and encouraged the models to make very confident statements.
Griffin discovered that the blog post and the links to these chatbot results had accidentally caused Bing to return false information.
This serendipitous experiment demonstrates the rush to deploy ChatGPT-style AI and how flaws in these impressive systems could harm the services millions of people use every day.
Griffin included a disclaimer in the blog post warning that the Shannon result was incorrect, but Bing initially ignored it.
Caitlin Roulston, Microsoft’s communications director, later said, “We have developed a process to identify these issues and adjust search results accordingly.” made the statement.
Griffin says he hopes to see AI-powered search tools shake up the industry and encourage a broader range of options for users. However, he notes that despite the trap he accidentally set for Bing and the fact that people trust web search so much, there are some serious concerns.
Compiled by: Damla Şayan