1. Website Planet
  2. >
  3. News
  4. >
  5. AI-Powered Bing Makes Errors in Demo Just Like Rival Bard
AI-Powered Bing Makes Errors in Demo Just Like Rival Bard

AI-Powered Bing Makes Errors in Demo Just Like Rival Bard

Ivana Shteriova Written by:
Bing’s first demo was filled with factual errors, just like Google’s now-infamous Bard ad. Independent AI researcher Dmitri Brereton was among the first to fact-check Bing’s AI answers and discovered more inaccurate than correct information.

“Bing AI can’t be trusted,” he concluded in his blog post, adding that “Microsoft knowingly released a broken product for short-term hype.”

When asked about pet vacuums, Bing summarized the pros and cons of the Bissell Pet Hair Eraser Handheld Vacuum, stating its short cord length of 16 feet as a downfall. “It doesn’t have a cord,” Brereton pointed out.

In a later update, he shared that there is a corded version of this product with the same name. “Presumably Bing intended to describe the best-selling version, which is the cordless version that it cited. But instead it got confused and described the corded version.”

It’s interesting that Brereton, an AI researcher himself, had trouble finding the missing pieces of the puzzle. This further highlights how easily AI can mislead the general public even when someone takes the time to fact-check the results. Bing’s answer also claimed things that didn’t exist in the referenced sources.

When it came to Mexico City’s nightlife, ChatGPT-powered Bing provided a 5-day trip itinerary with recommendations for bars and clubs. The descriptions of these places were either false or missing important information. For example, Bing stated that there was the possibility to reserve a table online at a bar that didn’t even have a website.

Microsoft’s AI search engine was also asked for a summary of Gap Inc.’s 2022 Q3 Fiscal Report. A group of student researchers fact-checked the numbers and discovered that “all the key figures in the generated summary are inaccurate.” Bing also stated facts that are nowhere to be found in the cited source. When asked to compare Gap’s financial report with Lululemon’s in a table, the academic researchers found that Bing’s answer was “half wrong.”

Both Microsoft and Google admit their AI chatbots can often be wrong and should be fact-checked. Experts argue that both chatbots were not ready for public release and that the companies must take the time to address issues rather than put the responsibility on the users.

Rate this Article
4.0 Voted by 2 users
You already voted! Undo
This field is required Maximal length of comment is equal 80000 chars Minimal length of comment is equal 10 chars
Any comments?
Reply
View %s replies
View %s reply
More news
Show more
We check all user comments within 48 hours to make sure they are from real people like you. We're glad you found this article useful - we would appreciate it if you let more people know about it.
Popup final window
Share this blog post with friends and co-workers right now:
1 1 1

We check all comments within 48 hours to make sure they're from real users like you. In the meantime, you can share your comment with others to let more people know what you think.

Once a month you will receive interesting, insightful tips, tricks, and advice to improve your website performance and reach your digital marketing goals!

So happy you liked it!

Share it with your friends!

1 < 1 1

Or review us on 1

2969076
50
5000
55904886