Particle.news

Download on the App Store

Microsoft's AI Chatbot Provides Inaccurate Election Information, Study Finds

Research reveals that Microsoft's Copilot gave incorrect answers to one in three election-related queries in recent German and Swiss elections, raising concerns about its reliability ahead of the 2024 U.S. elections.

  • Microsoft's AI chatbot, Copilot, has been found to provide inaccurate information about elections, including the upcoming 2024 U.S. elections, according to research by European nonprofits AI Forensics and AlgorithmWatch.
  • Copilot, which is integrated into Microsoft's Bing search engine, was found to give incorrect answers to one out of every three basic questions about candidates, polls, scandals, and voting in recent German and Swiss elections.
  • The inaccuracies were more common when questions were asked in languages other than English, raising concerns about the performance of AI tools built by U.S.-based companies abroad.
  • Copilot was found to misquote its sources, provide incorrect polling numbers, list candidates who had withdrawn from the race as leading contenders, and invent controversies about candidates.
  • Microsoft has stated that it is working to correct the issues ahead of the 2024 U.S. elections and encourages users to verify the information provided by the chatbot.
Hero image