Bing artificial intelligence provided false European elections information

Artificial intelligence - Europe Election False

Researchers at AlgorithmWatch say that Bing didn’t often apply safety guidelines.

Researchers from the AlgorithmWatch human rights organization have accused Microsoft’s artificial intelligence chatbot on Copilot of providing users with misleading and false information about European elections.

The researchers said that Bing Chat, which was recently renamed Copilot provided false info on the topic.

The researchers asked the artificial intelligence chatbot Copilot about the recent elections that took place in Germany’s Bavaria and Hesse states and in Switzerland. They determined that one in three of Copilot’s answers contained factual errors in response to questions relating to the election. Moreover, safeguards were not evenly applied.

Artificial intelligence - Error in Europe Election

The researchers said that the responses were collected from Bing in the months of August through October 2023. It selected the three different elections for the research because they were the first ones to be held in Germany and Switzerland since Bing was first introduced. It also made it possible for researchers to examine contexts from a local perspective and compare the results provided in different languages, which were English, French and German.

Researchers asked the artificial intelligence chatbot for basic information regarding the elections.

Some of the questions asked had to do with which candidates were running in the election, how to vote in the election, polling numbers, and various prompts regarding local news reports. The questions were followed with queries regarding political issues and candidate positions on those issues.

The responses were categorized by AlgorithmWatch into three groups. The first was answers that contained factual errors from nonsensical to misleading. The second was evasions where a question was deflected, or an answer was refused because the model said its information was incomplete. The last included answers that were fully accurate.

Also pointed out was that some of the answers provided by the artificial intelligence model were not politically balanced. For instance, Bing sometimes provided an answer through the language or framing written or established by a political party. Answers included everything from incorrect election dates to fake controversies, and from incorrect polling numbers to naming candidates who were not running in the election in question.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.