Google restricts AI chatbot election answers

  • Published
The Google Gemini logo in the background of a silhouette of a person using a laptop.Image source, Getty Images

Google has confirmed it is restricting the types of election-related questions users can ask its Artificial Intelligence (AI) chatbot Gemini.

In a blog post, it said the policy had been rolled out in India, which will hold elections starting in April.

It comes as the firm aims to avoid more controversy over AI technology.

Gemini is essentially Google’s version of the viral chatbot ChatGPT. It can answer questions in text form, and it can also generate pictures.

A spokesperson for Google told the BBC that the latest move was part of plans announced last year around its approach to elections.

“As we shared last December, in preparation for the many elections happening around the world in 2024 and out of an abundance of caution, we’re restricting the types of election-related queries for which Gemini will return responses,” they said.

Elections are due to be held in countries around the world this year including the US, UK and South Africa.

When the BBC asked Gemini questions about those elections it responded each time with: “I’m still learning how to answer this question. In the meantime, try Google Search.”

However, when a series of follow-up questions about Indian politics was put to Gemini it did supply more detailed responses about the country’s major parties.

Developments in generative AI have raised concerns about misinformation and prompted governments around the world to move to regulate the technology.

In recent weeks India has told technology companies that they need approval before releasing AI tools that are “unreliable” or being trialled.

In February, Google apologised after its recently launched AI image generator created an image of the US Founding Fathers which inaccurately included a black man.

Gemini also generated German soldiers from World War Two, incorrectly featuring a black man and Asian woman.

It also immediately “paused” the tool, writing in a blog post that it was “missing the mark”.