Don't blindly trust what AI tells you, says Google's Sundar Pichai
4 hours ago Share Save Faisal Islam, economics editor and Rachel Clun, business reporter Share Save
Getty Images
People should not "blindly trust" everything AI tools tell them, the boss of Google's parent company Alphabet has told the BBC. In an exclusive interview, chief executive Sundar Pichai said that AI models are "prone to errors" and urged people to use them alongside other tools. Mr Pichai said it highlighted the importance of having a rich information ecosystem, rather than solely relying on AI technology. "This is why people also use Google search, and we have other products that are more grounded in providing accurate information."
While AI tools were helpful "if you want to creatively write something", Mr Pichai said people "have to learn to use these tools for what they're good at, and not blindly trust everything they say". He told the BBC: "We take pride in the amount of work we put in to give us as accurate information as possible, but the current state-of-the-art AI technology is prone to some errors."
'A new phase'
The tech world has been awaiting the latest launch of Google's consumer AI model, Gemini 3.0, which is starting to win back market share from ChatGPT. From May this year, Google began introducing a new "AI Mode" into its search, integrating its Gemini chatbot which is aimed at giving users the experience of talking to an expert. At the time, Mr Pichai said the integration of Gemini with search signalled a "new phase of the AI platform shift". The move is also part of the tech giant's bid to remain competitive against AI services such as ChatGPT, which have threatened Google's online search dominance. His comments back up BBC research from earlier this year, which found that AI chatbots inaccurately summarised news stories. OpenAI's ChatGPT, Microsoft's Copilot, Google's Gemini and Perplexity AI were all given content from the BBC website and asked questions about it, and the research found the AI answers contained "significant inaccuracies".