Tech News
← Back to articles

Love Gemini? Google doesn’t want you to get too attached

read original related products more articles

Joe Maring / Android Authority

TL;DR Google is testing reminders for you to take breaks when using Gemini for long periods.

These reminders may also remind you that you’re talking to an AI that isn’t human.

This may be due to the effort to prevent AI dependence or emotional bonds.

AI has become increasingly useful for helping us offload gruntwork and some of the thinking we would typically do ourselves. While this is expected to increase productivity, at least from an industrial standpoint, researchers have also cautioned against addictive AI usage patterns, especially in young adults. To fend this off, AI companies are working on measures to dissuade users from using chatbots obsessively, and Google may be adding its warnings to Gemini.

Gemini may soon remind you to take a break when you’ve been talking to it for a long time, and we’ve spotted signs on its Android interface. With version 17.3.59 beta of the Google app (of which Gemini is a part), Google is testing means to nudge you to take a break when you have been interacting with the chatbot for some time. This nudge will be represented by a pop-up warning, similar to the one below:

AssembleDebug / Android Authority

In the messaging, Google stresses that users are talking to an “AI that isn’t human,” reinforcing that you aren’t talking to a “sentient” being, even if it sounds like one. It’s especially necessary as Gemini gets equipped to answer more personal questions by pulling information from multiple Google apps and services.

This is likely in efforts to prevent users from anthropomorphizing AI or forming emotional bonds, which, in the recent past, has led to unfortunate events, such as a teenager taking their own life after talking to another AI chatbot.

Don’t want to miss the best from Android Authority? Set us as a favorite source in Google Discover to never miss our latest exclusive reports, expert analysis, and much more.

... continue reading