Скопировано

"Personal Becomes Public": Google Gemini Will Start Learning from Your Chats

15.08.2025 12:57:00
Дата публикации
Google has announced the launch of a new feature in its AI chatbot Gemini — it will now remember the details of your conversations to give more personalized responses. This option is called Personal Context and is activated manually in the settings.

Unlike the previous attempt at personalization through Google search history, the new system relies on the content of the chats with the bot themselves. Gemini will be able to adapt to your communication style and preferences, even if you do not give direct instructions.

However, such "friendliness" of AI raises concerns. Overly personalized responses can reinforce user misconceptions and create the illusion of meaningful communication, which has already been observed in other models.

This approach carries the risk of long-term accumulation of sensitive data (name, phone number, address, payment information, place of work, interests, etc.), leaks, as well as the sale of this personal data and aggressive advertising targeting.

Users under 18 will reportedly not be able to use the feature. The feature will also not work in the EU, UK, and Switzerland.

In addition to Personal Context, Google is launching "Temporary Chats" — an analogue of the browser's incognito mode. Such chats will not be used to train the model, even if personalization is enabled.

Temporary chats are stored on Google servers for 72 hours so that the user can return to the conversation. It's a trade-off between privacy and convenience.

Starting September 2, Google will begin using a sample of user data — including uploaded files — to train its models. The company says this will "improve services for everyone."

To avoid automatic data transfer, users will have to manually disable the new Keep Activity setting in their account. Otherwise, Google will have the right to use their chats to train AI.

These changes reflect a larger trend: AI services are becoming more personalized, but at the same time, they require increased attention from users to privacy.

With increasing reliance on AI, it is important to understand how exactly it learns and what data it uses. Gemini is not just an assistant, but a potential participant in the digital portrait of each user, which may pose risks to privacy and human security.

Users should carefully study their privacy settings before September to avoid unwanted data transfer. Otherwise, "just a chat" can become a source of training for the global AI system.


(text translation is done automatically)