With the widespread adoption of AI tools, browser extensions have become essential for many to boost productivity, but the privacy risks behind them are quietly escalating. A recent research report from Incogni, a data deletion service, reveals that over 50% of Chrome AI extensions in a sampled survey collect user data, with nearly one-third even accessing personally identifiable information (PII).
The study analyzed 442 extensions labeled "AI" and found their total downloads reached 115.5 million. The report indicates that programming assistants, math assistance tools, meeting assistants, and voice-to-text extensions pose the highest risk. These extensions often use "script writing" permissions to access users' input in real-time or alter web page displays, affecting about 92 million users.
In the specific "risk list," some well-known tools also made the cut. The study mentions that Grammarly and the AI content detection tool Quillbot are considered potentially privacy-destructive, which is related to their high user penetration rate. In addition, the survey listed the top ten extensions with high risk probability and potential damage, including Google Translate and ChatGPT Search.
AIbase reminds users that identifying risks lies in observing whether the permissions requested by the extension match its core functionality. For example, if a writing assistant requests precise location data, it is highly suspicious. The security rules are actually simple: once personal data leaves the local device without being anonymized, the extension should be considered an "unacceptable risk." While enjoying convenience, users need to carefully assess the permissions they grant to avoid unintentionally leaking personal privacy.
Key points:
🕵️ Data Collection is Common: Over 50% of AI browser extensions collect user data, and 42% of them use script permissions to monitor users' input and browsing behavior.
⚠️ High-Risk Categories Identified: Programming assistants, meeting transcription, and writing assistance tools have the highest privacy risks; well-known tools like Grammarly and Quillbot are also under scrutiny due to the breadth of their data processing.
🛡️ Permission Review Advice: Users should be wary of extensions that request excessive permissions. The basic principle is: if personal data unnecessarily leaves the host device, it crosses a security red line.
