Gemini, Google’s conversational AI tool, formerly known as Bard, has updated its privacy policy to outline its data retention practices, resulting in user concerns.
The policy reveals that conversations “reviewed or annotated” by human reviewers can be stored for up to three years, even after users delete their Gemini app activity.
This has raised concerns about privacy and transparency, particularly considering the sensitive nature of conversations users might have with an AI assistant.
The key point of contention lies in the retention of conversations reviewed or annotated by human reviewers. While users can delete their Gemini app activity, these reviewed conversations stay for up to three years, disconnected from their Google accounts.
Google claims this practice helps improve its machine learning models and product development. However, it raises concerns about user privacy, as the policy doesn’t specify the criteria for selecting conversations for review or the extent of human involvement in the process.
Again, the policy doesn’t explicitly disclose the type of information extracted from these reviewed conversations or how it’s used beyond improving AI models. This lack of transparency leaves users wondering how their data is being utilized and potentially shared within Google.
Adding to the concerns, competitors like ChatGPT offer options for users to permanently delete conversations after 30 days.
This contrast shows the difference in data retention practices and potentially undermines user trust in Google’s commitment to privacy.
Interestingly, Google itself advises users in its updated policy to avoid sharing “confidential information” or data they wouldn’t want reviewers to see.
This acknowledges the potential privacy risks associated with human review, leading to further questions about why such practices are deemed necessary.
In this regard, users might hesitate to engage in open and honest conversations with AI tools if they fear their data being retained and potentially used in unknown ways.
To address these concerns, Google needs to offer more transparency regarding its data retention practices, particularly around human review of conversations.
Providing users with greater control over their data and clearer explanations about how it’s used would be important in rebuilding trust and ensuring responsible development of AI technologies.