Monday, July 29, 2024

X allows users to turn off GROK AI Training due to data concerns

X, the social media platform formerly known as Twitter, is introducing a new feature that allows users to opt out of its AI training program involving Grok. Grok is an AI assistant developed by xAI, a company owned by Elon Musk, which learns from user posts and interactions.

Previously, users were automatically included in this training program, meaning their posts and interactions were used to help Grok improve its responses. However, due to growing concerns about data privacy, X has decided to make participation in this program optional.

Grok was designed to enhance its capabilities by analyzing millions of user interactions and posts. This data-driven approach helps the AI identify patterns and improve its functionality. However, users raised concerns when it became clear that their content was being used as open-source data without explicit consent.

In March 2024, privacy advocates in Europe, particularly from Ireland, voiced their objections to the use of social media data for training Generative AI models. The Irish Data Protection Commission intervened, noting that this practice violated GDPR regulations. The Commission threatened X with a €20 million fine, prompting the platform to introduce the opt-out feature.

To opt out, users can navigate to the “Privacy and Safety” tab, select “Data Sharing and Personalization,” choose “Grok,” and uncheck the box that permits data usage. Users also have the option to delete their chat history, ensuring that their information is removed from Grok’s training dataset.

Similar opt-out options are available on other major tech platforms like Facebook and Google, reflecting a broader trend towards respecting user consent in AI training practices. This move comes as companies aim to mitigate media scrutiny and avoid regulatory penalties related to data usage.

The post X allows users to turn off GROK AI Training due to data concerns appeared first on Cybersecurity Insiders.


July 30, 2024 at 10:47AM

0 comments:

Post a Comment