This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Technology Law

| 2 minute read

Data Scraping Allegations Raised in Recent OpenAI Privacy Class Action Lawsuit

On June 28, 2023, a group of anonymous plaintiffs brought a class action lawsuit against OpenAI, alleging the companies’ artificial intelligence tools have violated privacy protection laws. Specifically, the plaintiffs claim that these AI products use “stolen private information” taken from “hundreds of millions internet users, including children of all ages, without their informed consent or knowledge.” As a result, plaintiffs allege, OpenAI has violated a number of State and federal laws, ranging from Illinois’ Biometric Information Privacy Act (BIPA) to the federal Electronic Communications Privacy Act (ECPA).

This action marks one of the first AI-related lawsuits not focused primarily on intellectual property infringement. On the same days at this lawsuit, two U.S. authors sued OpenAI in the same court (the Northern District of California) in another class action lawsuit for copyright infringement claims, alleging OpenAI had mined data from a number of books without consent from the writers.

In the 157-page complaint,  plaintiffs assert that tech companies “are onboarding society into a plane that over half of the surveyed AI experts believe has at least a 10% chance of crashing and killing everyone on board.” Although this is one of the first privacy-focused AI lawsuits, it is not first lawsuit concerning data mining and scraping.

In November 2022, the Ninth Circuit ruled on a case involving HiQ Labs, a data science company, for allegedly violating the Computer Fraud and Abuse Act (CFAA) by scraping public user profiles and related information from a social media website. This ruling came after the case had been remanded by the U.S. Supreme Court. In short, the six-year litigation ended in a stipulation between the parties that would prohibit HiQ Labs from scraping any more data from user’s social media profiles and destroy the algorithms created as a result of the scraped user data.

Among a number of other claims, plaintiffs allege that OpenAI has violated the California Consumer Privacy Act (CCPA) by failing to provide sufficient notice and honoring consumer requests. In particular, the complaint notes the technical impossibilities for consumers attempting to exercise their right to deletion. The complaint reads: “OpenAI fails to disclose that once its AI Products have been trained on an individual’s information, that information has been included into the product and cannot reasonably be extracted.”

While the lawsuit contains some perhaps overbroad warnings about artificial intelligence, this class action poses an interesting legal challenge to OpenAI, particularly in light of the FTC-Edmodo order. As a quick recap, the FTC sued Edmodo, an ed-tech provider, for allegedly violating children’s privacy.  Notably, the FTC required the deletion of all “Affected Work Product,” which includes models or algorithms that were developed using personal information collected unlawfully from children.

In light of these LinkedIn and Edmodo decisions, the real risk to OpenAI here may be the eventual requirement to destroy its trained artificial intelligence models, at the least to the extent those models can be shown to have incorporated unlawfully scraped personal information. Further, the outcome of this lawsuit will undoubtedly alter the data practices and training methods employed by AI companies across the internet.

Tags

artificial intelligence, class action, privacy, data scraping, data mining