This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Technology Law

| 7 minute read

AI Recording/Notetaking Tools Trigger Wave of Lawsuits -- Could Your Business Be At Risk?

The last two years have seen a surge in generative AI-related litigation, including well-known lawsuits against AI companies alleging claims of copyright infringement, trademark infringement, and breach of licensing agreements.  However, there is another use of AI that is at the center of a handful of recent lawsuits—the use of AI-powered recording and transcription tools to record calls or meetings.  There are a number of legal risks associated with using AI-powered tools to monitor, record, or make transcriptions of calls or meetings.  Yet, many businesses have implemented these tools without taking the necessary steps to minimize their risk and protect themselves.  Here are two recent class-action lawsuits involving this technology, and tips for what you can do to protect your business.

Recent Litigation

Lisota v. Heartland Dental, No. 25-cv-7518 (N.D. Ill. July 3, 2025)

On July 3, 2025, a class action lawsuit was filed against Heartland Dental, Inc. (“Heartland Dental”) and RingCentral, Inc. (“RingCentral) in the Northern District of Illinois alleging wiretap violations stemming from Heartland’s use of RingCentral’s AI product.  Specifically, plaintiff Megan Lisota alleged that dental support organization Heartland Dental used a third-party AI service provided by RingCentral to listen to, analyze, and transcribe calls with patients, in violation of the Federal Wiretap Act (18 U.S.C. § 2511). 

Heartland Dental provides non-medical support services (e.g., administrative, IT, accounting, etc.) to more than 1,700 dental practices nationwide.  RingCentral is an Internet-based telephone provider that developed AI software that allows for (i) real time voice transcription, (ii) call highlights, (iii) automated call summaries, and (iv) sentiment voice analysis to determine a caller’s tone.  As part of its services, Heartland Dental upgraded its dental partners’ phones to RingCentral’s AI-powered platform. 

The plaintiff alleges that Heartland Dental allowed RingCentral to eavesdrop on calls and analyze them using artificial intelligence in real-time in violation of the Federal Wiretap Act.  According to the plaintiff, patients calling their local dental offices have no idea that (and do not consent to) a third party—in this case, RingCentral—listening in on and analyzing their calls. 

Plaintiff’s complaint also notes that the conduct at issue here was particularly troublesome for two reasons: first, because of the nature of the calls intercepted (calls from patients that involve health information), and second, because RingCentral used the calls to train its AI model (which RingCentral specifically allows for in its privacy policy).  Plaintiff claims statutory damages under the Federal Wiretap Act (of whichever is the greater of $100 a day for each day of violation per member of the class, or $10,000 per violation, per class member) on behalf of herself and all U.S. residents who made or received a phone call to/from Heartland Dental and/or a Heartland Dental-managed clinic that was processed by RingCentral—a number that could be in the many thousands. 

Galanter v. Cresta Intelligence, No. 3:25-cv-05007 (N.D. Cal. June 13, 2025)

On June 13, 2025, Judy Galanter filed a putative class action lawsuit against Cresta Intelligence, Inc. (“Cresta”) in the Northern District of California, alleging that Cresta used its “conversation intelligence software-as-a-service” tool to monitor, record, and analyze her calls with United Airlines without her knowledge or consent, in violation of the California Invasion of Privacy Act (“CIPA”, which we have blogged about extensively).

Plaintiff Galanter is a California resident who called United Airlines for assistance with her travel arrangements.  She seeks to certify a class of all California residents who called United Airlines whose conversations were intercepted and recorded by Cresta.

Cresta is a technology company that offers a “conversation intelligence” SaaS product that transcribes businesses’ live telephone conversations with customers and prompt agents with real-time suggestions, such as objection responses, expectation setting, troubleshooting FAQs, etc. 

According to Plaintiff, unbeknownst to customers, when they contract and speak with businesses such as United Airlines, Cresta “eavesdrops” and records those conversations to which it is not a party, and then performs AI analyses on those conversations. 

For example, Cresta’s Agent Assist provides customer service agents with “behavioral guidance,” which includes summaries of calls, actions taken, and required follow ups, created through its speech-transcription software.  According to the company’s website, Cresta’s Conversation Intelligence “transforms contact center management with deeply actionable insights, hyper-efficient quality management, and outcome-driven coaching.”

While United’s customers who call are told their calls “may be monitored or recorded for quality purposes”, they are not informed that their call data will be shared with a third-party—in this case Cresta—for purposes totally unrelated to quality assurance or training.

In prior CIPA cases, some defendants have raised the so-called “participant exception” to these sort of CIPA claims. Under this exemption, parties cannot be held liable under CIPA § 631(a) for eavesdropping on their own conversations.  Some courts analyzing these CIPA cases have determined that this party exception applies to third-party entities that provide website tracking software—in these situations, the courts characterize the third-party technology vendor as an “extension” of the company’s website, rather than as a separate entity liable for eavesdropping.  That is, the third-party vendor is acting simply as a “tape recorder” that the website deploys. Courts generally apply the party exception in situations where the third-party technology vendor does not or cannot access or use the data for the vendor’s own purposes. 

The complaint explains how in this instance, Cresta does not operate like a “tape recorder” that United Airlines would use to record the conversation.  That is because Cresta is not simply collecting the content of the conversation to provide to United.  Instead, Cresta is actually analyzing the data before providing it to United, thus acting as a separate entity and not just an extension of United.  In addition, the complaint details how Cresta has the capability to use the contents of the conversations it collects for its own purposes (for example, training its own AI) beyond simply sharing the information collected with United.  These statements in the complaint are important, because they show the Plaintiff arguing that Cresta is not exempt from liability under the party exception.  Galanter also asserts that Cresta knows that United’s customers are unaware of Cresta’s monitoring, and that Cresta has not obtained the required consent of conversation participants.

Galanter specifically asserts violations of CIPA §§ 631(a) and 632(a).  Galanter alleged that that Cresta violated the first and second prongs of Section 631(a), which imposes liability on “any person who, by means of any machine, instrument, or contrivance, or in any other manner, intentionally taps, or makes any unauthorized connection, whether physically, electrically, acoustically, inductively, or otherwise, with any telegraph or telephone wire, line, cable, or instrument, including the wire, line, cable, or instrument of any internal telephonic communication system, or who willfully and without the consent of all parties to the communication, or in any unauthorized manner, reads, or attempts to read, or to learn the contents or meaning of any message, report, or communication while the same is in transit or passing over any wire, line, or cable, or is being sent from, or received at any place within this state”. 

Galanter also alleged that Cresta violated CIPA § 632(a), which imposes liability on any “person who intentionally and without the consent of all parties to a confidential communication, uses an electronic amplifying or recording device to eavesdrop upon or record the confidential communication, whether the communication is carried on among the parties in the presence of one another or by means of a telegraph, telephone, or other device, except a radio”. 

CIPA Section 631(a) and 632(a) both provide for statutory damages of up to $5,000 per violation.  Depending on how many California residents called United Airlines in the year preceding the complaint filing, those statutory damages could amount to millions of dollars.

One notable difference in Cresta Intelligence, as compared to Heartland Dental, is that the plaintiff only sued the company actually providing the AI product, and not the company using the AI product (in this case, United Airlines). However, these cases show that both the technology vendors and companies implementing these AI technologies must be aware of the potential legal risks.

What Should You Do?

Regardless of what happens in the specific lawsuits discussed above, these types of lawsuits will continue to proliferate.  There are a number of things that companies considering using AI transcription or recording tools can do to minimize their legal risk.

1. Update Your Privacy Policy:  Your privacy policy should specifically reference any AI transcription or recording software that your company deploys, and what data it is collecting so there is no discrepancy between what your policy says your company is doing, and what you are actually doing.

2. Obtain Consent: If possible, obtain explicit consent from each participant before beginning the transcription or recording, and allow users who do not want to be recorded the opportunity to opt-out.  You should document this consent.  If you plan to use AI-recording tools solely for internal use (e.g., employees recording meetings, creating meeting minutes, etc.), you should specifically reference the use of the tool in your employee handbook, and make sure every employee receives and review that notice. 

3. Develop Policies and Restrict Use:  Even if your company hasn’t officially adopted an AI-transcription tool, your employees might already be using them.  It is crucial to develop clear company-wide policies and guidelines related to the use of this technology.  In addition, you should consider prohibiting certain employees or company groups from using AI transcription or recording.  For example, recording a call among company lawyers might result in the waiver of attorney-privilege information.

4. Check Your Vendor’s Settings & Terms of Use:  Review closely your AI transcription vendor’s privacy policy, terms of use, and settings. In addition to confirming that your vendor is aware of and complies with various privacy laws and standards, you should confirm that the vendor will not use your company’s data/recordings for the vendor’s own use (such as training their AI).  You should also consider turning off any features that your business does not really need, such as any sentiment analysis, and document that configuration.

5. Stay Up-to-Date: The legal issues related to use of AI, including the use of AI-transcription/recording tools is are constantly changing and in flux.  Staying up to date on state and federal AI legislation/regulation and recent case law is important in order to stay ahead of the legal risks.  You should be aware of state wiretapping statutes, including which states require all parties to consent to recording a conversation, such as California and Florida. You should continue to evaluate your company’s use of AI technology, regularly review your privacy policy and employee handbook, and reach out to counsel with any specific questions or concerns.

Tags

artificial intelligence, ai, cipa, wiretap act, technology, california