Use of AI Notetakers in Business Meetings
The 6 Hidden Risks and Recommended Considerations and Practices!
So, for the past thirty plus years, when in client or other meetings, I often thought to myself . . . “Self, wouldn’t it be nice if someone could take notes for me while I was in this meeting so I could focus on the meeting and ensure nothing important was missed?”
You see, as a copious note taker, but not often the best real-time multi-tasker, I either captured all of the information, not enough information, or possibly the right amount, but not particularly well organized as recorded in real time.
So, of course, my Self would answer: “why yes, that would be awfully nice”.
Fast forward to 2025, and low and behold, my prayers have been answered with the release of multiple different AI-powered note takers — those wonderful, automated tools that transcribe and summarize meetings — like Otter, Fireflies, Granola, tl;dv, Avoma, Superpowerd, Krsip, Tactiq, Notta, Gong, Rev, Fathom, Notion AI — and the list goes on and on. Even Zoom, Teams, and Google Meet provide their own integrated note taking capabilities at various tiers of service.
Ahhhh, heaven, right? Well, maybe not so fast, “Self”.
In the fast-paced world of growth businesses, executives, including lawyers such as my “Self”, are constantly seeking ways to improve efficiency and streamline operations. Any not surprisingly, AI notetakers have surged in popularity, promising to eliminate the burden of manual note-taking.
However, upon closer inspection, these tools unfortunately come with significant risks, particularly in meetings involving sensitive information, including discussions where legal counsel is involved. So, understanding these risks and recommended practices is essential for businesses (and lawyers) looking to leverage AI notetakers responsibly.
So . . . what are the concerns? Well, both me and my “Self” are glad you asked!
1. Consent: Are Meeting Participants Required to Consent to Use of an AI Notetaker?
One of the most fundamental legal considerations when using an AI notetaker is whether all meeting participants have consented to being recorded or transcribed. U.S. laws on recording conversations vary by state:
Approximately 38 states and the District of Columbia follow one-party consent laws, meaning that only one person in a conversation needs to be aware of and consent to the recording.
12 states, including California, Florida, and Illinois, require all-party consent, meaning everyone in the meeting must agree to the recording or transcription.
Failing to obtain the required consent could lead to legal liability, including claims of wiretapping violations or invasion of privacy. Businesses should implement clear policies requiring explicit consent before using AI notetakers. This is why when someone is recording your Zoom or Teams meeting, there is a pop-up telling you that the meeting is being recorded and to disconnect if you don’t consent.
But, what about when an individual in the meeting is using his or her own notetaker? Rarely, if at all, is consent being sought or recorded for this use case. Hmmmmm.
Considerations: Ensure that permitted use of notetakers is disclosed at the start of each meeting and allow participants to either veto use of such tools, or leave the meeting, if they are uncomfortable given the meeting content.
2. Security Risks: Protecting Sensitive Data
AI notetaking applications store, process, and sometimes analyze large volumes of confidential business information. Inadequate security measures can expose businesses to cyber threats, data breaches, and corporate espionage. Key security concerns include, for example:
Encryption: Is the AI notetaker using end-to-end encryption to protect sensitive data?
Data Storage: Where is the data stored, and does it comply with relevant regulations (e.g., GDPR, CCPA)?
Access Controls: Who has access to the transcripts, and are there safeguards against unauthorized use?
Considerations: Before adopting an AI notetaker, whether at an enterprise level, or in connection with an AI Usage Policy, businesses should conduct a security assessment to ensure robust protection of their data.
3. Attorney-Client Privilege: A Potential Major Legal Pitfall
For meetings involving lawyers, AI notetakers pose a particularly acute risk to the attorney-client privilege — the legal protection that keeps communications between a lawyer and their client confidential. If a third-party AI service stores or processes meeting data, including communications with attorneys, on their servers, a court may find that privilege has been waived. You see, the information discussed, which is supposed to remain solely between the lawyer and client, may now have been technically shared with an unaffiliated third-party, potentially exposing critical legal strategies and sensitive information in litigation. Yikes!
Considerations: To preserve this critical privilege, businesses should:
Avoid using AI notetakers in legal meetings unless they have been vetted for compliance with privilege requirements.
Use self-hosted or on-premise AI solutions rather than cloud-based services with third-party access.
Mark privileged communications explicitly and maintain tight access controls.
4. Data Privacy: Navigating Regulatory Compliance . . Yet Again!
AI notetakers process personal and business data, which may trigger compliance requirements under various data privacy laws, including for example:
GDPR (EU): Requires user consent and mandates data minimization principles.
CCPA/CPRA (California): Grants individuals the right to know what data is collected and request its deletion.
Other state laws: States like Colorado, Utah, Virginia, and others have their own data privacy regulations, adding complexity to compliance, as all such laws need to be understood to ensure compliance (or inadvertent non-compliance, which can be costly).
Failure to comply with these regulations can result in legal penalties, reputational damage, and loss of customer trust.
Considerations: Businesses should verify that AI notetaker vendors, and their tools, comply with relevant data privacy laws before use.
5. Confidentiality Risks: Internal and External Exposure
AI notetakers can inadvertently lead to confidentiality breaches, particularly if they:
Store transcripts in unsecured locations.
Allow unauthorized employees or external parties to access meeting records.
Integrate with third-party tools that do not have strict data handling policies.
Considerations: To mitigate these risks, one approach companies can take is to classify meetings by confidentiality level and restrict AI usage in sensitive discussions, such as strategic planning, mergers and acquisitions, and personnel matters.
6. Record-Keeping and Accuracy Concerns
While AI note-taking tools promise efficiency, they are not infallible, and often can mis-record or mis-construe, meeting content. Some of these concerns include:
Misinterpretation of spoken words, leading to incorrect records.
Failure to capture tone, intent, or non-verbal cues, which can be critical in legal or strategic contexts.
Lack of version control, making it difficult to track changes and ensure data integrity.
Considerations: Businesses can establish a process for reviewing AI-generated notes for accuracy and completeness before relying on them for decision-making and ensure appropriate record keeping and version control.
Summary: Recommended Practices for Vetting and Using AI Notetakers
To responsibly integrate AI notetakers into business operations, executives and lawyers should take reasonable steps to ensure responsible, compliant use of such tools, including considering the following practices:
Obtain Clear Consent: Develop a policy requiring explicit consent from all meeting participants before using AI notetakers.
Choose Secure and Compliant Providers: Vet vendors for strong encryption, access controls, and compliance with relevant privacy laws.
Limit Use in Privileged or Highly Confidential Meetings: Avoid using AI notetakers in legal discussions and high-stakes strategy sessions.
Set Clear Data Retention Policies and Ensure Legal Compliance: Define how long transcripts will be stored and who can access them, including procedures for disclosure, correction and deletion of any personal information, and consider on-premise solutions, over third-party hosted options (yes, I understand this may not be a practical option, but something worth considering).
Regularly Audit Accuracy: Assign team members to review AI-generated notes for correctness and completeness and implement processes for approval of all such content.
Monitor Legal and Regulatory Changes: Stay informed about evolving laws governing AI transcription and recording. This is critical throughout the AI landscape.
So, upon thinking about the options, my “Self”, while still very excited at the prospect of automated notetakers, has decided to take a bit more measured approach . . . which means I am likely not getting rid of my legal pad - at least not entirely - any time soon!
AI notetakers can be a powerful tool for efficiency, but they also introduce substantial risks that growth businesses must carefully manage. By taking a proactive approach to legal compliance, security, and data integrity, executives can harness AI while protecting their business from unintended legal and operational consequences.
Nope, this is not legal advice of any kind, but just the random musings of some lawyer guy on the Interwebs. So, the information herein is for “informational” and “entertainment” purposes only. If you are seeking actual legal advice about any of the matters discussed in this article, you should engage a trained legal professional.