The AI Meeting Assistant Trap: Why Your Organization's Newest Productivity Tool Might Be a HIPAA Breach Waiting to Happen
Health care executives are falling over themselves to sign up for AI meeting assistants. And honestly, who can blame them? These tools are genuinely impressive. They join your meetings automatically, transcribe everything in real time, identify individual speakers, generate organized summaries with action items, and some can even analyze shared screen content and video. For an administrator drowning in back-to-back meetings, a tool that saves two or three hours a day on note-taking sounds like a gift.
The problem is not the technology. The problem is that most of these platforms are structured in a way that makes HIPAA-compliant use either impossible on lower-tier plans or prohibitively expensive for the organizations that need them most. And a lot of health care administrators are signing up for free or mid-tier plans without realizing they have just handed a third-party vendor unrestricted access to conversations that almost certainly contain protected health information.
The Pricing Wall in Front of the BAA
Take a look at how the major AI meeting note-takers structure their plans. The pattern is remarkably consistent. There is a free tier. There are one or two mid-range paid tiers. And then, at the very top, there is an enterprise tier that costs significantly more per user and often requires minimum seat counts. HIPAA compliance - specifically, the willingness to sign a Business Associate Agreement - lives exclusively at that top tier for most vendors.
Here is how that breaks down across the platforms health care organizations are most likely to encounter:
| Vendor | Plan Tiers (Annual Pricing) | BAA Requires Highest Tier? | Min. Users for BAA Tier |
|---|---|---|---|
| Read AI | Free - $0 | Yes | 10 |
| Pro - $15/user/mo | |||
| Enterprise - $22.50/user/mo | |||
| Enterprise+ - $29.75/user/mo (BAA) | |||
| Fireflies.ai | Free - $0 | Yes | Contact Sales |
| Pro - $10/user/mo | |||
| Business - $19/user/mo | |||
| Enterprise - $39/user/mo (BAA) | |||
| Otter.ai | Basic - $0 | Yes | Contact Sales |
| Pro - ~$8.33/user/mo | |||
| Business - $20/user/mo | |||
| Enterprise - Custom Pricing (BAA) | |||
| Fathom | Free - $0 (blanket BAA) | Verify | None (BAA); Team+ recommended for governance |
| Premium - ~$16/user/mo (blanket BAA) | |||
| Team - Contact Sales (BAA + SSO, governance) | |||
| Business - ~$25/user/mo (BAA + SSO, custom retention) | |||
| Fellow | Free - $0 | Yes | 10 |
| Team - $7/user/mo | |||
| Business - $15/user/mo | |||
| Enterprise - $25/user/mo (BAA) | |||
| MeetGeek | Basic - $0 | Claims All Paid - Verify | None (paid plans) |
| Pro - ~$9.99/user/mo (claims BAA) | |||
| Business - ~$19.99/user/mo (claims BAA) | |||
| Enterprise - Custom Pricing (claims BAA) |
Green = BAA available and tier includes enterprise governance controls (SSO, custom retention, centralized admin).
Amber = BAA is available or claimed at this tier, but the plan may lack governance controls (SSO, custom retention, centralized admin) that your risk analysis requires. Verify scope and controls directly with the vendor before deploying with PHI.
No highlight = No BAA available at this tier.
Pricing reflects annual billing where available and is current as of early 2026. All pricing and plan structures are subject to change. Always verify directly with the vendor before making purchasing decisions. A BAA alone does not make a platform "HIPAA compliant" for your organization - the platform must also provide the technical and administrative controls your risk analysis identifies as necessary. Organizations should request the actual BAA document, review it with legal counsel, and confirm that the specific plan tier meets the safeguard requirements of their environment before deploying where PHI may be present.
Pricing and plan details are current as of early 2026 and are subject to change. Always verify directly with the vendor before making purchasing decisions.
The pattern should be alarming. For the most widely adopted platforms - Read AI, Fireflies.ai, and Otter.ai - the only plan that includes a Business Associate Agreement is the most expensive tier. Read AI's Enterprise+ plan, for example, not only costs roughly $30 per user per month billed annually but requires a minimum of ten licenses and mandates that your organization configure SAML-based single sign-on and domain capture before they will even begin the BAA process. Fireflies.ai's enterprise tier runs $39 per user per month. Otter.ai does not even publish enterprise pricing publicly.
For a 25-bed Critical Access Hospital where three administrators and an IT manager want to use an AI note-taker, Read AI's Enterprise+ plan would run a minimum of roughly $3,570 per year at the ten-license floor - and most of those licenses would go unused. That is before you factor in the SSO infrastructure requirements.
Why This Matters More Than You Think
Some administrators will read this and think, "I don't deal with patients directly, so HIPAA doesn't apply to my meetings." That reasoning is understandable, but it is wrong far more often than people realize.
Under 45 CFR 160.103, protected health information is defined as individually identifiable health information that is created or received by a health care provider, health plan, employer, or health care clearinghouse, and that relates to the past, present, or future physical or mental health or condition of an individual, the provision of health care to an individual, or the past, present, or future payment for the provision of health care to an individual. The definition is broad by design.
Consider what a typical health care administrator's day actually looks like. A patient calls about a billing question and the conversation includes their name, account number, and the procedure being billed. A department head brings up a staffing concern related to a specific patient complaint. An HR meeting discusses a workers' compensation claim that references medical treatment. A vendor demo uses de-identified data that turns out to not be fully de-identified. A finance meeting discusses outstanding accounts receivable and references specific patient balances.
In every one of those scenarios, PHI is present. A patient's name connected to any health care service, payment, or health condition constitutes PHI. It does not have to be a medical record or a diagnosis. Under the HIPAA Privacy Rule's Safe Harbor de-identification standard at 45 CFR 164.514(b), there are 18 categories of identifiers that must be removed for information to be considered de-identified, including names, geographic data smaller than a state, dates, phone numbers, email addresses, and account numbers. If your meeting includes a patient's name in the context of anything related to their care, payment, or health plan enrollment, you are dealing with PHI.
So when that AI meeting assistant joins your call, records the audio, transcribes it to text, processes it through large language models to generate summaries, and stores all of that on the vendor's cloud infrastructure - you have just disclosed PHI to a third party.
If you need a reminder of how seriously OCR takes PHI on devices used by non-clinical staff, look at the enforcement history around stolen laptops. In 2020, Lifespan Health System paid $1,040,000 after an unencrypted MacBook was stolen from an employee's car, exposing the ePHI of over 20,000 individuals. In 2016, the Feinstein Institute for Medical Research paid $3.9 million after a laptop containing roughly 13,000 records was stolen from a researcher's vehicle. Concentra Health Services settled for $1,725,220 over a stolen unencrypted laptop. In 2019, OCR reached a $3 million settlement with a diagnostic imaging company over a failure to encrypt mobile devices. These were not clinical staff losing patient charts. These were laptops used by administrators, researchers, and employees whose work happened to involve data that qualified as PHI. The common thread in every case was not the theft itself - it was the failure to encrypt the device and the failure to have adequate policies in place before the loss occurred.
Now replace the stolen laptop with an AI meeting assistant's cloud-hosted transcript database containing six months of recorded conversations. The exposure model is the same, except the data does not require a car break-in to be at risk. It is sitting on a third party's servers, processed by their AI models, accessible through their infrastructure, and governed by whatever terms of service the individual user clicked through when they signed up.
The Business Associate Problem
Under 45 CFR 164.502(e), a covered entity may disclose PHI to a business associate only if the covered entity obtains satisfactory assurance that the business associate will appropriately safeguard the information. That satisfactory assurance takes the form of a Business Associate Agreement, the requirements for which are spelled out at 45 CFR 164.504(e). The BAA must establish the permitted and required uses and disclosures of PHI, require appropriate safeguards, require breach reporting, and ensure subcontractors are held to the same standards.
Without a BAA in place, disclosing PHI to a third party - even unintentionally - is a violation of the Privacy Rule. There is no gray area here. If you are using Read AI's free plan and it records a meeting where a patient's name and billing information are discussed, you have disclosed PHI to a business associate without a BAA. That is, definitionally, a HIPAA violation.
And it gets worse. These platforms do not just store your data. Read AI's security documentation notes that HIPAA compliance features are only supported on Enterprise+, and their standard terms require SAML authentication and domain capture to be configured before a BAA will be executed. Fireflies.ai's help documentation explicitly states that HIPAA compliance is only available on the Enterprise plan, with BAA availability described as "Enterprise only." Otter.ai's help center confirms that HIPAA compliance is exclusively available for Enterprise plan customers and that users on Basic, Pro, or Business plans cannot obtain a BAA.
In other words, the vendors themselves are telling you that their lower-tier plans are not designed to handle PHI. Using them in environments where PHI is present is a compliance risk that the vendor's own documentation does not support.
The Shadow IT Angle
Here is where the real danger lives. These tools are incredibly easy to sign up for. An administrator goes to read.ai, creates an account with their work email, connects their calendar, and suddenly every meeting they attend has an AI bot joining to record and transcribe. No IT involvement. No compliance review. No BAA.
This is textbook shadow IT, and it is happening at health care organizations right now. The person signing up is not being malicious. They genuinely want to be more productive. They saw a colleague use it, or they saw a LinkedIn post about it, or the tool was recommended in a webinar. They do not know what a BAA is, they do not know that HIPAA applies to their administrative meetings, and the vendor's signup flow certainly is not going to stop them and ask if they work in a regulated industry.
For IT and compliance teams, this means you cannot assume these tools are not already in your environment. You need to be proactively checking for them. Review calendar events for bot participants. Check email forwarding rules. Look at browser extensions installed on managed devices. Ask your staff directly. The conversation should not be accusatory - it should be educational. People do not know what they do not know, and the vendors are not helping them figure it out.
What You Should Actually Do About This
If you are an IT manager, compliance officer, or administrator at a health care organization, here is where to start.
Find out if these tools are already in use. Ask around. Check meeting recordings. Look for bot participants in calendar invitations. Review browser extensions on managed endpoints. If you manage devices through Intune or Configuration Manager (formerly SCCM), inventory installed applications and browser add-ons.
If they are in use without a BAA, stop and assess. This does not necessarily mean panic. Determine what meetings were recorded, whether PHI was likely discussed, and document the scope. Work with your compliance officer or legal counsel to determine whether a breach notification obligation has been triggered under 45 CFR 164.402, which defines a breach as the acquisition, access, use, or disclosure of PHI in a manner not permitted under the Privacy Rule that compromises the security or privacy of the PHI. Keep in mind that not every unauthorized disclosure automatically triggers notification - the regulation includes a risk assessment process to determine whether the disclosure compromised the security or privacy of the PHI. But that assessment needs to happen, it needs to be documented, and it should involve someone qualified to make the determination.
If you want to use an AI meeting assistant, budget for the BAA-eligible tier. For organizations that genuinely need this capability, the cost of the enterprise tier is a legitimate business expense. Factor it into your IT budget like any other tool that handles PHI. Do not try to save money by using the free tier and hoping no one discusses PHI in a meeting - because they will.
Consider platforms that offer BAAs at lower tiers - but understand the tradeoffs. Not every vendor locks compliance behind the most expensive plan. Fathom, for example, provides a blanket BAA incorporated by reference in their service agreement that covers all plan tiers, including the free plan. The BAA document (last updated September 2024) references PHI definitions from 45 CFR 160.103, requires incident reporting within five days, mandates subcontractor compliance, and includes post-termination PHI return and destruction obligations. That is a meaningfully different posture than vendors that refuse to sign a BAA below the enterprise tier. However, the free and individual Premium plans lack SSO, centralized access management, and custom data retention policies - controls that most health care organizations would need to satisfy their own risk analysis under 45 CFR 164.308(a)(1). For a health care deployment, budget for at least the Team tier (contact sales for pricing and minimums) to get the governance features that matter. MeetGeek claims HIPAA compliance across all paid plans. In both cases, request the actual BAA document, review it with counsel, and confirm that the plan tier you are purchasing includes the technical and administrative controls your risk analysis requires.
Write a policy. Even if you decide to adopt an AI meeting assistant at the appropriate tier, you need a written policy governing its use. Which meetings should it join? Which meetings should it absolutely not join (think HR discussions involving employee health information, legal consultations, or conversations with patients)? Who has access to the recordings and transcripts? How long is that data retained? A tool that records everything by default is a compliance risk even with a BAA in place if there are no controls on its use.
The Proposed HIPAA Security Rule Changes Make This Worse
It is worth noting that HHS published a Notice of Proposed Rulemaking in January 2025 that would significantly tighten the HIPAA Security Rule. Among the most consequential proposed changes is the elimination of the distinction between "required" and "addressable" implementation specifications. Under the current rule, addressable specifications allow organizations to assess whether a particular safeguard is reasonable and appropriate in their environment and document an alternative if it is not. The proposed rule would make all implementation specifications mandatory, with very limited exceptions.
The final rule is expected to be published in mid-2026, with a compliance window of approximately 240 days after publication. A coalition of industry associations led by CHIME, along with a separate letter from 57 hospitals and health systems, has pushed back on the proposed changes, arguing they impose unsustainable financial and operational costs. But the direction from HHS is clear: the regulatory bar for protecting ePHI is going up, not down. For a deeper look at the proposed rule's timeline, the industry opposition, and what it means for your organization right now, Health Tech Authority covered this in detail in The Industry Fought Back and the Rule Is Still Moving Forward - What the May 2026 Finalization Target Means for Your Organization Right Now.
One piece of that proposed rule deserves special attention here. The NPRM proposes a new requirement under 164.308(a)(1) that organizations maintain a comprehensive technology asset inventory and a network map documenting the movement of ePHI through their electronic information systems, reviewed and updated at least annually. Think about what that means in the context of AI meeting assistants being adopted without IT involvement. If your CEO signed up for Read AI last month and it has been recording meetings ever since, that platform is now a technology asset in your environment that processes ePHI. It needs to be in your inventory. It needs to be part of your risk analysis. And if you did not know it existed until someone mentioned it in passing, your inventory is already inaccurate.
For organizations currently using AI meeting tools without proper safeguards, this proposed rule should serve as additional motivation to get their house in order now, before the compliance requirements get even more prescriptive.
This Is an Application Intake Problem, Not Just an AI Problem
AI meeting assistants are the current flashpoint, but the underlying issue is much broader. Every health care organization should have a formal application intake process - a defined workflow that captures all new software being introduced into the environment before it goes live. And critically, that process should not just cover new systems. It should capture upgrades and changes to existing systems as well, because a vendor adding AI features to a platform you already use can change the risk profile overnight.
An effective application intake process does several things at once. It ensures that IT and compliance are aware of what is being brought into the environment. It triggers a risk assessment before the tool is deployed, not after a problem surfaces. It feeds directly into your technology asset inventory - the same inventory that the proposed HIPAA Security Rule changes would require organizations to maintain and update annually. And it creates a documented record that the organization evaluated the tool, assessed its compliance implications, and made a deliberate decision about whether and how to use it.
Without that process, you are relying on the honor system. You are trusting that every department head, every administrator, every physician who downloads a browser extension or connects a SaaS tool to their calendar will think to ask IT first. That has never worked reliably in any industry, and it works even less reliably in health care, where the people making these decisions are often under enormous time pressure and have no reason to suspect that a productivity tool could create a compliance problem.
The intake process does not need to be bureaucratic or slow. It can be a simple form: what is the tool, what does it do, what data will it access, does the vendor offer a BAA, who is requesting it, and what is the business justification. IT reviews it, compliance reviews it, and a decision is made. For straightforward requests, this can happen in a day or two. The point is not to create a bottleneck. The point is to create a checkpoint - one that ensures no tool enters your environment without someone asking the right questions first.
Leadership Has to Set the Example
Here is the part that needs to be said directly, even if it is uncomfortable.
The people most likely to adopt these tools without going through proper channels are not the front-desk staff. They are not the nurses. They are the administrators, the executives, the department directors - the same people who are supposed to be championing policy and process for the rest of the organization. When the CEO signs up for a free AI note-taker because they saw it on LinkedIn, they are doing the exact thing they would discipline a staff member for: introducing unauthorized software into a regulated environment without IT review, compliance assessment, or a BAA.
That is shadow IT. And when leadership is the source of shadow IT, it sends a message to the entire organization that the rules are optional for people who are busy enough or important enough to skip them.
Let's also step back and put this technology in perspective. AI meeting assistants are genuinely impressive. They are also, at the end of the day, a convenience. Health care organizations have been conducting meetings, documenting decisions, and managing action items for decades without an AI bot sitting in the room. Nobody is going to miss a critical patient safety issue because they did not have an AI-generated meeting summary. This is not an EHR. It is not a life safety system. It is not infrastructure. It is a productivity tool - a nice one, a time-saving one, but a productivity tool nonetheless.
That does not mean organizations should never use it. It means the urgency to adopt it should never outpace the diligence required to adopt it correctly. If your organization can afford the BAA-eligible tier, can implement the necessary policies, and can run the tool through a proper intake and risk assessment process, go for it. But if the alternative is a free account with no BAA and no oversight because someone wanted it working by Friday, the answer is no. The organization survived without it last week. It will survive without it next week too.
The Bottom Line
AI meeting assistants are not the problem. The problem is treating them like consumer apps when they are operating in a regulated environment. The problem is signing up for a free tier without reading the compliance fine print. The problem is bypassing the processes that exist to protect the organization - or worse, never building those processes in the first place.
The HIPAA Privacy Rule does not care about your job title. It does not care that the meeting was "just administrative." It does not care that the tool was free. It cares about whether PHI was disclosed to an entity without proper safeguards in place. And the proposed Security Rule changes on the horizon are going to press even harder on asset inventory accuracy, risk analysis completeness, and the documentation trail behind every technology decision.
Health care organizations that want to use these tools can and should. But do it right. Run it through intake. Get the BAA. Write the policy. Train the staff. And make sure leadership is following the same rules they expect everyone else to follow.
If your CEO just signed up for a free AI note-taker last Tuesday, schedule a conversation about it. Just make sure the note-taker is not in the room for that one.
This article is for informational purposes only and does not constitute legal or compliance advice. Covered entities and business associates should consult qualified legal counsel or compliance professionals before making decisions pertaining to HIPAA or IT infrastructure.
Sources
- 45 CFR 160.103 - Definitions (including PHI and business associate): https://www.ecfr.gov/current/title-45/subtitle-A/subchapter-C/part-160/subpart-A/section-160.103
- 45 CFR 164.502 - Uses and disclosures of protected health information: https://www.ecfr.gov/current/title-45/subtitle-A/subchapter-C/part-164/subpart-E/section-164.502
- 45 CFR 164.504 - Organizational requirements (business associate contracts): https://www.ecfr.gov/current/title-45/subtitle-A/subchapter-C/part-164/subpart-E/section-164.504
- 45 CFR 164.514(b) - De-identification Safe Harbor (18 identifiers): https://www.ecfr.gov/current/title-45/subtitle-A/subchapter-C/part-164/subpart-E/section-164.514
- 45 CFR 164.402 - Breach definition: https://www.ecfr.gov/current/title-45/subtitle-A/subchapter-C/part-164/subpart-D/section-164.402
- HHS Business Associates guidance: https://www.hhs.gov/hipaa/for-professionals/privacy/guidance/business-associates/index.html
- HHS De-Identification guidance: https://www.hhs.gov/hipaa/for-professionals/special-topics/de-identification/index.html
- Read AI Security and Privacy Overview: https://support.read.ai/hc/en-us/articles/25702259763091-Security-Privacy-Overview
- Read AI Pricing: https://www.read.ai/plans-pricing
- Fireflies.ai Pricing: https://fireflies.ai/pricing
- Fireflies.ai Security: https://fireflies.ai/security
- Otter.ai HIPAA Help Center: https://help.otter.ai/hc/en-us/articles/33975072019991-HIPAA-Otter-ai
- Fathom HIPAA Compliance: https://help.fathom.video/en/articles/5291265
- OCR Resolution Agreements (enforcement history): https://www.hhs.gov/hipaa/for-professionals/compliance-enforcement/agreements/index.html
- Lifespan ACE Settlement ($1,040,000 - stolen laptop): https://www.hhs.gov/hipaa/for-professionals/compliance-enforcement/agreements/lifespan/index.html
- Concentra Health Services Settlement ($1,725,220 - stolen laptop): https://www.hhs.gov/hipaa/for-professionals/compliance-enforcement/examples/concentra-health-services/index.html
- HIPAA Security Rule NPRM (January 2025): https://www.govinfo.gov/content/pkg/FR-2025-01-06/pdf/2024-30983.pdf
- Health Tech Authority - The Industry Fought Back and the Rule Is Still Moving Forward: https://healthtechauthority.com/industry-fought-back-and-rule-still-moving-forward-what-may-2026-finalization-target-means-your