Clinical Consent for Recording & AI Summaries (2026): Scripts, Forms, and What If They Refuse?
Ready-to-use consent scripts, downloadable form templates, and practical workflow options when therapy patients opt out of AI recording — including manual notes, partial recording, and no-recording alternatives.
Written by
Dya Clinical Team
Clinical Documentation Experts
A therapist sits across from a new patient. The intake is done, rapport is building, and it's time to bring up the AI. "I'd like to record our session so an AI tool can help me with my notes. Is that okay?"
The patient says no.
Now what?
This is the moment most consent guides skip. They give you the script for the ask — but not the plan for the refusal. In 2026, with AI scribes becoming standard in therapy and clinical practice, the "what if they refuse?" question isn't an edge case. It's a workflow you need to design for.
This article gives you three things: consent scripts that work across clinical contexts, a written consent form template you can adapt today, and — critically — structured alternatives for every scenario where the patient says no.
Why Consent for AI Recording Is Different From General Consent
You already obtain informed consent for treatment. Adding AI recording to the mix is not an extension of that process — it's a separate consent event with distinct legal, ethical, and relational dimensions.
The Legal Reality in 2026
The regulatory landscape has shifted significantly. Several US states now require standalone consent for AI-assisted recording in clinical settings:
- Illinois passed the Wellness and Oversight for Psychological Resources Act (Public Act 104-0054), which requires therapists to use a separate written consent form if they want to record sessions for AI transcription. A general intake form is not sufficient. Violations carry penalties up to $10,000.
- Florida has pre-filed legislation for 2026 requiring written, informed consent at least 24 hours in advance of any AI system recording or transcribing a therapy session.
- Texas (effective January 1, 2026) requires written disclosure to the patient that an AI system is being used in connection with health-care services prior to or on the date of service.
- Nevada prohibits AI from representing itself as a mental health provider and maintains strict all-party consent rules for phone call recordings.
Even in jurisdictions without AI-specific legislation, standard recording consent laws apply. In two-party consent states (California, Maryland, Washington, and others), all parties must agree to recording. In one-party consent states, the clinician's activation of the tool may technically satisfy the law — but ethical standards demand patient consent regardless.
Under HIPAA, AI transcription vendors are business associates handling protected health information. You need a Business Associate Agreement and clear patient disclosure. Under GDPR and the Swiss FADP, health data requires explicit, informed consent with documented opt-out mechanisms.
The Ethical Reality
The power dynamic in a therapy session makes "consent" more complicated than a signature on a form. A patient sitting across from their therapist — someone they depend on for care — faces implicit pressure to agree. Research published in the AMA Journal of Ethics documents cases where patients who initially wanted to decline felt pressured when asked in person and capitulated against their preferences.
This means your consent process needs to:
- Give patients time to consider before the session begins
- Normalize refusal as an equally valid choice
- Provide clear alternatives so saying "no" doesn't feel like creating a problem
Ready-to-Use Consent Scripts
The verbal script is not your legal protection — that's what the written form provides. The script is how you set the emotional tone. It determines whether the patient feels informed and respected or pressured and confused.
Script 1: The Standard Therapy Opener
"Before we begin, I want to let you know about something I use in my practice. I have an AI tool that can record our conversation and generate a draft of my clinical notes. This means I can focus entirely on our conversation instead of writing during the session. Here's what's important: I review and edit every note before it becomes part of your record. The recording itself is deleted once the notes are generated — it's not stored anywhere. You can say no to this, and it changes nothing about your care. I have other ways of documenting our sessions. Would you like me to use it, or would you prefer I take notes another way?"
Why it works: It leads with the benefit (full attention), addresses the biggest concern (what happens to the recording), and closes with a genuine alternative — not just "is that okay?" but an explicit acknowledgment that other methods exist.
Script 2: For Returning Patients (Re-Consent)
"I wanted to check in about the AI note-taking tool we've been using. Are you still comfortable with that, or would you prefer we switch to a different approach? Your preferences may change over time, and that's completely fine."
Why it works: Consent is not a one-time event. Patients' comfort levels shift — especially as they disclose more sensitive material over time. A brief periodic check-in demonstrates that you take their autonomy seriously.
Script 3: The Mental Health / Sensitive Context Version
"I'd like to talk about how I handle documentation for our sessions. I have access to an AI tool that can listen to our conversation and help me draft session notes afterward. I want to be upfront: this means our conversation would be recorded during the session, and then the audio is processed and deleted. Only the written notes — which I personally review — become part of your file. I understand this might feel like a lot for a therapy session. Some of my clients are comfortable with it, and some prefer I take notes manually or dictate them after we finish. There's no wrong answer. What would you prefer?"
Why it works: It acknowledges the sensitivity of the setting. The phrase "some of my clients prefer" normalizes both options, reducing the pressure to conform. Offering dictation-after-session as an explicit alternative gives the patient a concrete picture of what "no" looks like.
Script 4: For Group Therapy or Multi-Party Sessions
"Before we start today's session, I want to mention that I use an AI note-taking tool for documentation. It would record our group conversation and help me draft my clinical notes afterward. The recording is deleted once notes are generated. For me to use it, everyone in the group needs to be comfortable. If anyone would prefer I don't use it today, I'll turn it off — no questions asked, no need to explain. Can I get a show of hands or a quick go-around?"
Why it works: Group settings add complexity because one person's refusal affects everyone. This script makes the collective nature explicit and provides a low-friction way to decline (a simple non-raise of the hand or a quiet "I'd rather not").
Written Consent Form Template
A verbal script is necessary but not sufficient. You need documented, written consent — and in several jurisdictions, it must be a standalone form, not a clause buried in your general intake paperwork.
Below is a template you can adapt to your practice. Consult your legal counsel to ensure it meets your jurisdiction's specific requirements.
CONSENT FOR AI-ASSISTED SESSION RECORDING AND CLINICAL DOCUMENTATION
Clinician: ___________________________ Patient name: ___________________________ Date: ___________________________
What this consent covers:
I use an AI-powered documentation tool to assist with clinical note-taking. With your permission, this tool will:
- Record the audio of our session
- Process the recording to generate a draft of clinical notes
- Delete the audio recording after processing (typically within minutes to hours)
The draft notes are reviewed, edited, and approved by me before they become part of your clinical record. No audio recording is stored permanently. No session data is used to train AI models.
Your rights:
- You may decline this consent without any effect on your treatment or the quality of care you receive
- You may withdraw consent at any time — including during a session — by telling me verbally
- You may request to review the AI-generated notes after I have finalized them
- If you withdraw consent, any unprocessed audio from the current session will be deleted immediately
If you decline or withdraw consent, I will document our sessions using one of the following alternative methods:
- Manual note-taking during the session
- Clinician dictation after the session (no patient audio recorded)
- Brief summary notes written after the session
Data handling:
- Audio is processed by [Vendor Name], a HIPAA-compliant / FADP-compliant service with a signed Business Associate Agreement / Data Processing Agreement
- All data is encrypted in transit and at rest
- Audio recordings are deleted after note generation
- Written notes are retained per standard medical record retention requirements
Please select one:
- I consent to AI-assisted recording and documentation of my sessions as described above
- I do not consent to AI-assisted recording. I understand my clinician will use alternative documentation methods
Signature: ___________________________ Date: ___________________________
Key Design Decisions in This Form
Why a standalone form? Illinois law explicitly requires a separate consent form. Even where not legally mandated, bundling AI consent into a general intake form risks the argument that consent wasn't truly informed or freely given.
Why list the alternatives on the form? Patients who see concrete alternatives are more likely to feel that refusal is genuinely okay. If "decline" leads to a blank space with no information about what happens next, it feels like choosing the "wrong" option.
Why include the vendor name? Transparency builds trust. The 2025 JAMA Network Open study on ambient documentation consent found that patients who were told about corporate vendor involvement were less likely to consent — but the same study noted that withholding this information created distrust when patients later found out. Front-load transparency; it pays off in the long term.
What If They Refuse? Three Workflow Options
Here's where most guides end and your actual work begins. A patient has declined AI recording. Your documentation burden just increased. How do you handle this without creating a two-tier care experience?
Option 1: Manual Notes During the Session
How it works: You take handwritten or typed notes during the session, the traditional way.
Pros:
- No technology concerns
- No additional consent complexity
- Patients who declined AI recording are often more comfortable with visible note-taking
Cons:
- Splits your attention between the patient and documentation
- Notes tend to be less comprehensive
- Higher risk of recall errors if notes are sparse
Implementation tip: Use structured templates. A pre-printed or digital template with sections for presenting concerns, interventions, patient responses, and plan helps you capture key information efficiently even without AI assistance. Structured session report templates significantly reduce the cognitive load of manual documentation.
Option 2: Clinician Dictation After the Session (No Patient Audio)
How it works: After the patient leaves, you dictate your session notes into an AI tool. The AI processes your voice only — summarizing, structuring, and formatting the notes based on your dictation. No patient audio is ever captured.
Pros:
- You remain fully present during the session
- AI still assists with note formatting and structure
- No patient recording consent required (you're only recording yourself)
- Several tools support this workflow: Quill, Mentalyc, and TherapyNotes (TherapyFuel) all offer post-session dictation modes
Cons:
- Adds time after each session for dictation
- Relies on your memory of the session — detail quality decreases with delay
- The AI can only work with what you say, not the full clinical conversation
Implementation tip: Dictate immediately after the session ends — ideally within 5 minutes. Keep a brief keyword list during the session (a few words jotted on paper) to anchor your dictation. This hybrid approach preserves most of the AI benefit while respecting the patient's preference.
Option 3: Partial Recording With Explicit Boundaries
How it works: You and the patient agree to record only specific portions of the session — for example, the treatment planning discussion but not the open-ended exploration, or the assessment portion but not personal disclosures.
Pros:
- Captures the most documentation-intensive parts of the session
- Gives the patient granular control
- Can be a middle ground for patients who are hesitant but not entirely opposed
Cons:
- Requires clear communication about when recording starts and stops
- Technically more complex (you need to reliably pause and resume)
- May fragment the note, requiring more manual editing
Implementation tip: Establish clear verbal markers: "I'm going to start the recording now for our treatment planning portion" and "I've stopped the recording." Make the transitions visible and predictable. Some patients who initially decline full recording are comfortable with this approach once they understand they control the boundaries.
Decision Matrix: Choosing the Right Fallback
| Factor | Manual notes | Post-session dictation | Partial recording |
|---|---|---|---|
| Patient comfort | Highest (no tech) | High (no patient audio) | Moderate (patient controls scope) |
| Note quality | Lower (split attention) | Good (AI-assisted) | Good (AI-assisted for recorded portions) |
| Clinician time | During session | After session (+5-10 min) | During + after session |
| AI assistance | None | Yes (dictation-to-note) | Yes (for recorded portions) |
| Consent complexity | None required | None for patient audio | Requires clear agreement on scope |
| Best for | Tech-averse patients | Most opt-out cases | Hesitant but open patients |
Building Opt-Out Into Your Workflow (Not Around It)
The worst outcome is treating a patient's refusal as an exception that breaks your system. Opt-out should be a designed workflow path, not a workaround.
Pre-Session: Flag Consent Status
Your intake system should capture AI recording consent status before the session begins. This means:
- The consent form goes out with intake paperwork — not presented for the first time in the room
- The patient's preference is visible to you before you walk in
- You don't have to ask "is AI recording okay?" cold — you already know
For returning patients, set a periodic re-consent cadence: every 6 months, or whenever the treatment focus changes significantly.
During the Session: Respect the Boundary Cleanly
If a patient has declined, do not:
- Ask them to reconsider
- Explain why AI recording is "actually fine"
- Mention that "most patients" agree
- Make any comment that frames refusal as unusual
Simply proceed with your alternative documentation method. The goal is for the patient to feel zero difference in the quality of their experience.
After the Session: Documentation Parity
Patients who decline AI recording deserve the same quality of documentation. This is where post-session dictation tools become valuable — they let you produce structured, thorough notes without having recorded the patient.
Monitor your own documentation patterns. If notes for opted-out patients are consistently shorter, less detailed, or delayed, that's a signal to adjust your workflow.
In Your Records: Track Consent Decisions
Maintain a log of consent decisions — not just who agreed, but who declined and what alternative was used. This serves multiple purposes:
- Compliance: Demonstrates that you offered genuine choice and respected refusals
- Quality assurance: Lets you compare documentation quality across workflows
- Re-consent: Flags when periodic re-consent conversations are due
Common Scenarios and How to Handle Them
The patient consents initially but withdraws mid-session
"Of course. I'm turning it off now. Everything from this point forward will be documented from my own notes. And I'll review what the AI captured so far before anything goes into your record."
Then do it. Stop the recording immediately. Review the partial output carefully. Consider whether the patient would want you to discard it entirely and re-document from memory.
The patient's guardian consents but the patient (a minor or dependent) objects
Follow the objection. In most therapeutic contexts, the patient's comfort with the recording environment takes precedence for clinical reasons, even if the guardian has legal authority to consent. Document the disagreement and proceed without recording.
A couple or family disagrees (one consents, one doesn't)
The refusal wins. In multi-party sessions, all parties must consent. One "no" means the recording stays off for the entire session. Don't single out or identify the person who declined.
The patient asks: "What do you recommend?"
Be honest about what you use and why, without pressuring:
"I find the AI tool helpful because it lets me focus on our conversation instead of splitting my attention with notes. But I also have a solid workflow for taking notes without it. I'd rather you choose what makes you comfortable — either way, your care is the same."
The patient consents but seems uncomfortable
Check in proactively:
"I notice you seem a bit uneasy. We can absolutely turn it off — no explanation needed. I want you to feel completely comfortable in here."
Trust behavioral cues over verbal agreement. A patient who says "sure, fine" while shifting uncomfortably has not given meaningful consent.
State-by-State Considerations (US)
The regulatory landscape is uneven. Here's a summary of the jurisdictions with specific requirements as of early 2026:
| State | Key requirement | Effective |
|---|---|---|
| Illinois | Separate written consent form for AI transcription; AI cannot make independent clinical decisions; penalties up to $10,000 | 2025 |
| Texas | Written disclosure of AI use prior to or on date of service | Jan 2026 |
| Florida | Written consent 24 hours in advance for AI recording of therapy | Pending (2026 session) |
| Nevada | AI prohibited from representing as mental health provider; all-party consent for phone recordings | 2025 |
| California | Two-party consent for recording; CCPA applies to AI-processed data | Existing + evolving |
For clinicians in states without specific AI legislation: follow your state's recording consent laws as a baseline, add a standalone consent form, and document your process. The regulatory trend is toward more specificity, not less — building a strong consent workflow now protects you as new laws arrive.
International Considerations
European Union
The EU AI Act classifies certain AI systems in healthcare as high-risk, triggering transparency obligations, human oversight requirements, and data governance standards. AI scribes used in clinical documentation may fall under this classification depending on their role in the care pathway.
Under GDPR, explicit consent is required for processing health data. Patients must be informed of the purpose, legal basis, retention period, and their rights to access, correction, and deletion.
Switzerland
The FADP classifies health data as sensitive personal data with elevated consent thresholds. Swiss therapists must provide separate, explicit consent using active opt-in mechanisms — not pre-ticked boxes or implied consent. See our complete FADP compliance checklist for AI transcription for a detailed setup guide.
Canada
PIPEDA and provincial health privacy laws (e.g., Ontario's PHIPA) require consent for the collection, use, and disclosure of personal health information. AI transcription vendors must meet the same security and privacy standards as any other health information custodian.
Consent as Clinical Practice, Not Paperwork
The consent conversation is a clinical interaction, not an administrative checkpoint. How you handle it tells the patient something about how you'll handle everything else — their disclosures, their boundaries, their autonomy.
The practices that get this right share three characteristics:
-
They design for refusal first. The opt-out workflow is as smooth and well-supported as the opt-in workflow. No scrambling, no visible disappointment, no degraded experience.
-
They treat consent as ongoing. A signature on a form is the beginning of the conversation, not the end. Regular check-ins, clear mechanisms for withdrawal, and sensitivity to changing comfort levels.
-
They separate the technology decision from the care relationship. The patient's answer to "do you want AI recording?" has zero bearing on the quality, attentiveness, or thoroughness of their care. And the patient knows it.
In 2026, AI-assisted documentation is no longer experimental. It's a standard tool in clinical practice. But standard tools still require informed, voluntary, and genuinely free consent — especially when they involve recording the most private conversations a person will ever have.
Looking for a privacy-first AI documentation tool with built-in consent workflows? Try Dya Clinical free for 7 days.
References
Lawrence, K. et al. (2025). Informed Consent for Ambient Documentation Using Generative AI in Ambulatory Care. JAMA Network Open. https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2836694
AMA Journal of Ethics (2025). How Should We Think About Ambient Listening and Transcription Technologies' Influences on EHR Documentation and Patient-Clinician Conversations? https://journalofethics.ama-assn.org/article/how-should-we-think-about-ambient-listening-and-transcription-technologies-influences-ehr/2025-11
Illinois General Assembly. Wellness and Oversight for Psychological Resources Act (Public Act 104-0054). https://www.heplerbroom.com/blog/illinois-legislation-ai-mental-health-services
Blueprint Health. Integrating AI into Your Practice: How to Navigate Informed Consent Conversations. https://www.blueprint.ai/blog/integrating-ai-into-your-practice-how-to-navigate-informed-consent-conversations
Supanote. Is Recording Therapy Sessions Legal with AI Transcription? https://www.supanote.ai/blog/is-recording-therapy-sessions-legal-with-ai-transcription
Medscape (2026). Health System Sued Over AI Scribe Technology, Patient Consent. https://www.medscape.com/viewarticle/health-system-sued-over-ai-scribe-technology-patient-consent-2026a10001k7
Related articles:
- Ambient Clinical Intelligence in 2026: How Background Listening Changes Notes, Consent, and Patient Trust
- AI Medical Transcription in Switzerland: FADP-Compliant Setup Checklist
- EU AI Act & AI Scribes: What High-Risk Classification Means for Healthcare in 2026
- Session Report Template for Therapists: Structure, Examples & Common Mistakes
- Template Governance for Multi-Practitioner Clinics