
How to Talk to Clients About AI in Your Therapy Documentation
A practical guide for therapists ready to adopt AI documentation tools but uncertain how to disclose it to clients. Covers ethics code requirements from APA, NASW, and AAMFT, what to say for recording-based vs generation-based tools, sample consent language, and how to handle pushback.
You have researched AI documentation tools. You have watched the demos, read the comparison articles, and decided you want to try one. And then you stopped.
Not because of the software. Not because of the price. Because of the conversation you would need to have with your clients.
"How do I bring this up?" is one of the most common questions therapists ask before adopting AI documentation. It makes sense. You spend your professional life navigating difficult conversations with skill and care. The idea of explaining AI to a client who came to you carrying something personal feels like it deserves the same care you give every other disclosure you make.
This guide is about that conversation. Why it is ethically required, what exactly you need to say, how the disclosure differs depending on the type of tool you use, and what to do when a client pushes back.
Why the Disclosure Is Required, Not Optional
Several therapists frame the AI consent conversation as a courtesy. It is not. Three major professional codes treat transparency about documentation tools as an obligation.
The APA Ethics Code (American Psychological Association, 2017) addresses this under Standard 4.02 (Discussing the Limits of Confidentiality) and Standard 3.10 (Informed Consent). The code requires psychologists to "obtain informed consent for services" and to explain "the nature of the anticipated services." If AI is shaping how your clinical record is created, that is part of the nature of your services. The APA's 2024 Resolution on Technology in Psychological Practice further affirms that transparency about technology tools is consistent with the profession's ethical commitment to client autonomy.
The NASW Code of Ethics (National Association of Social Workers, 2021 revision) is more explicit. Standard 1.07 on Privacy and Confidentiality states that social workers must inform clients "about the purposes and nature of any use of technology" that affects the handling of client information. Standard 1.03 on Informed Consent requires informed consent for any aspect of service delivery. The NASW also explicitly addresses electronic and digital practices in Standard 1.07(m), which requires disclosure of confidentiality limits and technology risks.
The AAMFT Code of Ethics (American Association for Marriage and Family Therapists, 2015, updated principles) requires informed consent under Principle 1 (Responsibility to Clients), including disclosure of "any aspects of the therapeutic process" that may affect confidentiality. AAMFT's technology-specific guidance (published 2021) extends this to "any AI or technology tool that processes client information."
The short version: your ethics code almost certainly requires you to disclose AI use in your documentation. This is not a legal technicality. It is a foundational element of the trust that makes therapy work.
There is also a growing layer of state law. Illinois Public Act 104-0054, enacted August 2025, requires written client consent before any AI tool is used to record or transcribe a psychotherapy session. New York has proposed similar legislation. Forty or more state-level AI mental health bills were active across twenty-five states as of early 2026. Even where state law does not yet require disclosure, your ethics code does.
The Recording-Based vs. Generation-Based Distinction Matters More Than You Think
Before you can write a disclosure, you need to understand what kind of tool you are using, because the consent conversation is fundamentally different for each type.
Recording-Based Tools (Ambient Scribes)
Ambient scribes work by recording the therapy session. The audio is transcribed in real time or near-real time, and an AI model generates a clinical note from the transcript. The session is the raw material.
If you use this type of tool, you are asking a client to consent to being recorded. That is a materially different conversation than disclosing that you use software to organize your post-session notes. The recording itself:
- Is subject to state recording consent laws (11 states and the District of Columbia require all-party consent before any session recording)
- May exist on a third-party server even if the vendor states it is deleted after processing
- Is a documented concern for clients with trauma histories, involvement in legal proceedings, domestic violence situations, or occupational exposure (military, government, law enforcement, law)
- Carries Illinois-specific legal requirements for written consent as of August 2025
The disclosure for a recording-based tool has to name the recording, explain what happens to the audio, identify the third-party service by name, and confirm the client's right to decline.
Generation-Based Tools (Post-Session AI)
Other tools work without any session recording. The therapist writes a brief summary of the session after it ends, and the AI structures that summary into a clinical note format. The AI sees only what the therapist chose to write. There is no audio, no transcript, and the session content is never directly processed.
This is called a generation-based AI workflow. For clients with any privacy concern, this is a meaningfully different situation. You are not telling a client that their words are being recorded or transcribed. You are telling them that the notes you write after your sessions are organized with the help of AI software.
That conversation is simpler. The most fraught element of the ambient scribe disclosure (an AI listened to your session) is absent.
Understanding which category your tool falls into shapes everything: what you put in your written consent form, what you say in the verbal conversation, and how you respond when a client asks follow-up questions. Do not use ambient-scribe consent language for a generation-based tool, or vice versa. The difference is material.
What Exactly to Disclose
The core elements of an AI documentation disclosure are the same across tool types, though the specifics differ. A complete disclosure includes:
1. That AI is involved. The client should know that software, and specifically AI software, plays a role in creating their clinical notes.
2. How the tool works. For ambient tools: that the session is recorded. For generation-based tools: that you write a summary after the session and the AI structures it. Clients do not need a technical explanation, but they need an accurate one.
3. What information the AI processes. For ambient tools: the full audio of the session. For generation-based tools: your written clinical summary (not a recording, and not the client's direct words).
4. Who provides the AI service. You do not need to describe the company's infrastructure in detail, but naming the tool or vendor and directing the client to the privacy policy is reasonable. "I use [tool name]; their privacy policy explains how they handle data."
5. What privacy protections are in place. If your tool has a Business Associate Agreement, you can say so. If it does not, be transparent about what protections do exist and what does not. Clients who ask specific HIPAA questions deserve honest, specific answers.
6. That the client can decline. This is non-negotiable. Consent must be voluntary to be valid. Tell the client they can say no, and that saying no will not affect their care.
7. How to document their choice. The consent conversation should be noted in the clinical record, not just on a form.
When to Have the Conversation
At Intake
Intake is the right time for new clients. You are already covering confidentiality, limits to confidentiality, your billing practices, and how records may be used. AI documentation belongs in the same cluster of disclosures. Clients are in information-receiving mode. An AI disclosure at intake registers as practice policy, not a surprise.
Introduce it after a few minutes of rapport-building, not as the opening line. Wait until you have explained who you are and what the client can expect before you move into disclosures.
For Existing Clients
Introducing AI to an existing client requires more care. The relational frame is established. Clients have formed expectations of how you work, and a new disclosure, even about something practical, signals change.
The best approach is a brief verbal check-in at the beginning of a session, before you start using the tool. Do not slip the change into updated paperwork without a conversation. Have the conversation first.
Give the client a moment to respond. Do not rush to the next topic. For clients who have shared significant personal history with you, this may need a fuller conversation than the intake version.
Ongoing Check-Ins
Active consent means the conversation is not a one-time event. If you change tools, your workflow changes materially, or you learn something significant about how your current tool handles data, the client should know. An annual brief check-in ("I want to confirm you are still comfortable with how I use AI in my note-writing") is also considered best practice by multiple clinical ethics commentators.
A Fictional Example: Introducing AI to a Skeptical Client
Dr. Renata is an LPC in private practice. She has been treating a 52-year-old client, David, for eighteen months. David is a retired attorney, careful with information, and has previously mentioned concerns about digital privacy. When Dr. Renata decided to start using a generation-based AI tool for note-writing, she knew this would be a conversation that required more than a sentence.
At the start of their next session, before any clinical content, she said:
"Before we get into today, I want to tell you about something I am changing in my practice. I have started using an AI tool to help me write my session notes. I still write a summary of what we cover after each session, just as I always have. What is different is that I now use software to help organize that summary into the structured format my notes follow. The tool does not record our sessions. It does not have access to anything you say in this room. It only processes the written summary I create after we finish."
David paused. "So you are sending my information to an AI company?"
"What I send is the clinical summary I write, yes. My summaries describe what we worked on in clinical terms. They do not contain a transcript of what you said. The vendor's privacy policy describes how they handle that data, and I am happy to share the tool's name so you can read it. I also want to be clear that you can decline. If you prefer I not use the tool for your notes, I will continue writing them manually. That will not change anything about our work."
David thought for a moment. "What does the summary say, exactly?"
"It says things like: client discussed ongoing work-related stress and progress on sleep regulation strategies. It uses clinical language to describe themes and interventions, not your words verbatim. I control what goes into it."
"And you still review the note before it is finalized?"
"Yes. The AI produces a draft based on my summary, and I review and edit it. Nothing goes into your chart without my eyes on it."
David nodded. "I appreciate you telling me directly. I am okay with it."
Dr. Renata made a brief note in David's chart: "AI documentation disclosure reviewed verbally. Client informed of generation-based tool workflow (post-session written summary structured by AI; no session recording). Client asked clarifying questions about data handling and therapist review. Client consented. Right to decline reviewed."
That note took her forty-five seconds to write. It documents the key elements: that the disclosure occurred, what was explained, what the client asked, and what the client chose.
Handling Pushback
Client pushback is not a clinical problem. It is informed consent working correctly.
When a client hesitates: Give the space. A hesitation is usually a request for more information. Ask, "Is there a specific question I can answer?" Do not interpret hesitation as refusal or as resistance in the clinical sense.
When a client says no: Accept it without negotiation. "That is completely your choice, and it does not affect your care. I will note your preference and continue with my current method." Do not revisit the topic repeatedly. If the client changes their mind later, they will bring it up.
When a client asks about security: Be specific and honest. If your tool has a BAA, say so. If it does not, explain what protections are in place instead of deflecting. Clients who ask detailed security questions are often data-literate, and vague reassurances will not satisfy them and should not.
When a client asks whether AI can be wrong: Yes, it can. And your answer should acknowledge that. "That is a fair question. The AI produces a draft from my clinical summary, and I review every note before it goes into your chart. The final record reflects my clinical judgment, not the AI's output." This is an honest answer and it is also the accurate one. A good documentation workflow always places the clinician as the reviewer before any note is finalized.
When a client asks if this is required: No. The client may assume you are telling them because they are about to be recorded whether they like it or not. Make clear that their choice matters. "I use this tool in my practice, but your consent is required before I use it for your record. You can absolutely say no."
Written Disclosure Language
The verbal consent conversation should be supplemented by written disclosure in your practice paperwork. The following is a template for a generation-based AI tool. Modify it for your specific tool and circumstances:
Use of AI-Assisted Documentation
This practice uses AI software to assist with formatting clinical session notes. After each session, your therapist writes a clinical summary of the session. That summary is processed by [tool name] to generate a structured progress note. No session recording takes place. The AI tool processes the therapist's written summary only, not your direct words or any audio of our sessions.
Your consent to this practice is voluntary. You may decline at any time by notifying your therapist. Declining will not affect your care in any way. If you have questions about how your information is handled, ask your therapist or review the privacy policy at [tool URL].
Client consent: _____ I consent to AI-assisted documentation as described above. _____ I decline AI-assisted documentation. I understand my care will not be affected.
Signature: _________________________ Date: __________
For a recording-based ambient tool, the written disclosure must also describe the recording, name the vendor and their data retention policy, and include state-specific language if you practice in an all-party consent state.
How to Document That You Obtained Consent
Documentation of the consent conversation belongs in the clinical record, whether in the intake note or in a session note if the disclosure happened mid-treatment.
A complete note entry might read:
"AI documentation consent reviewed verbally with client at intake. Client informed that generation-based AI tool is used to structure post-session written summaries into note format. No session recording is used. Client informed of right to decline without impact to care. Client asked [note specific questions if any]. Client signed written AI disclosure form. Client consented to AI-assisted documentation."
If the client declined, note that as well:
"Client declined AI-assisted documentation. Client preference documented. Manual note-writing will continue for this client."
The documentation creates a record that the consent process occurred, what was explained, what the client chose, and that the client understood their choice was voluntary. That record is what you would produce if a licensing board, ethics committee, or legal proceeding ever asked whether the client knew AI was involved in their documentation.
A Note on Tools That Do Not Record
Several documentation tools, including generation-based ones, do not involve session recording at all. Therapists using those tools sometimes wonder whether they need to disclose AI involvement at all, since the session itself is not captured.
The answer is yes, disclosure is still required. What the client consented to when they signed their initial paperwork is a therapeutic relationship with you. The addition of any third-party software, AI or otherwise, to the creation of their clinical record is material information they are entitled to know. The fact that the tool processes your written summary and not their words makes the disclosure easier, but it does not make it optional.
Think of it this way: if you hired a transcription service to type your handwritten notes, your client would want to know. AI software that processes your written notes is analogous. The substance of the clinical record involves the client's information, and third-party involvement in that record requires disclosure.
Pre-Adoption Checklist
Before you introduce AI documentation to your practice, confirm the following:
Ethics and legal:
- Reviewed your professional ethics code for AI and technology disclosure requirements (APA Standard 4.02 and 3.10; NASW Standard 1.07; AAMFT Principle 1 and technology principles)
- Checked your state's recording consent laws (all-party vs. one-party consent)
- Reviewed your state's AI-specific mental health legislation (currently enacted in Illinois; proposed in New York and others)
- Confirmed what privacy protections your tool provides and whether a Business Associate Agreement is available
Consent preparation:
- Determined whether your tool is recording-based or generation-based
- Drafted written disclosure language appropriate for your tool type
- Added AI disclosure to your intake paperwork
- Prepared verbal explanation in your own clinical voice (not a corporate script)
For existing clients:
- Identified clients who should receive a mid-treatment disclosure
- Planned a brief verbal check-in at the start of the next appropriate session
- Prepared for possible questions about data security, HIPAA, and clinical judgment
Documentation:
- Established a standard note entry format for documenting consent and refusals
- Created a system for tracking which clients have consented, declined, or not yet been asked
- Set a reminder for annual consent check-ins with existing clients
If a client declines:
- Confirmed your practice can accommodate manual documentation for individual clients
- Prepared a non-pressured response for refusals
- Noted how you will track consent preferences in the clinical record
One practical note for those evaluating tools before committing: the consent burden differs by tool type. If recording consent in your state or client population is likely to be a friction point, a generation-based tool like NotuDocs avoids that layer entirely. The consent conversation still needs to happen, but "AI helps me format the notes I write" is a much easier disclosure than "AI listens to our sessions."
The consent conversation is not the obstacle it might feel like before you have it. Most clients, when given a clear and honest explanation, move on quickly. The ones who do not will teach you something about what they need from the therapeutic relationship. Both outcomes are worth the thirty seconds it takes.
Related reading: How to Document Informed Consent in Therapy and Clinical Practice | Concurrent Documentation in Therapy | What Insurance Auditors Look For in AI-Generated Therapy Notes


