What You Need to Know About AI Compliance in PT, OT, and SLP Private Practice
This isn't about being overly cautious or anti-technology. It's about protecting your patients, your practice, and your license. Here's what every PT, OT, and SLP private practice owner needs to understand before adopting AI tools.
HIPAA Is the Starting Line, Not the Finish Line
The first question to ask about any AI tool is simple: does it handle patient data, and if so, is it HIPAA-compliant?
Any vendor that processes protected health information (PHI) on your behalf must sign a Business Associate Agreement (BAA) with your practice. This is not optional. A BAA outlines how the vendor is permitted to use patient data, what safeguards are in place, and what happens in the event of a breach.
Here's where many practice owners get caught off guard: popular general-purpose AI tools including ChatGPT in its standard consumer form — do not offer BAAs. OpenAI does not enter into Business Associate Agreements with covered entities, which means using ChatGPT to process, summarize, or document anything involving identifiable patient information is a HIPAA violation, regardless of how useful the tool is.
The practical rule: use general AI tools like ChatGPT and Claude for operational and marketing tasks: drafting emails, writing blog posts, building SOPs but never for anything that involves patient data.
If you want AI in your clinical documentation workflow, use a platform that is purpose-built for healthcare, explicitly HIPAA-compliant, and willing to execute a BAA. Tools like SPRY Scribe, ScribePT, WebPT, and TheraPlatform are designed with this in mind.
Beyond the BAA, HIPAA's minimum necessary standard requires that AI tools only access the patient information strictly needed for their function. This matters because AI models are often designed to learn from as much data as possible which can conflict with HIPAA's data minimization requirements. Ask vendors directly how they handle this.
Medicare, CMS, and Documentation Requirements
AI-generated notes do not get a pass on documentation standards. If your practice bills Medicare or works with insurance, every note — regardless of how it was drafted — must meet clinical documentation requirements, demonstrate medical necessity, and use the correct CPT and ICD-10 codes.
This means you are still responsible for reviewing every AI-generated note before it goes into the record or gets submitted for billing. An AI scribe that produces a plausible-sounding but clinically inaccurate note is a liability, not a time saver. Always include review into your workflow.
For PT and SLP practices billing Medicare, the KX modifier threshold for combined PT and SLP services in 2025 is $2,410. Your documentation needs to reflect the medical necessity that justifies services beyond that threshold. AI tools can help you draft those notes faster, but the clinical reasoning still needs to come from you.
What ASHA Says About AI in SLP Practice
For speech-language pathologists, the American Speech-Language-Hearing Association has been clear: AI tools must support clinical judgment, not replace it.
ASHA's Code of Ethics places a direct responsibility on certified professionals to evaluate any technology they use in their work. That includes AI. Just because a tool performs well in a controlled setting or a vendor demo does not mean it will perform the same way in your clinic, with your patient population, or across the full range of communication disorders you treat.
AI tools are making their way into private practice fast. They save time, reduce administrative burden, and help clinicians focus on care. But before you start plugging AI into your workflows, there's a conversation that needs to happen first: compliance.
ASHA also flags a specific concern about validation: even if an AI system meets or exceeds expert-level performance in a lab environment, that does not mean it can be readily adopted into clinical practice, that it will perform similarly when deployed in real-world settings, or that the evaluation metrics used actually reflect meaningful clinical outcomes.
There is also currently limited legislative or regulatory oversight of AI tools in healthcare. That means the burden of due diligence falls largely on you as the clinician. If something goes wrong with an AI-generated note or recommendation, your license is on the line and not the software vendor's.
State Licensure and Scope of Practice
Compliance doesn't stop at the federal level. Your state licensure board has its own requirements around scope of practice, telehealth delivery, and supervision and these vary significantly from state to state.
If you're using AI tools to support remote or telehealth services, you need to know your state's specific rules about how those services can be delivered and documented. Clinical fellows also have special billing and documentation requirements that AI tools need to be configured to accommodate.
Before committing to any AI platform, ask the vendor directly whether their tool is designed to support multi-state compliance and whether they can provide guidance on your specific state's requirements.
AI-Specific Risks Worth Knowing About
A few compliance risks that are specific to AI tools and easy to overlook:
De-identification is not a workaround. Some practice owners assume that removing a patient's name from data before feeding it into an AI tool makes it safe. HIPAA's de-identification standards are more rigorous than that. There are 18 specific identifiers that must be removed, and even de-identified datasets can carry re-identification risks when combined with other data sources.
AI can introduce bias. The HHS Final Rule now requires covered entities to identify patient care decision support tools that use variables related to protected characteristics such as race, ethnicity, language, disability status and take reasonable steps to mitigate discrimination risks. If the AI tool you're using was trained on data that doesn't reflect your patient population, its outputs may be less accurate or appropriate for certain groups.
Audit trails matter. HIPAA requires comprehensive logging of who accessed PHI and when. Your AI vendor should maintain automatic audit logs and you should confirm this before signing any agreement.
A Compliance Checklist Before You Adopt Any AI Tool
Before integrating any AI tool into your practice, work through these questions:
Does this tool involve patient data in any way? If yes, does the vendor offer a BAA? Is the platform explicitly HIPAA-compliant, with documented security standards including encryption at rest and in transit? Has the tool been validated for use in clinical settings with populations similar to yours? Does it align with your state's scope of practice and telehealth regulations? And finally, do you have a review process in place so that no AI-generated content goes into a patient record without clinician oversight?
AI tools can genuinely make your practice more efficient but they don't operate outside the rules that govern healthcare. HIPAA, CMS documentation standards, ASHA ethics guidelines, and state licensure requirements all still apply, and in some cases, AI introduces new compliance considerations that didn't exist before.
The good news is that compliance and efficiency are not opposites. Purpose-built healthcare AI tools are designed to work within these frameworks. The key is knowing what questions to ask before you adopt anything and making sure your team understands the rules too.
If navigating all of this feels like one more thing on an already full plate, that's a sign your practice may benefit from more operational support.
Virtual Rockstar works with PT, OT, and SLP private practice owners to build the systems and team structures that let you grow without getting buried. Book a free discovery call and let's talk about what that could look like for you.