Overview
- The RISE Act would shield AI developers from civil liability if they publish detailed model cards outlining training data sources, intended uses, performance metrics and known limitations.
- Developers must update AI documentation within 30 days of deploying new versions or identifying significant failure modes to maintain immunity.
- Licensed professionals using AI in contexts like medicine, law or finance would remain legally responsible for decisions made with these tools.
- The bill explicitly excludes immunity in cases of recklessness, willful misconduct, fraud, misrepresentation or activities beyond professional scope.
- The proposal enters a broader debate on federal versus state AI oversight and could set a national standard for liability and transparency.