Particle.news

Download on the App Store

Generative AI Reshapes Legal Industry Amid Rising Public Trust and Regulatory Oversight

New research highlights public reliance on AI legal advice despite risks of inaccuracies, as EU regulations enforce transparency in AI-generated content.

Image
Image

Overview

  • Generative AI tools like ChatGPT, Harvey, and Noxtua are now widely used in law firms for tasks such as document review, legal research, and contract drafting.
  • A University of Southampton study found laypeople trust AI-generated legal advice as much as or more than advice from human lawyers, even when informed of its source.
  • Concerns persist over AI ‘hallucinations,’ which can lead to inaccurate legal advice, unnecessary complications, or judicial errors.
  • The EU AI Act's Article 50.9 mandates transparency by requiring AI-generated text to be clearly labeled as such, addressing risks of misguidance.
  • Ongoing research aims to improve the efficiency, reliability, and cultural sensitivity of legal AI tools, while public education on AI literacy is emphasized as crucial.