U.S. Supreme Court urges caution in use of AI

408

The Chief Justice of the United States Supreme Court, John Roberts, has released a comprehensive 13-page report summarizing the effects of artificial intelligence (AI) in the legal industry over the past year.

Explaining that generative AI was a mixed bag of blessings for the legal sector, Roberts urged “caution and humility” at a time when technology is reshaping the way lawyers and judges work.

U.S. Supreme Court Chief Justice John Roberts

In the report, Chief Justice Roberts recognized that AI had the potential to increase access to justice for indigent litigants, revolutionize legal research, and assist courts in resolving cases more quickly.

However, he also expressed the need for vigilance due to privacy concerns and the inability of the technology to reproduce human discretion.

“I predict that human judges will be around for a while… But with equal confidence, I predict that judicial work—particularly at the trial level—will be significantly affected,” Roberts said.

AI Hallucinations

The Chief Justice’s commentary is his most significant discussion to date of the influence of AI on the law.

Also Read: AI can create more jobs than it potentially eliminates – Gurnani

Interestingly, his views coincide with many lower courts contending with how best to adapt to a new technology capable of passing the bar exam but also prone to generating fictitious content, known as “hallucinations.”

He further noted an instance where lawyers cited non-existent cases in court papers due to AI hallucinations. He dubbed it “always a bad idea” to rely on such tools solely while dealing with court matters. Besides, he observed that AI hallucinations had “made headlines this year.”

Fake case citation

In 2018, Michael Cohen pleaded guilty to charges of campaign finance violations, tax evasion, and lying to Congress.

The former fixer and lawyer for Donald Trump, Michael Cohen, revealed the consequences of such problems recently. Cohen said in court papers unsealed last week that he mistakenly gave his attorney fake case citations generated by an AI program that made their way into an official court filing.

This incident explains the potential pitfalls of depending on AI for legal matters without carrying out thorough scrutiny.

The AI-generated fictitious cases were quoted in a written argument by Cohen’s attorney, David M. Schwartz. Previously, Cohen had served over a year in prison and was trying to close the chapter on his legal battles.

Other instances of lawyers including AI-hallucinated cases in legal briefs have also been documented.

A federal appeals court in New Orleans last month drew headlines by unveiling what appeared to be the first proposed rule by any of the 13 U.S. appeals courts aimed at regulating the use of generative AI tools like OpenAI’s ChatGPT by lawyers appearing before it.

The proposed rule by the 5th U.S. Circuit Court of Appeals would require lawyers to certify that they either did not rely on artificial intelligence programs to draft briefs or that humans reviewed the accuracy of any text generated by AI in their court filings.

Source Reuters 

Comments are closed.