Nearly 60 years ago, film director Stanley Kubrick and author Arthur C. Clarke created a sentient but fatally flawed artificial intelligence – HAL 9000, in 2001:A Space Odyssey.1 Although the fictional HAL became operational in 1992, expectations that advances in computer design and construction would produce a real-life HAL have yet to materialize.
Nonetheless, the trajectory of the abilities of generative artificial intelligence (GenAI) should alert us to anticipate a HAL appearing without much advance notice. Legal professionals must be ready. In anticipation, the legal community should adopt new rules that recognize the level of incorporation of GenAI into legal practice and judicial decision-making.
GenAI is rapidly being incorporated into the everyday practice of law. “According to the results of a survey released by LexisNexis in August 2023, about half of all lawyers believe generative AI tools will significantly transform the practice of law, and nearly all believe it will have at least some impact (92 percent).
“77 percent believe generative AI tools will increase the efficiency of lawyers, paralegals and law clerks, and 63 percent also believe generative AI will change the way law is taught and studied.”2 Litigators already use computer programs to sift through and organize electronic discovery. Corporate lawyers use contract analytics.3 Attorneys conduct legal research with GenAI search engines. Lawyers even rely on GenAI to determine how judges have historically ruled on similar cases.4 For judges, this raises the issue of whether courts should provide any guidance or restrict the use of GenAI in cases.
Court Approaches to the Use of Generative AI
Courts in the U.S. have taken different approaches to the use of GenAI in court filings.
Standing Orders. Some courts have issued standing orders requiring attorneys to declare that their legal arguments were not drafted by GenAI. For example, Judge Brantley Starr of the U.S. District Court for the Northern District of Texas issued an order requiring counsel who appear in his court to file a certificate attesting “either that no portion of any filing will be drafted by generative artificial intelligence … or that any language drafted by generative artificial intelligence was checked for accuracy, using print reporters or traditional legal databases, by a human being.”5
Prohibitions. Other courts have outright prohibited the use of GenAI, other than legal databases or search engines such as Westlaw or Lexis-Nexis. In Belenzon v. Paws UP Ranch LLC, the plaintiff moved for the admission of counsel pro hac vice. U.S. District Court Judge Donald W. Molloy of Montana allowed counsel’s admission but added a provision to the standard pro hac vice admission order indicating that counsel must do their own writing and prohibiting the use of AI automated-drafting programs.6
Explanation and Notice About the Use of AI by Governmental Agencies. Some governmental agencies rely on GenAI to administer social service programs, allocate public resources, and enforce certain activities. GenAI has been found to be embedded with patterns of gender, racial, and income discrimination. In late 2023, President Joe Biden issued Executive Order 14110, which sets forth measures designed to advance equity and protect civil rights in the context of the development of algorithmic systems. In litigation related to these issues, courts have ruled that algorithmic decisions by governmental actors must be supported by clear communication that provides reasons for decisions supported by GenAI that claimants can reasonably understand. Further, governmental entities must provide notices with information that allows affected people to identify errors and pursue corrective action. The goal is to ensure procedural fairness and due process.
Certificates of Compliance. The U.S. Court of Appeals for the Fifth Circuit recently proposed an amendment to the circuit’s rules regulating the use of GenAI by attorneys. The proposed rule would have required a new certificate of compliance on which filers would have to check one of two boxes specifying that either 1) a “generative artificial intelligence program” was not used in generating the court documents, or 2) a “generative artificial intelligence program” was used during the drafting of the documents and text and the documents and text were reviewed for accuracy by a human being. Legal professionals in the Fifth Circuit lodged significant opposition to this new requirement, leading the court to eventually reject the proposed rule. As a result, attorneys and parties remain responsible for the truthfulness and accuracy of their submissions with no special provisions for AI-generated content.7
Compliance with Existing Ethical Obligations. Florida courts have taken the approach that affirmative disclosure of the use of GenAI in court filings is not required. Instead, the Florida Bar took a more open approach for the use of AI. A Florida Bar advisory ethics opinion seems to encourage the use of AI. It states: “Lawyers may use generative artificial intelligence (‘AI’) in the practice of law but must protect the confidentiality of client information, provide accurate and competent services, avoid improper billing practices, and comply with applicable restrictions on lawyer advertising.”8 While Florida’s approach does not require disclosure of the use of AI, it falls back on attorneys’ obligations to discharge their ethical duties, including keeping clients apprised of the use of this technology and protecting clients’ interests.
The goal of each approach seems to be to require lawyers who use GenAI to be transparent about the use of this tool to ensure clients’ interests are protected.
ABA Formal Ethics Opinion 512
On July 29, 2024, the American Bar Association (ABA) released Formal Ethics Opinion 512 related to lawyers using GenAI tools. The opinion raises some important considerations for lawyers who use this technology. The opinion discusses issues of competence, confidentiality, communication, and candor toward tribunals.
ABA opinions are not binding on Wisconsin lawyers but because of the similarities between the ABA’s Model Rules of Professional Conduct and the Wisconsin Supreme Court Rules, the opinion’s discussion of these issues is instructive for Wisconsin lawyers and judges.
ABA Formal Ethics Opinion 512 concerns and the related Wisconsin Supreme Court Rules include the following:
Competence (SCR 20:2.1). Attorneys are advisors. Lawyers owe their clients competent representation. This means that lawyers using GenAI must have a reasonable understanding of not only the law but also the capabilities and limitations of GenAI tools to serve clients’ best interests. Because GenAI uses complex algorithms, lawyers must use fluid text when fashioning queries. They must also be aware that biased content can lead the GenAI to put out false or misleading results or “hallucinations.” Thus, human verification of data is necessary.
Confidentiality (SCR 20:1.6). Lawyers owe clients the duty of maintaining confidentiality. Attorneys are not permitted to reveal information related to a client’s case unless the client gives informed consent. As a result, when attorneys input a client’s data into a GenAI tool, they must evaluate whether this information will later be disclosed or accessed by others outside the firm. GenAI tools are self-learning. Facts and circumstances related to a case entered into a GenAI database do not disappear. The machine stores the information and incorporates it into its algorithm. Because of this, it appears the best practice is to get the client’s informed consent before disclosing the client’s information to others, even if inadvertent.
Communication (SCR 20:1.4). Lawyers have a duty to communicate with their clients. This rule requires attorneys to consult with their clients and to make sure that they have sufficient information to intelligently make decisions related to their case. Lawyers are not required to consult with clients in every situation. The ABA opinion recommends, however, that counsel tell clients in their engagement letter that GenAI may be used to assist in the delivery of legal services. This allows the client to determine the level of reliance they want to place on machine-driven information.
Candor Toward the Tribunal (SCR 20:3.3). Lawyers have a duty to not make false statements of fact or law to courts. GenAI programs and tools can be helpful but they have been known to make up information (for example, by citing to nonexistent opinions), provide inaccurate analysis of the law, and make misleading arguments. The comments to SCR 20:3.3 note that an advocate is responsible for the pleadings and other documents that are prepared for litigation. Lawyers are responsible for ensuring their legal arguments are not based on false representations of the law. These rules preserve integrity in the adjudicative process.
Conclusion
All the concerns raised by ABA Ethics Opinion 512 suggest that courts and bar associations should develop methods and guidelines to deal with the use of GenAI. The world is rapidly changing due to advancements in technology. We need to recognize the near magic that a HAL-like entity (a sentient or nearly sentient utility) would impose on lawyers and judges at all levels. The legal community should embrace this new technology but with transparency, with consistency, and in accordance with the rules of ethics. If courtroom and ethics rules do not acknowledge and anticipate the benefits and drawbacks of incorporation of GenAI into laws and legal practice, lawyers, judges, and other legal professionals might always lag behind the technology.
State Bar of Wisconsin Sources on AI in Law Practice and in Courts
The State Bar of Wisconsin has published articles on the use of artificial intelligence in law practice and in the courts. Check out these recent articles and watch for more to come in 2025.
Timothy D. Edwards & Hayley Rich, Discovering & Admitting AI Data in State & Federal Courts; Part 1, 97 Wis. Law. 10 (November 2024). The authors introduce readers to artificial intelligence (AI) in the context of civil litigation. Discovery should always be conducted with a view toward admissibility of evidence for summary-judgment and trial purposes, and AI-based evidence poses special challenges that lawyers must consider.
Brent J. Hoeft, Current State of Generative AI in Legal: Benefits, Risks, and Best Practices, 97 Wis. Law. 33 (November 2024). Generative artificial intelligence (GenAI) has the potential to revolutionize many aspects of legal practice. However, using GenAI also raises ethical and professional considerations.
Matthew M. Beier, The AI Revolution in Law: There’s No Turning Back, 97 Wis. Law. 41 (November 2024). Attorneys have a professional obligation to learn what artificial intelligence is and when and how to use it responsibly to avoid associated risks.
Timothy D. Edwards & Hayley Rich, Discovering and Admitting AI Data in State and Federal Court: Part 2, 97 Wis. Law. 8 (December 2024). This is the second of a two-part article on the discoverability and admissibility of artificial intelligence (AI) data. This part addresses the evidentiary issues that accompany the admissibility of AI and provides tips for attorneys and judges.
Bonnie Shucha, Getting Started with GenAI in Legal Practice, 97 Wis. Law. 29 (December 2024). The author offers advice for approaching GenAI in legal practice, examines types of GenAI tools and key policy considerations, and provides a step-by-step approach to building competence.
Endnotes
1 Arthur C. Clarke, 2001: A Space Odyssey (1968). The science fiction novel was developed concurrently with Stanley Kubrick’s film version and published after the April 3, 1968, release of the film. Clarke and Kubrick worked on the book together, but only Clarke is credited as the official author.
2 LexisNexis, Generative AI and the Law, https://www.lexisnexis.com/html/lexisnexis-generative-ai-story (last visited Dec. 11, 2024).
3 Carole Basri, eDiscovery for Corporate Counsel § 26:16 (Thomson Reuters 2024).
4 See ABA Formal Ethics Op. 512 (July 29, 2024), https://www.americanbar.org/content/dam/aba/administrative/professional_responsibility/ethics-opinions/aba-formal-opinion-512.pdf.
5 Judge Brantley Starr, In re: Mandatory Certification Regarding Generative Artificial Intelligence. Misc. Order No. 2 (N.D. Tex May 30, 2023); see also Jacqueline Thomsen, US Judge Orders Lawyers to Sign AI Pledge, Warning that Chatbots ‘Make Stuff Up,’ Reuters (June 2, 2023).
6 Eugene Volokh, Federal Judge Forbids Use of ChatGPT by Out-of-State Lawyers, The Volokh Conspiracy (May 29, 2023); see also Order Granting Pro Hac Vice Admission, Belenzon v. Paws Up Ranch LLC, No 9 23-cv-00069 (D. Mont. June 22, 2023).
7 Complex Discovery Blog, Fifth Circuit Rejects Proposed AI Regulation for Legal Filings After Widespread Opposition (June 20, 2024), https://complexdiscovery.com/fifth-circuit-rejects-proposed-ai-regulation-for-legal-filings-after-widespread-opposition/.
8 Fla. Bar Advisory Ethics Op. 24-1 (January 19, 2024).
» Cite this article: 98 Wis. Law. 45-47 (January 2025).