Sign In
    Wisconsin Lawyer
    November 07, 2024

    Managing Risk
    The AI Revolution in Law: There's No Turning Back

    Regardless of one's level of comfort with artificial intelligence, AI is here to stay. Attorneys have a professional obligation to learn what AI is and when and how to use it responsibly to avoid associated risks.

    Matthew M. Beier

    digital freeway

    When I was a young associate at a medium-sized firm in Madison, the president excitedly distributed BlackBerry cell phones to all the lawyers, a move that put us at the forefront of legal technology at the time. Back then, there was debate over whether the BlackBerry was a helpful tool or merely a “short leash.”1

    Today, there’s no debate – legal technology has advanced to the point where legal tech tools such as generative artificial intelligence (AI), “smart contracts,” data analytics, and cloud computing are so powerful that they are revolutionizing the practice of law. These tools and others were highlighted at a recent National Association of Bar Related Insurance Companies (NABRICO) conference in Calgary that I and other Wisconsin Lawyers Mutual Insurance Co. leaders attended.

    If your stance on AI is “I’ll retire before I use that,” you might want to consider an early exit, because AI is rapidly becoming integral to the legal field. At the NABRICO conference, over half the programming focused on how AI is being used by lawyers and insurance companies and how to implement it safely. One presentation, titled “Generative AI – No Slowing Down and No Going Back,” emphasized that it’s no longer a question of whether to use AI but when and how to do so responsibly to avoid risks associated with its use.

    The legal industry creates massive amounts of information. Legal tech that is used to store, manage, search, create, and communicate that information is nothing new. New tech software and products are transforming the efficient delivery of legal services, and clients are demanding lawyers use AI as a cost-saving measure. The main thrust of AI is to automate some of the routine legal work so that lawyers can focus on client contact and strategy. So, what are these tools and how do they work?

    Generative AI

    Generative AI has recently been the subject of much of the legal tech revolution discussion. ChatGPT is the large language model probably most recognized by the public. When asked to describe itself in two sentences, ChatGPT said this: “ChatGPT is an advanced AI language model designed to generate human-like text, assist with tasks, answer questions, and engage in conversations across a wide range of topics. It leverages deep learning techniques to understand context and provide meaningful, context-aware responses.” In other words, ChatGPT is a computer program that attempts to simulate human intelligence when interacting with users.

    Matthew M. BeierMatthew M. Beier, U.W. 2000, is senior vice president at Wisconsin Lawyers Mutual Insurance Co., Madison.

    These large language models analyze large amounts of data from the internet (open-ended) or proprietary (closed-ended) sources and “predict” human responses to various prompts. The responses are often very impressive. Equally impressive is the amount of human (and nonhuman) effort that goes into improving these models, specific to different industries, including the practice of law.

    The story of the New York lawyer who used ChatGPT to write a legal brief that included completely fictional cases and citations is now well known. The industry response was swift, with risk management programs and articles aplenty exposing ChatGPT’s drawbacks and shortcomings, reminding lawyers of their ethical obligations to clients and the profession, and cautioning against relying on ChatGPT as an authoritative source.

    In addition, developers and programmers have raced to produce better, safer products for lawyers – products like Spellbook,2 which “uses GPT-4 to review and suggest language for your contracts and legal documents, right in Microsoft Word”; Lexis+AI (from Lexis-Nexis); and CoCounsel (from Thomson Reuters). In addition to contract drafting and review, Lexis+AI3 and CoCounsel4 are designed to provide succinct answers to complex legal questions complete with citations to relevant statutes and case law.

    All these programs operate in what is known as a closed-end system or library, which refers to a self-contained environment where the model operates within a predefined set of constraints, such as a specific dataset, task, or application domain. In such a system, GPT is restricted to a customized knowledge base, ensuring that outputs are more controlled and focused. This is especially important for purposes like legal services for which security, accuracy, and relevance are paramount. This means that lawyers can control “where” GPT looks for responses to questions and prompts – specific jurisdictions, firm-uploaded briefs, specific legal resources, and so on. Using a closed-end system prevents inadvertent disclosures of sensitive firm and client information.

    Smart Contracts

    “Smart contracts are digital contracts stored on a blockchain that are automatically executed when predetermined terms and conditions are met.”5 For many people, the term blockchain brings to mind cryptocurrency. The main reasons to use smart contracts are efficiency, certainty, cost reduction, and mitigation of risks. Smart contracts are used in the delivery of life-saving medications and for retailer-supplier relationships, international trade, real estate, and other areas.6

    Data Analytics

    Law firms use data analytics for the collection, processing, and analysis of vast amounts of legal, business, and client data to uncover patterns, trends, and insights that can improve decision-making and operational efficiency. By leveraging advanced data tools, law firms can optimize case strategies, predict litigation outcomes, streamline billing practices, improve client services, and ensure compliance with legal regulations, ultimately driving more informed, data-driven legal practices.

    Cloud Computing

    One of the most significant catalysts for cloud computing was the COVID-19 pandemic. Since the start of the pandemic in 2020, the benefits and accessibility of remote collaboration have grown immensely. According to the ABA’s 2022 Tech Report, “Cloud usage increased significantly from 60% to 70%.”7 That same ABA report commented that cloud computing and AI are closely intertwined into practice management and legal research.

    The benefits are many and obvious. Lawyers can store and access their data from anywhere with an internet connection, allowing them to maintain communications with coworkers and clients. Gone are the days of having large, expensive on-site servers to protect data. The southeast United States is recovering from two major hurricanes, Helene and Milton. There is no doubt that losses in states including Florida, Georgia, South Carolina, North Carolina, and Tennessee will be huge, but recovery of digital information will be much faster and much more complete for many individuals and entities because of cloud computing.

    Ethical Considerations

    In response to lawyers using AI and ChatGPT carelessly, commentators have pointed out the several Rules of Professional Conduct that are implicated when it is used. In a 2023 Wisconsin Lawyer article, Aviva Kaiser explained, “Lawyers using ChatGPT must carefully manage nonlawyer assistance [SCR 20:5.3], protect confidentiality [SCR 20:1.6], provide competent representation [SCR 20:1.1], exercise independent professional judgment [SCR 20:2.1], verify the accuracy and authenticity of text and citations generated by the software [SCR 20:4.1, SCR 20:3.3, and SCR 20:8.4(c)], and perform other duties owed to clients and third parties. [SCR 20:8.4(i)].”8

    In addition, the ABA Standing Committee on Ethics and Professional Responsibility has issued its first formal opinion focusing on the use of generative AI by lawyers.9 The opinion is very consistent with Kaiser’s analysis and provides the following guidance:

    Model Rule 1.1 – Competence: “To competently use a GAI [generative artificial intelligence] tool in a client representation, lawyers need not become GAI experts. Rather, lawyers must have a reasonable understanding of the capabilities and limitations…. Because GAI tools are subject to mistakes, lawyers’ uncritical reliance on content created by a GAI tool can result in inaccurate legal advice to clients or misleading representations to courts and third parties. Therefore, a lawyer’s reliance on, or submission of, a GAI tool’s output – without an appropriate degree of independent verification or review of its output – could violate the duty to provide competent representation as required by Model Rule 1.1.”10

    Model Rules 1.6, 1.9(c), and 1.18(b) – Confidentiality: “Before lawyers input information relating to the representation of a client into a GAI tool, they must evaluate the risks that the information will be disclosed to or accessed by others outside the firm. Lawyers must also evaluate the risk that the information will be disclosed to or accessed by others inside the firm who will not adequately protect the information from improper disclosure or use…. Because GAI tools now available differ in their ability to ensure that information relating to the representation is protected from impermissible disclosure and access, this risk analysis will be fact-driven and depend on the client, the matter, the task, and the GAI tool used to perform it.”11

    Model Rule 1.4 – Communication: “Of course, lawyers must disclose their GAI practices if asked by a client how they conducted their work, or whether GAI technologies were employed in doing so, or if the client expressly requires disclosure under the terms of the engagement agreement or the client’s outside counsel guidelines. There are also situations where Model Rule 1.4 requires lawyers to discuss their use of GAI tools unprompted by the client. For example, as discussed in the previous section, clients would need to be informed in advance, and to give informed consent, if the lawyer proposes to input information relating to the representation into the GAI tool. Lawyers must also consult clients when the use of a GAI tool is relevant to the basis or reasonableness of a lawyer’s fee.”12

    Model Rules 3.1, 3.3, and 8.4(c) – Meritorious Claims and Candor: “In judicial proceedings, duties to the tribunal likewise require lawyers, before submitting materials to a court, to review these outputs, including analysis and citations to authority, and to correct errors, including misstatements of law and fact, a failure to include controlling legal authority, and misleading arguments.”

    Model Rules 5.1 and 5.3 – Supervisory Responsibilities: “Managerial lawyers must establish clear policies regarding the law firm’s permissible use of GAI, and supervisory lawyers must make reasonable efforts to ensure that the firm’s lawyers and nonlawyers comply with their professional obligations when using GAI tools. Supervisory obligations also include ensuring that subordinate lawyers and nonlawyers are trained, including in the ethical and practical use of the GAI tools relevant to their work as well as on risks associated with relevant GAI use.”13 The opinion also considers lawyers’ obligations to vet third-party providers, as discussed in prior ABA opinions.

    Model Rule 1.5 – Fees: “[B]efore charging the client for the use of the GAI tools or services, the lawyer must explain the basis for the charge, preferably in writing…. If a lawyer uses a GAI tool to draft a pleading and expends 15 minutes to input the relevant information into the GAI program, the lawyer may charge for the 15 minutes as well as for the time the lawyer expends to review the resulting draft for accuracy and completeness.”

    The lawyer should also consider whether a cost is overhead or an out-of-pocket expense. “For example, when a lawyer uses a GAI tool embedded in or added to the lawyer’s word processing software to check grammar in documents the lawyer drafts, the cost of the tool should be considered to be overhead. In contrast, when a lawyer uses a third-party provider’s GAI service to review thousands of voluminous contracts for a particular client and the provider charges the lawyer for using the tool on a per-use basis, it would ordinarily be reasonable for the lawyer to bill the client as an expense for the actual out-of-pocket expense incurred for using that tool.”14

    Legal Malpractice – Current and Future Claims

    There is limited formal guidance for attorneys to avoid the worst outcomes from use of AI. Currently that guidance comes from professional commentary, a few cases, a handful of ethics opinions, and some local court rules from jurisdictions that have dealt with the errors directly. Aside from the obvious mistakes in which a lawyer uses generative AI to submit briefs with fake quotations and citations, there are very few malpractice claims that involve the use of AI. When a collection of claims attorneys from 19 NABRICO insurance companies were asked how many and what types of claims have been experienced by those insurers, there was only one response, and the “fake citation” fact pattern was very similar. Nonetheless, the speed at which legal tech is moving and its unavoidable effects on the practice of law have many in the legal and insurance industry nervous.

    The legal elements of a legal malpractice claim require a claimant to establish a duty owed by the lawyer to the client, a breach of that duty, and that the breach caused or was the proximate cause of damages to the client. While the discussion thus far has centered on the risks, pitfalls, and ill-advised use of generative AI, it is worth noting that the title of the NABRICO program, “No slowing down and no going back,” is appropriate. Because of client demand, efficiency, and inevitability, the duty of care owed by lawyers to clients is likely to require that lawyers use generative AI, rather than steering clear of it or burying their heads in the sand and avoiding it altogether.

    Conclusion

    The future of AI in the legal industry is not a question of “if” but “how” it will reshape the profession. As legal tech tools like generative AI, data analytics, smart contracts, and cloud computing continue to evolve, they are becoming indispensable for improving efficiency, client service, and decision-making. Lawyers must adapt to this shift, responsibly integrating these technologies while maintaining ethical standards, to remain competitive in an increasingly digital and data-driven world.

    Endnotes

    1 Carolyn Elefant, The Blackberry: A Short Leash or Liberation?, My Shingle (Dec. 28, 2005), https://myshingle.com/2005/12/articles/operations/the-blackberry-a-short-leash-or-liberation/.

    2 https://www.spellbook.legal/.

    3 https://www.lexisnexis.com/en-us/products/lexis-plus-ai.page.

    4 https://www.thomsonreuters.com/en/artificial-intelligence.html.

    5 IBM, What Are Smart Contracts on Blockchain?, https://www.ibm.com/topics/smart-contracts (last visited Oct. 13, 2024).

    6 Id.

    7 Michael D.J. Eisenberg, ABA 2023 Cloud Computing Tech Report, The Status and Future of Cloud Computing for Attorneys: Harnessing Artificial Intelligence in Practice Management and Legal Research (Jan. 29, 2024), https://www.americanbar.org/groups/law_practice/resources/tech-report/2023/2023-cloud-computing-techreport/.

    8 Aviva Kaiser, Ethical Obligations When Using ChatGPT, 96 Wis. Law. 41 (Feb. 2023), https://www.wisbar.org/NewsPublications/WisconsinLawyer/Pages/Article.aspx?Volume=96&Issue=2&ArticleID=29597#1.

    9 ABA Standing Comm. on Ethics & Pro. Resp., Formal Op. 512 (July 29, 2024), https://www.americanbar.org/content/dam/aba/administrative/professional_responsibility/ethics-opinions/aba-formal-opinion-512.pdf.

    10 Id.

    11 Id.

    12 Id.

    13 Id.

    14 Id.

    » Cite this article: 97 Wis. Law. 41-44 (November 2024).


Join the conversation! Log in to comment.

News & Pubs Search

-
Format: MM/DD/YYYY