Notice & Comment

The Limits of Generative AI in Administrative Law Research, by Susan Azyndar

When I began experimenting with Lexis+AI in my administrative law research course this past spring, we found it ineffective for questions beyond the C.F.R. For example, asking for a recent IRS private letter ruling kept pulling up rulings from the last century, and no prompt seemed able to come up with EEOC policy documents. Why did this generative artificial intelligence (GAI) law practice tool struggle so much with administrative law queries?

First, let’s consider what makes regulatory research complex. The regulatory lawyer must look beyond the C.F.R. After all, regulations must be tied to authority, often a statute, and cases interpret both regulations and their underlying authorities. As discussed on this blog, the recent Supreme Court decisions on agencies, perhaps especially Corner Post, will likely yield an uptick in regulatory litigation, yielding more cases for legal researchers to sift through. Moreover, agencies produce a wide range of materials beyond the C.F.R., from legal opinions to handbooks to explanatory pamphlets. All of this material holds value for regulatory attorneys.  

How, on the other hand, does Lexis+AI generate a response to a prompt? Before generating an answer to a query, Lexis+AI first retrieves primary law sources and whatever secondary law sources are allowable given copyright constraints (right now, Matthew Bender treatises). Then, it uses those sources to reply to the prompt through retrieval-augmented generation (RAG). As Lexis describes it, “The RAG model is an LLM prompt cycle that accesses information external to the model to improve its response to specific queries, rather than only relying upon data that was included in its training data.” (For a fuller explanation see Paul D. Callister’s Generative AI Large Language Models and Researching the Law.)

One characteristic of Lexis+AI my students experience lies in limitations in the administrative materials included in the retrieval step. When released to law schools, Lexis explained that Lexis+AI included “top agency decisions,” a very generic description. I was told these decisions included only a handful of agencies: EEOC, GAO, Merit Systems Protection Board, Department of Interior, EPA, trademark, and patent decisions. Later in the semester, by the time my students worked with this tool, Lexis expanded this content to include “all regulatory decisions.”  

“All regulatory decisions,” however, does not mean all regulatory content. It excludes “topical agency materials such as press releases and guidance manuals produced by an agency.” Moreover, the “ask a legal question” and “generate a draft” modules do not retrieve the same agency decisions.  “Generate a draft” is still limited to “top agency decisions.” I asked why these modules differ in this way, but Lexis (at least my school’s rep) was not able to offer an explanation.

These limitations directly impact the quality of responses to administrative law prompts and therefore implicate professional responsibility for administrative lawyers. In discussing the duty of competence, the ABA’s recent opinion on generative artificial intelligence (GAI) tools states “lawyers’ uncritical reliance on content created by a GAI tool can result in inaccurate legal advice…” Administrative law researchers, and those who teach administrative law research, must approach these tools critically, inquiring whether the tool fits the task.

Of course, administrative lawyers have tools beyond Lexis+AI. Specialized GAI tools may be a better option for regulatory areas with enough demand, such as BlueJ for tax, as they are tailored more specifically to the materials in a given practice area as well as relationships between those materials.  Even so, these tools raise ethical issues similar to other GAI tools discussed in the ABA opinion. And traditional legal research strategies and tools remain essential to master and practice, as my students can attest, not only because GAI hallucinates but also because it cannot replace the creative and thoughtful work of administrative attorneys.

Susan Azyndar is the Senior Associate Director of the Kresge Law Library at Notre Dame Law School. She is happy to receive comments via email.