The American Bar Association has responded to ChatGPT. 

The increasing use and fascination of ChatGPT are reminiscent of Wikipedia. I remember in school being told not to use Wikipedia as a source because the author cannot be trusted. We see the legal industry viewing the use of ChatGPT much the same. Even though you input a question or demand, the answer is curated by an untrustworthy and anonymous author – an algorithm.

The ABA’s Take on AI 

The American Bar Association has responded to the legal community’s concerns about the use of AI by adopting Resolution 6043 at its 2023 midyear meeting. In summary, the resolution addresses how leaders should address accountability, transparency, and traceability in artificial intelligence. The resolution states three major components: 

1. Adopt Guidelines 

Using AI as a tool to better accomplish tasks is one thing. To rely on it completely is another. The initial factor is to ensure guidelines are created to maintain a human as the authority over the AI product. For example, there may be copyright and trademark issues at hand if a software engineer uses AI to help them create code for a page. Depending on the application of AI, the code may not be eligible for copyright. Thus rendering any monetization of the engineer’s idea useless. 

American Bar Association ChatGPT response - Resolution 604

2. Be Accountable and Take Reasonable Steps to Mitigate Harm 

The second prong of the resolution calls for entities and organizations to take accountability for their misuse of AI unless they can show that steps were taken to mitigate the harm. This sounds vaguely familiar to an employment attorney trying to defend a discrimination charge by stating that there were policies and steps in place for the alleged victim to have used. Conversely, it is asking companies and businesses to be more proactive about how their employees and leaders use AI tools. 

We know that algorithms can carry biases. We know that there are sets of data and codes that also carry biases. If a company is aware that this kind of activity is occurring, then it is on them to take accountability. For example, Amazon was using an AI-generated program for hiring. The algorithm for that program perpetuated a bias toward women’s applications. This resolution definitively places the blame on the companies and entities that use such programs to help with things like their hiring process. 

3. Document Key Decisions Regarding Use of AI

This final prong of the resolution seems to speak to the developer’s use of AI when curating intellectual property, data sets, designs, coding, etc. Using AI is inevitable and should be used thoughtfully while also making diligent notes of its use. Illinois had been one of the thought leaders implementing strict laws regarding the use of AI. For example, for any biometric data collected, there has to be a series of notices provided to the user about where their data will be stored, how it will be protected, and, hopefully, deleted. They are asking for a similar standard for the use of AI. 

The Reason for the Resolution 

The purpose of this American Bar Association ChatGPT / AI resolution is beyond the scope of using the tool to write an email. Instead, they are considering the production of self-driving cars, medical development in surgeries and medical devices, and autonomous systems. So how does this relate back to legal marketing? It is all about the audience. 

How Resolution 604 Relates to Legal Marketing 

Attorneys do not want the headache of being the brunt of a lawsuit or being the reason for a lawsuit. Using ChatGPT to curate your law firm’s pages means inputting your law firm’s data into a database and relying on an unreliable author to curate your media/content. 

Importantly, that goes against the first prong of the resolution and easily bleeds into the second. How is your firm mitigating the harm of AI if they are letting the marketing department use AI to curate its marketing content? It begs the question, how else are they using AI in their firm? What guidelines are they implementing in one department but not the others? 

The legal industry is heavily reliant on reputation. If you are reputable, then you are referable. If your site does not reflect your reputation, how can you keep relying on those referrals?  

It’s not a new message. We have been writing about this since the initial launch of Chat GPT. The use of AI is a delicate dance. You want to use it as a tool but not rely on it like a religious text. However, as the legal industry continues to explore the use and implication of AI, we start to see more and more legal barriers. 

We also have to consider how Google is reacting to the AI trend. We know that ChatGPT is limited in its scope, and certain industries are trying hard not to let details like their pricing structure fall into the ChatGPT data hole for fear of trademark issues or breach of trade secrets. Many companies are legally asking their employees not to use ChatGPT so that OpenAI does not get information like that stored in its database.

Like the old Wikipedia, ChatGPT is an unreliable author (and regurgitator of information). It can provide great outlines, but its content is repetitive, lacks accuracy, and can be curating material from biased sources which ultimately leaves your reputation in question. The ABA’s resolution lays down the groundwork for how the legal industry needs to frame the use of AI: as a tool but not as the “end all, be all.” 

Blue Seven Content is Attorney Driven, Human Curated 

At Blue Seven Content, we are attorney driven, human-curated. Our writers are well-equipped and experienced to research and include reliable sources that will drive your authority on a subject and increase your site’s credibility. When a reader who needs you views your website, their instinct to trust will help turn that viewer into a client. If you have any questions about ChatGPT or your law firm’s content, reach out to us for a chat today. We are powered by attorneys who understand American Bar Association resolutions and trends with AI and ChatGPT. We’re here to help.

Written by Victoria Lozano – Attorney, Co-Founder & Consultant

One thought on “American Bar Association Resolution 604 – A Response to ChatGPT?

Leave a Reply