Samuel Edwards
May 8, 2024
Legal matters are tricky.
And unfortunately, they're not getting any simpler.
We're in the early stages of an AI explosion, with artificial intelligence powered apps and systems rivaling human capabilities in terms of advanced tasks like natural language processing – and far exceeding them with niche tasks like data analytics.
As developers and promoters of legal AI (tools designed to automate and expedite various responsibilities of lawyers and law firms), we can't help but wonder: how is the legal and regulatory landscape going to change now that these tools are widely accessible?
In the hands of a good lawyer, legal AI is an incredibly powerful tool. It can simplify and accelerate many tasks, and relieve some of the most persistent headaches in the profession.
But we also must remember that average people can harness the power of legal AI for their own ends. Without legal expertise and professional experience, this can be risky or even downright dangerous; if someone misuses AI to create shoddy legal documents or try in vain to protect themselves from legal consequences, they could end up greatly damaging themselves or others.
One application of this is generating contracts, which legal AI can do. What if these contracts leave out specific protection clauses? What if they make demands that aren't legally permitted? What if two people signed the contract without fully understanding it and its consequences? What would happen?
Are AI generated contracts even legally valid?
First, we have to acknowledge that the age of AI is confusing for judges, lawyers, and regulators. The technology is too new and too novel to have straightforward solutions that lawmakers can easily address. In fact, many judges and lawyers are reluctant to even touch AI as a subject for fear of prematurely judging it or establishing precedent that will eventually be regretted.
As a result of this, there aren't many laws or regulations in the United States governing how or when AI can be used. It's a kind of Wild West where most AI operations are held to the same standards as comparable human operations – but at the same time, people are rightfully exercising skepticism and caution with significant AI tasks.
With this backdrop, we must reluctantly accept that there are no easy, straightforward answers. But we can examine our existing knowledge and assumptions to practice responsible AI contract generation, safeguard against legal consequences, and speculate about how AI regulation might evolve in the near future.
Let's take a look at how AI contract generation works.
Legal AI, in this specific application, uses traditional legal generative AI processes. Using sophisticated machine learning algorithms, these systems can study an absurd number of examples of a piece of content, then use insights generated by that analysis to create a new piece of content within specified parameters.
For example, ChatGPT has read and processed patterns in millions of online articles, so if you ask it to write an online article about a new topic, it can do it quickly and easily. By studying millions of examples of specific types of contracts, such as employment contracts or purchase agreements, legal AI can replicate established patterns and create something new. Typically, it can do this in a matter of minutes, if not seconds.
There are many advantages to this:
· Speed. Generative AI in the legal realm is incredibly fast. If you're trying to generate a simple contract as quickly as possible, AI is the best way to do it.
· Efficiency. It's also highly efficient. With minimal inputs and only a handful of modified variables, AI can execute professional-level work.
· Cost. Many people also appreciate the cost efficiency of legal AI. Lawyers are expensive, especially when you have them working for long hours. Comparatively, AI tools cost pennies on the dollar and greatly reduce the hours spent by practicing lawyers.
· Accessibility. These tools are also highly accessible. While primarily used for lawyers to expedite tedious and routine tasks, it's also possible (if a bit risky) for lay people with no legal education to use these tools for their own ends.
· Consistency. Legal AI tools rely on the same internal logic and broad pattern recognition to produce results. Accordingly, output is stunningly consistent in terms of quality.
However, we can already identify some issues associated with AI contract generation.
· Hallucinations. AI hallucinations are more common than most people realize. An AI hallucination is when a generative AI engine submits something as factual or true when it isn't. This is something that only affects a small percentage of outputs, but it's significant enough to warrant specific safeguards at the human review level.
· Plagiarism and ethical concerns. Because legal AI draws upon existing examples of contracts, there's a small risk of committing accidental plagiarism. There are other ethical concerns associated with AI as well, such as drawing from established works without crediting creators and dealing with bias.
· Lack of creativity. While contracts aren't exactly creative writing exercises, we should note that generative AI tends to lack genuine creativity. What we see is more akin to sophisticated mimicry, which isn't always ideal, even in a legal setting. If you're looking for something boilerplate, this may not be a concern, but if you need something novel, you might be better off working with a human lawyer.
· Lack of human reasoning. Despite outward appearances, AI doesn't think. It can't use human reasoning or logic to form the conclusions we can. Don't expect it to fully think through all the potential legal consequences of the contract you generate – and don’t expect it to be perfectly logical 100 percent of the time.
· Lag in legal knowledge. It's also important to note that there's a short lag in the acquisition of legal knowledge. Lawyers can read news and update themselves in real time, but it typically takes AI systems weeks or months to catch up to the latest developments. If there are new laws or regulations that impact your generated contract, they may not be taken into account by your generative AI tool.
· Domain inexperience. Most legal AI tools are designed to be broadly applied, so they don't have the niche experience necessary to replicate the work of true domain masters. For simple, entry-level contracts, this is no major cause for concern; but if you’re working in a particularly sensitive field or in an area that demands legal expertise, it’s imperative to have a human lawyer with domain expertise working alongside you.
Legality and Enforceability of AI Generated Contracts
So we've established that there are some pros and cons to using AI to generate contracts.
But what does the legal world think about all this?
For a contract to be legally valid in the United States, it must adhere to several requirements.
· Mutual voluntarism. A contract is only legally valid if it is entered into with mutual voluntarism. All parties signing the document must do so knowingly and voluntarily; contracts are invalid if signed under duress or under false pretenses. This does not bear any direct relevance to how the contract is initially created.
· Reasonable exchange. Contracts must also have some kind of reasonable exchange. For example, you can agree to pay a fixed dollar amount for a property, while the seller of the property agrees to hand over ownership of that property.
· Contractual authority. Parties must be legally allowed to enter the contract for the contract to be valid. If a person is legally prohibited from signing a specific type of contract, their signature on such a contract doesn't make it valid.
· Lawful subject matter. Contracts must also be lawful in nature. Nobody will be able to take you to court for violating a contract you signed concerning facilitation of an illegal drug deal.
Notice that there is no specific legal requirement that requires a specific person or entity to generate the contract. In fact, contract composition seems to have nothing to do with the legal validity of the signed document itself – at least for now.
This idea also seems to hold up in court, at least ostensibly. However, there haven't been enough AI-related cases to make a strong judgment or assertion here. For example, there was a 2021 case in the United States where a judge withheld an opinion on the use of AI out of fear of setting a precedent prematurely.
For now, it seems AI generated contracts are held to the exact same standards as human generated ones – no more, and no less.
But what does the future hold?
Given the current reluctance to make significant judgments, it's unlikely that we'll see any major legal changes to how AI generated contracts are treated in the near future. But as AI grows more powerful and more influential, we may start to see regulations creeping in.
There are already some signs of countries around the world posturing themselves to regulate AI in specific, limiting ways.
For example, China has specific rules in place for “smart” contracts in place, though they currently aren't much different than rules for typical contracts.
The European Union tends to take a more aggressive regulatory stance on new technologies, and generally supports the idea that humans should be in control of important matters. It wouldn't be surprising to see regulations or resolutions passed in the EU to guarantee that human lawyers review any work done by AI. Ostensibly, this wouldn't change too much.
And in the United States, regulations tend to be comparatively loose. Given the current hands-off stance that many judges and lawmakers have, we anticipate that any future modifications to how AI generated contracts are treated will be minimal and non-invasive.
AI generated contracts are perfectly legal and valid, assuming they meet established criteria for contract validity. However, that doesn't mean they're perfect or that they're equally reliable as contracts created by competent human lawyers.
Due to the many weaknesses and limitations of current legal AI, it's a good idea to follow these best practices regarding AI generated contracts:
· Keep a human expert in charge. Always make sure there's a human expert in charge. Feel free to generate contracts with AI, but always have them double checked by a knowledgeable lawyer. If your lawyer uses AI to facilitate contract generation, that could be even better.
· Review terms of use carefully. If you decide to use legal AI to generate your own contract, review the terms of use for that particular product very carefully. AI developers sometimes include specific restrictions on how AI generated materials can or should be used. This may preclude you from generating certain types of contracts or using contracts to facilitate certain types of activities.
· Document your efforts and be transparent. It's a best practice for both lawyers and laypeople to document any efforts you've taken with AI and be transparent about the use of AI. If you're going to sign this contract with another party, make sure to let them know that it was generated by AI. Otherwise, there's a small chance the legal validity of the contract could be questioned, on the basis that certain parties were not made aware of the nature of the contract.
· Read the contract carefully. This should go without saying, but make sure you read every line of the contract carefully. Because of the prevalence of AI hallucinations and other small mistakes, we can't fully trust AI to generate long, detailed documents completely free of error. In the legal world, even a small typo can be devastating, so it's not worth the risk to blindly trust this generated document.
· Exercise caution with respect to user (and client) privacy. Not all legal AI services have appropriately robust protections in place for user privacy or security. If you're thinking of generating a contract that contains or relates to sensitive information, scrutinize the AI tool you plan on using before submitting anything.
· Ensure legal validity independently. Finally, make sure that the AI generated contract meets all requirements necessary for legal validity. If it does, unambiguously, it shouldn't be challengeable.
It’s hard to say exactly where legal AI goes from here – or exactly how courts will treat AI generated contracts in the future.
But one thing’s for sure.
Whatever happens, Law.co is going to remain at the leading edge of legal AI development. We want to help lawyers, law firms, and even average people navigate the complexities of the legal system – as well as the emerging complexities of AI’s role in law.
If you’re ready to get started, or if you’re just curious to learn more, contact us today!
Samuel Edwards is CMO of Law.co and its associated agency. Since 2012, Sam has worked with some of the largest law firms around the globe. Today, Sam works directly with high-end law clients across all verticals to maximize operational efficiency and ROI through artificial intelligence. Connect with Sam on Linkedin.
November 26, 2024
November 24, 2024
Law
(
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.
)
News
(
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.
)
© 2023 Nead, LLC
Law.co is NOT a law firm. Law.co is built directly as an AI-enhancement tool for lawyers and law firms, NOT the clients they serve. The information on this site does not constitute attorney-client privilege or imply an attorney-client relationship. Furthermore, This website is NOT intended to replace the professional legal advice of a licensed attorney. Our services and products are subject to our Privacy Policy and Terms and Conditions.