This website uses cookies

This website uses cookies to ensure you get the best experience. By using our website, you agree to our Privacy Policy

Jack Shepherd

Principal Business Consultant, iManage

Quotation Marks
“General consensus is that, at least right now, the use case for ChatGPT is simply producing first drafts of contracts”

ChatGPT for contract drafting: AI vs templates

ChatGPT for contract drafting: AI vs templates


Jack Shepherd explores the limitations and possible benefits of ChatGPT for the legal profession

Since ChatGPT landed at the end of 2022, it’s proven to be an impressive tool indeed. You can ask it a question and receive seemingly magic results. Ask it to write a song in the style of James Blunt and it will do just that?, in a worryingly convincing fashion.

If we want to really dig into how useful ChatGPT is going to be, we need to understand how it works. If we don’t understand that, we are in danger if thinking this is magic wizardry, and we have no idea where exactly we can use this promising technology. 

Here’s my attempt at explaining it:

  • There are two types of algorithms. Rules-based algorithms produce outputs based on a series of specific rules taught them by a human, eg ‘if time is before 12pm, display “Good Morning”.’ Machine-learning algorithms produce outputs based (trained) on prior examples, eg ‘here’s 1,000,000 pictures of a cat…okay, now tell me, based on what you know from those 1,000,000 pictures, is this next picture a cat or not?’
  • ChatGPT is a language learning model, meaning it is based on a machine-learning algorithm that has had millions upon millions of sentences thrown at it as training data. Therefore, it has been trained specifically to construct coherent sentences by calculating the probability of one word following another word. Like a super-powered autocorrect on your phone.
  • Specifically, with ChatGPT, humans have given examples of questions it might get asked, answers in response to those questions and ratings of potential answers to questions. This supervised part of ChatGPT is what gives it its magic powers to produce impressive responses to questions. At least, that’s what I think might have happened? Open AI is vague as to what the nature of human involvement actually is here.

The use case

ChatGPT can draft contracts. A prompt such as ‘draft me an employment contract’ will result in a fairly convincing-looking employment contract being produced. This might save lawyers time if they are struggling to find a template or example for a contract they need to produce.

It’s worth noting that there are a vast number of potential use cases for ChatGPT and contract drafting is just one of them. Other use cases include legal advice memos, drafting pleadings and court documents, summarisation. 

First drafts

I have thrown a number of examples of contract drafting tasks to ChatGPT, and the output always looks superficially impressive. However, because ChatGPT does not actually understand the legal consequences of its output, it’s common for its output to be questionable. Also, it doesn’t understand all the context, for example, it never asks you what the governing law should be.

I have also seen a number of examples where the output of ChatGPT simply does not make sense. For example, I asked it to draft me a Lock-Up Agreement, an agreement used in a restructuring transaction that commits holders of shares or bonds to vote in favour of a restructuring proposal. It is key to such agreements that the securities holders cannot transfer their securities to anybody, unless the transferee also agrees to vote in favour of the proposal. 

ChatGPT produced me a good-looking Lock-Up, but it allowed people to freely transfer shares if they were gifts or to transferees unaffiliated with the transferor, thereby subverting the whole purpose of the agreement. Again, this is because ChatGPT is designed to produce sentences that make sense, not legal advice that makes sense.

It would be foolish to rely on ChatGPT’s output without reviewing it and refining it. General consensus is that, at least right now, the use case for ChatGPT is simply producing first drafts of contracts. I find this more palatable, but I still have two issues with using ChatGPT in this way. 

Provenance and context

First, when I am reviewing a contract, I want to know why it has been drafted in the way it is. Why is there a clause about gardening leave in an employment contract? Why has it been drafted in a particular way? Do we need it in our specific circumstances? Which clauses should we definitely not agree to negotiate? As a lawyer, it would make me uncomfortable to think that I cannot answer these kinds of questions, if a client asked.

When I first used ChatGPT, I presumed there was a specific source for the output. Given the large dataset it is trained on, this is of course a fool’s errand. When you think about a contract as documenting a relationship between two parties, it feels weird to think people might be agreeing to things that one or both parties might not properly understand.


Second, when I am reviewing a contract based on a template, I can see what has changed. This allows me to target my review in specific places. I can skip over the boilerplate and concentrate on the clauses that I know will require interrogation. I can also do this when working from an example document. I can speak to the person who drafted it and work out which bits were heavily negotiated and which were not.

With output based on a black box, I have no idea what to focus on. This would worry me, as I would have to scrutinize every word on the page. With long legal documents, this would undoubtedly have a negative effect on my review and I would worry about missing things.

A last resort?

I can see why ChatGPT looks appealing to people for the contract drafting use case. But it’s still a long way from autonomously drafting contracts. Personally, I would be uncomfortable using it to generate first drafts as well. Few lawyers are drafting contracts that have never been drafted before and I would prefer to work from a firm-approved template,. which has at least been vetted. Without this , I may struggle to advise clients properly or achieve the best outcomes for them.

ChatGPT reveals something interesting about cultural differences in law firms when it comes to knowledge management. As a general rule, US attorneys are used to drafting from examples as their firms do not often have the resource to produce firm-approved templates. European lawyers, however, have more of a knowledge management tradition and often have a whole host of home-grown templates.

Of the people I have spoken to, the US attorneys seem more comfortable using ChatGPT for contract drafting because they are more used to fishing in the DMS for prior examples of contracts. Therefore, they see challenges to provenance and context in their source materials on a daily basis. This is less of a challenge for European lawyers, who are generally more familiar with working from templates. You can probably tell where my bias lies.

I asked ChatGPT to summarise this article. This is the output it gave:

“The article is about ChatGPT, a tool that uses machine-learning algorithms to generate responses to questions. The author mentions that while the outputs of ChatGPT look impressive, they can often be conflicting, wrong, or vague, and that the tool does not understand the legal consequences of its outputs. The author suggests that ChatGPT’s current use case is for producing first drafts of contracts, but raises two concerns: the lack of provenance and context in its outputs, and the inability to trace back its sentences to a specific source. The author warns that ChatGPT’s output should not be used without consulting with an attorney.”

I’d say this is a decent summary. I then asked ChatGPT whether the author was correct:

“As an AI language model, I do not have personal opinions or biases. However, based on the article, it appears that the author has a nuanced view of ChatGPT’s ability to draft contracts, recognising its potential for generating first drafts but also highlighting its limitations such as the lack of understanding of legal consequences, context and provenance.”

And I might just end this article there. Maybe summarisation is a better use case for ChatGPT than contract drafting.

Jack Shepherd is Legal Practice Lead at iManage