How to Compare Contracts Efficiently
Every lawyer has a version of this story. You get a contract back from opposing counsel. The email says "minor comments only." You open the file, run a quick comparison, and the output is a mess of red. Hundreds of changes. Most of them are formatting noise. Somewhere in that wall of markup, there's a liability cap that quietly dropped from $5M to $500K.
The problem isn't that you don't know how to compare contracts. You do. The problem is that the typical workflow wastes enormous amounts of time on things that don't matter, while making it surprisingly easy to miss the things that do.
This guide is a practical walkthrough of how to compare contracts efficiently. Not the theoretical version. The version that works on a Tuesday afternoon when you have six revised drafts sitting in your inbox and a partner asking for a summary by end of day.
How most lawyers compare contracts today
Let's be honest about what the actual workflow looks like for most legal teams. It's usually one of three things, sometimes a combination.
The manual side-by-side
Two documents open, two windows tiled on the screen. Read paragraph by paragraph. Maybe print both copies and use a highlighter. This works for a two-page NDA where only the governing law and notice address changed. It falls apart the moment you're looking at anything longer than about five pages, because human attention just isn't built for character-level comparison across 30 pages of dense legal text.
And yet, a lot of lawyers still default to this. Sometimes because they don't trust the tools (fair). Sometimes because nobody showed them anything better (also fair).
Word's built-in Compare
Review tab, Compare, select your two files, hit OK. Word generates a new document with tracked changes showing every difference. It's free, it's already on your machine, and it works. For simple documents with minimal formatting changes, it's genuinely fine.
The problem shows up when the document has been reformatted, or when someone applied a different template, or when paragraphs got moved around. Suddenly you're looking at a document that's 80% red ink, and most of it is font changes and spacing adjustments. The substantive edits are in there somewhere. You just have to find them.
We have a detailed guide to redlining in Word if you want to get the most out of what it offers, including the settings that reduce (though don't eliminate) the noise.
Dedicated comparison tools
Litera Compare is the dominant name in BigLaw. Draftable is popular with smaller firms and solo practitioners. There are others. These produce cleaner output than Word, but most of them still use the same fundamental approach: compare characters and words, flag every difference, present it all with equal weight.
Here's the thing. Having a better comparison tool helps. But if your process for reviewing the output is "start at page one, read every markup, and hope I don't lose focus," you're still going to miss things. The tool is half the equation. The workflow is the other half.
Before you compare: the setup that saves you time later
Most comparison problems don't happen during the comparison. They happen before it, because someone skipped a setup step that takes less than two minutes.
Accept all tracked changes in both documents
This is the single most common source of garbled comparison output. If Document A still has pending tracked changes from a previous round, and you compare it against Document B, the comparison engine gets confused. It's trying to diff against a document that has two states at once (the "before accept" state and the "after accept" state). The result is a mess of old markup mixed with new differences, and you can't tell which is which.
Before running any comparison: open both files, go to Review, Accept All Changes, save. Then compare. Every time.
Confirm you have the right versions
This sounds so basic it feels silly to mention. But wrong baselines are one of the top causes of wasted review time. The comparison looks bizarre, you spend 20 minutes trying to make sense of it, and then you realize you compared v3 against v5 instead of v4 against v5. That's 20 minutes gone.
A quick sanity check: look at the date modified on each file, confirm the file names match what you expect, and if you're pulling from a shared folder or DMS, verify the version number. Takes 30 seconds. Prevents a 30-minute mistake.
Use a naming convention (and actually follow it)
The MSA_client_comments_final_final_v6_REVISED.docx meme exists because it's
painfully real. A simple convention like [Client]_[DocType]_v[X]_[YYYY-MM-DD].docx prevents most version confusion.
The exact format matters less than everyone on the team using the same one.
If you're working with outside counsel who won't follow your convention (which is most of them, honestly), rename their file when you receive it so it fits your system. That small act of discipline pays for itself the first time you need to pull the right version three weeks later.
A step-by-step process for comparing contracts
Here's the actual process. It's not complicated, but following it consistently is what separates a reliable review from a "pretty sure I caught everything" review.
Step 1: Prepare both files
Accept all tracked changes in both documents. Save clean copies. Confirm file names and versions. If either document is a PDF, convert to .docx first (native PDFs only; scanned PDFs need OCR and the output should be manually verified before you trust a comparison against it).
Step 2: Run the comparison
Use whatever tool you have. Word Compare, Draftable, Litera, Clausul, doesn't matter at this step. Feed in the "before" version (what you sent) and the "after" version (what you got back). Make sure the tool knows which is which, because reversing them flips every insertion into a deletion and vice versa.
Step 3: Scan the change summary (don't start reading yet)
Before reading any individual markup, look at the big picture. How many changes are there? What's the rough split between formatting edits and content edits? Are entire sections showing as modified, or is it scattered? This ten-second scan tells you whether you're dealing with a light round of comments or a significant rework. It also helps you decide how much time to budget for the review.
Step 4: Triage (covered in detail below)
Go straight to the high-impact areas. Don't read linearly from page one. We'll cover this in the next section.
Step 5: Full review of remaining changes
After triaging the high-impact items, work through the rest. This is where most of the formatting noise lives. Skim it, but don't ignore it entirely. Occasionally, a seemingly cosmetic edit (a changed defined term, a tweaked cross-reference) turns out to have substance.
Step 6: Document what you found
Write a short summary. Not a novel. A few bullet points covering what changed, what matters, and what needs a decision. This becomes your audit trail and your communication tool. We'll cover the format in the output section.
The triage step: high-impact changes first
This is the part that makes the biggest difference to your efficiency, and it's the part most people skip. Instead of reading the comparison output linearly (page 1, page 2, page 3...), go straight to the areas where changes are most likely to cost money, shift risk, or change obligations.
What to scan for first
- Numbers and dollar amounts. A liability cap that moved from $10M to $1M is the kind of change that can get buried in a noisy redline. Scan for any changed figures: caps, baskets, deductibles, fee amounts, payment schedules. These are almost always material.
- Dates and time periods. Notice windows shortened from 30 days to 10. Cure periods cut in half. Renewal dates shifted. Termination notice deadlines changed. Any date or time period that's different deserves close attention.
- Obligation language. "Best efforts" becoming "commercially reasonable efforts." "Shall" becoming "may." "Must notify" becoming "may notify at its discretion." These are the edits that change who has to do what.
- Indemnification, liability, and termination clauses. Go directly to these sections every time. Don't wait to encounter them during a linear read. If something changed here, you need to know about it before anything else.
- Definitions. A changed definition can ripple through an entire agreement. "Services" becoming "Core Services" looks minor until you notice a new "Ancillary Services" category was added alongside it, quietly carving out half the original scope.
This triage pass takes about three to five minutes for a typical contract. And it fundamentally changes how you approach the rest of the review, because you already know where the real changes are. Everything else is confirmation.
When Word Compare is enough (and when it isn't)
There's no reason to pay for a tool you don't need. Here's an honest assessment of when Word's built-in Compare does the job.
Word Compare is fine when:
- The document is short (under about 10 pages)
- The formatting hasn't changed between versions
- You're comparing simple text edits, not structural rearrangements
- There aren't significant tables or schedules involved
- You're doing a quick gut-check, not a final review before signing
Word Compare starts failing you when:
- Someone reformatted the document (different template, different styles, different fonts) and now 70% of the markups are cosmetic
- Paragraphs were moved between sections and show up as separate delete/insert pairs with no connection between them
- The contract has detailed tables (pricing schedules, service levels, milestone deliverables) and the comparison garbles the row alignment
- A defined term was renamed throughout, creating dozens of identical markups that are really one logical change
- The document is 30+ pages and you need to trust that nothing slipped through
Honestly, for a lot of day-to-day work at smaller firms, Word Compare is enough. The problems show up when stakes are high, documents are long, or the other side has been liberal with formatting changes. That's when a dedicated comparison tool earns its keep.
Common mistakes that cause real problems
These aren't hypothetical. They come from conversations with lawyers who got burned, and from our own experience building comparison software.
Comparing against the wrong version
This is the number one mistake. Someone grabs "v3" from the shared drive, but the version they actually sent to opposing counsel was the one labeled "v3_clean" in a subfolder. The comparison output looks strange, they spend 20 minutes trying to interpret it, and then they realize the baseline was wrong. Occasionally, this mistake goes unnoticed and the review proceeds against a false comparison. That's how changes get missed.
Prevention: always pull your "before" file from wherever you sent it (your sent email, your DMS, your matter folder), not from wherever you think it might be.
Skipping the comparison on "minor" rounds
"They said they only changed the effective date." Great. Run the comparison anyway. It takes ten seconds. The number of times the other side has changed more than they disclosed is not zero. It's not even close to zero. Sometimes it's intentional. Often it's just an associate who made additional edits without telling the partner, who then told your team "just the date." Either way, you need to verify.
Trusting the redline without checking the source
Occasionally, opposing counsel sends a redline along with the clean version: "here's our markup, and here's the clean." Some lawyers review the provided redline and skip running their own comparison. Don't do this. The redline they sent might not reflect the actual differences between the clean version and your last draft. It might be based on a different baseline. It might be incomplete. Always generate your own comparison from the actual files.
Ignoring "cosmetic" changes without checking
When a comparison produces 150 formatting-only markups, the temptation is to dismiss all of them at once. Usually that's fine. But a changed defined term can look like a formatting edit if the tool doesn't distinguish content from style. A cross-reference that was updated (or broken) can appear as a minor text change. The rule is: collapse the noise, but give it a quick scan before you discard it completely.
Not comparing the final execution version
This one is less obvious. You negotiate for three rounds, agree on the final terms, and the other side sends a "final clean copy for signature." Many teams skip comparing this against the agreed-upon last draft because, well, it's supposed to be final. But "final" versions sometimes contain discrepancies. Maybe a round of edits didn't get incorporated. Maybe the version that went to the signing partner had a stale paragraph. A quick comparison of the execution copy against the last agreed draft takes two minutes and protects against a very bad outcome.
Handling the output: clean redlines and client summaries
Running the comparison is half the job. The other half is turning the output into something useful for the people who need to act on it.
Creating a clean redline for opposing counsel
When you send a marked-up version back to the other side, you want a redline that shows your changes clearly against their last draft. Here's the process:
- Start from the version they sent you (their "clean" copy)
- Make your edits with Track Changes turned on
- Before sending, inspect the document for hidden metadata (comments you meant to delete, internal notes, document properties with sensitive file paths). Use Word's Document Inspector under File, then Info, then Inspect Document.
- Send both a redline version (with tracked changes visible) and a clean version (with all your changes accepted). Label each file clearly.
If you're using a comparison tool that outputs a redlined .docx, you can use that as your markup file. But always open it and verify that the tracked changes accurately represent your edits before sending it out.
Documenting what changed for the client or partner
Partners and clients don't want to read a 200-change redline. They want a summary they can act on. Keep it short. Use this format:
- Change: what text changed, stated in plain language
- Impact: how it affects risk, cost, timeline, or control
- Recommendation: accept, reject, or counter (with suggested language if applicable)
An example: "Indemnification cap reduced from $5M to $2M. Impact: significantly increases our uncovered exposure on IP claims. Recommendation: counter at $5M, fallback position at $3.5M with carve-out for willful breach."
Another: "Termination for convenience added with 15-day notice. Impact: client could exit mid-implementation with minimal notice. Recommendation: reject, or counter with 60-day notice plus wind-down payment."
Notes like these take five minutes to write and they do two things well. They give the decision-maker exactly what they need to respond quickly. And they create a paper trail that's invaluable if anyone asks "why did we agree to that?" six months later.
Keeping a comparison log across rounds
For deals with multiple negotiation rounds, maintaining a running log of what changed in each round helps you spot patterns. Is the other side gradually weakening the same provision across successive drafts? Did a term that was rejected in round two quietly reappear in round four with different wording? A simple spreadsheet with columns for round number, change description, and disposition is enough. Nothing fancy.
Putting it together
Comparing contracts efficiently isn't about finding a magic tool (though a good tool helps). It's about having a process that doesn't let things slip through, and that doesn't waste your time on things that don't matter.
The short version: prepare your files properly, run the comparison, triage high-impact changes first, do a full pass second, and document what you found in a format that someone can act on. Do this consistently and you'll catch more, miss less, and spend less time staring at walls of red ink.
If you're finding that Word Compare isn't giving you reliable output on longer documents, or if you're spending too much time sorting through formatting noise, Clausul was built for exactly that problem. It separates substantive changes from cosmetic ones so you can focus on what actually matters. But regardless of what tool you use, the process above will make your reviews faster and more reliable.
Frequently asked questions
How long does it take to compare two versions of a contract?
It depends on the method. A manual side-by-side read of a 30-page agreement takes most lawyers 45 minutes to an hour. Word Compare generates output in seconds, but reviewing that output (especially if there is formatting noise) can take 20 to 40 minutes. A semantic comparison tool that filters noise and groups related changes can cut effective review time to 10 to 15 minutes for a typical negotiation round.
Should I compare contracts even if the other side says they only made minor changes?
Yes. Always. "They only changed the date" is one of the most common phrases that precedes a missed substantive edit. Running a comparison takes seconds with any tool. Skipping it because someone told you the changes were minor is a risk with no upside.
What is the best way to compare two contracts in Word?
Open Word, go to Review, then Compare, then Compare Documents. Select your original file and the revised file. Word will generate a new document showing all differences as tracked changes. Before you start, accept all existing tracked changes in both documents so old markup does not contaminate the output. For a detailed walkthrough, see our guide on how to redline in Word.
Do I need a dedicated tool or is Word Compare enough?
For short documents (under 10 pages) with minimal formatting changes, Word Compare is often enough. Once documents get longer, involve tables, or have been reformatted between versions, the noise in Word Compare output makes it unreliable for high-stakes review. A dedicated comparison tool helps by producing cleaner output and, in some cases, separating substantive changes from cosmetic ones.
How do I create a clean redline to send to opposing counsel?
Start by comparing the version you sent against the version you received back. Review the output and confirm every flagged change is real (not an artifact of formatting differences). Then accept all the changes that are accurate, reject any false positives, and save the result as your redline. If your comparison tool exports a redlined .docx with tracked changes, you can send that file directly. Always double-check that no metadata or hidden comments are included before sending.