All posts

What Is Contract Redlining? (And Where Most Tools Get It Wrong)

· 12 min read

Here's something that still surprises me. Redlining has been part of legal practice for decades, and yet most of the tools lawyers use to do it haven't really changed since the early 2000s. The word processor got a "Compare" button. A few vendors wrapped nicer interfaces around the same character-diff engine. And everyone just... kept going.

But the contracts got longer. The deal cycles got faster. And the gap between what redlining tools show you and what you actually need to know kept getting wider.

This post is about that gap. We'll walk through what contract redlining really involves in daily practice, where the standard tools genuinely fall short, and what to look for if you're tired of reading through walls of red ink that don't tell you much.

What contract redlining actually is

Redlining is the practice of marking up changes between document versions so that everyone involved can see exactly what moved, what was added, and what got cut. The name comes from the old-school method of literally drawing red lines through text on paper copies. Lawyers would scribble in the margins, cross things out, write new language above the line. It was physical, it was messy, and honestly it worked pretty well for single-page amendments.

Today, "redlining a contract" means producing a marked-up document that highlights the differences between two versions. Sometimes that markup comes from editing with Track Changes turned on. Sometimes it comes from running a comparison tool against two separate files. Either way, the goal is the same: give the reviewer a clear picture of what changed so they can decide whether to accept, reject, or push back.

And here's the part that doesn't get said enough: the quality of your redline directly affects the quality of your review. A clean, well-organized redline lets you focus on substance. A noisy, cluttered one buries the important stuff under hundreds of irrelevant markups. Same contract, same changes, wildly different review experience depending on what tool produced the output.

Two flavors of redlining (and when to use each)

There are really two distinct ways lawyers create redlines, and they serve different purposes. It's worth being clear about which one you're dealing with, because the failure modes are different.

Track Changes: editing in real time

You open a document, turn on Track Changes in Word, and start making your edits. Every addition shows up underlined (usually in a color). Every deletion shows up with a strikethrough. When you send the file back to the other side, they can see precisely what you touched and accept or reject each change individually.

This works well when you're the one doing the editing. You control what gets marked up. The output is clean because it only contains your intentional changes.

The trouble starts when someone else edits without Track Changes on, or when you receive a "clean" version that's supposed to reflect only the agreed changes. How do you verify that nothing extra slipped in? You can't. Not from the document alone. That's where the second flavor comes in.

If you want the step-by-step mechanics of both methods in Word, we have a detailed walkthrough that covers the button clicks.

Comparison-generated redlines: the after-the-fact check

This is what you need when you have two files and want to know what's different between them. You feed the "before" and "after" versions into a comparison tool, and it produces a redline showing every difference.

This is the verification step. The "trust but verify" of contract negotiation. Even if the other side says they only changed three clauses, you run the comparison anyway. Because sometimes people change more than they say they did. Not always on purpose, but it happens often enough that skipping this step is genuinely risky.

The catch? Comparison tools vary wildly in how useful their output is. And that's where things start to get interesting.

What a trustworthy redline gives you

Before we talk about what goes wrong, it helps to know what "right" looks like. A redline that actually serves the reviewer (rather than just dumping data on them) does a few things:

  • It separates substance from noise. Formatting changes, spacing tweaks, and punctuation edits are clearly distinguished from changes that affect legal meaning. You can still see them if you want to. But they don't compete for your attention alongside a modified indemnification cap.
  • It handles moved text as moves. When a paragraph relocates from one section to another, it shows up as a single move, not as a deletion in one place and an unrelated insertion somewhere else.
  • It groups related changes together. If someone renamed a defined term and that change rippled through 40 occurrences, a good redline treats that as one logical change instead of 40 scattered markups.
  • It tells you where to look first. Not every change carries the same weight. A changed dollar amount or a removed termination right is more urgent than a swapped synonym. Good output helps you prioritize.

That's the bar. And honestly? Most tools don't clear it. They produce accurate diffs (every character difference gets flagged), but accuracy and usefulness are not the same thing.

Five ways redlining goes wrong in practice

These aren't edge cases. They happen in ordinary contract negotiations, on ordinary documents, with the tools most lawyers use every day. If you've ever opened a redline and thought "this can't be right," chances are you ran into one of these.

1. Formatting noise that drowns out the real changes

This is the big one. Someone reformats the document (different fonts, adjusted margins, changed line spacing, applied a template) and suddenly the redline is a sea of red. Every cosmetic tweak gets the exact same visual treatment as a substantive edit. A changed font size on a heading looks identical to a changed liability cap.

You know what happens next. The reviewer starts skimming. They get fatigued after page three of formatting markups and their attention drifts. And that's precisely when a real change slips by unnoticed.

Some tools let you toggle "ignore formatting." But that's a blunt instrument. What if someone deliberately changed the formatting on a defined term to make it blend into the boilerplate? All-or-nothing filtering doesn't work for high-stakes review.

2. Moved paragraphs that look like something got deleted

Picture this: a limitation of liability clause gets moved from the general terms section into a specific carve-out section. The legal effect might be completely different now. But your redline shows two separate, unrelated changes: "paragraph deleted from Section 5" and "new paragraph added at Section 11."

If the document is long enough, those two changes might be pages apart. A reviewer has to mentally connect them. And if they don't? They might read the deletion as a removed protection and flag it as a problem, when really it was just repositioned. Or worse, they might miss that the repositioning actually narrowed its scope.

True move detection (where the tool recognizes that text was relocated, not removed and re-added) requires understanding document structure. Most redlining tools don't do this. Word's comparison definitely doesn't.

3. Tables that turn into a garbled mess

Legal documents are full of tables: payment schedules, fee structures, milestone deliverables, compliance matrices. And tables are where comparison engines tend to fall apart most dramatically.

Add one row to a pricing table and some tools will misalign every row below it, comparing row 3 of the old version against row 4 of the new one. Delete a column and the output can look like half the table was rewritten. Merge two cells? Good luck getting anything readable out of that.

For anyone who regularly reviews contracts with significant tabular content (service agreements, licensing schedules, construction contracts), this is a real problem. The table is often where the money lives, and it's exactly where the redline is least reliable.

4. Section renumbering that buries a real deletion

Someone inserts a new clause at Section 3. Everything after it renumbers. Sections 4 through 18 are now Sections 5 through 19. The content didn't change in any of those sections. But the heading numbers did, so the redline flags every single one as modified.

Somewhere in that avalanche of renumbered headings, Section 7 (now Section 8) also had a sentence quietly removed. Finding that one real change among fifteen false positives is like spotting a specific receipt in a stack of renumbered invoices. It's possible. But it takes concentration, and concentration is exactly what's running low after reading through all those numbering changes.

5. Defined term renames that flood the whole document

Change "Effective Date" to "Commencement Date" in the definitions section and suddenly every occurrence throughout the contract gets flagged. That could be 30 or 40 individual markups, all for what is (usually) a single, harmless rename.

Usually. But not always. Sometimes a term rename subtly changes scope. "Services" becoming "Core Services" might look like a stylistic preference, but if a new "Ancillary Services" definition was added alongside it, the split could exclude certain obligations from the original coverage. A good redline would group all those occurrences into one logical change and let you evaluate the rename once, clearly. Most redlines just give you 40 scattered markups and wish you the best.

When Word's redlining is honestly fine

Let's be fair about this. Word's redlining tools aren't terrible. They're limited, but for certain situations they get the job done.

Word's Track Changes works well when:

  • You're the one making the edits and you want a clean markup of your own changes
  • The document is short (under 10 pages) and the changes are straightforward
  • Both parties have agreed to use Track Changes throughout the negotiation

Word's Compare Documents is workable when:

  • You're doing a quick check on a short, simple document
  • The document hasn't been reformatted between versions
  • You don't need to distinguish between material and cosmetic changes (because there aren't many cosmetic ones)

But here's the honest truth. Once a document crosses about 15 pages, or once someone reformats it, or once you're reviewing contracts where a missed change could actually cost money, Word's output starts working against you rather than for you. The noise-to-signal ratio gets bad fast, and the time you spend sorting through it adds up.

We wrote a full guide to redlining in Word if you want to get the most out of what it offers. It covers both Track Changes and Compare Documents step by step, plus the settings that help reduce (though not eliminate) the noise.

What "better" actually looks like

There's a generation of comparison tools that go beyond character-level diffing. Instead of treating every difference the same, they classify changes by what they actually affect. Formatting edits get separated from content edits. Moved text gets recognized as moves. Related changes get grouped together.

The term for this is "semantic comparison," and it means the tool is looking at what the words mean, not just whether the characters match. A changed dollar amount gets treated differently than a changed font. "Best efforts" becoming "commercially reasonable efforts" gets flagged as significant, while "shall" becoming "will" in a jurisdiction where they're equivalent gets marked as stylistic.

This doesn't replace your judgment. You still make the call on whether a change is acceptable. But the tool stops wasting your time on things that don't warrant your attention, so you can spend that attention where it actually counts.

That's what we built Clausul to do. Semantic comparison with AI classification, so you see the changes that matter first and the formatting noise doesn't steal the show. If you're curious, our full comparison guide covers how different tools stack up, including pricing and feature breakdowns.

Frequently asked questions

What does it mean to redline a contract?

Redlining a contract means marking up the differences between two versions so both sides can see exactly what changed. The term comes from lawyers who literally drew red lines through text on paper. Today it usually means using software to compare document versions and produce a markup showing additions, deletions, and modifications.

What's the difference between a redline and a blackline?

In practice, most lawyers use these interchangeably. 'Redline' is more common in the US, while 'blackline' shows up more in Canada and parts of the UK. Some firms draw a technical distinction (redline = tracked changes you can accept or reject; blackline = a clean summary of changes), but the meaning is effectively the same.

Can I redline a contract without Track Changes?

Yes. Word's Compare Documents feature can generate a redline from any two .docx files, even if Track Changes was never turned on. Dedicated comparison tools like Draftable, Litera Compare, and Clausul also produce redlines without requiring Track Changes.

Why does my contract redline show hundreds of changes when only a few things were edited?

Most likely formatting noise. If someone reformatted the document (changed fonts, adjusted spacing, applied styles), those all register as 'changes' in a character-level comparison. Moved paragraphs also inflate the count because they show up as a deletion plus a separate insertion. A semantic comparison tool can separate these from the substantive edits.

Is redlining the same as document comparison?

They overlap but they're not identical. Redlining is the broader practice of marking up changes in a document. Document comparison is one way to produce a redline. You can also redline by editing with Track Changes turned on. The key difference: comparison generates a redline after the fact from two separate files, while Track Changes records edits as you make them.


About this post. Written by the Clausul team. We build document comparison tools for legal teams and think a lot about what makes a redline actually useful vs. just technically accurate.

Questions or corrections? Drop us a line.

Last reviewed: February 2026.