All posts

Contract Redlining Best Practices: A Step-by-Step Workflow

· 12 min read

Redlining a contract should be straightforward. You get a draft back, you compare it against what you sent, you identify what changed, and you respond. In practice, it is rarely that clean. The comparison output is noisy. The other side made changes they didn't mention. Someone reformatted the document and now every paragraph is flagged. A defined term was renamed and the cascade is buried across thirty pages.

The result is that most contract redlining workflows rely on attention and endurance rather than structure. Lawyers start at page one, read through every markup, and hope they don't lose focus before page forty. That approach works until it doesn't. And when it doesn't, the missed change is usually the one that matters most.

This post is a step-by-step workflow for redlining contracts that catches more, wastes less time, and doesn't depend on perfect concentration. It covers preparation, comparison, structured review, common mistakes, and the specific situations where you should stop and escalate.

What contract redlining is (briefly)

Redlining is the practice of marking up changes between contract versions so that both sides can see what was added, deleted, or modified. The term comes from the physical practice of drawing red lines through text on paper. Today it means producing a marked-up document, either by editing with Track Changes on or by running a comparison tool against two separate files.

The goal is always the same: give the reviewer a clear, complete picture of what changed so they can evaluate each change and decide whether to accept, reject, or counter. The quality of that picture depends on both the tool that produces the markup and the process used to review it.

If you want a deeper look at what redlining involves and where the standard tools fall short, our post on what contract redlining is covers the fundamentals in detail.

Step 1: Preparing to redline

Most redlining problems don't happen during the review. They happen before it, because someone skipped a preparation step that takes less than two minutes. The preparation phase is where you prevent the most common and most expensive mistakes.

Identify the correct baseline

The baseline is the version you last sent to the other side. Not the version you think you sent. Not the version in your drafts folder. The actual file that left your hands. Pull it from your sent email, your document management system, or whatever record of transmission you have.

Comparing against the wrong baseline is the single most common redlining error, and it's invisible until someone catches it. The comparison output will look plausible. It will show real differences. But those differences will be measured against the wrong starting point, which means your entire review is based on incorrect inputs. You might flag changes that aren't changes, miss changes that are, or both.

Accept all existing tracked changes

If either document still has pending tracked changes from a previous round, the comparison engine will compare against a document that has two states at once: the "before accept" state and the "after accept" state. The result is garbled output that mixes old markup with new differences.

Before running any comparison: open both files, go to Review, Accept All Changes, save. This takes thirty seconds and prevents the most common source of unusable comparison output.

Verify the received version

When the other side sends back a revised draft, do a quick sanity check before you start the comparison. Is the file name consistent with what they said they were sending? Does the date make sense? Is it a .docx file or did they send a PDF (which would require conversion before comparison)? If they sent both a clean copy and a redline, note which is which. You will compare against the clean copy, not the redline.

Set up your file naming

A naming convention like [Client]_[DocType]_v[X]_[YYYY-MM-DD].docx prevents version confusion. The exact format matters less than consistency. When you receive a file from opposing counsel with a different naming scheme, rename it to fit your convention before you do anything else. Three weeks from now, when you need to pull the right version, you will be glad you did.

Step 2: Running the comparison

With both files prepared, run the comparison. The tool you use matters, but the process around it matters more.

Always run an independent comparison

Even if the other side provided a redline, run your own comparison between their clean version and the version you sent. Their redline might be based on a different baseline. It might omit changes made after the markup was generated. It might not reflect changes made with Track Changes turned off.

An independent comparison shows what actually differs between two files, regardless of what anyone says changed. It is the verification step, and skipping it is the highest-risk shortcut in contract review.

Check the comparison quality

Before diving into individual changes, scan the comparison output for signs of misalignment. If large blocks of text show as completely deleted and reinserted (rather than showing specific word-level differences), the comparison may have misaligned the documents. This happens when documents have significantly different formatting, when sections were moved, or when someone applied a different template.

If the output looks garbled, check two things: first, that you're comparing the right files. Second, that tracked changes were accepted in both documents before comparison. If both checks pass, try a different comparison tool. Different engines handle structural changes differently, and some produce much cleaner output on reformatted documents than others.

Note the change summary

Before reading any individual markup, look at the big picture. How many changes does the comparison show? If your tool classifies changes, what's the split between content changes and formatting changes? Are the changes concentrated in a few sections or scattered throughout?

This ten-second scan sets your expectations. A comparison with 15 content changes is a focused review you can finish in 20 minutes. A comparison with 80 content changes across every section signals a heavily negotiated round that needs an hour or more. Knowing this upfront lets you budget your time and attention correctly.

Step 3: Structured review of changes

This is where most workflows break down. The default approach is to start at page one and read every markup in order. That approach gives equal attention to a changed font on page two and a removed indemnification cap on page twenty-eight. By the time you reach the cap, you've spent your best concentration on cosmetic edits.

A structured review works in priority order. You review the changes most likely to affect the deal first, while your attention is sharpest.

Start with definitions

Go to the definitions section first and review every change there before touching anything else. A changed definition is the highest-leverage edit in any contract. "Confidential Information" narrowed to exclude publicly available data. "Services" split into "Core Services" and "Ancillary Services." "Material Adverse Effect" redefined with a higher threshold. Each of these changes can affect dozens of clauses throughout the agreement.

For each changed definition:

  1. Note what changed (narrowed, expanded, new qualifier, new exception).
  2. Search the document for every occurrence of that term.
  3. For each occurrence, evaluate whether the definition change alters the meaning of that clause.
  4. If the changed term references other defined terms, check whether those also changed.

This step takes five to ten minutes on a typical contract. It catches the changes that comparison tools are worst at surfacing: the ones where the text in a clause didn't change at all, but its meaning changed completely because the definition behind a key term shifted.

Review high-risk clauses next

After definitions, go directly to the clauses that carry the most financial and legal risk. Don't wait to encounter them during a linear read. Go to them deliberately.

  • Indemnification. Caps, carve-outs, basket amounts, survival periods. Any change here directly affects financial exposure.
  • Limitation of liability. Cap amounts, exclusions from cap, consequential damages waivers. A changed number or a new exclusion can shift millions in exposure.
  • Termination. Triggers, cure periods, notice periods, post-termination obligations, termination for convenience. Changes here affect how either party can exit the agreement.
  • Intellectual property. Ownership, license scope, work-for-hire provisions, assignment clauses. These are often the hardest to reverse once signed.
  • Representations and warranties. Scope, knowledge qualifiers, materiality qualifiers, survival periods. A "knowledge" qualifier on a representation fundamentally changes what it guarantees.

For each change in these sections, ask: does this change the risk allocation, the financial terms, or the obligations? If yes, flag it for discussion with the partner or client. If no, note it and move on.

Check for moved clauses

Most comparison tools show moved text as a deletion in one place and an unrelated insertion somewhere else. If you see a clause deleted from one section and similar language appearing in a different section, check whether it was moved rather than removed and replaced. The distinction matters because a move can change the legal effect of a provision even when the words are identical: a limitation of liability clause in the general terms section has a different scope than the same clause nested inside a specific service schedule.

Look for this pattern: a deletion that seems surprising (why would they remove that protection?) paired with an insertion elsewhere that looks familiar. If you find a move, evaluate whether the new location changes its scope, applicability, or interaction with surrounding provisions.

Review tables and schedules separately

Tables are where comparison tools are weakest and where the money often lives. Pricing schedules, SLA matrices, milestone deliverables, fee structures. Do not rely solely on the comparison output for table changes. Open both versions side by side and verify cell by cell.

Check for: changed numbers, added or removed rows, modified thresholds, changed measurement periods, and any inconsistency between the table content and what the contract body says. If the contract text says "monthly fee of $10,000" and the pricing table says "$12,000," one of them changed and the other didn't get updated.

Verify cross-references

If any sections were added, deleted, or renumbered, check that cross-references still point to the correct targets. A new Section 3 pushes everything after it down by one. Every cross-reference to "Section 7" now needs to say "Section 8." If the drafter updated the numbering but missed a cross-reference, the contract contains a provision that points to the wrong section.

Search the document for references to any renumbered sections. Also search for references to any deleted sections. Both create orphaned references that can cause interpretation disputes later.

Scan remaining changes

After the priority items, work through everything else. This is where most of the formatting noise lives, along with boilerplate edits, minor clarifications, and punctuation fixes. These are lower priority but should not be ignored entirely. Occasionally, a change that looks cosmetic turns out to have substance. A changed heading level can affect which provisions fall under a particular governing clause. A "clarification" can subtly shift the scope of an obligation.

The goal at this stage is not to spend twenty minutes on every semicolon change. The goal is a quick, deliberate scan that catches anything with potential substance, while efficiently confirming that the rest is genuinely cosmetic.

Step 4: Responding to the redline

Once you've reviewed all changes, you need to produce a response. That means creating your own markup, documenting what you found, and preparing a clean version.

Create your response markup

Start from the version the other side sent (their clean copy). Turn on Track Changes in Word. Make your edits: accept what you agree with, reject what you don't, and propose alternative language where needed. Add comments to explain your reasoning on significant changes. The other side's counsel should be able to understand why you rejected or modified a provision without having to call you.

Write a change summary

Partners and clients don't want to read through every tracked change. They want a concise summary of what matters. For each significant change, document:

  • What changed: state the change in plain language.
  • Impact: how it affects risk, cost, timeline, or obligations.
  • Your response: accepted, rejected, or countered (with your proposed alternative language if applicable).

Example: "Liability cap reduced from $5M to $500K (Section 8.2). Impact: significantly increases our uncovered exposure on performance failures. Recommendation: counter at $5M, fallback at $3M with carve-out for willful misconduct."

These notes take five minutes to write and serve two purposes. They give the decision-maker what they need to respond quickly. And they create an audit trail that's invaluable when someone asks "why did we agree to this?" six months later.

Compare the final clean copy

Before sending your response, accept all your tracked changes to produce a clean version. Then run one more comparison: your clean version against the version you started from. This catches any editing errors (accidental deletions, tracked changes that weren't properly accepted or rejected, formatting artifacts). It also ensures that the clean version you send actually matches your markup.

This same principle applies at the end of the negotiation. When the other side sends a "final clean copy for signature," compare it against the last agreed version. Final copies sometimes contain discrepancies: a stale paragraph, a round of edits that didn't get incorporated, a change that was supposed to be in the next draft but landed in this one by mistake.

Common redlining mistakes (and how to avoid them)

These are not hypothetical. They come from real negotiations, real missed changes, and real problems that could have been prevented with a slightly different process.

Accepting the other side's redline without independent comparison

When opposing counsel sends a redline alongside their clean version, some lawyers review the provided redline and skip running their own comparison. This is the highest-risk shortcut in contract review.

The provided redline may be incomplete. It may be based on a different baseline. It may not reflect changes made after the markup was generated. It may omit changes made with Track Changes turned off. An independent comparison between their clean version and your last draft catches all of these problems. The provided redline catches none of them.

Relying solely on Track Changes

Track Changes records what happened while tracking was enabled. If someone turns off tracking, makes an edit, and turns tracking back on, that edit is invisible in the tracked changes view. If someone accepts a tracked change and then modifies the accepted text, the modification doesn't show up as a tracked change.

Track Changes is useful for creating your own markup. It is not a reliable verification mechanism for changes made by someone else. Independent comparison fills that gap.

Ignoring formatting changes without scanning them

When a comparison produces a hundred formatting-only changes, the temptation is to dismiss them all at once. Usually that's safe. But occasionally, a structural change hides among the cosmetic ones. A heading level change can affect which subsections fall under a particular governing provision. A numbering change can break cross-references. A changed defined term style (bold, italics, capitalization) can indicate that the term itself was modified.

Don't review every formatting change in detail. But do scan them quickly before dismissing the batch. Look for heading-level changes, numbering changes, and any formatting change in the definitions section.

Reading the comparison output linearly

Starting at page one and reading every markup in sequence is the default approach, and it's the least efficient. By the time you reach the liability cap on page thirty, you've already spent your best attention on "Party A" becoming "Company" on page two and a comma splice fix on page nine.

The structured review approach described above (definitions first, then high-risk clauses, then tables, then cross-references, then everything else) ensures that the changes most likely to affect the deal get your best attention. The order matters more than the speed.

Skipping the comparison on "minor" rounds

"They said they only changed the effective date." Run the comparison anyway. It takes seconds. The frequency with which the other side makes undisclosed changes (sometimes intentionally, sometimes because an associate made edits without telling the partner) is high enough that skipping the comparison is never worth the risk.

Not comparing the execution copy

The negotiation is done. Terms are agreed. The other side sends a clean copy for signature. Many teams skip comparing this against the last agreed draft because the negotiation is over. But execution copies sometimes diverge from the agreed version. A stale paragraph. An unapplied edit. A last-minute change that wasn't discussed. A two-minute comparison catches the difference. Skipping it doesn't save time; it creates unmanaged risk.

Best practices by stage

Here is a summary of the most important practices at each stage of the redlining workflow.

Preparation

  • Pull your baseline from where you actually sent it (sent email, DMS), not from where you think it is.
  • Accept all tracked changes in both documents before comparing.
  • Rename received files to match your naming convention immediately.
  • If either document is a PDF, convert to .docx before comparing (native PDFs only; scanned PDFs require OCR and the output should be manually verified).

Comparison

  • Always run your own independent comparison, even if the other side provided a redline.
  • Verify the comparison quality before reviewing individual changes. Garbled output means wrong files, unaccepted tracked changes, or a tool that can't handle the formatting differences.
  • Note the total change count and the content-to-formatting split before starting the review.

Review

  • Review in priority order: definitions, then high-risk clauses, then tables, then cross-references, then everything else.
  • For each changed definition, trace every occurrence through the document.
  • Check for moved clauses (deletions paired with similar insertions elsewhere).
  • Verify tables cell by cell against both versions, not just the comparison output.
  • If sections were renumbered, check every cross-reference.
  • Scan formatting changes before dismissing them. Look for heading-level changes, numbering changes, and definition formatting changes.

Response

  • Start from the other side's clean copy with Track Changes on.
  • Add comments explaining your reasoning on significant rejections or modifications.
  • Write a concise change summary for the partner or client (what changed, impact, recommendation).
  • Compare your final clean version against the version you started from to catch editing errors.
  • Inspect the document for hidden metadata before sending (Document Inspector in Word).

Tools and workflow integration

The right tool makes each stage of the workflow faster. The wrong tool, or no tool, makes every stage harder. Here's what matters at each level.

Word's built-in Compare

Free, already on your machine, and adequate for short, simply formatted documents. Review tab, Compare, select your two files, hit OK. The limitations show up on longer documents, reformatted documents, and anything with tables. The output treats every difference (font change, spacing change, substantive edit) with equal weight, so you're responsible for the entire triage yourself.

Dedicated comparison tools

Tools like Clausul, Litera, and Draftable produce cleaner output than Word. The better ones separate formatting changes from content changes, detect moved text, and classify changes by importance so you can triage faster. These features directly support the structured review workflow described above: if your tool can surface the high-risk changes first, you don't have to hunt for them manually.

Document management

If your firm uses a DMS (iManage, NetDocuments, SharePoint), pull your baseline from the DMS version record rather than from local copies. The DMS version is the system of record. Local copies can be stale, modified, or the wrong version entirely. Integrate your comparison workflow with the DMS so that the comparison input files are always the authoritative versions.

Communication

Your change summary is a communication tool, not just a review artifact. Use a consistent format across your team. Keep it short. Make every item actionable: what changed, what it means, what you recommend. If you share the comparison output with the partner or client alongside the summary, highlight the key changes so they don't have to find them in a wall of red markup.

When to escalate

Not every redlining issue is something you handle on your own. Certain patterns should trigger escalation to a senior lawyer, the deal partner, or the client.

Undisclosed changes to high-risk clauses

If the other side changed an indemnification cap, added a termination trigger, modified an IP assignment provision, or altered a liability exclusion without mentioning it in their cover email, escalate. Undisclosed changes to high-risk provisions are either a mistake (their associate forgot to mention it) or intentional (they hoped you wouldn't notice). Either way, it warrants a conversation with someone senior on your team about how to respond.

New clauses or sections that weren't negotiated

If the comparison reveals entirely new provisions that were not part of the negotiation discussion, escalate. A new non-compete clause, a new audit right, a new assignment restriction. Adding provisions without discussion is a different kind of negotiation tactic than modifying existing terms, and it usually requires a strategic response rather than a clause-by-clause markup.

Definition changes with broad cascading effects

A changed definition that affects a handful of clauses is normal negotiation. A changed definition that fundamentally alters the scope of the entire agreement (for example, "Services" being redefined to exclude a major category of work) is something the deal partner needs to know about before you respond.

Execution copy discrepancies

If the final clean copy doesn't match the last agreed version, stop. Do not sign. Do not send for signature. Escalate immediately. This could be a drafting error, a miscommunication, or something worse. Either way, it needs to be resolved before anyone signs.

Patterns of obscured changes

If you notice a pattern where changes seem designed to be difficult to detect (moving a clause to a less visible section while simultaneously narrowing its scope, renaming a defined term in a way that subtly excludes coverage, splitting a provision across two sections so the deletion of one half is less obvious), escalate. These patterns change the nature of the negotiation and the deal partner should be aware.

Putting it all together

Contract redlining is not complicated in principle. It's demanding in practice because the volume of changes, the noise in comparison output, and the consequences of missing something all push against casual attention.

The workflow in this post reduces that pressure by giving you a structure that doesn't depend on reading every markup in sequence from page one. Prepare correctly (right baseline, accepted tracked changes, clean files). Run an independent comparison (always, regardless of what the other side provided). Review in priority order (definitions, high-risk clauses, tables, cross-references, then everything else). Document what you found. Compare the final clean copy before signing.

If you're spending too much time sorting through formatting noise, or if you need a comparison tool that classifies changes by importance so you can work through the high-risk items first, Clausul was built for exactly that workflow. But regardless of which tool you use, the process above will make your redlining more reliable and less dependent on sustained concentration across forty pages of markup.

Frequently asked questions

What is the most important step in redlining a contract?

Running an independent comparison against the version you actually sent, not the version you think you sent. The most common and most damaging redlining failure is reviewing against the wrong baseline. After that, the highest-leverage step is reviewing definition changes first and tracing their impact through the rest of the agreement before touching any other section.

Should I trust a redline that opposing counsel sends me?

No. A redline provided by the other side shows what they chose to mark up, which may not reflect every change they made. It may be based on a different baseline than your last draft, or it may omit changes made after the redline was generated. Always run your own independent comparison between the clean version they sent and the last version you sent. The provided redline can be a helpful reference, but it is not a substitute for your own verification.

How do I handle a redline with hundreds of formatting changes?

First, determine whether the formatting changes are masking substantive edits. Use a comparison tool that separates formatting from content, or manually scan a sample of the formatting changes to confirm they are genuinely cosmetic. Then set the formatting changes aside and focus your review on content changes. Do not dismiss all formatting changes without scanning them: occasionally, structural changes (heading levels, numbering, defined term formatting) appear as formatting edits but have substantive implications.

When should I escalate a redline issue to senior counsel?

Escalate when: (1) a high-risk clause (indemnification, liability cap, termination, IP assignment) has been materially changed without disclosure, (2) entirely new clauses or sections have been added that were not part of the negotiation, (3) defined terms have been changed in ways that cascade through the agreement and alter obligations, (4) the clean execution copy does not match the last agreed version, or (5) changes appear designed to obscure their effect (e.g., moving a clause to a different section while simultaneously narrowing its scope).

What is the difference between a redline and an independent comparison?

A redline is a marked-up document showing changes. It can be produced by editing with Track Changes on (which records what happened while tracking was active) or by running a comparison tool against two files (which shows what actually differs between them). An independent comparison specifically means using a comparison tool to detect differences between two files, regardless of whether Track Changes was used. The distinction matters because Track Changes only captures edits made while it was enabled. An independent comparison catches everything, including changes made with tracking turned off.


About this post. Written by the Clausul team. We build document comparison software for legal teams.

Something inaccurate? Let us know.

Last reviewed: March 2026.