All posts

Table Changes in Contracts: SLA Schedules, Pricing, and What Tools Miss

· 13 min read

Most contract comparison discussions focus on paragraphs: the indemnification clause was modified, the termination provision was expanded, the defined term was narrowed. But many of the most financially significant terms in a contract live in tables: pricing schedules, SLA matrices, milestone payments, disclosure schedules.

Tables are also where comparison tools are weakest. The same tool that reliably catches a two-word change in a liability cap may completely garble the comparison of a pricing table. This means the most important numbers in the contract get the least reliable automated review.

This post explains why table comparison is hard, where the common tools fail, what types of table changes matter most, and how to review them.

Why tables are hard to compare

Text comparison is a well-solved problem. Given two sequences of words, algorithms can efficiently find the insertions, deletions, and substitutions between them. This works because text is one-dimensional: words come in a sequence.

Tables are two-dimensional. A table is a grid of cells, and comparing two grids requires solving a harder problem: which cell in version A corresponds to which cell in version B? This correspondence is straightforward when the table structure hasn't changed (same rows, same columns, just different text in some cells). But when rows are added, deleted, or moved, the tool needs to figure out the alignment before it can compare the content.

Three specific structural changes make table comparison especially difficult:

Row insertions and deletions

If a pricing table had 10 rows and now has 12, which 2 rows are new? If the new rows were added at the end, the answer is obvious. If they were inserted in the middle, every subsequent row shifts down. The tool needs to match rows by content similarity rather than by position. Some tools don't do this: they compare row 1 against row 1, row 2 against row 2, and so on. When a row is inserted at position 3, every row from 3 onward is compared against the wrong counterpart.

Cell merges and splits

A cell that spanned 2 columns in version A is split into 2 separate cells in version B. Or 3 cells in version A are merged into 1 cell in version B. The correspondence between cells is no longer one-to-one. Most comparison algorithms cannot handle this gracefully. They either report the merged/split cells as completely changed (even if the text is the same) or skip the comparison entirely.

Column additions and removals

If a table gains or loses a column, the grid structure changes. Every row has a different number of cells. The tool needs to determine which columns match between the two versions before comparing cell content. This is essentially a two-dimensional alignment problem that most text-comparison algorithms are not designed to solve.

Four types of contract tables and what changes matter

1. Pricing tables and fee schedules

What they contain: line items, unit prices, quantities, totals, discount tiers, payment terms. May include formulas or calculated fields.

What changes matter: any change to a number matters. A unit price change from $150 to $175 is a direct financial impact. A quantity change from 1,000 to 10,000 (or 10,000 to 1,000) changes the total by an order of magnitude. Row additions may add new line items with new costs. Row deletions may remove deliverables that were previously included in the price. Discount tier changes affect the effective rate at different volumes.

The risk: pricing changes are often made in the final round of edits, after the substantive terms have been negotiated. A small change to a number in a table is easy to miss in a comparison that shows 80 text changes across 30 pages.

2. SLA tables and performance metrics

What they contain: service levels, uptime percentages, response time targets, resolution time commitments, measurement periods, penalty calculations.

What changes matter: threshold changes (99.9% uptime to 99.5%), measurement period changes (monthly to quarterly, which allows worse months to be averaged out), penalty calculation changes (flat fee to percentage of monthly charges), exclusion additions (excluding scheduled maintenance, third-party outages, force majeure events from the uptime calculation). Each of these changes affects the practical enforceability of the SLA.

The risk: SLA tables are technical. The significance of a change from "monthly" to "quarterly" measurement may not be obvious to a legal reviewer who focuses on the contract language. But it means the provider has three months to recover from a bad month instead of one.

3. Disclosure schedules

What they contain: exceptions to representations and warranties, organized by the section of the main agreement they reference. Each entry is a specific item that the seller is disclosing as an exception to a general representation.

What changes matter: new entries (the seller is disclosing an additional exception), removed entries (a previously disclosed exception is no longer listed, which may indicate the issue was resolved or may be an omission), modified descriptions (the scope of an existing exception is narrowed or broadened), and cross-reference changes (an exception that was under Section 3.5 is now under Section 3.7, potentially changing which representation it qualifies).

The risk: disclosure schedules are often reviewed separately from the main agreement, sometimes by different team members. Changes to the schedule may not be cross-referenced against changes to the main agreement. A narrowed representation in the main body combined with expanded disclosure exceptions can dramatically reduce the buyer's protections.

4. Milestone and payment tables

What they contain: deliverables, due dates, payment amounts tied to each milestone, acceptance criteria, penalty for late delivery.

What changes matter: date changes (extending deadlines), payment amount changes (front-loading or back-loading payments), milestone reordering (changing the sequence of deliverables), acceptance criteria changes (making it easier or harder to trigger payment), and the addition or removal of milestones (changing the total scope of work).

The risk: milestone tables are negotiated alongside the scope of work. A date change in the milestone table may not be coordinated with a related date in the contract body. If the contract says "completion by December 31" and the milestone table shows the final deliverable due on January 15, there is an inconsistency that may not be caught by comparing either document independently.

What comparison tools miss in tables

Here is a realistic scenario. You send a technology services agreement with a 10-row pricing table. The counterparty returns a version with a 12-row pricing table. They added two new line items and changed the unit price on one existing item.

Here is what different approaches show:

Word Compare

Word Compare detects that the table changed. But the output often shows the entire table as deleted and reinserted, or shows row-level changes that don't clearly indicate which cells were modified. The reviewer sees a mess of red and green in the table area and has to manually compare the tables side by side to understand what actually changed. For simple text changes within cells, Word Compare is adequate. For structural changes (added rows), the output is unreliable.

Text-based comparison tools

Tools that extract text from documents before comparing (like DiffChecker or generic diff tools) lose the table structure entirely. The pricing table becomes a sequence of text lines. Row boundaries disappear. Cell alignment is lost. A change that moved a price from one column to another shows up as a text deletion and insertion that may not be recognizable as a table change at all.

The common failure mode

The most dangerous outcome is not a garbled comparison. It is a comparison that looks clean. The tool compares running text perfectly, shows the 25 paragraph-level changes accurately, and the reviewer signs off. The pricing table change (where the unit price increased from $150 to $175 on one line item) was either not detected or was shown as part of a larger "table changed" indicator that the reviewer didn't drill into. The contract gets executed with the higher price.

How to review table changes reliably

Until table comparison in automated tools is fully reliable, tables need a separate review process.

Step 1: Identify all tables in both versions

Before running the comparison, scroll through both versions and note every table. Confirm that both versions have the same tables. If a table was added or removed, that's a significant change that needs review regardless of what the comparison tool shows.

Step 2: Compare table structure

For each table, check: same number of rows? Same number of columns? Same cell merge pattern? If the structure changed, note exactly what changed before comparing content. A structural change (added row, removed column, merged cells) often matters more than any individual cell content change.

Step 3: Compare cell content

Open both versions side by side. For each cell that matters (numbers, dates, obligations, thresholds), compare the content between versions. This is manual work. For a 10-row pricing table, it takes 2-3 minutes. For a 50-row disclosure schedule, it takes longer. This is time well spent: table changes carry disproportionate financial risk.

Step 4: Cross-reference with the contract body

Check that table content is consistent with the contract body. If the contract says "monthly fee of $10,000" and the pricing table says "$12,000," there is an inconsistency. If the contract references "the milestones set forth in Schedule B" and Schedule B was modified, the contract body may need corresponding updates. Table changes don't happen in isolation.

How tools handle table comparison

CapabilityWord CompareText-only toolsClausul
Text changes within cellsUsually detectedUnreliable (table structure lost)Detected at cell level
Row additions/deletionsOften garbledNot detected as table changesDetected with row alignment
Cell content diffBasicNoneWord-level diff per cell
Cell merge/splitPoorly handledNot detectedDetected
Side-by-side table viewNot availableNot applicableOld/new table shown together

Table comparison is an area where tools are improving rapidly. But even with the best current tools, manual verification on high-stakes tables (pricing, SLA penalties, disclosure schedules) is still advisable. The cost of missing a table change is typically higher than the cost of a 5-minute manual check.

The bottom line

Tables are where the money lives in a contract. Pricing, SLAs, milestones, and disclosure schedules often determine the practical economics of the deal more than any individual paragraph in the contract body. Yet tables are exactly where comparison tools are least reliable.

The fix is not to trust the comparison tool less. It is to treat tables as a separate review item: identify them, compare their structure, compare their content cell by cell, and cross-reference with the contract body. This adds a few minutes to your review. For a contract where a single-cell change in a pricing table can shift the deal economics by thousands or millions of dollars, those minutes are the most valuable part of your review.

If you want a comparison tool with cell-level table comparison that shows exactly which rows and cells changed, try Clausul. But regardless of which tool you use, check the tables.

Frequently asked questions

Why are tables in contracts hard to compare?

Tables have a two-dimensional structure (rows and columns) that standard text comparison algorithms are not designed to handle. Text comparison works line by line or word by word. Tables require cell-by-cell matching, which means the tool must first figure out which cell in version A corresponds to which cell in version B. When rows are added, deleted, or reordered, this matching becomes ambiguous. When cells are merged or split, the correspondence breaks entirely. Most comparison tools either fall back to a text-only comparison (losing the table structure) or report the entire table as changed when only one cell was modified.

What types of contract tables matter most for comparison?

Pricing tables and fee schedules carry the most financial risk because they contain exact dollar amounts, percentages, and formulas. SLA tables are critical in technology and outsourcing agreements because they define performance obligations with specific metrics and penalties. Disclosure schedules in M&A and purchase agreements list exceptions to representations and warranties and are heavily negotiated. Milestone and payment tables tie payments to deliverables and deadlines. Any table that contains numbers, dates, or specific obligations should be compared cell by cell, not just glanced at.

Can Word Compare handle table changes?

Word Compare detects some table changes, but the output is often unreliable for complex modifications. It handles simple text changes within cells reasonably well. But row insertions, row deletions, column changes, and cell merge/split operations produce garbled or misleading output. Word Compare may show an entire table as deleted and reinserted rather than showing which specific cells changed. For simple tables with minor text edits, Word Compare is adequate. For pricing schedules, SLA matrices, or any table where precise cell-by-cell tracking matters, the output should be verified manually.

How should I review table changes in a contract?

Open both versions side by side and compare tables cell by cell. Start with the row and column structure: were any rows added or removed? Were any columns added or removed? Were any cells merged or split? Then compare the content of each cell that corresponds between the two versions. For pricing tables, verify every number. For SLA tables, check metrics, thresholds, and penalties individually. For disclosure schedules, compare each exception entry. This is tedious but necessary because automated comparison of tables is less reliable than automated comparison of running text. A comparison tool with good table support can reduce this work, but manual verification on high-stakes tables is still advisable.

What is a disclosure schedule in a purchase agreement?

A disclosure schedule is an attachment to a purchase or M&A agreement that lists specific exceptions to the seller's representations and warranties. For example, if the agreement says "Seller has no pending litigation," the disclosure schedule lists any pending litigation as an exception. Disclosure schedules are often structured as tables or numbered lists referencing specific sections of the main agreement. They are heavily negotiated because each entry narrows the scope of a representation. A change to a disclosure schedule (adding an exception, removing an exception, or modifying the description of an exception) can materially affect the buyer's risk exposure.

Do any comparison tools handle tables well?

Table comparison quality varies significantly across tools. Most tools detect text changes within table cells but struggle with structural changes (row/column additions, deletions, merges, splits). Some enterprise tools like Litera Compare handle basic structural changes. Clausul compares tables at the cell level and detects row additions, deletions, and content changes within cells. No current tool handles all possible table modifications perfectly, particularly cell merge/split operations and column reordering. For high-stakes tables, automated comparison should be supplemented with manual verification.


About this post. Written by the Clausul team. We build document comparison software for legal teams. Table comparison is one of the hardest problems in document comparison, and it is where we invest heavily in our comparison engine.

Something inaccurate? Let us know.

Last reviewed: February 2026.