Level 2Agency Foundation

Contract Review

Setup Time: Full day
Est. API Cost: ~$25–$60/mo
Client Price: $500–$2,000

Contract review is one of the most expensive forms of highly skilled labour in professional services. A junior associate or paralegal spending six hours extracting clauses, flagging risk provisions, and summarising contract terms is consuming $600–$900 of billable time on work that is mechanical rather than analytical. Multiply that across the volume of contracts a mid-size firm processes and you have tens of thousands of dollars per month consumed by intelligent people doing work that pattern recognition could do faster and at a fraction of the cost.

Follow the exact steps below to configure and deploy this automation inside your OpenClaw workspace.

  1. 1Create a new agent named `contract-review`. Configure your `CLAUSE_LIBRARY` section — this is the most important configuration step and requires 60–90 minutes: define your standard clause positions for each clause category (payment terms, IP assignment, limitation of liability, confidentiality, non-solicitation, governing law). The agent measures each contract's clauses against these standards to flag deviations.
  2. 2Set your `JURISDICTION` parameter: the clause interpretation logic and risk flagging varies by jurisdiction (English law vs Scots law vs US law). Configure this to match the majority of contracts you process — multi-jurisdiction support requires separate prompt architectures per jurisdiction.
  3. 3Upload 5–10 sample contracts from your own files to calibrate the extraction quality. Run the agent on these in test mode and review the extracted clauses against your own annotations. Adjust the System Prompt based on any systematic misses before going live.
  4. 4Define your `RISK_ESCALATION_RULES`: which clause deviations or missing provisions trigger an immediate escalation to a senior partner rather than a standard review flag. Common escalation triggers: uncapped liability provisions, mandatory arbitration in unfavourable jurisdictions, IP assignments that are broader than standard.
  5. 5Configure the output format: the agent supports structured Word document output (memo format), JSON export for integration with document management systems (iManage, NetDocuments), or a structured email summary. Most law firm clients prefer the Word memo format for compatibility with existing review workflows.

Save this file as: .openclaw/agents/contract-review/context.json

context.json
context payload
{
  "automation_id": "18",
  "title": "Contract Review",
  "level": 2,
  "tier": "Agency Foundation",
  "setup_time": "Full day",
  "estimated_api_cost": "~$25–$60/mo",
  "client_price_range": "$500–$2,000",
  "agents": [
    {
      "role": "orchestrator",
      "model": "claude-3-5-sonnet",
      "temperature": 0.2,
      "max_tokens": 4096
    }
  ],
  "memory": "session",
  "output_format": "structured_json",
  "human_review_gate": true,
  "documentation_standard": "required"
}

Run these commands from your openclaw-workshop/ directory to validate, test, and schedule this automation. Commands are taken directly from The OpenClaw Income Engine, Appendix F.

terminal
execution commands
# ── STEP 1: Run calibration on historical contracts ──
$ openclaw run contract-review --calibrate --input ./calibration_contracts/
$ 
# ── STEP 2: Review calibration accuracy report ──
$ openclaw run contract-review --calibration-report
# Target: 90%+ clause extraction accuracy before going live.
$ 
# ── STEP 3: Process a single contract (test mode) ──
$ openclaw run contract-review --input ./test_contract.pdf --dry-run
$ 
# ── STEP 4: Process live and deliver memo ──
$ openclaw run contract-review --input ./client_contract.pdf
$ 
# ── STEP 5: Activate email and Drive watchers ──
$ openclaw watch contract-review --email-label contracts-incoming --start
$ openclaw watch contract-review --drive-folder <FOLDER_ID> --start
$ 
# ── Process a batch of contracts ──
$ openclaw run contract-review --input ./contracts_folder/ --batch
$ 
# ── Switch output format to JSON (for DMS integration) ──
$ openclaw run contract-review --input ./contract.pdf --context.output.format="json"
$ 
# ── Run in calibration mode (outputs + your annotations compared) ──
$ openclaw run contract-review --input ./contract.pdf --calibration-mode

Automation Stats

  • Automation#18 / 30
  • LevelLevel 2
  • TierAgency Foundation
  • Setup TimeFull day
  • API Cost~$25–$60/mo
  • Client Price$500–$2,000

Full Deployment Guide

Get the complete step-by-step playbook, all 30 context payloads, engineered prompt files, and the full technical deployment appendix in Book 2.

Get the Book