Blog

  • Context Cases: Surviving AI Context Compaction

    AI coding agents lose context when their conversation window fills up. Claude Code handles this by compacting — summarizing the conversation to free space. But the summary loses critical details about your active work. We built Context Cases to solve this.

    The Approach

    Instead of stuffing everything into static CLAUDE.md files (which burn tokens on every message), CaseMgr stores knowledge in cases and loads it dynamically when needed.

    Context Cases are regular CaseMgr cases that you link to your working case. After compaction, the system automatically loads your active tasks plus all notes and knowledge from linked context cases.

    Example

    You’re working on a feature case. You link two context cases:

    • System Knowledge — Server configs, SSH access, deployment procedures, architecture decisions
    • Client Requirements — Specs, constraints, billing preferences

    When compaction happens, Claude automatically recovers all of this — no manual re-explanation needed.

    Setting It Up

    1. Create Your Context Cases

    Create a case for each category of knowledge you want to persist. Add notes with the information Claude needs — environment configs, debugging procedures, test credentials, architecture facts.

    2. Link Them

    In the case detail view, expand the Context Cases section (below Worktrees), click + Link, and select the cases you want as context.

    Or via MCP:

    wa mcp graph-link from_id="#1:100" to_id="#1:200" link_type=context

    3. Configure the Recovery Hook

    Add a PostCompact hook to your project’s .claude/settings.json:

    {
      "hooks": {
        "PostCompact": [
          {
            "hooks": [
              {
                "type": "command",
                "command": "echo '{\"hookSpecificOutput\":{\"hookEventName\":\"PostCompact\",\"additionalContext\":\"CONTEXT COMPACTED. Call preferences-get to find current case, then cmmn-get_resume_context to reload work state.\"}}'",
                "timeout": 5
              }
            ]
          }
        ]
      }
    }

    4. Add Recovery to ~/.claude/CLAUDE.md

    ## Post-Compaction Recovery (MANDATORY)
    
    After every context compaction, you MUST immediately:
    1. Call `preferences-get` to find the current case ID
    2. If a current case exists, call `cmmn-get_resume_context` with that case ID
    3. Resume work from where you left off

    The Principle

    CLAUDE.md is the bootstrap. CaseMgr is the knowledge base.

    Keep instruction files small and static. Put everything dynamic, detailed, or case-specific into CaseMgr cases and link them as context. The system loads what’s needed, when it’s needed.

    Full setup guide: casemgr.systems/context-cases

  • OAuth 2.0: Connecting Claude.ai to CaseMgr

    CaseMgr now supports OAuth 2.0 authorization for MCP connections. This means Claude.ai (and any other OAuth-speaking MCP client) can connect to your CaseMgr instance through a browser-based login flow — no manual token copy-paste needed.

    How It Works

    1. In Claude.ai, go to Settings → Integrations → Add Integration
    2. Enter https://casemgr.systems/mcp
    3. Claude.ai discovers the OAuth endpoints automatically
    4. A browser window opens — log in to CaseMgr and click Authorize
    5. Claude.ai receives an access token and connects

    That’s it. All 184 MCP tools are available in your Claude.ai conversation.

    Under the Hood

    We implemented the full OAuth 2.0 Authorization Code flow with PKCE:

    • Discovery/.well-known/oauth-protected-resource and /.well-known/oauth-authorization-server endpoints tell clients where to authenticate
    • Dynamic Client Registration — Clients register automatically via /oauth/register
    • PKCE — S256 code challenge prevents authorization code interception
    • Consent Page — Users see exactly what they’re authorizing
    • Token Exchange — Authorization codes are exchanged for CaseMgr API tokens

    The issued tokens are standard CaseMgr API tokens — the same format used by Claude Code and Cursor. They show up in your Tokens page where you can review and revoke them.

    Supported Clients

    The Tokens page has copy-paste configuration for:

    • Claude.ai — OAuth flow (no token needed)
    • Claude Code — Bearer token in .mcp.json
    • Claude Desktop — Via mcp-remote bridge
    • Cursor — Bearer token in .cursor/mcp.json
    • VS Code Copilot — Bearer token in .vscode/mcp.json
    • Windsurf — Bearer token in mcp_config.json
    • Gemini CLI — CLI command or settings.json
    • ChatGPT — Manual configuration in Settings

    Tool Name Change

    One thing we discovered during Claude.ai integration: it validates tool names against ^[a-zA-Z0-9_-]{1,64}$ — no dots allowed. Our tools used dots (cases.list), so we switched to hyphens (cases-list). The change happens at the protocol boundary — all internal code and documentation still works.

  • 17 Public Case Plan Models with Mermaid Diagrams

    CaseMgr now includes a library of 17 public Case Plan Models — reusable workflow templates based on the CMMN standard. Instantiate one to create a fully structured case with stages, tasks, milestones, and sentries already wired together.

    What Are Case Plan Models?

    Models are templates. Instead of building a workflow from scratch every time, you pick a model and instantiate it. The model creates all the CMMN items (stages, tasks, milestones, sentries, timers) with their relationships already configured.

    The Pattern Library

    We’ve published six reusable CMMN patterns that serve as building blocks:

    • Approval Gate — Decision task routes to approved/rejected branches via conditional sentries
    • Decision Tree — Multi-way branching based on a decision task’s outcome
    • Escalation Timer — A task with a deadline; if the timer fires first, an escalation path activates
    • Multi-Stage Pipeline — Sequential stages gated by sentries
    • Parallel Work with Join — Concurrent tasks with a milestone that fires when all complete
    • Monitoring Loop — Cron timer drives a webhook, condition sentry auto-resolves when met

    Workflow Templates

    For end-to-end workflows, we have:

    • Bug Triage & Fix — Report → triage (AI decision) → fix → verify
    • Project Kickoff — Discovery → planning → execution → closeout
    • Research & Report — AI-human collaboration for research and drafting
    • Invoice Generation — Client selection → billable items → LaTeX/PDF generation → delivery (our most complex model with 29 definitions)
    • AI Work Item Pipeline — Process task creates work for an AI agent
    • Recurring API Poll — Timer-driven webhook polling with conditional resolution

    Using Models

    # List available models (including public ones)
    wa mcp models-list include_public=true
    
    # Create a case from a model
    wa mcp models-instantiate model_id="#61:37" name="Q1 Budget Approval"
    
    # Or add a workflow to an existing case
    wa mcp cmmn-add_model_to_case case_id="#1:123" model_id="#61:40"

    In the web UI, click Models in the navigation to browse, or use the Workflow button in any case to add a model’s items.

    All models include Mermaid diagrams on the documentation page: casemgr.systems/case-plan-models

  • wa CLI 2.0: Worktree Management in the Monorepo

    We just shipped wa 2.0 — our CLI tool for managing Git worktrees, VS Code workspaces, and CaseMgr cases from the command line. Here’s what changed and why.

    What is wa?

    If you juggle multiple Git branches, git worktree is powerful but hard to remember. wa wraps it into simple commands and ties worktrees to VS Code workspaces and aliases for instant navigation.

    wa add --bn=feature-x --alias=fx    # Create worktree + branch + alias
    cdwa fx                              # Navigate by alias
    wa code                              # Open VS Code with saved state
    wa upload report.pdf                 # Upload file to linked case
    wa mcp cases-list status=active      # Call any MCP tool directly

    What’s New in v2.0

    • Monorepo migration — wa now lives inside casemgr_umbrella/wa/ alongside the Phoenix app. One repo, one deploy.
    • Hyphenated tool namescases-list instead of cases.list. Breaking change, but required for Claude.ai compatibility.
    • Smaller binary — Removed unused dependencies (xlsxir, csv). Down from 2.17MB to 1.95MB.
    • Shell functions split — Core functions (cdwa, home, aliases) ship separately from example project functions.

    Install

    curl -fsSL https://casemgr.systems/cli/install.sh | bash

    Requires Erlang/OTP. Works on Linux and macOS. The installer sets up shell functions and offers to add them to your profile automatically.

    Full documentation: casemgr.systems/cli

  • CaseMgr: Building an AI-Native Case Management System

    Today marks a milestone for CaseMgr — we shipped per-case context loading, a feature that lets AI agents automatically recover their working context after Claude Code’s context window compaction.

    The Problem

    When you’re deep in a coding session with Claude Code, the context window eventually fills up. Claude compacts it — summarizing the conversation to free space. But that summary loses the details: which tasks were active, what environment you were debugging in, the client requirements you were referencing.

    Most AI memory systems (like Soul v5.0) handle this at session boundaries — loading context at start, saving at end. But they’re blind to mid-session compaction. Your AI agent suddenly doesn’t know what it was working on.

    Our Solution: Context Cases

    CaseMgr now lets you link cases together as “context.” When compaction happens, a PostCompact hook automatically fires and reloads:

    • Your current case’s active tasks and work state
    • Notes and knowledge from all linked context cases
    • Case descriptions and project context

    The key insight: CLAUDE.md is the bootstrap, CaseMgr is the knowledge base. Keep your instruction files small and static. Put everything dynamic into CaseMgr cases and link them as context.

    What Else We Shipped This Week

    • wa CLI v2.0 — Migrated into the monorepo, consolidated tool names from dots to hyphens for Claude.ai compatibility
    • OAuth 2.0 for MCP — Claude.ai can now connect to CaseMgr natively via browser-based OAuth authorization
    • Tool consolidation — Reduced from 222 to 184 MCP tools by merging duplicate create/list operations
    • Cloudflare Turnstile — Bot protection on login and registration
    • PubSub broadcasts — Real-time UI updates when MCP operations modify data
    • Case Plan Models documentation — 17 public models with Mermaid diagrams on the website
    • Dark theme — All documentation pages match the CLI landing page aesthetic
    • Product gating — Billing tools hidden from users without the billing product
    • Sticky case header — No more scrolling to see the case name and status

    What’s Next

    We’re working on making the dual-nav problem go away (Phoenix and WordPress currently serve separate menus), improving the case detail UI with collapsible sections and better mobile support, and expanding the public model library with more CMMN workflow templates that users can instantiate with one click.

    If you’re building with AI agents and want persistent, case-driven context management, try CaseMgr — it’s free to get started.