Skip to content

v0.2: implement lcm_expand_query with LLM-driven recursion #1

@offendingcommit

Description

@offendingcommit

Current state (v0.1 stub)

lcm_expand_query is currently a stub that returns guidance text:

lcm_expand_query is not implemented in lossless-claude v1.

Workflow to use instead:
1. lcm_grep with pattern="<query>" mode="full_text"
2. For promising summary hits, lcm_describe with include_lineage=true
3. Walk parents (deeper summaries) until you find specific source messages
4. lcm_describe msg#<id> for exact message text

This works — Claude can drive the recursion manually using lcm_grep + lcm_describe — but it spends turns and tokens on a process the tool should handle inline.

What v0.2 needs to do

Reproduce the behavior of lossless-claw's native lcm_expand_query tool (see src/tools/lcm-expand-query-tool.ts and src/expansion.ts in lossless-claw):

  1. Take a natural-language query and an optional conversation scope.
  2. FTS5-search summaries to find candidate nodes.
  3. For each candidate, walk the summary DAG (parents → deeper) and decide which descendants are worth expanding based on relevance.
  4. Issue a model call to extract the exact evidence (commands, file paths, error strings, decisions) that answers the query, drawing from raw messages where available.
  5. Return a structured answer with citations to message and summary IDs.

Constraints / open questions

  • The MCP server is currently read-only (PRAGMA query_only = ON). v0.2 doesn't need to write to the LCM database, but it does need to issue an LLM call. That means picking how to make the model call from inside an MCP server: direct Anthropic SDK, OpenAI-compatible endpoint, or proxy through the user's existing OpenClaw model router. Decision deferred until v0.2 design.
  • Recursion budget: lossless-claw's version has a lcm-expansion-recursion-guard.ts to prevent runaway expansion. We'll need the same.
  • Expansion model: lossless-claw lets users configure expansionModel separately from the main summary model (often a faster Haiku-class model). v0.2 should expose the same env var.
  • Token caps: the expansion call is bounded by maxExpandTokens in lossless-claw config. Honor the same cap.

Acceptance criteria

  • lcm_expand_query accepts query: string and optional conversationId / session_key parameters
  • Returns structured evidence drawn from raw messages, not summary paraphrases
  • Bounded by configurable recursion depth and token budget
  • Falls back to a clear error message (not the v1 stub guidance) when expansion fails
  • Documented model-call configuration via env vars (LOSSLESS_CLAUDE_EXPANSION_MODEL, etc.)
  • Existing v0.1 stub message removed

Related

  • The v1 stub lives in mcp-server/src/index.ts — search for "lcm_expand_query"
  • The authoritative implementation to mirror lives in lossless-claw's src/expansion.ts
  • Roadmap entry: README.md → "Roadmap" → v0.2

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions