Current state (v0.1 stub)
lcm_expand_query is currently a stub that returns guidance text:
lcm_expand_query is not implemented in lossless-claude v1.
Workflow to use instead:
1. lcm_grep with pattern="<query>" mode="full_text"
2. For promising summary hits, lcm_describe with include_lineage=true
3. Walk parents (deeper summaries) until you find specific source messages
4. lcm_describe msg#<id> for exact message text
This works — Claude can drive the recursion manually using lcm_grep + lcm_describe — but it spends turns and tokens on a process the tool should handle inline.
What v0.2 needs to do
Reproduce the behavior of lossless-claw's native lcm_expand_query tool (see src/tools/lcm-expand-query-tool.ts and src/expansion.ts in lossless-claw):
- Take a natural-language query and an optional conversation scope.
- FTS5-search summaries to find candidate nodes.
- For each candidate, walk the summary DAG (parents → deeper) and decide which descendants are worth expanding based on relevance.
- Issue a model call to extract the exact evidence (commands, file paths, error strings, decisions) that answers the query, drawing from raw messages where available.
- Return a structured answer with citations to message and summary IDs.
Constraints / open questions
- The MCP server is currently read-only (
PRAGMA query_only = ON). v0.2 doesn't need to write to the LCM database, but it does need to issue an LLM call. That means picking how to make the model call from inside an MCP server: direct Anthropic SDK, OpenAI-compatible endpoint, or proxy through the user's existing OpenClaw model router. Decision deferred until v0.2 design.
- Recursion budget: lossless-claw's version has a
lcm-expansion-recursion-guard.ts to prevent runaway expansion. We'll need the same.
- Expansion model: lossless-claw lets users configure
expansionModel separately from the main summary model (often a faster Haiku-class model). v0.2 should expose the same env var.
- Token caps: the expansion call is bounded by
maxExpandTokens in lossless-claw config. Honor the same cap.
Acceptance criteria
Related
- The v1 stub lives in
mcp-server/src/index.ts — search for "lcm_expand_query"
- The authoritative implementation to mirror lives in lossless-claw's
src/expansion.ts
- Roadmap entry: README.md → "Roadmap" → v0.2
Current state (v0.1 stub)
lcm_expand_queryis currently a stub that returns guidance text:This works — Claude can drive the recursion manually using
lcm_grep+lcm_describe— but it spends turns and tokens on a process the tool should handle inline.What v0.2 needs to do
Reproduce the behavior of
lossless-claw's nativelcm_expand_querytool (seesrc/tools/lcm-expand-query-tool.tsandsrc/expansion.tsin lossless-claw):Constraints / open questions
PRAGMA query_only = ON). v0.2 doesn't need to write to the LCM database, but it does need to issue an LLM call. That means picking how to make the model call from inside an MCP server: direct Anthropic SDK, OpenAI-compatible endpoint, or proxy through the user's existing OpenClaw model router. Decision deferred until v0.2 design.lcm-expansion-recursion-guard.tsto prevent runaway expansion. We'll need the same.expansionModelseparately from the main summary model (often a faster Haiku-class model). v0.2 should expose the same env var.maxExpandTokensin lossless-claw config. Honor the same cap.Acceptance criteria
lcm_expand_queryacceptsquery: stringand optionalconversationId/session_keyparametersLOSSLESS_CLAUDE_EXPANSION_MODEL, etc.)Related
mcp-server/src/index.ts— search for "lcm_expand_query"src/expansion.ts