Add LangChain workflow span support and refactor LLM invocation#4449
Add LangChain workflow span support and refactor LLM invocation#4449wrisa wants to merge 22 commits intoopen-telemetry:mainfrom
Conversation
There was a problem hiding this comment.
Pull request overview
Note
Copilot was unable to run its full agentic suite in this review.
Adds workflow-level tracing for LangChain (top-level chain runs) and refactors LLM span creation to use the newer GenAI invocation APIs, including an example and expanded test coverage.
Changes:
- Add workflow invocation (INTERNAL span) start/stop/error handling via
on_chain_*callbacks. - Refactor LLM spans to use
InferenceInvocation(start_inference(),stop(),fail()), and introduce workflow invocation tracking. - Add
opentelemetry-util-genaidependency plus a LangGraph workflow example and new tests.
Reviewed changes
Copilot reviewed 7 out of 8 changed files in this pull request and generated 4 comments.
Show a summary per file
| File | Description |
|---|---|
| instrumentation-genai/opentelemetry-instrumentation-langchain/tests/test_workflow_chain.py | Adds unit tests validating workflow span creation, CSA propagation, and error/no-op paths. |
| instrumentation-genai/opentelemetry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/invocation_manager.py | Makes invocation state nullable to support runs without an associated GenAI invocation. |
| instrumentation-genai/opentelemetry-instrumentation-langchain/src/opentelemetry/instrumentation/langchain/callback_handler.py | Implements workflow spans for chains and migrates LLM handling to InferenceInvocation. |
| instrumentation-genai/opentelemetry-instrumentation-langchain/pyproject.toml | Updates core instrumentation dependency and adds explicit opentelemetry-util-genai dependency. |
| instrumentation-genai/opentelemetry-instrumentation-langchain/examples/workflow/requirements.txt | Adds dependencies for the new workflow example. |
| instrumentation-genai/opentelemetry-instrumentation-langchain/examples/workflow/main.py | Adds a LangGraph StateGraph workflow example that invokes an LLM node. |
| instrumentation-genai/opentelemetry-instrumentation-langchain/CHANGELOG.md | Notes the new workflow span support/refactor in the changelog. |
| invocation = self._invocation_manager.get_invocation(run_id=run_id) | ||
| if invocation is None or not isinstance( | ||
| invocation, WorkflowInvocation | ||
| ): | ||
| # If the invocation does not exist, we cannot set attributes or end it | ||
| return |
There was a problem hiding this comment.
Nested chains are recorded in the invocation manager with invocation=None, but on_chain_end() returns early when the invocation is missing/non-WorkflowInvocation. If parent_run_id is not present in the manager (e.g., out-of-order callbacks or partial instrumentation), this creates orphaned entries that will never be cleaned up. Consider deleting the invocation state on on_chain_end() / on_chain_error() when get_invocation() returns None (or when it’s not a WorkflowInvocation), or avoid storing state at all when the parent is unknown.
Description
WIP - adding agent invocation as well.
opentelemetry-util-genaias an explicit dependency inpyproject.toml.examples/workflow/main.py)See sample workflow and inference spans below,

Fixes # (issue)
Type of change
Please delete options that are not relevant.
How Has This Been Tested?
Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
Does This PR Require a Core Repo Change?
Checklist:
See contributing.md for styleguide, changelog guidelines, and more.