Tracing
In any complex AI agent system, understanding what’s happening under the hood is crucial for debugging, optimization, and ensuring reliability. Tracing provides a detailed, visual log of your agents’ execution paths, including tool usage, agent interactions, and performance metrics. LangCrew integrates with Langtrace to offer robust observability out of the box.
Quick Start: Tracing a Crew in 5 Steps
Section titled “Quick Start: Tracing a Crew in 5 Steps”Get your first traced crew running in minutes.
1. Install the Langtrace SDK
Section titled “1. Install the Langtrace SDK”First, add the Langtrace Python SDK to your project using uv:
uv add langtrace-python-sdk2. Get Your Langtrace API Key
Section titled “2. Get Your Langtrace API Key”To send traces to the platform, you’ll need an API key.
- Navigate to the Langtrace website.
- Sign up for a free account.
- In your account settings, create a new project and generate an API key.
3. Configure the SDK
Section titled “3. Configure the SDK”Set your API key as an environment variable. Create a .env file in your project root if you don’t have one:
LANGTRACE_API_KEY="your-langtrace-api-key-goes-here"4. Initialize Langtrace in Your App
Section titled “4. Initialize Langtrace in Your App”In your main application file, before you run your crew, initialize Langtrace. It only takes two lines of code.
import osfrom dotenv import load_dotenvfrom langtrace_python_sdk import langtrace
# Load environment variablesload_dotenv()
# Initialize Langtracelangtrace.init(api_key=os.getenv("LANGTRACE_API_KEY"))
# ... rest of your crew setup and execution5. Run Your Crew
Section titled “5. Run Your Crew”Now, simply run your crew as you normally would. Langtrace uses instrumentation to automatically capture and send trace data from your LangCrew agents and tasks.
# Example of running a crew after initializationfrom my_crew import MyAwesomeCrew
def main(): # The crew execution is automatically traced result = MyAwesomeCrew().crew().kickoff() print(result)
if __name__ == "__main__": main()Viewing Your Traces
Section titled “Viewing Your Traces”Once your script has been executed, your traces are available on the Langtrace platform.
- Log in to your Langtrace account.
- Navigate to your project.
- You will see a dashboard with a list of recent traces. Click on any trace to see a detailed waterfall view of the execution, including timings, inputs, outputs, and tool calls for each step in your crew’s process.
Advanced Usage: Custom Spans
Section titled “Advanced Usage: Custom Spans”For more granular control, you can add custom spans to trace specific parts of your application using the @with_langtrace_root_span decorator. This is useful for grouping a set of operations under a single root trace.
from langtrace_python_sdk import with_langtrace_root_spanfrom my_crew import MyAwesomeCrew
@with_langtrace_root_span("my_custom_crew_run")def run_my_crew_with_custom_span(): inputs = {"topic": "AI advancements"} result = MyAwesomeCrew().crew().kickoff(inputs=inputs) print(result)
# All operations inside this function will be nested under the "my_custom_crew_run" spanrun_my_crew_with_custom_span()By integrating Langtrace, you gain powerful insights into your LangCrew’s performance and behavior, making it easier to build, debug, and scale your AI agent systems.
Troubleshooting
Section titled “Troubleshooting”If you’re having trouble seeing your traces, here are a couple of common issues and how to solve them.
Problem: No traces are uploaded after execution
Section titled “Problem: No traces are uploaded after execution”If your code runs but nothing appears in the Langtrace dashboard, the SDK might not be capturing any data.
Solution:
-
Enable console output for spans by adding
write_spans_to_console=Trueto theinitfunction.langtrace.init(api_key=os.getenv("LANGTRACE_API_KEY"),write_spans_to_console=True) -
Run your script again and check the console. If you do not see span data printed in your console, it almost always means the
langtrace.init()call is happening too late. -
The Fix: The Langtrace SDK works using bytecode instrumentation. This means it must be initialized before any LLM libraries (like
langchain,openai, etc.) or your crew code is imported. Ensurelangtrace.init()is one of the very first things that runs in your application’s entry point.
Problem: Spans appear in the console, but not on the platform
Section titled “Problem: Spans appear in the console, but not on the platform”If you see trace data in your console but it never appears in your Langtrace dashboard, the issue is likely with your credentials or endpoint configuration.
Check the following:
- API Key: Double-check that your
LANGTRACE_API_KEYis correct and doesn’t have any typos or extra characters. - Self-Hosted Endpoint: If you are self-hosting Langtrace, you must specify the correct API endpoint during initialization. Make sure the
api_hostparameter is pointing to your instance.
langtrace.init( api_key=os.getenv("LANGTRACE_API_KEY"), api_host="http://your-self-hosted-langtrace-instance:3000" # Example)Integrate with LangSmith
Section titled “Integrate with LangSmith”LangCrew can also emit traces to LangSmith, LangChain’s observability platform. Use this if your team already relies on LangSmith for dashboards and evaluations.
1. Install the SDK
Section titled “1. Install the SDK”uv add langsmith# orpip install -U langsmith2. Configure environment variables
Section titled “2. Configure environment variables”Create or update your .env so tracing is enabled and authenticated:
LANGCHAIN_TRACING_V2="true"LANGCHAIN_API_KEY="your-langsmith-api-key"# OptionalLANGCHAIN_PROJECT="my-langcrew-project"# If using a self-hosted instance or a non-default region# LANGCHAIN_ENDPOINT="https://api.smith.langchain.com"Load these before your app starts (for example, via dotenv.load_dotenv()), or export them in your shell.
3. Annotate or wrap your code (optional)
Section titled “3. Annotate or wrap your code (optional)”LangSmith auto-instruments many LangChain integrations when LANGCHAIN_TRACING_V2=true. For non-LangChain code paths, or to create clear run boundaries around your crew execution, use the decorator or context manager:
from dotenv import load_dotenvload_dotenv()
from langsmith import traceable
@traceable(name="run_langcrew_crew")def run_crew(): from my_crew import MyAwesomeCrew return MyAwesomeCrew().crew().kickoff()
result = run_crew()print(result)Or with a context manager for manual control over inputs/outputs:
import langsmith as ls
with ls.trace("langcrew_run", "chain") as run: from my_crew import MyAwesomeCrew out = MyAwesomeCrew().crew().kickoff() run.end(outputs={"result": str(out)})4. View your traces
Section titled “4. View your traces”Open your LangSmith project dashboard to verify runs are appearing with inputs, outputs, and timing. For more, see the official LangSmith docs: LangSmith Documentation.