Skip to main content
Valiqor automatically traces all Anthropic messages.create calls — both sync and async. Every call is recorded as a span with model name, token usage, cost, and full message content.

Install

pip install valiqor[anthropic]
This installs valiqor plus anthropic>=0.18.0.
Add a single import at the top of your app — all Anthropic calls are automatically traced:
import valiqor.auto  # ← Add this line

import anthropic

client = anthropic.Anthropic()
message = client.messages.create(
    model="claude-sonnet-4-20250514",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Explain quantum computing"}]
)
print(message.content[0].text)
Every messages.create call is now traced with full metadata.

Selective Instrumentation

If you only want Anthropic tracing (not other providers), use the provider-specific function:
from valiqor.trace import anthropic_autolog

anthropic_autolog()

# Or using the namespace-style API:
from valiqor.trace import Anthropic
Anthropic.autolog()
Both are equivalent — they enable tracing only for Anthropic.

Async Support

Async Anthropic calls are automatically traced:
import valiqor.auto
import anthropic

client = anthropic.AsyncAnthropic()

async def main():
    message = await client.messages.create(
        model="claude-sonnet-4-20250514",
        max_tokens=1024,
        messages=[{"role": "user", "content": "Explain quantum computing"}]
    )
    print(message.content[0].text)

Content Block Handling

Anthropic responses use content blocks. Valiqor automatically extracts text from both string content and structured content block lists:
import valiqor.auto
import anthropic

client = anthropic.Anthropic()

# Multi-block content is handled automatically
message = client.messages.create(
    model="claude-sonnet-4-20250514",
    max_tokens=1024,
    messages=[
        {
            "role": "user",
            "content": [
                {"type": "text", "text": "Summarize this document"},
                {"type": "text", "text": "Document content here..."}
            ]
        }
    ]
)

What Gets Captured

Each traced Anthropic call records:
FieldDescription
modelModel name (e.g. claude-sonnet-4-20250514)
input_tokensInput token count
output_tokensOutput token count
total_tokensCombined token count
costEstimated cost in USD
stop_reasonWhy the model stopped (e.g. end_turn, max_tokens)
messagesUser and assistant messages
duration_msCall latency
statusSuccess or error

With Workflows

Combine with trace_workflow to group multiple Anthropic calls into a single trace:
import valiqor.auto
from valiqor.trace import trace_workflow
import anthropic

client = anthropic.Anthropic()

with trace_workflow("claude-assistant"):
    # Step 1: Analyze
    analysis = client.messages.create(
        model="claude-sonnet-4-20250514",
        max_tokens=1024,
        messages=[{"role": "user", "content": "Analyze this data: [...]"}]
    )

    # Step 2: Summarize
    summary = client.messages.create(
        model="claude-sonnet-4-20250514",
        max_tokens=512,
        messages=[
            {"role": "user", "content": f"Summarize: {analysis.content[0].text}"}
        ]
    )
Both calls appear as child spans under the claude-assistant trace.

Disabling

To disable Anthropic tracing:
from valiqor.trace import disable_autolog

disable_autolog("anthropic")    # Disable Anthropic only
disable_autolog()               # Disable all providers

Limitations

  • Streaming is not currently instrumented — streamed responses are not captured in traces
  • Tool use — Anthropic tool use responses are not explicitly parsed (tool_use content blocks are captured as raw content)

Next Steps