All the latest updates, improvements, and fixes to Intercept. Stay up to date with everything we're building.
v3.0.0
Full Model Context Protocol support with automatic instrumentation, tool call tracing, and comprehensive resource monitoring for AI agents.
Track LLM costs per user, session, or feature with granular attribution. Set budgets, receive alerts, and optimize spending with smart recommendations.
Live quality metrics with automated evaluation pipelines. Run LLM-as-judge evaluations on production traffic with zero latency impact.
50+
Eval Templates
<5ms
Overhead
Intelligent alerts for latency spikes, quality degradation, cost anomalies, and compliance violations. Slack, PagerDuty, and webhook integrations.
intercept.alert({
name: "High Latency",
condition: "latency_p99 > 2000ms",
channel: "slack:#alerts"
})Rebuilt ingestion pipeline with edge buffering and smart batching. Now handling 1M+ traces per minute with sub-second visibility.
New SDK automatically detects and instruments OpenAI, Anthropic, Cohere, LangChain, LlamaIndex, and more. Just import and go.
Intercept is now SOC 2 Type II certified and HIPAA compliant. Enterprise-grade security with data residency options.
Subscribe to our newsletter and get notified about new features and updates.