Autolemetry
OpenTelemetry instrumentation for Model Context Protocol (MCP) with distributed tracing support
Ask AI about Autolemetry
Powered by Claude Β· Grounded in docs
I know everything about Autolemetry. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
π autolemetry
Write once, observe everywhere.
Instrument your Node.js code a single time, keep the DX you love, and stream traces, metrics, logs, and product events to any observability stack without vendor lock-in.
One init(), wrap functions with trace(), and get automatic traces, metrics, and events:
import { init, trace, track } from 'autolemetry';
import { PostHogSubscriber, SlackSubscriber } from 'autolemetry-subscribers';
// Initialize once at startup
init({
service: 'checkout-api',
endpoint: process.env.OTEL_EXPORTER_OTLP_ENDPOINT, // Grafana, Datadog, Tempo, etc.
subscribers: [
new PostHogSubscriber({ apiKey: process.env.POSTHOG_KEY! }),
new SlackSubscriber({ webhookUrl: process.env.SLACK_WEBHOOK! }),
],
});
// Wrap any function - automatic spans, error tracking, and context
export const processOrder = trace(async function processOrder(
orderId: string,
amount: number,
) {
const user = await db.users.findById(orderId);
const payment = await chargeCard(user.cardId, amount);
// Product events automatically enriched with trace context
// Sent to: OTLP + PostHog + Slack (all in one call!)
track('order.completed', { orderId, amount, userId: user.id });
return payment;
});
That's it. Every call to processOrder() now:
- β Creates a span with automatic error handling
- β Tracks metrics (duration, success rate)
- β
Sends events with
traceIdandspanIdto all adapters - β Works with any OTLP-compatible backend (Grafana, Datadog, New Relic, Tempo, etc.)
β See complete examples and API docs
Packages
This monorepo contains the following packages:
autolemetry
Core library providing ergonomic OpenTelemetry instrumentation with:
- Drop-in DX with
trace(),span(), and decorators - Adaptive sampling (10% baseline, 100% errors/slow paths)
- Production hardening (rate limiting, circuit breakers, redaction)
- Auto trace context enrichment
- LLM observability via OpenLLMetry integration
- AI workflow patterns (multi-agent, RAG, evaluation loops)
autolemetry-subscribers
Product events subscribers for:
- PostHog
- Mixpanel
- Amplitude
- Slack webhooks
- Custom webhooks
β View subscribers documentation
autolemetry-edge
Edge runtime support for:
- Cloudflare Workers
- Vercel Edge Functions
- Other edge environments
Migrating from OpenTelemetry?
Migration Guide - Migrate from vanilla OpenTelemetry to autolemetry:
- Quick start with copy-paste code examples
- Pattern-by-pattern transformations (environment variables, manual SDK setup, manual spans, logger integration, sampling)
- Side-by-side before/after comparisons
- 9-phase migration checklist
- Edge cases and when not to migrate
Typical migration: Replace NODE_OPTIONS and 30+ lines of SDK boilerplate with init(), wrap functions with trace() instead of manual span.start()/span.end().
Quick Start
npm install autolemetry
# Optional: Add event subscribers (PostHog, Slack, Mixpanel, etc.)
npm install autolemetry-subscribers
# or
pnpm add autolemetry
pnpm add autolemetry-subscribers # Optional
Quick Debug Mode
See traces instantly during development - perfect for progressive development:
import { init, trace } from 'autolemetry';
// Start with console-only (no backend needed)
init({
service: 'my-app',
debug: true // Outputs spans to console
});
// Your traced functions work as normal
const result = await trace(async () => {
// Your code here
return 'success';
})();
// Span printed to console automatically!
How it works:
debug: true- Print spans to console AND send to backend (if endpoint configured)- No endpoint = console-only (perfect for local development)
- With endpoint = console + backend (verify before choosing provider)
- No debug flag - Send to backend only (default production behavior)
Or use environment variable:
AUTOLEMETRY_DEBUG=true node server.js
Environment Variables
Configure autolemetry using standard OpenTelemetry environment variables:
export OTEL_SERVICE_NAME=my-app
export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
export OTEL_EXPORTER_OTLP_HEADERS=x-honeycomb-team=YOUR_API_KEY
export OTEL_RESOURCE_ATTRIBUTES=deployment.environment=production
Then call init() without any config - it picks up env vars automatically:
init({ service: 'my-app' }); // Minimal config, env vars fill the rest
β See complete environment variable documentation
Development
Prerequisites
- Node.js 18+
- pnpm 8+
Setup
# Clone and install dependencies
git clone https://github.com/jagreehal/autolemetry.git
cd autolemetry
pnpm install
# Build all packages
pnpm build
# Run tests
pnpm test
# Run example apps
pnpm --filter @jagreehal/example-basic start
pnpm --filter @jagreehal/example-http start
Project Structure
autolemetry/
βββ packages/
β βββ autolemetry/ # Core library
β βββ autolemetry-subscribers/ # Event subscribers
β βββ autolemetry-edge/ # Edge runtime support
βββ apps/
β βββ example-basic/ # Basic usage example
β βββ example-http/ # Express server example
β βββ cloudflare-example/ # Cloudflare Workers example
βββ turbo.json # Turborepo configuration
Available Scripts
# Development
pnpm dev # Watch mode for all packages
pnpm build # Build all packages
pnpm test # Run all tests
pnpm test:integration # Run integration tests
# Code quality
pnpm lint # Lint all packages
pnpm format # Format code with Prettier
pnpm type-check # TypeScript type checking
# Releases
pnpm changeset # Create a changeset
pnpm version-packages # Version packages
pnpm release # Publish to npm
Running Examples
Basic Example
pnpm --filter @jagreehal/example-basic start
HTTP Server Example
pnpm --filter @jagreehal/example-http start
Cloudflare Workers Example
pnpm --filter cloudflare-example dev
Contributing
We welcome contributions! Please see our contributing guidelines for details.
Development Workflow
- Fork and clone the repository
- Create a branch for your feature:
git checkout -b feature/my-feature - Make your changes and add tests
- Run tests:
pnpm test - Create a changeset:
pnpm changeset - Commit your changes:
git commit -am "Add new feature" - Push to your fork:
git push origin feature/my-feature - Open a pull request
Adding a Changeset
We use changesets for version management:
pnpm changeset
Follow the prompts to:
- Select which packages changed
- Choose semver bump (major/minor/patch)
- Write a summary of your changes
Architecture
Autolemetry is built on top of OpenTelemetry and provides:
- Ergonomic API layer - Wraps verbose OpenTelemetry APIs
- Smart defaults - Production-ready configuration out of the box
- Platform agnostic - Works with any OTLP-compatible backend
- Type-safe - Full TypeScript support with strict types
- Modular design - Use only what you need
Why Autolemetry?
| Challenge | With autolemetry |
|---|---|
| Raw OpenTelemetry is verbose | One-line trace() wrapper with automatic lifecycle |
| Vendor SDKs create lock-in | OTLP-native, works with any backend |
| Need both observability & events | Unified API for traces, metrics, logs, and events |
| Production safety concerns | Built-in sampling, rate limiting, redaction |
Troubleshooting
Having issues seeing your traces? Use ConsoleSpanExporter for visual debugging or InMemorySpanExporter for testing. See the full troubleshooting guide in the detailed docs.
Roadmap
- Core tracing API
- Metrics support
- Log correlation
- Product events subscribers
- Edge runtime support
- LLM observability (OpenLLMetry)
Community & Support
License
MIT - See LICENSE for details.
