Maxim AI release notes
Maxim AI release notes
www.getmaxim.ai

🔭 Public API for OpenTelemetry trace ingestion

 

New

  

You can now send your OpenTelemetry GenAI traces directly to Maxim with a single-line code change, unlocking comprehensive LLM observability. Maxim supports semantic conventions for generative AI systems, so you can set up observability for your LLM workflows with minimal setup.

tracer_provider = trace_sdk.TracerProvider()
span_processor = SimpleSpanProcessor(OTLPSpanExporter(
    endpoint="https://api.getmaxim.ai/v1/otel",
    headers={
        "x-maxim-api-key": f"{maxim_api_key}",
        "x-maxim-repo-id": f"{maxim_repo_id}",
    },
))
tracer_provider.add_span_processor(span_processor)

See our Ingesting via OTLP Endpoint documentation for details.