Skip to main content

๐Ÿ“Š Langtrace

Langtrace enables developers to trace, evaluate, manage prompts and datasets, and debug issues related to an LLM applicationโ€™s performance. It creates open telemetry standard traces for Chroma which helps with observability and works with any observability client.

Key features include:

  • Detailed traces and logs
  • Real-time monitoring of key metrics including accuracy, evaluations, usage, costs, and latency
  • Integrations for the most popular frameworks, vector databases, and LLMs including Langchain, LllamaIndex, OpenAI, Anthropic, Pinecone, Chroma and Cohere.
  • Self-hosted or using Langtrace cloud

| Docs | Github |

Installationโ€‹

Install the SDK on your project:โ€‹

  • Python: Install the Langtrace SDK using pip
pip install langtrace-python-sdk
  • Typescript: Install the Langtrace SDK using npm
npm i @langtrase/typescript-sdk

Initialize the SDK in your project:โ€‹

  • Typescript:
// Must precede any llm module imports
import * as Langtrace from "@langtrase/typescript-sdk";

Langtrace.init({ api_key: "<LANGTRACE_API_KEY>" });
  • Python:
from langtrace_python_sdk import langtrace

langtrace.init(api_key = '<LANGTRACE_API_KEY>')

Configurationโ€‹

Langtrace is adaptable and can be configured to transmit traces to any observability platform compatible with OpenTelemetry, such as Datadog, Honeycomb, Dynatrace, New Relic, among others. For more details on setup and options, consult the Langtrace docs.