Skip to main content
Skip to main content
Edit this page

Langfuse

What is Langfuse?

Langfuse is an open-source LLM engineering platform that helps teams collaboratively debug, analyze, and iterate on their LLM applications. It is part of the ClickHouse ecosystem and relies on ClickHouse at its core to provide a scalable, high-performance observability backend.

By leveraging ClickHouse's columnar storage and fast analytical capabilities, Langfuse can handle billions of traces and events with low latency, making it suitable for high-throughput production workloads.

Why Langfuse?

  • Open source: Fully open source with public API for custom integrations
  • Production optimized: Designed with minimal performance overhead
  • Best-in-class SDKs: Native SDKs for Python and JavaScript
  • Framework support: Integrated with popular frameworks like OpenAI SDK, LangChain, and LlamaIndex
  • Multi-modal: Support for tracing text, images and other modalities
  • Full platform: Suite of tools for the complete LLM application development lifecycle

Deployment options

Langfuse offers flexible deployment options to meet different security and infrastructure needs.

Langfuse Cloud is a fully managed service powered by a managed ClickHouse cluster for optimal performance. It is SOC 2 Type II and ISO 27001 certified, GDPR compliant, and available in US (AWS us-west-2) and EU (AWS eu-west-1) data regions.

Self-hosted Langfuse is fully open-source (MIT license) and free to deploy on your own infrastructure using Docker or Kubernetes. You run your own ClickHouse instance (or use ClickHouse Cloud) to store observability data, ensuring complete control over your data.

Architecture

Langfuse only depends on open source components and can be deployed locally, on cloud infrastructure, or on-premises:

  • ClickHouse: Stores high-volume observability data (traces, spans, generations, scores). It enables fast aggregation and analytics for dashboards.
  • Postgres: Stores transactional data like user accounts, project configurations, and prompt definitions.
  • Redis: Handles event queuing and caching.
  • S3/Blob Storage: Stores large payloads and raw event data.

Features

Observability

Observability is essential for understanding and debugging LLM applications. Unlike traditional software, LLM applications involve complex, non-deterministic interactions that can be challenging to monitor and debug. Langfuse provides comprehensive tracing capabilities that help you understand exactly what's happening in your application.

📹 Want to learn more? Watch end-to-end walkthrough of Langfuse Observability and how to integrate it with your application.

Traces allow you to track every LLM call and other relevant logic in your app.

Prompt management

Prompt Management is critical in building effective LLM applications. Langfuse provides tools to help you manage, version, and optimize your prompts throughout the development lifecycle.

📹 Want to learn more? Watch end-to-end walkthrough of Langfuse Prompt Management and how to integrate it with your application.

Create a new prompt via UI, SDKs, or API.

Evaluation & datasets

Evaluation is crucial for ensuring the quality and reliability of your LLM applications. Langfuse provides flexible evaluation tools that adapt to your specific needs, whether you're testing in development or monitoring production performance.

📹 Want to learn more? Watch end-to-end walkthrough of Langfuse Evaluation and how to use it to improve your LLM application.

Plot evaluation results in the Langfuse Dashboard.

Quickstarts

Get up and running with Langfuse in minutes. Choose the path that best fits your current needs:

Learn more