Visit
Langfuse

Langfuse

5· 13 reviews
AI summary readySince 2023

Open Source LLM Engineering Platform

Langfuse is an open-source LLM engineering platform that helps teams collaboratively debug, analyze, and iterate on their LLM applications. All platform features are natively integrated to accelerate the development workflow. Langfuse is open. It works with any model, any framework, allows for complex nesting, and has open APIs to build downstream use cases. Docs: https://langfuse.com/docs Github: https://github.com/langfuse/langfuse

Launched 202313 reviewsAI summary available
AI Infrastructure ToolsAI Metrics and Evaluation

Reviews for Langfuse

Hear what real users highlight about this tool.

5
Based on 13 reviews
5
13
4
0
3
0
2
0
1
0
AI summary

Reviews praise Langfuse for reliable observability, fast iteration, and an approachable open-source experience. Makers of Lingo.dev, AutonomyAI, and Touring highlight dependable tracking, easy monitoring, and debugging that shortens development and QA cycles. Users note smooth SDKs, detailed tracing, strong analytics on latency, tokens, and costs, plus flexible integrations and self-hosting. Teams say it “just works,” helps uncover edge cases, and supports prompt management and evals. The generous free options and responsive team stand out, with many adopting it as their default LLM observability layer.

This AI-generated snapshot distills top reviewer sentiments.

Yasi Rajaee
Yasi Rajaee5/52yr ago

Langfuse is a gamechanger! thank you team

Source: Product Hunt
Punn Kam
Punn Kam5/53mo ago

Detailed tracing and evals for our AI workloads.

Source: Product Hunt
Garry Tan
Garry Tan5/53mo ago

Marc gave me an in-person onboarding in SF - I found an issue in our LLM provider config just 30 minutes after the onboarding thanks to Langfuse. 10/10 recommendation

Source: Product Hunt
Ben Lang
Ben Lang5/53mo ago

Langfuse is amazing and is helping support our tracking on LLM usage and more

Source: Product Hunt
Alexander Doudkin
Alexander Doudkin5/515d ago

Used for evals & tracing — absolutely essential for getting visibility into how our AI behaves. Why we love it: Open source, deep insights, smooth integration into our stack.

Source: Product Hunt
Gift Richard
Gift Richard5/55mo ago

We chose Langfuse over alternatives like Weights & Biases or custom logging solutions because of its specialized focus on LLM observability and evaluation. When users upload diverse content formats - from messy handwritten notes to complex technical PDFs - we need detailed tracing to understand how our AI pipeline processes each input and generates personalized quizzes. Langfuse lets us track the entire journey from content ingestion through quiz generation, helping us identify when our AI creates exceptional learning experiences versus when it struggles with certain content types. The evaluation capabilities are crucial for maintaining consistent quality as we scale - we can automatically detect when quiz questions become too easy, too hard, or miss key concepts from the source material.

Source: Product Hunt
Philip Moses
Philip Moses5/59mo ago

Open source, supports self hosting and super easy to integrate with Langflow.

Source: Product Hunt
Dmitriy Tkalich
Dmitriy Tkalich5/59mo ago

Solid observability features combined with TS SDK made Langfuse a go-to solution for bootstrapping new AI products.

Previously used LangSmith and Helicone, where Langfuse stands out - pricing plan that favors early-stage products and offers more value for the price.

Pros
+ LLM observability (8)+ SDK availability (2)+ cost-effective (2)
Source: Product Hunt
Othmane Zoheir
Othmane Zoheir5/511mo ago

I've been using Langfuse from day 1 for my LLM package "tinyllm" and all my LLM apps. It just works and the team is fast and very helpful. Probably the best open-source tool/experience I've ever had.

Pros
+ open source (4)
Source: Product Hunt
Nils Reichardt
Nils Reichardt5/511mo ago

12 months ago I discovered Langfuse and I've been a happy user ever since. Langfuse made it super easy for me to find & fix weird edge case responses of LLMs and build a new architecture that is much cheaper and more stable.

Pros
+ debugging tools (2)+ cost-effective (2)
Source: Product Hunt
Gavin Seewooruttun
Gavin Seewooruttun5/51yr ago

One of the key challenges in developing production-ready Generative AI apps is the ability to tune LLMs and RAG pipelines. For this purpose, observability of inputs and outputs is essential. Langfuse provides a sophisticated but easy-to-use platform that has enabled us to significantly improve our AI pipelines.

Source: Product Hunt
Mehdi Zare
Mehdi Zare5/51yr ago

It's easy to integrate in my workflow and you get a bunch of reports and details that would be impossible to do on your own. I love that they offer free cloud version, at least for now.

Pros
+ easy to integrate (2)+ free tier (2)
Source: Product Hunt
Sebastian Schüller
Sebastian Schüller5/52yr ago

Needed to make sense of any LLM effort quickly.

Source: Product Hunt