Introduction to Langtrace
Langtrace is an innovative open-source observability tool designed specifically for applications utilizing Large Language Models (LLMs), vector databases, and LLM-based frameworks. It provides a comprehensive solution for capturing, debugging, and analyzing traces and metrics across various platforms and services.
π Open Source and Open Telemetry for LLM Applications
Langtrace prides itself on being an open-source project, welcoming contributions and engagement from the community. It supports Open Telemetry (OTEL) standards, ensuring interoperability and coherence in trace data collection. Open Telemetry is a set of specifications, libraries, and tools used to collect telemetry data (such as traces, metrics, and logs) from applications, making Langtrace a robust solution for monitoring and analyzing performance.
π¦ SDK Repositories
Langtrace services are accessible through several Software Development Kits (SDKs) tailored for different programming environments:
- TypeScript SDK: Available here
- Python SDK: Available here
- Semantic Span Attributes: Available here
These SDKs facilitate the integration of Langtrace with applications, allowing developers to seamlessly incorporate observability features into their projects.
π Getting Started with Langtrace Cloud and Self-Hosting
Langtrace can be accessed as a managed service via Langtrace Cloud or can be self-hosted, offering flexibility based on user requirements.
Langtrace Cloud
To start using Langtrace Cloud:
- Sign up here.
- Create a new project to house the application's traces and metrics.
- Generate an API key within the project.
- Install and initialize the Langtrace SDK in your application to start collecting data.
Example initialization for TypeScript/JavaScript and Python applications is provided to help users quickly set up their environment.
Self-Hosted Option
Users who prefer running Langtrace locally can do so by setting up the following services:
- Next.js app
- PostgreSQL database
- Clickhouse database
Docker and Docker Compose are used for deployment, simplifying the process of running these services on a local machine.
Telemetry and Privacy
Langtrace collects basic non-sensitive telemetry data by default to help improve the platform. This includes project and team names, which aids in understanding usage patterns and enhancing features. Users can opt-out of telemetry by adjusting configuration settings.
π Supported Integrations
Langtrace supports automatic trace capture from numerous vendors across different types, including LLMs, frameworks, and vector databases. Some examples include OpenAI, Azure OpenAI, Anthropic, Cohere, among others, ensuring broad compatibility with popular services.
π The System Architecture
The Langtrace system architecture utilizes a robust infrastructure combining modern technologies such as Next.js for front-end services, PostgreSQL for metadata storage, and Clickhouse for efficient handling of metrics and traces.
π€ Community and Contributions
Langtrace encourages community involvement. Developers interested in contributing can fork the repository, join discussions on feature requests, or report issues. They can also join the Langtrace Slack workspace to collaborate and exchange ideas.
π Security and License
Langtrace upholds security by inviting reports of vulnerabilities directly via email. It is licensed under the AGPL 3.0 License, which reflects its commitment to maintaining an open and transparent development ethos. SDKs are licensed under the Apache 2.0 License, ensuring flexibility for integration in various projects.
β Frequently Asked Questions
Can I self-host Langtrace? Yes, detailed guidance is available for self-hosting Langtrace on-premises.
Is Langtrace Cloud free? Currently, Langtrace Cloud is offered for feedback without charges, allowing users to evaluate the platform as developers continue improvements.
What is the technical foundation of Langtrace? Langtrace uses modern web technologies such as NextJS, PostgresDB, and Clickhouse to offer a comprehensive observability solution. It also finds strength in community contributions that include expertise in TypeScript, Python, and more.
Join the community, contribute to the project, and explore Langtrace's potential for enhancing observability in LLM applications.