🦜️🔗LangChain Rust
Introduction
LangChain Rust is a Rust language implementation of the LangChain project, which focuses on building applications with large language models (LLMs) through composability. This project enhances the ability to work with LLMs in a modular and flexible manner using the Rust programming language.
Features
LangChain Rust is equipped with a comprehensive set of features that cater to various needs in language processing and application development. Let's explore these features in detail:
LLMs
LangChain Rust supports several LLMs, including:
- OpenAi: Integration with OpenAi services for language model interactions.
- Azure OpenAi: Use of Azure's platform to access OpenAi functionalities.
- Ollama: Another option for deploying language models.
- Anthropic Claude: Utilizes Anthropic's Claude for LLM operations.
Embeddings
The project supports embeddings with:
- OpenAi: Embedding capabilities using OpenAi's models.
- Azure OpenAi: Leverage Azure's infrastructure for embeddings with OpenAi.
- Ollama: Embedding functionalities available through Ollama.
- Local FastEmbed: A local option for fast embedding processes.
- MistralAI: Another robust option for embedding tasks.
VectorStores
LangChain Rust provides support for multiple vector stores for data handling:
- OpenSearch
- Postgres
- Qdrant
- Sqlite
- SurrealDB
Each of these vector stores allows efficient management and retrieval of vectorized data.
Chain
Chains are central to building complex LLM workflows. LangChain Rust offers:
- LLM Chain: For basic language model chaining.
- Conversational Chain: Tailored for dialogue-based applications.
- Conversational Retriever Chains: Enhanced with or without vector stores.
- Sequential Chain: Enables sequential processing of data.
- Q&A Chain: Focused on generating question and answer workflows.
- SQL Chain: Employs SQL functionalities in chained processes.
Agents
Agents are crucial for interacting with tools and external systems:
- Chat Agent with Tools: Facilitates tool-assisted conversations.
- Open AI Compatible Tools Agent: Seamless integration with Open AI tools.
Tools
LangChain Rust incorporates several tools for data interaction and processing:
- Serpapi/Google: Web search integration.
- DuckDuckGo Search: Privacy-conscious web searching.
- Wolfram/Math: Complex mathematical computations.
- Command Line
- Text2Speech: Transform text content into speech using OpenAi.
Semantic Routing
- Static Routing: Predefined routing paths for data processing.
- Dynamic Routing: Adaptive paths based on semantic analysis.
Document Loaders
LangChain Rust supports various document loaders, enabling easy import and processing of different file types:
- PDF: Load and process PDF documents.
- Pandoc: Import various formats including docx.
- HTML: Manage HTML content using specified URLs.
- CSV: Import CSV files with specified columns.
- Git commits: Load data from Git repository commits.
- Source code: Include source code files with specified suffixes.
Installation
To begin using LangChain Rust, it requires serde_json
, which is central to its operation. Installation involves adding dependencies to your Rust project:
-
Add
serde_json
:cargo add serde_json
-
Add
langchain-rust
:- Simple Install:
cargo add langchain-rust
- With Optional Features: (e.g., SQLite, Postgres, Qdrant)
cargo add langchain-rust --features [feature_name]
- Simple Install:
Replace [feature_name]
with your desired feature, such as sqlite-vss
, sqlite-vec
, postgres
, or surrealdb
.
Quick Start Example: Conversational Chain
Here's a simple example showcasing a basic setup of a conversational chain using LangChain Rust:
use langchain_rust::{
chain::{Chain, LLMChainBuilder},
language_models::llm::LLM,
llm::openai::{OpenAI, OpenAIModel},
message_formatter,
prompt::HumanMessagePromptTemplate,
};
#[tokio::main]
async fn main() {
let open_ai = OpenAI::default().with_model(OpenAIModel::Gpt4oMini.to_string());
let resp = open_ai.invoke("What is rust").await.unwrap();
println!("{}", resp);
let prompt = message_formatter![
fmt_message!(Message::new_system_message(
"You are a world-class technical documentation writer."
)),
fmt_template!(HumanMessagePromptTemplate::new("{input}", "input"))
];
let chain = LLMChainBuilder::new()
.prompt(prompt)
.llm(open_ai.clone())
.build()
.unwrap();
match chain
.invoke(prompt_args! {
"input" => "Who is the writer of 20,000 Leagues Under the Sea, and what is my name?",
"history" => vec![
Message::new_human_message("My name is: Luis"),
Message::new_ai_message("Hi Luis"),
],
})
.await
{
Ok(result) => {
println!("Result: {:?}", result);
}
Err(e) => panic!("Error invoking LLMChain: {:?}", e),
}
}
This example demonstrates how one can chain prompts with language models for advanced natural language processing workflows using LangChain Rust. The project presents a versatile platform for developing robust language-based applications.