OllamaSharp: Simplifying .NET Interactions with the Ollama API
OllamaSharp is a .NET library designed to simplify the process of interacting with the Ollama API, be it on a local machine or a remote server. This project provides users with straightforward .NET bindings, ensuring that developers can easily connect and work with all facets of the Ollama ecosystem.
Key Features
-
Ease of Use: With OllamaSharp, users can start interacting with the Ollama API using minimal code. This makes the development process smooth and accessible, even for those who may not be deeply familiar with the API.
-
Comprehensive API Coverage: The library supports all available endpoints of the Ollama API. It covers functionalities such as managing chats, handling embeddings, listing available models, as well as pulling and creating new models.
-
Real-time Streaming: OllamaSharp allows for streaming of responses directly into applications, providing developers with immediate feedback and interaction.
-
Progress Reporting: Developers receive real-time updates on various tasks, such as when pulling models, helping them track progress effectively.
-
Vision Models and Tools: The library supports advanced features such as vision models and tool (function) calling, broadening the scope of applications that can be developed using OllamaSharp.
How to Use OllamaSharp
OllamaSharp wraps each Ollama API endpoint in methods that support asynchronous operations and response streaming, making code execution efficient and responsive.
Getting Started
To begin using OllamaSharp, set up the client with the desired API URI and select a model for operations:
var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);
ollama.SelectedModel = "llama3.1:8b";
Listing Available Models
You can easily retrieve and list all models that are accessible locally with the following code:
var models = await ollama.ListLocalModelsAsync();
Model Management and Progress
Pull a model and monitor the progress using asynchronous operations for smooth updates:
await foreach (var status in ollama.PullModelAsync("llama3.1:405b"))
Console.WriteLine($"{status.Percent}% {status.Status}");
Interactive Chat Development
Build interactive chat applications with streaming responses efficiently:
var chat = new Chat(ollama);
while (true)
{
var message = Console.ReadLine();
await foreach (var answerToken in chat.SendAsync(message))
Console.Write(answerToken);
}
Integration with Microsoft.Extensions.AI
OllamaSharp is designed to integrate seamlessly with Microsoft's AI abstraction libraries, allowing developers to switch between multiple AI providers like ChatGPT, Claude, or Ollama's local models easily. It supports the IChatClient
and IEmbeddingGenerator
interfaces, ensuring compatibility across different AI services.
Example of creating a chat client:
private static IChatClient CreateChatClient(Arguments arguments)
{
if (arguments.Provider.Equals("ollama", StringComparison.OrdinalIgnoreCase))
return new OllamaApiClient(arguments.Uri, arguments.Model);
else
return new OpenAIChatClient(new OpenAI.OpenAIClient(arguments.ApiKey), arguments.Model);
}
Acknowledgements
The development of OllamaSharp was strongly inspired by the original Ollama project. The community and contributors, especially individuals like mili-tan, play a crucial role in keeping the library up-to-date and functional with the latest in API developments.
Overall, OllamaSharp is a powerful tool for developers looking to integrate advanced language models into their .NET applications efficiently and effectively.