Introduction to OpenLM
OpenLM is a versatile library designed to interface seamlessly with language models (LLMs) from various providers, such as HuggingFace and Cohere, while maintaining compatibility with OpenAI's API. This capability enables users to work with a wide array of advanced LLMs without needing to change their existing OpenAI-based code.
Key Features
- OpenAI Compatibility: OpenLM accepts the same parameters used by OpenAI’s Completion API and returns responses in a similar format, ensuring a smooth transition for developers.
- Multi-Provider Support: Users can access models not just from OpenAI but also from providers like HuggingFace and Cohere, along with the possibility of adding custom implementations.
- Batch Processing: The library supports handling multiple prompts across various models within a single request, facilitating efficient processing.
- Minimalistic Design: OpenLM directly invokes inference APIs, resulting in a minimal footprint and avoiding the need for multiple software development kits (SDKs).
Installation
OpenLM can be easily installed using pip with the following command:
pip install openlm
Practical Examples
OpenLM provides several examples demonstrating its use as an OpenAI alternative, setting up API keys, integrating custom models, and executing multiple prompts concurrently. Some highlighted examples include:
- Importing the library as OpenAI: as_openai.py
- Configuring API keys: api_keys.py
- Adding custom models or providers: custom_provider.py
- Completing multiple prompts across different models: multiplex.py
Notably, OpenLM supports the Completion endpoint, with plans to expand support to additional standardized endpoints in the future.
Example with Response
The usage of OpenLM is illustrated through a code example that combines multiple prompts and models:
import openlm
import json
completion = openlm.Completion.create(
model=["ada", "huggingface.co/gpt2", "cohere.ai/command"],
prompt=["The quick brown fox", "Who jumped over the lazy dog?"],
max_tokens=15
)
print(json.dumps(completion, indent=4))
This outputs a structured JSON response detailing completion choices from different models, showcasing OpenLM’s capability to unify responses from diverse LLM sources.
Other Languages
For those interested in similar functionality in TypeScript, r2d4/llm.ts offers a comparable API that operates atop multiple language models.
Future Developments
OpenLM’s roadmap includes introducing a Streaming API and an Embeddings API, broadening the library's functionality and potential applications.
Contributing
The project embraces contributions from the community. Interested developers are encouraged to open issues or submit pull requests to help enhance the library.
Licensing
OpenLM is available under the MIT License, promoting open and collaborative development.
Overall, OpenLM serves as a convenient and powerful tool for developers seeking to integrate diverse language models without altering the basic structure of their existing OpenAI-centric implementations.