Azure OpenAI Proxy Project Overview
The Azure OpenAI Proxy is an innovative tool designed to bridge the gap between OpenAI requests and Azure OpenAI requests. It serves as a vital backend component for various open source ChatGPT web projects and functions as a straightforward OpenAI API proxy, especially addressing the limitations faced due to regional restrictions on OpenAI API usage.
Core Features
- Comprehensive API Support: The proxy is adept at handling all Azure OpenAI APIs, making it a versatile solution for developers.
- Model Versatility: It accommodates all Azure OpenAI models including custom fine-tuned models, providing flexibility for specific needs.
- Custom Mapping Capabilities: Users can define custom mappings between Azure deployment names and OpenAI models, ensuring seamless integration.
- Dual Proxy Functionality: The tool supports both reverse proxy (for OpenAI API gateway) and forward proxy usage.
- Mocking Unsupported APIs: It can mock APIs not supported by Azure, ensuring uninterrupted service for features not native to Azure.
Supported APIs
Currently, the Azure OpenAI Proxy fully supports the following APIs:
/v1/chat/completions
/v1/completions
/v1/embeddings
For other APIs not directly supported by Azure, the proxy returns a mocked response, thereby allowing continuity even in unsupported scenarios.
Recent Updates
The project remains dynamic with continual updates, such as:
- Added support for the
/v1/models
interface, resolving dependencies in some web projects. - Introduced support for the
options
interface, fixing cross-domain check errors.
Usage Scenarios
Reverse Proxy Use Case:
To function as an OpenAI API gateway, you can configure various environment variables such as the service listening address, proxy mode, Azure OpenAI endpoint, and custom model mappers. These configurations allow efficient handling of requests by translating them from OpenAI to Azure seamlessly.
A typical command-line application involves using a curl command to interact with the proxy endpoint, submitting data like model details and user messages.
Forward Proxy Application:
When used as a forward proxy, the Azure OpenAI Proxy acts as an intermediary in HTTP requests to the API, requiring an HTTPS proxy like Nginx for secure connections. This setup is crucial for developers aiming to maintain secure communications when bypassing regional restrictions.
Deployment Instructions
Deployment is streamlined through Docker, enabling users to quickly launch the proxy with specified environmental configurations like the Azure OpenAI endpoint and custom model mappings. The proxy can then be accessed easily via a local URL.
Advanced Model Mapping
The proxy features pre-defined model mapping rules ensuring Azure's model naming conventions are adhered to efficiently. Custom mappings can be defined to tailor specific deployment needs, making it highly adaptable for bespoke solutions.
Licensing
The project is open-source under the MIT license, promoting widespread use and collaboration.
Community Engagement
The project maintains an active community interaction as evident from its star history, highlighting its growing popularity and user engagement over time.
In essence, the Azure OpenAI Proxy provides a robust intermediary service that significantly enhances the capability and reach of OpenAI's capabilities through Azure's environment. With its flexible integration features and seamless adaptability, it stands as a vital tool for developers worldwide.