Amazing OpenAI API: An Overview
The Amazing OpenAI API is a versatile tool designed to simplify the integration of various model APIs into the OpenAI API format. This utility, compact in size (over 10MB), enables users to convert different models into the OpenAI format straight out of the box. It currently supports the following models:
- Azure OpenAI API (GPT 3.5/4), including GPT4 Vision (GPT4v)
- YI 34B API
- Google Gemini Pro
How to Get Started
Download and Installation
Users can download the executable suited to their operating system via the GitHub Release page. For those preferring to use Docker, the following command pulls the desired version of the image:
docker pull soulteary/amazing-openai-api:v0.7.0
Quick Start
AOA
requires no configuration files. Instead, users can adjust the application's behavior via environment variables. These include selecting the working model, setting necessary parameters, and configuring model aliases. By default, the program starts with the azure
model.
To use the service, set the environment variable as shown:
AZURE_ENDPOINT=https://your-deployment-name.openai.azure.com/ ./aoa
For Docker users, the setup is similar:
docker run --rm -it -e AZURE_ENDPOINT=https://your-deployment-name.openai.azure.com/ -p 8080:8080 soulteary/amazing-openai-api:v0.7.0
Once the service is running, it provides access to an OpenAI-like API service at http://localhost:8080/v1
. Users can test with curl
:
curl -v http://127.0.0.1:8080/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer 123" \
-d '{
"model": "gpt-4",
"messages": [
{
"role": "system",
"content": "You are a poetic assistant, skilled in explaining complex programming concepts with creative flair."
},
{
"role": "user",
"content": "Compose a poem that explains the concept of recursion in programming."
}
]
}'
The API can also be accessed using the OpenAI official SDK or any compatible open-source software. Here’s a Python example using the OpenAI client:
from openai import OpenAI
client = OpenAI(
api_key="your-key-or-input-something-as-you-like",
base_url="http://127.0.0.1:8080/v1"
)
chat_completion = client.chat.completions.create(
messages=[
{
"role": "user",
"content": "Say this is a test",
}
],
model="gpt-3.5-turbo",
)
print(chat_completion)
To enhance API Key security without exposing it to applications, configure an additional environment variable AZURE_API_KEY=your API Key
. This setup means OpenAI-compatible software doesn’t need to include an API key in requests.
Model names can be easily mapped to preferred ones. For instance, this syntax maps GPT 3.5/4 calls to yi-34b-chat
:
gpt-3.5-turbo:yi-34b-chat,gpt-4:yi-34b-chat
To switch between supported models, set AOA_TYPE=yi
or AOA_TYPE=gemini
, with all else staying the same.
Using Docker Compose for Quick Setup
The project includes docker compose
example files for supported models. Simply modify the appropriate example file from the example
directory to docker-compose.yml
and fill in the mandatory details before executing docker compose up
to start the service. Example files include:
docker-compose.azure.yml
docker-compose.azure-gpt4v.yml
docker-compose.yi.yml
docker-compose.gemini.yml
Detailed Usage Configuration
To change the working model, adjust AOA_TYPE
, with 'azure'
as the default option:
AOA_TYPE: "azure" # Options include "azure", "yi", "gemini"
The service port and address, default at 8080
and 0.0.0.0
, can also be configured:
AOA_PORT: 8080 # Default service port
AOA_HOST: "0.0.0.0" # Default service address
Configuring Azure, YI, and Gemini Models
Azure
Deploy Azure’s OpenAI service as a standard OpenAI call with the following:
AZURE_ENDPOINT=https://<your-endpoint>.openai.azure.com/ AZURE_API_KEY=<your-api-key> AZURE_MODEL_ALIAS=gpt-3.5-turbo:gpt-35 ./amazing-openai-api
This setup uses AZURE_MODEL_ALIAS
to substitute the request's model name with the true Azure deployment name for compatibility across various software models.
YI (Lingyizhi)
Convert YI’s official API using:
AOA_TYPE=yi YI_API_KEY=<your-api-key> ./amazing-openai-api
Map generic software models as needed:
YI_MODEL_ALIAS=gpt-3.5-turbo:yi-34b-chat,gpt-4:yi-34b-chat
Gemini PRO
For Google Gemini API integration:
AOA_TYPE=gemini GEMINI_API_KEY=<your-api-key> ./amazing-openai-api
Utilize model mapping similarly:
GEMINI_MODEL_ALIAS=gpt-3.5-turbo:gemini-pro,gpt-4:gemini-pro
This setup ensures API Key security by not necessitating key input during requests, though users can continue to include keys in headers if preferred. The Amazing OpenAI API provides an efficient means of leveraging diverse APIs seamlessly and securely within an OpenAI-like environment, enhancing integration ease for developers.