Introduction to poe-openai-proxy
The poe-openai-proxy is a unique and ingenious solution for developers and enthusiasts in need of a free ChatGPT API experience. This project presents a wrapper that allows the use of the reverse-engineered Python library called poe-api
in a manner similar to the official OpenAI ChatGPT API. By using this proxy, users can integrate their favorite apps designed for the OpenAI API and enjoy ChatGPT services without any direct cost.
The project leverages Poe.com, a free web application by Quora, which facilitates communication with GPT models. The poe-api
library reverse-engineers access to poe.com
via Python, enabling calls to these AI models. The poe-openai-proxy acts as a bridge, enabling this functionality as an HTTP API. It replicates the behavior of the OpenAI API, making it compatible with applications already tailored to use OpenAI services.
Installation Guide
To get started with poe-openai-proxy, follow these straightforward steps:
-
Begin by cloning the repository to your local environment:
git clone https://github.com/juzeon/poe-openai-proxy.git cd poe-openai-proxy/
-
Next, install the necessary dependencies listed in
requirements.txt
:pip install -r external/requirements.txt
-
Create a configuration file named
config.toml
in the project's root directory by copying and editing the example configuration:cp config.example.toml config.toml vim config.toml
-
Start the Python backend needed for
poe-api
functionality:python external/api.py # This will run on port 5100
-
Finally, compile and initiate the Go backend, which complements the system:
go build chmod +x poe-openai-proxy ./poe-openai-proxy
Docker Support
For those who prefer using Docker, simplify the setup by executing docker-compose up -d
after configuring config.toml
as previously described.
How It Works
The poe-openai-proxy allows seamless usage of features comparable to the official ChatGPT API. A simple tweak in code can redirect requests from https://api.openai.com
to http://localhost:3700
. This substitution makes the application's existing infrastructure compatible with the proxy.
Supported API routes include:
/models
/chat/completions
/v1/models
/v1/chat/completions
In terms of parameters, core attributes such as model
, messages
, and stream
are fully supported. The configuration example file (config.example.toml
) provides mapping guidelines for model names and bot nicknames, facilitating easy integration.
Other parameters that do not align with the primary API will be disregarded, ensuring streamlined operation.
Acknowledgments
This project builds upon the foundational work of the poe-api
created by ading2210, showcasing a collaborative effort to enhance the utility and accessibility of AI model interaction.