Introduction to Smarty-GPT
Smarty-GPT is an innovative tool that enhances the capabilities of large language models (LLMs) like ChatGPT and GPT-4. It does this by using a system of prompts to influence the behavior of these models in a way that is clear and straightforward for users. This tool simplifies the way users interact with complex models, making advanced AI technologies more accessible and user-friendly.
Installation
Installing Smarty-GPT is simple and can be done by running the following command in your terminal:
sh install.sh
Additionally, there are several platforms available for users to experiment with Smarty-GPT, such as:
Supported Models
Smarty-GPT integrates several powerful models, offering diverse functionalities to its users. These include:
- text-davinci-003
- Flan-T5, developed by Google
- ChatGPT and GPT4 via paid API access
Contexts and Prompts
Smarty-GPT allows users to guide the output of language models through different types of prompts. These include:
- Manual Prompts: Pre-set prompts built into the system.
- Awesome Chat GPT Prompts: A wide range of prompts supported through a comprehensive dataset.
- Custom Prompts: Users can create their own prompts by adding them to a file.
- Support for Awesome-GPT4 Prompts: This feature is currently under development.
Authentication
To use certain functionalities of Smarty-GPT, users need to create a configuration file named config.txt. This file should include the API key from OpenAI for authentication purposes, formatted as shown:
[auth]
api_key = xxxxxxxxxxxxxxxxxx
Coding Examples
Here's a brief example of how to use Smarty-GPT in a Python script:
from smartygpt import SmartyGPT, Models
if __name__=="__main__":
s = SmartyGPT(prompt="DoctorAdvice", config_file="/home/user/config.txt")
result = s.wrapper("Can Vitamin D cure COVID-19?")
print(result)
For more detailed examples and to explore additional functionalities, users can visit the Colab page or check the test folders available.
Philosophy
Smarty-GPT's core mission is to bring together numerous resources such as models, prompts, and APIs related to LLMs in one unified platform. The approach eases the complexity for end-users, who typically do not have the technical background to craft intricate query contexts to adjust model outputs. The library takes on the intricacies, offering users a more direct and efficient experience.
Future Developments
The project is evolving, with more features and models set to be introduced. Contributors are welcome to submit pull requests, raise issues, or connect via email at [email protected] for collaboration.
Disclaimer
The software is offered "as is" and comes without warranties of any kind, including, but not limited to, implied warranties of merchantability and fitness for a particular purpose. Users assume all risks associated with using the software, and the developer is not liable for any damages that might occur.