Introduction to the Farfalle Project
Overview
Farfalle is an innovative, open-source, AI-powered search engine designed as a clone of Perplexity. This project is versatile, allowing users to run local large language models (LLMs) such as Llama3, Gemma, Mistral, and Phi3, through Ollama. Additionally, it supports custom LLMs via LiteLLM and provides an option to use cloud models like Groq/Llama3 and OpenAI's GPT-4.
Features
Farfalle offers a robust set of features that enhances the search engine experience:
- Diverse Search Providers: Users can search with multiple providers, including Tavily, Searxng, Serper, and Bing, enabling comprehensive results.
- AI Interaction: It answers questions using cloud-based models like OpenAI/GPT4-o, OpenAI/GPT-3.5-turbo, and Groq/Llama3.
- Local and Custom AI Models: Users can interact with local models (Llama3, Mistral, Gemma, Phi3) and even implement custom LLMs through LiteLLM.
- Intelligent Search Agents: Farfalle's smart agent plans and executes searches for optimized results.
- Chat and Search History: Easily access past searches and interactions for a continuous user experience.
- Expert Search: Gain insights from AI-derived expertise on various topics.
Technology Stack
The Farfalle project employs a modern technology stack to provide efficient and high-performance services:
- Frontend: Developed using Next.js, ensuring a dynamic and responsive user interface.
- Backend: Built with FastAPI, offering a robust and scalable server environment.
- Search APIs: Incorporates various search API providers like SearXNG, Tavily, Serper, and Bing for wide-ranging data.
- Logging and Control: Utilizes Logfire for logging purposes and Redis for rate limiting.
- UI Components: Shadcn/ui provides the components for Farfalle's interface.
Getting Started Locally
To launch Farfalle locally, certain prerequisites must be met:
- Install Docker: Necessary for containerization and deployment.
- Install Ollama: Required if running local models.
- Obtain API Keys: For optional providers like Tavily, Serper, OpenAI, Bing, and Groq.
Quick Start Guide
- Clone the project from GitHub.
- Set up the
.env
file with your API keys if required. - Start the application using Docker to launch on
http://localhost:3000
.
Deployment
Backend Deployment
- Utilize Render for quick and seamless backend deployment.
Frontend Deployment
- Deploy the frontend with Vercel, using the backend URL to connect the two components.
Using Farfalle as Your Search Engine
To make Farfalle your default search engine, configure your browser's settings to include the URL pattern http://localhost:3000/?q=%s
, allowing seamless integration with your browsing experience.
In conclusion, Farfalle offers a comprehensive, user-friendly platform for both casual users and developers looking for customizable search solutions. Its robust feature set and modern tech stack provide a flexible and powerful search engine experience.