

Farfalle
Overview :
Farfalle is an open-source AI-powered search engine that allows users to run local large language models (LLMs) or utilize cloud models. It's based on the Perplexity clone and integrates with various technology stacks, including Next.js frontend, FastAPI backend, and Tavily search API. It also provides comprehensive documentation and demo videos for customized setups to help users get started quickly.
Target Users :
Farfalle is suited for developers and technical personnel, particularly those interested in AI-driven search technologies. It offers an open-source solution that empowers individuals and organizations to leverage large language models (LLMs) to enhance search capabilities, whether deployed locally or in the cloud.
Use Cases
Developers can utilize Farfalle to create personalized AI search engines.
Organizations can deploy Farfalle to provide internal search services, enhancing employee productivity.
Educational institutions can leverage Farfalle as a teaching tool to aid students in accessing information effectively.
Features
Supports local execution of large language models like llama3, gemma, and mistral.
Supports cloud models, such as Groq/Llama3 and OpenAI/gpt4-o.
Provides Docker deployment settings for user convenience.
Integrated with Tavily search API, eliminating external dependencies.
Uses Logfire for logging.
Uses Redis for rate limiting.
Offers detailed documentation for customized settings.
How to Use
Firstly, ensure Docker and Ollama (if running local models) are installed.
Download one of the supported models: llama3, mistral, or gemma.
Launch the Ollama server: ollama serve.
Obtain API keys for Tavily, OpenAI (optional), and Groq (optional).
Clone the Farfalle repository to your local machine.
Add the necessary environment variables to the .env file.
Run the containers using Docker Compose.
Access http://localhost:3000 to view the application.
Featured AI Tools

Pseudoeditor
PseudoEditor is a free online pseudocode editor. It features syntax highlighting and auto-completion, making it easier for you to write pseudocode. You can also use our pseudocode compiler feature to test your code. No download is required, start using it immediately.
Development & Tools
3.8M

Coze
Coze is a next-generation AI chatbot building platform that enables the rapid creation, debugging, and optimization of AI chatbot applications. Users can quickly build bots without writing code and deploy them across multiple platforms. Coze also offers a rich set of plugins that can extend the capabilities of bots, allowing them to interact with data, turn ideas into bot skills, equip bots with long-term memory, and enable bots to initiate conversations.
Development & Tools
3.8M