llama-fs
L
Llama Fs
Overview :
LlamaFS is a self-organizing file manager that automatically renames and organizes files based on their content and known conventions (e.g., time). It supports various file types, including images processed by Moondream and audio files processed by Whisper. It operates in two modes: batch processing (batch mode) and interactive daemon (monitoring mode). In monitoring mode, LlamaFS launches a daemon that monitors directories, intercepts all filesystem operations, and proactively learns and predicts your file renaming actions using the latest editing context. Additionally, it features a "stealth mode" switch, allowing requests to be routed through Ollama instead of Groq to ensure privacy.
Target Users :
LlamaFS is ideal for users who need to efficiently manage and organize large amounts of files, particularly those who frequently handle multimedia files and require privacy protection. It automates and streamlines file management, reducing time consumption and boosting productivity.
Total Visits: 474.6M
Top Region: US(19.34%)
Website Views : 131.7K
Use Cases
User A uses LlamaFS to automatically sort a cluttered desktop of files.
User B utilizes monitoring mode to have LlamaFS automatically manage new files in their Downloads folder.
User C enables stealth mode to protect their file operations from being recorded by third-party API providers.
Features
Automatic file renaming and organization
Support for multiple file types, including images and audio
Two operating modes: batch processing and monitoring mode
Intercepts file system operations in monitoring mode and predicts user actions
Stealth mode protects user privacy
User-friendly interface built with Electron
Fast file operation processing, with most operations taking less than 500 milliseconds in monitoring mode
How to Use
Clone the repository: git clone https://github.com/iyaja/llama-fs.git
Navigate to the project directory: cd llama-fs
Install dependencies: pip install -r requirements.txt
Start the application locally using FastAPI: Run the designated command. The server runs by default on port 8000.
Query the API using the curl command, passing the file path as a parameter, for example, for operations on the Downloads folder
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase