Self-Organising File Management Through LlamaFS

Implement LlamaFS, an AI-driven file management system, based on Llama3 and Groq.

Traditional file managers typically rely on manual file categorisation based on folders, tags, etc. It can be time-consuming and inefficient, especially for users managing a large number of files. LlamaFS is a self-organising file manager built using Llama 3 LLM that automates the process of managing files by intelligently sorting and organising them based on their content. It uses Llama 3 LLM powered with Groq’s inference API to perform AI-driven analysis and organise them.

LlamaFS can significantly reduce the time and effort required to manage files. This article explores the implementation and working of LlamaFS.

Table of Content

  1. Understanding LlamaFS and its Utility
  2. LlamaFS Modes of Operation 
  3. Running LlamaFS using FastAPI

Understanding LlamaFS and its Utility

LlamaFS is a self-organising file manager that uses Llama 3 at the backend to automate file categorisation, renaming and sorting based on the file content and file-based conventions. It uses an AI-driven approach to understand the file nature and implement a flexible adjustment in file organisation. 

LlamaFS analyses the files and uses its built-in AI-driven intelligence to categorise and rename them. This can save significant time and effort compared to manual file organisation. The process involves content analysis and retrieving context for appropriate file organisation. 

The LlamaFS project is completely open-source and the code is freely available for anyone to inspect, contribute or modify on https://github.com/iyaja/llama-fs.  It is a good option for users who need the effortless organisation of files based on content-aware sorting, i.e., similarly documents, photos or even code snippets can be grouped enabling faster accessibility. 

LlamaFS is built on Python, based on Llama 3 LLM and inference using Groq’s API for file content summarisation and structuring. Ollama can be implemented for local processing. It uses a cross-platform open-source framework named Electron, as a frontend for a user-friendly UI. 

LlamaFS Modes of Operations

LlamaFS can be executed in two primary modes – Batch Mode and Watch Mode, and a toggle-based Stealth (incognito) Mode.

Batch Mode

This mode allows a user to target specific folders or groups of files for organisation by LlamaFS. The user can select a directory and initiate a sorting process, enabling it to analyse and categorise the files present in the directory. 

Watch Mode

LlamaFS can be implemented using watch mode where it passively monitors the file system in the background, performing automatic analysis and relevant categorisation. It doesn’t need any manual intervention to operate. This mode involves execution of a background service (daemon process) which intercepts file system operations to proactively learn and organise based on context and recent file edits. 

Stealth Mode

LlamaFS can operate in stealth mode, enabling file processing without cloud uploads. This feature maintains user data privacy and prevents data leakage.

Running LlamaFS using FastAPI

To install and execute the LlamaFS project, we need to follow the steps listed below: 

Step 1 – Use the git clone command to create a copy of the target repository: 

git clone https://github.com/iyaja/llama-fs.git

Step 2 – Change the directory to llama-fs: 

cd llama-fs 

Step 3 – Install the required Python packages using the requirements.txt file:

pip install -r requirements.txt
Package NameUtility
ollamaA platform for running LLMs locally
chromadbOpen-source embedding database
llama-indexData framework for LLM application building and Orchestration
litellmUnified interface to call LLMs using consistent I/O Formats
groqGroq Python library provides convenient access to Groq REST API
docx2txtPython-based utility for text and image extraction from docx files
coloramaImplement coloured terminal text and cursor positioning
termcolorANSI colour formatting for output in the terminal
clickCommand Line Interface Creation Kit for writing CLIs
asciitreeASCII tree generator for folder structure and documentation
fastapiWeb framework for building APIs with Python
weaveToolkit for developing GenAI applications
agentopsPython SDK for AI agent evaluation and observability
langchainBuilding LLM applications
langchain_coreBase abstractions that power the LangChain ecosystem
watchdogPython API and shell utilities for monitoring file system events

Step 4 – Update server.py and main.py files with a valid Groq API, use FastAPI to serve the application and query using the CURL command – 

fastapi dev server.py

Output

curl -X POST http://127.0.0.1:8000 \
-H "Content-Type: application/json" \
 -d '{"path": "/Users/sachintripathi/Downloads/demo101", "instruction": "string", "incognito": false}'

Make sure to use your folder path where you wish LlamaFS to understand and organise the structure

Output

LlamaFS understood the files and gave a description based on their context, this is used in categorising and organising the file structure. Users can check the output of LlamaFS’s suggested file structures and finalise changes if needed. 

Final Words

LlamaFS offers an intelligent and privacy-focused file management approach, potentially saving time and effort in keeping the user’s digital space organised and appropriately categorised. LlamaFS is extremely fast and immediately usable. By integrating local processing with Ollama, implementing smart caching, and using Groq’s API, it offers a seamless experience.

References

  1. LlamaFS GitHub Repo

Learn more about generative AI and LLM concepts through our hand-picked modules:

Picture of Sachin Tripathi

Sachin Tripathi

Sachin Tripathi is the Manager of AI Research at AIM, with over a decade of experience in AI and Machine Learning. An expert in generative AI and large language models (LLMs), Sachin excels in education, delivering effective training programs. His expertise also includes programming, big data analytics, and cybersecurity. Known for simplifying complex concepts, Sachin is a leading figure in AI education and professional development.

The Chartered Data Scientist Designation

Achieve the highest distinction in the data science profession.

Elevate Your Team's AI Skills with our Proven Training Programs

Strengthen Critical AI Skills with Trusted Generative AI Training by Association of Data Scientists.

Our Accreditations

Get global recognition for AI skills

Chartered Data Scientist (CDS™)

The highest distinction in the data science profession. Not just earn a charter, but use it as a designation.

Certified Data Scientist - Associate Level

Global recognition of data science skills at the beginner level.

Certified Generative AI Engineer

An upskilling-linked certification initiative designed to recognize talent in generative AI and large language models

Join thousands of members and receive all benefits.

Become Our Member

We offer both Individual & Institutional Membership.