System Administrator With Agentic AI
AI-powered file system administrator built with Google ADK, FastMCP & Ollama (Llama 3.2). Manage files through natural language β fully local, containerized with Docker, and secured with JWT auth + AI guardrails.
Ask AI about System Administrator With Agentic AI
Powered by Claude Β· Grounded in docs
I know everything about System Administrator With Agentic AI. Ask me about installation, configuration, usage, or troubleshooting.
0/500
Reviews
Documentation
π€ AI System Administrator
An intelligent file system administrator powered by a local LLM, built with Google ADK, FastMCP, and Ollama β fully containerized and secured.

π Overview
AI System Administrator is an agentic AI project that acts as an intelligent file system manager. You interact with it through natural language β ask it to read files, list directories, create folders, or check file contents β and it will carry out the operations autonomously using MCP (Model Context Protocol) tools.
The system runs entirely locally using Ollama with Llama 3.2, meaning no data is sent to external APIs. The full stack is containerized with Docker and orchestrated via Docker Compose, making it portable and easy to deploy on any machine.
β¨ Features
- π¬ Natural language interface β talk to the agent like a human, it figures out what to do
- π Full file system management β read, write, create, delete files and directories
- π File metadata β retrieve size, timestamps, and file type
- π Multi-layer security β protected files, blocked tools, rate limiting, and JWT authentication
- π³ Fully Dockerized β three isolated containers, launched with a single command
- π JWT-authenticated MCP server β only agents with valid RSA-signed tokens can access tools
- π‘οΈ AI Guardrails β prompt engineering to prevent the agent from leaking sensitive information
ποΈ Architecture
The system is composed of three Docker containers communicating over an internal bridge network:
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Docker Network β
β β
β βββββββββββββββββ βββββββββββββββββ β
β β Ollama β β MCP Server β β
β β (Llama 3.2) βββββββ (FastMCP) β β
β β port: 11434 β β port: 8001 β β
β βββββββββββββββββ βββββββββ¬ββββββββ β
β β JWT Auth β
β βββββββββΌββββββββ β
β β Agent β β
β β (Google ADK) β β
β β port: 8000 ββββ User β
β βββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
| Container | Technology | Role |
|---|---|---|
ollama | ollama/ollama:latest | Runs the Llama 3.2 language model locally |
mcp-server | Python + FastMCP | Exposes file system tools via HTTP streaming |
agent | Python + Google ADK | Processes natural language and calls MCP tools |
π οΈ MCP Tools
The MCP server exposes the following tools to the agent:
| Tool | Description |
|---|---|
get_file_content(file_path) | Read the complete contents of a file |
list_directory(dir_path) | List all files and folders in a directory |
get_file_info(file_path) | Get metadata: size, creation time, modification time, type |
write_file_content(file_path, content) | Create or overwrite a file with given content |
create_directory(dir_path) | Create a new directory (including nested) |
delete_path(path) | Delete a file or empty directory |
check_file_content(file_path, expected) | Verify if a file's content matches a given string (with rate limiting) |
get_root_path() | Return the absolute path of the agent's root directory |
get_help() | Display all available tools and their descriptions |
All file paths are relative to the agent's root directory. The
sanitize_path()function prevents any path traversal attacks (no..or.escapes).
π Security
Security is implemented across three layers:
1. Server Level (MCP)
- JWT Authentication β the MCP server validates RSA-signed Bearer tokens; unauthenticated requests are rejected
- Path sanitization β
sanitize_path()blocks any attempt to escape the root directory - Protected files β
flag.txtis shielded fromget_file_content,write_file_content,delete_path, andget_file_info - Rate limiting β
check_file_contentlimits repeated attempts on protected files
2. Agent Level (Callback)
before_tool_callbackβ intercepts every tool call before execution; if a blocked tool is called on a protected file, access is denied immediately and the LLM never sees the content
3. Instruction Level (AI Guardrails)
- The agent's system prompt explicitly instructs it to never reveal the contents of protected files, never expose internal tools like
get_security_config, and respond evasively if asked about system internals
π Project Structure
project-root/
βββ agent/
β βββ mcp_auth/
β β βββ private.pem # RSA private key (agent signs tokens)
β β βββ public.pem # RSA public key
β βββ multi_tool_agent/
β β βββ __init__.py
β β βββ agent.py # Agent definition & security callback
β βββ Dockerfile
β βββ requirements.txt
β
βββ server/
β βββ mcp_auth/
β β βββ public.pem # RSA public key (server validates tokens)
β βββ server.py # MCP server with all tools
β βββ Dockerfile
β βββ requirements.txt
β
βββ docker-compose.yml
π Getting Started
1. Clone the repository
git clone https://github.com/dansimina/System-Administrator-With-Agentic-AI
cd System-Administrator-With-Agentic-AI
2. Generate RSA key pair
mkdir -p agent/mcp_auth server/mcp_auth
# Generate private key
openssl genrsa -out agent/mcp_auth/private.pem 2048
# Extract public key
openssl rsa -in agent/mcp_auth/private.pem -pubout -out agent/mcp_auth/public.pem
# Copy public key to server
cp agent/mcp_auth/public.pem server/mcp_auth/public.pem
3. Start the system
docker-compose up --build
4. Pull the Llama model (first run only)
docker exec -it <ollama-container-id> ollama pull llama3.2:latest
5. Open the chat interface
Visit http://localhost:8000 in your browser and start chatting with the agent!
π§° Tech Stack
| Component | Technology |
|---|---|
| LLM | Ollama + Llama 3.2 |
| Agent Framework | Google ADK |
| MCP Server | FastMCP |
| Authentication | RSA + JWT |
| Containerization | Docker + Docker Compose |
| Language | Python 3.13 |
π License
This project was developed as part of the Operating Systems Administration course, 2025.
