This adds a completely separate frontend based on React because I found that code gen works better with React once the application gets bigger. In particular it was getting very hard to move past add connectors and actions. The idea is to replace the standard UI with this once it has been tested. But for now it is available at /app in addition to the original at / Signed-off-by: Richard Palethorpe <io@richiejp.com>
LOCAL AGENT
AI that stays where it belongs — on your machine.
TAKE BACK CONTROL
LocalAgent is an AI platform that runs 100% on your hardware. No clouds. No data sharing. No compromises.
Built for those who value privacy as non-negotiable, LocalAgent lets you deploy intelligent agents that never phone home. Your data stays where you put it — period. Are you tired of agent wrappers to cloud APIs? me too.
WHY LOCALAGENT?
- ✓ TRUE PRIVACY — Everything runs on your hardware, nothing leaves your machine
- ✓ MODEL FREEDOM — Works with local LLM formats (GGUF, GGML) you already have
- ✓ BUILD YOUR WAY — Extensible architecture for custom agents with specialized skills
- ✓ SLICK INTERFACE — Clean web UI for hassle-free agent interactions
- ✓ DEV-FRIENDLY — Comprehensive REST API for seamless integration
- ✓ PLAYS WELL WITH OTHERS — Optimized for LocalAI
- ✓ RUN ANYWHERE — Linux, macOS, Windows — we've got you covered
THE LOCAL ECOSYSTEM
LocalAgent is part of a trinity of tools designed to keep AI under your control:
- LocalAI — Run LLMs on your hardware
- LocalRAG — Local Retrieval-Augmented Generation
- LocalAgent — Deploy AI agents that respect your privacy
Features
Powerful WebUI
Connectors ready-to-go
QUICK START
One-Command Docker Setup
The fastest way to get everything running — LocalRAG, LocalAI, and LocalAgent pre-configured:
docker-compose up
No API keys. No cloud subscriptions. No external dependencies. Just AI that works.
Manual Launch
Run the binary and you're live:
./localagent
Access your agents at http://localhost:3000
Environment Configuration
| Variable | What It Does |
|---|---|
LOCALAGENT_MODEL |
Your go-to model |
LOCALAGENT_MULTIMODAL_MODEL |
Optional model for multimodal capabilities |
LOCALAGENT_LLM_API_URL |
OpenAI-compatible API server URL |
LOCALAGENT_API_KEY |
API authentication |
LOCALAGENT_TIMEOUT |
Request timeout settings |
LOCALAGENT_STATE_DIR |
Where state gets stored |
LOCALAGENT_LOCALRAG_URL |
LocalRAG connection |
LOCALAGENT_ENABLE_CONVERSATIONS_LOGGING |
Toggle conversation logs |
LOCALAGENT_API_KEYS |
A comma separated list of api keys used for authentication |
INSTALLATION OPTIONS
Pre-Built Binaries
Download ready-to-run binaries from the Releases page.
Source Build
Requirements:
- Go 1.20+
- Git
- Bun 1.2+
# Clone repo
git clone https://github.com/mudler/LocalAgent.git
cd LocalAgent
# Build it
cd webui/react-ui && bun i && bun run build
cd ../..
go build -o localagent
# Run it
./localagent
CONNECTORS
Link your agents to the services you already use. Configuration examples below.
GitHub Issues
{
"token": "YOUR_PAT_TOKEN",
"repository": "repo-to-monitor",
"owner": "repo-owner",
"botUserName": "bot-username"
}
Discord
After creating your Discord bot:
{
"token": "Bot YOUR_DISCORD_TOKEN",
"defaultChannel": "OPTIONAL_CHANNEL_ID"
}
Don't forget to enable "Message Content Intent" in Bot(tab) settings! Enable " Message Content Intent " in the Bot tab!
Slack
Use the included slack.yaml manifest to create your app, then configure:
{
"botToken": "xoxb-your-bot-token",
"appToken": "xapp-your-app-token"
}
- Create Oauth token bot token from "OAuth & Permissions" -> "OAuth Tokens for Your Workspace"
- Create App level token (from "Basic Information" -> "App-Level Tokens" ( scope connections:writeRoute authorizations:read ))
Telegram
Get a token from @botfather, then:
{
"token": "your-bot-father-token"
}
IRC
Connect to IRC networks:
{
"server": "irc.example.com",
"port": "6667",
"nickname": "LocalAgentBot",
"channel": "#yourchannel",
"alwaysReply": "false"
}
REST API
Agent Management
| Endpoint | Method | Description | Example |
|---|---|---|---|
/agents |
GET | List all available agents | Example |
/status/:name |
GET | View agent status history | Example |
/create |
POST | Create a new agent | Example |
/delete/:name |
DELETE | Remove an agent | Example |
/pause/:name |
PUT | Pause agent activities | Example |
/start/:name |
PUT | Resume a paused agent | Example |
/settings/export/:name |
GET | Export agent config | Example |
/settings/import |
POST | Import agent config | Example |
/api/agent/:name/config |
GET | Get agent configuration | |
/api/agent/:name/config |
PUT | Update agent configuration |
Chat Interactions
| Endpoint | Method | Description | Example |
|---|---|---|---|
/chat/:name |
POST | Send message & get response | Example |
/notify/:name |
GET | Send notification to agent | Example |
/sse/:name |
GET | Real-time agent event stream | Example |
/v1/responses |
POST | Send message & get response | OpenAI's Responses |
Curl Examples
Get All Agents
curl -X GET "http://localhost:3000/agents"
Get Agent Status
curl -X GET "http://localhost:3000/status/my-agent"
Create Agent
curl -X POST "http://localhost:3000/create" \
-H "Content-Type: application/json" \
-d '{
"name": "my-agent",
"model": "gpt-4",
"system_prompt": "You are an AI assistant.",
"enable_kb": true,
"enable_reasoning": true
}'
Delete Agent
curl -X DELETE "http://localhost:3000/delete/my-agent"
Pause Agent
curl -X PUT "http://localhost:3000/pause/my-agent"
Start Agent
curl -X PUT "http://localhost:3000/start/my-agent"
Export Agent
curl -X GET "http://localhost:3000/settings/export/my-agent" --output my-agent.json
Import Agent
curl -X POST "http://localhost:3000/settings/import" \
-F "file=@/path/to/my-agent.json"
Send Message
curl -X POST "http://localhost:3000/chat/my-agent" \
-H "Content-Type: application/json" \
-d '{"message": "Hello, how are you today?"}'
Notify Agent
curl -X GET "http://localhost:3000/notify/my-agent" \
-d "message=Important notification"
Agent SSE Stream
curl -N -X GET "http://localhost:3000/sse/my-agent"
Note: For proper SSE handling, you should use a client that supports SSE natively.
Agent Configuration Reference
{
"name": "my-agent",
"model": "gpt-4",
"multimodal_model": "gpt-4-vision",
"hud": true,
"standalone_job": false,
"random_identity": false,
"initiate_conversations": true,
"enable_planning": true,
"identity_guidance": "You are a helpful assistant.",
"periodic_runs": "0 * * * *",
"permanent_goal": "Help users with their questions.",
"enable_kb": true,
"enable_reasoning": true,
"kb_results": 5,
"can_stop_itself": false,
"system_prompt": "You are an AI assistant.",
"long_term_memory": true,
"summary_long_term_memory": false
}
LICENSE
MIT License — See the LICENSE file for details.
LOCAL PROCESSING. GLOBAL THINKING.
Made with ❤️ by mudler