* feat(call_agents): merge metadata of results Signed-off-by: Ettore Di Giacinto <mudler@localai.io> * chore: correct env typo Signed-off-by: Ettore Di Giacinto <mudler@localai.io> * Update services/actions/callagents.go Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * chore: add icon to thinking --------- Signed-off-by: Ettore Di Giacinto <mudler@localai.io> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
LOCAL AGENT
AI that stays where it belongs — on your machine.
TAKE BACK CONTROL
LocalAgent is an AI platform that runs 100% on your hardware. No clouds. No data sharing. No compromises.
Built for those who value privacy as non-negotiable, LocalAgent lets you deploy intelligent agents that never phone home. Your data stays where you put it — period. Are you tired of agent wrappers to cloud APIs? me too.
WHY LOCALAGENT?
- ✓ TRUE PRIVACY — Everything runs on your hardware, nothing leaves your machine
- ✓ MODEL FREEDOM — Works with local LLM formats (GGUF, GGML) you already have
- ✓ BUILD YOUR WAY — Extensible architecture for custom agents with specialized skills
- ✓ SLICK INTERFACE — Clean web UI for hassle-free agent interactions
- ✓ DEV-FRIENDLY — Comprehensive REST API for seamless integration
- ✓ PLAYS WELL WITH OTHERS — Optimized for LocalAI
- ✓ RUN ANYWHERE — Linux, macOS, Windows — we've got you covered
THE LOCAL ECOSYSTEM
LocalAgent is part of a trinity of tools designed to keep AI under your control:
- LocalAI — Run LLMs on your hardware
- LocalRAG — Local Retrieval-Augmented Generation
- LocalAgent — Deploy AI agents that respect your privacy
Features
Powerful WebUI
Connectors ready-to-go
QUICK START
One-Command Docker Setup
The fastest way to get everything running — LocalRAG, LocalAI, and LocalAgent pre-configured:
docker-compose up
No API keys. No cloud subscriptions. No external dependencies. Just AI that works.
Manual Launch
Run the binary and you're live:
./localagent
Access your agents at http://localhost:3000
Environment Configuration
| Variable | What It Does |
|---|---|
LOCALAGENT_MODEL |
Your go-to model |
LOCALAGENT_MULTIMODAL_MODEL |
Optional model for multimodal capabilities |
LOCALAGENT_LLM_API_URL |
OpenAI-compatible API server URL |
LOCALAGENT_LLM_API_KEY |
API authentication |
LOCALAGENT_TIMEOUT |
Request timeout settings |
LOCALAGENT_STATE_DIR |
Where state gets stored |
LOCALAGENT_LOCALRAG_URL |
LocalRAG connection |
LOCALAGENT_ENABLE_CONVERSATIONS_LOGGING |
Toggle conversation logs |
LOCALAGENT_API_KEYS |
A comma separated list of api keys used for authentication |
INSTALLATION OPTIONS
Pre-Built Binaries
Download ready-to-run binaries from the Releases page.
Source Build
Requirements:
- Go 1.20+
- Git
- Bun 1.2+
# Clone repo
git clone https://github.com/mudler/LocalAgent.git
cd LocalAgent
# Build it
cd webui/react-ui && bun i && bun run build
cd ../..
go build -o localagent
# Run it
./localagent
Development
The development workflow is similar to the source build, but with additional steps for hot reloading of the frontend:
# Clone repo
git clone https://github.com/mudler/LocalAgent.git
cd LocalAgent
# Install dependencies and start frontend development server
cd webui/react-ui && bun i && bun run dev
Then in seperate terminal:
# Start development server
cd ../.. && go run main.go
Note: see webui/react-ui/.vite.config.js for env vars that can be used to configure the backend URL
CONNECTORS
Link your agents to the services you already use. Configuration examples below.
GitHub Issues
{
"token": "YOUR_PAT_TOKEN",
"repository": "repo-to-monitor",
"owner": "repo-owner",
"botUserName": "bot-username"
}
Discord
After creating your Discord bot:
{
"token": "Bot YOUR_DISCORD_TOKEN",
"defaultChannel": "OPTIONAL_CHANNEL_ID"
}
Don't forget to enable "Message Content Intent" in Bot(tab) settings! Enable " Message Content Intent " in the Bot tab!
Slack
Use the included slack.yaml manifest to create your app, then configure:
{
"botToken": "xoxb-your-bot-token",
"appToken": "xapp-your-app-token"
}
- Create Oauth token bot token from "OAuth & Permissions" -> "OAuth Tokens for Your Workspace"
- Create App level token (from "Basic Information" -> "App-Level Tokens" ( scope connections:writeRoute authorizations:read ))
Telegram
Get a token from @botfather, then:
{
"token": "your-bot-father-token"
}
IRC
Connect to IRC networks:
{
"server": "irc.example.com",
"port": "6667",
"nickname": "LocalAgentBot",
"channel": "#yourchannel",
"alwaysReply": "false"
}
REST API
Agent Management
| Endpoint | Method | Description | Example |
|---|---|---|---|
/api/agents |
GET | List all available agents | Example |
/api/agent/:name/status |
GET | View agent status history | Example |
/api/agent/create |
POST | Create a new agent | Example |
/api/agent/:name |
DELETE | Remove an agent | Example |
/api/agent/:name/pause |
PUT | Pause agent activities | Example |
/api/agent/:name/start |
PUT | Resume a paused agent | Example |
/api/agent/:name/config |
GET | Get agent configuration | |
/api/agent/:name/config |
PUT | Update agent configuration | |
/api/meta/agent/config |
GET | Get agent configuration metadata | |
/settings/export/:name |
GET | Export agent config | Example |
/settings/import |
POST | Import agent config | Example |
Actions and Groups
| Endpoint | Method | Description | Example |
|---|---|---|---|
/api/actions |
GET | List available actions | |
/api/action/:name/run |
POST | Execute an action | |
/api/agent/group/generateProfiles |
POST | Generate group profiles | |
/api/agent/group/create |
POST | Create a new agent group |
Chat Interactions
| Endpoint | Method | Description | Example |
|---|---|---|---|
/api/chat/:name |
POST | Send message & get response | Example |
/api/notify/:name |
POST | Send notification to agent | Example |
/api/sse/:name |
GET | Real-time agent event stream | Example |
/v1/responses |
POST | Send message & get response | OpenAI's Responses |
Curl Examples
Get All Agents
curl -X GET "http://localhost:3000/api/agents"
Get Agent Status
curl -X GET "http://localhost:3000/api/agent/my-agent/status"
Create Agent
curl -X POST "http://localhost:3000/api/agent/create" \
-H "Content-Type: application/json" \
-d '{
"name": "my-agent",
"model": "gpt-4",
"system_prompt": "You are an AI assistant.",
"enable_kb": true,
"enable_reasoning": true
}'
Delete Agent
curl -X DELETE "http://localhost:3000/api/agent/my-agent"
Pause Agent
curl -X PUT "http://localhost:3000/api/agent/my-agent/pause"
Start Agent
curl -X PUT "http://localhost:3000/api/agent/my-agent/start"
Get Agent Configuration
curl -X GET "http://localhost:3000/api/agent/my-agent/config"
Update Agent Configuration
curl -X PUT "http://localhost:3000/api/agent/my-agent/config" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4",
"system_prompt": "You are an AI assistant."
}'
Export Agent
curl -X GET "http://localhost:3000/settings/export/my-agent" --output my-agent.json
Import Agent
curl -X POST "http://localhost:3000/settings/import" \
-F "file=@/path/to/my-agent.json"
Send Message
curl -X POST "http://localhost:3000/api/chat/my-agent" \
-H "Content-Type: application/json" \
-d '{"message": "Hello, how are you today?"}'
Notify Agent
curl -X POST "http://localhost:3000/api/notify/my-agent" \
-H "Content-Type: application/json" \
-d '{"message": "Important notification"}'
Agent SSE Stream
curl -N -X GET "http://localhost:3000/api/sse/my-agent"
Note: For proper SSE handling, you should use a client that supports SSE natively.
Agent Configuration Reference
The agent configuration defines how an agent behaves and what capabilities it has. You can view the available configuration options and their descriptions by using the metadata endpoint:
curl -X GET "http://localhost:3000/api/meta/agent/config"
This will return a JSON object containing all available configuration fields, their types, and descriptions.
Here's an example of the agent configuration structure:
{
"name": "my-agent",
"model": "gpt-4",
"multimodal_model": "gpt-4-vision",
"hud": true,
"standalone_job": false,
"random_identity": false,
"initiate_conversations": true,
"enable_planning": true,
"identity_guidance": "You are a helpful assistant.",
"periodic_runs": "0 * * * *",
"permanent_goal": "Help users with their questions.",
"enable_kb": true,
"enable_reasoning": true,
"kb_results": 5,
"can_stop_itself": false,
"system_prompt": "You are an AI assistant.",
"long_term_memory": true,
"summary_long_term_memory": false
}
LICENSE
MIT License — See the LICENSE file for details.
LOCAL PROCESSING. GLOBAL THINKING.
Made with ❤️ by mudler