Richard Palethorpe 6d9f1a95cc fix(ui): Set page title
2025-04-02 08:28:49 +01:00
2025-02-26 22:37:48 +01:00
2025-04-02 08:28:49 +01:00
2025-03-05 22:19:13 +01:00
2025-02-05 19:57:25 +01:00
2025-03-05 22:19:07 +01:00
2025-03-09 23:05:55 +01:00
2025-02-05 19:57:25 +01:00

LocalAgent Logo

LOCAL AGENT

AI that stays where it belongs — on your machine.

Go Report Card License: MIT GitHub stars GitHub issues

TAKE BACK CONTROL

LocalAgent is an AI platform that runs 100% on your hardware. No clouds. No data sharing. No compromises.

Built for those who value privacy as non-negotiable, LocalAgent lets you deploy intelligent agents that never phone home. Your data stays where you put it — period. Are you tired of agent wrappers to cloud APIs? me too.

WHY LOCALAGENT?

  • ✓ TRUE PRIVACY — Everything runs on your hardware, nothing leaves your machine
  • ✓ MODEL FREEDOM — Works with local LLM formats (GGUF, GGML) you already have
  • ✓ BUILD YOUR WAY — Extensible architecture for custom agents with specialized skills
  • ✓ SLICK INTERFACE — Clean web UI for hassle-free agent interactions
  • ✓ DEV-FRIENDLY — Comprehensive REST API for seamless integration
  • ✓ PLAYS WELL WITH OTHERS — Optimized for LocalAI
  • ✓ RUN ANYWHERE — Linux, macOS, Windows — we've got you covered

THE LOCAL ECOSYSTEM

LocalAgent is part of a trinity of tools designed to keep AI under your control:

  • LocalAI — Run LLMs on your hardware
  • LocalRAG — Local Retrieval-Augmented Generation
  • LocalAgent — Deploy AI agents that respect your privacy

Features

Powerful WebUI

Screenshot from 2025-03-11 22-50-24 Screenshot from 2025-03-11 22-50-06 Screenshot from 2025-03-11 22-49-56

Connectors ready-to-go

Telegram Logo Discord Logo Slack Logo IRC Logo Github Logo

QUICK START

One-Command Docker Setup

The fastest way to get everything running — LocalRAG, LocalAI, and LocalAgent pre-configured:

docker-compose up

No API keys. No cloud subscriptions. No external dependencies. Just AI that works.

Manual Launch

Run the binary and you're live:

./localagent

Access your agents at http://localhost:3000

Environment Configuration

Variable What It Does
LOCALAGENT_MODEL Your go-to model
LOCALAGENT_MULTIMODAL_MODEL Optional model for multimodal capabilities
LOCALAGENT_LLM_API_URL OpenAI-compatible API server URL
LOCALAGENT_LLM_API_KEY API authentication
LOCALAGENT_TIMEOUT Request timeout settings
LOCALAGENT_STATE_DIR Where state gets stored
LOCALAGENT_LOCALRAG_URL LocalRAG connection
LOCALAGENT_ENABLE_CONVERSATIONS_LOGGING Toggle conversation logs
LOCALAGENT_API_KEYS A comma separated list of api keys used for authentication

INSTALLATION OPTIONS

Pre-Built Binaries

Download ready-to-run binaries from the Releases page.

Source Build

Requirements:

  • Go 1.20+
  • Git
  • Bun 1.2+
# Clone repo
git clone https://github.com/mudler/LocalAgent.git
cd LocalAgent

# Build it
cd webui/react-ui && bun i && bun run build
cd ../..
go build -o localagent

# Run it
./localagent

Development

The development workflow is similar to the source build, but with additional steps for hot reloading of the frontend:

# Clone repo
git clone https://github.com/mudler/LocalAgent.git
cd LocalAgent

# Install dependencies and start frontend development server
cd webui/react-ui && bun i && bun run dev

Then in seperate terminal:

# Start development server
cd ../.. && go run main.go

Note: see webui/react-ui/.vite.config.js for env vars that can be used to configure the backend URL

CONNECTORS

Link your agents to the services you already use. Configuration examples below.

GitHub Issues

{
  "token": "YOUR_PAT_TOKEN",
  "repository": "repo-to-monitor",
  "owner": "repo-owner",
  "botUserName": "bot-username"
}

Discord

After creating your Discord bot:

{
  "token": "Bot YOUR_DISCORD_TOKEN",
  "defaultChannel": "OPTIONAL_CHANNEL_ID"
}

Don't forget to enable "Message Content Intent" in Bot(tab) settings! Enable " Message Content Intent " in the Bot tab!

Slack

Use the included slack.yaml manifest to create your app, then configure:

{
  "botToken": "xoxb-your-bot-token",
  "appToken": "xapp-your-app-token"
}
  • Create Oauth token bot token from "OAuth & Permissions" -> "OAuth Tokens for Your Workspace"
  • Create App level token (from "Basic Information" -> "App-Level Tokens" ( scope connections:writeRoute authorizations:read ))

Telegram

Get a token from @botfather, then:

{ 
  "token": "your-bot-father-token" 
}

IRC

Connect to IRC networks:

{
  "server": "irc.example.com",
  "port": "6667",
  "nickname": "LocalAgentBot",
  "channel": "#yourchannel",
  "alwaysReply": "false"
}

REST API

Agent Management

Endpoint Method Description Example
/api/agents GET List all available agents Example
/api/agent/:name/status GET View agent status history Example
/api/agent/create POST Create a new agent Example
/api/agent/:name DELETE Remove an agent Example
/api/agent/:name/pause PUT Pause agent activities Example
/api/agent/:name/start PUT Resume a paused agent Example
/api/agent/:name/config GET Get agent configuration
/api/agent/:name/config PUT Update agent configuration
/api/meta/agent/config GET Get agent configuration metadata
/settings/export/:name GET Export agent config Example
/settings/import POST Import agent config Example

Actions and Groups

Endpoint Method Description Example
/api/actions GET List available actions
/api/action/:name/run POST Execute an action
/api/agent/group/generateProfiles POST Generate group profiles
/api/agent/group/create POST Create a new agent group

Chat Interactions

Endpoint Method Description Example
/api/chat/:name POST Send message & get response Example
/api/notify/:name POST Send notification to agent Example
/api/sse/:name GET Real-time agent event stream Example
/v1/responses POST Send message & get response OpenAI's Responses
Curl Examples

Get All Agents

curl -X GET "http://localhost:3000/api/agents"

Get Agent Status

curl -X GET "http://localhost:3000/api/agent/my-agent/status"

Create Agent

curl -X POST "http://localhost:3000/api/agent/create" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "my-agent",
    "model": "gpt-4",
    "system_prompt": "You are an AI assistant.",
    "enable_kb": true,
    "enable_reasoning": true
  }'

Delete Agent

curl -X DELETE "http://localhost:3000/api/agent/my-agent"

Pause Agent

curl -X PUT "http://localhost:3000/api/agent/my-agent/pause"

Start Agent

curl -X PUT "http://localhost:3000/api/agent/my-agent/start"

Get Agent Configuration

curl -X GET "http://localhost:3000/api/agent/my-agent/config"

Update Agent Configuration

curl -X PUT "http://localhost:3000/api/agent/my-agent/config" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4",
    "system_prompt": "You are an AI assistant."
  }'

Export Agent

curl -X GET "http://localhost:3000/settings/export/my-agent" --output my-agent.json

Import Agent

curl -X POST "http://localhost:3000/settings/import" \
  -F "file=@/path/to/my-agent.json"

Send Message

curl -X POST "http://localhost:3000/api/chat/my-agent" \
  -H "Content-Type: application/json" \
  -d '{"message": "Hello, how are you today?"}'

Notify Agent

curl -X POST "http://localhost:3000/api/notify/my-agent" \
  -H "Content-Type: application/json" \
  -d '{"message": "Important notification"}'

Agent SSE Stream

curl -N -X GET "http://localhost:3000/api/sse/my-agent"

Note: For proper SSE handling, you should use a client that supports SSE natively.

Agent Configuration Reference

The agent configuration defines how an agent behaves and what capabilities it has. You can view the available configuration options and their descriptions by using the metadata endpoint:

curl -X GET "http://localhost:3000/api/meta/agent/config"

This will return a JSON object containing all available configuration fields, their types, and descriptions.

Here's an example of the agent configuration structure:

{
  "name": "my-agent",
  "model": "gpt-4",
  "multimodal_model": "gpt-4-vision",
  "hud": true,
  "standalone_job": false,
  "random_identity": false,
  "initiate_conversations": true,
  "enable_planning": true,
  "identity_guidance": "You are a helpful assistant.",
  "periodic_runs": "0 * * * *",
  "permanent_goal": "Help users with their questions.",
  "enable_kb": true,
  "enable_reasoning": true,
  "kb_results": 5,
  "can_stop_itself": false,
  "system_prompt": "You are an AI assistant.",
  "long_term_memory": true,
  "summary_long_term_memory": false
}

LICENSE

MIT License — See the LICENSE file for details.


LOCAL PROCESSING. GLOBAL THINKING.
Made with ❤️ by mudler

Description
LocalAGI is a powerful, self-hostable AI Agent platform designed for maximum privacy and flexibility. A complete drop-in replacement for OpenAI's Responses APIs with advanced agentic capabilities. No clouds. No data leaks. Just pure local AI that works on consumer-grade hardware (CPU and GPU).
Readme MIT 7.1 MiB
Languages
Go 60.2%
JavaScript 19.6%
HTML 13.3%
CSS 6.7%
Makefile 0.1%