Ettore Di Giacinto fa12dba7c2 Better paragraph splitting
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-25 22:28:08 +01:00
2025-03-20 16:36:19 +01:00
2025-03-25 17:58:59 +01:00
2025-02-26 22:37:48 +01:00
2025-03-25 22:28:08 +01:00
2025-03-25 22:28:08 +01:00
2025-03-05 22:19:13 +01:00
2025-02-05 19:57:25 +01:00
2025-03-07 22:51:34 +01:00
2025-03-05 22:19:07 +01:00
2025-03-22 20:50:31 +01:00
2025-03-18 23:28:02 +01:00
2025-03-24 10:23:05 +01:00
2025-03-09 23:05:55 +01:00
2025-02-05 19:57:25 +01:00

LocalAgent Logo

LOCAL AGENT

AI that stays where it belongs — on your machine.

Go Report Card License: MIT GitHub stars GitHub issues

TAKE BACK CONTROL

LocalAgent is an AI platform that runs 100% on your hardware. No clouds. No data sharing. No compromises.

Built for those who value privacy as non-negotiable, LocalAgent lets you deploy intelligent agents that never phone home. Your data stays where you put it — period. Are you tired of agent wrappers to cloud APIs? me too.

WHY LOCALAGENT?

  • ✓ TRUE PRIVACY — Everything runs on your hardware, nothing leaves your machine
  • ✓ MODEL FREEDOM — Works with local LLM formats (GGUF, GGML) you already have
  • ✓ BUILD YOUR WAY — Extensible architecture for custom agents with specialized skills
  • ✓ SLICK INTERFACE — Clean web UI for hassle-free agent interactions
  • ✓ DEV-FRIENDLY — Comprehensive REST API for seamless integration
  • ✓ PLAYS WELL WITH OTHERS — Optimized for LocalAI
  • ✓ RUN ANYWHERE — Linux, macOS, Windows — we've got you covered

THE LOCAL ECOSYSTEM

LocalAgent is part of a trinity of tools designed to keep AI under your control:

  • LocalAI — Run LLMs on your hardware
  • LocalRAG — Local Retrieval-Augmented Generation
  • LocalAgent — Deploy AI agents that respect your privacy

Features

Powerful WebUI

Screenshot from 2025-03-11 22-50-24 Screenshot from 2025-03-11 22-50-06 Screenshot from 2025-03-11 22-49-56

Connectors ready-to-go

Telegram Logo Discord Logo Slack Logo IRC Logo Github Logo

QUICK START

One-Command Docker Setup

The fastest way to get everything running — LocalRAG, LocalAI, and LocalAgent pre-configured:

docker-compose up

No API keys. No cloud subscriptions. No external dependencies. Just AI that works.

Manual Launch

Run the binary and you're live:

./localagent

Access your agents at http://localhost:3000

Environment Configuration

Variable What It Does
LOCALAGENT_MODEL Your go-to model
LOCALAGENT_MULTIMODAL_MODEL Optional model for multimodal capabilities
LOCALAGENT_LLM_API_URL OpenAI-compatible API server URL
LOCALAGENT_API_KEY API authentication
LOCALAGENT_TIMEOUT Request timeout settings
LOCALAGENT_STATE_DIR Where state gets stored
LOCALAGENT_LOCALRAG_URL LocalRAG connection
LOCALAGENT_ENABLE_CONVERSATIONS_LOGGING Toggle conversation logs
LOCALAGENT_API_KEYS A comma separated list of api keys used for authentication

INSTALLATION OPTIONS

Pre-Built Binaries

Download ready-to-run binaries from the Releases page.

Source Build

Requirements:

  • Go 1.20+
  • Git
# Clone repo
git clone https://github.com/mudler/LocalAgent.git
cd LocalAgent

# Build it
go build -o localagent

# Run it
./localagent

CONNECTORS

Link your agents to the services you already use. Configuration examples below.

GitHub Issues

{
  "token": "YOUR_PAT_TOKEN",
  "repository": "repo-to-monitor",
  "owner": "repo-owner",
  "botUserName": "bot-username"
}

Discord

After creating your Discord bot:

{
  "token": "Bot YOUR_DISCORD_TOKEN",
  "defaultChannel": "OPTIONAL_CHANNEL_ID"
}

Don't forget to enable "Message Content Intent" in Bot(tab) settings! Enable " Message Content Intent " in the Bot tab!

Slack

Use the included slack.yaml manifest to create your app, then configure:

{
  "botToken": "xoxb-your-bot-token",
  "appToken": "xapp-your-app-token"
}
  • Create Oauth token bot token from "OAuth & Permissions" -> "OAuth Tokens for Your Workspace"
  • Create App level token (from "Basic Information" -> "App-Level Tokens" ( scope connections:writeRoute authorizations:read ))

Telegram

Get a token from @botfather, then:

{ 
  "token": "your-bot-father-token" 
}

IRC

Connect to IRC networks:

{
  "server": "irc.example.com",
  "port": "6667",
  "nickname": "LocalAgentBot",
  "channel": "#yourchannel",
  "alwaysReply": "false"
}

REST API

Agent Management

Endpoint Method Description Example
/agents GET List all available agents Example
/status/:name GET View agent status history Example
/create POST Create a new agent Example
/delete/:name DELETE Remove an agent Example
/pause/:name PUT Pause agent activities Example
/start/:name PUT Resume a paused agent Example
/settings/export/:name GET Export agent config Example
/settings/import POST Import agent config Example
/api/agent/:name/config GET Get agent configuration
/api/agent/:name/config PUT Update agent configuration

Chat Interactions

Endpoint Method Description Example
/chat/:name POST Send message & get response Example
/notify/:name GET Send notification to agent Example
/sse/:name GET Real-time agent event stream Example
/v1/responses POST Send message & get response OpenAI's Responses
Curl Examples

Get All Agents

curl -X GET "http://localhost:3000/agents"

Get Agent Status

curl -X GET "http://localhost:3000/status/my-agent"

Create Agent

curl -X POST "http://localhost:3000/create" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "my-agent",
    "model": "gpt-4",
    "system_prompt": "You are an AI assistant.",
    "enable_kb": true,
    "enable_reasoning": true
  }'

Delete Agent

curl -X DELETE "http://localhost:3000/delete/my-agent"

Pause Agent

curl -X PUT "http://localhost:3000/pause/my-agent"

Start Agent

curl -X PUT "http://localhost:3000/start/my-agent"

Export Agent

curl -X GET "http://localhost:3000/settings/export/my-agent" --output my-agent.json

Import Agent

curl -X POST "http://localhost:3000/settings/import" \
  -F "file=@/path/to/my-agent.json"

Send Message

curl -X POST "http://localhost:3000/chat/my-agent" \
  -H "Content-Type: application/json" \
  -d '{"message": "Hello, how are you today?"}'

Notify Agent

curl -X GET "http://localhost:3000/notify/my-agent" \
  -d "message=Important notification"

Agent SSE Stream

curl -N -X GET "http://localhost:3000/sse/my-agent"

Note: For proper SSE handling, you should use a client that supports SSE natively.

Agent Configuration Reference

{
  "name": "my-agent",
  "model": "gpt-4",
  "multimodal_model": "gpt-4-vision",
  "hud": true,
  "standalone_job": false,
  "random_identity": false,
  "initiate_conversations": true,
  "enable_planning": true,
  "identity_guidance": "You are a helpful assistant.",
  "periodic_runs": "0 * * * *",
  "permanent_goal": "Help users with their questions.",
  "enable_kb": true,
  "enable_reasoning": true,
  "kb_results": 5,
  "can_stop_itself": false,
  "system_prompt": "You are an AI assistant.",
  "long_term_memory": true,
  "summary_long_term_memory": false
}

LICENSE

MIT License — See the LICENSE file for details.


LOCAL PROCESSING. GLOBAL THINKING.
Made with ❤️ by mudler

Description
LocalAGI is a powerful, self-hostable AI Agent platform designed for maximum privacy and flexibility. A complete drop-in replacement for OpenAI's Responses APIs with advanced agentic capabilities. No clouds. No data leaks. Just pure local AI that works on consumer-grade hardware (CPU and GPU).
Readme MIT 7.1 MiB
Languages
Go 60.2%
JavaScript 19.6%
HTML 13.3%
CSS 6.7%
Makefile 0.1%