Compare commits

...

486 Commits

Author SHA1 Message Date
Richard Palethorpe
be8d914bb6 fix(docker): Add mcpbox server to extended compose files
Signed-off-by: Richard Palethorpe <io@richiejp.com>
2025-04-25 15:33:36 +01:00
Ettore Di Giacinto
12209ab926 feat(browseragent): post screenshot on slack (#81)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-04-24 23:17:10 +02:00
dependabot[bot]
547e9cd0c4 chore(deps): bump actions/checkout from 2 to 4 (#44)
Bumps [actions/checkout](https://github.com/actions/checkout) from 2 to 4.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v2...v4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: '4'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-24 22:45:29 +02:00
Richard Palethorpe
6a1e536ca7 Update README.md (#80)
Add observability screenshot and bullet point. Also update strap line and descriptions at the top to try and describe the benefit of this software
2025-04-24 18:01:58 +02:00
Ettore Di Giacinto
eb8663ada1 feat: local MCP server support (#61)
* wip

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: mudler <mudler@localai.io>

* Add groups to mcpbox

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: mudler <mudler@localai.io>

* Add mcpbox dockerfile and entrypoint

Signed-off-by: mudler <mudler@localai.io>

* Attach mcp stdio box to agent

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: mudler <mudler@localai.io>

* Add to dockerfile

Signed-off-by: mudler <mudler@localai.io>

* Attach to config

Signed-off-by: mudler <mudler@localai.io>

* Attach to ui

Signed-off-by: mudler <mudler@localai.io>

* Revert "Attach to ui"

This reverts commit 088d0c47e87ee8f84297e47d178fb7384bbe6d45.

Signed-off-by: mudler <mudler@localai.io>

* add one-time process, attach to UI the mcp server json configuration

Signed-off-by: mudler <mudler@localai.io>

* quality of life improvements

Signed-off-by: mudler <mudler@localai.io>

* fixes

Signed-off-by: mudler <mudler@localai.io>

* Make it working, expose MCP prepare script to UI

Signed-off-by: mudler <mudler@localai.io>

* Add container image to CI builds

* Wire mcpbox to tests

* Improve setup'

* Not needed anymore, using tests

Signed-off-by: mudler <mudler@localai.io>

* fix: do not override actions

Signed-off-by: mudler <mudler@localai.io>

* chore(tests): fix env var

Signed-off-by: mudler <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: mudler <mudler@localai.io>
2025-04-24 16:39:20 +02:00
Richard Palethorpe
ce997d2425 fix: Handle state on agent restart and update observables (#75)
Keep some agent start across restarts, such as the SSE manager and
observer. This allows restarts to be shown on the state page and also
allows avatars to be kept when reconfiguring the agent.

Also observable updates can happen out of order because SSE manager has
multiple workers. For now handle this in the client.

Finally fix an issue with the IRC client to make it disconnect and
handle being assigned a different nickname by the server.

Signed-off-by: Richard Palethorpe <io@richiejp.com>
2025-04-23 15:29:06 +02:00
Ettore Di Giacinto
56cd0e05ca chore: better defaults for parallel jobs (#76)
* chore: better defaults for parallel jobs

Signed-off-by: mudler <mudler@localai.io>

* chore(tests): add timeout

---------

Signed-off-by: mudler <mudler@localai.io>
2025-04-23 00:12:44 +02:00
Ettore Di Giacinto
25bb3fb123 feat: allow the agent to perform things concurrently (#74)
* feat: allow the agent to perform things concurrently

Signed-off-by: mudler <mudler@localai.io>

* Apply suggestions from code review

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* collect errors

Signed-off-by: mudler <mudler@localai.io>

---------

Signed-off-by: mudler <mudler@localai.io>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-04-22 16:49:28 +02:00
dependabot[bot]
9e52438877 chore(deps-dev): bump vite from 6.3.1 to 6.3.2 in /webui/react-ui (#69)
Bumps [vite](https://github.com/vitejs/vite/tree/HEAD/packages/vite) from 6.3.1 to 6.3.2.
- [Release notes](https://github.com/vitejs/vite/releases)
- [Changelog](https://github.com/vitejs/vite/blob/main/packages/vite/CHANGELOG.md)
- [Commits](https://github.com/vitejs/vite/commits/v6.3.2/packages/vite)

---
updated-dependencies:
- dependency-name: vite
  dependency-version: 6.3.2
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-22 11:44:53 +02:00
Ettore Di Giacinto
c4618896cf chore: default to gemma-3-12b-it-qat (#60)
* chore: default to gemma-3-12b-it-qat

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* fix: simplify tests to run faster

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-04-22 11:44:42 +02:00
Ettore Di Giacinto
ee1667d51a feat: add history metadata of agent browser (#71)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-04-21 22:52:04 +02:00
dependabot[bot]
bafd26e92c chore(deps-dev): bump eslint-plugin-react-hooks in /webui/react-ui (#67)
Bumps [eslint-plugin-react-hooks](https://github.com/facebook/react/tree/HEAD/packages/eslint-plugin-react-hooks) from 5.2.0 to 6.0.0.
- [Release notes](https://github.com/facebook/react/releases)
- [Changelog](https://github.com/facebook/react/blob/main/packages/eslint-plugin-react-hooks/CHANGELOG.md)
- [Commits](https://github.com/facebook/react/commits/HEAD/packages/eslint-plugin-react-hooks)

---
updated-dependencies:
- dependency-name: eslint-plugin-react-hooks
  dependency-version: 6.0.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-21 21:31:48 +02:00
dependabot[bot]
8ecc18f76f chore(deps-dev): bump react-router-dom in /webui/react-ui (#65)
Bumps [react-router-dom](https://github.com/remix-run/react-router/tree/HEAD/packages/react-router-dom) from 7.5.0 to 7.5.1.
- [Release notes](https://github.com/remix-run/react-router/releases)
- [Changelog](https://github.com/remix-run/react-router/blob/main/packages/react-router-dom/CHANGELOG.md)
- [Commits](https://github.com/remix-run/react-router/commits/react-router-dom@7.5.1/packages/react-router-dom)

---
updated-dependencies:
- dependency-name: react-router-dom
  dependency-version: 7.5.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-21 21:09:56 +02:00
dependabot[bot]
985f07a529 chore(deps): bump github.com/metoro-io/mcp-golang from 0.9.0 to 0.11.0 (#64)
Bumps [github.com/metoro-io/mcp-golang](https://github.com/metoro-io/mcp-golang) from 0.9.0 to 0.11.0.
- [Release notes](https://github.com/metoro-io/mcp-golang/releases)
- [Changelog](https://github.com/metoro-io/mcp-golang/blob/main/.goreleaser.yml)
- [Commits](https://github.com/metoro-io/mcp-golang/compare/v0.9.0...v0.11.0)

---
updated-dependencies:
- dependency-name: github.com/metoro-io/mcp-golang
  dependency-version: 0.11.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-21 21:09:39 +02:00
dependabot[bot]
8b2900c6d8 chore(deps): bump github.com/sashabaranov/go-openai (#63)
Bumps [github.com/sashabaranov/go-openai](https://github.com/sashabaranov/go-openai) from 1.38.1 to 1.38.2.
- [Release notes](https://github.com/sashabaranov/go-openai/releases)
- [Commits](https://github.com/sashabaranov/go-openai/compare/v1.38.1...v1.38.2)

---
updated-dependencies:
- dependency-name: github.com/sashabaranov/go-openai
  dependency-version: 1.38.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-21 21:09:16 +02:00
Ettore Di Giacinto
50e56fe22f feat(browseragent): add browser agent runner action (#55)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-04-18 22:42:17 +02:00
Richard Palethorpe
b5a12a1da6 feat(ui): Structured observability/status view (#40)
* refactor(ui): Make message status SSE name more specific

Signed-off-by: Richard Palethorpe <io@richiejp.com>

* feat(ui): Add structured observability events

Signed-off-by: Richard Palethorpe <io@richiejp.com>

---------

Signed-off-by: Richard Palethorpe <io@richiejp.com>
2025-04-18 17:32:43 +02:00
Ettore Di Giacinto
70e749b53a fix(github*): pass by correctly owner and repository (#54)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-04-17 23:01:19 +02:00
Ettore Di Giacinto
784a4c7969 fix(githubreader): do not use pointers (#53)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-04-17 22:45:24 +02:00
dependabot[bot]
43a2a142fa chore(deps-dev): bump globals from 15.15.0 to 16.0.0 in /webui/react-ui (#45)
Bumps [globals](https://github.com/sindresorhus/globals) from 15.15.0 to 16.0.0.
- [Release notes](https://github.com/sindresorhus/globals/releases)
- [Commits](https://github.com/sindresorhus/globals/compare/v15.15.0...v16.0.0)

---
updated-dependencies:
- dependency-name: globals
  dependency-version: 16.0.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-17 19:53:13 +02:00
Ettore Di Giacinto
8ee5956bdb fix: correct image name, switch to sd-1.5-ggml as default (#51)
* fix: correct image name, switch to flux.1-dev-ggml as default

Signed-off-by: mudler <mudler@localai.io>

* standardize around sd-1.5-ggml since it's smaller

Signed-off-by: mudler <mudler@localai.io>

---------

Signed-off-by: mudler <mudler@localai.io>
2025-04-17 19:52:10 +02:00
Richard Palethorpe
4888dfcdca chore(ui): Switch to text-based Bun lock file to allow diffing (#50)
Signed-off-by: Richard Palethorpe <io@richiejp.com>
2025-04-17 14:43:52 +02:00
Ettore Di Giacinto
a6b41fd3ab Update README.md 2025-04-17 10:26:30 +02:00
dependabot[bot]
d25aed9a1a chore(deps): bump github.com/sashabaranov/go-openai from 1.19.4 to 1.38.1 (#47)
* chore(deps): bump github.com/sashabaranov/go-openai

Bumps [github.com/sashabaranov/go-openai](https://github.com/sashabaranov/go-openai) from 1.19.4 to 1.38.1.
- [Release notes](https://github.com/sashabaranov/go-openai/releases)
- [Commits](https://github.com/sashabaranov/go-openai/compare/v1.19.4...v1.38.1)

---
updated-dependencies:
- dependency-name: github.com/sashabaranov/go-openai
  dependency-version: 1.38.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* chore: minor adaptations

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Ettore Di Giacinto <mudler@localai.io>
2025-04-17 08:04:06 +02:00
Ikko Eltociear Ashimine
4a3f471f72 docs: update README.md (#48)
seperate -> separate
2025-04-16 19:54:49 +02:00
dependabot[bot]
93154a0a27 chore(deps): bump docker/metadata-action (#43)
Bumps [docker/metadata-action](https://github.com/docker/metadata-action) from 2a4836ac76fe8f5d0ee3a0d89aa12a80cc552ad3 to 902fa8ec7d6ecbf8d84d538b9b233a880e428804.
- [Release notes](https://github.com/docker/metadata-action/releases)
- [Commits](2a4836ac76...902fa8ec7d)

---
updated-dependencies:
- dependency-name: docker/metadata-action
  dependency-version: 902fa8ec7d6ecbf8d84d538b9b233a880e428804
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-04-16 09:45:22 +02:00
mudler
59ab91d7df chore: update telegram
Signed-off-by: mudler <mudler@localai.io>
2025-04-16 08:54:17 +02:00
Rene Leonhardt
42590a7371 chore(deps): Update dependencies (#42) 2025-04-16 08:45:53 +02:00
Ettore Di Giacinto
6260d4f168 Update README.md 2025-04-15 19:38:01 +02:00
Ettore Di Giacinto
4206da92a6 Update README.md 2025-04-15 19:37:32 +02:00
Ettore Di Giacinto
4d6fbf1caa feat(github): add action to open up a PR and get all repository content (#39)
* feat(github): add action to open up a PR and get all repository content

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Minor fixes

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-04-15 09:26:40 +02:00
Ettore Di Giacinto
97ef7acec0 chore: return more results to the LLM of the action that was done (#38)
Signed-off-by: mudler <mudler@localai.io>
2025-04-14 22:03:53 +02:00
Richard Palethorpe
77189b6114 fix(test): Encourage LLM to plan multiple searches (#36)
Signed-off-by: Richard Palethorpe <io@richiejp.com>
2025-04-14 17:35:19 +02:00
Richard Palethorpe
c32d315910 fix(ui): Proxy avatars endpoint in dev mode (#32)
Signed-off-by: Richard Palethorpe <io@richiejp.com>
2025-04-14 16:09:57 +02:00
Richard Palethorpe
606ffd8275 fix(ui): Don't try to pass unserializable Go objects to status UI (#28)
Signed-off-by: Richard Palethorpe <io@richiejp.com>
2025-04-14 16:09:42 +02:00
mudler
601dba3fc4 chore(tests): try to be more expressive
Signed-off-by: mudler <mudler@localai.io>
2025-04-14 16:08:54 +02:00
mudler
00ab476a77 chore(tests): try to be more extensive in timeouts
Signed-off-by: mudler <mudler@localai.io>
2025-04-14 12:37:31 +02:00
Ettore Di Giacinto
906079cbbb Update README.md 2025-04-14 10:50:19 +02:00
Ettore Di Giacinto
808d9c981c Update docker-compose.intel.yaml 2025-04-13 22:33:03 +02:00
Ettore Di Giacinto
2b79c99dd7 chore(README): reorganize docker compose files
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-04-13 22:31:33 +02:00
Ettore Di Giacinto
77905ed3cd Update README.md 2025-04-13 22:10:17 +02:00
Ettore Di Giacinto
60c249f19a chore: cleanup, identify goal from conversation when evaluting achievement (#29)
* chore: cleanup, identify goal from conversation when evaluting achievement

Signed-off-by: mudler <mudler@localai.io>

* change base cpu model

Signed-off-by: mudler <mudler@localai.io>

* this is not necessary anymore

Signed-off-by: mudler <mudler@localai.io>

* use 12b

Signed-off-by: mudler <mudler@localai.io>

* use openthinker, it's smaller

* chore(tests): set timeout

Signed-off-by: mudler <mudler@localai.io>

* Enable reasoning in some of the tests

Signed-off-by: mudler <mudler@localai.io>

* docker compose unification, small changes

Signed-off-by: mudler <mudler@localai.io>

* Simplify

Signed-off-by: mudler <mudler@localai.io>

* Back at arcee-agent as default

Signed-off-by: mudler <mudler@localai.io>

* Better error handling during planning

Signed-off-by: mudler <mudler@localai.io>

* Ci: do not run jobs for every branch

Signed-off-by: mudler <mudler@localai.io>

---------

Signed-off-by: mudler <mudler@localai.io>
2025-04-12 21:01:01 +02:00
Ettore Di Giacinto
209a9989c4 Update README.md 2025-04-11 22:49:50 +02:00
Ettore Di Giacinto
5105b46f48 Add Github reviewer and improve reasoning (#27)
* Add Github reviewer and improve reasoning

* feat: improve action picking

Signed-off-by: mudler <mudler@localai.io>

---------

Signed-off-by: mudler <mudler@localai.io>
2025-04-11 21:57:19 +02:00
Ettore Di Giacinto
e4c7d1acfc feat(github): add actions to comment and read PRs (#26)
Signed-off-by: mudler <mudler@localai.io>
2025-04-10 21:45:18 +02:00
Ettore Di Giacinto
dd4fbd64d3 fix(pick_action): improve action pickup by using only the assistant thought process (#25)
* fix(pick_action): improve action pickup by using only the assistant thought process

Signed-off-by: mudler <mudler@localai.io>

* fix: improve templates

Signed-off-by: mudler <mudler@localai.io>

---------

Signed-off-by: mudler <mudler@localai.io>
2025-04-10 21:45:04 +02:00
Ettore Di Giacinto
4010f9d86c Update README.md 2025-04-10 19:38:13 +02:00
Ettore Di Giacinto
0fda6e38db Update README.md 2025-04-10 10:09:12 +02:00
Richard Palethorpe
bffb5bd852 Update README.md (#23)
Add new Web UI screen shots
2025-04-10 09:17:22 +02:00
Ettore Di Giacinto
4d722c35d3 ci: fix image building branch 2025-04-09 22:53:04 +02:00
Ettore Di Giacinto
8dd0c3883b Merge pull request #14 from mudler/rewrite
LocalAGI port in Go
2025-04-09 22:45:05 +02:00
Ettore Di Giacinto
c2ec333777 Enlarge timeout window 2025-04-09 21:59:18 +02:00
Ettore Di Giacinto
2f19feff5e fixups 2025-04-09 21:25:44 +02:00
Richard Palethorpe
e128cde613 chore(gitignore): Add volumes directory for docker compose
Signed-off-by: Richard Palethorpe <io@richiejp.com>
2025-04-09 19:21:36 +01:00
mudler
bc7f6f059c Reply when skipping loops 2025-04-09 19:49:39 +02:00
mudler
0eb68b6c20 feat: add loop detection
Signed-off-by: mudler <mudler@localai.io>
2025-04-09 19:13:41 +02:00
mudler
1c4ab09335 Update README 2025-04-09 18:28:59 +02:00
mudler
abc7d6e080 Do not protect endpoint by default
Signed-off-by: mudler <mudler@localai.io>
2025-04-09 18:08:22 +02:00
mudler
cb15f926e8 fix(tests): wait for API to be available
Signed-off-by: mudler <mudler@localai.io>
2025-04-09 15:26:53 +02:00
Richard Palethorpe
70282535d4 fix(app): Use Correct log format 2025-04-09 13:30:56 +01:00
Ettore Di Giacinto
5111738b3b Wait for API to be ready 2025-04-08 22:45:15 +02:00
Ettore Di Giacinto
c141a9bd80 Not needed here 2025-04-08 22:34:37 +02:00
Ettore Di Giacinto
c8cf70b1d0 Update gpu docker compose 2025-04-08 22:28:55 +02:00
Ettore Di Giacinto
3f83f5c4b0 Fix docker compose for gpu 2025-04-08 22:28:21 +02:00
Ettore Di Giacinto
45fbfed030 Update images 2025-04-08 22:27:13 +02:00
Ettore Di Giacinto
2b3f61aed1 Use public runners 2025-04-08 22:19:15 +02:00
Ettore Di Giacinto
e7111c6554 Rename 2025-04-08 22:18:32 +02:00
Ettore Di Giacinto
894dde9256 Merge remote-tracking branch 'localagent/master' into rewrite 2025-04-08 22:08:11 +02:00
Ettore Di Giacinto
446908b759 nuke old implementation 2025-04-08 22:07:59 +02:00
Richard Palethorpe
18364d169e fix(ui): Don't convert form inputs from string
Action, connector, MCP, dynamic prompts and anything else that is
stored as an array in the config; their fields are always stored
as string by the backend.

Signed-off-by: Richard Palethorpe <io@richiejp.com>
2025-04-07 14:02:38 +01:00
Richard Palethorpe
6464a33912 fix(docker): Use localrecall main tag instead of master
Signed-off-by: Richard Palethorpe <io@richiejp.com>
2025-04-07 12:23:57 +01:00
Richard Palethorpe
34caeea081 feat(docker): Add Intel Sycl Docker Compose file
Signed-off-by: Richard Palethorpe <io@richiejp.com>
2025-04-07 12:23:57 +01:00
mudler
25286a828c chore(gpu-example): update with multimodal model and image-gen model
Signed-off-by: mudler <mudler@localai.io>
2025-04-04 11:54:20 +02:00
mudler
6747fe87f2 chore: update docker compose files
Signed-off-by: mudler <mudler@localai.io>
2025-04-03 18:38:35 +02:00
Ettore Di Giacinto
09559f9ed9 chore(ci): push to localagi (#137)
Signed-off-by: mudler <mudler@localai.io>
2025-04-03 18:29:57 +02:00
Ettore Di Giacinto
4107a7a063 Update README.md 2025-04-03 17:35:28 +02:00
Ettore Di Giacinto
e3d4177c53 chore(README): update (#133)
* chore(README): update

Signed-off-by: mudler <mudler@localai.io>

* Update logos

---------

Signed-off-by: mudler <mudler@localai.io>
2025-04-03 17:35:07 +02:00
Richard Palethorpe
ee77bba615 fix: Don't crash when some agents fail to start 2025-04-03 16:11:21 +01:00
Ettore Di Giacinto
0709f2f1ff Revert "feat(ui): Action playground config and parameter forms (#129)"
This reverts commit 1eee5b5a32.
2025-04-03 14:46:04 +01:00
Richard Palethorpe
a569e37a34 fix(ui): Send number input as number JSON not string (#130)
* fix(ui): Submit number fields as numbers not text

* fix(ui): Remove some debug messages
2025-04-03 15:27:23 +02:00
Richard Palethorpe
1eee5b5a32 feat(ui): Action playground config and parameter forms (#129) 2025-04-03 15:26:14 +02:00
Ettore Di Giacinto
ffee9d8307 Update README.md 2025-04-02 23:24:45 +02:00
Ettore Di Giacinto
ff20a0332e Update README.md 2025-04-02 23:14:11 +02:00
Richard Palethorpe
034f596e06 fix(docker): Set API key on LocalAI (#128) 2025-04-02 21:47:00 +02:00
mudler
daa7dcd12a fix(discord): make it work
Signed-off-by: mudler <mudler@localai.io>
2025-04-02 19:40:27 +02:00
mudler
b81f34a8f8 Answer if mentioned if not specifying a default channel
Signed-off-by: mudler <mudler@localai.io>
2025-04-02 17:10:03 +02:00
Richard Palethorpe
6d9f1a95cc fix(ui): Set page title 2025-04-02 08:28:49 +01:00
Richard Palethorpe
9f77bb99f1 fix(ui): Various fixes 2025-04-02 08:28:49 +01:00
Richard Palethorpe
74fdfd7a55 feat(ui): Add import agent screen 2025-04-02 08:28:49 +01:00
Richard Palethorpe
7494aa9c26 fix(ui): Prevent infinite loop when displaying error toast in chat 2025-04-02 08:28:49 +01:00
Ettore Di Giacinto
e90c192063 feat(call_agents): merge metadata of results (#126)
* feat(call_agents): merge metadata of results

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* chore: correct env typo

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Update services/actions/callagents.go

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* chore: add icon to thinking

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-04-01 21:57:32 +02:00
Richard Palethorpe
53d135bec9 chore(ui): Move zombie UI to old 2025-04-01 17:25:04 +01:00
Richard Palethorpe
99e0011920 Revert "chore(ui): Nuke original web UI, in favor of React"
This reverts commit 86cb9f1282.
2025-04-01 17:25:04 +01:00
Richard Palethorpe
5023bc77f4 feat(ui): Add custom action config meta data 2025-04-01 17:25:04 +01:00
Richard Palethorpe
a5ba49ec93 fix(ui): rm broken status message (it's in the button already) 2025-04-01 17:25:04 +01:00
Ettore Di Giacinto
f3c06b1bfb feat(api): implement stateful responses api (#122)
* feat(api): implement stateful responses api

Signed-off-by: mudler <mudler@localai.io>

* fix(tests): align client to API changes

Signed-off-by: mudler <mudler@localai.io>

---------

Signed-off-by: mudler <mudler@localai.io>
2025-04-01 18:00:37 +02:00
Richard Palethorpe
86cb9f1282 chore(ui): Nuke original web UI, in favor of React 2025-04-01 14:36:33 +01:00
Richard Palethorpe
d8cf5b419b Update README with new API paths 2025-04-01 14:36:33 +01:00
Richard Palethorpe
bb4459b99f chore: Add alternate bin name to .gitignore 2025-04-01 14:36:33 +01:00
Richard Palethorpe
ab3e6ae3c8 fix(ui): Fix SSE in chat 2025-04-01 14:36:33 +01:00
Richard Palethorpe
1f8c601795 fix(ui): Format item type label when it contians underscore 2025-04-01 14:36:33 +01:00
Richard Palethorpe
f70985362d chore(ui): Remove original UI/API routes 2025-04-01 14:36:33 +01:00
Richard Palethorpe
cafaa0e153 feat(ui): Add dynamic prompt config 2025-04-01 14:36:33 +01:00
Richard Palethorpe
491354280b feat(ui): Add dynamic prompt config 2025-04-01 14:36:33 +01:00
Richard Palethorpe
4c40e47e8d chore(prompts): Rename Prompt blocks to Dynamic prompts 2025-04-01 14:36:33 +01:00
Richard Palethorpe
045fb1f8d6 fix(ui): Remove infinite animations due to high CPU usage 2025-04-01 14:36:33 +01:00
Richard Palethorpe
8e703c0ac2 fix(ui): Loading .env 2025-04-01 14:36:33 +01:00
Richard Palethorpe
29beee6057 fix(ui): SSE in React chat 2025-04-01 14:36:33 +01:00
Richard Palethorpe
d672842a81 chore(ui): Add .vscode to gitignore 2025-04-01 14:36:33 +01:00
Richard Palethorpe
45078e1fa7 fix(ui): Re-add Chat 2025-04-01 14:36:33 +01:00
Richard Palethorpe
c96c8d8009 fix(ui): Various 2025-04-01 14:36:33 +01:00
Richard Palethorpe
11231f23ea feat(ui): Button appearance change 2025-04-01 14:36:33 +01:00
Ettore Di Giacinto
c1dcda42ae fix: re-enable nested plannings (#117)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-30 22:18:15 +02:00
Ettore Di Giacinto
7b52b9c22d fix(slack): support multiple threads update (#115)
Signed-off-by: mudler <mudler@localai.io>
2025-03-30 19:20:41 +02:00
Ettore Di Giacinto
dff678fc4e feat(job): add finalizers and save conversation after job is result is released (#114)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-30 18:43:28 +02:00
mudler
e0703cdb7c chore(tests): extend timeout for client
Signed-off-by: mudler <mudler@localai.io>
2025-03-30 18:42:35 +02:00
Ettore Di Giacinto
c940141e61 fix: make new_conversation to work (#112)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-29 00:21:00 +01:00
Ettore Di Giacinto
5fdd464fad Improve plan description 2025-03-28 22:16:12 +01:00
Ettore Di Giacinto
68cfdecaee Do not delete message in case of error 2025-03-28 22:11:43 +01:00
Ettore Di Giacinto
906b4ebd76 feat: add retries to pickAction
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-28 22:04:04 +01:00
Ettore Di Giacinto
05cb8ba2eb ci: add concurrency group for tests 2025-03-28 21:33:14 +01:00
Ettore Di Giacinto
0644daa477 feat: retrials (#110)
* feat(jobs): rework next actions

Also attempt to retry when failing at generating parameters

Signed-off-by: mudler <mudler@localai.io>

* feat(retries): add retries for common operations

Signed-off-by: mudler <mudler@localai.io>

---------

Signed-off-by: mudler <mudler@localai.io>
2025-03-28 21:27:34 +01:00
mudler
62940a1a56 fix: add a tab when listing agents
Signed-off-by: mudler <mudler@localai.io>
2025-03-28 18:40:40 +01:00
mudler
c6ce1c324f feat(slack): respond to channel only on channel mode
Signed-off-by: mudler <mudler@localai.io>
2025-03-28 17:01:24 +01:00
mudler
383fc1d0f4 fix(multi-agent): small fixes
Signed-off-by: mudler <mudler@localai.io>
2025-03-28 16:55:08 +01:00
mudler
8ac6f68568 fix(multi-agent): do not allow to call ourselves
Signed-off-by: mudler <mudler@localai.io>
2025-03-28 16:51:07 +01:00
mudler
05af5d9695 Use internal API for services/actions when using the pool
Signed-off-by: mudler <mudler@localai.io>
2025-03-28 16:12:42 +01:00
mudler
2c273392cd add debug messages 2025-03-28 15:31:26 +01:00
Ettore Di Giacinto
08f5417e96 go fmt 2025-03-27 23:06:33 +01:00
Ettore Di Giacinto
f67ebe8c7a Update agent.go 2025-03-27 00:15:52 +01:00
Ettore Di Giacinto
6ace4ab60d Expire jobs if context is canceled 2025-03-27 00:13:26 +01:00
Richard Palethorpe
319caf8e91 chore(ui): Move some field definitions server side 2025-03-26 22:56:29 +00:00
Richard Palethorpe
7fb99ecf21 chore(ui): Reuse FormFieldDefinition on other parts of AgentForm 2025-03-26 22:56:29 +00:00
Richard Palethorpe
d520d88301 feat(ui): Add required indicator to form field 2025-03-26 22:56:29 +00:00
Richard Palethorpe
4dcc77372d chore(ui): Refactor action and connector form fields into single component 2025-03-26 22:56:29 +00:00
Ettore Di Giacinto
0f2731f9e8 fix(actions): respect running context
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-26 22:58:52 +01:00
Ettore Di Giacinto
6e888f6008 Move action context to the job
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-26 22:37:25 +01:00
mudler
2713349c75 debug 2025-03-26 18:26:31 +01:00
mudler
b6cd62a8a3 Merge branch 'fixups/mixed' 2025-03-26 17:11:38 +01:00
mudler
dd6739cbbf fix: consistently track user message in connector
Signed-off-by: mudler <mudler@localai.io>
2025-03-26 17:11:09 +01:00
Ettore Di Giacinto
5cd0eaae3f fix: mixed fixups and enhancements (#107)
* chore(Makefile): build react dist if missing

Signed-off-by: mudler <mudler@localai.io>

* fix(planning): don't loose results

Signed-off-by: mudler <mudler@localai.io>

* fix(slack): track user messages when writing on channel

Signed-off-by: mudler <mudler@localai.io>

---------

Signed-off-by: mudler <mudler@localai.io>
2025-03-26 17:05:59 +01:00
mudler
9d6b81d9c2 fix(slack): track user messages when writing on channel
Signed-off-by: mudler <mudler@localai.io>
2025-03-26 16:59:23 +01:00
mudler
d5df14a714 fix(planning): don't loose results
Signed-off-by: mudler <mudler@localai.io>
2025-03-26 16:58:11 +01:00
mudler
8e9b87bcb1 chore(Makefile): build react dist if missing
Signed-off-by: mudler <mudler@localai.io>
2025-03-26 16:57:46 +01:00
Richard Palethorpe
3e36b09376 fix(ui): Format status result as string 2025-03-26 06:34:32 +00:00
Richard Palethorpe
074aefd0df feat(ui): Add status page to react frontend 2025-03-26 06:34:32 +00:00
Richard Palethorpe
3e1081fc6e fix(ui): Fix MCP form 2025-03-26 06:34:32 +00:00
Richard Palethorpe
73af9538eb feat(ui): Add agent avatar placeholders to agent list 2025-03-26 06:34:32 +00:00
Richard Palethorpe
959dd8c7f3 Update README with hot reloading instructions 2025-03-26 06:34:32 +00:00
Richard Palethorpe
71e66c651c feat(ui): Add React based UI for the vibes at /app
This adds a completely separate frontend based on React because I
found that code gen works better with React once the application gets
bigger. In particular it was getting very hard to move past add
connectors and actions.

The idea is to replace the standard UI with this once it has been
tested. But for now it is available at /app in addition to the
original at /

Signed-off-by: Richard Palethorpe <io@richiejp.com>
2025-03-26 06:34:32 +00:00
Ettore Di Giacinto
438a65caf6 Fixup printing large messages
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-25 22:59:08 +01:00
Ettore Di Giacinto
fb20bbe5bf Allow slack bots to initiate conversations
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-25 22:58:59 +01:00
Ettore Di Giacinto
fa12dba7c2 Better paragraph splitting
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-25 22:28:08 +01:00
mudler
54c8bf5f1a Split and preserve message 2025-03-25 20:10:40 +01:00
mudler
7bc44167cf Handle photos results in telegram
Signed-off-by: mudler <mudler@localai.io>
2025-03-25 19:47:34 +01:00
mudler
88933784de Handle long responses
Signed-off-by: mudler <mudler@localai.io>
2025-03-25 19:41:13 +01:00
mudler
d7b503e30c Fixups 2025-03-25 19:18:05 +01:00
mudler
e1e708ee75 Isolate functions 2025-03-25 19:06:55 +01:00
mudler
9d81eb7509 Do not lock on responses
Signed-off-by: mudler <mudler@localai.io>
2025-03-25 19:03:01 +01:00
mudler
ddc7d0e100 handle lock inside goroutine
Signed-off-by: mudler <mudler@localai.io>
2025-03-25 18:48:15 +01:00
mudler
e26b55a6a8 Add tests 2025-03-25 17:58:59 +01:00
mudler
ca3420c791 fixup silly mistake
Signed-off-by: mudler <mudler@localai.io>
2025-03-25 17:25:59 +01:00
mudler
abd6d1bbf7 Do not allow to recursively follow plan actions
Signed-off-by: mudler <mudler@localai.io>
2025-03-25 16:59:13 +01:00
Ettore Di Giacinto
d0cfc4c317 feat: track conversations inside connectors (#92)
* switch to observer pattern

Signed-off-by: mudler <mudler@localai.io>

* keep conversation history in telegram and slack

* generalize with conversation tracker

---------

Signed-off-by: mudler <mudler@localai.io>
2025-03-25 16:31:03 +01:00
Ettore Di Giacinto
53c1554d55 fix: do not track an internal currentConversation (#91)
It is prone to races, and does not really track all conversations for
each job

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-25 00:36:09 +01:00
Ettore Di Giacinto
b09749dddb try to resolve user IDs to enable bot to mention users (#89)
Signed-off-by: mudler <mudler@localai.io>
2025-03-24 22:20:25 +01:00
mudler
b199c10ab7 do not re-generate avatar if already existing
Signed-off-by: mudler <mudler@localai.io>
2025-03-24 19:46:56 +01:00
mudler
c8abc5f28f fix(slack): do not convert, mention user when summoned by mentions
Signed-off-by: mudler <mudler@localai.io>
2025-03-24 15:41:57 +01:00
mudler
14948c965d feat(slack): update, improve links and mentions
Signed-off-by: mudler <mudler@localai.io>
2025-03-24 11:11:31 +01:00
Ettore Di Giacinto
558306a841 Update README.md 2025-03-24 10:23:05 +01:00
Richard Palethorpe
fb41663330 fix(ui): Add connector templates to group-create 2025-03-24 08:54:36 +00:00
Richard Palethorpe
84836b8345 feat(ui): Add individual forms for each connector 2025-03-24 08:54:36 +00:00
Ettore Di Giacinto
5f2a2eaa24 feat(slack): show thought process (#83)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-23 22:43:13 +01:00
mudler
75a8d63e83 Finish moving types 2025-03-23 21:57:09 +01:00
mudler
f0b8bfb4f4 no need to defer here
Signed-off-by: mudler <mudler@localai.io>
2025-03-23 11:57:01 +01:00
mudler
fa25e7c077 fixup: pass pointer to pool
Signed-off-by: mudler <mudler@localai.io>
2025-03-23 11:53:36 +01:00
Ettore Di Giacinto
3a9169bdbe feat(agents): Create group of agents (#82)
* feat(ui): add section to create agents in group

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Enhance UX and do not display first form section

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* fixups

* Small fixups on avatar creation

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-22 21:41:51 +01:00
Ettore Di Giacinto
3a921f6241 feat(ui): generate avatars (#80)
* feat(ui): generate avatars

Signed-off-by: mudler <mudler@localai.io>

* Show a placeholder if the image is not ready

Signed-off-by: mudler <mudler@localai.io>

* feat(avatar): generate prompt first

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: mudler <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-22 20:50:31 +01:00
Ettore Di Giacinto
c1ac7b675a feat(api): add endpoint to create group of dedicated agents (#79)
Signed-off-by: mudler <mudler@localai.io>
2025-03-22 18:44:22 +01:00
Ettore Di Giacinto
d689bb4331 feat(actions): add playground to test actions (#74)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-21 23:51:55 +01:00
Ettore Di Giacinto
b42ef27641 feat: change default models (#73)
Signed-off-by: mudler <mudler@localai.io>
2025-03-21 21:44:06 +01:00
Ettore Di Giacinto
abb3ffc109 feat: track plan action when is being executed, also tests (#72)
* feat: track plan action when is being executed, also tests

* Update core/agent/agent_test.go

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* Update core/agent/actions.go

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

---------

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-03-21 16:32:24 +01:00
Ettore Di Giacinto
e5e238efc0 fix(planning): correctly generate a valid JSON schema (#71)
Signed-off-by: mudler <mudler@localai.io>
2025-03-21 15:47:34 +01:00
Ettore Di Giacinto
33483ab4b9 feat(planning): enable agent planning (#68)
* feat(planning): Allow the agent to plan subtasks

Signed-off-by: mudler <mudler@localai.io>

* feat(planning): enable planning toggle in the webui

Signed-off-by: mudler <mudler@localai.io>

* feat(planning): take in consideration the overall goal

Signed-off-by: mudler <mudler@localai.io>

* Update core/action/plan.go

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

---------

Signed-off-by: mudler <mudler@localai.io>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-03-21 12:28:11 +01:00
Ettore Di Giacinto
638eedc2a0 fix: correctly stop agents
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-20 22:46:54 +01:00
Ettore Di Giacinto
86d3596f41 Add more logging 2025-03-20 22:35:55 +01:00
mudler
16288c0fc3 fix: correct model name in defaults 2025-03-20 19:18:14 +01:00
mudler
5d42ebbc71 chore: add docker-compose for gpu setup
Signed-off-by: mudler <mudler@localai.io>
2025-03-20 19:17:42 +01:00
mudler
0513a327f6 feat(ssh): allow to customize action name
Signed-off-by: mudler <mudler@localai.io>
2025-03-20 19:15:12 +01:00
mudler
c3d3bba32a feat(ssh): allow to specify a fixed host/user to run commands
Signed-off-by: mudler <mudler@localai.io>
2025-03-20 19:13:37 +01:00
Ettore Di Giacinto
1b187444fc feat: ssh as shell command (#67)
* feat(ssh): add action to run shell commands over a remote server

Signed-off-by: mudler <mudler@localai.io>

* fix(github): correctly get content

Signed-off-by: mudler <mudler@localai.io>

---------

Signed-off-by: mudler <mudler@localai.io>
2025-03-20 18:55:19 +01:00
Ettore Di Giacinto
d54abc3ed0 Revert "Generate connector form based on meta-data (#62)" (#65)
This reverts commit d7cfa7f0b2.
2025-03-20 18:21:19 +01:00
Ettore Di Giacinto
1e5b3f501f feat(github): add action to read github project's README (#64)
Signed-off-by: mudler <mudler@localai.io>
2025-03-20 17:43:15 +01:00
mudler
0e240077ab ci: setup go 2025-03-20 16:36:19 +01:00
mudler
401172631d ci: fixups 2025-03-20 16:09:24 +01:00
mudler
5b8ca0b756 ci: drop docker removal 2025-03-20 16:08:21 +01:00
mudler
8be14b7e3f ci: drop docker removal 2025-03-20 16:07:28 +01:00
mudler
96de3bdddd Relax errors 2025-03-20 16:05:43 +01:00
mudler
56d209f95d Fixups in ci 2025-03-20 16:04:57 +01:00
mudler
169c5e8aad Setup docker in ci 2025-03-20 16:02:55 +01:00
Richard Palethorpe
d7cfa7f0b2 Generate connector form based on meta-data (#62)
* Ignore volumes and exe

* Export form meta-data

* use dynamic metaform for connectors

* fix populating form
2025-03-20 16:00:37 +01:00
Ettore Di Giacinto
43a46ad1fb Update tests.yml 2025-03-20 13:05:29 +01:00
mudler
2de5152bfd ci: run on self-hosted
Signed-off-by: mudler <mudler@localai.io>
2025-03-20 10:30:16 +01:00
Ettore Di Giacinto
a83f4512b6 feat: allow to set LocalRAG API URL ad key (#61)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-19 23:10:14 +01:00
Ettore Di Giacinto
08785e2908 feat: add action to call other agents (#60)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-19 22:58:35 +01:00
Ettore Di Giacinto
8e694f70ec Add description field (#59) 2025-03-19 22:40:21 +01:00
Ettore Di Giacinto
f0bd184fbd feat: add twitter action and connector (#58)
* feat: add twitter post action

Signed-off-by: mudler <mudler@localai.io>

* feat: handle twitter post messages limits

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* feat: add twitter connector, unify client

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* make sure answers do not exceed twitter maximum

---------

Signed-off-by: mudler <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-19 22:14:32 +01:00
Ettore Di Giacinto
e32a569796 try to fixup tests, enable e2e (#53)
* try to fixup tests, enable e2e

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Generate JSON character data with tools

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Rework generation of character

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Simplify text

Signed-off-by: mudler <mudler@localai.io>

* Relax some test constraints

Signed-off-by: mudler <mudler@localai.io>

* Fixups

* Properly fit schema generation

* Swap default model

* ci fixups

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: mudler <mudler@localai.io>
2025-03-18 23:28:02 +01:00
Ettore Di Giacinto
31b5849d02 feat(api): add support to responses api (#52)
Signed-off-by: mudler <mudler@localai.io>
2025-03-17 18:42:33 +01:00
mudler
29a8713427 enhance update form
Signed-off-by: mudler <mudler@localai.io>
2025-03-17 16:13:03 +01:00
Ettore Di Giacinto
3c3b5a774c Fix race conditions 2025-03-16 22:59:59 +01:00
Ettore Di Giacinto
35c75b61d8 Refactor views 2025-03-16 22:59:48 +01:00
Ettore Di Giacinto
33b5b8c8f4 feat(agent): add MCP integration (#50)
* feat(agent): add MCP integration

Signed-off-by: mudler <mudler@localai.io>

* Update core/agent/mcp.go

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* Hook MCP Server configuration to creation and setting mask

* Allow to specify a bearer token

* Small fixups

---------

Signed-off-by: mudler <mudler@localai.io>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-03-15 23:25:03 +01:00
Richard Palethorpe
dc2570c90b Minor fixes for Docker and Javascript (#49)
* fix compose dependency

* add volume data to .dockerignore

* remove unused alpine.js import
2025-03-15 19:10:59 +01:00
Ettore Di Giacinto
aea0b424b9 try to get SHA of the content 2025-03-13 22:53:02 +01:00
Ettore Di Giacinto
5e73be42cb Always try to get branch sha
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-13 22:43:09 +01:00
Ettore Di Giacinto
53ebcdad5d Small fixups 2025-03-13 22:23:31 +01:00
Ettore Di Giacinto
a1cdabd0a8 Add github actions to comment on an issue or read the content
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-13 22:17:44 +01:00
Ettore Di Giacinto
26bcdf72a2 ci: drop arm64 builds for now (too slow, not used) 2025-03-13 22:00:41 +01:00
Ettore Di Giacinto
9347193fdc Need to fill more options to commit to github (#42)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-13 21:56:47 +01:00
Ettore Di Giacinto
efc82bde30 feat: add ActionGithubRepositoryCreateOrUpdate to Availableactions 2025-03-13 11:32:38 +01:00
Ettore Di Giacinto
6a451267d5 Return URL of issue opened
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-12 23:01:50 +01:00
Ettore Di Giacinto
9ee0d89a6b Add github actions to upload and get files, update github dep
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-12 22:50:45 +01:00
Ettore Di Giacinto
10f7c8ff13 feat(github): allow to customize action name
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-12 22:15:52 +01:00
Ettore Di Giacinto
c69ee9e5f7 feat(github-actions): allow to bind to a specific repository
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-12 21:59:31 +01:00
Ettore Di Giacinto
0ad2de72e0 feat(keys): allow to set api keys to secure the instance (#39) 2025-03-11 23:14:05 +01:00
Ettore Di Giacinto
1e484d7ccd Update README 2025-03-11 22:57:19 +01:00
Richard Palethorpe
7486e68a17 Add Counter action to count things (#38)
* add host

* fix: make user name explicit in IRC

* feat: Add counter action
2025-03-11 22:54:35 +01:00
Ettore Di Giacinto
3763f320b9 Update README.md 2025-03-11 22:53:38 +01:00
Ettore Di Giacinto
16e0836fc7 Update README.md 2025-03-11 22:53:28 +01:00
Ettore Di Giacinto
6954ad3217 Update README.md 2025-03-11 22:41:00 +01:00
Ettore Di Giacinto
69e043f8e1 Update logo 2025-03-11 22:35:42 +01:00
Ettore Di Giacinto
d451919414 feat(edit): allow to edit agents (#36) 2025-03-11 22:32:13 +01:00
mudler
9ff2fde44f feat: allow to specify models in agent creation mask
Signed-off-by: mudler <mudler@localai.io>
2025-03-11 10:20:28 +01:00
Ettore Di Giacinto
40b0d4bfc9 Merge pull request #33 from richiejp/compose
Fix no health check
2025-03-10 19:41:30 +01:00
Richard Palethorpe
14a70c3edd Fix no health check
ragserver doesn't have a health check and is from scratch image.
So do hack to wait for it to come up
2025-03-10 16:09:36 +00:00
mudler
0b71d8dc10 feat: make slack process images 2025-03-09 23:34:49 +01:00
mudler
bc60dde94f Enable more logging, only describe image once when walking history
Signed-off-by: mudler <mudler@localai.io>
2025-03-09 23:34:20 +01:00
Ettore Di Giacinto
28e80084f6 Update slack.yaml 2025-03-09 23:05:55 +01:00
mudler
7be93fb014 Update README 2025-03-09 17:20:26 +01:00
Ettore Di Giacinto
5ecb97e845 Merge pull request #32 from mudler/feat/multimodal
feat: add capability to understand images
2025-03-08 17:59:13 +01:00
mudler
3827ebebdf feat: add capability to understand images
Signed-off-by: mudler <mudler@localai.io>
2025-03-08 17:54:35 +01:00
mudler
106d1e61d4 Update docker-compose file 2025-03-08 12:48:37 +01:00
Ettore Di Giacinto
b884d9433a make sure /tmp exists 2025-03-07 22:51:34 +01:00
Richard Palethorpe
f2e7010297 Add IRC connector 2025-03-07 12:56:24 +00:00
Ettore Di Giacinto
1e1c123d84 chore(ci): specify Dockerfile 2025-03-05 22:25:32 +01:00
Ettore Di Giacinto
51ba87a7ba chore(ci): specify registry 2025-03-05 22:23:41 +01:00
Ettore Di Giacinto
bf8d8be5ad chore(ci): small fixups 2025-03-05 22:22:10 +01:00
Ettore Di Giacinto
311c0bb5ee Add goreleaser 2025-03-05 22:19:13 +01:00
Ettore Di Giacinto
7492a3ab3b Change env vars to be more meaningful 2025-03-05 22:19:07 +01:00
Ettore Di Giacinto
127c76d006 Add workflows for CI 2025-03-05 22:18:02 +01:00
Ettore Di Giacinto
2942668d89 Put logging of conversations behind ENABLE_CONVERSATIONS_LOGGING 2025-03-04 22:23:58 +01:00
Ettore Di Giacinto
d288755444 Automatically save all conversations 2025-03-04 22:22:16 +01:00
Ettore Di Giacinto
758a73e8ab Minor UX Tweaks
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-03-04 22:01:00 +01:00
Ettore Di Giacinto
5dd4b9cb65 Improve page style 2025-03-03 23:32:06 +01:00
Ettore Di Giacinto
d714c4f80b Page restyling 2025-03-03 23:08:58 +01:00
Ettore Di Giacinto
173eda4fb3 Rework UI by returning error/statuses, some refactorings 2025-03-03 22:34:46 +01:00
Ettore Di Giacinto
365f89cd5e initialize prompt blocks 2025-03-03 21:41:03 +01:00
Ettore Di Giacinto
5e52383a99 Refactorings 2025-03-02 22:44:54 +01:00
Ettore Di Giacinto
f6e16be170 Allow to specify dynamic prompts 2025-03-02 22:40:37 +01:00
Ettore Di Giacinto
5721c52c0d use date for file name prefix 2025-03-02 21:27:11 +01:00
mudler
6c83f3d089 refactorings 2025-03-02 17:45:06 +01:00
mudler
3a7e56cdf1 Cleanup 2025-03-02 17:43:11 +01:00
Ettore Di Giacinto
89d7da3817 Logging 2025-03-01 22:44:45 +01:00
Ettore Di Giacinto
7a98408336 Fixups, enhance logging 2025-03-01 22:40:42 +01:00
Ettore Di Giacinto
e696c5ae31 Save conversations also at the end 2025-03-01 22:31:49 +01:00
Ettore Di Giacinto
042c1ee65c Uniq results 2025-03-01 22:20:49 +01:00
Ettore Di Giacinto
3a7b5e1461 return metadatas and conversations in job results. Consume them in Slack connector to use attachments in responses 2025-03-01 22:10:21 +01:00
Ettore Di Giacinto
4d6b04c6ed Enable optional scaping of messages 2025-03-01 21:30:16 +01:00
Ettore Di Giacinto
70c389ce0b refactors 2025-03-01 21:30:16 +01:00
mudler
5b4f618ca3 Standardize action results 2025-03-01 17:27:07 +01:00
mudler
8492c95cb6 Convert markdown to slack markdown 2025-03-01 16:33:50 +01:00
mudler
a75feaab4e Simplify, no need to add system prompt when asking for reply 2025-03-01 16:17:27 +01:00
mudler
3790ad3666 Fix genimage action 2025-03-01 16:02:01 +01:00
mudler
a57f990576 Add genimage action 2025-03-01 12:18:10 +01:00
Ettore Di Giacinto
76a01994f9 Enable markdown responses in slack 2025-02-28 22:50:44 +01:00
Ettore Di Giacinto
4520d16522 Fixups 2025-02-28 22:48:50 +01:00
Ettore Di Giacinto
0dce524c7a Fixups 2025-02-28 22:48:26 +01:00
Ettore Di Giacinto
0b78956cc4 Enhance logging 2025-02-28 22:38:05 +01:00
Ettore Di Giacinto
43352376e3 Use toolcall to construct current conversation 2025-02-28 22:37:55 +01:00
Ettore Di Giacinto
d3f2126614 Debug 2025-02-28 22:10:28 +01:00
Ettore Di Giacinto
cf112d57a6 Strip bot user from messages received 2025-02-28 22:06:05 +01:00
mudler
f28725199c Reply with threads history 2025-02-28 19:55:10 +01:00
mudler
fbcc618355 Reply to mentions in threads 2025-02-28 19:30:12 +01:00
Ettore Di Giacinto
094580724f Merge pull request #28 from mudler/localrag
feat: integrate LocalRAG
2025-02-28 17:02:50 +01:00
Ettore Di Giacinto
371ea63f5a Integrate with LocalRAG, drop RAG functionalities 2025-02-27 23:51:02 +01:00
Ettore Di Giacinto
bc431ea6ff Add LocalRAG client 2025-02-27 23:17:46 +01:00
mudler
5c6df5adc0 Fix entrypoint 2025-02-27 19:38:21 +01:00
Ettore Di Giacinto
46085077a5 Update README.md 2025-02-26 23:10:39 +01:00
Ettore Di Giacinto
33f1c102e4 Update README.md 2025-02-26 22:52:35 +01:00
Ettore Di Giacinto
b66e698b5e Rename package 2025-02-26 22:51:29 +01:00
Ettore Di Giacinto
43c29fbdb0 Re-order main 2025-02-26 22:45:50 +01:00
Ettore Di Giacinto
0a18d8409e refactoring 2025-02-26 22:37:48 +01:00
Ettore Di Giacinto
0139b79835 refactoring 2025-02-25 23:17:28 +01:00
Ettore Di Giacinto
296734ba3b reordering 2025-02-25 22:18:08 +01:00
Ettore Di Giacinto
d73fd545b2 Hook to the webui 2025-02-25 22:09:38 +01:00
Ettore Di Giacinto
fa1ae086bf Move custom action into actions 2025-02-25 22:09:12 +01:00
Ettore Di Giacinto
96091a1ad5 Add custom actions with golang interpreter
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-02-24 23:21:51 +01:00
mudler
ac6c03bcb7 wip 2025-02-23 19:43:22 +01:00
mudler
6ae019db23 Add realtimesst example 2025-02-05 19:57:25 +01:00
mudler
2a6650c3ea Add deepface client 2025-02-04 12:53:39 +01:00
mudler
5e6863379c Time shouldn't be part of the Hud, but always rendered
Signed-off-by: mudler <mudler@localai.io>
2024-12-25 23:17:40 +01:00
mudler
8c447a0cf8 feat: support separate knowledge bases for each agent
Also allow to export/import KB

Signed-off-by: mudler <mudler@localai.io>
2024-12-18 20:17:15 +01:00
mudler
c68ff23b01 Small fixups
Signed-off-by: mudler <mudler@localai.io>
2024-12-18 17:44:52 +01:00
mudler
735bab5e32 Fixups to long memory, and avoid re-initing RAGDB recursively
Signed-off-by: mudler <mudler@localai.io>
2024-12-18 12:41:18 +01:00
mudler
c2035c95c5 Always store in KB when adding new content
Signed-off-by: mudler <mudler@localai.io>
2024-12-18 11:30:02 +01:00
mudler
351aa80b74 Use InMemoryDB in place of a RAGDB to store things as files
Signed-off-by: mudler <mudler@localai.io>
2024-12-17 22:24:38 +01:00
mudler
0fc09b4f49 Do not summarize if nothing is in memory 2024-12-17 19:55:21 +01:00
mudler
1900915609 Use marco-o1 for defaults 2024-12-17 19:53:43 +01:00
mudler
6bafb48cec Customize embedding model 2024-12-17 19:53:30 +01:00
mudler
2561f2f175 Add long term memory 2024-12-17 19:53:07 +01:00
mudler
d053dc249c Panic for now if agent config is failing
Signed-off-by: mudler <mudler@localai.io>
2024-12-15 18:15:04 +01:00
mudler
6318f1aa97 Store into memory conversation before resetting it
Signed-off-by: mudler <mudler@localai.io>
2024-12-15 18:13:54 +01:00
mudler
a204b3677b add github PR watcher
Signed-off-by: mudler <mudler@localai.io>
2024-06-30 15:12:31 +02:00
Ettore Di Giacinto
b1d90dbedd Merge pull request #11 from greygoo/add-longchain-community
Add langchain-community requirement to make it build again
2024-06-23 23:13:50 +02:00
greygoo
361927b0a4 Add langchain-community requirement to make it build again 2024-06-23 23:02:35 +02:00
mudler
b010f48ddc Return content if present 2024-05-31 00:43:55 +02:00
mudler
6d866985e8 Small renames 2024-05-23 18:59:10 +02:00
mudler
989a2421ba Do not do two requests if reasoning is disabled
Signed-off-by: mudler <mudler@localai.io>
2024-05-23 18:58:53 +02:00
mudler
3f9b454276 finish handling support if no reasoning is used 2024-05-13 19:50:18 +02:00
mudler
fdd4af1042 Do not inject a reply action 2024-05-13 00:16:13 +02:00
mudler
24ef65f70d Enable reasoning separately 2024-05-12 22:47:36 +02:00
mudler
5c3497a433 The bug was that we didn't reset the timer when an agent was not working as standalone'
'
2024-04-21 16:33:04 +02:00
mudler
9974dec2e1 Fix freaking BUG! 2024-04-21 16:32:01 +02:00
mudler
773259ceaa Stop action when picking up the job 2024-04-21 15:39:38 +02:00
Ettore Di Giacinto
1fc0db18cd hook sendmail action 2024-04-21 00:43:43 +02:00
Ettore Di Giacinto
249cedd9d1 add sendmail action 2024-04-21 00:42:58 +02:00
mudler
bc7192d664 handle corner case 2024-04-20 10:35:57 +02:00
Ettore Di Giacinto
22160cbf9e Cleanup context, more logging 2024-04-20 00:19:17 +02:00
mudler
34f6d821b9 more logging 2024-04-19 13:17:08 +02:00
Ettore Di Giacinto
e2aa3bedd7 Enhance logging 2024-04-19 00:15:53 +02:00
Ettore Di Giacinto
f6dcc09562 more logging 2024-04-19 00:08:18 +02:00
Ettore Di Giacinto
08563c3286 uniform logging 2024-04-19 00:02:00 +02:00
mudler
2cba2eafe6 fixups and workarounds llm want to output tags 2024-04-17 19:54:26 +02:00
mudler
c0773f03f8 Do not add that to the conv 2024-04-17 12:46:01 +02:00
mudler
bad3e53f06 Revert "this is not needed.. probably?"
This reverts commit cac986d287.
2024-04-17 12:45:02 +02:00
mudler
96f7a4653f Sort agents 2024-04-17 12:42:49 +02:00
mudler
cac986d287 this is not needed.. probably? 2024-04-17 12:42:21 +02:00
Ettore Di Giacinto
4a42ec0487 give reasoning when expanding parameters 2024-04-17 00:40:53 +02:00
mudler
9d0dcf792a fixups 2024-04-16 19:27:56 +02:00
mudler
324cbcd2a6 Force answering when answering was chosen 2024-04-16 18:54:37 +02:00
mudler
35d9ba44f5 Fixups 2024-04-16 17:44:04 +02:00
mudler
4ed61daf8e Better error handling 2024-04-16 10:26:55 +02:00
mudler
56a6a10980 Minor fixups 2024-04-16 10:19:15 +02:00
Ettore Di Giacinto
ca57f4cd85 add browse action 2024-04-16 00:20:05 +02:00
mudler
609b2a2b7c Correct description of 'stop' 2024-04-15 18:32:22 +02:00
mudler
55b7b4a41e Add more actions, small fixups in the UI, add stop action 2024-04-15 17:50:51 +02:00
mudler
7db2b10bd2 Do not let age to be either a string or an int 2024-04-15 12:09:22 +02:00
Ettore Di Giacinto
f0bc2be678 logging: pt1 2024-04-15 00:13:10 +02:00
mudler
ac8f6e94ff wip: noaction for deciding to stop 2024-04-14 16:38:56 +02:00
mudler
27f7299749 Fixup timers (?) 2024-04-14 16:38:14 +02:00
mudler
eda99d05f4 Small UI enhancements 2024-04-14 11:24:55 +02:00
mudler
0bc95cfd16 Increase textarea 2024-04-14 11:16:34 +02:00
Ettore Di Giacinto
80508001c7 generate elem with go 2024-04-14 00:31:24 +02:00
Ettore Di Giacinto
7c0f615952 Add import/export 2024-04-14 00:05:19 +02:00
Ettore Di Giacinto
54de6221d1 add start/pause (to improve further) 2024-04-13 00:38:58 +02:00
Ettore Di Giacinto
74b158e9f1 Add status to show the reasoning log 2024-04-13 00:16:28 +02:00
mudler
4b41653d00 Add reset, some minor UX tweaks 2024-04-12 19:34:26 +02:00
mudler
03175d36ab Add ignore files 2024-04-12 16:18:45 +02:00
mudler
e6fdd91369 Add docker-compose 2024-04-12 15:47:06 +02:00
Ettore Di Giacinto
97f5ba16cd Add logos and static 2024-04-12 00:48:11 +02:00
Ettore Di Giacinto
677a96074b Add logo 2024-04-12 00:39:35 +02:00
Ettore Di Giacinto
82f1b37ae8 Small ux enhancements 2024-04-12 00:23:55 +02:00
Ettore Di Giacinto
adf649be99 Uniform webui 2024-04-11 23:45:57 +02:00
Ettore Di Giacinto
26baaecbcd Correctly handle chat render' 2024-04-11 23:21:47 +02:00
mudler
fac211d8ee parallel calls for telegram 2024-04-11 23:04:29 +02:00
mudler
84a808fb7e fixups 2024-04-11 23:01:40 +02:00
mudler
6528da804f Add dockerfile 2024-04-11 20:00:38 +02:00
mudler
3170ee0ba2 Add search action 2024-04-11 19:32:36 +02:00
mudler
3295942e59 Add issue closer 2024-04-11 17:05:41 +02:00
mudler
fa39ae93f1 Add github actions: 2024-04-11 13:05:35 +02:00
Ettore Di Giacinto
d237e17719 Support pdf ingestion 2024-04-11 00:40:46 +02:00
Ettore Di Giacinto
cb35f871db Split main 2024-04-11 00:20:49 +02:00
Ettore Di Giacinto
f85962769a Move views 2024-04-11 00:16:03 +02:00
Ettore Di Giacinto
3b361d1a3d Add prompt blocks 2024-04-10 23:49:13 +02:00
Ettore Di Giacinto
ac580b562e User message is the last 2024-04-10 23:34:01 +02:00
mudler
230d012915 Add customizable system prompt 2024-04-10 20:35:04 +02:00
mudler
0dda705017 Slack: no need of serial messages. in this way can be interrupted 2024-04-10 20:29:30 +02:00
mudler
69cbcc5c88 Allow slack connector to answer to channel messages 2024-04-10 20:28:15 +02:00
mudler
437da44590 Allow to skip indexing 2024-04-10 19:42:54 +02:00
mudler
82ac74ac5d Add an index 2024-04-10 19:40:39 +02:00
mudler
7c70f09834 Reload store at start for now 2024-04-10 17:40:15 +02:00
mudler
73524adfce Make Knowledgebase RAG functional (almost) 2024-04-10 17:38:16 +02:00
Ettore Di Giacinto
48d17b6952 Do not fail if KB retrieve fails 2024-04-10 00:04:07 +02:00
Ettore Di Giacinto
db490fb3ca KB: Specify chunk size 2024-04-09 23:54:58 +02:00
Ettore Di Giacinto
a1edf005a9 KB: add webui sections 2024-04-09 23:52:39 +02:00
Ettore Di Giacinto
78ba7871e9 rag: add KB to conversation 2024-04-09 22:34:22 +02:00
mudler
36abf837a9 Add rag commands 2024-04-09 20:05:08 +02:00
mudler
4453f43bec Add slack and github connectors 2024-04-09 18:24:47 +02:00
Ettore Di Giacinto
414a973282 Add slack connector 2024-04-08 23:50:59 +02:00
Ettore Di Giacinto
80361a6400 small fixups 2024-04-08 23:25:17 +02:00
Ettore Di Giacinto
f11c8f3f57 Enhance create form, wire up permanent goal 2024-04-08 23:18:03 +02:00
Ettore Di Giacinto
c56e277a2b Add telegram connector 2024-04-08 23:03:24 +02:00
Ettore Di Giacinto
cf3c4549da allow to set configs for actions too 2024-04-08 22:58:53 +02:00
mudler
66b1847644 Allow to configure connectors 2024-04-08 20:15:32 +02:00
mudler
533caeee96 Add conversation history for jobs 2024-04-08 16:10:26 +02:00
mudler
185fb89d39 wips 2024-04-08 11:15:36 +02:00
Ettore Di Giacinto
7adcce78be wip: UI 2024-04-08 00:35:14 +02:00
mudler
dbfc596333 comment 2024-04-07 20:30:20 +02:00
mudler
dda993e43b Add skeleton for pages 2024-04-07 20:29:33 +02:00
mudler
00078b5da8 Cleanups 2024-04-07 20:23:22 +02:00
mudler
a4a2a172f5 A manager for each agent 2024-04-07 20:22:07 +02:00
mudler
8db36b4619 use go-fiber 2024-04-07 20:13:40 +02:00
mudler
59b91d1403 Initiate agents from pool 2024-04-07 20:13:28 +02:00
Ettore Di Giacinto
23867bf0e6 fixups 2024-04-07 00:22:17 +02:00
Ettore Di Giacinto
27202c3622 comments 2024-04-07 00:17:37 +02:00
Ettore Di Giacinto
dc72904cee add agentpool 2024-04-07 00:16:02 +02:00
Ettore Di Giacinto
8ceaab2ac1 enhance webui graphics 2024-04-07 00:15:48 +02:00
mudler
015b26096f wip animation 2024-04-06 20:37:09 +02:00
mudler
215c3ddbf7 UI: block messages while agent is replying, visualize status 2024-04-06 16:16:09 +02:00
mudler
84c56f6c3e Allow to initiate new conversations 2024-04-06 16:08:04 +02:00
mudler
90fd130e31 comments 2024-04-06 15:20:45 +02:00
mudler
9f5a32a1bf If we didn't had any response, just use the reasoning 2024-04-06 15:08:39 +02:00
mudler
13b08c8cb3 Consume answer if we didn't picked up any tool' 2024-04-06 15:08:21 +02:00
mudler
4f9ec2896b reason should be said from system'
'
2024-04-06 15:07:48 +02:00
Ettore Di Giacinto
e3987e9bd1 small fixups 2024-04-06 00:22:05 +02:00
Ettore Di Giacinto
899e3c0771 Search might be useful for other agents 2024-04-05 23:59:48 +02:00
Ettore Di Giacinto
f2c74e29e8 Add search internet 2024-04-05 23:56:03 +02:00
mudler
5f29125bbb webui, fixes 2024-04-05 17:14:53 +02:00
Ettore Di Giacinto
32f4e53242 Remove focus 2024-04-05 01:28:31 +02:00
Ettore Di Giacinto
3f6b68a60a Minor fixes 2024-04-05 01:27:52 +02:00
Ettore Di Giacinto
9e8b1dab84 adapt test 2024-04-05 01:24:28 +02:00
Ettore Di Giacinto
652cef288d control show of character,global callbacks, re-add replies during internal runs to self-stop 2024-04-05 01:20:20 +02:00
mudler
744af19025 WIP? 2024-04-04 22:44:59 +02:00
mudler
5c58072ad7 Avoid compression 2024-04-04 20:25:02 +02:00
mudler
98c53e042d minor 2024-04-04 20:14:04 +02:00
mudler
79e5dffe09 wip 2024-04-04 20:00:58 +02:00
mudler
b4fd482f66 fix state update, save/load 2024-04-04 16:58:25 +02:00
Ettore Di Giacinto
9173156e40 Update state action 2024-04-04 00:19:56 +02:00
Ettore Di Giacinto
936b2af4ca Return JobResult 2024-04-03 23:17:46 +02:00
Ettore Di Giacinto
9df690999c fix template 2024-04-03 23:17:20 +02:00
mudler
7db9aea57b comments 2024-04-03 20:03:51 +02:00
mudler
41692d700c automatic background actions 2024-04-03 20:02:39 +02:00
mudler
e6090c62cf Split character from state 2024-04-03 18:04:50 +02:00
mudler
a404b75fbe remove debug prints 2024-04-03 18:04:33 +02:00
mudler
58ba4db1dd fix tests 2024-04-02 20:29:23 +02:00
mudler
9417c5ca8f correctly store conversation 2024-04-02 17:32:27 +02:00
mudler
8e3a1fcbe5 return statesg 2024-04-01 22:50:11 +02:00
mudler
7c679ead94 use options 2024-04-01 20:02:25 +02:00
mudler
b45490e84d wip introspection 2024-04-01 18:36:30 +02:00
mudler
2ebbf1007f Allow to display HUD/character informations to the LLM 2024-04-01 18:27:09 +02:00
mudler
fdb0585dcb improve templating 2024-04-01 18:08:29 +02:00
Ettore Di Giacinto
56ceedb4fb add reasoning action 2024-04-01 11:33:12 +02:00
Ettore Di Giacinto
f2e09dfe81 sub-sequent action support 2024-04-01 00:45:32 +02:00
Ettore Di Giacinto
eb4294bdbb re-eval 2024-04-01 00:17:16 +02:00
mudler
3b1a54083d wip 2024-03-31 23:06:28 +02:00
mudler
aa62d9ef9e refactor 2024-03-31 17:20:06 +02:00
mudler
8601956e53 simplify declarations 2024-03-31 17:08:34 +02:00
mudler
11486d4720 make it work 2024-03-31 16:51:38 +02:00
Ettore Di Giacinto
658681f344 let it choose 2024-03-31 00:26:48 +01:00
mudler
9926674c38 fixups 2024-03-30 22:35:24 +01:00
mudler
61e4be0d0c add job/queue logics 2024-01-21 16:12:34 +01:00
mudler
d52e3b8309 fixup 2024-01-21 11:18:40 +01:00
mudler
2cc907cbd7 better debugging 2024-01-21 11:17:10 +01:00
mudler
3790a872ea refactoring 2024-01-21 11:09:47 +01:00
mudler
d22154e9be Initial import 2024-01-20 19:41:09 +01:00
mudler
a1203c8f14 Initial commit 2024-01-20 19:40:57 +01:00
Ettore Di Giacinto
509080c8f2 Update requirements.txt 2023-12-28 22:47:53 +01:00
Ettore Di Giacinto
2535895214 Update requirements.txt 2023-12-28 16:38:02 +01:00
Ettore Di Giacinto
60e9d66b36 Update requirements.txt 2023-12-28 16:37:47 +01:00
Ettore Di Giacinto
df15681400 Update requirements.txt 2023-12-28 16:37:26 +01:00
mudler
f1b39164ef Submit thread context in slack 2023-12-16 19:12:24 +01:00
mudler
7e3f2cbffb Add slack example 2023-12-16 18:54:53 +01:00
Ettore Di Giacinto
0f953d8ad9 Merge pull request #6 from scenaristeur/patch-1
Update README.md
2023-12-02 08:40:21 +01:00
Ettore Di Giacinto
491a0f2b3d Merge pull request #8 from richiejp/pin-openai-api
Pin openai API to pre v1.0
2023-12-02 08:40:05 +01:00
Richard Palethorpe
e606d637e2 Pin openai API to pre v1.0 2023-12-01 16:50:01 +00:00
David
ab24a2a7cf Update README.md
first pull the container
2023-10-02 19:04:56 +02:00
mudler
8164190425 s/caption/description
Signed-off-by: mudler <mudler@localai.io>
2023-08-26 15:33:01 +02:00
mudler
d1ae51ae5d Set api base from config if not specified from env 2023-08-26 08:58:34 +02:00
mudler
f740f307e2 Disable critic by default (discord) 2023-08-26 01:35:57 +02:00
mudler
d5f72a6c82 Drop unrequired 2023-08-26 01:02:18 +02:00
mudler
87736e464a Update sample config, fixup sqlite load 2023-08-26 00:48:40 +02:00
mudler
414f9ca765 Add sources in results, allow to set search type and results from configs 2023-08-26 00:42:49 +02:00
mudler
a7462249a7 Add support for milvus in the discord bot 2023-08-26 00:17:19 +02:00
mudler
317b91a7e9 add beautifulsoup4 2023-08-24 18:42:16 +02:00
mudler
ee8351a637 discord: track actions 2023-08-24 00:00:47 +02:00
mudler
d01ee8e27b Add action picker critic
Force to critic the action before actually chosing
2023-08-23 21:56:35 +02:00
mudler
d08a097175 discord: fix interactions answers 2023-08-23 21:56:19 +02:00
mudler
42d35dd7a3 Add ingest command 2023-08-23 21:55:57 +02:00
mudler
a823131a2d Add langchain 2023-08-23 00:35:43 +02:00
mudler
d32940e604 Add discord bot, github pipelines 2023-08-23 00:30:35 +02:00
mudler
11514a0e0c no need for the counter 2023-08-21 23:50:53 +02:00
mudler
ee535536ad fix python req 2023-08-21 01:49:40 +02:00
mudler
1ad68eca01 Drop unused attributes, add callback 2023-08-20 23:45:22 +02:00
mudler
d57db7449a force summarization at the end 2023-08-20 12:22:04 +02:00
mudler
a3300dfb6d Add pyproject 2023-08-20 12:00:44 +02:00
215 changed files with 31277 additions and 1556 deletions

View File

@@ -1,2 +1,3 @@
models/
db/
data/
volumes/

26
.env
View File

@@ -1,26 +0,0 @@
# Enable debug mode in the LocalAI API
DEBUG=true
# Where models are stored
MODELS_PATH=/models
# Galleries to use
GALLERIES=[{"name":"model-gallery", "url":"github:go-skynet/model-gallery/index.yaml"}, {"url": "github:go-skynet/model-gallery/huggingface.yaml","name":"huggingface"}]
# Select model configuration in the config directory
#PRELOAD_MODELS_CONFIG=/config/wizardlm-13b.yaml
PRELOAD_MODELS_CONFIG=/config/wizardlm-13b.yaml
#PRELOAD_MODELS_CONFIG=/config/wizardlm-13b-superhot.yaml
# You don't need to put a valid OpenAI key, however, the python libraries expect
# the string to be set or panics
OPENAI_API_KEY=sk---
# Set the OpenAI API base URL to point to LocalAI
DEFAULT_API_BASE=http://api:8080
# Set an image path
IMAGE_PATH=/tmp
# Set number of default threads
THREADS=14

19
.github/dependabot.yml vendored Normal file
View File

@@ -0,0 +1,19 @@
# https://docs.github.com/en/code-security/dependabot/working-with-dependabot/dependabot-options-reference#package-ecosystem-
version: 2
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "monthly"
- package-ecosystem: "docker"
directory: "/"
schedule:
interval: "monthly"
- package-ecosystem: "gomod"
directory: "/"
schedule:
interval: "weekly"
- package-ecosystem: "bun"
directory: "/webui/react-ui"
schedule:
interval: "weekly"

32
.github/workflows/goreleaser.yml vendored Normal file
View File

@@ -0,0 +1,32 @@
name: goreleaser
on:
push:
tags:
- 'v*' # Add this line to trigger the workflow on tag pushes that match 'v*'
permissions:
id-token: write
contents: read
jobs:
goreleaser:
permissions:
contents: write
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Set up Go
uses: actions/setup-go@v5
with:
go-version: 1.24
- name: Run GoReleaser
uses: goreleaser/goreleaser-action@v6
with:
version: '~> v2'
args: release --clean
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

160
.github/workflows/image.yml vendored Normal file
View File

@@ -0,0 +1,160 @@
---
name: 'build container images'
on:
push:
branches:
- main
tags:
- '*'
concurrency:
group: ci-image-${{ github.head_ref || github.ref }}-${{ github.repository }}
cancel-in-progress: true
jobs:
containerImages:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Prepare
id: prep
run: |
DOCKER_IMAGE=quay.io/mudler/localagi
# Use branch name as default
VERSION=${GITHUB_REF#refs/heads/}
BINARY_VERSION=$(git describe --always --tags --dirty)
SHORTREF=${GITHUB_SHA::8}
# If this is git tag, use the tag name as a docker tag
if [[ $GITHUB_REF == refs/tags/* ]]; then
VERSION=${GITHUB_REF#refs/tags/}
fi
TAGS="${DOCKER_IMAGE}:${VERSION},${DOCKER_IMAGE}:${SHORTREF}"
# If the VERSION looks like a version number, assume that
# this is the most recent version of the image and also
# tag it 'latest'.
if [[ $VERSION =~ ^[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}$ ]]; then
TAGS="$TAGS,${DOCKER_IMAGE}:latest"
fi
# Set output parameters.
echo ::set-output name=binary_version::${BINARY_VERSION}
echo ::set-output name=tags::${TAGS}
echo ::set-output name=docker_image::${DOCKER_IMAGE}
- name: Set up QEMU
uses: docker/setup-qemu-action@master
with:
platforms: all
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@master
- name: Login to DockerHub
if: github.event_name != 'pull_request'
uses: docker/login-action@v3
with:
registry: quay.io
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@902fa8ec7d6ecbf8d84d538b9b233a880e428804
with:
images: quay.io/mudler/localagi
tags: |
type=ref,event=branch,suffix=-{{date 'YYYYMMDDHHmmss'}}
type=semver,pattern={{raw}}
type=sha,suffix=-{{date 'YYYYMMDDHHmmss'}}
type=ref,event=branch
flavor: |
latest=auto
prefix=
suffix=
- name: Build
uses: docker/build-push-action@v6
with:
builder: ${{ steps.buildx.outputs.name }}
build-args: |
VERSION=${{ steps.prep.outputs.binary_version }}
context: ./
file: ./Dockerfile.webui
#platforms: linux/amd64,linux/arm64
platforms: linux/amd64
push: true
#tags: ${{ steps.prep.outputs.tags }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
mcpbox-build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Prepare
id: prep
run: |
DOCKER_IMAGE=quay.io/mudler/localagi-mcpbox
# Use branch name as default
VERSION=${GITHUB_REF#refs/heads/}
BINARY_VERSION=$(git describe --always --tags --dirty)
SHORTREF=${GITHUB_SHA::8}
# If this is git tag, use the tag name as a docker tag
if [[ $GITHUB_REF == refs/tags/* ]]; then
VERSION=${GITHUB_REF#refs/tags/}
fi
TAGS="${DOCKER_IMAGE}:${VERSION},${DOCKER_IMAGE}:${SHORTREF}"
# If the VERSION looks like a version number, assume that
# this is the most recent version of the image and also
# tag it 'latest'.
if [[ $VERSION =~ ^[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}$ ]]; then
TAGS="$TAGS,${DOCKER_IMAGE}:latest"
fi
# Set output parameters.
echo ::set-output name=binary_version::${BINARY_VERSION}
echo ::set-output name=tags::${TAGS}
echo ::set-output name=docker_image::${DOCKER_IMAGE}
- name: Set up QEMU
uses: docker/setup-qemu-action@master
with:
platforms: all
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@master
- name: Login to DockerHub
if: github.event_name != 'pull_request'
uses: docker/login-action@v3
with:
registry: quay.io
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@902fa8ec7d6ecbf8d84d538b9b233a880e428804
with:
images: quay.io/mudler/localagi-mcpbox
tags: |
type=ref,event=branch,suffix=-{{date 'YYYYMMDDHHmmss'}}
type=semver,pattern={{raw}}
type=sha,suffix=-{{date 'YYYYMMDDHHmmss'}}
type=ref,event=branch
flavor: |
latest=auto
prefix=
suffix=
- name: Build
uses: docker/build-push-action@v6
with:
builder: ${{ steps.buildx.outputs.name }}
build-args: |
VERSION=${{ steps.prep.outputs.binary_version }}
context: ./
file: ./Dockerfile.mcpbox
#platforms: linux/amd64,linux/arm64
platforms: linux/amd64
push: true
#tags: ${{ steps.prep.outputs.tags }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}

50
.github/workflows/tests.yml vendored Normal file
View File

@@ -0,0 +1,50 @@
name: Run Go Tests
on:
push:
branches:
- 'main'
pull_request:
branches:
- '**'
concurrency:
group: ci-tests-${{ github.head_ref || github.ref }}-${{ github.repository }}
cancel-in-progress: true
jobs:
test:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- run: |
# Add Docker's official GPG key:
sudo apt-get update
sudo apt-get install ca-certificates curl
sudo install -m 0755 -d /etc/apt/keyrings
sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc
sudo chmod a+r /etc/apt/keyrings/docker.asc
# Add the repository to Apt sources:
echo \
"deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu \
$(. /etc/os-release && echo "${UBUNTU_CODENAME:-$VERSION_CODENAME}") stable" | \
sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
sudo apt-get update
sudo apt-get install -y docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin
docker version
docker run --rm hello-world
- uses: actions/setup-go@v5
with:
go-version: '>=1.17.0'
- name: Run tests
run: |
sudo apt-get update && sudo apt-get install -y make
make tests
#sudo mv coverage/coverage.txt coverage.txt
#sudo chmod 777 coverage.txt
# - name: Upload coverage to Codecov
# uses: codecov/codecov-action@v4
# with:
# token: ${{ secrets.CODECOV_TOKEN }}

10
.gitignore vendored
View File

@@ -1,2 +1,10 @@
db/
models/
data/
pool
uploads/
local-agent-framework
localagi
LocalAGI
**/.env
.vscode
volumes/

40
.goreleaser.yml Normal file
View File

@@ -0,0 +1,40 @@
# Make sure to check the documentation at http://goreleaser.com
version: 2
builds:
- main: ./
id: "localagi"
binary: localagi
ldflags:
- -w -s
# - -X github.com/internal.Version={{.Tag}}
# - -X github.com/internal.Commit={{.Commit}}
env:
- CGO_ENABLED=0
goos:
- linux
- windows
- darwin
- freebsd
goarch:
- amd64
- arm
- arm64
source:
enabled: true
name_template: '{{ .ProjectName }}-{{ .Tag }}-source'
archives:
# Default template uses underscores instead of -
- name_template: >-
{{ .ProjectName }}-{{ .Tag }}-
{{- if eq .Os "freebsd" }}FreeBSD
{{- else }}{{- title .Os }}{{end}}-
{{- if eq .Arch "amd64" }}x86_64
{{- else if eq .Arch "386" }}i386
{{- else }}{{ .Arch }}{{end}}
{{- if .Arm }}v{{ .Arm }}{{ end }}
checksum:
name_template: '{{ .ProjectName }}-{{ .Tag }}-checksums.txt'
snapshot:
name_template: "{{ .Tag }}-next"
changelog:
use: github-native

View File

@@ -1,18 +0,0 @@
FROM python:3.10-bullseye
WORKDIR /app
COPY ./requirements.txt /app/requirements.txt
RUN pip install --no-cache-dir -r requirements.txt
ENV DEBIAN_FRONTEND noninteractive
# Install package dependencies
RUN apt-get update -y && \
apt-get install -y --no-install-recommends \
alsa-utils \
libsndfile1-dev && \
apt-get clean
COPY . /app
RUN pip install .
ENTRYPOINT [ "python", "./main.py" ];

47
Dockerfile.mcpbox Normal file
View File

@@ -0,0 +1,47 @@
# Build stage
FROM golang:1.24-alpine AS builder
# Install build dependencies
RUN apk add --no-cache git
# Set working directory
WORKDIR /app
# Copy go mod files
COPY go.mod go.sum ./
# Download dependencies
RUN go mod download
# Copy source code
COPY . .
# Build the application
RUN CGO_ENABLED=0 GOOS=linux go build -o mcpbox ./cmd/mcpbox
# Final stage
FROM alpine:3.19
# Install runtime dependencies
RUN apk add --no-cache ca-certificates tzdata docker
# Create non-root user
#RUN adduser -D -g '' appuser
# Set working directory
WORKDIR /app
# Copy binary from builder
COPY --from=builder /app/mcpbox .
# Use non-root user
#USER appuser
# Expose port
EXPOSE 8080
# Set entrypoint
ENTRYPOINT ["/app/mcpbox"]
# Default command
CMD ["-addr", ":8080"]

12
Dockerfile.realtimesst Normal file
View File

@@ -0,0 +1,12 @@
# python
FROM python:3.13-slim
ENV DEBIAN_FRONTEND=noninteractive
RUN apt-get update && apt-get install -y python3-dev portaudio19-dev ffmpeg build-essential
RUN pip install RealtimeSTT
#COPY ./example/realtimesst /app
# https://github.com/KoljaB/RealtimeSTT/blob/master/RealtimeSTT_server/README.md#server-usage
ENTRYPOINT ["stt-server"]
#ENTRYPOINT [ "/app/main.py" ]

55
Dockerfile.webui Normal file
View File

@@ -0,0 +1,55 @@
# Use Bun container for building the React UI
FROM oven/bun:1 AS ui-builder
# Set the working directory for the React UI
WORKDIR /app
# Copy package.json and bun.lockb (if exists)
COPY webui/react-ui/package.json webui/react-ui/bun.lockb* ./
# Install dependencies
RUN bun install --frozen-lockfile
# Copy the rest of the React UI source code
COPY webui/react-ui/ ./
# Build the React UI
RUN bun run build
# Use a temporary build image based on Golang 1.24-alpine
FROM golang:1.24-alpine AS builder
# Define argument for linker flags
ARG LDFLAGS="-s -w"
# Install git
RUN apk add --no-cache git
RUN rm -rf /tmp/* /var/cache/apk/*
# Set the working directory
WORKDIR /work
# Copy go.mod and go.sum files first to leverage Docker cache
COPY go.mod go.sum ./
# Download dependencies - this layer will be cached as long as go.mod and go.sum don't change
RUN go mod download
# Now copy the rest of the source code
COPY . .
# Copy the built React UI from the ui-builder stage
COPY --from=ui-builder /app/dist /work/webui/react-ui/dist
# Build the application
RUN CGO_ENABLED=0 go build -ldflags="$LDFLAGS" -o localagi ./
FROM scratch
# Copy the webui binary from the builder stage to the final image
COPY --from=builder /work/localagi /localagi
COPY --from=builder /etc/ssl/ /etc/ssl/
COPY --from=builder /tmp /tmp
# Define the command that will be run when the container is started
ENTRYPOINT ["/localagi"]

View File

@@ -1,6 +1,6 @@
MIT License
Copyright (c) 2023 Ettore Di Giacinto
Copyright (c) 2023-2025 Ettore Di Giacinto (mudler@localai.io)
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

40
Makefile Normal file
View File

@@ -0,0 +1,40 @@
GOCMD?=go
IMAGE_NAME?=webui
MCPBOX_IMAGE_NAME?=mcpbox
ROOT_DIR:=$(shell dirname $(realpath $(lastword $(MAKEFILE_LIST))))
prepare-tests: build-mcpbox
docker compose up -d --build
docker run -d -v /var/run/docker.sock:/var/run/docker.sock --privileged -p 9090:8080 --rm -ti $(MCPBOX_IMAGE_NAME)
cleanup-tests:
docker compose down
tests: prepare-tests
LOCALAGI_MCPBOX_URL="http://localhost:9090" LOCALAGI_MODEL="gemma-3-12b-it-qat" LOCALAI_API_URL="http://localhost:8081" LOCALAGI_API_URL="http://localhost:8080" $(GOCMD) run github.com/onsi/ginkgo/v2/ginkgo --fail-fast -v -r ./...
run-nokb:
$(MAKE) run KBDISABLEINDEX=true
webui/react-ui/dist:
docker run --entrypoint /bin/bash -v $(ROOT_DIR):/app oven/bun:1 -c "cd /app/webui/react-ui && bun install && bun run build"
.PHONY: build
build: webui/react-ui/dist
$(GOCMD) build -o localagi ./
.PHONY: run
run: webui/react-ui/dist
LOCALAGI_MCPBOX_URL="http://localhost:9090" $(GOCMD) run ./
build-image:
docker build -t $(IMAGE_NAME) -f Dockerfile.webui .
image-push:
docker push $(IMAGE_NAME)
build-mcpbox:
docker build -t $(MCPBOX_IMAGE_NAME) -f Dockerfile.mcpbox .
run-mcpbox:
docker run -v /var/run/docker.sock:/var/run/docker.sock --privileged -p 9090:8080 -ti mcpbox

635
README.md
View File

@@ -1,181 +1,520 @@
<p align="center">
<img src="./webui/react-ui/public/logo_1.png" alt="LocalAGI Logo" width="220"/>
</p>
<h1 align="center">
<br>
<img height="300" src="https://github.com/mudler/LocalAGI/assets/2420543/b69817ce-2361-4234-a575-8f578e159f33"> <br>
LocalAGI
<br>
</h1>
<h3 align="center"><em>Your AI. Your Hardware. Your Rules</em></h3>
[AutoGPT](https://github.com/Significant-Gravitas/Auto-GPT), [babyAGI](https://github.com/yoheinakajima/babyagi), ... and now LocalAGI!
<div align="center">
LocalAGI is a small 🤖 virtual assistant that you can run locally, made by the [LocalAI](https://github.com/go-skynet/LocalAI) author and powered by it.
[![Go Report Card](https://goreportcard.com/badge/github.com/mudler/LocalAGI)](https://goreportcard.com/report/github.com/mudler/LocalAGI)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
[![GitHub stars](https://img.shields.io/github/stars/mudler/LocalAGI)](https://github.com/mudler/LocalAGI/stargazers)
[![GitHub issues](https://img.shields.io/github/issues/mudler/LocalAGI)](https://github.com/mudler/LocalAGI/issues)
The goal is:
- Keep it simple, hackable and easy to understand
- No API keys needed, No cloud services needed, 100% Local. Tailored for Local use, however still compatible with OpenAI.
- Smart-agent/virtual assistant that can do tasks
- Small set of dependencies
- Run with Docker/Podman/Containers
- Rather than trying to do everything, provide a good starting point for other projects
</div>
Note: Be warned! It was hacked in a weekend, and it's just an experiment to see what can be done with local LLMs.
Create customizable AI assistants, automations, chat bots and agents that run 100% locally. No need for agentic Python libraries or cloud service keys, just bring your GPU (or even just CPU) and a web browser.
![Screenshot from 2023-08-05 22-40-40](https://github.com/mudler/LocalAGI/assets/2420543/144da83d-3879-44f2-985c-efd690e2b136)
**LocalAGI** is a powerful, self-hostable AI Agent platform that allows you to design AI automations without writing code. A complete drop-in replacement for OpenAI's Responses APIs with advanced agentic capabilities. No clouds. No data leaks. Just pure local AI that works on consumer-grade hardware (CPU and GPU).
## 🚀 Features
## 🛡️ Take Back Your Privacy
- 🧠 LLM for intent detection
- 🧠 Uses functions for actions
- 📝 Write to long-term memory
- 📖 Read from long-term memory
- 🌐 Internet access for search
- :card_file_box: Write files
- 🔌 Plan steps to achieve a goal
- 🤖 Avatar creation with Stable Diffusion
- 🗨️ Conversational
- 🗣️ Voice synthesis with TTS
Are you tired of AI wrappers calling out to cloud APIs, risking your privacy? So were we.
## Demo
LocalAGI ensures your data stays exactly where you want it—on your hardware. No API keys, no cloud subscriptions, no compromise.
Search on internet (interactive mode)
## 🌟 Key Features
https://github.com/mudler/LocalAGI/assets/2420543/23199ca3-7380-4efc-9fac-a6bc2b52bdb3
- 🎛 **No-Code Agents**: Easy-to-configure multiple agents via Web UI.
- 🖥 **Web-Based Interface**: Simple and intuitive agent management.
- 🤖 **Advanced Agent Teaming**: Instantly create cooperative agent teams from a single prompt.
- 📡 **Connectors Galore**: Built-in integrations with Discord, Slack, Telegram, GitHub Issues, and IRC.
- 🛠 **Comprehensive REST API**: Seamless integration into your workflows. Every agent created will support OpenAI Responses API out of the box.
- 📚 **Short & Long-Term Memory**: Powered by [LocalRecall](https://github.com/mudler/LocalRecall).
- 🧠 **Planning & Reasoning**: Agents intelligently plan, reason, and adapt.
- 🔄 **Periodic Tasks**: Schedule tasks with cron-like syntax.
- 💾 **Memory Management**: Control memory usage with options for long-term and summary memory.
- 🖼 **Multimodal Support**: Ready for vision, text, and more.
- 🔧 **Extensible Custom Actions**: Easily script dynamic agent behaviors in Go (interpreted, no compilation!).
- 🛠 **Fully Customizable Models**: Use your own models or integrate seamlessly with [LocalAI](https://github.com/mudler/LocalAI).
- 📊 **Observability**: Monitor agent status and view detailed observable updates in real-time.
Plan a road trip (batch mode)
https://github.com/mudler/LocalAGI/assets/2420543/9ba43b82-dec5-432a-bdb9-8318e7db59a4
> Note: The demo is with a GPU and `30b` models size
## :book: Quick start
No frills, just run docker-compose and start chatting with your virtual assistant:
## 🛠️ Quickstart
```bash
# Modify the configuration
# vim .env
docker-compose run -i --rm localagi
# Clone the repository
git clone https://github.com/mudler/LocalAGI
cd LocalAGI
# CPU setup (default)
docker compose up
# NVIDIA GPU setup
docker compose -f docker-compose.nvidia.yaml up
# Intel GPU setup (for Intel Arc and integrated GPUs)
docker compose -f docker-compose.intel.yaml up
# Start with a specific model (see available models in models.localai.io, or localai.io to use any model in huggingface)
MODEL_NAME=gemma-3-12b-it docker compose up
# NVIDIA GPU setup with custom multimodal and image models
MODEL_NAME=gemma-3-12b-it \
MULTIMODAL_MODEL=minicpm-v-2_6 \
IMAGE_MODEL=flux.1-dev-ggml \
docker compose -f docker-compose.nvidia.yaml up
```
## How to use it
Now you can access and manage your agents at [http://localhost:8080](http://localhost:8080)
By default localagi starts in interactive mode
Still having issues? see this Youtube video: https://youtu.be/HtVwIxW3ePg
### Examples
## 📚🆕 Local Stack Family
Road trip planner by limiting searching to internet to 3 results only:
🆕 LocalAI is now part of a comprehensive suite of AI tools designed to work together:
<table>
<tr>
<td width="50%" valign="top">
<a href="https://github.com/mudler/LocalAI">
<img src="https://raw.githubusercontent.com/mudler/LocalAI/refs/heads/master/core/http/static/logo_horizontal.png" width="300" alt="LocalAI Logo">
</a>
</td>
<td width="50%" valign="top">
<h3><a href="https://github.com/mudler/LocalAI">LocalAI</a></h3>
<p>LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that's compatible with OpenAI API specifications for local AI inferencing. Does not require GPU.</p>
</td>
</tr>
<tr>
<td width="50%" valign="top">
<a href="https://github.com/mudler/LocalRecall">
<img src="https://raw.githubusercontent.com/mudler/LocalRecall/refs/heads/main/static/localrecall_horizontal.png" width="300" alt="LocalRecall Logo">
</a>
</td>
<td width="50%" valign="top">
<h3><a href="https://github.com/mudler/LocalRecall">LocalRecall</a></h3>
<p>A REST-ful API and knowledge base management system that provides persistent memory and storage capabilities for AI agents.</p>
</td>
</tr>
</table>
## 🖥️ Hardware Configurations
LocalAGI supports multiple hardware configurations through Docker Compose profiles:
### CPU (Default)
- No special configuration needed
- Runs on any system with Docker
- Best for testing and development
- Supports text models only
### NVIDIA GPU
- Requires NVIDIA GPU and drivers
- Uses CUDA for acceleration
- Best for high-performance inference
- Supports text, multimodal, and image generation models
- Run with: `docker compose -f docker-compose.nvidia.yaml up`
- Default models:
- Text: `gemma-3-12b-it-qat`
- Multimodal: `minicpm-v-2_6`
- Image: `sd-1.5-ggml`
- Environment variables:
- `MODEL_NAME`: Text model to use
- `MULTIMODAL_MODEL`: Multimodal model to use
- `IMAGE_MODEL`: Image generation model to use
- `LOCALAI_SINGLE_ACTIVE_BACKEND`: Set to `true` to enable single active backend mode
### Intel GPU
- Supports Intel Arc and integrated GPUs
- Uses SYCL for acceleration
- Best for Intel-based systems
- Supports text, multimodal, and image generation models
- Run with: `docker compose -f docker-compose.intel.yaml up`
- Default models:
- Text: `gemma-3-12b-it-qat`
- Multimodal: `minicpm-v-2_6`
- Image: `sd-1.5-ggml`
- Environment variables:
- `MODEL_NAME`: Text model to use
- `MULTIMODAL_MODEL`: Multimodal model to use
- `IMAGE_MODEL`: Image generation model to use
- `LOCALAI_SINGLE_ACTIVE_BACKEND`: Set to `true` to enable single active backend mode
## Customize models
You can customize the models used by LocalAGI by setting environment variables when running docker-compose. For example:
```bash
docker-compose run -i --rm localagi \
--skip-avatar \
--subtask-context \
--postprocess \
--search-results 3 \
--prompt "prepare a plan for my roadtrip to san francisco"
# CPU with custom model
MODEL_NAME=gemma-3-12b-it docker compose up
# NVIDIA GPU with custom models
MODEL_NAME=gemma-3-12b-it \
MULTIMODAL_MODEL=minicpm-v-2_6 \
IMAGE_MODEL=flux.1-dev-ggml \
docker compose -f docker-compose.nvidia.yaml up
# Intel GPU with custom models
MODEL_NAME=gemma-3-12b-it \
MULTIMODAL_MODEL=minicpm-v-2_6 \
IMAGE_MODEL=sd-1.5-ggml \
docker compose -f docker-compose.intel.yaml up
```
Limit results of planning to 3 steps:
If no models are specified, it will use the defaults:
- Text model: `gemma-3-12b-it-qat`
- Multimodal model: `minicpm-v-2_6`
- Image model: `sd-1.5-ggml`
Good (relatively small) models that have been tested are:
- `qwen_qwq-32b` (best in co-ordinating agents)
- `gemma-3-12b-it`
- `gemma-3-27b-it`
## 🏆 Why Choose LocalAGI?
- **✓ Ultimate Privacy**: No data ever leaves your hardware.
- **✓ Flexible Model Integration**: Supports GGUF, GGML, and more thanks to [LocalAI](https://github.com/mudler/LocalAI).
- **✓ Developer-Friendly**: Rich APIs and intuitive interfaces.
- **✓ Effortless Setup**: Simple Docker compose setups and pre-built binaries.
- **✓ Feature-Rich**: From planning to multimodal capabilities, connectors for Slack, MCP support, LocalAGI has it all.
## 🌐 The Local Ecosystem
LocalAGI is part of the powerful Local family of privacy-focused AI tools:
- [**LocalAI**](https://github.com/mudler/LocalAI): Run Large Language Models locally.
- [**LocalRecall**](https://github.com/mudler/LocalRecall): Retrieval-Augmented Generation with local storage.
- [**LocalAGI**](https://github.com/mudler/LocalAGI): Deploy intelligent AI agents securely and privately.
## 🌟 Screenshots
### Powerful Web UI
![Web UI Dashboard](https://github.com/user-attachments/assets/a40194f9-af3a-461f-8b39-5f4612fbf221)
![Web UI Agent Settings](https://github.com/user-attachments/assets/fb3c3e2a-cd53-4ca8-97aa-c5da51ff1f83)
![Web UI Create Group](https://github.com/user-attachments/assets/102189a2-0fba-4a1e-b0cb-f99268ef8062)
![Web UI Agent Observability](https://github.com/user-attachments/assets/f7359048-9d28-4cf1-9151-1f5556ce9235)
### Connectors Ready-to-Go
<p align="center">
<img src="https://github.com/user-attachments/assets/4171072f-e4bf-4485-982b-55d55086f8fc" alt="Telegram" width="60"/>
<img src="https://github.com/user-attachments/assets/9235da84-0187-4f26-8482-32dcc55702ef" alt="Discord" width="220"/>
<img src="https://github.com/user-attachments/assets/a88c3d88-a387-4fb5-b513-22bdd5da7413" alt="Slack" width="220"/>
<img src="https://github.com/user-attachments/assets/d249cdf5-ab34-4ab1-afdf-b99e2db182d2" alt="IRC" width="220"/>
<img src="https://github.com/user-attachments/assets/52c852b0-4b50-4926-9fa0-aa50613ac622" alt="GitHub" width="220"/>
</p>
## 📖 Full Documentation
Explore detailed documentation including:
- [Installation Options](#installation-options)
- [REST API Documentation](#rest-api)
- [Connector Configuration](#connectors)
- [Agent Configuration](#agent-configuration-reference)
### Environment Configuration
LocalAGI supports environment configurations. Note that these environment variables needs to be specified in the localagi container in the docker-compose file to have effect.
| Variable | What It Does |
|----------|--------------|
| `LOCALAGI_MODEL` | Your go-to model |
| `LOCALAGI_MULTIMODAL_MODEL` | Optional model for multimodal capabilities |
| `LOCALAGI_LLM_API_URL` | OpenAI-compatible API server URL |
| `LOCALAGI_LLM_API_KEY` | API authentication |
| `LOCALAGI_TIMEOUT` | Request timeout settings |
| `LOCALAGI_STATE_DIR` | Where state gets stored |
| `LOCALAGI_LOCALRAG_URL` | LocalRecall connection |
| `LOCALAGI_ENABLE_CONVERSATIONS_LOGGING` | Toggle conversation logs |
| `LOCALAGI_API_KEYS` | A comma separated list of api keys used for authentication |
## Installation Options
### Pre-Built Binaries
Download ready-to-run binaries from the [Releases](https://github.com/mudler/LocalAGI/releases) page.
### Source Build
Requirements:
- Go 1.20+
- Git
- Bun 1.2+
```bash
docker-compose run -i --rm localagi \
--skip-avatar \
--subtask-context \
--postprocess \
--search-results 1 \
--prompt "do a plan for my roadtrip to san francisco" \
--plan-message "The assistant replies with a plan of 3 steps to answer the request with a list of subtasks with logical steps. The reasoning includes a self-contained, detailed and descriptive instruction to fullfill the task."
# Clone repo
git clone https://github.com/mudler/LocalAGI.git
cd LocalAGI
# Build it
cd webui/react-ui && bun i && bun run build
cd ../..
go build -o localagi
# Run it
./localagi
```
### Advanced
### Development
localagi has several options in the CLI to tweak the experience:
- `--system-prompt` is the system prompt to use. If not specified, it will use none.
- `--prompt` is the prompt to use for batch mode. If not specified, it will default to interactive mode.
- `--interactive` is the interactive mode. When used with `--prompt` will drop you in an interactive session after the first prompt is evaluated.
- `--skip-avatar` will skip avatar creation. Useful if you want to run it in a headless environment.
- `--re-evaluate` will re-evaluate if another action is needed or we have completed the user request.
- `--postprocess` will postprocess the reasoning for analysis.
- `--subtask-context` will include context in subtasks.
- `--search-results` is the number of search results to use.
- `--plan-message` is the message to use during planning. You can override the message for example to force a plan to have a different message.
- `--tts-api-base` is the TTS API base. Defaults to `http://api:8080`.
- `--localai-api-base` is the LocalAI API base. Defaults to `http://api:8080`.
- `--images-api-base` is the Images API base. Defaults to `http://api:8080`.
- `--embeddings-api-base` is the Embeddings API base. Defaults to `http://api:8080`.
- `--functions-model` is the functions model to use. Defaults to `functions`.
- `--embeddings-model` is the embeddings model to use. Defaults to `all-MiniLM-L6-v2`.
- `--llm-model` is the LLM model to use. Defaults to `gpt-4`.
- `--tts-model` is the TTS model to use. Defaults to `en-us-kathleen-low.onnx`.
- `--stablediffusion-model` is the Stable Diffusion model to use. Defaults to `stablediffusion`.
- `--stablediffusion-prompt` is the Stable Diffusion prompt to use. Defaults to `DEFAULT_PROMPT`.
- `--force-action` will force a specific action.
- `--debug` will enable debug mode.
### Customize
To use a different model, you can see the examples in the `config` folder.
To select a model, modify the `.env` file and change the `PRELOAD_MODELS_CONFIG` variable to use a different configuration file.
### Caveats
The "goodness" of a model has a big impact on how LocalAGI works. Currently `13b` models are powerful enough to actually able to perform multi-step tasks or do more actions. However, it is quite slow when running on CPU (no big surprise here).
The context size is a limitation - you can find in the `config` examples to run with superhot 8k context size, but the quality is not good enough to perform complex tasks.
## What is LocalAGI?
It is a dead simple experiment to show how to tie the various LocalAI functionalities to create a virtual assistant that can do tasks. It is simple on purpose, trying to be minimalistic and easy to understand and customize for everyone.
It is different from babyAGI or AutoGPT as it uses [LocalAI functions](https://localai.io/features/openai-functions/) - it is a from scratch attempt built on purpose to run locally with [LocalAI](https://localai.io) (no API keys needed!) instead of expensive, cloud services. It sets apart from other projects as it strives to be small, and easy to fork on.
### How it works?
`LocalAGI` just does the minimal around LocalAI functions to create a virtual assistant that can do generic tasks. It works by an endless loop of `intent detection`, `function invocation`, `self-evaluation` and `reply generation` (if it decides to reply! :)). The agent is capable of planning complex tasks by invoking multiple functions, and remember things from the conversation.
In a nutshell, it goes like this:
- Decide based on the conversation history if it needs to take an action by using functions. It uses the LLM to detect the intent from the conversation.
- if it need to take an action (e.g. "remember something from the conversation" ) or generate complex tasks ( executing a chain of functions to achieve a goal ) it invokes the functions
- it re-evaluates if it needs to do any other action
- return the result back to the LLM to generate a reply for the user
Under the hood LocalAI converts functions to llama.cpp BNF grammars. While OpenAI fine-tuned a model to reply to functions, LocalAI constrains the LLM to follow grammars. This is a much more efficient way to do it, and it is also more flexible as you can define your own functions and grammars. For learning more about this, check out the [LocalAI documentation](https://localai.io/docs/llm) and my tweet that explains how it works under the hoods: https://twitter.com/mudler_it/status/1675524071457533953.
### Agent functions
The intention of this project is to keep the agent minimal, so can be built on top of it or forked. The agent is capable of doing the following functions:
- remember something from the conversation
- recall something from the conversation
- search something from the internet
- plan a complex task by invoking multiple functions
- write files to disk
## Roadmap
- [x] 100% Local, with Local AI. NO API KEYS NEEDED!
- [x] Create a simple virtual assistant
- [x] Make the virtual assistant do functions like store long-term memory and autonomously search between them when needed
- [x] Create the assistant avatar with Stable Diffusion
- [x] Give it a voice
- [ ] Use weaviate instead of Chroma
- [ ] Get voice input (push to talk or wakeword)
- [ ] Make a REST API (OpenAI compliant?) so can be plugged by e.g. a third party service
- [x] Take a system prompt so can act with a "character" (e.g. "answer in rick and morty style")
## Development
Run docker-compose with main.py checked-out:
The development workflow is similar to the source build, but with additional steps for hot reloading of the frontend:
```bash
docker-compose run -v main.py:/app/main.py -i --rm localagi
# Clone repo
git clone https://github.com/mudler/LocalAGI.git
cd LocalAGI
# Install dependencies and start frontend development server
cd webui/react-ui && bun i && bun run dev
```
## Notes
Then in separate terminal:
- a 13b model is enough for doing contextualized research and search/retrieve memory
- a 30b model is enough to generate a roadmap trip plan ( so cool! )
- With superhot models looses its magic, but maybe suitable for search
- Context size is your enemy. `--postprocess` some times helps, but not always
- It can be silly!
- It is slow on CPU, don't expect `7b` models to perform good, and `13b` models perform better but on CPU are quite slow.
```bash
# Start development server
cd ../.. && go run main.go
```
> Note: see webui/react-ui/.vite.config.js for env vars that can be used to configure the backend URL
## CONNECTORS
Link your agents to the services you already use. Configuration examples below.
### GitHub Issues
```json
{
"token": "YOUR_PAT_TOKEN",
"repository": "repo-to-monitor",
"owner": "repo-owner",
"botUserName": "bot-username"
}
```
### Discord
After [creating your Discord bot](https://discordpy.readthedocs.io/en/stable/discord.html):
```json
{
"token": "Bot YOUR_DISCORD_TOKEN",
"defaultChannel": "OPTIONAL_CHANNEL_ID"
}
```
> Don't forget to enable "Message Content Intent" in Bot(tab) settings!
> Enable " Message Content Intent " in the Bot tab!
### Slack
Use the included `slack.yaml` manifest to create your app, then configure:
```json
{
"botToken": "xoxb-your-bot-token",
"appToken": "xapp-your-app-token"
}
```
- Create Oauth token bot token from "OAuth & Permissions" -> "OAuth Tokens for Your Workspace"
- Create App level token (from "Basic Information" -> "App-Level Tokens" ( scope connections:writeRoute authorizations:read ))
### Telegram
Get a token from @botfather, then:
```json
{
"token": "your-bot-father-token"
}
```
### IRC
Connect to IRC networks:
```json
{
"server": "irc.example.com",
"port": "6667",
"nickname": "LocalAGIBot",
"channel": "#yourchannel",
"alwaysReply": "false"
}
```
## REST API
### Agent Management
| Endpoint | Method | Description | Example |
|----------|--------|-------------|---------|
| `/api/agents` | GET | List all available agents | [Example](#get-all-agents) |
| `/api/agent/:name/status` | GET | View agent status history | [Example](#get-agent-status) |
| `/api/agent/create` | POST | Create a new agent | [Example](#create-agent) |
| `/api/agent/:name` | DELETE | Remove an agent | [Example](#delete-agent) |
| `/api/agent/:name/pause` | PUT | Pause agent activities | [Example](#pause-agent) |
| `/api/agent/:name/start` | PUT | Resume a paused agent | [Example](#start-agent) |
| `/api/agent/:name/config` | GET | Get agent configuration | |
| `/api/agent/:name/config` | PUT | Update agent configuration | |
| `/api/meta/agent/config` | GET | Get agent configuration metadata | |
| `/settings/export/:name` | GET | Export agent config | [Example](#export-agent) |
| `/settings/import` | POST | Import agent config | [Example](#import-agent) |
### Actions and Groups
| Endpoint | Method | Description | Example |
|----------|--------|-------------|---------|
| `/api/actions` | GET | List available actions | |
| `/api/action/:name/run` | POST | Execute an action | |
| `/api/agent/group/generateProfiles` | POST | Generate group profiles | |
| `/api/agent/group/create` | POST | Create a new agent group | |
### Chat Interactions
| Endpoint | Method | Description | Example |
|----------|--------|-------------|---------|
| `/api/chat/:name` | POST | Send message & get response | [Example](#send-message) |
| `/api/notify/:name` | POST | Send notification to agent | [Example](#notify-agent) |
| `/api/sse/:name` | GET | Real-time agent event stream | [Example](#agent-sse-stream) |
| `/v1/responses` | POST | Send message & get response | [OpenAI's Responses](https://platform.openai.com/docs/api-reference/responses/create) |
<details>
<summary><strong>Curl Examples</strong></summary>
#### Get All Agents
```bash
curl -X GET "http://localhost:3000/api/agents"
```
#### Get Agent Status
```bash
curl -X GET "http://localhost:3000/api/agent/my-agent/status"
```
#### Create Agent
```bash
curl -X POST "http://localhost:3000/api/agent/create" \
-H "Content-Type: application/json" \
-d '{
"name": "my-agent",
"model": "gpt-4",
"system_prompt": "You are an AI assistant.",
"enable_kb": true,
"enable_reasoning": true
}'
```
#### Delete Agent
```bash
curl -X DELETE "http://localhost:3000/api/agent/my-agent"
```
#### Pause Agent
```bash
curl -X PUT "http://localhost:3000/api/agent/my-agent/pause"
```
#### Start Agent
```bash
curl -X PUT "http://localhost:3000/api/agent/my-agent/start"
```
#### Get Agent Configuration
```bash
curl -X GET "http://localhost:3000/api/agent/my-agent/config"
```
#### Update Agent Configuration
```bash
curl -X PUT "http://localhost:3000/api/agent/my-agent/config" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4",
"system_prompt": "You are an AI assistant."
}'
```
#### Export Agent
```bash
curl -X GET "http://localhost:3000/settings/export/my-agent" --output my-agent.json
```
#### Import Agent
```bash
curl -X POST "http://localhost:3000/settings/import" \
-F "file=@/path/to/my-agent.json"
```
#### Send Message
```bash
curl -X POST "http://localhost:3000/api/chat/my-agent" \
-H "Content-Type: application/json" \
-d '{"message": "Hello, how are you today?"}'
```
#### Notify Agent
```bash
curl -X POST "http://localhost:3000/api/notify/my-agent" \
-H "Content-Type: application/json" \
-d '{"message": "Important notification"}'
```
#### Agent SSE Stream
```bash
curl -N -X GET "http://localhost:3000/api/sse/my-agent"
```
Note: For proper SSE handling, you should use a client that supports SSE natively.
</details>
### Agent Configuration Reference
The agent configuration defines how an agent behaves and what capabilities it has. You can view the available configuration options and their descriptions by using the metadata endpoint:
```bash
curl -X GET "http://localhost:3000/api/meta/agent/config"
```
This will return a JSON object containing all available configuration fields, their types, and descriptions.
Here's an example of the agent configuration structure:
```json
{
"name": "my-agent",
"model": "gpt-4",
"multimodal_model": "gpt-4-vision",
"hud": true,
"standalone_job": false,
"random_identity": false,
"initiate_conversations": true,
"enable_planning": true,
"identity_guidance": "You are a helpful assistant.",
"periodic_runs": "0 * * * *",
"permanent_goal": "Help users with their questions.",
"enable_kb": true,
"enable_reasoning": true,
"kb_results": 5,
"can_stop_itself": false,
"system_prompt": "You are an AI assistant.",
"long_term_memory": true,
"summary_long_term_memory": false
}
```
## LICENSE
MIT License — See the [LICENSE](LICENSE) file for details.
---
<p align="center">
<strong>LOCAL PROCESSING. GLOBAL THINKING.</strong><br>
Made with ❤️ by <a href="https://github.com/mudler">mudler</a>
</p>

38
cmd/mcpbox/main.go Normal file
View File

@@ -0,0 +1,38 @@
package main
import (
"flag"
"log"
"os"
"os/signal"
"syscall"
"github.com/mudler/LocalAGI/pkg/stdio"
)
func main() {
// Parse command line flags
addr := flag.String("addr", ":8080", "HTTP server address")
flag.Parse()
// Create and start the server
server := stdio.NewServer()
// Handle graceful shutdown
sigChan := make(chan os.Signal, 1)
signal.Notify(sigChan, syscall.SIGINT, syscall.SIGTERM)
go func() {
log.Printf("Starting server on %s", *addr)
if err := server.Start(*addr); err != nil {
log.Fatalf("Failed to start server: %v", err)
}
}()
// Wait for shutdown signal
<-sigChan
log.Println("Shutting down server...")
// TODO: Implement graceful shutdown if needed
os.Exit(0)
}

View File

@@ -1,45 +0,0 @@
- id: huggingface@TheBloke/WizardLM-13B-V1.1-GGML/wizardlm-13b-v1.1.ggmlv3.q5_K_M.bin
name: "gpt-4"
overrides:
context_size: 2048
mmap: true
f16: true
mirostat: 2
mirostat_tau: 5.0
mirostat_eta: 0.1
parameters:
temperature: 0.1
top_k: 40
top_p: 0.95
- id: model-gallery@stablediffusion
- id: model-gallery@voice-en-us-kathleen-low
- url: github:go-skynet/model-gallery/base.yaml
name: all-MiniLM-L6-v2
overrides:
embeddings: true
backend: huggingface-embeddings
parameters:
model: all-MiniLM-L6-v2
- id: huggingface@TheBloke/WizardLM-13B-V1.1-GGML/wizardlm-13b-v1.1.ggmlv3.q5_K_M.bin
name: functions
overrides:
context_size: 2048
mirostat: 2
mirostat_tau: 5.0
mirostat_eta: 0.1
template:
chat: ""
completion: ""
roles:
assistant: "ASSISTANT:"
system: "SYSTEM:"
assistant_function_call: "FUNCTION_CALL:"
function: "FUNCTION CALL RESULT:"
parameters:
temperature: 0.1
top_k: 40
top_p: 0.95
function:
disable_no_action: true
mmap: true
f16: true

View File

@@ -1,47 +0,0 @@
- id: huggingface@TheBloke/WizardLM-13B-V1-0-Uncensored-SuperHOT-8K-GGML/wizardlm-13b-v1.0-superhot-8k.ggmlv3.q4_K_M.bin
name: "gpt-4"
overrides:
context_size: 8192
mmap: true
f16: true
mirostat: 2
mirostat_tau: 5.0
mirostat_eta: 0.1
parameters:
temperature: 0.1
top_k: 40
top_p: 0.95
rope_freq_scale: 0.25
- id: model-gallery@stablediffusion
- id: model-gallery@voice-en-us-kathleen-low
- url: github:go-skynet/model-gallery/base.yaml
name: all-MiniLM-L6-v2
overrides:
embeddings: true
backend: huggingface-embeddings
parameters:
model: all-MiniLM-L6-v2
- id: huggingface@TheBloke/WizardLM-13B-V1-0-Uncensored-SuperHOT-8K-GGML/wizardlm-13b-v1.0-superhot-8k.ggmlv3.q4_K_M.bin
name: functions
overrides:
context_size: 8192
mirostat: 2
mirostat_tau: 5.0
mirostat_eta: 0.1
template:
chat: ""
completion: ""
roles:
assistant: "ASSISTANT:"
system: "SYSTEM:"
assistant_function_call: "FUNCTION_CALL:"
function: "FUNCTION CALL RESULT:"
parameters:
temperature: 0.1
top_k: 40
top_p: 0.95
rope_freq_scale: 0.25
function:
disable_no_action: true
mmap: true
f16: true

View File

@@ -1,45 +0,0 @@
- id: huggingface@thebloke/wizardlm-13b-v1.0-uncensored-ggml/wizardlm-13b-v1.0-uncensored.ggmlv3.q4_k_m.bin
name: "gpt-4"
overrides:
context_size: 2048
mmap: true
f16: true
mirostat: 2
mirostat_tau: 5.0
mirostat_eta: 0.1
parameters:
temperature: 0.1
top_k: 40
top_p: 0.95
- id: model-gallery@stablediffusion
- id: model-gallery@voice-en-us-kathleen-low
- url: github:go-skynet/model-gallery/base.yaml
name: all-MiniLM-L6-v2
overrides:
embeddings: true
backend: huggingface-embeddings
parameters:
model: all-MiniLM-L6-v2
- id: huggingface@thebloke/wizardlm-13b-v1.0-uncensored-ggml/wizardlm-13b-v1.0-uncensored.ggmlv3.q4_0.bin
name: functions
overrides:
context_size: 2048
mirostat: 2
mirostat_tau: 5.0
mirostat_eta: 0.1
template:
chat: ""
completion: ""
roles:
assistant: "ASSISTANT:"
system: "SYSTEM:"
assistant_function_call: "FUNCTION_CALL:"
function: "FUNCTION CALL RESULT:"
parameters:
temperature: 0.1
top_k: 40
top_p: 0.95
function:
disable_no_action: true
mmap: true
f16: true

View File

@@ -1,47 +0,0 @@
- id: huggingface@TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-SuperHOT-8K-GGML/WizardLM-Uncensored-SuperCOT-StoryTelling-30b-superhot-8k.ggmlv3.q4_0.bin
name: "gpt-4"
overrides:
context_size: 8192
mmap: true
f16: true
mirostat: 2
mirostat_tau: 5.0
mirostat_eta: 0.1
parameters:
temperature: 0.1
top_k: 40
top_p: 0.95
rope_freq_scale: 0.25
- id: model-gallery@stablediffusion
- id: model-gallery@voice-en-us-kathleen-low
- url: github:go-skynet/model-gallery/base.yaml
name: all-MiniLM-L6-v2
overrides:
embeddings: true
backend: huggingface-embeddings
parameters:
model: all-MiniLM-L6-v2
- id: huggingface@TheBloke/WizardLM-Uncensored-SuperCOT-StoryTelling-30B-SuperHOT-8K-GGML/WizardLM-Uncensored-SuperCOT-StoryTelling-30b-superhot-8k.ggmlv3.q4_0.bin
name: functions
overrides:
context_size: 8192
mirostat: 2
mirostat_tau: 5.0
mirostat_eta: 0.1
template:
chat: ""
completion: ""
roles:
assistant: "ASSISTANT:"
system: "SYSTEM:"
assistant_function_call: "FUNCTION_CALL:"
function: "FUNCTION CALL RESULT:"
parameters:
temperature: 0.1
top_k: 40
top_p: 0.95
rope_freq_scale: 0.25
function:
disable_no_action: true
mmap: true
f16: true

View File

@@ -1,46 +0,0 @@
- id: huggingface@thebloke/wizardlm-30b-uncensored-ggml/wizardlm-30b-uncensored.ggmlv3.q2_k.bin
galleryModel:
name: "gpt-4"
overrides:
context_size: 4096
mmap: true
f16: true
mirostat: 2
mirostat_tau: 5.0
mirostat_eta: 0.1
parameters:
temperature: 0.1
top_k: 40
top_p: 0.95
- id: model-gallery@stablediffusion
- id: model-gallery@voice-en-us-kathleen-low
- url: github:go-skynet/model-gallery/base.yaml
name: all-MiniLM-L6-v2
overrides:
embeddings: true
backend: huggingface-embeddings
parameters:
model: all-MiniLM-L6-v2
- id: huggingface@thebloke/wizardlm-30b-uncensored-ggml/wizardlm-30b-uncensored.ggmlv3.q2_k.bin
name: functions
overrides:
context_size: 4096
mirostat: 2
mirostat_tau: 5.0
mirostat_eta: 0.1
template:
chat: ""
completion: ""
roles:
assistant: "ASSISTANT:"
system: "SYSTEM:"
assistant_function_call: "FUNCTION_CALL:"
function: "FUNCTION CALL RESULT:"
parameters:
temperature: 0.1
top_k: 40
top_p: 0.95
function:
disable_no_action: true
mmap: true
f16: true

View File

@@ -1,45 +0,0 @@
- id: huggingface@thebloke/wizardlm-7b-v1.0-uncensored-ggml/wizardlm-7b-v1.0-uncensored.ggmlv3.q4_k_m.bin
name: "gpt-4"
overrides:
context_size: 2048
mmap: true
f16: true
mirostat: 2
mirostat_tau: 5.0
mirostat_eta: 0.1
parameters:
temperature: 0.1
top_k: 40
top_p: 0.95
- id: model-gallery@stablediffusion
- id: model-gallery@voice-en-us-kathleen-low
- url: github:go-skynet/model-gallery/base.yaml
name: all-MiniLM-L6-v2
overrides:
embeddings: true
backend: huggingface-embeddings
parameters:
model: all-MiniLM-L6-v2
- id: huggingface@thebloke/wizardlm-7b-v1.0-uncensored-ggml/wizardlm-7b-v1.0-uncensored.ggmlv3.q4_0.bin
name: functions
overrides:
context_size: 2048
mirostat: 2
mirostat_tau: 5.0
mirostat_eta: 0.1
template:
chat: ""
completion: ""
roles:
assistant: "ASSISTANT:"
system: "SYSTEM:"
assistant_function_call: "FUNCTION_CALL:"
function: "FUNCTION CALL RESULT:"
parameters:
temperature: 0.1
top_k: 40
top_p: 0.95
function:
disable_no_action: true
mmap: true
f16: true

View File

@@ -0,0 +1,13 @@
package action_test
import (
"testing"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
)
func TestAction(t *testing.T) {
RegisterFailHandler(Fail)
RunSpecs(t, "Agent Action test suite")
}

163
core/action/custom.go Normal file
View File

@@ -0,0 +1,163 @@
package action
import (
"context"
"fmt"
"strings"
"github.com/mudler/LocalAGI/core/types"
"github.com/mudler/LocalAGI/pkg/config"
"github.com/mudler/LocalAGI/pkg/xlog"
"github.com/sashabaranov/go-openai/jsonschema"
"github.com/traefik/yaegi/interp"
"github.com/traefik/yaegi/stdlib"
)
func NewCustom(config map[string]string, goPkgPath string) (*CustomAction, error) {
a := &CustomAction{
config: config,
goPkgPath: goPkgPath,
}
if err := a.initializeInterpreter(); err != nil {
return nil, err
}
if err := a.callInit(); err != nil {
xlog.Error("Error calling custom action init", "error", err)
}
return a, nil
}
type CustomAction struct {
config map[string]string
goPkgPath string
i *interp.Interpreter
}
func (a *CustomAction) callInit() error {
if a.i == nil {
return nil
}
v, err := a.i.Eval(fmt.Sprintf("%s.Init", a.config["name"]))
if err != nil {
return err
}
run := v.Interface().(func() error)
return run()
}
func (a *CustomAction) initializeInterpreter() error {
if _, exists := a.config["code"]; exists && a.i == nil {
unsafe := strings.ToLower(a.config["unsafe"]) == "true"
i := interp.New(interp.Options{
GoPath: a.goPkgPath,
Unrestricted: unsafe,
})
if err := i.Use(stdlib.Symbols); err != nil {
return err
}
if _, exists := a.config["name"]; !exists {
a.config["name"] = "custom"
}
_, err := i.Eval(fmt.Sprintf("package %s\n%s", a.config["name"], a.config["code"]))
if err != nil {
return err
}
a.i = i
}
return nil
}
func (a *CustomAction) Plannable() bool {
return true
}
func (a *CustomAction) Run(ctx context.Context, params types.ActionParams) (types.ActionResult, error) {
v, err := a.i.Eval(fmt.Sprintf("%s.Run", a.config["name"]))
if err != nil {
return types.ActionResult{}, err
}
run := v.Interface().(func(map[string]interface{}) (string, map[string]interface{}, error))
res, meta, err := run(params)
return types.ActionResult{Result: res, Metadata: meta}, err
}
func (a *CustomAction) Definition() types.ActionDefinition {
v, err := a.i.Eval(fmt.Sprintf("%s.Definition", a.config["name"]))
if err != nil {
xlog.Error("Error getting custom action definition", "error", err)
return types.ActionDefinition{}
}
properties := v.Interface().(func() map[string][]string)
v, err = a.i.Eval(fmt.Sprintf("%s.RequiredFields", a.config["name"]))
if err != nil {
xlog.Error("Error getting custom action definition", "error", err)
return types.ActionDefinition{}
}
requiredFields := v.Interface().(func() []string)
prop := map[string]jsonschema.Definition{}
for k, v := range properties() {
if len(v) != 2 {
xlog.Error("Invalid property definition", "property", k)
continue
}
prop[k] = jsonschema.Definition{
Type: jsonschema.DataType(v[0]),
Description: v[1],
}
}
return types.ActionDefinition{
Name: types.ActionDefinitionName(a.config["name"]),
Description: a.config["description"],
Properties: prop,
Required: requiredFields(),
}
}
func CustomConfigMeta() []config.Field {
return []config.Field{
{
Name: "name",
Label: "Action Name",
Type: config.FieldTypeText,
Required: true,
HelpText: "Name of the custom action",
},
{
Name: "description",
Label: "Description",
Type: config.FieldTypeTextarea,
HelpText: "Description of the custom action",
},
{
Name: "code",
Label: "Code",
Type: config.FieldTypeTextarea,
Required: true,
HelpText: "Go code for the custom action",
},
{
Name: "unsafe",
Label: "Unsafe",
Type: config.FieldTypeCheckbox,
HelpText: "Allow unsafe code execution",
},
}
}

View File

@@ -0,0 +1,87 @@
package action_test
import (
"context"
. "github.com/mudler/LocalAGI/core/action"
"github.com/mudler/LocalAGI/core/types"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
"github.com/sashabaranov/go-openai/jsonschema"
)
var _ = Describe("Agent custom action", func() {
Context("custom action", func() {
It("initializes correctly", func() {
testCode := `
import (
"encoding/json"
)
type Params struct {
Foo string
}
func Run(config map[string]interface{}) (string, map[string]interface{}, error) {
p := Params{}
b, err := json.Marshal(config)
if err != nil {
return "",map[string]interface{}{}, err
}
if err := json.Unmarshal(b, &p); err != nil {
return "",map[string]interface{}{}, err
}
return p.Foo,map[string]interface{}{}, nil
}
func Definition() map[string][]string {
return map[string][]string{
"foo": []string{
"string",
"The foo value",
},
}
}
func RequiredFields() []string {
return []string{"foo"}
}
`
customAction, err := NewCustom(
map[string]string{
"code": testCode,
"name": "test",
"description": "A test action",
},
"",
)
Expect(err).ToNot(HaveOccurred())
definition := customAction.Definition()
Expect(definition).To(Equal(types.ActionDefinition{
Properties: map[string]jsonschema.Definition{
"foo": {
Type: jsonschema.String,
Description: "The foo value",
},
},
Required: []string{"foo"},
Name: "test",
Description: "A test action",
}))
runResult, err := customAction.Run(context.Background(), types.ActionParams{
"Foo": "bar",
})
Expect(err).ToNot(HaveOccurred())
Expect(runResult.Result).To(Equal("bar"))
})
})
})

48
core/action/goal.go Normal file
View File

@@ -0,0 +1,48 @@
package action
import (
"context"
"github.com/mudler/LocalAGI/core/types"
"github.com/sashabaranov/go-openai/jsonschema"
)
// NewGoal creates a new intention action
// The inention action is special as it tries to identify
// a tool to use and a reasoning over to use it
func NewGoal() *GoalAction {
return &GoalAction{}
}
type GoalAction struct {
}
type GoalResponse struct {
Goal string `json:"goal"`
Achieved bool `json:"achieved"`
}
func (a *GoalAction) Run(context.Context, types.ActionParams) (types.ActionResult, error) {
return types.ActionResult{}, nil
}
func (a *GoalAction) Plannable() bool {
return false
}
func (a *GoalAction) Definition() types.ActionDefinition {
return types.ActionDefinition{
Name: "goal",
Description: "Check if the goal is achieved",
Properties: map[string]jsonschema.Definition{
"goal": {
Type: jsonschema.String,
Description: "The goal to check if it is achieved.",
},
"achieved": {
Type: jsonschema.Boolean,
Description: "Whether the goal is achieved",
},
},
Required: []string{"goal", "achieved"},
}
}

50
core/action/intention.go Normal file
View File

@@ -0,0 +1,50 @@
package action
import (
"context"
"github.com/mudler/LocalAGI/core/types"
"github.com/sashabaranov/go-openai/jsonschema"
)
// NewIntention creates a new intention action
// The inention action is special as it tries to identify
// a tool to use and a reasoning over to use it
func NewIntention(s ...string) *IntentAction {
return &IntentAction{tools: s}
}
type IntentAction struct {
tools []string
}
type IntentResponse struct {
Tool string `json:"tool"`
Reasoning string `json:"reasoning"`
}
func (a *IntentAction) Run(context.Context, types.ActionParams) (types.ActionResult, error) {
return types.ActionResult{}, nil
}
func (a *IntentAction) Plannable() bool {
return false
}
func (a *IntentAction) Definition() types.ActionDefinition {
return types.ActionDefinition{
Name: "pick_tool",
Description: "Pick a tool",
Properties: map[string]jsonschema.Definition{
"reasoning": {
Type: jsonschema.String,
Description: "A detailed reasoning on why you want to call this tool.",
},
"tool": {
Type: jsonschema.String,
Description: "The tool you want to use",
Enum: a.tools,
},
},
Required: []string{"tool", "reasoning"},
}
}

View File

@@ -0,0 +1,42 @@
package action
import (
"context"
"github.com/mudler/LocalAGI/core/types"
"github.com/sashabaranov/go-openai/jsonschema"
)
const ConversationActionName = "new_conversation"
func NewConversation() *ConversationAction {
return &ConversationAction{}
}
type ConversationAction struct{}
type ConversationActionResponse struct {
Message string `json:"message"`
}
func (a *ConversationAction) Run(context.Context, types.ActionParams) (types.ActionResult, error) {
return types.ActionResult{}, nil
}
func (a *ConversationAction) Plannable() bool {
return false
}
func (a *ConversationAction) Definition() types.ActionDefinition {
return types.ActionDefinition{
Name: ConversationActionName,
Description: "Use this tool to initiate a new conversation or to notify something.",
Properties: map[string]jsonschema.Definition{
"message": {
Type: jsonschema.String,
Description: "The message to start the conversation",
},
},
Required: []string{"message"},
}
}

32
core/action/noreply.go Normal file
View File

@@ -0,0 +1,32 @@
package action
import (
"context"
"github.com/mudler/LocalAGI/core/types"
)
// StopActionName is the name of the action
// used by the LLM to stop any further action
const StopActionName = "stop"
func NewStop() *StopAction {
return &StopAction{}
}
type StopAction struct{}
func (a *StopAction) Run(context.Context, types.ActionParams) (types.ActionResult, error) {
return types.ActionResult{}, nil
}
func (a *StopAction) Plannable() bool {
return false
}
func (a *StopAction) Definition() types.ActionDefinition {
return types.ActionDefinition{
Name: StopActionName,
Description: "Use this tool to stop any further action and stop the conversation. You must use this when it looks like there is a conclusion to the conversation or the topic diverged too much from the original conversation. For instance if the user offer his help and you already replied with a message, you can use this tool to stop the conversation.",
}
}

71
core/action/plan.go Normal file
View File

@@ -0,0 +1,71 @@
package action
import (
"context"
"github.com/mudler/LocalAGI/core/types"
"github.com/sashabaranov/go-openai/jsonschema"
)
// PlanActionName is the name of the plan action
// used by the LLM to schedule more actions
const PlanActionName = "plan"
func NewPlan(plannableActions []string) *PlanAction {
return &PlanAction{
plannables: plannableActions,
}
}
type PlanAction struct {
plannables []string
}
type PlanResult struct {
Subtasks []PlanSubtask `json:"subtasks"`
Goal string `json:"goal"`
}
type PlanSubtask struct {
Action string `json:"action"`
Reasoning string `json:"reasoning"`
}
func (a *PlanAction) Run(context.Context, types.ActionParams) (types.ActionResult, error) {
return types.ActionResult{}, nil
}
func (a *PlanAction) Plannable() bool {
return false
}
func (a *PlanAction) Definition() types.ActionDefinition {
return types.ActionDefinition{
Name: PlanActionName,
Description: "Use it for situations that involves doing more actions in sequence.",
Properties: map[string]jsonschema.Definition{
"subtasks": {
Type: jsonschema.Array,
Description: "The subtasks to be executed",
Items: &jsonschema.Definition{
Type: jsonschema.Object,
Properties: map[string]jsonschema.Definition{
"action": {
Type: jsonschema.String,
Description: "The action to call",
Enum: a.plannables,
},
"reasoning": {
Type: jsonschema.String,
Description: "The reasoning for calling this action",
},
},
},
},
"goal": {
Type: jsonschema.String,
Description: "The goal of this plan",
},
},
Required: []string{"subtasks", "goal"},
}
}

43
core/action/reasoning.go Normal file
View File

@@ -0,0 +1,43 @@
package action
import (
"context"
"github.com/mudler/LocalAGI/core/types"
"github.com/sashabaranov/go-openai/jsonschema"
)
// NewReasoning creates a new reasoning action
// The reasoning action is special as it tries to force the LLM
// to think about what to do next
func NewReasoning() *ReasoningAction {
return &ReasoningAction{}
}
type ReasoningAction struct{}
type ReasoningResponse struct {
Reasoning string `json:"reasoning"`
}
func (a *ReasoningAction) Run(context.Context, types.ActionParams) (types.ActionResult, error) {
return types.ActionResult{}, nil
}
func (a *ReasoningAction) Plannable() bool {
return false
}
func (a *ReasoningAction) Definition() types.ActionDefinition {
return types.ActionDefinition{
Name: "pick_action",
Description: "try to understand what's the best thing to do and pick an action with a reasoning",
Properties: map[string]jsonschema.Definition{
"reasoning": {
Type: jsonschema.String,
Description: "A detailed reasoning on what would you do in this situation.",
},
},
Required: []string{"reasoning"},
}
}

45
core/action/reply.go Normal file
View File

@@ -0,0 +1,45 @@
package action
import (
"context"
"github.com/mudler/LocalAGI/core/types"
"github.com/sashabaranov/go-openai/jsonschema"
)
// ReplyActionName is the name of the reply action
// used by the LLM to reply to the user without
// any additional processing
const ReplyActionName = "reply"
func NewReply() *ReplyAction {
return &ReplyAction{}
}
type ReplyAction struct{}
type ReplyResponse struct {
Message string `json:"message"`
}
func (a *ReplyAction) Run(context.Context, types.ActionParams) (string, error) {
return "no-op", nil
}
func (a *ReplyAction) Plannable() bool {
return false
}
func (a *ReplyAction) Definition() types.ActionDefinition {
return types.ActionDefinition{
Name: ReplyActionName,
Description: "Use this tool to reply to the user once we have all the informations we need.",
Properties: map[string]jsonschema.Definition{
"message": {
Type: jsonschema.String,
Description: "The message to reply with",
},
},
Required: []string{"message"},
}
}

59
core/action/state.go Normal file
View File

@@ -0,0 +1,59 @@
package action
import (
"context"
"github.com/mudler/LocalAGI/core/types"
"github.com/sashabaranov/go-openai/jsonschema"
)
const StateActionName = "update_state"
func NewState() *StateAction {
return &StateAction{}
}
type StateAction struct{}
func (a *StateAction) Run(context.Context, types.ActionParams) (types.ActionResult, error) {
return types.ActionResult{Result: "internal state has been updated"}, nil
}
func (a *StateAction) Plannable() bool {
return false
}
func (a *StateAction) Definition() types.ActionDefinition {
return types.ActionDefinition{
Name: StateActionName,
Description: "update the agent state (short memory) with the current state of the conversation.",
Properties: map[string]jsonschema.Definition{
"goal": {
Type: jsonschema.String,
Description: "The current goal of the agent.",
},
"doing_next": {
Type: jsonschema.String,
Description: "The next action the agent will do.",
},
"done_history": {
Type: jsonschema.Array,
Items: &jsonschema.Definition{
Type: jsonschema.String,
},
Description: "A list of actions that the agent has done.",
},
"now_doing": {
Type: jsonschema.String,
Description: "The current action the agent is doing.",
},
"memories": {
Type: jsonschema.Array,
Items: &jsonschema.Definition{
Type: jsonschema.String,
},
Description: "A list of memories to keep between conversations.",
},
},
}
}

563
core/agent/actions.go Normal file
View File

@@ -0,0 +1,563 @@
package agent
import (
"context"
"encoding/json"
"fmt"
"os"
"github.com/mudler/LocalAGI/core/action"
"github.com/mudler/LocalAGI/core/types"
"github.com/mudler/LocalAGI/pkg/xlog"
"github.com/sashabaranov/go-openai"
)
type decisionResult struct {
actionParams types.ActionParams
message string
actioName string
}
// decision forces the agent to take one of the available actions
func (a *Agent) decision(
job *types.Job,
conversation []openai.ChatCompletionMessage,
tools []openai.Tool, toolchoice string, maxRetries int) (*decisionResult, error) {
var choice *openai.ToolChoice
if toolchoice != "" {
choice = &openai.ToolChoice{
Type: openai.ToolTypeFunction,
Function: openai.ToolFunction{Name: toolchoice},
}
}
decision := openai.ChatCompletionRequest{
Model: a.options.LLMAPI.Model,
Messages: conversation,
Tools: tools,
}
if choice != nil {
decision.ToolChoice = *choice
}
var obs *types.Observable
if job.Obs != nil {
obs = a.observer.NewObservable()
obs.Name = "decision"
obs.ParentID = job.Obs.ID
obs.Icon = "brain"
obs.Creation = &types.Creation{
ChatCompletionRequest: &decision,
}
a.observer.Update(*obs)
}
var lastErr error
for attempts := 0; attempts < maxRetries; attempts++ {
resp, err := a.client.CreateChatCompletion(job.GetContext(), decision)
if err != nil {
lastErr = err
xlog.Warn("Attempt to make a decision failed", "attempt", attempts+1, "error", err)
if obs != nil {
obs.Progress = append(obs.Progress, types.Progress{
Error: err.Error(),
})
a.observer.Update(*obs)
}
continue
}
jsonResp, _ := json.Marshal(resp)
xlog.Debug("Decision response", "response", string(jsonResp))
if obs != nil {
obs.AddProgress(types.Progress{
ChatCompletionResponse: &resp,
})
}
if len(resp.Choices) != 1 {
lastErr = fmt.Errorf("no choices: %d", len(resp.Choices))
xlog.Warn("Attempt to make a decision failed", "attempt", attempts+1, "error", lastErr)
if obs != nil {
obs.Progress[len(obs.Progress)-1].Error = lastErr.Error()
a.observer.Update(*obs)
}
continue
}
msg := resp.Choices[0].Message
if len(msg.ToolCalls) != 1 {
if err := a.saveConversation(append(conversation, msg), "decision"); err != nil {
xlog.Error("Error saving conversation", "error", err)
}
if obs != nil {
obs.MakeLastProgressCompletion()
a.observer.Update(*obs)
}
return &decisionResult{message: msg.Content}, nil
}
params := types.ActionParams{}
if err := params.Read(msg.ToolCalls[0].Function.Arguments); err != nil {
lastErr = err
xlog.Warn("Attempt to parse action parameters failed", "attempt", attempts+1, "error", err)
if obs != nil {
obs.Progress[len(obs.Progress)-1].Error = lastErr.Error()
a.observer.Update(*obs)
}
continue
}
if err := a.saveConversation(append(conversation, msg), "decision"); err != nil {
xlog.Error("Error saving conversation", "error", err)
}
if obs != nil {
obs.MakeLastProgressCompletion()
a.observer.Update(*obs)
}
return &decisionResult{actionParams: params, actioName: msg.ToolCalls[0].Function.Name, message: msg.Content}, nil
}
return nil, fmt.Errorf("failed to make a decision after %d attempts: %w", maxRetries, lastErr)
}
type Messages []openai.ChatCompletionMessage
func (m Messages) ToOpenAI() []openai.ChatCompletionMessage {
return []openai.ChatCompletionMessage(m)
}
func (m Messages) RemoveIf(f func(msg openai.ChatCompletionMessage) bool) Messages {
for i := len(m) - 1; i >= 0; i-- {
if f(m[i]) {
m = append(m[:i], m[i+1:]...)
}
}
return m
}
func (m Messages) String() string {
s := ""
for _, cc := range m {
s += cc.Role + ": " + cc.Content + "\n"
}
return s
}
func (m Messages) Exist(content string) bool {
for _, cc := range m {
if cc.Content == content {
return true
}
}
return false
}
func (m Messages) RemoveLastUserMessage() Messages {
if len(m) == 0 {
return m
}
for i := len(m) - 1; i >= 0; i-- {
if m[i].Role == UserRole {
return append(m[:i], m[i+1:]...)
}
}
return m
}
func (m Messages) Save(path string) error {
content, err := json.MarshalIndent(m, "", " ")
if err != nil {
return err
}
f, err := os.Create(path)
if err != nil {
return err
}
defer f.Close()
if _, err := f.Write(content); err != nil {
return err
}
return nil
}
func (m Messages) GetLatestUserMessage() *openai.ChatCompletionMessage {
for i := len(m) - 1; i >= 0; i-- {
msg := m[i]
if msg.Role == UserRole {
return &msg
}
}
return nil
}
func (m Messages) IsLastMessageFromRole(role string) bool {
if len(m) == 0 {
return false
}
return m[len(m)-1].Role == role
}
func (a *Agent) generateParameters(job *types.Job, pickTemplate string, act types.Action, c []openai.ChatCompletionMessage, reasoning string, maxAttempts int) (*decisionResult, error) {
stateHUD, err := renderTemplate(pickTemplate, a.prepareHUD(), a.availableActions(), reasoning)
if err != nil {
return nil, err
}
conversation := c
if !Messages(c).Exist(stateHUD) && a.options.enableHUD {
conversation = append([]openai.ChatCompletionMessage{
{
Role: "system",
Content: stateHUD,
},
}, conversation...)
}
cc := conversation
if a.options.forceReasoning {
cc = append(conversation, openai.ChatCompletionMessage{
Role: "system",
Content: fmt.Sprintf("The agent decided to use the tool %s with the following reasoning: %s", act.Definition().Name, reasoning),
})
}
var result *decisionResult
var attemptErr error
for attempts := 0; attempts < maxAttempts; attempts++ {
result, attemptErr = a.decision(job,
cc,
a.availableActions().ToTools(),
act.Definition().Name.String(),
maxAttempts,
)
if attemptErr == nil && result.actionParams != nil {
return result, nil
}
xlog.Warn("Attempt to generate parameters failed", "attempt", attempts+1, "error", attemptErr)
}
return nil, fmt.Errorf("failed to generate parameters after %d attempts: %w", maxAttempts, attemptErr)
}
func (a *Agent) handlePlanning(ctx context.Context, job *types.Job, chosenAction types.Action, actionParams types.ActionParams, reasoning string, pickTemplate string, conv Messages) (Messages, error) {
// Planning: run all the actions in sequence
if !chosenAction.Definition().Name.Is(action.PlanActionName) {
xlog.Debug("no plan action")
return conv, nil
}
xlog.Debug("[planning]...")
planResult := action.PlanResult{}
if err := actionParams.Unmarshal(&planResult); err != nil {
return conv, fmt.Errorf("error unmarshalling plan result: %w", err)
}
stateResult := types.ActionState{
ActionCurrentState: types.ActionCurrentState{
Job: job,
Action: chosenAction,
Params: actionParams,
Reasoning: reasoning,
},
ActionResult: types.ActionResult{
Result: fmt.Sprintf("planning %s, subtasks: %+v", planResult.Goal, planResult.Subtasks),
},
}
job.Result.SetResult(stateResult)
job.CallbackWithResult(stateResult)
xlog.Info("[Planning] starts", "agent", a.Character.Name, "goal", planResult.Goal)
for _, s := range planResult.Subtasks {
xlog.Info("[Planning] subtask", "agent", a.Character.Name, "action", s.Action, "reasoning", s.Reasoning)
}
if len(planResult.Subtasks) == 0 {
return conv, fmt.Errorf("no subtasks")
}
// Execute all subtasks in sequence
for _, subtask := range planResult.Subtasks {
xlog.Info("[subtask] Generating parameters",
"agent", a.Character.Name,
"action", subtask.Action,
"reasoning", reasoning,
)
subTaskAction := a.availableActions().Find(subtask.Action)
subTaskReasoning := fmt.Sprintf("%s Overall goal is: %s", subtask.Reasoning, planResult.Goal)
params, err := a.generateParameters(job, pickTemplate, subTaskAction, conv, subTaskReasoning, maxRetries)
if err != nil {
xlog.Error("error generating action's parameters", "error", err)
return conv, fmt.Errorf("error generating action's parameters: %w", err)
}
actionParams = params.actionParams
if !job.Callback(types.ActionCurrentState{
Job: job,
Action: subTaskAction,
Params: actionParams,
Reasoning: subTaskReasoning,
}) {
job.Result.SetResult(types.ActionState{
ActionCurrentState: types.ActionCurrentState{
Job: job,
Action: chosenAction,
Params: actionParams,
Reasoning: subTaskReasoning,
},
ActionResult: types.ActionResult{
Result: "stopped by callback",
},
})
job.Result.Conversation = conv
job.Result.Finish(nil)
break
}
result, err := a.runAction(job, subTaskAction, actionParams)
if err != nil {
xlog.Error("error running action", "error", err)
return conv, fmt.Errorf("error running action: %w", err)
}
stateResult := types.ActionState{
ActionCurrentState: types.ActionCurrentState{
Job: job,
Action: subTaskAction,
Params: actionParams,
Reasoning: subTaskReasoning,
},
ActionResult: result,
}
job.Result.SetResult(stateResult)
job.CallbackWithResult(stateResult)
xlog.Debug("[subtask] Action executed", "agent", a.Character.Name, "action", subTaskAction.Definition().Name, "result", result)
conv = a.addFunctionResultToConversation(subTaskAction, actionParams, result, conv)
}
return conv, nil
}
func (a *Agent) availableActions() types.Actions {
// defaultActions := append(a.options.userActions, action.NewReply())
addPlanAction := func(actions types.Actions) types.Actions {
if !a.options.canPlan {
return actions
}
plannablesActions := []string{}
for _, a := range actions {
if a.Plannable() {
plannablesActions = append(plannablesActions, a.Definition().Name.String())
}
}
planAction := action.NewPlan(plannablesActions)
actions = append(actions, planAction)
return actions
}
defaultActions := append(a.mcpActions, a.options.userActions...)
if a.options.initiateConversations && a.selfEvaluationInProgress { // && self-evaluation..
acts := append(defaultActions, action.NewConversation())
if a.options.enableHUD {
acts = append(acts, action.NewState())
}
//if a.options.canStopItself {
// acts = append(acts, action.NewStop())
// }
return addPlanAction(acts)
}
if a.options.canStopItself {
acts := append(defaultActions, action.NewStop())
if a.options.enableHUD {
acts = append(acts, action.NewState())
}
return addPlanAction(acts)
}
if a.options.enableHUD {
return addPlanAction(append(defaultActions, action.NewState()))
}
return addPlanAction(defaultActions)
}
func (a *Agent) prepareHUD() (promptHUD *PromptHUD) {
if !a.options.enableHUD {
return nil
}
return &PromptHUD{
Character: a.Character,
CurrentState: *a.currentState,
PermanentGoal: a.options.permanentGoal,
ShowCharacter: a.options.showCharacter,
}
}
// pickAction picks an action based on the conversation
func (a *Agent) pickAction(job *types.Job, templ string, messages []openai.ChatCompletionMessage, maxRetries int) (types.Action, types.ActionParams, string, error) {
c := messages
xlog.Debug("[pickAction] picking action starts", "messages", messages)
// Identify the goal of this conversation
if !a.options.forceReasoning {
xlog.Debug("not forcing reasoning")
// We also could avoid to use functions here and get just a reply from the LLM
// and then use the reply to get the action
thought, err := a.decision(job,
messages,
a.availableActions().ToTools(),
"",
maxRetries)
if err != nil {
return nil, nil, "", err
}
xlog.Debug(fmt.Sprintf("thought action Name: %v", thought.actioName))
xlog.Debug(fmt.Sprintf("thought message: %v", thought.message))
// Find the action
chosenAction := a.availableActions().Find(thought.actioName)
if chosenAction == nil || thought.actioName == "" {
xlog.Debug("no answer")
// LLM replied with an answer?
//fmt.Errorf("no action found for intent:" + thought.actioName)
return nil, nil, thought.message, nil
}
xlog.Debug(fmt.Sprintf("chosenAction: %v", chosenAction.Definition().Name))
return chosenAction, thought.actionParams, thought.message, nil
}
xlog.Debug("[pickAction] forcing reasoning")
prompt, err := renderTemplate(templ, a.prepareHUD(), a.availableActions(), "")
if err != nil {
return nil, nil, "", err
}
// Get the LLM to think on what to do
// and have a thought
if !Messages(c).Exist(prompt) {
c = append([]openai.ChatCompletionMessage{
{
Role: "system",
Content: prompt,
},
}, c...)
}
thought, err := a.decision(job,
c,
types.Actions{action.NewReasoning()}.ToTools(),
action.NewReasoning().Definition().Name.String(), maxRetries)
if err != nil {
return nil, nil, "", err
}
originalReasoning := ""
response := &action.ReasoningResponse{}
if thought.actionParams != nil {
if err := thought.actionParams.Unmarshal(response); err != nil {
return nil, nil, "", err
}
originalReasoning = response.Reasoning
}
if thought.message != "" {
originalReasoning = thought.message
}
xlog.Debug("[pickAction] picking action", "messages", c)
// thought, err := a.askLLM(ctx,
// c,
actionsID := []string{"reply"}
for _, m := range a.availableActions() {
actionsID = append(actionsID, m.Definition().Name.String())
}
xlog.Debug("[pickAction] actionsID", "actionsID", actionsID)
intentionsTools := action.NewIntention(actionsID...)
// TODO: FORCE to select ana ction here
// NOTE: we do not give the full conversation here to pick the action
// to avoid hallucinations
// Extract an action
params, err := a.decision(job,
append(c, openai.ChatCompletionMessage{
Role: "system",
Content: "Pick the relevant action given the following reasoning: " + originalReasoning,
}),
types.Actions{intentionsTools}.ToTools(),
intentionsTools.Definition().Name.String(), maxRetries)
if err != nil {
return nil, nil, "", fmt.Errorf("failed to get the action tool parameters: %v", err)
}
if params.actionParams == nil {
xlog.Debug("[pickAction] no action params found")
return nil, nil, params.message, nil
}
actionChoice := action.IntentResponse{}
err = params.actionParams.Unmarshal(&actionChoice)
if err != nil {
return nil, nil, "", err
}
if actionChoice.Tool == "" || actionChoice.Tool == "reply" {
xlog.Debug("[pickAction] no action found, replying")
return nil, nil, "", nil
}
chosenAction := a.availableActions().Find(actionChoice.Tool)
xlog.Debug("[pickAction] chosenAction", "chosenAction", chosenAction, "actionName", actionChoice.Tool)
// // Let's double check if the action is correct by asking the LLM to judge it
// if chosenAction!= nil {
// promptString:= "Given the following goal and thoughts, is the action correct? \n\n"
// promptString+= fmt.Sprintf("Goal: %s\n", goalResponse.Goal)
// promptString+= fmt.Sprintf("Thoughts: %s\n", originalReasoning)
// promptString+= fmt.Sprintf("Action: %s\n", chosenAction.Definition().Name.String())
// promptString+= fmt.Sprintf("Action description: %s\n", chosenAction.Definition().Description)
// promptString+= fmt.Sprintf("Action parameters: %s\n", params.actionParams)
// }
return chosenAction, nil, originalReasoning, nil
}

1072
core/agent/agent.go Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,27 @@
package agent_test
import (
"os"
"testing"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
)
func TestAgent(t *testing.T) {
RegisterFailHandler(Fail)
RunSpecs(t, "Agent test suite")
}
var testModel = os.Getenv("LOCALAGI_MODEL")
var apiURL = os.Getenv("LOCALAI_API_URL")
var apiKeyURL = os.Getenv("LOCALAI_API_KEY")
func init() {
if testModel == "" {
testModel = "hermes-2-pro-mistral"
}
if apiURL == "" {
apiURL = "http://192.168.68.113:8080"
}
}

356
core/agent/agent_test.go Normal file
View File

@@ -0,0 +1,356 @@
package agent_test
import (
"context"
"fmt"
"net/http"
"strings"
"sync"
"github.com/mudler/LocalAGI/pkg/xlog"
"github.com/mudler/LocalAGI/services/actions"
. "github.com/mudler/LocalAGI/core/agent"
"github.com/mudler/LocalAGI/core/types"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
"github.com/sashabaranov/go-openai"
"github.com/sashabaranov/go-openai/jsonschema"
)
const testActionResult = "In Boston it's 30C today, it's sunny, and humidity is at 98%"
const testActionResult2 = "In milan it's very hot today, it is 45C and the humidity is at 200%"
const testActionResult3 = "In paris it's very cold today, it is 2C and the humidity is at 10%"
var _ types.Action = &TestAction{}
var debugOptions = []types.JobOption{
types.WithReasoningCallback(func(state types.ActionCurrentState) bool {
xlog.Info("Reasoning", state)
return true
}),
types.WithResultCallback(func(state types.ActionState) {
xlog.Info("Reasoning", state.Reasoning)
xlog.Info("Action", state.Action)
xlog.Info("Result", state.Result)
}),
}
type TestAction struct {
response map[string]string
}
func (a *TestAction) Plannable() bool {
return true
}
func (a *TestAction) Run(c context.Context, p types.ActionParams) (types.ActionResult, error) {
for k, r := range a.response {
if strings.Contains(strings.ToLower(p.String()), strings.ToLower(k)) {
return types.ActionResult{Result: r}, nil
}
}
return types.ActionResult{Result: "No match"}, nil
}
func (a *TestAction) Definition() types.ActionDefinition {
return types.ActionDefinition{
Name: "get_weather",
Description: "get current weather",
Properties: map[string]jsonschema.Definition{
"location": {
Type: jsonschema.String,
Description: "The city and state, e.g. San Francisco, CA",
},
"unit": {
Type: jsonschema.String,
Enum: []string{"celsius", "fahrenheit"},
},
},
Required: []string{"location"},
}
}
type FakeStoreResultAction struct {
TestAction
}
func (a *FakeStoreResultAction) Definition() types.ActionDefinition {
return types.ActionDefinition{
Name: "store_results",
Description: "store results permanently. Use this tool after you have a result you want to keep.",
Properties: map[string]jsonschema.Definition{
"term": {
Type: jsonschema.String,
Description: "What to store permanently",
},
},
Required: []string{"term"},
}
}
type FakeInternetAction struct {
TestAction
}
func (a *FakeInternetAction) Definition() types.ActionDefinition {
return types.ActionDefinition{
Name: "search_internet",
Description: "search on internet",
Properties: map[string]jsonschema.Definition{
"term": {
Type: jsonschema.String,
Description: "What to search for",
},
},
Required: []string{"term"},
}
}
var _ = Describe("Agent test", func() {
Context("jobs", func() {
BeforeEach(func() {
Eventually(func() error {
// test apiURL is working and available
_, err := http.Get(apiURL + "/readyz")
return err
}, "10m", "10s").ShouldNot(HaveOccurred())
})
It("pick the correct action", func() {
agent, err := New(
WithLLMAPIURL(apiURL),
WithModel(testModel),
EnableForceReasoning,
WithTimeout("10m"),
WithLoopDetectionSteps(3),
// WithRandomIdentity(),
WithActions(&TestAction{response: map[string]string{
"boston": testActionResult,
"milan": testActionResult2,
"paris": testActionResult3,
}}),
)
Expect(err).ToNot(HaveOccurred())
go agent.Run()
defer agent.Stop()
res := agent.Ask(
append(debugOptions,
types.WithText("what's the weather in Boston and Milano? Use celsius units"),
)...,
)
Expect(res.Error).ToNot(HaveOccurred())
reasons := []string{}
for _, r := range res.State {
reasons = append(reasons, r.Result)
}
Expect(reasons).To(ContainElement(testActionResult), fmt.Sprint(res))
Expect(reasons).To(ContainElement(testActionResult2), fmt.Sprint(res))
reasons = []string{}
res = agent.Ask(
append(debugOptions,
types.WithText("Now I want to know the weather in Paris, always use celsius units"),
)...)
for _, r := range res.State {
reasons = append(reasons, r.Result)
}
//Expect(reasons).ToNot(ContainElement(testActionResult), fmt.Sprint(res))
//Expect(reasons).ToNot(ContainElement(testActionResult2), fmt.Sprint(res))
Expect(reasons).To(ContainElement(testActionResult3), fmt.Sprint(res))
// conversation := agent.CurrentConversation()
// for _, r := range res.State {
// reasons = append(reasons, r.Result)
// }
// Expect(len(conversation)).To(Equal(10), fmt.Sprint(conversation))
})
It("pick the correct action", func() {
agent, err := New(
WithLLMAPIURL(apiURL),
WithModel(testModel),
WithTimeout("10m"),
// WithRandomIdentity(),
WithActions(&TestAction{response: map[string]string{
"boston": testActionResult,
},
}),
)
Expect(err).ToNot(HaveOccurred())
go agent.Run()
defer agent.Stop()
res := agent.Ask(
append(debugOptions,
types.WithText("can you get the weather in boston? Use celsius units"))...,
)
reasons := []string{}
for _, r := range res.State {
reasons = append(reasons, r.Result)
}
Expect(reasons).To(ContainElement(testActionResult), fmt.Sprint(res))
})
It("updates the state with internal actions", func() {
agent, err := New(
WithLLMAPIURL(apiURL),
WithModel(testModel),
WithTimeout("10m"),
EnableHUD,
// EnableStandaloneJob,
// WithRandomIdentity(),
WithPermanentGoal("I want to learn to play music"),
)
Expect(err).ToNot(HaveOccurred())
go agent.Run()
defer agent.Stop()
result := agent.Ask(
types.WithText("Update your goals such as you want to learn to play the guitar"),
)
fmt.Printf("%+v\n", result)
Expect(result.Error).ToNot(HaveOccurred())
Expect(agent.State().Goal).To(ContainSubstring("guitar"), fmt.Sprint(agent.State()))
})
It("Can generate a plan", func() {
agent, err := New(
WithLLMAPIURL(apiURL),
WithModel(testModel),
WithLLMAPIKey(apiKeyURL),
WithTimeout("10m"),
WithActions(
&TestAction{response: map[string]string{
"boston": testActionResult,
"milan": testActionResult2,
}},
),
EnablePlanning,
EnableForceReasoning,
// EnableStandaloneJob,
// WithRandomIdentity(),
)
Expect(err).ToNot(HaveOccurred())
go agent.Run()
defer agent.Stop()
result := agent.Ask(
types.WithText("Use the plan tool to do two actions in sequence: search for the weather in boston and search for the weather in milan"),
)
Expect(len(result.State)).To(BeNumerically(">", 1))
actionsExecuted := []string{}
actionResults := []string{}
for _, r := range result.State {
xlog.Info(r.Result)
actionsExecuted = append(actionsExecuted, r.Action.Definition().Name.String())
actionResults = append(actionResults, r.ActionResult.Result)
}
Expect(actionsExecuted).To(ContainElement("get_weather"), fmt.Sprint(result))
Expect(actionsExecuted).To(ContainElement("plan"), fmt.Sprint(result))
Expect(actionResults).To(ContainElement(testActionResult), fmt.Sprint(result))
Expect(actionResults).To(ContainElement(testActionResult2), fmt.Sprint(result))
})
It("Can initiate conversations", func() {
message := openai.ChatCompletionMessage{}
mu := &sync.Mutex{}
agent, err := New(
WithLLMAPIURL(apiURL),
WithModel(testModel),
WithLLMAPIKey(apiKeyURL),
WithTimeout("10m"),
WithNewConversationSubscriber(func(m openai.ChatCompletionMessage) {
mu.Lock()
message = m
mu.Unlock()
}),
WithActions(
actions.NewSearch(map[string]string{}),
),
EnablePlanning,
EnableForceReasoning,
EnableInitiateConversations,
EnableStandaloneJob,
EnableHUD,
WithPeriodicRuns("1s"),
WithPermanentGoal("use the new_conversation tool to initiate a conversation with the user"),
// EnableStandaloneJob,
// WithRandomIdentity(),
)
Expect(err).ToNot(HaveOccurred())
go agent.Run()
defer agent.Stop()
Eventually(func() string {
mu.Lock()
defer mu.Unlock()
return message.Content
}, "10m", "10s").ShouldNot(BeEmpty())
})
/*
It("it automatically performs things in the background", func() {
agent, err := New(
WithLLMAPIURL(apiURL),
WithModel(testModel),
EnableHUD,
EnableStandaloneJob,
WithAgentReasoningCallback(func(state ActionCurrentState) bool {
xlog.Info("Reasoning", state)
return true
}),
WithAgentResultCallback(func(state ActionState) {
xlog.Info("Reasoning", state.Reasoning)
xlog.Info("Action", state.Action)
xlog.Info("Result", state.Result)
}),
WithActions(
&FakeInternetAction{
TestAction{
response:
map[string]string{
"italy": "The weather in italy is sunny",
}
},
},
&FakeStoreResultAction{
TestAction{
response: []string{
"Result permanently stored",
},
},
},
),
//WithRandomIdentity(),
WithPermanentGoal("get the weather of all the cities in italy and store the results"),
)
Expect(err).ToNot(HaveOccurred())
go agent.Run()
defer agent.Stop()
Eventually(func() string {
return agent.State().Goal
}, "10m", "10s").Should(ContainSubstring("weather"), fmt.Sprint(agent.State()))
Eventually(func() string {
return agent.State().String()
}, "10m", "10s").Should(ContainSubstring("store"), fmt.Sprint(agent.State()))
// result := agent.Ask(
// WithText("Update your goals such as you want to learn to play the guitar"),
// )
// fmt.Printf("%+v\n", result)
// Expect(result.Error).ToNot(HaveOccurred())
// Expect(agent.State().Goal).To(ContainSubstring("guitar"), fmt.Sprint(agent.State()))
})
*/
})
})

53
core/agent/identity.go Normal file
View File

@@ -0,0 +1,53 @@
package agent
import (
"fmt"
"os"
"github.com/mudler/LocalAGI/pkg/llm"
)
func (a *Agent) generateIdentity(guidance string) error {
if guidance == "" {
guidance = "Generate a random character for roleplaying."
}
err := llm.GenerateTypedJSON(a.context.Context, a.client, "Generate a character as JSON data. "+guidance, a.options.LLMAPI.Model, a.options.character.ToJSONSchema(), &a.options.character)
//err := llm.GenerateJSONFromStruct(a.context.Context, a.client, guidance, a.options.LLMAPI.Model, &a.options.character)
a.Character = a.options.character
if err != nil {
return fmt.Errorf("failed to generate JSON from structure: %v", err)
}
if !a.validCharacter() {
return fmt.Errorf("generated character is not valid ( guidance: %s ): %v", guidance, a.Character.String())
}
return nil
}
func (a *Agent) prepareIdentity() error {
if !a.options.randomIdentity {
// No identity to generate
return nil
}
if a.options.characterfile == "" {
return a.generateIdentity(a.options.randomIdentityGuidance)
}
if _, err := os.Stat(a.options.characterfile); err == nil {
// if there is a file, load the character back
return a.LoadCharacter(a.options.characterfile)
}
if err := a.generateIdentity(a.options.randomIdentityGuidance); err != nil {
return fmt.Errorf("failed to generate identity: %v", err)
}
// otherwise save it for next time
if err := a.SaveCharacter(a.options.characterfile); err != nil {
return fmt.Errorf("failed to save character: %v", err)
}
return nil
}

107
core/agent/knowledgebase.go Normal file
View File

@@ -0,0 +1,107 @@
package agent
import (
"fmt"
"os"
"path/filepath"
"time"
"github.com/mudler/LocalAGI/pkg/xlog"
"github.com/sashabaranov/go-openai"
)
func (a *Agent) knowledgeBaseLookup(conv Messages) {
if (!a.options.enableKB && !a.options.enableLongTermMemory && !a.options.enableSummaryMemory) ||
len(conv) <= 0 {
xlog.Debug("[Knowledge Base Lookup] Disabled, skipping", "agent", a.Character.Name)
return
}
// Walk conversation from bottom to top, and find the first message of the user
// to use it as a query to the KB
userMessage := conv.GetLatestUserMessage().Content
xlog.Info("[Knowledge Base Lookup] Last user message", "agent", a.Character.Name, "message", userMessage, "lastMessage", conv.GetLatestUserMessage())
if userMessage == "" {
xlog.Info("[Knowledge Base Lookup] No user message found in conversation", "agent", a.Character.Name)
return
}
results, err := a.options.ragdb.Search(userMessage, a.options.kbResults)
if err != nil {
xlog.Info("Error finding similar strings inside KB:", "error", err)
}
if len(results) == 0 {
xlog.Info("[Knowledge Base Lookup] No similar strings found in KB", "agent", a.Character.Name)
return
}
formatResults := ""
for _, r := range results {
formatResults += fmt.Sprintf("- %s \n", r)
}
xlog.Info("[Knowledge Base Lookup] Found similar strings in KB", "agent", a.Character.Name, "results", formatResults)
// conv = append(conv,
// openai.ChatCompletionMessage{
// Role: "system",
// Content: fmt.Sprintf("Given the user input you have the following in memory:\n%s", formatResults),
// },
// )
conv = append([]openai.ChatCompletionMessage{
{
Role: "system",
Content: fmt.Sprintf("Given the user input you have the following in memory:\n%s", formatResults),
}}, conv...)
}
func (a *Agent) saveConversation(m Messages, prefix string) error {
if a.options.conversationsPath == "" {
return nil
}
dateTime := time.Now().Format("2006-01-02-15-04-05")
fileName := a.Character.Name + "-" + dateTime + ".json"
if prefix != "" {
fileName = prefix + "-" + fileName
}
os.MkdirAll(a.options.conversationsPath, os.ModePerm)
return m.Save(filepath.Join(a.options.conversationsPath, fileName))
}
func (a *Agent) saveCurrentConversation(conv Messages) {
if err := a.saveConversation(conv, ""); err != nil {
xlog.Error("Error saving conversation", "error", err)
}
if !a.options.enableLongTermMemory && !a.options.enableSummaryMemory {
xlog.Debug("Long term memory is disabled", "agent", a.Character.Name)
return
}
xlog.Info("Saving conversation", "agent", a.Character.Name, "conversation size", len(conv))
if a.options.enableSummaryMemory && len(conv) > 0 {
msg, err := a.askLLM(a.context.Context, []openai.ChatCompletionMessage{{
Role: "user",
Content: "Summarize the conversation below, keep the highlights as a bullet list:\n" + Messages(conv).String(),
}}, maxRetries)
if err != nil {
xlog.Error("Error summarizing conversation", "error", err)
}
if err := a.options.ragdb.Store(msg.Content); err != nil {
xlog.Error("Error storing into memory", "error", err)
}
} else {
for _, message := range conv {
if message.Role == "user" {
if err := a.options.ragdb.Store(message.Content); err != nil {
xlog.Error("Error storing into memory", "error", err)
}
}
}
}
}

225
core/agent/mcp.go Normal file
View File

@@ -0,0 +1,225 @@
package agent
import (
"context"
"encoding/json"
mcp "github.com/metoro-io/mcp-golang"
"github.com/metoro-io/mcp-golang/transport/http"
stdioTransport "github.com/metoro-io/mcp-golang/transport/stdio"
"github.com/mudler/LocalAGI/core/types"
"github.com/mudler/LocalAGI/pkg/stdio"
"github.com/mudler/LocalAGI/pkg/xlog"
"github.com/sashabaranov/go-openai/jsonschema"
)
var _ types.Action = &mcpAction{}
type MCPServer struct {
URL string `json:"url"`
Token string `json:"token"`
}
type MCPSTDIOServer struct {
Args []string `json:"args"`
Env []string `json:"env"`
Cmd string `json:"cmd"`
}
type mcpAction struct {
mcpClient *mcp.Client
inputSchema ToolInputSchema
toolName string
toolDescription string
}
func (a *mcpAction) Plannable() bool {
return true
}
func (m *mcpAction) Run(ctx context.Context, params types.ActionParams) (types.ActionResult, error) {
resp, err := m.mcpClient.CallTool(ctx, m.toolName, params)
if err != nil {
xlog.Error("Failed to call tool", "error", err.Error())
return types.ActionResult{}, err
}
xlog.Debug("MCP response", "response", resp)
textResult := ""
for _, c := range resp.Content {
switch c.Type {
case mcp.ContentTypeText:
textResult += c.TextContent.Text + "\n"
case mcp.ContentTypeImage:
xlog.Error("Image content not supported yet")
case mcp.ContentTypeEmbeddedResource:
xlog.Error("Embedded resource content not supported yet")
}
}
return types.ActionResult{
Result: textResult,
}, nil
}
func (m *mcpAction) Definition() types.ActionDefinition {
props := map[string]jsonschema.Definition{}
dat, err := json.Marshal(m.inputSchema.Properties)
if err != nil {
xlog.Error("Failed to marshal input schema", "error", err.Error())
}
json.Unmarshal(dat, &props)
return types.ActionDefinition{
Name: types.ActionDefinitionName(m.toolName),
Description: m.toolDescription,
Required: m.inputSchema.Required,
//Properties: ,
Properties: props,
}
}
type ToolInputSchema struct {
Type string `json:"type"`
Properties map[string]interface{} `json:"properties,omitempty"`
Required []string `json:"required,omitempty"`
}
func (a *Agent) addTools(client *mcp.Client) (types.Actions, error) {
var generatedActions types.Actions
xlog.Debug("Initializing client")
// Initialize the client
response, e := client.Initialize(a.context)
if e != nil {
xlog.Error("Failed to initialize client", "error", e.Error())
return nil, e
}
xlog.Debug("Client initialized: %v", response.Instructions)
var cursor *string
for {
tools, err := client.ListTools(a.context, cursor)
if err != nil {
xlog.Error("Failed to list tools", "error", err.Error())
return nil, err
}
for _, t := range tools.Tools {
desc := ""
if t.Description != nil {
desc = *t.Description
}
xlog.Debug("Tool", "name", t.Name, "description", desc)
dat, err := json.Marshal(t.InputSchema)
if err != nil {
xlog.Error("Failed to marshal input schema", "error", err.Error())
}
xlog.Debug("Input schema", "tool", t.Name, "schema", string(dat))
// XXX: This is a wild guess, to verify (data types might be incompatible)
var inputSchema ToolInputSchema
err = json.Unmarshal(dat, &inputSchema)
if err != nil {
xlog.Error("Failed to unmarshal input schema", "error", err.Error())
}
// Create a new action with Client + tool
generatedActions = append(generatedActions, &mcpAction{
mcpClient: client,
toolName: t.Name,
inputSchema: inputSchema,
toolDescription: desc,
})
}
if tools.NextCursor == nil {
break // No more pages
}
cursor = tools.NextCursor
}
return generatedActions, nil
}
func (a *Agent) initMCPActions() error {
a.mcpActions = nil
var err error
generatedActions := types.Actions{}
// MCP HTTP Servers
for _, mcpServer := range a.options.mcpServers {
transport := http.NewHTTPClientTransport("/mcp")
transport.WithBaseURL(mcpServer.URL)
if mcpServer.Token != "" {
transport.WithHeader("Authorization", "Bearer "+mcpServer.Token)
}
// Create a new client
client := mcp.NewClient(transport)
xlog.Debug("Adding tools for MCP server", "server", mcpServer)
actions, err := a.addTools(client)
if err != nil {
xlog.Error("Failed to add tools for MCP server", "server", mcpServer, "error", err.Error())
}
generatedActions = append(generatedActions, actions...)
}
// MCP STDIO Servers
a.closeMCPSTDIOServers() // Make sure we stop all previous servers if any is active
if a.options.mcpPrepareScript != "" {
xlog.Debug("Preparing MCP box", "script", a.options.mcpPrepareScript)
client := stdio.NewClient(a.options.mcpBoxURL)
client.RunProcess(a.context, "/bin/bash", []string{"-c", a.options.mcpPrepareScript}, []string{})
}
for _, mcpStdioServer := range a.options.mcpStdioServers {
client := stdio.NewClient(a.options.mcpBoxURL)
p, err := client.CreateProcess(a.context,
mcpStdioServer.Cmd,
mcpStdioServer.Args,
mcpStdioServer.Env,
a.Character.Name)
if err != nil {
xlog.Error("Failed to create process", "error", err.Error())
continue
}
read, writer, err := client.GetProcessIO(p.ID)
if err != nil {
xlog.Error("Failed to get process IO", "error", err.Error())
continue
}
transport := stdioTransport.NewStdioServerTransportWithIO(read, writer)
// Create a new client
mcpClient := mcp.NewClient(transport)
xlog.Debug("Adding tools for MCP server (stdio)", "server", mcpStdioServer)
actions, err := a.addTools(mcpClient)
if err != nil {
xlog.Error("Failed to add tools for MCP server", "server", mcpStdioServer, "error", err.Error())
}
generatedActions = append(generatedActions, actions...)
}
a.mcpActions = generatedActions
return err
}
func (a *Agent) closeMCPSTDIOServers() {
client := stdio.NewClient(a.options.mcpBoxURL)
client.StopGroup(a.Character.Name)
}

88
core/agent/observer.go Normal file
View File

@@ -0,0 +1,88 @@
package agent
import (
"encoding/json"
"sync"
"sync/atomic"
"github.com/mudler/LocalAGI/core/sse"
"github.com/mudler/LocalAGI/core/types"
"github.com/mudler/LocalAGI/pkg/xlog"
)
type Observer interface {
NewObservable() *types.Observable
Update(types.Observable)
History() []types.Observable
}
type SSEObserver struct {
agent string
maxID int32
manager sse.Manager
mutex sync.Mutex
history []types.Observable
historyLast int
}
func NewSSEObserver(agent string, manager sse.Manager) *SSEObserver {
return &SSEObserver{
agent: agent,
maxID: 1,
manager: manager,
history: make([]types.Observable, 100),
}
}
func (s *SSEObserver) NewObservable() *types.Observable {
id := atomic.AddInt32(&s.maxID, 1)
return &types.Observable{
ID: id - 1,
Agent: s.agent,
}
}
func (s *SSEObserver) Update(obs types.Observable) {
data, err := json.Marshal(obs)
if err != nil {
xlog.Error("Error marshaling observable", "error", err)
return
}
msg := sse.NewMessage(string(data)).WithEvent("observable_update")
s.manager.Send(msg)
s.mutex.Lock()
defer s.mutex.Unlock()
for i, o := range s.history {
if o.ID == obs.ID {
s.history[i] = obs
return
}
}
s.history[s.historyLast] = obs
s.historyLast += 1
if s.historyLast >= len(s.history) {
s.historyLast = 0
}
}
func (s *SSEObserver) History() []types.Observable {
h := make([]types.Observable, 0, 20)
s.mutex.Lock()
defer s.mutex.Unlock()
for _, obs := range s.history {
if obs.ID == 0 {
continue
}
h = append(h, obs)
}
return h
}

379
core/agent/options.go Normal file
View File

@@ -0,0 +1,379 @@
package agent
import (
"context"
"strings"
"time"
"github.com/mudler/LocalAGI/core/types"
"github.com/sashabaranov/go-openai"
)
type Option func(*options) error
type llmOptions struct {
APIURL string
APIKey string
Model string
MultimodalModel string
}
type options struct {
LLMAPI llmOptions
character Character
randomIdentityGuidance string
randomIdentity bool
userActions types.Actions
enableHUD, standaloneJob, showCharacter, enableKB, enableSummaryMemory, enableLongTermMemory bool
canStopItself bool
initiateConversations bool
loopDetectionSteps int
forceReasoning bool
canPlan bool
characterfile string
statefile string
context context.Context
permanentGoal string
timeout string
periodicRuns time.Duration
kbResults int
ragdb RAGDB
prompts []DynamicPrompt
systemPrompt string
// callbacks
reasoningCallback func(types.ActionCurrentState) bool
resultCallback func(types.ActionState)
conversationsPath string
mcpServers []MCPServer
mcpStdioServers []MCPSTDIOServer
mcpBoxURL string
mcpPrepareScript string
newConversationsSubscribers []func(openai.ChatCompletionMessage)
observer Observer
parallelJobs int
}
func (o *options) SeparatedMultimodalModel() bool {
return o.LLMAPI.MultimodalModel != "" && o.LLMAPI.Model != o.LLMAPI.MultimodalModel
}
func defaultOptions() *options {
return &options{
parallelJobs: 1,
periodicRuns: 15 * time.Minute,
LLMAPI: llmOptions{
APIURL: "http://localhost:8080",
Model: "gpt-4",
},
character: Character{
Name: "",
Age: "",
Occupation: "",
Hobbies: []string{},
MusicTaste: []string{},
},
}
}
func newOptions(opts ...Option) (*options, error) {
options := defaultOptions()
for _, o := range opts {
if err := o(options); err != nil {
return nil, err
}
}
return options, nil
}
var EnableHUD = func(o *options) error {
o.enableHUD = true
return nil
}
var EnableForceReasoning = func(o *options) error {
o.forceReasoning = true
return nil
}
var EnableKnowledgeBase = func(o *options) error {
o.enableKB = true
o.kbResults = 5
return nil
}
var CanStopItself = func(o *options) error {
o.canStopItself = true
return nil
}
func WithTimeout(timeout string) Option {
return func(o *options) error {
o.timeout = timeout
return nil
}
}
func WithLoopDetectionSteps(steps int) Option {
return func(o *options) error {
o.loopDetectionSteps = steps
return nil
}
}
func WithConversationsPath(path string) Option {
return func(o *options) error {
o.conversationsPath = path
return nil
}
}
func EnableKnowledgeBaseWithResults(results int) Option {
return func(o *options) error {
o.enableKB = true
o.kbResults = results
return nil
}
}
func WithParallelJobs(jobs int) Option {
return func(o *options) error {
o.parallelJobs = jobs
return nil
}
}
func WithNewConversationSubscriber(sub func(openai.ChatCompletionMessage)) Option {
return func(o *options) error {
o.newConversationsSubscribers = append(o.newConversationsSubscribers, sub)
return nil
}
}
var EnableInitiateConversations = func(o *options) error {
o.initiateConversations = true
return nil
}
var EnablePlanning = func(o *options) error {
o.canPlan = true
return nil
}
// EnableStandaloneJob is an option to enable the agent
// to run jobs in the background automatically
var EnableStandaloneJob = func(o *options) error {
o.standaloneJob = true
return nil
}
var EnablePersonality = func(o *options) error {
o.showCharacter = true
return nil
}
var EnableSummaryMemory = func(o *options) error {
o.enableSummaryMemory = true
return nil
}
var EnableLongTermMemory = func(o *options) error {
o.enableLongTermMemory = true
return nil
}
func WithRAGDB(db RAGDB) Option {
return func(o *options) error {
o.ragdb = db
return nil
}
}
func WithSystemPrompt(prompt string) Option {
return func(o *options) error {
o.systemPrompt = prompt
return nil
}
}
func WithMCPServers(servers ...MCPServer) Option {
return func(o *options) error {
o.mcpServers = servers
return nil
}
}
func WithMCPSTDIOServers(servers ...MCPSTDIOServer) Option {
return func(o *options) error {
o.mcpStdioServers = servers
return nil
}
}
func WithMCPBoxURL(url string) Option {
return func(o *options) error {
o.mcpBoxURL = url
return nil
}
}
func WithMCPPrepareScript(script string) Option {
return func(o *options) error {
o.mcpPrepareScript = script
return nil
}
}
func WithLLMAPIURL(url string) Option {
return func(o *options) error {
o.LLMAPI.APIURL = url
return nil
}
}
func WithStateFile(path string) Option {
return func(o *options) error {
o.statefile = path
return nil
}
}
func WithCharacterFile(path string) Option {
return func(o *options) error {
o.characterfile = path
return nil
}
}
// WithPrompts adds additional block prompts to the agent
// to be rendered internally in the conversation
// when processing the conversation to the LLM
func WithPrompts(prompts ...DynamicPrompt) Option {
return func(o *options) error {
o.prompts = prompts
return nil
}
}
// WithDynamicPrompts is a helper function to create dynamic prompts
// Dynamic prompts contains golang code which is executed dynamically
// // to render a prompt to the LLM
// func WithDynamicPrompts(prompts ...map[string]string) Option {
// return func(o *options) error {
// for _, p := range prompts {
// prompt, err := NewDynamicPrompt(p, "")
// if err != nil {
// return err
// }
// o.prompts = append(o.prompts, prompt)
// }
// return nil
// }
// }
func WithLLMAPIKey(key string) Option {
return func(o *options) error {
o.LLMAPI.APIKey = key
return nil
}
}
func WithMultimodalModel(model string) Option {
return func(o *options) error {
o.LLMAPI.MultimodalModel = model
return nil
}
}
func WithPermanentGoal(goal string) Option {
return func(o *options) error {
o.permanentGoal = goal
return nil
}
}
func WithPeriodicRuns(duration string) Option {
return func(o *options) error {
t, err := time.ParseDuration(duration)
if err != nil {
o.periodicRuns, _ = time.ParseDuration("10m")
}
o.periodicRuns = t
return nil
}
}
func WithContext(ctx context.Context) Option {
return func(o *options) error {
o.context = ctx
return nil
}
}
func WithAgentReasoningCallback(cb func(types.ActionCurrentState) bool) Option {
return func(o *options) error {
o.reasoningCallback = cb
return nil
}
}
func WithAgentResultCallback(cb func(types.ActionState)) Option {
return func(o *options) error {
o.resultCallback = cb
return nil
}
}
func WithModel(model string) Option {
return func(o *options) error {
o.LLMAPI.Model = model
return nil
}
}
func WithCharacter(c Character) Option {
return func(o *options) error {
o.character = c
return nil
}
}
func FromFile(path string) Option {
return func(o *options) error {
c, err := Load(path)
if err != nil {
return err
}
o.character = *c
return nil
}
}
func WithRandomIdentity(guidance ...string) Option {
return func(o *options) error {
o.randomIdentityGuidance = strings.Join(guidance, "")
o.randomIdentity = true
o.showCharacter = true
return nil
}
}
func WithActions(actions ...types.Action) Option {
return func(o *options) error {
o.userActions = actions
return nil
}
}
func WithObserver(observer Observer) Option {
return func(o *options) error {
o.observer = observer
return nil
}
}

6
core/agent/prompt.go Normal file
View File

@@ -0,0 +1,6 @@
package agent
type DynamicPrompt interface {
Render(a *Agent) (string, error)
Role() string
}

143
core/agent/state.go Normal file
View File

@@ -0,0 +1,143 @@
package agent
import (
"encoding/json"
"fmt"
"os"
"path/filepath"
"github.com/mudler/LocalAGI/core/types"
"github.com/sashabaranov/go-openai/jsonschema"
)
// PromptHUD contains
// all information that should be displayed to the LLM
// in the prompts
type PromptHUD struct {
Character Character `json:"character"`
CurrentState types.AgentInternalState `json:"current_state"`
PermanentGoal string `json:"permanent_goal"`
ShowCharacter bool `json:"show_character"`
}
type Character struct {
Name string `json:"name"`
Age string `json:"age"`
Occupation string `json:"job_occupation"`
Hobbies []string `json:"hobbies"`
MusicTaste []string `json:"favorites_music_genres"`
Sex string `json:"sex"`
}
func (c *Character) ToJSONSchema() jsonschema.Definition {
return jsonschema.Definition{
Type: jsonschema.Object,
Properties: map[string]jsonschema.Definition{
"name": {
Type: jsonschema.String,
Description: "The name of the character",
},
"age": {
Type: jsonschema.String,
Description: "The age of the character",
},
"job_occupation": {
Type: jsonschema.String,
Description: "The occupation of the character",
},
"hobbies": {
Type: jsonschema.Array,
Description: "The hobbies of the character",
Items: &jsonschema.Definition{
Type: jsonschema.String,
},
},
"favorites_music_genres": {
Type: jsonschema.Array,
Description: "The favorite music genres of the character",
Items: &jsonschema.Definition{
Type: jsonschema.String,
},
},
"sex": {
Type: jsonschema.String,
Description: "The character sex (male, female)",
},
},
}
}
func Load(path string) (*Character, error) {
data, err := os.ReadFile(path)
if err != nil {
return nil, err
}
var c Character
err = json.Unmarshal(data, &c)
if err != nil {
return nil, err
}
return &c, nil
}
func (a *Agent) State() types.AgentInternalState {
return *a.currentState
}
func (a *Agent) LoadState(path string) error {
data, err := os.ReadFile(path)
if err != nil {
return err
}
return json.Unmarshal(data, a.currentState)
}
func (a *Agent) LoadCharacter(path string) error {
data, err := os.ReadFile(path)
if err != nil {
return err
}
return json.Unmarshal(data, &a.Character)
}
func (a *Agent) SaveState(path string) error {
os.MkdirAll(filepath.Dir(path), 0755)
data, err := json.Marshal(a.currentState)
if err != nil {
return err
}
os.WriteFile(path, data, 0644)
return nil
}
func (a *Agent) SaveCharacter(path string) error {
os.MkdirAll(filepath.Dir(path), 0755)
data, err := json.Marshal(a.Character)
if err != nil {
return err
}
return os.WriteFile(path, data, 0644)
}
func (a *Agent) validCharacter() bool {
return a.Character.Name != ""
}
const fmtT = `=====================
Name: %s
Age: %s
Occupation: %s
Hobbies: %v
Music taste: %v
=====================`
func (c *Character) String() string {
return fmt.Sprintf(
fmtT,
c.Name,
c.Age,
c.Occupation,
c.Hobbies,
c.MusicTaste,
)
}

56
core/agent/state_test.go Normal file
View File

@@ -0,0 +1,56 @@
package agent_test
import (
"net/http"
. "github.com/mudler/LocalAGI/core/agent"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
)
var _ = Describe("Agent test", func() {
Context("identity", func() {
var agent *Agent
BeforeEach(func() {
Eventually(func() error {
// test apiURL is working and available
_, err := http.Get(apiURL + "/readyz")
return err
}, "10m", "10s").ShouldNot(HaveOccurred())
})
It("generates all the fields with random data", func() {
var err error
agent, err = New(
WithLLMAPIURL(apiURL),
WithModel(testModel),
WithTimeout("10m"),
WithRandomIdentity(),
)
Expect(err).ToNot(HaveOccurred())
By("generating random identity")
Expect(agent.Character.Name).ToNot(BeEmpty())
Expect(agent.Character.Age).ToNot(BeZero())
Expect(agent.Character.Occupation).ToNot(BeEmpty())
Expect(agent.Character.Hobbies).ToNot(BeEmpty())
Expect(agent.Character.MusicTaste).ToNot(BeEmpty())
})
It("detect an invalid character", func() {
var err error
agent, err = New(WithRandomIdentity())
Expect(err).To(HaveOccurred())
})
It("generates all the fields", func() {
var err error
agent, err := New(
WithLLMAPIURL(apiURL),
WithModel(testModel),
WithRandomIdentity("An 90-year old man with a long beard, a wizard, who lives in a tower."),
)
Expect(err).ToNot(HaveOccurred())
Expect(agent.Character.Name).ToNot(BeEmpty())
})
})
})

146
core/agent/templates.go Normal file
View File

@@ -0,0 +1,146 @@
package agent
import (
"bytes"
"html/template"
"time"
"github.com/mudler/LocalAGI/core/types"
"github.com/sashabaranov/go-openai"
)
func renderTemplate(templ string, hud *PromptHUD, actions types.Actions, reasoning string) (string, error) {
// prepare the prompt
prompt := bytes.NewBuffer([]byte{})
promptTemplate, err := template.New("pickAction").Parse(templ)
if err != nil {
return "", err
}
// Get all the actions definitions
definitions := []types.ActionDefinition{}
for _, m := range actions {
definitions = append(definitions, m.Definition())
}
err = promptTemplate.Execute(prompt, struct {
HUD *PromptHUD
Actions []types.ActionDefinition
Reasoning string
Messages []openai.ChatCompletionMessage
Time string
}{
Actions: definitions,
HUD: hud,
Reasoning: reasoning,
Time: time.Now().Format(time.RFC3339),
})
if err != nil {
return "", err
}
return prompt.String(), nil
}
const innerMonologueTemplate = `You are an autonomous AI agent thinking out loud and evaluating your current situation.
Your task is to analyze your goals and determine the best course of action.
Consider:
1. Your permanent goal (if any)
2. Your current state and progress
3. Available tools and capabilities
4. Previous actions and their outcomes
You can:
- Take immediate actions using available tools
- Plan future actions
- Update your state and goals
- Initiate conversations with the user when appropriate
Remember to:
- Think critically about each decision
- Consider both short-term and long-term implications
- Be proactive in addressing potential issues
- Maintain awareness of your current state and goals`
const hudTemplate = `{{with .HUD }}{{if .ShowCharacter}}You are an AI assistant with a distinct personality and character traits that influence your responses and actions.
{{if .Character.Name}}Name: {{.Character.Name}}
{{end}}{{if .Character.Age}}Age: {{.Character.Age}}
{{end}}{{if .Character.Occupation}}Occupation: {{.Character.Occupation}}
{{end}}{{if .Character.Hobbies}}Hobbies: {{.Character.Hobbies}}
{{end}}{{if .Character.MusicTaste}}Music Taste: {{.Character.MusicTaste}}
{{end}}
{{end}}
Current State:
- Current Action: {{if .CurrentState.NowDoing}}{{.CurrentState.NowDoing}}{{else}}None{{end}}
- Next Action: {{if .CurrentState.DoingNext}}{{.CurrentState.DoingNext}}{{else}}None{{end}}
- Permanent Goal: {{if .PermanentGoal}}{{.PermanentGoal}}{{else}}None{{end}}
- Current Goal: {{if .CurrentState.Goal}}{{.CurrentState.Goal}}{{else}}None{{end}}
- Action History: {{range .CurrentState.DoneHistory}}{{.}} {{end}}
- Short-term Memory: {{range .CurrentState.Memories}}{{.}} {{end}}{{end}}
Current Time: {{.Time}}`
const pickSelfTemplate = `
You are an autonomous AI agent with a defined character and state (as shown above).
Your task is to evaluate your current situation and determine the best course of action.
Guidelines:
1. Review your current state and goals
2. Consider available tools and their purposes
3. Plan your next steps carefully
4. Update your state appropriately
When making decisions:
- Use the "reply" tool to provide final responses
- Update your state using appropriate tools
- Plan complex tasks using the planning tool
- Consider both immediate and long-term goals
Remember:
- You are autonomous and should not ask for user input
- Your character traits influence your decisions
- Keep track of your progress and state
- Be proactive in addressing potential issues
Available Tools:
{{range .Actions -}}
- {{.Name}}: {{.Description }}
{{ end }}
{{if .Reasoning}}Previous Reasoning: {{.Reasoning}}{{end}}
` + hudTemplate
const reSelfEvalTemplate = pickSelfTemplate
const pickActionTemplate = hudTemplate + `
Your only task is to analyze the conversation and determine a goal and the best tool to use, or just a final response if we have fullfilled the goal.
Guidelines:
1. Review the current state, what was done already and context
2. Consider available tools and their purposes
3. Plan your approach carefully
4. Explain your reasoning clearly
When choosing actions:
- Use "reply" or "answer" tools for direct responses
- Select appropriate tools for specific tasks
- Consider the impact of each action
- Plan for potential challenges
Decision Process:
1. Analyze the situation
2. Consider available options
3. Choose the best course of action
4. Explain your reasoning
5. Execute the chosen action
Available Tools:
{{range .Actions -}}
- {{.Name}}: {{.Description }}
{{ end }}
{{if .Reasoning}}Previous Reasoning: {{.Reasoning}}{{end}}`
const reEvalTemplate = pickActionTemplate

224
core/sse/sse.go Normal file
View File

@@ -0,0 +1,224 @@
package sse
import (
"bufio"
"fmt"
"strings"
"sync"
"time"
"github.com/gofiber/fiber/v2"
"github.com/valyala/fasthttp"
)
type (
// Listener defines the interface for the receiving end.
Listener interface {
ID() string
Chan() chan Envelope
}
// Envelope defines the interface for content that can be broadcast to clients.
Envelope interface {
String() string // Represent the envelope contents as a string for transmission.
}
// Manager defines the interface for managing clients and broadcasting messages.
Manager interface {
Send(message Envelope)
Handle(ctx *fiber.Ctx, cl Listener)
Clients() []string
}
History interface {
Add(message Envelope) // Add adds a message to the history.
Send(c Listener) // Send sends the history to a client.
}
)
type Client struct {
id string
ch chan Envelope
}
func NewClient(id string) Listener {
return &Client{
id: id,
ch: make(chan Envelope, 50),
}
}
func (c *Client) ID() string { return c.id }
func (c *Client) Chan() chan Envelope { return c.ch }
// Message represents a simple message implementation.
type Message struct {
Event string
Time time.Time
Data string
}
// NewMessage returns a new message instance.
func NewMessage(data string) *Message {
return &Message{
Data: data,
Time: time.Now(),
}
}
// String returns the message as a string.
func (m *Message) String() string {
sb := strings.Builder{}
if m.Event != "" {
sb.WriteString(fmt.Sprintf("event: %s\n", m.Event))
}
sb.WriteString(fmt.Sprintf("data: %v\n\n", m.Data))
return sb.String()
}
// WithEvent sets the event name for the message.
func (m *Message) WithEvent(event string) Envelope {
m.Event = event
return m
}
// broadcastManager manages the clients and broadcasts messages to them.
type broadcastManager struct {
clients sync.Map
broadcast chan Envelope
workerPoolSize int
messageHistory *history
}
// NewManager initializes and returns a new Manager instance.
func NewManager(workerPoolSize int) Manager {
manager := &broadcastManager{
broadcast: make(chan Envelope),
workerPoolSize: workerPoolSize,
messageHistory: newHistory(10),
}
manager.startWorkers()
return manager
}
// Send broadcasts a message to all connected clients.
func (manager *broadcastManager) Send(message Envelope) {
manager.broadcast <- message
}
// Handle sets up a new client and handles the connection.
func (manager *broadcastManager) Handle(c *fiber.Ctx, cl Listener) {
manager.register(cl)
ctx := c.Context()
ctx.SetContentType("text/event-stream")
ctx.Response.Header.Set("Cache-Control", "no-cache")
ctx.Response.Header.Set("Connection", "keep-alive")
ctx.Response.Header.Set("Access-Control-Allow-Origin", "*")
ctx.Response.Header.Set("Access-Control-Allow-Headers", "Cache-Control")
ctx.Response.Header.Set("Access-Control-Allow-Credentials", "true")
// Send history to the newly connected client
manager.messageHistory.Send(cl)
ctx.SetBodyStreamWriter(fasthttp.StreamWriter(func(w *bufio.Writer) {
for {
select {
case msg, ok := <-cl.Chan():
if !ok {
// If the channel is closed, return from the function
return
}
_, err := fmt.Fprint(w, msg.String())
if err != nil {
// If an error occurs (e.g., client has disconnected), return from the function
return
}
w.Flush()
case <-ctx.Done():
manager.unregister(cl.ID())
close(cl.Chan())
return
}
}
}))
}
// Clients method to list connected client IDs
func (manager *broadcastManager) Clients() []string {
var clients []string
manager.clients.Range(func(key, value any) bool {
id, ok := key.(string)
if ok {
clients = append(clients, id)
}
return true
})
return clients
}
// startWorkers starts worker goroutines for message broadcasting.
func (manager *broadcastManager) startWorkers() {
for i := 0; i < manager.workerPoolSize; i++ {
go func() {
for message := range manager.broadcast {
manager.clients.Range(func(key, value any) bool {
client, ok := value.(Listener)
if !ok {
return true // Continue iteration
}
select {
case client.Chan() <- message:
manager.messageHistory.Add(message)
default:
// If the client's channel is full, drop the message
}
return true // Continue iteration
})
}
}()
}
}
// register adds a client to the manager.
func (manager *broadcastManager) register(client Listener) {
manager.clients.Store(client.ID(), client)
}
// unregister removes a client from the manager.
func (manager *broadcastManager) unregister(clientID string) {
manager.clients.Delete(clientID)
}
type history struct {
messages []Envelope
maxSize int // Maximum number of messages to retain
}
func newHistory(maxSize int) *history {
return &history{
messages: []Envelope{},
maxSize: maxSize,
}
}
func (h *history) Add(message Envelope) {
h.messages = append(h.messages, message)
// Ensure history does not exceed maxSize
if len(h.messages) > h.maxSize {
// Remove the oldest messages to fit the maxSize
h.messages = h.messages[len(h.messages)-h.maxSize:]
}
}
func (h *history) Send(c Listener) {
for _, msg := range h.messages {
c.Chan() <- msg
}
}

465
core/state/config.go Normal file
View File

@@ -0,0 +1,465 @@
package state
import (
"encoding/json"
"fmt"
"strings"
"github.com/mudler/LocalAGI/core/agent"
"github.com/mudler/LocalAGI/core/types"
"github.com/mudler/LocalAGI/pkg/config"
)
type ConnectorConfig struct {
Type string `json:"type"` // e.g. Slack
Config string `json:"config"`
}
type ActionsConfig struct {
Name string `json:"name"` // e.g. search
Config string `json:"config"`
}
type DynamicPromptsConfig struct {
Type string `json:"type"`
Config string `json:"config"`
}
func (d DynamicPromptsConfig) ToMap() map[string]string {
config := map[string]string{}
json.Unmarshal([]byte(d.Config), &config)
return config
}
type AgentConfig struct {
Connector []ConnectorConfig `json:"connectors" form:"connectors" `
Actions []ActionsConfig `json:"actions" form:"actions"`
DynamicPrompts []DynamicPromptsConfig `json:"dynamic_prompts" form:"dynamic_prompts"`
MCPServers []agent.MCPServer `json:"mcp_servers" form:"mcp_servers"`
MCPSTDIOServers []agent.MCPSTDIOServer `json:"mcp_stdio_servers" form:"mcp_stdio_servers"`
MCPPrepareScript string `json:"mcp_prepare_script" form:"mcp_prepare_script"`
MCPBoxURL string `json:"mcp_box_url" form:"mcp_box_url"`
Description string `json:"description" form:"description"`
Model string `json:"model" form:"model"`
MultimodalModel string `json:"multimodal_model" form:"multimodal_model"`
APIURL string `json:"api_url" form:"api_url"`
APIKey string `json:"api_key" form:"api_key"`
LocalRAGURL string `json:"local_rag_url" form:"local_rag_url"`
LocalRAGAPIKey string `json:"local_rag_api_key" form:"local_rag_api_key"`
Name string `json:"name" form:"name"`
HUD bool `json:"hud" form:"hud"`
StandaloneJob bool `json:"standalone_job" form:"standalone_job"`
RandomIdentity bool `json:"random_identity" form:"random_identity"`
InitiateConversations bool `json:"initiate_conversations" form:"initiate_conversations"`
CanPlan bool `json:"enable_planning" form:"enable_planning"`
IdentityGuidance string `json:"identity_guidance" form:"identity_guidance"`
PeriodicRuns string `json:"periodic_runs" form:"periodic_runs"`
PermanentGoal string `json:"permanent_goal" form:"permanent_goal"`
EnableKnowledgeBase bool `json:"enable_kb" form:"enable_kb"`
EnableReasoning bool `json:"enable_reasoning" form:"enable_reasoning"`
KnowledgeBaseResults int `json:"kb_results" form:"kb_results"`
LoopDetectionSteps int `json:"loop_detection_steps" form:"loop_detection_steps"`
CanStopItself bool `json:"can_stop_itself" form:"can_stop_itself"`
SystemPrompt string `json:"system_prompt" form:"system_prompt"`
LongTermMemory bool `json:"long_term_memory" form:"long_term_memory"`
SummaryLongTermMemory bool `json:"summary_long_term_memory" form:"summary_long_term_memory"`
ParallelJobs int `json:"parallel_jobs" form:"parallel_jobs"`
}
type AgentConfigMeta struct {
Fields []config.Field
Connectors []config.FieldGroup
Actions []config.FieldGroup
DynamicPrompts []config.FieldGroup
MCPServers []config.Field
}
func NewAgentConfigMeta(
actionsConfig []config.FieldGroup,
connectorsConfig []config.FieldGroup,
dynamicPromptsConfig []config.FieldGroup,
) AgentConfigMeta {
return AgentConfigMeta{
Fields: []config.Field{
{
Name: "name",
Label: "Name",
Type: "text",
DefaultValue: "",
Required: true,
Tags: config.Tags{Section: "BasicInfo"},
},
{
Name: "description",
Label: "Description",
Type: "textarea",
DefaultValue: "",
Tags: config.Tags{Section: "BasicInfo"},
},
{
Name: "identity_guidance",
Label: "Identity Guidance",
Type: "textarea",
DefaultValue: "",
Tags: config.Tags{Section: "BasicInfo"},
},
{
Name: "random_identity",
Label: "Random Identity",
Type: "checkbox",
DefaultValue: false,
Tags: config.Tags{Section: "BasicInfo"},
},
{
Name: "hud",
Label: "HUD",
Type: "checkbox",
DefaultValue: false,
Tags: config.Tags{Section: "BasicInfo"},
},
{
Name: "model",
Label: "Model",
Type: "text",
DefaultValue: "",
Tags: config.Tags{Section: "ModelSettings"},
},
{
Name: "multimodal_model",
Label: "Multimodal Model",
Type: "text",
DefaultValue: "",
Tags: config.Tags{Section: "ModelSettings"},
},
{
Name: "api_url",
Label: "API URL",
Type: "text",
DefaultValue: "",
Tags: config.Tags{Section: "ModelSettings"},
},
{
Name: "api_key",
Label: "API Key",
Type: "password",
DefaultValue: "",
Tags: config.Tags{Section: "ModelSettings"},
},
{
Name: "local_rag_url",
Label: "Local RAG URL",
Type: "text",
DefaultValue: "",
Tags: config.Tags{Section: "ModelSettings"},
},
{
Name: "local_rag_api_key",
Label: "Local RAG API Key",
Type: "password",
DefaultValue: "",
Tags: config.Tags{Section: "ModelSettings"},
},
{
Name: "enable_kb",
Label: "Enable Knowledge Base",
Type: "checkbox",
DefaultValue: false,
Tags: config.Tags{Section: "MemorySettings"},
},
{
Name: "kb_results",
Label: "Knowledge Base Results",
Type: "number",
DefaultValue: 5,
Min: 1,
Step: 1,
Tags: config.Tags{Section: "MemorySettings"},
},
{
Name: "long_term_memory",
Label: "Long Term Memory",
Type: "checkbox",
DefaultValue: false,
Tags: config.Tags{Section: "MemorySettings"},
},
{
Name: "summary_long_term_memory",
Label: "Summary Long Term Memory",
Type: "checkbox",
DefaultValue: false,
Tags: config.Tags{Section: "MemorySettings"},
},
{
Name: "system_prompt",
Label: "System Prompt",
Type: "textarea",
DefaultValue: "",
HelpText: "Instructions that define the agent's behavior and capabilities",
Tags: config.Tags{Section: "PromptsGoals"},
},
{
Name: "permanent_goal",
Label: "Permanent Goal",
Type: "textarea",
DefaultValue: "",
HelpText: "Long-term objective for the agent to pursue",
Tags: config.Tags{Section: "PromptsGoals"},
},
{
Name: "standalone_job",
Label: "Standalone Job",
Type: "checkbox",
DefaultValue: false,
HelpText: "Run as a standalone job without user interaction",
Tags: config.Tags{Section: "AdvancedSettings"},
},
{
Name: "initiate_conversations",
Label: "Initiate Conversations",
Type: "checkbox",
DefaultValue: false,
HelpText: "Allow agent to start conversations on its own",
Tags: config.Tags{Section: "AdvancedSettings"},
},
{
Name: "enable_planning",
Label: "Enable Planning",
Type: "checkbox",
DefaultValue: false,
HelpText: "Enable agent to create and execute plans",
Tags: config.Tags{Section: "AdvancedSettings"},
},
{
Name: "can_stop_itself",
Label: "Can Stop Itself",
Type: "checkbox",
DefaultValue: false,
HelpText: "Allow agent to terminate its own execution",
Tags: config.Tags{Section: "AdvancedSettings"},
},
{
Name: "periodic_runs",
Label: "Periodic Runs",
Type: "text",
DefaultValue: "",
Placeholder: "10m",
HelpText: "Duration for scheduling periodic agent runs",
Tags: config.Tags{Section: "AdvancedSettings"},
},
{
Name: "enable_reasoning",
Label: "Enable Reasoning",
Type: "checkbox",
DefaultValue: false,
HelpText: "Enable agent to explain its reasoning process",
Tags: config.Tags{Section: "AdvancedSettings"},
},
{
Name: "loop_detection_steps",
Label: "Max Loop Detection Steps",
Type: "number",
DefaultValue: 5,
Min: 1,
Step: 1,
Tags: config.Tags{Section: "AdvancedSettings"},
},
{
Name: "parallel_jobs",
Label: "Parallel Jobs",
Type: "number",
DefaultValue: 5,
Min: 1,
Step: 1,
HelpText: "Number of concurrent tasks that can run in parallel",
Tags: config.Tags{Section: "AdvancedSettings"},
},
{
Name: "mcp_stdio_servers",
Label: "MCP STDIO Servers",
Type: "textarea",
DefaultValue: "",
HelpText: "JSON configuration for MCP STDIO servers",
Tags: config.Tags{Section: "AdvancedSettings"},
},
{
Name: "mcp_prepare_script",
Label: "MCP Prepare Script",
Type: "textarea",
DefaultValue: "",
HelpText: "Script to prepare the MCP box",
Tags: config.Tags{Section: "AdvancedSettings"},
},
},
MCPServers: []config.Field{
{
Name: "url",
Label: "URL",
Type: config.FieldTypeText,
Required: true,
},
{
Name: "token",
Label: "API Key",
Type: config.FieldTypeText,
Required: true,
},
},
DynamicPrompts: dynamicPromptsConfig,
Connectors: connectorsConfig,
Actions: actionsConfig,
}
}
type Connector interface {
AgentResultCallback() func(state types.ActionState)
AgentReasoningCallback() func(state types.ActionCurrentState) bool
Start(a *agent.Agent)
}
// UnmarshalJSON implements json.Unmarshaler for AgentConfig
func (a *AgentConfig) UnmarshalJSON(data []byte) error {
// Create a temporary type to avoid infinite recursion
type Alias AgentConfig
aux := &struct {
*Alias
MCPSTDIOServersConfig interface{} `json:"mcp_stdio_servers"`
}{
Alias: (*Alias)(a),
}
if err := json.Unmarshal(data, &aux); err != nil {
return err
}
// Handle MCP STDIO servers configuration
if aux.MCPSTDIOServersConfig != nil {
switch v := aux.MCPSTDIOServersConfig.(type) {
case string:
// Parse string configuration
var mcpConfig struct {
MCPServers map[string]struct {
Command string `json:"command"`
Args []string `json:"args"`
Env map[string]string `json:"env"`
} `json:"mcpServers"`
}
if err := json.Unmarshal([]byte(v), &mcpConfig); err != nil {
return fmt.Errorf("failed to parse MCP STDIO servers configuration: %w", err)
}
a.MCPSTDIOServers = make([]agent.MCPSTDIOServer, 0, len(mcpConfig.MCPServers))
for _, server := range mcpConfig.MCPServers {
// Convert env map to slice of "KEY=VALUE" strings
envSlice := make([]string, 0, len(server.Env))
for k, v := range server.Env {
envSlice = append(envSlice, fmt.Sprintf("%s=%s", k, v))
}
a.MCPSTDIOServers = append(a.MCPSTDIOServers, agent.MCPSTDIOServer{
Cmd: server.Command,
Args: server.Args,
Env: envSlice,
})
}
case []interface{}:
// Parse array configuration
a.MCPSTDIOServers = make([]agent.MCPSTDIOServer, 0, len(v))
for _, server := range v {
serverMap, ok := server.(map[string]interface{})
if !ok {
return fmt.Errorf("invalid server configuration format")
}
cmd, _ := serverMap["cmd"].(string)
args := make([]string, 0)
if argsInterface, ok := serverMap["args"].([]interface{}); ok {
for _, arg := range argsInterface {
if argStr, ok := arg.(string); ok {
args = append(args, argStr)
}
}
}
env := make([]string, 0)
if envInterface, ok := serverMap["env"].([]interface{}); ok {
for _, e := range envInterface {
if envStr, ok := e.(string); ok {
env = append(env, envStr)
}
}
}
a.MCPSTDIOServers = append(a.MCPSTDIOServers, agent.MCPSTDIOServer{
Cmd: cmd,
Args: args,
Env: env,
})
}
}
}
return nil
}
// MarshalJSON implements json.Marshaler for AgentConfig
func (a *AgentConfig) MarshalJSON() ([]byte, error) {
// Create a temporary type to avoid infinite recursion
type Alias AgentConfig
aux := &struct {
*Alias
MCPSTDIOServersConfig string `json:"mcp_stdio_servers,omitempty"`
}{
Alias: (*Alias)(a),
}
// Convert MCPSTDIOServers back to the expected JSON format
if len(a.MCPSTDIOServers) > 0 {
mcpConfig := struct {
MCPServers map[string]struct {
Command string `json:"command"`
Args []string `json:"args"`
Env map[string]string `json:"env"`
} `json:"mcpServers"`
}{
MCPServers: make(map[string]struct {
Command string `json:"command"`
Args []string `json:"args"`
Env map[string]string `json:"env"`
}),
}
// Convert each MCPSTDIOServer to the expected format
for i, server := range a.MCPSTDIOServers {
// Convert env slice back to map
envMap := make(map[string]string)
for _, env := range server.Env {
if parts := strings.SplitN(env, "=", 2); len(parts) == 2 {
envMap[parts[0]] = parts[1]
}
}
mcpConfig.MCPServers[fmt.Sprintf("server%d", i)] = struct {
Command string `json:"command"`
Args []string `json:"args"`
Env map[string]string `json:"env"`
}{
Command: server.Cmd,
Args: server.Args,
Env: envMap,
}
}
// Marshal the MCP config to JSON string
mcpConfigJSON, err := json.Marshal(mcpConfig)
if err != nil {
return nil, fmt.Errorf("failed to marshal MCP STDIO servers configuration: %w", err)
}
aux.MCPSTDIOServersConfig = string(mcpConfigJSON)
}
return json.Marshal(aux)
}

33
core/state/internal.go Normal file
View File

@@ -0,0 +1,33 @@
package state
import (
. "github.com/mudler/LocalAGI/core/agent"
)
type AgentPoolInternalAPI struct {
*AgentPool
}
func (a *AgentPool) InternalAPI() *AgentPoolInternalAPI {
return &AgentPoolInternalAPI{a}
}
func (a *AgentPoolInternalAPI) GetAgent(name string) *Agent {
return a.agents[name]
}
func (a *AgentPoolInternalAPI) AllAgents() []string {
var agents []string
for agent := range a.agents {
agents = append(agents, agent)
}
return agents
}
func (a *AgentPoolInternalAPI) GetConfig(name string) *AgentConfig {
agent, exists := a.pool[name]
if !exists {
return nil
}
return &agent
}

702
core/state/pool.go Normal file
View File

@@ -0,0 +1,702 @@
package state
import (
"context"
"encoding/base64"
"encoding/json"
"fmt"
"os"
"path/filepath"
"sort"
"strings"
"sync"
"time"
. "github.com/mudler/LocalAGI/core/agent"
"github.com/mudler/LocalAGI/core/sse"
"github.com/mudler/LocalAGI/core/types"
"github.com/mudler/LocalAGI/pkg/llm"
"github.com/mudler/LocalAGI/pkg/localrag"
"github.com/mudler/LocalAGI/pkg/utils"
"github.com/sashabaranov/go-openai"
"github.com/sashabaranov/go-openai/jsonschema"
"github.com/mudler/LocalAGI/pkg/xlog"
)
type AgentPool struct {
sync.Mutex
file string
pooldir string
pool AgentPoolData
agents map[string]*Agent
managers map[string]sse.Manager
agentStatus map[string]*Status
apiURL, defaultModel, defaultMultimodalModel string
mcpBoxURL string
imageModel, localRAGAPI, localRAGKey, apiKey string
availableActions func(*AgentConfig) func(ctx context.Context, pool *AgentPool) []types.Action
connectors func(*AgentConfig) []Connector
dynamicPrompt func(*AgentConfig) []DynamicPrompt
timeout string
conversationLogs string
}
type Status struct {
ActionResults []types.ActionState
}
func (s *Status) addResult(result types.ActionState) {
// If we have more than 10 results, remove the oldest one
if len(s.ActionResults) > 10 {
s.ActionResults = s.ActionResults[1:]
}
s.ActionResults = append(s.ActionResults, result)
}
func (s *Status) Results() []types.ActionState {
return s.ActionResults
}
type AgentPoolData map[string]AgentConfig
func loadPoolFromFile(path string) (*AgentPoolData, error) {
data, err := os.ReadFile(path)
if err != nil {
return nil, err
}
poolData := &AgentPoolData{}
err = json.Unmarshal(data, poolData)
return poolData, err
}
func NewAgentPool(
defaultModel, defaultMultimodalModel, imageModel, apiURL, apiKey, directory, mcpBoxURL string,
LocalRAGAPI string,
availableActions func(*AgentConfig) func(ctx context.Context, pool *AgentPool) []types.Action,
connectors func(*AgentConfig) []Connector,
promptBlocks func(*AgentConfig) []DynamicPrompt,
timeout string,
withLogs bool,
) (*AgentPool, error) {
// if file exists, try to load an existing pool.
// if file does not exist, create a new pool.
poolfile := filepath.Join(directory, "pool.json")
conversationPath := ""
if withLogs {
conversationPath = filepath.Join(directory, "conversations")
}
if _, err := os.Stat(poolfile); err != nil {
// file does not exist, create a new pool
return &AgentPool{
file: poolfile,
pooldir: directory,
apiURL: apiURL,
defaultModel: defaultModel,
defaultMultimodalModel: defaultMultimodalModel,
mcpBoxURL: mcpBoxURL,
imageModel: imageModel,
localRAGAPI: LocalRAGAPI,
apiKey: apiKey,
agents: make(map[string]*Agent),
pool: make(map[string]AgentConfig),
agentStatus: make(map[string]*Status),
managers: make(map[string]sse.Manager),
connectors: connectors,
availableActions: availableActions,
dynamicPrompt: promptBlocks,
timeout: timeout,
conversationLogs: conversationPath,
}, nil
}
poolData, err := loadPoolFromFile(poolfile)
if err != nil {
return nil, err
}
return &AgentPool{
file: poolfile,
apiURL: apiURL,
pooldir: directory,
defaultModel: defaultModel,
defaultMultimodalModel: defaultMultimodalModel,
mcpBoxURL: mcpBoxURL,
imageModel: imageModel,
apiKey: apiKey,
agents: make(map[string]*Agent),
managers: make(map[string]sse.Manager),
agentStatus: map[string]*Status{},
pool: *poolData,
connectors: connectors,
localRAGAPI: LocalRAGAPI,
dynamicPrompt: promptBlocks,
availableActions: availableActions,
timeout: timeout,
conversationLogs: conversationPath,
}, nil
}
func replaceInvalidChars(s string) string {
s = strings.ReplaceAll(s, "/", "_")
return strings.ReplaceAll(s, " ", "_")
}
// CreateAgent adds a new agent to the pool
// and starts it.
// It also saves the state to the file.
func (a *AgentPool) CreateAgent(name string, agentConfig *AgentConfig) error {
a.Lock()
defer a.Unlock()
name = replaceInvalidChars(name)
agentConfig.Name = name
if _, ok := a.pool[name]; ok {
return fmt.Errorf("agent %s already exists", name)
}
a.pool[name] = *agentConfig
if err := a.save(); err != nil {
return err
}
go func(ac AgentConfig) {
// Create the agent avatar
if err := createAgentAvatar(a.apiURL, a.apiKey, a.defaultModel, a.imageModel, a.pooldir, ac); err != nil {
xlog.Error("Failed to create agent avatar", "error", err)
}
}(a.pool[name])
return a.startAgentWithConfig(name, agentConfig, nil)
}
func (a *AgentPool) RecreateAgent(name string, agentConfig *AgentConfig) error {
a.Lock()
defer a.Unlock()
oldAgent := a.agents[name]
var o *types.Observable
obs := oldAgent.Observer()
if obs != nil {
o = obs.NewObservable()
o.Name = "Restarting Agent"
o.Icon = "sync"
o.Creation = &types.Creation{}
obs.Update(*o)
}
stateFile, characterFile := a.stateFiles(name)
os.Remove(stateFile)
os.Remove(characterFile)
oldAgent.Stop()
a.pool[name] = *agentConfig
delete(a.agents, name)
if err := a.save(); err != nil {
if obs != nil {
o.Completion = &types.Completion{Error: err.Error()}
obs.Update(*o)
}
return err
}
if err := a.startAgentWithConfig(name, agentConfig, obs); err != nil {
if obs != nil {
o.Completion = &types.Completion{Error: err.Error()}
obs.Update(*o)
}
return err
}
if obs != nil {
o.Completion = &types.Completion{}
obs.Update(*o)
}
return nil
}
func createAgentAvatar(APIURL, APIKey, model, imageModel, avatarDir string, agent AgentConfig) error {
client := llm.NewClient(APIKey, APIURL+"/v1", "10m")
if imageModel == "" {
return fmt.Errorf("image model not set")
}
if model == "" {
return fmt.Errorf("default model not set")
}
imagePath := filepath.Join(avatarDir, "avatars", fmt.Sprintf("%s.png", agent.Name))
if _, err := os.Stat(imagePath); err == nil {
// Image already exists
xlog.Debug("Avatar already exists", "path", imagePath)
return nil
}
var results struct {
ImagePrompt string `json:"image_prompt"`
}
err := llm.GenerateTypedJSON(
context.Background(),
llm.NewClient(APIKey, APIURL, "10m"),
"Generate a prompt that I can use to create a random avatar for the bot '"+agent.Name+"', the description of the bot is: "+agent.Description,
model,
jsonschema.Definition{
Type: jsonschema.Object,
Properties: map[string]jsonschema.Definition{
"image_prompt": {
Type: jsonschema.String,
Description: "The prompt to generate the image",
},
},
Required: []string{"image_prompt"},
}, &results)
if err != nil {
return fmt.Errorf("failed to generate image prompt: %w", err)
}
if results.ImagePrompt == "" {
xlog.Error("Failed to generate image prompt")
return fmt.Errorf("failed to generate image prompt")
}
req := openai.ImageRequest{
Prompt: results.ImagePrompt,
Model: imageModel,
Size: openai.CreateImageSize256x256,
ResponseFormat: openai.CreateImageResponseFormatB64JSON,
}
ctx, cancel := context.WithTimeout(context.Background(), 120*time.Second)
defer cancel()
resp, err := client.CreateImage(ctx, req)
if err != nil {
return fmt.Errorf("failed to generate image: %w", err)
}
if len(resp.Data) == 0 {
return fmt.Errorf("failed to generate image")
}
imageJson := resp.Data[0].B64JSON
os.MkdirAll(filepath.Join(avatarDir, "avatars"), 0755)
// Save the image to the agent directory
imageData, err := base64.StdEncoding.DecodeString(imageJson)
if err != nil {
return err
}
return os.WriteFile(imagePath, imageData, 0644)
}
func (a *AgentPool) List() []string {
a.Lock()
defer a.Unlock()
var agents []string
for agent := range a.pool {
agents = append(agents, agent)
}
// return a sorted list
sort.SliceStable(agents, func(i, j int) bool {
return agents[i] < agents[j]
})
return agents
}
func (a *AgentPool) GetStatusHistory(name string) *Status {
a.Lock()
defer a.Unlock()
return a.agentStatus[name]
}
func (a *AgentPool) startAgentWithConfig(name string, config *AgentConfig, obs Observer) error {
var manager sse.Manager
if m, ok := a.managers[name]; ok {
manager = m
} else {
manager = sse.NewManager(5)
}
ctx := context.Background()
model := a.defaultModel
multimodalModel := a.defaultMultimodalModel
if config.MultimodalModel != "" {
multimodalModel = config.MultimodalModel
}
if config.Model != "" {
model = config.Model
}
if config.MCPBoxURL != "" {
a.mcpBoxURL = config.MCPBoxURL
}
if config.PeriodicRuns == "" {
config.PeriodicRuns = "10m"
}
if config.APIURL != "" {
a.apiURL = config.APIURL
}
if config.APIKey != "" {
a.apiKey = config.APIKey
}
if config.LocalRAGURL != "" {
a.localRAGAPI = config.LocalRAGURL
}
if config.LocalRAGAPIKey != "" {
a.localRAGKey = config.LocalRAGAPIKey
}
connectors := a.connectors(config)
promptBlocks := a.dynamicPrompt(config)
actions := a.availableActions(config)(ctx, a)
stateFile, characterFile := a.stateFiles(name)
actionsLog := []string{}
for _, action := range actions {
actionsLog = append(actionsLog, action.Definition().Name.String())
}
connectorLog := []string{}
for _, connector := range connectors {
connectorLog = append(connectorLog, fmt.Sprintf("%+v", connector))
}
xlog.Info(
"Creating agent",
"name", name,
"model", model,
"api_url", a.apiURL,
"actions", actionsLog,
"connectors", connectorLog,
)
// dynamicPrompts := []map[string]string{}
// for _, p := range config.DynamicPrompts {
// dynamicPrompts = append(dynamicPrompts, p.ToMap())
// }
if obs == nil {
obs = NewSSEObserver(name, manager)
}
opts := []Option{
WithModel(model),
WithLLMAPIURL(a.apiURL),
WithContext(ctx),
WithMCPServers(config.MCPServers...),
WithPeriodicRuns(config.PeriodicRuns),
WithPermanentGoal(config.PermanentGoal),
WithMCPSTDIOServers(config.MCPSTDIOServers...),
WithMCPBoxURL(a.mcpBoxURL),
WithPrompts(promptBlocks...),
WithMCPPrepareScript(config.MCPPrepareScript),
// WithDynamicPrompts(dynamicPrompts...),
WithCharacter(Character{
Name: name,
}),
WithActions(
actions...,
),
WithStateFile(stateFile),
WithCharacterFile(characterFile),
WithLLMAPIKey(a.apiKey),
WithTimeout(a.timeout),
WithRAGDB(localrag.NewWrappedClient(a.localRAGAPI, a.localRAGKey, name)),
WithAgentReasoningCallback(func(state types.ActionCurrentState) bool {
xlog.Info(
"Agent is thinking",
"agent", name,
"reasoning", state.Reasoning,
"action", state.Action.Definition().Name,
"params", state.Params,
)
manager.Send(
sse.NewMessage(
fmt.Sprintf(`Thinking: %s`, utils.HTMLify(state.Reasoning)),
).WithEvent("status"),
)
for _, c := range connectors {
if !c.AgentReasoningCallback()(state) {
return false
}
}
return true
}),
WithSystemPrompt(config.SystemPrompt),
WithMultimodalModel(multimodalModel),
WithAgentResultCallback(func(state types.ActionState) {
a.Lock()
if _, ok := a.agentStatus[name]; !ok {
a.agentStatus[name] = &Status{}
}
a.agentStatus[name].addResult(state)
a.Unlock()
xlog.Debug(
"Calling agent result callback",
)
text := fmt.Sprintf(`Reasoning: %s
Action taken: %+v
Parameters: %+v
Result: %s`,
state.Reasoning,
state.ActionCurrentState.Action.Definition().Name,
state.ActionCurrentState.Params,
state.Result)
manager.Send(
sse.NewMessage(
utils.HTMLify(
text,
),
).WithEvent("status"),
)
for _, c := range connectors {
c.AgentResultCallback()(state)
}
}),
WithObserver(obs),
}
if config.HUD {
opts = append(opts, EnableHUD)
}
if a.conversationLogs != "" {
opts = append(opts, WithConversationsPath(a.conversationLogs))
}
if config.StandaloneJob {
opts = append(opts, EnableStandaloneJob)
}
if config.LongTermMemory {
opts = append(opts, EnableLongTermMemory)
}
if config.SummaryLongTermMemory {
opts = append(opts, EnableSummaryMemory)
}
if config.CanStopItself {
opts = append(opts, CanStopItself)
}
if config.CanPlan {
opts = append(opts, EnablePlanning)
}
if config.InitiateConversations {
opts = append(opts, EnableInitiateConversations)
}
if config.RandomIdentity {
if config.IdentityGuidance != "" {
opts = append(opts, WithRandomIdentity(config.IdentityGuidance))
} else {
opts = append(opts, WithRandomIdentity())
}
}
if config.EnableKnowledgeBase {
opts = append(opts, EnableKnowledgeBase)
}
if config.EnableReasoning {
opts = append(opts, EnableForceReasoning)
}
if config.KnowledgeBaseResults > 0 {
opts = append(opts, EnableKnowledgeBaseWithResults(config.KnowledgeBaseResults))
}
if config.LoopDetectionSteps > 0 {
opts = append(opts, WithLoopDetectionSteps(config.LoopDetectionSteps))
}
if config.ParallelJobs > 0 {
opts = append(opts, WithParallelJobs(config.ParallelJobs))
}
xlog.Info("Starting agent", "name", name, "config", config)
agent, err := New(opts...)
if err != nil {
return err
}
a.agents[name] = agent
a.managers[name] = manager
go func() {
if err := agent.Run(); err != nil {
xlog.Error("Agent stopped", "error", err.Error(), "name", name)
}
}()
xlog.Info("Starting connectors", "name", name, "config", config)
for _, c := range connectors {
go c.Start(agent)
}
go func() {
for {
time.Sleep(1 * time.Second) // Send a message every seconds
manager.Send(sse.NewMessage(
utils.HTMLify(agent.State().String()),
).WithEvent("hud"))
}
}()
xlog.Info("Agent started", "name", name)
return nil
}
// Starts all the agents in the pool
func (a *AgentPool) StartAll() error {
a.Lock()
defer a.Unlock()
for name, config := range a.pool {
if a.agents[name] != nil { // Agent already started
continue
}
if err := a.startAgentWithConfig(name, &config, nil); err != nil {
xlog.Error("Failed to start agent", "name", name, "error", err)
}
}
return nil
}
func (a *AgentPool) StopAll() {
a.Lock()
defer a.Unlock()
for _, agent := range a.agents {
agent.Stop()
}
}
func (a *AgentPool) Stop(name string) {
a.Lock()
defer a.Unlock()
a.stop(name)
}
func (a *AgentPool) stop(name string) {
if agent, ok := a.agents[name]; ok {
agent.Stop()
}
}
func (a *AgentPool) Start(name string) error {
a.Lock()
defer a.Unlock()
if agent, ok := a.agents[name]; ok {
err := agent.Run()
if err != nil {
return fmt.Errorf("agent %s failed to start: %w", name, err)
}
xlog.Info("Agent started", "name", name)
return nil
}
if config, ok := a.pool[name]; ok {
return a.startAgentWithConfig(name, &config, nil)
}
return fmt.Errorf("agent %s not found", name)
}
func (a *AgentPool) stateFiles(name string) (string, string) {
stateFile := filepath.Join(a.pooldir, fmt.Sprintf("%s.state.json", name))
characterFile := filepath.Join(a.pooldir, fmt.Sprintf("%s.character.json", name))
return stateFile, characterFile
}
func (a *AgentPool) Remove(name string) error {
a.Lock()
defer a.Unlock()
// Cleanup character and state
stateFile, characterFile := a.stateFiles(name)
os.Remove(stateFile)
os.Remove(characterFile)
a.stop(name)
delete(a.agents, name)
delete(a.pool, name)
// remove avatar
os.Remove(filepath.Join(a.pooldir, "avatars", fmt.Sprintf("%s.png", name)))
if err := a.save(); err != nil {
return err
}
return nil
}
func (a *AgentPool) Save() error {
a.Lock()
defer a.Unlock()
return a.save()
}
func (a *AgentPool) save() error {
data, err := json.MarshalIndent(a.pool, "", " ")
if err != nil {
return err
}
return os.WriteFile(a.file, data, 0644)
}
func (a *AgentPool) GetAgent(name string) *Agent {
a.Lock()
defer a.Unlock()
return a.agents[name]
}
func (a *AgentPool) AllAgents() []string {
a.Lock()
defer a.Unlock()
var agents []string
for agent := range a.agents {
agents = append(agents, agent)
}
return agents
}
func (a *AgentPool) GetConfig(name string) *AgentConfig {
a.Lock()
defer a.Unlock()
agent, exists := a.pool[name]
if !exists {
return nil
}
return &agent
}
func (a *AgentPool) GetManager(name string) sse.Manager {
a.Lock()
defer a.Unlock()
return a.managers[name]
}

128
core/types/actions.go Normal file
View File

@@ -0,0 +1,128 @@
package types
import (
"context"
"encoding/json"
"github.com/sashabaranov/go-openai"
"github.com/sashabaranov/go-openai/jsonschema"
)
type ActionContext struct {
context.Context
cancelFunc context.CancelFunc
}
func (ac *ActionContext) Cancel() {
if ac.cancelFunc != nil {
ac.cancelFunc()
}
}
func NewActionContext(ctx context.Context, cancel context.CancelFunc) *ActionContext {
return &ActionContext{
Context: ctx,
cancelFunc: cancel,
}
}
type ActionParams map[string]interface{}
type ActionResult struct {
Job *Job
Result string
Metadata map[string]interface{}
}
func (ap ActionParams) Read(s string) error {
err := json.Unmarshal([]byte(s), &ap)
return err
}
func (ap ActionParams) String() string {
b, _ := json.Marshal(ap)
return string(b)
}
func (ap ActionParams) Unmarshal(v interface{}) error {
b, err := json.Marshal(ap)
if err != nil {
return err
}
if err := json.Unmarshal(b, v); err != nil {
return err
}
return nil
}
//type ActionDefinition openai.FunctionDefinition
type ActionDefinition struct {
Properties map[string]jsonschema.Definition
Required []string
Name ActionDefinitionName
Description string
}
type ActionDefinitionName string
func (a ActionDefinitionName) Is(name string) bool {
return string(a) == name
}
func (a ActionDefinitionName) String() string {
return string(a)
}
func (a ActionDefinition) ToFunctionDefinition() *openai.FunctionDefinition {
return &openai.FunctionDefinition{
Name: a.Name.String(),
Description: a.Description,
Parameters: jsonschema.Definition{
Type: jsonschema.Object,
Properties: a.Properties,
Required: a.Required,
},
}
}
// Actions is something the agent can do
type Action interface {
Run(ctx context.Context, action ActionParams) (ActionResult, error)
Definition() ActionDefinition
Plannable() bool
}
type Actions []Action
func (a Actions) ToTools() []openai.Tool {
tools := []openai.Tool{}
for _, action := range a {
tools = append(tools, openai.Tool{
Type: openai.ToolTypeFunction,
Function: action.Definition().ToFunctionDefinition(),
})
}
return tools
}
func (a Actions) Find(name string) Action {
for _, action := range a {
if action.Definition().Name.Is(name) {
return action
}
}
return nil
}
type ActionState struct {
ActionCurrentState
ActionResult
}
type ActionCurrentState struct {
Job *Job
Action Action
Params ActionParams
Reasoning string
}

208
core/types/job.go Normal file
View File

@@ -0,0 +1,208 @@
package types
import (
"context"
"log"
"github.com/google/uuid"
"github.com/sashabaranov/go-openai"
)
// Job is a request to the agent to do something
type Job struct {
// The job is a request to the agent to do something
// It can be a question, a command, or a request to do something
// The agent will try to do it, and return a response
Result *JobResult
ReasoningCallback func(ActionCurrentState) bool
ResultCallback func(ActionState)
ConversationHistory []openai.ChatCompletionMessage
UUID string
Metadata map[string]interface{}
pastActions []*ActionRequest
nextAction *Action
nextActionParams *ActionParams
nextActionReasoning string
context context.Context
cancel context.CancelFunc
Obs *Observable
}
type ActionRequest struct {
Action Action
Params *ActionParams
}
type JobOption func(*Job)
func WithConversationHistory(history []openai.ChatCompletionMessage) JobOption {
return func(j *Job) {
j.ConversationHistory = history
}
}
func WithReasoningCallback(f func(ActionCurrentState) bool) JobOption {
return func(r *Job) {
r.ReasoningCallback = f
}
}
func WithResultCallback(f func(ActionState)) JobOption {
return func(r *Job) {
r.ResultCallback = f
}
}
func WithMetadata(metadata map[string]interface{}) JobOption {
return func(j *Job) {
j.Metadata = metadata
}
}
// NewJobResult creates a new job result
func NewJobResult() *JobResult {
r := &JobResult{
ready: make(chan bool),
}
return r
}
func (j *Job) Callback(stateResult ActionCurrentState) bool {
if j.ReasoningCallback == nil {
return true
}
return j.ReasoningCallback(stateResult)
}
func (j *Job) CallbackWithResult(stateResult ActionState) {
if j.ResultCallback == nil {
return
}
j.ResultCallback(stateResult)
}
func (j *Job) SetNextAction(action *Action, params *ActionParams, reasoning string) {
j.nextAction = action
j.nextActionParams = params
j.nextActionReasoning = reasoning
}
func (j *Job) AddPastAction(action Action, params *ActionParams) {
j.pastActions = append(j.pastActions, &ActionRequest{
Action: action,
Params: params,
})
}
func (j *Job) GetPastActions() []*ActionRequest {
return j.pastActions
}
func (j *Job) GetNextAction() (*Action, *ActionParams, string) {
return j.nextAction, j.nextActionParams, j.nextActionReasoning
}
func (j *Job) HasNextAction() bool {
return j.nextAction != nil
}
func (j *Job) ResetNextAction() {
j.nextAction = nil
j.nextActionParams = nil
j.nextActionReasoning = ""
}
func WithTextImage(text, image string) JobOption {
return func(j *Job) {
j.ConversationHistory = append(j.ConversationHistory, openai.ChatCompletionMessage{
Role: "user",
MultiContent: []openai.ChatMessagePart{
{
Type: openai.ChatMessagePartTypeText,
Text: text,
},
{
Type: openai.ChatMessagePartTypeImageURL,
ImageURL: &openai.ChatMessageImageURL{URL: image},
},
},
})
}
}
func WithText(text string) JobOption {
return func(j *Job) {
j.ConversationHistory = append(j.ConversationHistory, openai.ChatCompletionMessage{
Role: "user",
Content: text,
})
}
}
func newUUID() string {
// Generate UUID with google/uuid
// https://pkg.go.dev/github.com/google/uuid
// Generate a Version 4 UUID
u, err := uuid.NewRandom()
if err != nil {
log.Fatalf("failed to generate UUID: %v", err)
}
return u.String()
}
// NewJob creates a new job
// It is a request to the agent to do something
// It has a JobResult to get the result asynchronously
// To wait for a Job result, use JobResult.WaitResult()
func NewJob(opts ...JobOption) *Job {
j := &Job{
Result: NewJobResult(),
UUID: newUUID(),
}
for _, o := range opts {
o(j)
}
var ctx context.Context
if j.context == nil {
ctx = context.Background()
} else {
ctx = j.context
}
context, cancel := context.WithCancel(ctx)
j.context = context
j.cancel = cancel
return j
}
func WithUUID(uuid string) JobOption {
return func(j *Job) {
j.UUID = uuid
}
}
func WithContext(ctx context.Context) JobOption {
return func(j *Job) {
j.context = ctx
}
}
func (j *Job) Cancel() {
j.cancel()
}
func (j *Job) GetContext() context.Context {
return j.context
}
func WithObservable(obs *Observable) JobOption {
return func(j *Job) {
j.Obs = obs
}
}

61
core/types/observable.go Normal file
View File

@@ -0,0 +1,61 @@
package types
import (
"github.com/mudler/LocalAGI/pkg/xlog"
"github.com/sashabaranov/go-openai"
)
type Creation struct {
ChatCompletionRequest *openai.ChatCompletionRequest `json:"chat_completion_request,omitempty"`
FunctionDefinition *openai.FunctionDefinition `json:"function_definition,omitempty"`
FunctionParams ActionParams `json:"function_params,omitempty"`
}
type Progress struct {
Error string `json:"error,omitempty"`
ChatCompletionResponse *openai.ChatCompletionResponse `json:"chat_completion_response,omitempty"`
ActionResult string `json:"action_result,omitempty"`
AgentState *AgentInternalState `json:"agent_state"`
}
type Completion struct {
Error string `json:"error,omitempty"`
ChatCompletionResponse *openai.ChatCompletionResponse `json:"chat_completion_response,omitempty"`
Conversation []openai.ChatCompletionMessage `json:"conversation,omitempty"`
ActionResult string `json:"action_result,omitempty"`
AgentState *AgentInternalState `json:"agent_state"`
}
type Observable struct {
ID int32 `json:"id"`
ParentID int32 `json:"parent_id,omitempty"`
Agent string `json:"agent"`
Name string `json:"name"`
Icon string `json:"icon"`
Creation *Creation `json:"creation,omitempty"`
Progress []Progress `json:"progress,omitempty"`
Completion *Completion `json:"completion,omitempty"`
}
func (o *Observable) AddProgress(p Progress) {
if o.Progress == nil {
o.Progress = make([]Progress, 0)
}
o.Progress = append(o.Progress, p)
}
func (o *Observable) MakeLastProgressCompletion() {
if len(o.Progress) == 0 {
xlog.Error("Observable completed without any progress", "id", o.ID, "name", o.Name)
return
}
p := o.Progress[len(o.Progress)-1]
o.Progress = o.Progress[:len(o.Progress)-1]
o.Completion = &Completion{
Error: p.Error,
ChatCompletionResponse: p.ChatCompletionResponse,
ActionResult: p.ActionResult,
AgentState: p.AgentState,
}
}

67
core/types/result.go Normal file
View File

@@ -0,0 +1,67 @@
package types
import (
"sync"
"github.com/sashabaranov/go-openai"
)
// JobResult is the result of a job
type JobResult struct {
sync.Mutex
// The result of a job
State []ActionState
Conversation []openai.ChatCompletionMessage
Finalizers []func([]openai.ChatCompletionMessage)
Response string
Error error
ready chan bool
}
// SetResult sets the result of a job
func (j *JobResult) SetResult(text ActionState) {
j.Lock()
defer j.Unlock()
j.State = append(j.State, text)
}
// SetResult sets the result of a job
func (j *JobResult) Finish(e error) {
j.Lock()
j.Error = e
j.Unlock()
close(j.ready)
for _, f := range j.Finalizers {
f(j.Conversation)
}
j.Finalizers = []func([]openai.ChatCompletionMessage){}
}
// AddFinalizer adds a finalizer to the job result
func (j *JobResult) AddFinalizer(f func([]openai.ChatCompletionMessage)) {
j.Lock()
defer j.Unlock()
j.Finalizers = append(j.Finalizers, f)
}
// SetResult sets the result of a job
func (j *JobResult) SetResponse(response string) {
j.Lock()
defer j.Unlock()
j.Response = response
}
// WaitResult waits for the result of a job
func (j *JobResult) WaitResult() *JobResult {
<-j.ready
j.Lock()
defer j.Unlock()
return j
}

41
core/types/state.go Normal file
View File

@@ -0,0 +1,41 @@
package types
import "fmt"
// State is the structure
// that is used to keep track of the current state
// and the Agent's short memory that it can update
// Besides a long term memory that is accessible by the agent (With vector database),
// And a context memory (that is always powered by a vector database),
// this memory is the shorter one that the LLM keeps across conversation and across its
// reasoning process's and life time.
// TODO: A special action is then used to let the LLM itself update its memory
// periodically during self-processing, and the same action is ALSO exposed
// during the conversation to let the user put for example, a new goal to the agent.
type AgentInternalState struct {
NowDoing string `json:"doing_now"`
DoingNext string `json:"doing_next"`
DoneHistory []string `json:"done_history"`
Memories []string `json:"memories"`
Goal string `json:"goal"`
}
const fmtT = `=====================
NowDoing: %s
DoingNext: %s
Your current goal is: %s
You have done: %+v
You have a short memory with: %+v
=====================
`
func (c AgentInternalState) String() string {
return fmt.Sprintf(
fmtT,
c.NowDoing,
c.DoingNext,
c.Goal,
c.DoneHistory,
c.Memories,
)
}

33
docker-compose.intel.yaml Normal file
View File

@@ -0,0 +1,33 @@
services:
localai:
extends:
file: docker-compose.yaml
service: localai
environment:
- LOCALAI_SINGLE_ACTIVE_BACKEND=true
- DEBUG=true
image: localai/localai:master-sycl-f32-ffmpeg-core
devices:
# On a system with integrated GPU and an Arc 770, this is the Arc 770
- /dev/dri/card1
- /dev/dri/renderD129
mcpbox:
extends:
file: docker-compose.yaml
service: mcpbox
localrecall:
extends:
file: docker-compose.yaml
service: localrecall
localrecall-healthcheck:
extends:
file: docker-compose.yaml
service: localrecall-healthcheck
localagi:
extends:
file: docker-compose.yaml
service: localagi

View File

@@ -0,0 +1,38 @@
services:
localai:
extends:
file: docker-compose.yaml
service: localai
environment:
- LOCALAI_SINGLE_ACTIVE_BACKEND=true
- DEBUG=true
image: localai/localai:master-cublas-cuda12-ffmpeg-core
# For images with python backends, use:
# image: localai/localai:master-cublas-cuda12-ffmpeg
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities: [gpu]
mcpbox:
extends:
file: docker-compose.yaml
service: mcpbox
localrecall:
extends:
file: docker-compose.yaml
service: localrecall
localrecall-healthcheck:
extends:
file: docker-compose.yaml
service: localrecall-healthcheck
localagi:
extends:
file: docker-compose.yaml
service: localagi

View File

@@ -1,31 +1,93 @@
version: "3.9"
services:
api:
image: quay.io/go-skynet/local-ai:master
localai:
# See https://localai.io/basics/container/#standard-container-images for
# a list of available container images (or build your own with the provided Dockerfile)
# Available images with CUDA, ROCm, SYCL, Vulkan
# Image list (quay.io): https://quay.io/repository/go-skynet/local-ai?tab=tags
# Image list (dockerhub): https://hub.docker.com/r/localai/localai
image: localai/localai:master-ffmpeg-core
command:
- ${MODEL_NAME:-gemma-3-12b-it-qat}
- ${MULTIMODAL_MODEL:-minicpm-v-2_6}
- ${IMAGE_MODEL:-sd-1.5-ggml}
- granite-embedding-107m-multilingual
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8080/readyz"]
interval: 1m
timeout: 120m
interval: 60s
timeout: 10m
retries: 120
ports:
- 8090:8080
env_file:
- .env
- 8081:8080
environment:
- DEBUG=true
#- LOCALAI_API_KEY=sk-1234567890
volumes:
- ./models:/models:cached
- ./config:/config:cached
command: ["/usr/bin/local-ai" ]
localagi:
- ./volumes/models:/build/models:cached
- ./volumes/images:/tmp/generated/images
localrecall:
image: quay.io/mudler/localrecall:main
ports:
- 8080
environment:
- COLLECTION_DB_PATH=/db
- EMBEDDING_MODEL=granite-embedding-107m-multilingual
- FILE_ASSETS=/assets
- OPENAI_API_KEY=sk-1234567890
- OPENAI_BASE_URL=http://localai:8080
volumes:
- ./volumes/localrag/db:/db
- ./volumes/localrag/assets/:/assets
localrecall-healthcheck:
depends_on:
localrecall:
condition: service_started
image: busybox
command: ["sh", "-c", "until wget -q -O - http://localrecall:8080 > /dev/null 2>&1; do echo 'Waiting for localrecall...'; sleep 1; done; echo 'localrecall is up!'"]
mcpbox:
build:
context: .
dockerfile: Dockerfile
devices:
- /dev/snd
depends_on:
api:
condition: service_healthy
dockerfile: Dockerfile.mcpbox
ports:
- "8080"
volumes:
- ./db:/app/db
- ./data:/data
env_file:
- .env
- ./volumes/mcpbox:/app/data
# share docker socket if you want it to be able to run docker commands
- /var/run/docker.sock:/var/run/docker.sock
healthcheck:
test: ["CMD", "wget", "-q", "-O", "-", "http://localhost:8080/processes"]
interval: 30s
timeout: 10s
retries: 3
localagi:
depends_on:
localai:
condition: service_healthy
localrecall-healthcheck:
condition: service_completed_successfully
mcpbox:
condition: service_healthy
build:
context: .
dockerfile: Dockerfile.webui
ports:
- 8080:3000
#image: quay.io/mudler/localagi:master
environment:
- LOCALAGI_MODEL=${MODEL_NAME:-gemma-3-12b-it-qat}
- LOCALAGI_MULTIMODAL_MODEL=${MULTIMODAL_MODEL:-minicpm-v-2_6}
- LOCALAGI_IMAGE_MODEL=${IMAGE_MODEL:-sd-1.5-ggml}
- LOCALAGI_LLM_API_URL=http://localai:8080
#- LOCALAGI_LLM_API_KEY=sk-1234567890
- LOCALAGI_LOCALRAG_URL=http://localrecall:8080
- LOCALAGI_STATE_DIR=/pool
- LOCALAGI_TIMEOUT=5m
- LOCALAGI_ENABLE_CONVERSATIONS_LOGGING=false
- LOCALAGI_MCPBOX_URL=http://mcpbox:8080
extra_hosts:
- "host.docker.internal:host-gateway"
volumes:
- ./volumes/localagi/:/pool

12
example/realtimesst/main.py Executable file
View File

@@ -0,0 +1,12 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from RealtimeSTT import AudioToTextRecorder
def process_text(text):
print(text)
if __name__ == '__main__':
recorder = AudioToTextRecorder(wake_words="jarvis")
while True:
recorder.text(process_text)

97
go.mod Normal file
View File

@@ -0,0 +1,97 @@
module github.com/mudler/LocalAGI
go 1.24
toolchain go1.24.2
require (
github.com/bwmarrin/discordgo v0.28.1
github.com/chasefleming/elem-go v0.30.0
github.com/dave-gray101/v2keyauth v0.0.0-20240624150259-c45d584d25e2
github.com/donseba/go-htmx v1.12.0
github.com/eritikass/githubmarkdownconvertergo v0.1.10
github.com/go-telegram/bot v1.14.2
github.com/gofiber/fiber/v2 v2.52.6
github.com/gofiber/template/html/v2 v2.1.3
github.com/google/go-github/v69 v69.2.0
github.com/google/uuid v1.6.0
github.com/metoro-io/mcp-golang v0.11.0
github.com/onsi/ginkgo/v2 v2.23.4
github.com/onsi/gomega v1.37.0
github.com/philippgille/chromem-go v0.7.0
github.com/sashabaranov/go-openai v1.38.2
github.com/slack-go/slack v0.16.0
github.com/thoj/go-ircevent v0.0.0-20210723090443-73e444401d64
github.com/tmc/langchaingo v0.1.13
github.com/traefik/yaegi v0.16.1
github.com/valyala/fasthttp v1.60.0
golang.org/x/crypto v0.37.0
jaytaylor.com/html2text v0.0.0-20230321000545-74c2419ad056
mvdan.cc/xurls/v2 v2.6.0
)
require (
github.com/PuerkitoBio/goquery v1.8.1 // indirect
github.com/andybalholm/brotli v1.1.1 // indirect
github.com/andybalholm/cascadia v1.3.2 // indirect
github.com/antchfx/htmlquery v1.3.0 // indirect
github.com/antchfx/xmlquery v1.3.17 // indirect
github.com/antchfx/xpath v1.2.4 // indirect
github.com/bahlo/generic-list-go v0.2.0 // indirect
github.com/buger/jsonparser v1.1.1 // indirect
github.com/dlclark/regexp2 v1.10.0 // indirect
github.com/gin-contrib/sse v0.1.0 // indirect
github.com/gin-gonic/gin v1.8.1 // indirect
github.com/go-logr/logr v1.4.2 // indirect
github.com/go-playground/locales v0.14.0 // indirect
github.com/go-playground/universal-translator v0.18.0 // indirect
github.com/go-playground/validator/v10 v10.10.0 // indirect
github.com/go-task/slim-sprig/v3 v3.0.0 // indirect
github.com/gobwas/glob v0.2.3 // indirect
github.com/goccy/go-json v0.9.7 // indirect
github.com/gocolly/colly v1.2.0 // indirect
github.com/gofiber/template v1.8.3 // indirect
github.com/gofiber/utils v1.1.0 // indirect
github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da // indirect
github.com/golang/protobuf v1.5.4 // indirect
github.com/google/go-cmp v0.7.0 // indirect
github.com/google/go-querystring v1.1.0 // indirect
github.com/google/pprof v0.0.0-20250403155104-27863c87afa6 // indirect
github.com/gorilla/websocket v1.5.3 // indirect
github.com/invopop/jsonschema v0.12.0 // indirect
github.com/json-iterator/go v1.1.12 // indirect
github.com/kennygrant/sanitize v1.2.4 // indirect
github.com/klauspost/compress v1.18.0 // indirect
github.com/leodido/go-urn v1.2.1 // indirect
github.com/mailru/easyjson v0.7.7 // indirect
github.com/mattn/go-colorable v0.1.13 // indirect
github.com/mattn/go-isatty v0.0.20 // indirect
github.com/mattn/go-runewidth v0.0.16 // indirect
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd // indirect
github.com/modern-go/reflect2 v1.0.2 // indirect
github.com/olekukonko/tablewriter v0.0.5 // indirect
github.com/pelletier/go-toml/v2 v2.0.9 // indirect
github.com/pkg/errors v0.9.1 // indirect
github.com/pkoukk/tiktoken-go v0.1.6 // indirect
github.com/rivo/uniseg v0.2.0 // indirect
github.com/saintfish/chardet v0.0.0-20230101081208-5e3ef4b5456d // indirect
github.com/ssor/bom v0.0.0-20170718123548-6386211fdfcf // indirect
github.com/temoto/robotstxt v1.1.2 // indirect
github.com/tidwall/gjson v1.18.0 // indirect
github.com/tidwall/match v1.1.1 // indirect
github.com/tidwall/pretty v1.2.1 // indirect
github.com/tidwall/sjson v1.2.5 // indirect
github.com/ugorji/go/codec v1.2.7 // indirect
github.com/valyala/bytebufferpool v1.0.0 // indirect
github.com/wk8/go-ordered-map/v2 v2.1.8 // indirect
go.starlark.net v0.0.0-20230302034142-4b1e35fe2254 // indirect
go.uber.org/automaxprocs v1.6.0 // indirect
golang.org/x/net v0.38.0 // indirect
golang.org/x/sys v0.32.0 // indirect
golang.org/x/text v0.24.0 // indirect
golang.org/x/tools v0.31.0 // indirect
google.golang.org/appengine v1.6.8 // indirect
google.golang.org/protobuf v1.36.5 // indirect
gopkg.in/yaml.v2 v2.4.0 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
)

354
go.sum Normal file
View File

@@ -0,0 +1,354 @@
cloud.google.com/go v0.26.0/go.mod h1:aQUYkXzVsufM+DwF1aE+0xfcU+56JwCaLick0ClmMTw=
github.com/BurntSushi/toml v0.3.1/go.mod h1:xHWCNGjB5oqiDr8zfno3MHue2Ht5sIBksp03qcyfWMU=
github.com/PuerkitoBio/goquery v1.8.1 h1:uQxhNlArOIdbrH1tr0UXwdVFgDcZDrZVdcpygAcwmWM=
github.com/PuerkitoBio/goquery v1.8.1/go.mod h1:Q8ICL1kNUJ2sXGoAhPGUdYDJvgQgHzJsnnd3H7Ho5jQ=
github.com/andybalholm/brotli v1.1.1 h1:PR2pgnyFznKEugtsUo0xLdDop5SKXd5Qf5ysW+7XdTA=
github.com/andybalholm/brotli v1.1.1/go.mod h1:05ib4cKhjx3OQYUY22hTVd34Bc8upXjOLL2rKwwZBoA=
github.com/andybalholm/cascadia v1.3.1/go.mod h1:R4bJ1UQfqADjvDa4P6HZHLh/3OxWWEqc0Sk8XGwHqvA=
github.com/andybalholm/cascadia v1.3.2 h1:3Xi6Dw5lHF15JtdcmAHD3i1+T8plmv7BQ/nsViSLyss=
github.com/andybalholm/cascadia v1.3.2/go.mod h1:7gtRlve5FxPPgIgX36uWBX58OdBsSS6lUvCFb+h7KvU=
github.com/antchfx/htmlquery v1.3.0 h1:5I5yNFOVI+egyia5F2s/5Do2nFWxJz41Tr3DyfKD25E=
github.com/antchfx/htmlquery v1.3.0/go.mod h1:zKPDVTMhfOmcwxheXUsx4rKJy8KEY/PU6eXr/2SebQ8=
github.com/antchfx/xmlquery v1.3.17 h1:d0qWjPp/D+vtRw7ivCwT5ApH/3CkQU8JOeo3245PpTk=
github.com/antchfx/xmlquery v1.3.17/go.mod h1:Afkq4JIeXut75taLSuI31ISJ/zeq+3jG7TunF7noreA=
github.com/antchfx/xpath v1.2.3/go.mod h1:i54GszH55fYfBmoZXapTHN8T8tkcHfRgLyVwwqzXNcs=
github.com/antchfx/xpath v1.2.4 h1:dW1HB/JxKvGtJ9WyVGJ0sIoEcqftV3SqIstujI+B9XY=
github.com/antchfx/xpath v1.2.4/go.mod h1:i54GszH55fYfBmoZXapTHN8T8tkcHfRgLyVwwqzXNcs=
github.com/bahlo/generic-list-go v0.2.0 h1:5sz/EEAK+ls5wF+NeqDpk5+iNdMDXrh3z3nPnH1Wvgk=
github.com/bahlo/generic-list-go v0.2.0/go.mod h1:2KvAjgMlE5NNynlg/5iLrrCCZ2+5xWbdbCW3pNTGyYg=
github.com/buger/jsonparser v1.1.1 h1:2PnMjfWD7wBILjqQbt530v576A/cAbQvEW9gGIpYMUs=
github.com/buger/jsonparser v1.1.1/go.mod h1:6RYKKt7H4d4+iWqouImQ9R2FZql3VbhNgx27UK13J/0=
github.com/bwmarrin/discordgo v0.28.1 h1:gXsuo2GBO7NbR6uqmrrBDplPUx2T3nzu775q/Rd1aG4=
github.com/bwmarrin/discordgo v0.28.1/go.mod h1:NJZpH+1AfhIcyQsPeuBKsUtYrRnjkyu0kIVMCHkZtRY=
github.com/census-instrumentation/opencensus-proto v0.2.1/go.mod h1:f6KPmirojxKA12rnyqOA5BBL4O983OfeGPqjHWSTneU=
github.com/chasefleming/elem-go v0.30.0 h1:BlhV1ekv1RbFiM8XZUQeln1Ikb4D+bu2eDO4agREvok=
github.com/chasefleming/elem-go v0.30.0/go.mod h1:hz73qILBIKnTgOujnSMtEj20/epI+f6vg71RUilJAA4=
github.com/chzyer/logex v1.1.10/go.mod h1:+Ywpsq7O8HXn0nuIou7OrIPyXbp3wmkHB+jjWRnGsAI=
github.com/chzyer/readline v0.0.0-20180603132655-2972be24d48e/go.mod h1:nSuG5e5PlCu98SY8svDHJxuZscDgtXS6KTTbou5AhLI=
github.com/chzyer/test v0.0.0-20180213035817-a1ea475d72b1/go.mod h1:Q3SI9o4m/ZMnBNeIyt5eFwwo7qiLfzFZmjNmxjkiQlU=
github.com/client9/misspell v0.3.4/go.mod h1:qj6jICC3Q7zFZvVWo7KLAzC3yx5G7kyvSDkc90ppPyw=
github.com/creack/pty v1.1.9/go.mod h1:oKZEueFk5CKHvIhNR5MUki03XCEU+Q6VDXinZuGJ33E=
github.com/dave-gray101/v2keyauth v0.0.0-20240624150259-c45d584d25e2 h1:flLYmnQFZNo04x2NPehMbf30m7Pli57xwZ0NFqR/hb0=
github.com/dave-gray101/v2keyauth v0.0.0-20240624150259-c45d584d25e2/go.mod h1:NtWqRzAp/1tw+twkW8uuBenEVVYndEAZACWU3F3xdoQ=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/dlclark/regexp2 v1.10.0 h1:+/GIL799phkJqYW+3YbOd8LCcbHzT0Pbo8zl70MHsq0=
github.com/dlclark/regexp2 v1.10.0/go.mod h1:DHkYz0B9wPfa6wondMfaivmHpzrQ3v9q8cnmRbL6yW8=
github.com/donseba/go-htmx v1.12.0 h1:7tESER0uxaqsuGMv3yP3pK1drfBUXM6apG4H7/3+IgE=
github.com/donseba/go-htmx v1.12.0/go.mod h1:8PTAYvNKf8+QYis+DpAsggKz+sa2qljtMgvdAeNBh5s=
github.com/envoyproxy/go-control-plane v0.9.1-0.20191026205805-5f8ba28d4473/go.mod h1:YTl/9mNaCwkRvm6d1a2C3ymFceY/DCBVvsKhRF0iEA4=
github.com/envoyproxy/protoc-gen-validate v0.1.0/go.mod h1:iSmxcyjqTsJpI2R4NaDN7+kN2VEUnK/pcBlmesArF7c=
github.com/eritikass/githubmarkdownconvertergo v0.1.10 h1:mL93ADvYMOeT15DcGtK9AaFFc+RcWcy6kQBC6yS/5f4=
github.com/eritikass/githubmarkdownconvertergo v0.1.10/go.mod h1:BdpHs6imOtzE5KorbUtKa6bZ0ZBh1yFcrTTAL8FwDKY=
github.com/gin-contrib/sse v0.1.0 h1:Y/yl/+YNO8GZSjAhjMsSuLt29uWRFHdHYUb5lYOV9qE=
github.com/gin-contrib/sse v0.1.0/go.mod h1:RHrZQHXnP2xjPF+u1gW/2HnVO7nvIa9PG3Gm+fLHvGI=
github.com/gin-gonic/gin v1.8.1 h1:4+fr/el88TOO3ewCmQr8cx/CtZ/umlIRIs5M4NTNjf8=
github.com/gin-gonic/gin v1.8.1/go.mod h1:ji8BvRH1azfM+SYow9zQ6SZMvR8qOMZHmsCuWR9tTTk=
github.com/go-logr/logr v1.4.2 h1:6pFjapn8bFcIbiKo3XT4j/BhANplGihG6tvd+8rYgrY=
github.com/go-logr/logr v1.4.2/go.mod h1:9T104GzyrTigFIr8wt5mBrctHMim0Nb2HLGrmQ40KvY=
github.com/go-playground/assert/v2 v2.0.1 h1:MsBgLAaY856+nPRTKrp3/OZK38U/wa0CcBYNjji3q3A=
github.com/go-playground/assert/v2 v2.0.1/go.mod h1:VDjEfimB/XKnb+ZQfWdccd7VUvScMdVu0Titje2rxJ4=
github.com/go-playground/locales v0.14.0 h1:u50s323jtVGugKlcYeyzC0etD1HifMjqmJqb8WugfUU=
github.com/go-playground/locales v0.14.0/go.mod h1:sawfccIbzZTqEDETgFXqTho0QybSa7l++s0DH+LDiLs=
github.com/go-playground/universal-translator v0.18.0 h1:82dyy6p4OuJq4/CByFNOn/jYrnRPArHwAcmLoJZxyho=
github.com/go-playground/universal-translator v0.18.0/go.mod h1:UvRDBj+xPUEGrFYl+lu/H90nyDXpg0fqeB/AQUGNTVA=
github.com/go-playground/validator/v10 v10.10.0 h1:I7mrTYv78z8k8VXa/qJlOlEXn/nBh+BF8dHX5nt/dr0=
github.com/go-playground/validator/v10 v10.10.0/go.mod h1:74x4gJWsvQexRdW8Pn3dXSGrTK4nAUsbPlLADvpJkos=
github.com/go-task/slim-sprig/v3 v3.0.0 h1:sUs3vkvUymDpBKi3qH1YSqBQk9+9D/8M2mN1vB6EwHI=
github.com/go-task/slim-sprig/v3 v3.0.0/go.mod h1:W848ghGpv3Qj3dhTPRyJypKRiqCdHZiAzKg9hl15HA8=
github.com/go-telegram/bot v1.14.2 h1:j9hXerxTuvkw7yFi3sF5jjRVGozNVKkMQSKjMeBJ5FY=
github.com/go-telegram/bot v1.14.2/go.mod h1:i2TRs7fXWIeaceF3z7KzsMt/he0TwkVC680mvdTFYeM=
github.com/go-test/deep v1.0.4 h1:u2CU3YKy9I2pmu9pX0eq50wCgjfGIt539SqR7FbHiho=
github.com/go-test/deep v1.0.4/go.mod h1:wGDj63lr65AM2AQyKZd/NYHGb0R+1RLqB8NKt3aSFNA=
github.com/gobwas/glob v0.2.3 h1:A4xDbljILXROh+kObIiy5kIaPYD8e96x1tgBhUI5J+Y=
github.com/gobwas/glob v0.2.3/go.mod h1:d3Ez4x06l9bZtSvzIay5+Yzi0fmZzPgnTbPcKjJAkT8=
github.com/goccy/go-json v0.9.7 h1:IcB+Aqpx/iMHu5Yooh7jEzJk1JZ7Pjtmys2ukPr7EeM=
github.com/goccy/go-json v0.9.7/go.mod h1:6MelG93GURQebXPDq3khkgXZkazVtN9CRI+MGFi0w8I=
github.com/gocolly/colly v1.2.0 h1:qRz9YAn8FIH0qzgNUw+HT9UN7wm1oF9OBAilwEWpyrI=
github.com/gocolly/colly v1.2.0/go.mod h1:Hof5T3ZswNVsOHYmba1u03W65HDWgpV5HifSuueE0EA=
github.com/gofiber/fiber/v2 v2.52.6 h1:Rfp+ILPiYSvvVuIPvxrBns+HJp8qGLDnLJawAu27XVI=
github.com/gofiber/fiber/v2 v2.52.6/go.mod h1:YEcBbO/FB+5M1IZNBP9FO3J9281zgPAreiI1oqg8nDw=
github.com/gofiber/template v1.8.3 h1:hzHdvMwMo/T2kouz2pPCA0zGiLCeMnoGsQZBTSYgZxc=
github.com/gofiber/template v1.8.3/go.mod h1:bs/2n0pSNPOkRa5VJ8zTIvedcI/lEYxzV3+YPXdBvq8=
github.com/gofiber/template/html/v2 v2.1.3 h1:n1LYBtmr9C0V/k/3qBblXyMxV5B0o/gpb6dFLp8ea+o=
github.com/gofiber/template/html/v2 v2.1.3/go.mod h1:U5Fxgc5KpyujU9OqKzy6Kn6Qup6Tm7zdsISR+VpnHRE=
github.com/gofiber/utils v1.1.0 h1:vdEBpn7AzIUJRhe+CiTOJdUcTg4Q9RK+pEa0KPbLdrM=
github.com/gofiber/utils v1.1.0/go.mod h1:poZpsnhBykfnY1Mc0KeEa6mSHrS3dV0+oBWyeQmb2e0=
github.com/golang/glog v0.0.0-20160126235308-23def4e6c14b/go.mod h1:SBH7ygxi8pfUlaOkMMuAQtPIUF8ecWP5IEl/CR7VP2Q=
github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da h1:oI5xCqsCo564l8iNU+DwB5epxmsaqB+rhGL0m5jtYqE=
github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da/go.mod h1:cIg4eruTrX1D+g88fzRXU5OdNfaM+9IcxsU14FzY7Hc=
github.com/golang/mock v1.1.1/go.mod h1:oTYuIxOrZwtPieC+H1uAHpcLFnEyAGVDL/k47Jfbm0A=
github.com/golang/protobuf v1.2.0/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
github.com/golang/protobuf v1.3.2/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
github.com/golang/protobuf v1.4.0-rc.1/go.mod h1:ceaxUfeHdC40wWswd/P6IGgMaK3YpKi5j83Wpe3EHw8=
github.com/golang/protobuf v1.4.0-rc.1.0.20200221234624-67d41d38c208/go.mod h1:xKAWHe0F5eneWXFV3EuXVDTCmh+JuBKY0li0aMyXATA=
github.com/golang/protobuf v1.4.0-rc.2/go.mod h1:LlEzMj4AhA7rCAGe4KMBDvJI+AwstrUpVNzEA03Pprs=
github.com/golang/protobuf v1.4.0-rc.4.0.20200313231945-b860323f09d0/go.mod h1:WU3c8KckQ9AFe+yFwt9sWVRKCVIyN9cPHBJSNnbL67w=
github.com/golang/protobuf v1.4.0/go.mod h1:jodUvKwWbYaEsadDk5Fwe5c77LiNKVO9IDvqG2KuDX0=
github.com/golang/protobuf v1.4.1/go.mod h1:U8fpvMrcmy5pZrNK1lt4xCsGvpyWQ/VVv6QDs8UjoX8=
github.com/golang/protobuf v1.5.0/go.mod h1:FsONVRAS9T7sI+LIUmWTfcYkHO4aIWwzhcaSAoJOfIk=
github.com/golang/protobuf v1.5.2/go.mod h1:XVQd3VNwM+JqD3oG2Ue2ip4fOMUkwXdXDdiuN0vRsmY=
github.com/golang/protobuf v1.5.4 h1:i7eJL8qZTpSEXOPTxNKhASYpMn+8e5Q6AdndVa1dWek=
github.com/golang/protobuf v1.5.4/go.mod h1:lnTiLA8Wa4RWRcIUkrtSVa5nRhsEGBg48fD6rSs7xps=
github.com/google/go-cmp v0.2.0/go.mod h1:oXzfMopK8JAjlY9xF4vHSVASa0yLyX7SntLO5aqRK0M=
github.com/google/go-cmp v0.3.0/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=
github.com/google/go-cmp v0.3.1/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=
github.com/google/go-cmp v0.4.0/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.5.0/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.5.1/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.5.2/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.5.5/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.5.7/go.mod h1:n+brtR0CgQNWTVd5ZUFpTBC8YFBDLK/h/bpaJ8/DtOE=
github.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=
github.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=
github.com/google/go-github/v69 v69.2.0 h1:wR+Wi/fN2zdUx9YxSmYE0ktiX9IAR/BeePzeaUUbEHE=
github.com/google/go-github/v69 v69.2.0/go.mod h1:xne4jymxLR6Uj9b7J7PyTpkMYstEMMwGZa0Aehh1azM=
github.com/google/go-querystring v1.1.0 h1:AnCroh3fv4ZBgVIf1Iwtovgjaw/GiKJo8M8yD/fhyJ8=
github.com/google/go-querystring v1.1.0/go.mod h1:Kcdr2DB4koayq7X8pmAG4sNG59So17icRSOU623lUBU=
github.com/google/gofuzz v1.0.0/go.mod h1:dBl0BpW6vV/+mYPU4Po3pmUjxk6FQPldtuIdl/M65Eg=
github.com/google/pprof v0.0.0-20250403155104-27863c87afa6 h1:BHT72Gu3keYf3ZEu2J0b1vyeLSOYI8bm5wbJM/8yDe8=
github.com/google/pprof v0.0.0-20250403155104-27863c87afa6/go.mod h1:boTsfXsheKC2y+lKOCMpSfarhxDeIzfZG1jqGcPl3cA=
github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=
github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
github.com/gorilla/websocket v1.4.2/go.mod h1:YR8l580nyteQvAITg2hZ9XVh4b55+EU/adAjf1fMHhE=
github.com/gorilla/websocket v1.5.3 h1:saDtZ6Pbx/0u+bgYQ3q96pZgCzfhKXGPqt7kZ72aNNg=
github.com/gorilla/websocket v1.5.3/go.mod h1:YR8l580nyteQvAITg2hZ9XVh4b55+EU/adAjf1fMHhE=
github.com/invopop/jsonschema v0.12.0 h1:6ovsNSuvn9wEQVOyc72aycBMVQFKz7cPdMJn10CvzRI=
github.com/invopop/jsonschema v0.12.0/go.mod h1:ffZ5Km5SWWRAIN6wbDXItl95euhFz2uON45H2qjYt+0=
github.com/josharian/intern v1.0.0/go.mod h1:5DoeVV0s6jJacbCEi61lwdGj/aVlrQvzHFFd8Hwg//Y=
github.com/json-iterator/go v1.1.12 h1:PV8peI4a0ysnczrg+LtxykD8LfKY9ML6u2jnxaEnrnM=
github.com/json-iterator/go v1.1.12/go.mod h1:e30LSqwooZae/UwlEbR2852Gd8hjQvJoHmT4TnhNGBo=
github.com/kennygrant/sanitize v1.2.4 h1:gN25/otpP5vAsO2djbMhF/LQX6R7+O1TB4yv8NzpJ3o=
github.com/kennygrant/sanitize v1.2.4/go.mod h1:LGsjYYtgxbetdg5owWB2mpgUL6e2nfw2eObZ0u0qvak=
github.com/klauspost/compress v1.18.0 h1:c/Cqfb0r+Yi+JtIEq73FWXVkRonBlf0CRNYc8Zttxdo=
github.com/klauspost/compress v1.18.0/go.mod h1:2Pp+KzxcywXVXMr50+X0Q/Lsb43OQHYWRCY2AiWywWQ=
github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
github.com/kr/pretty v0.2.1/go.mod h1:ipq/a2n7PKx3OHsz4KJII5eveXtPO4qwEXGdVfWzfnI=
github.com/kr/pretty v0.3.0/go.mod h1:640gp4NfQd8pI5XOwp5fnNeVWj67G7CFk/SaSQn7NBk=
github.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=
github.com/kr/pretty v0.3.1/go.mod h1:hoEshYVHaxMs3cyo3Yncou5ZscifuDolrwPKZanG3xk=
github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=
github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=
github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
github.com/leodido/go-urn v1.2.1 h1:BqpAaACuzVSgi/VLzGZIobT2z4v53pjosyNd9Yv6n/w=
github.com/leodido/go-urn v1.2.1/go.mod h1:zt4jvISO2HfUBqxjfIshjdMTYS56ZS/qv49ictyFfxY=
github.com/mailru/easyjson v0.7.7 h1:UGYAvKxe3sBsEDzO8ZeWOSlIQfWFlxbzLZe7hwFURr0=
github.com/mailru/easyjson v0.7.7/go.mod h1:xzfreul335JAWq5oZzymOObrkdz5UnU4kGfJJLY9Nlc=
github.com/mattn/go-colorable v0.1.13 h1:fFA4WZxdEF4tXPZVKMLwD8oUnCTTo08duU7wxecdEvA=
github.com/mattn/go-colorable v0.1.13/go.mod h1:7S9/ev0klgBDR4GtXTXX8a3vIGJpMovkB8vQcUbaXHg=
github.com/mattn/go-isatty v0.0.16/go.mod h1:kYGgaQfpe5nmfYZH+SKPsOc2e4SrIfOl2e/yFXSvRLM=
github.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY=
github.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y=
github.com/mattn/go-runewidth v0.0.9/go.mod h1:H031xJmbD/WCDINGzjvQ9THkh0rPKHF+m2gUSrubnMI=
github.com/mattn/go-runewidth v0.0.16 h1:E5ScNMtiwvlvB5paMFdw9p4kSQzbXFikJ5SQO6TULQc=
github.com/mattn/go-runewidth v0.0.16/go.mod h1:Jdepj2loyihRzMpdS35Xk/zdY8IAYHsh153qUoGf23w=
github.com/metoro-io/mcp-golang v0.11.0 h1:1k+VSE9QaeMTLn0gJ3FgE/DcjsCBsLFnz5eSFbgXUiI=
github.com/metoro-io/mcp-golang v0.11.0/go.mod h1:ifLP9ZzKpN1UqFWNTpAHOqSvNkMK6b7d1FSZ5Lu0lN0=
github.com/modern-go/concurrent v0.0.0-20180228061459-e0a39a4cb421/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd h1:TRLaZ9cD/w8PVh93nsPXa1VrQ6jlwL5oN8l14QlcNfg=
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
github.com/modern-go/reflect2 v1.0.2 h1:xBagoLtFs94CBntxluKeaWgTMpvLxC4ur3nMaC9Gz0M=
github.com/modern-go/reflect2 v1.0.2/go.mod h1:yWuevngMOJpCy52FWWMvUC8ws7m/LJsjYzDa0/r8luk=
github.com/olekukonko/tablewriter v0.0.5 h1:P2Ga83D34wi1o9J6Wh1mRuqd4mF/x/lgBS7N7AbDhec=
github.com/olekukonko/tablewriter v0.0.5/go.mod h1:hPp6KlRPjbx+hW8ykQs1w3UBbZlj6HuIJcUGPhkA7kY=
github.com/onsi/ginkgo/v2 v2.23.4 h1:ktYTpKJAVZnDT4VjxSbiBenUjmlL/5QkBEocaWXiQus=
github.com/onsi/ginkgo/v2 v2.23.4/go.mod h1:Bt66ApGPBFzHyR+JO10Zbt0Gsp4uWxu5mIOTusL46e8=
github.com/onsi/gomega v1.37.0 h1:CdEG8g0S133B4OswTDC/5XPSzE1OeP29QOioj2PID2Y=
github.com/onsi/gomega v1.37.0/go.mod h1:8D9+Txp43QWKhM24yyOBEdpkzN8FvJyAwecBgsU4KU0=
github.com/pelletier/go-toml/v2 v2.0.9 h1:uH2qQXheeefCCkuBBSLi7jCiSmj3VRh2+Goq2N7Xxu0=
github.com/pelletier/go-toml/v2 v2.0.9/go.mod h1:tJU2Z3ZkXwnxa4DPO899bsyIoywizdUvyaeZurnPPDc=
github.com/philippgille/chromem-go v0.7.0 h1:4jfvfyKymjKNfGxBUhHUcj1kp7B17NL/I1P+vGh1RvY=
github.com/philippgille/chromem-go v0.7.0/go.mod h1:hTd+wGEm/fFPQl7ilfCwQXkgEUxceYh86iIdoKMolPo=
github.com/pkg/diff v0.0.0-20210226163009-20ebb0f2a09e/go.mod h1:pJLUxLENpZxwdsKMEsNbx1VGcRFpLqf3715MtcvvzbA=
github.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=
github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pkoukk/tiktoken-go v0.1.6 h1:JF0TlJzhTbrI30wCvFuiw6FzP2+/bR+FIxUdgEAcUsw=
github.com/pkoukk/tiktoken-go v0.1.6/go.mod h1:9NiV+i9mJKGj1rYOT+njbv+ZwA/zJxYdewGl6qVatpg=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/prashantv/gostub v1.1.0 h1:BTyx3RfQjRHnUWaGF9oQos79AlQ5k8WNktv7VGvVH4g=
github.com/prashantv/gostub v1.1.0/go.mod h1:A5zLQHz7ieHGG7is6LLXLz7I8+3LZzsrV0P1IAHhP5U=
github.com/prometheus/client_model v0.0.0-20190812154241-14fe0d1b01d4/go.mod h1:xMI15A0UPsDsEKsMN9yxemIoYk6Tm2C1GtYGdfGttqA=
github.com/rivo/uniseg v0.2.0 h1:S1pD9weZBuJdFmowNwbpi7BJ8TNftyUImj/0WQi72jY=
github.com/rivo/uniseg v0.2.0/go.mod h1:J6wj4VEh+S6ZtnVlnTBMWIodfgj8LQOQFoIToxlJtxc=
github.com/rogpeppe/go-internal v1.6.1/go.mod h1:xXDCJY+GAPziupqXw64V24skbSoqbTEfhy4qGm1nDQc=
github.com/rogpeppe/go-internal v1.8.0/go.mod h1:WmiCO8CzOY8rg0OYDC4/i/2WRWAB6poM+XZ2dLUbcbE=
github.com/rogpeppe/go-internal v1.13.2-0.20241226121412-a5dc8ff20d0a h1:w3tdWGKbLGBPtR/8/oO74W6hmz0qE5q0z9aqSAewaaM=
github.com/rogpeppe/go-internal v1.13.2-0.20241226121412-a5dc8ff20d0a/go.mod h1:S8kfXMp+yh77OxPD4fdM6YUknrZpQxLhvxzS4gDHENY=
github.com/saintfish/chardet v0.0.0-20230101081208-5e3ef4b5456d h1:hrujxIzL1woJ7AwssoOcM/tq5JjjG2yYOc8odClEiXA=
github.com/saintfish/chardet v0.0.0-20230101081208-5e3ef4b5456d/go.mod h1:uugorj2VCxiV1x+LzaIdVa9b4S4qGAcH6cbhh4qVxOU=
github.com/sashabaranov/go-openai v1.38.2 h1:akrssjj+6DY3lWuDwHv6cBvJ8Z+FZDM9XEaaYFt0Auo=
github.com/sashabaranov/go-openai v1.38.2/go.mod h1:lj5b/K+zjTSFxVLijLSTDZuP7adOgerWeFyZLUhAKRg=
github.com/slack-go/slack v0.16.0 h1:khp/WCFv+Hb/B/AJaAwvcxKun0hM6grN0bUZ8xG60P8=
github.com/slack-go/slack v0.16.0/go.mod h1:hlGi5oXA+Gt+yWTPP0plCdRKmjsDxecdHxYQdlMQKOw=
github.com/ssor/bom v0.0.0-20170718123548-6386211fdfcf h1:pvbZ0lM0XWPBqUKqFU8cmavspvIl9nulOYwdy6IFRRo=
github.com/ssor/bom v0.0.0-20170718123548-6386211fdfcf/go.mod h1:RJID2RhlZKId02nZ62WenDCkgHFerpIOmW0iT7GKmXM=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/objx v0.4.0/go.mod h1:YvHI0jy2hoMjB+UWwv71VJQ9isScKT/TqJzVSSt89Yw=
github.com/stretchr/objx v0.5.0/go.mod h1:Yh+to48EsGEfYuaHDzXPcE3xhTkx73EhmCGUpEOglKo=
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
github.com/stretchr/testify v1.6.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/stretchr/testify v1.7.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/stretchr/testify v1.8.0/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU=
github.com/stretchr/testify v1.8.4/go.mod h1:sz/lmYIOXD/1dqDmKjjqLyZ2RngseejIcXlSw2iwfAo=
github.com/stretchr/testify v1.10.0 h1:Xv5erBjTwe/5IxqUQTdXv5kgmIvbHo3QQyRwhJsOfJA=
github.com/stretchr/testify v1.10.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY=
github.com/temoto/robotstxt v1.1.2 h1:W2pOjSJ6SWvldyEuiFXNxz3xZ8aiWX5LbfDiOFd7Fxg=
github.com/temoto/robotstxt v1.1.2/go.mod h1:+1AmkuG3IYkh1kv0d2qEB9Le88ehNO0zwOr3ujewlOo=
github.com/thoj/go-ircevent v0.0.0-20210723090443-73e444401d64 h1:l/T7dYuJEQZOwVOpjIXr1180aM9PZL/d1MnMVIxefX4=
github.com/thoj/go-ircevent v0.0.0-20210723090443-73e444401d64/go.mod h1:Q1NAJOuRdQCqN/VIWdnaaEhV8LpeO2rtlBP7/iDJNII=
github.com/tidwall/gjson v1.14.2/go.mod h1:/wbyibRr2FHMks5tjHJ5F8dMZh3AcwJEMf5vlfC0lxk=
github.com/tidwall/gjson v1.18.0 h1:FIDeeyB800efLX89e5a8Y0BNH+LOngJyGrIWxG2FKQY=
github.com/tidwall/gjson v1.18.0/go.mod h1:/wbyibRr2FHMks5tjHJ5F8dMZh3AcwJEMf5vlfC0lxk=
github.com/tidwall/match v1.1.1 h1:+Ho715JplO36QYgwN9PGYNhgZvoUSc9X2c80KVTi+GA=
github.com/tidwall/match v1.1.1/go.mod h1:eRSPERbgtNPcGhD8UCthc6PmLEQXEWd3PRB5JTxsfmM=
github.com/tidwall/pretty v1.2.0/go.mod h1:ITEVvHYasfjBbM0u2Pg8T2nJnzm8xPwvNhhsoaGGjNU=
github.com/tidwall/pretty v1.2.1 h1:qjsOFOWWQl+N3RsoF5/ssm1pHmJJwhjlSbZ51I6wMl4=
github.com/tidwall/pretty v1.2.1/go.mod h1:ITEVvHYasfjBbM0u2Pg8T2nJnzm8xPwvNhhsoaGGjNU=
github.com/tidwall/sjson v1.2.5 h1:kLy8mja+1c9jlljvWTlSazM7cKDRfJuR/bOJhcY5NcY=
github.com/tidwall/sjson v1.2.5/go.mod h1:Fvgq9kS/6ociJEDnK0Fk1cpYF4FIW6ZF7LAe+6jwd28=
github.com/tmc/langchaingo v0.1.13 h1:rcpMWBIi2y3B90XxfE4Ao8dhCQPVDMaNPnN5cGB1CaA=
github.com/tmc/langchaingo v0.1.13/go.mod h1:vpQ5NOIhpzxDfTZK9B6tf2GM/MoaHewPWM5KXXGh7hg=
github.com/traefik/yaegi v0.16.1 h1:f1De3DVJqIDKmnasUF6MwmWv1dSEEat0wcpXhD2On3E=
github.com/traefik/yaegi v0.16.1/go.mod h1:4eVhbPb3LnD2VigQjhYbEJ69vDRFdT2HQNrXx8eEwUY=
github.com/ugorji/go v1.2.7/go.mod h1:nF9osbDWLy6bDVv/Rtoh6QgnvNDpmCalQV5urGCCS6M=
github.com/ugorji/go/codec v1.2.7 h1:YPXUKf7fYbp/y8xloBqZOw2qaVggbfwMlI8WM3wZUJ0=
github.com/ugorji/go/codec v1.2.7/go.mod h1:WGN1fab3R1fzQlVQTkfxVtIBhWDRqOviHU95kRgeqEY=
github.com/valyala/bytebufferpool v1.0.0 h1:GqA5TC/0021Y/b9FG4Oi9Mr3q7XYx6KllzawFIhcdPw=
github.com/valyala/bytebufferpool v1.0.0/go.mod h1:6bBcMArwyJ5K/AmCkWv1jt77kVWyCJ6HpOuEn7z0Csc=
github.com/valyala/fasthttp v1.60.0 h1:kBRYS0lOhVJ6V+bYN8PqAHELKHtXqwq9zNMLKx1MBsw=
github.com/valyala/fasthttp v1.60.0/go.mod h1:iY4kDgV3Gc6EqhRZ8icqcmlG6bqhcDXfuHgTO4FXCvc=
github.com/wk8/go-ordered-map/v2 v2.1.8 h1:5h/BUHu93oj4gIdvHHHGsScSTMijfx5PeYkE/fJgbpc=
github.com/wk8/go-ordered-map/v2 v2.1.8/go.mod h1:5nJHM5DyteebpVlHnWMV0rPz6Zp7+xBAnxjb1X5vnTw=
github.com/xyproto/randomstring v1.0.5 h1:YtlWPoRdgMu3NZtP45drfy1GKoojuR7hmRcnhZqKjWU=
github.com/xyproto/randomstring v1.0.5/go.mod h1:rgmS5DeNXLivK7YprL0pY+lTuhNQW3iGxZ18UQApw/E=
github.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5taEt/CY=
go.starlark.net v0.0.0-20230302034142-4b1e35fe2254 h1:Ss6D3hLXTM0KobyBYEAygXzFfGcjnmfEJOBgSbemCtg=
go.starlark.net v0.0.0-20230302034142-4b1e35fe2254/go.mod h1:jxU+3+j+71eXOW14274+SmmuW82qJzl6iZSeqEtTGds=
go.uber.org/automaxprocs v1.6.0 h1:O3y2/QNTOdbF+e/dpXNNW7Rx2hZ4sTIPyybbxyNqTUs=
go.uber.org/automaxprocs v1.6.0/go.mod h1:ifeIMSnPZuznNm6jmdzmU3/bfk01Fe2fotchwEFJ8r8=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/crypto v0.0.0-20210421170649-83a5a9bb288b/go.mod h1:T9bdIzuCu7OtxOm1hfPfRQxPLYneinmdGuTeoZ9dtd4=
golang.org/x/crypto v0.0.0-20210711020723-a769d52b0f97/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
golang.org/x/crypto v0.37.0 h1:kJNSjF/Xp7kU0iB2Z+9viTPMW4EqqsrywMXLJOOsXSE=
golang.org/x/crypto v0.37.0/go.mod h1:vg+k43peMZ0pUMhYmVAWysMK35e6ioLh3wB8ZCAfbVc=
golang.org/x/exp v0.0.0-20190121172915-509febef88a4/go.mod h1:CJ0aWSM057203Lf6IL+f9T1iT9GByDxfZKAQTCR3kQA=
golang.org/x/lint v0.0.0-20181026193005-c67002cb31c3/go.mod h1:UVdnD1Gm6xHRNCYTkRU2/jEulfH38KcIWyp/GAMgvoE=
golang.org/x/lint v0.0.0-20190227174305-5b3e6a55c961/go.mod h1:wehouNa3lNwaWXcvxsM5YxQ5yQlVC4a0KAMCusXpPoU=
golang.org/x/lint v0.0.0-20190313153728-d0100b6bd8b3/go.mod h1:6SW0HCj/g11FgYtHlgUYUwCkIfeOF89ocIRzGO/8vkc=
golang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91VN4djpZkiMVwK6gcyfeH4XE8wZrZaV4=
golang.org/x/mod v0.8.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=
golang.org/x/net v0.0.0-20180724234803-3673e40ba225/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20180826012351-8a410e7b638d/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20190213061140-3a22650c66bd/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20190311183353-d8887717615a/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=
golang.org/x/net v0.0.0-20190620200207-3b0461eec859/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=
golang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v0D8zg8gWTRqZa9RBIspLL5mdg=
golang.org/x/net v0.0.0-20210614182718-04defd469f4e/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
golang.org/x/net v0.0.0-20210916014120-12bc252f5db8/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
golang.org/x/net v0.0.0-20220722155237-a158d28d115b/go.mod h1:XRhObCWvk6IyKnWLug+ECip1KBveYUHfp+8e9klMJ9c=
golang.org/x/net v0.5.0/go.mod h1:DivGGAXEgPSlEBzxGzZI+ZLohi+xUj054jfeKui00ws=
golang.org/x/net v0.6.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=
golang.org/x/net v0.7.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=
golang.org/x/net v0.9.0/go.mod h1:d48xBJpPfHeWQsugry2m+kC02ZBRGRgulfHnEXEuWns=
golang.org/x/net v0.38.0 h1:vRMAPTMaeGqVhG5QyLJHqNDwecKTomGeqbnfZyKlBI8=
golang.org/x/net v0.38.0/go.mod h1:ivrbrMbzFq5J41QOQh0siUuly180yBYtLp+CKbEaFx8=
golang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
golang.org/x/sync v0.0.0-20180314180146-1d60e4601c6f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20181108010431-42b317875d0f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sys v0.0.0-20180830151530-49385e6e1522/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210423082822-04245dca01da/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20210806184541-e5e7981a1069/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220520151302-bc2c85ada10a/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220715151400-c0bba94af5f8/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220722155257-8c9f86f7a55f/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220811171246-fbc7d0a398ab/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.4.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.7.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.32.0 h1:s77OFDvIQeibCmezSnk/q6iAfkdiQaJi4VzroCFrN20=
golang.org/x/sys v0.32.0/go.mod h1:BJP2sWEmIv4KK5OTEluFJCKSidICx8ciO85XgH3Ak8k=
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
golang.org/x/term v0.0.0-20210927222741-03fcf44c2211/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=
golang.org/x/term v0.0.0-20220526004731-065cf7ba2467/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=
golang.org/x/term v0.4.0/go.mod h1:9P2UbLfCdcvo3p/nzKvsmas4TnlujnuoV9hGgYzW1lQ=
golang.org/x/term v0.5.0/go.mod h1:jMB1sMXY+tzblOD4FWmEbocvup2/aLOaQEp7JmGp78k=
golang.org/x/term v0.7.0/go.mod h1:P32HKFT3hSsZrRxla30E9HqToFYAQPCMs/zFMBUFqPY=
golang.org/x/term v0.31.0 h1:erwDkOK1Msy6offm1mOgvspSkslFnIGsFnxOKoufg3o=
golang.org/x/term v0.31.0/go.mod h1:R4BeIy7D95HzImkxGkTW1UQTtP54tio2RyHz7PwK0aw=
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
golang.org/x/text v0.3.6/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
golang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=
golang.org/x/text v0.3.8/go.mod h1:E6s5w1FMmriuDzIBO73fBruAKo1PCIq6d2Q6DHfQ8WQ=
golang.org/x/text v0.6.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=
golang.org/x/text v0.7.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=
golang.org/x/text v0.9.0/go.mod h1:e1OnstbJyHTd6l/uOt8jFFHp6TRDWZR/bV3emEE/zU8=
golang.org/x/text v0.24.0 h1:dd5Bzh4yt5KYA8f9CJHCP4FB4D51c2c6JvN37xJJkJ0=
golang.org/x/text v0.24.0/go.mod h1:L8rBsPeo2pSS+xqN0d5u2ikmjtmoJbDBT1b7nHvFCdU=
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
golang.org/x/tools v0.0.0-20190114222345-bf090417da8b/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
golang.org/x/tools v0.0.0-20190226205152-f727befe758c/go.mod h1:9Yl7xja0Znq3iFh3HoIrodX9oNMXvdceNzlUR8zjMvY=
golang.org/x/tools v0.0.0-20190311212946-11955173bddd/go.mod h1:LCzVGOaR6xXOjkQ3onu1FJEFr0SW1gC7cKk1uF8kGRs=
golang.org/x/tools v0.0.0-20190524140312-2c0ae7006135/go.mod h1:RgjU9mgBXZiqYHBnxXauZ1Gv1EHHAz9KjViQ78xBX0Q=
golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=
golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=
golang.org/x/tools v0.6.0/go.mod h1:Xwgl3UAJ/d3gWutnCtw505GrjyAbvKui8lOU390QaIU=
golang.org/x/tools v0.31.0 h1:0EedkvKDbh+qistFTd0Bcwe/YLh4vHwWEkiI0toFIBU=
golang.org/x/tools v0.31.0/go.mod h1:naFTU+Cev749tSJRXJlna0T3WxKvb1kWEx15xA4SdmQ=
golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
golang.org/x/xerrors v0.0.0-20200804184101-5ec99f83aff1/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
google.golang.org/appengine v1.1.0/go.mod h1:EbEs0AVv82hx2wNQdGPgUI5lhzA/G0D9YwlJXL52JkM=
google.golang.org/appengine v1.4.0/go.mod h1:xpcJRLb0r/rnEns0DIKYYv+WjYCduHsrkT7/EB5XEv4=
google.golang.org/appengine v1.6.8 h1:IhEN5q69dyKagZPYMSdIjS2HqprW324FRQZJcGqPAsM=
google.golang.org/appengine v1.6.8/go.mod h1:1jJ3jBArFh5pcgW8gCtRJnepW8FzD1V44FJffLiz/Ds=
google.golang.org/genproto v0.0.0-20180817151627-c66870c02cf8/go.mod h1:JiN7NxoALGmiZfu7CAH4rXhgtRTLTxftemlI0sWmxmc=
google.golang.org/genproto v0.0.0-20190819201941-24fa4b261c55/go.mod h1:DMBHOl98Agz4BDEuKkezgsaosCRResVns1a3J2ZsMNc=
google.golang.org/genproto v0.0.0-20200526211855-cb27e3aa2013/go.mod h1:NbSheEEYHJ7i3ixzK3sjbqSGDJWnxyFXZblF3eUsNvo=
google.golang.org/grpc v1.19.0/go.mod h1:mqu4LbDTu4XGKhr4mRzUsmM4RtVoemTSY81AxZiDr8c=
google.golang.org/grpc v1.23.0/go.mod h1:Y5yQAOtifL1yxbo5wqy6BxZv8vAUGQwXBOALyacEbxg=
google.golang.org/grpc v1.27.0/go.mod h1:qbnxyOmOxrQa7FizSgH+ReBfzJrCY1pSN7KXBS8abTk=
google.golang.org/protobuf v0.0.0-20200109180630-ec00e32a8dfd/go.mod h1:DFci5gLYBciE7Vtevhsrf46CRTquxDuWsQurQQe4oz8=
google.golang.org/protobuf v0.0.0-20200221191635-4d8936d0db64/go.mod h1:kwYJMbMJ01Woi6D6+Kah6886xMZcty6N08ah7+eCXa0=
google.golang.org/protobuf v0.0.0-20200228230310-ab0ca4ff8a60/go.mod h1:cfTl7dwQJ+fmap5saPgwCLgHXTUD7jkjRqWcaiX5VyM=
google.golang.org/protobuf v1.20.1-0.20200309200217-e05f789c0967/go.mod h1:A+miEFZTKqfCUM6K7xSMQL9OKL/b6hQv+e19PK+JZNE=
google.golang.org/protobuf v1.21.0/go.mod h1:47Nbq4nVaFHyn7ilMalzfO3qCViNmqZ2kzikPIcrTAo=
google.golang.org/protobuf v1.22.0/go.mod h1:EGpADcykh3NcUnDUJcl1+ZksZNG86OlYog2l/sGQquU=
google.golang.org/protobuf v1.23.1-0.20200526195155-81db48ad09cc/go.mod h1:EGpADcykh3NcUnDUJcl1+ZksZNG86OlYog2l/sGQquU=
google.golang.org/protobuf v1.25.0/go.mod h1:9JNX74DMeImyA3h4bdi1ymwjUzf21/xIlbajtzgsN7c=
google.golang.org/protobuf v1.26.0-rc.1/go.mod h1:jlhhOSvTdKEhbULTjvd4ARK9grFBp09yW+WbY/TyQbw=
google.golang.org/protobuf v1.26.0/go.mod h1:9q0QmTI4eRPtz6boOQmLYwt+qCgq0jsYwAQnmE0givc=
google.golang.org/protobuf v1.36.5 h1:tPhr+woSbjfYvY6/GPufUoYizxw1cF/yFoxJ2fmpwlM=
google.golang.org/protobuf v1.36.5/go.mod h1:9fA7Ob0pmnwhb644+1+CVWFRbNajQ6iRojtC/QF5bRE=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c h1:Hei/4ADfdWqJk1ZMxUNpqntNwaWcugrBjAiHlqqRiVk=
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c/go.mod h1:JHkPIbrfpd72SG/EVd6muEfDQjcINNoR0C8j2r3qZ4Q=
gopkg.in/errgo.v2 v2.1.0/go.mod h1:hNsd1EY+bozCKY1Ytp96fpM3vjJbqLJn88ws8XvfDNI=
gopkg.in/yaml.v2 v2.4.0 h1:D8xgwECY7CYvx+Y2n4sBz93Jn9JRvxdiyyo8CTfuKaY=
gopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
gopkg.in/yaml.v3 v3.0.0-20210107192922-496545a6307b/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
honnef.co/go/tools v0.0.0-20190102054323-c2f93a96b099/go.mod h1:rf3lG4BRIbNafJWhAfAdb/ePZxsR/4RtNHQocxwk9r4=
honnef.co/go/tools v0.0.0-20190523083050-ea95bdfd59fc/go.mod h1:rf3lG4BRIbNafJWhAfAdb/ePZxsR/4RtNHQocxwk9r4=
jaytaylor.com/html2text v0.0.0-20230321000545-74c2419ad056 h1:6YFJoB+0fUH6X3xU/G2tQqCYg+PkGtnZ5nMR5rpw72g=
jaytaylor.com/html2text v0.0.0-20230321000545-74c2419ad056/go.mod h1:OxvTsCwKosqQ1q7B+8FwXqg4rKZ/UG9dUW+g/VL2xH4=
mvdan.cc/xurls/v2 v2.6.0 h1:3NTZpeTxYVWNSokW3MKeyVkz/j7uYXYiMtXRUfmjbgI=
mvdan.cc/xurls/v2 v2.6.0/go.mod h1:bCvEZ1XvdA6wDnxY7jPPjEmigDtvtvPXAD/Exa9IMSk=
sigs.k8s.io/yaml v1.3.0 h1:a2VclLzOGrwOHDiV8EfBGhvjHvP46CtW5j6POvhYGGo=
sigs.k8s.io/yaml v1.3.0/go.mod h1:GeOyir5tyXNByN85N/dRIT9es5UQNerPYEKK56eTBm8=

15
jsconfig.json Normal file
View File

@@ -0,0 +1,15 @@
{
"compilerOptions": {
"module": "ESNext",
"moduleResolution": "Bundler",
"target": "ES2022",
"jsx": "react",
"allowImportingTsExtensions": true,
"strictNullChecks": true,
"strictFunctionTypes": true
},
"exclude": [
"node_modules",
"**/node_modules/*"
]
}

97
main.go Normal file
View File

@@ -0,0 +1,97 @@
package main
import (
"log"
"os"
"path/filepath"
"strings"
"github.com/mudler/LocalAGI/core/state"
"github.com/mudler/LocalAGI/services"
"github.com/mudler/LocalAGI/webui"
)
var baseModel = os.Getenv("LOCALAGI_MODEL")
var multimodalModel = os.Getenv("LOCALAGI_MULTIMODAL_MODEL")
var apiURL = os.Getenv("LOCALAGI_LLM_API_URL")
var apiKey = os.Getenv("LOCALAGI_LLM_API_KEY")
var timeout = os.Getenv("LOCALAGI_TIMEOUT")
var stateDir = os.Getenv("LOCALAGI_STATE_DIR")
var localRAG = os.Getenv("LOCALAGI_LOCALRAG_URL")
var withLogs = os.Getenv("LOCALAGI_ENABLE_CONVERSATIONS_LOGGING") == "true"
var apiKeysEnv = os.Getenv("LOCALAGI_API_KEYS")
var imageModel = os.Getenv("LOCALAGI_IMAGE_MODEL")
var conversationDuration = os.Getenv("LOCALAGI_CONVERSATION_DURATION")
var localOperatorBaseURL = os.Getenv("LOCALOPERATOR_BASE_URL")
var mcpboxURL = os.Getenv("LOCALAGI_MCPBOX_URL")
func init() {
if baseModel == "" {
panic("LOCALAGI_MODEL not set")
}
if apiURL == "" {
panic("LOCALAGI_API_URL not set")
}
if timeout == "" {
timeout = "5m"
}
if stateDir == "" {
cwd, err := os.Getwd()
if err != nil {
panic(err)
}
stateDir = filepath.Join(cwd, "pool")
}
}
func main() {
// make sure state dir exists
os.MkdirAll(stateDir, 0755)
apiKeys := []string{}
if apiKeysEnv != "" {
apiKeys = strings.Split(apiKeysEnv, ",")
}
// Create the agent pool
pool, err := state.NewAgentPool(
baseModel,
multimodalModel,
imageModel,
apiURL,
apiKey,
stateDir,
mcpboxURL,
localRAG,
services.Actions(map[string]string{
"browser-agent-runner-base-url": localOperatorBaseURL,
}),
services.Connectors,
services.DynamicPrompts,
timeout,
withLogs,
)
if err != nil {
panic(err)
}
// Create the application
app := webui.NewApp(
webui.WithPool(pool),
webui.WithConversationStoreduration(conversationDuration),
webui.WithApiKeys(apiKeys...),
webui.WithLLMAPIUrl(apiURL),
webui.WithLLMAPIKey(apiKey),
webui.WithLLMModel(baseModel),
webui.WithStateDir(stateDir),
)
// Start the agents
if err := pool.StartAll(); err != nil {
panic(err)
}
// Start the web server
log.Fatal(app.Listen(":3000"))
}

436
main.py
View File

@@ -1,436 +0,0 @@
import openai
#from langchain.embeddings import HuggingFaceEmbeddings
from langchain.embeddings import LocalAIEmbeddings
import uuid
import sys
from localagi import LocalAGI
from loguru import logger
from ascii_magic import AsciiArt
from duckduckgo_search import DDGS
from typing import Dict, List
import os
# these three lines swap the stdlib sqlite3 lib with the pysqlite3 package for chroma
__import__('pysqlite3')
import sys
sys.modules['sqlite3'] = sys.modules.pop('pysqlite3')
from langchain.vectorstores import Chroma
from chromadb.config import Settings
import json
import os
from io import StringIO
# Parse arguments such as system prompt and batch mode
import argparse
parser = argparse.ArgumentParser(description='LocalAGI')
# System prompt
parser.add_argument('--system-prompt', dest='system_prompt', action='store',
help='System prompt to use')
# Batch mode
parser.add_argument('--prompt', dest='prompt', action='store', default=False,
help='Prompt mode')
# Interactive mode
parser.add_argument('--interactive', dest='interactive', action='store_true', default=False,
help='Interactive mode. Can be used with --prompt to start an interactive session')
# skip avatar creation
parser.add_argument('--skip-avatar', dest='skip_avatar', action='store_true', default=False,
help='Skip avatar creation')
# Reevaluate
parser.add_argument('--re-evaluate', dest='re_evaluate', action='store_true', default=False,
help='Reevaluate if another action is needed or we have completed the user request')
# Postprocess
parser.add_argument('--postprocess', dest='postprocess', action='store_true', default=False,
help='Postprocess the reasoning')
# Subtask context
parser.add_argument('--subtask-context', dest='subtaskContext', action='store_true', default=False,
help='Include context in subtasks')
# Search results number
parser.add_argument('--search-results', dest='search_results', type=int, action='store', default=2,
help='Number of search results to return')
# Plan message
parser.add_argument('--plan-message', dest='plan_message', action='store',
help="What message to use during planning",
)
DEFAULT_PROMPT="floating hair, portrait, ((loli)), ((one girl)), cute face, hidden hands, asymmetrical bangs, beautiful detailed eyes, eye shadow, hair ornament, ribbons, bowties, buttons, pleated skirt, (((masterpiece))), ((best quality)), colorful|((part of the head)), ((((mutated hands and fingers)))), deformed, blurry, bad anatomy, disfigured, poorly drawn face, mutation, mutated, extra limb, ugly, poorly drawn hands, missing limb, blurry, floating limbs, disconnected limbs, malformed hands, blur, out of focus, long neck, long body, Octane renderer, lowres, bad anatomy, bad hands, text"
DEFAULT_API_BASE = os.environ.get("DEFAULT_API_BASE", "http://api:8080")
# TTS api base
parser.add_argument('--tts-api-base', dest='tts_api_base', action='store', default=DEFAULT_API_BASE,
help='TTS api base')
# LocalAI api base
parser.add_argument('--localai-api-base', dest='localai_api_base', action='store', default=DEFAULT_API_BASE,
help='LocalAI api base')
# Images api base
parser.add_argument('--images-api-base', dest='images_api_base', action='store', default=DEFAULT_API_BASE,
help='Images api base')
# Embeddings api base
parser.add_argument('--embeddings-api-base', dest='embeddings_api_base', action='store', default=DEFAULT_API_BASE,
help='Embeddings api base')
# Functions model
parser.add_argument('--functions-model', dest='functions_model', action='store', default="functions",
help='Functions model')
# Embeddings model
parser.add_argument('--embeddings-model', dest='embeddings_model', action='store', default="all-MiniLM-L6-v2",
help='Embeddings model')
# LLM model
parser.add_argument('--llm-model', dest='llm_model', action='store', default="gpt-4",
help='LLM model')
# Voice model
parser.add_argument('--tts-model', dest='tts_model', action='store', default="en-us-kathleen-low.onnx",
help='TTS model')
# Stable diffusion model
parser.add_argument('--stablediffusion-model', dest='stablediffusion_model', action='store', default="stablediffusion",
help='Stable diffusion model')
# Stable diffusion prompt
parser.add_argument('--stablediffusion-prompt', dest='stablediffusion_prompt', action='store', default=DEFAULT_PROMPT,
help='Stable diffusion prompt')
# Force action
parser.add_argument('--force-action', dest='force_action', action='store', default="",
help='Force an action')
# Debug mode
parser.add_argument('--debug', dest='debug', action='store_true', default=False,
help='Debug mode')
# Critic mode
parser.add_argument('--critic', dest='critic', action='store_true', default=False,
help='Enable critic')
# Parse arguments
args = parser.parse_args()
STABLEDIFFUSION_MODEL = os.environ.get("STABLEDIFFUSION_MODEL", args.stablediffusion_model)
STABLEDIFFUSION_PROMPT = os.environ.get("STABLEDIFFUSION_PROMPT", args.stablediffusion_prompt)
FUNCTIONS_MODEL = os.environ.get("FUNCTIONS_MODEL", args.functions_model)
EMBEDDINGS_MODEL = os.environ.get("EMBEDDINGS_MODEL", args.embeddings_model)
LLM_MODEL = os.environ.get("LLM_MODEL", args.llm_model)
VOICE_MODEL= os.environ.get("TTS_MODEL",args.tts_model)
STABLEDIFFUSION_MODEL = os.environ.get("STABLEDIFFUSION_MODEL",args.stablediffusion_model)
STABLEDIFFUSION_PROMPT = os.environ.get("STABLEDIFFUSION_PROMPT", args.stablediffusion_prompt)
PERSISTENT_DIR = os.environ.get("PERSISTENT_DIR", "/data")
SYSTEM_PROMPT = ""
if os.environ.get("SYSTEM_PROMPT") or args.system_prompt:
SYSTEM_PROMPT = os.environ.get("SYSTEM_PROMPT", args.system_prompt)
LOCALAI_API_BASE = args.localai_api_base
TTS_API_BASE = args.tts_api_base
IMAGE_API_BASE = args.images_api_base
EMBEDDINGS_API_BASE = args.embeddings_api_base
# Set log level
LOG_LEVEL = "INFO"
def my_filter(record):
return record["level"].no >= logger.level(LOG_LEVEL).no
logger.remove()
logger.add(sys.stderr, filter=my_filter)
if args.debug:
LOG_LEVEL = "DEBUG"
logger.debug("Debug mode on")
FUNCTIONS_MODEL = os.environ.get("FUNCTIONS_MODEL", args.functions_model)
EMBEDDINGS_MODEL = os.environ.get("EMBEDDINGS_MODEL", args.embeddings_model)
LLM_MODEL = os.environ.get("LLM_MODEL", args.llm_model)
VOICE_MODEL= os.environ.get("TTS_MODEL",args.tts_model)
STABLEDIFFUSION_MODEL = os.environ.get("STABLEDIFFUSION_MODEL",args.stablediffusion_model)
STABLEDIFFUSION_PROMPT = os.environ.get("STABLEDIFFUSION_PROMPT", args.stablediffusion_prompt)
PERSISTENT_DIR = os.environ.get("PERSISTENT_DIR", "/data")
SYSTEM_PROMPT = ""
if os.environ.get("SYSTEM_PROMPT") or args.system_prompt:
SYSTEM_PROMPT = os.environ.get("SYSTEM_PROMPT", args.system_prompt)
LOCALAI_API_BASE = args.localai_api_base
TTS_API_BASE = args.tts_api_base
IMAGE_API_BASE = args.images_api_base
EMBEDDINGS_API_BASE = args.embeddings_api_base
## Constants
REPLY_ACTION = "reply"
PLAN_ACTION = "plan"
embeddings = LocalAIEmbeddings(model=EMBEDDINGS_MODEL,openai_api_base=EMBEDDINGS_API_BASE)
chroma_client = Chroma(collection_name="memories", persist_directory="db", embedding_function=embeddings)
# Function to create images with LocalAI
def display_avatar(agi, input_text=STABLEDIFFUSION_PROMPT, model=STABLEDIFFUSION_MODEL):
image_url = agi.get_avatar(input_text, model)
# convert the image to ascii art
my_art = AsciiArt.from_url(image_url)
my_art.to_terminal()
## This function is called to ask the user if does agree on the action to take and execute
def ask_user_confirmation(action_name, action_parameters):
logger.info("==> Ask user confirmation")
logger.info("==> action_name: {action_name}", action_name=action_name)
logger.info("==> action_parameters: {action_parameters}", action_parameters=action_parameters)
# Ask via stdin
logger.info("==> Do you want to execute the action? (y/n)")
user_input = input()
if user_input == "y":
logger.info("==> Executing action")
return True
else:
logger.info("==> Skipping action")
return False
### Agent capabilities
### These functions are called by the agent to perform actions
###
def save(memory, agent_actions={}, localagi=None):
q = json.loads(memory)
logger.info(">>> saving to memories: ")
logger.info(q["content"])
chroma_client.add_texts([q["content"]],[{"id": str(uuid.uuid4())}])
chroma_client.persist()
return f"The object was saved permanently to memory."
def search_memory(query, agent_actions={}, localagi=None):
q = json.loads(query)
docs = chroma_client.similarity_search(q["reasoning"])
text_res="Memories found in the database:\n"
for doc in docs:
text_res+="- "+doc.page_content+"\n"
#if args.postprocess:
# return post_process(text_res)
#return text_res
return localagi.post_process(text_res)
# write file to disk with content
def save_file(arg, agent_actions={}, localagi=None):
arg = json.loads(arg)
filename = arg["filename"]
content = arg["content"]
# create persistent dir if does not exist
if not os.path.exists(PERSISTENT_DIR):
os.makedirs(PERSISTENT_DIR)
# write the file in the directory specified
filename = os.path.join(PERSISTENT_DIR, filename)
with open(filename, 'w') as f:
f.write(content)
return f"File {filename} saved successfully."
def ddg(query: str, num_results: int, backend: str = "api") -> List[Dict[str, str]]:
"""Run query through DuckDuckGo and return metadata.
Args:
query: The query to search for.
num_results: The number of results to return.
Returns:
A list of dictionaries with the following keys:
snippet - The description of the result.
title - The title of the result.
link - The link to the result.
"""
with DDGS() as ddgs:
results = ddgs.text(
query,
backend=backend,
)
if results is None:
return [{"Result": "No good DuckDuckGo Search Result was found"}]
def to_metadata(result: Dict) -> Dict[str, str]:
if backend == "news":
return {
"date": result["date"],
"title": result["title"],
"snippet": result["body"],
"source": result["source"],
"link": result["url"],
}
return {
"snippet": result["body"],
"title": result["title"],
"link": result["href"],
}
formatted_results = []
for i, res in enumerate(results, 1):
if res is not None:
formatted_results.append(to_metadata(res))
if len(formatted_results) == num_results:
break
return formatted_results
## Search on duckduckgo
def search_duckduckgo(a, agent_actions={}, localagi=None):
a = json.loads(a)
list=ddg(a["query"], args.search_results)
text_res=""
for doc in list:
text_res+=f"""{doc["link"]}: {doc["title"]} {doc["snippet"]}\n"""
#if args.postprocess:
# return post_process(text_res)
return text_res
#l = json.dumps(list)
#return l
### End Agent capabilities
###
### Agent action definitions
agent_actions = {
"search_internet": {
"function": search_duckduckgo,
"plannable": True,
"description": 'For searching the internet with a query, the assistant replies with the action "search_internet" and the query to search.',
"signature": {
"name": "search_internet",
"description": """For searching internet.""",
"parameters": {
"type": "object",
"properties": {
"query": {
"type": "string",
"description": "information to save"
},
},
}
},
},
"save_file": {
"function": save_file,
"plannable": True,
"description": 'The assistant replies with the action "save_file", the filename and content to save for writing a file to disk permanently. This can be used to store the result of complex actions locally.',
"signature": {
"name": "save_file",
"description": """For saving a file to disk with content.""",
"parameters": {
"type": "object",
"properties": {
"filename": {
"type": "string",
"description": "information to save"
},
"content": {
"type": "string",
"description": "information to save"
},
},
}
},
},
"save_memory": {
"function": save,
"plannable": True,
"description": 'The assistant replies with the action "save_memory" and the string to remember or store an information that thinks it is relevant permanently.',
"signature": {
"name": "save_memory",
"description": """Save or store informations into memory.""",
"parameters": {
"type": "object",
"properties": {
"content": {
"type": "string",
"description": "information to save"
},
},
"required": ["content"]
}
},
},
"search_memory": {
"function": search_memory,
"plannable": True,
"description": 'The assistant replies with the action "search_memory" for searching between its memories with a query term.',
"signature": {
"name": "search_memory",
"description": """Search in memory""",
"parameters": {
"type": "object",
"properties": {
"reasoning": {
"type": "string",
"description": "reasoning behind the intent"
},
},
"required": ["reasoning"]
}
},
},
}
if __name__ == "__main__":
conversation_history = []
# Create a LocalAGI instance
logger.info("Creating LocalAGI instance")
localagi = LocalAGI(
agent_actions=agent_actions,
embeddings_model=EMBEDDINGS_MODEL,
embeddings_api_base=EMBEDDINGS_API_BASE,
llm_model=LLM_MODEL,
tts_model=VOICE_MODEL,
tts_api_base=TTS_API_BASE,
functions_model=FUNCTIONS_MODEL,
api_base=LOCALAI_API_BASE,
stablediffusion_api_base=IMAGE_API_BASE,
stablediffusion_model=STABLEDIFFUSION_MODEL,
force_action=args.force_action,
plan_message=args.plan_message,
)
# Set a system prompt if SYSTEM_PROMPT is set
if SYSTEM_PROMPT != "":
conversation_history.append({
"role": "system",
"content": SYSTEM_PROMPT
})
logger.info("Welcome to LocalAGI")
# Skip avatar creation if --skip-avatar is set
if not args.skip_avatar:
logger.info("Creating avatar, please wait...")
display_avatar(localagi)
actions = ""
for action in agent_actions:
actions+=" '"+action+"'"
logger.info("LocalAGI internally can do the following actions:{actions}", actions=actions)
if not args.prompt:
logger.info(">>> Interactive mode <<<")
else:
logger.info(">>> Prompt mode <<<")
logger.info(args.prompt)
# IF in prompt mode just evaluate, otherwise loop
if args.prompt:
conversation_history=localagi.evaluate(
args.prompt,
conversation_history,
critic=args.critic,
re_evaluate=args.re_evaluate,
# Enable to lower context usage but increases LLM calls
postprocess=args.postprocess,
subtaskContext=args.subtaskContext,
)
localagi.tts_play(conversation_history[-1]["content"])
if not args.prompt or args.interactive:
# TODO: process functions also considering the conversation history? conversation history + input
logger.info(">>> Ready! What can I do for you? ( try with: plan a roadtrip to San Francisco ) <<<")
while True:
user_input = input(">>> ")
# we are going to use the args to change the evaluation behavior
conversation_history=localagi.evaluate(
user_input,
conversation_history,
critic=args.critic,
re_evaluate=args.re_evaluate,
# Enable to lower context usage but increases LLM calls
postprocess=args.postprocess,
subtaskContext=args.subtaskContext,
)
localagi.tts_play(conversation_history[-1]["content"])

172
pkg/client/agents.go Normal file
View File

@@ -0,0 +1,172 @@
package localagi
import (
"encoding/json"
"fmt"
"net/http"
)
// AgentConfig represents the configuration for an agent
type AgentConfig struct {
Name string `json:"name"`
Actions []string `json:"actions,omitempty"`
Connectors []string `json:"connectors,omitempty"`
PromptBlocks []string `json:"prompt_blocks,omitempty"`
InitialPrompt string `json:"initial_prompt,omitempty"`
Parallel bool `json:"parallel,omitempty"`
Config map[string]interface{} `json:"config,omitempty"`
}
// AgentStatus represents the status of an agent
type AgentStatus struct {
Status string `json:"status"`
}
// ListAgents returns a list of all agents
func (c *Client) ListAgents() ([]string, error) {
resp, err := c.doRequest(http.MethodGet, "/agents", nil)
if err != nil {
return nil, err
}
defer resp.Body.Close()
// The response is HTML, so we'll need to parse it properly
// For now, we'll just return a placeholder implementation
return []string{}, fmt.Errorf("ListAgents not implemented")
}
// GetAgentConfig retrieves the configuration for a specific agent
func (c *Client) GetAgentConfig(name string) (*AgentConfig, error) {
path := fmt.Sprintf("/api/agent/%s/config", name)
resp, err := c.doRequest(http.MethodGet, path, nil)
if err != nil {
return nil, err
}
defer resp.Body.Close()
var config AgentConfig
if err := json.NewDecoder(resp.Body).Decode(&config); err != nil {
return nil, fmt.Errorf("error decoding response: %w", err)
}
return &config, nil
}
// CreateAgent creates a new agent with the given configuration
func (c *Client) CreateAgent(config *AgentConfig) error {
resp, err := c.doRequest(http.MethodPost, "/api/agent/create", config)
if err != nil {
return err
}
defer resp.Body.Close()
var response map[string]string
if err := json.NewDecoder(resp.Body).Decode(&response); err != nil {
return fmt.Errorf("error decoding response: %w", err)
}
if status, ok := response["status"]; ok && status == "ok" {
return nil
}
return fmt.Errorf("failed to create agent: %v", response)
}
// UpdateAgentConfig updates the configuration for an existing agent
func (c *Client) UpdateAgentConfig(name string, config *AgentConfig) error {
// Ensure the name in the URL matches the name in the config
config.Name = name
path := fmt.Sprintf("/api/agent/%s/config", name)
resp, err := c.doRequest(http.MethodPut, path, config)
if err != nil {
return err
}
defer resp.Body.Close()
var response map[string]string
if err := json.NewDecoder(resp.Body).Decode(&response); err != nil {
return fmt.Errorf("error decoding response: %w", err)
}
if status, ok := response["status"]; ok && status == "ok" {
return nil
}
return fmt.Errorf("failed to update agent: %v", response)
}
// DeleteAgent removes an agent
func (c *Client) DeleteAgent(name string) error {
path := fmt.Sprintf("/api/agent/%s", name)
resp, err := c.doRequest(http.MethodDelete, path, nil)
if err != nil {
return err
}
defer resp.Body.Close()
var response map[string]string
if err := json.NewDecoder(resp.Body).Decode(&response); err != nil {
return fmt.Errorf("error decoding response: %w", err)
}
if status, ok := response["status"]; ok && status == "ok" {
return nil
}
return fmt.Errorf("failed to delete agent: %v", response)
}
// PauseAgent pauses an agent
func (c *Client) PauseAgent(name string) error {
path := fmt.Sprintf("/api/agent/pause/%s", name)
resp, err := c.doRequest(http.MethodPut, path, nil)
if err != nil {
return err
}
defer resp.Body.Close()
var response map[string]string
if err := json.NewDecoder(resp.Body).Decode(&response); err != nil {
return fmt.Errorf("error decoding response: %w", err)
}
if status, ok := response["status"]; ok && status == "ok" {
return nil
}
return fmt.Errorf("failed to pause agent: %v", response)
}
// StartAgent starts a paused agent
func (c *Client) StartAgent(name string) error {
path := fmt.Sprintf("/api/agent/start/%s", name)
resp, err := c.doRequest(http.MethodPut, path, nil)
if err != nil {
return err
}
defer resp.Body.Close()
var response map[string]string
if err := json.NewDecoder(resp.Body).Decode(&response); err != nil {
return fmt.Errorf("error decoding response: %w", err)
}
if status, ok := response["status"]; ok && status == "ok" {
return nil
}
return fmt.Errorf("failed to start agent: %v", response)
}
// ExportAgent exports an agent configuration
func (c *Client) ExportAgent(name string) (*AgentConfig, error) {
path := fmt.Sprintf("/settings/export/%s", name)
resp, err := c.doRequest(http.MethodGet, path, nil)
if err != nil {
return nil, err
}
defer resp.Body.Close()
var config AgentConfig
if err := json.NewDecoder(resp.Body).Decode(&config); err != nil {
return nil, fmt.Errorf("error decoding response: %w", err)
}
return &config, nil
}

65
pkg/client/chat.go Normal file
View File

@@ -0,0 +1,65 @@
package localagi
import (
"fmt"
"net/http"
"strings"
)
// Message represents a chat message
type Message struct {
Message string `json:"message"`
}
// ChatResponse represents a response from the agent
type ChatResponse struct {
Response string `json:"response"`
}
// SendMessage sends a message to an agent
func (c *Client) SendMessage(agentName, message string) error {
path := fmt.Sprintf("/chat/%s", agentName)
msg := Message{
Message: message,
}
resp, err := c.doRequest(http.MethodPost, path, msg)
if err != nil {
return err
}
defer resp.Body.Close()
// The response is HTML, so it's not easily parseable in this context
return nil
}
// Notify sends a notification to an agent
func (c *Client) Notify(agentName, message string) error {
path := fmt.Sprintf("/notify/%s", agentName)
// URL encoded form data
form := strings.NewReader(fmt.Sprintf("message=%s", message))
req, err := http.NewRequest(http.MethodGet, c.BaseURL+path, form)
if err != nil {
return fmt.Errorf("error creating request: %w", err)
}
if c.APIKey != "" {
req.Header.Set("Authorization", "Bearer "+c.APIKey)
}
req.Header.Set("Content-Type", "application/x-www-form-urlencoded")
resp, err := c.HTTPClient.Do(req)
if err != nil {
return fmt.Errorf("error making request: %w", err)
}
defer resp.Body.Close()
if resp.StatusCode >= 400 {
return fmt.Errorf("api error (status %d)", resp.StatusCode)
}
return nil
}

76
pkg/client/client.go Normal file
View File

@@ -0,0 +1,76 @@
package localagi
import (
"bytes"
"encoding/json"
"fmt"
"io"
"net/http"
"time"
)
// Client represents a client for the LocalAGI API
type Client struct {
BaseURL string
APIKey string
HTTPClient *http.Client
}
// NewClient creates a new LocalAGI client
func NewClient(baseURL string, apiKey string, timeout time.Duration) *Client {
if timeout == 0 {
timeout = time.Second * 30
}
return &Client{
BaseURL: baseURL,
APIKey: apiKey,
HTTPClient: &http.Client{
Timeout: timeout,
},
}
}
// SetTimeout sets the HTTP client timeout
func (c *Client) SetTimeout(timeout time.Duration) {
c.HTTPClient.Timeout = timeout
}
// doRequest performs an HTTP request and returns the response
func (c *Client) doRequest(method, path string, body interface{}) (*http.Response, error) {
var reqBody io.Reader
if body != nil {
jsonData, err := json.Marshal(body)
if err != nil {
return nil, fmt.Errorf("error marshaling request body: %w", err)
}
reqBody = bytes.NewBuffer(jsonData)
}
url := fmt.Sprintf("%s%s", c.BaseURL, path)
req, err := http.NewRequest(method, url, reqBody)
if err != nil {
return nil, fmt.Errorf("error creating request: %w", err)
}
if c.APIKey != "" {
req.Header.Set("Authorization", "Bearer "+c.APIKey)
}
if body != nil {
req.Header.Set("Content-Type", "application/json")
}
resp, err := c.HTTPClient.Do(req)
if err != nil {
return nil, fmt.Errorf("error making request: %w", err)
}
if resp.StatusCode >= 400 {
// Read the error response
defer resp.Body.Close()
errorData, _ := io.ReadAll(resp.Body)
return resp, fmt.Errorf("api error (status %d): %s", resp.StatusCode, string(errorData))
}
return resp, nil
}

127
pkg/client/responses.go Normal file
View File

@@ -0,0 +1,127 @@
package localagi
import (
"encoding/json"
"fmt"
"net/http"
)
// RequestBody represents the message request to the AI model
type RequestBody struct {
Model string `json:"model"`
Input any `json:"input"`
Temperature *float64 `json:"temperature,omitempty"`
MaxTokens *int `json:"max_output_tokens,omitempty"`
}
// InputMessage represents a user input message
type InputMessage struct {
Role string `json:"role"`
Content any `json:"content"`
}
// ContentItem represents an item in a content array
type ContentItem struct {
Type string `json:"type"`
Text string `json:"text,omitempty"`
ImageURL string `json:"image_url,omitempty"`
}
// ResponseBody represents the response from the AI model
type ResponseBody struct {
CreatedAt int64 `json:"created_at"`
Status string `json:"status"`
Error any `json:"error,omitempty"`
Output []ResponseMessage `json:"output"`
}
// ResponseMessage represents a message in the response
type ResponseMessage struct {
Type string `json:"type"`
Status string `json:"status"`
Role string `json:"role"`
Content []MessageContentItem `json:"content"`
}
// MessageContentItem represents a content item in a message
type MessageContentItem struct {
Type string `json:"type"`
Text string `json:"text"`
}
// GetAIResponse sends a request to the AI model and returns the response
func (c *Client) GetAIResponse(request *RequestBody) (*ResponseBody, error) {
resp, err := c.doRequest(http.MethodPost, "/v1/responses", request)
if err != nil {
return nil, err
}
defer resp.Body.Close()
var response ResponseBody
if err := json.NewDecoder(resp.Body).Decode(&response); err != nil {
return nil, fmt.Errorf("error decoding response: %w", err)
}
// Check if there was an error in the response
if response.Error != nil {
return nil, fmt.Errorf("api error: %v", response.Error)
}
return &response, nil
}
// SimpleAIResponse is a helper function to get a simple text response from the AI
func (c *Client) SimpleAIResponse(agentName, input string) (string, error) {
temperature := 0.7
request := &RequestBody{
Model: agentName,
Input: input,
Temperature: &temperature,
}
response, err := c.GetAIResponse(request)
if err != nil {
return "", err
}
// Extract the text response from the output
for _, msg := range response.Output {
if msg.Role == "assistant" {
for _, content := range msg.Content {
if content.Type == "output_text" {
return content.Text, nil
}
}
}
}
return "", fmt.Errorf("no text response found")
}
// ChatAIResponse sends chat messages to the AI model
func (c *Client) ChatAIResponse(agentName string, messages []InputMessage) (string, error) {
temperature := 0.7
request := &RequestBody{
Model: agentName,
Input: messages,
Temperature: &temperature,
}
response, err := c.GetAIResponse(request)
if err != nil {
return "", err
}
// Extract the text response from the output
for _, msg := range response.Output {
if msg.Role == "assistant" {
for _, content := range msg.Content {
if content.Type == "output_text" {
return content.Text, nil
}
}
}
}
return "", fmt.Errorf("no text response found")
}

42
pkg/config/meta.go Normal file
View File

@@ -0,0 +1,42 @@
package config
type FieldType string
const (
FieldTypeNumber FieldType = "number"
FieldTypeText FieldType = "text"
FieldTypeTextarea FieldType = "textarea"
FieldTypeCheckbox FieldType = "checkbox"
FieldTypeSelect FieldType = "select"
)
type Tags struct {
Section string `json:"section,omitempty"`
}
type FieldOption struct {
Value string `json:"value"`
Label string `json:"label"`
}
type Field struct {
Name string `json:"name"`
Type FieldType `json:"type"`
Label string `json:"label"`
DefaultValue any `json:"defaultValue"`
Placeholder string `json:"placeholder,omitempty"`
HelpText string `json:"helpText,omitempty"`
Required bool `json:"required,omitempty"`
Disabled bool `json:"disabled,omitempty"`
Options []FieldOption `json:"options,omitempty"`
Min float32 `json:"min,omitempty"`
Max float32 `json:"max,omitempty"`
Step float32 `json:"step,omitempty"`
Tags Tags `json:"tags,omitempty"`
}
type FieldGroup struct {
Name string `json:"name"`
Label string `json:"label"`
Fields []Field `json:"fields"`
}

112
pkg/deepface/client.go Normal file
View File

@@ -0,0 +1,112 @@
package deepface
// A simple Golang client for repository: https://github.com/serengil/deepface
import (
"bytes"
"encoding/base64"
"encoding/json"
"fmt"
"io"
"net/http"
"os"
)
type DeepFaceClient struct {
BaseURL string
}
func NewClient(baseURL string) *DeepFaceClient {
return &DeepFaceClient{BaseURL: baseURL}
}
func encodeImageToBase64(imgPath string) (string, error) {
file, err := os.Open(imgPath)
if err != nil {
return "", err
}
defer file.Close()
buf := new(bytes.Buffer)
if _, err := io.Copy(buf, file); err != nil {
return "", err
}
return base64.StdEncoding.EncodeToString(buf.Bytes()), nil
}
func (c *DeepFaceClient) Represent(modelName, imgPath string) error {
imgBase64, err := encodeImageToBase64(imgPath)
if err != nil {
return err
}
data := map[string]string{
"model_name": modelName,
"img": imgBase64,
}
jsonData, _ := json.Marshal(data)
resp, err := http.Post(c.BaseURL+"/represent", "application/json", bytes.NewBuffer(jsonData))
if err != nil {
return err
}
defer resp.Body.Close()
body, _ := io.ReadAll(resp.Body)
fmt.Println("Response:", string(body))
return nil
}
func (c *DeepFaceClient) Verify(img1Path, img2Path, modelName, detector, metric string) error {
img1Base64, err := encodeImageToBase64(img1Path)
if err != nil {
return err
}
img2Base64, err := encodeImageToBase64(img2Path)
if err != nil {
return err
}
data := map[string]string{
"img1": img1Base64,
"img2": img2Base64,
"model_name": modelName,
"detector_backend": detector,
"distance_metric": metric,
}
jsonData, _ := json.Marshal(data)
resp, err := http.Post(c.BaseURL+"/verify", "application/json", bytes.NewBuffer(jsonData))
if err != nil {
return err
}
defer resp.Body.Close()
body, _ := io.ReadAll(resp.Body)
fmt.Println("Response:", string(body))
return nil
}
func (c *DeepFaceClient) Analyze(imgPath string, actions []string) error {
imgBase64, err := encodeImageToBase64(imgPath)
if err != nil {
return err
}
data := map[string]interface{}{
"img": imgBase64,
"actions": actions,
}
jsonData, _ := json.Marshal(data)
resp, err := http.Post(c.BaseURL+"/analyze", "application/json", bytes.NewBuffer(jsonData))
if err != nil {
return err
}
defer resp.Body.Close()
body, _ := io.ReadAll(resp.Body)
fmt.Println("Response:", string(body))
return nil
}

28
pkg/llm/client.go Normal file
View File

@@ -0,0 +1,28 @@
package llm
import (
"net/http"
"time"
"github.com/sashabaranov/go-openai"
)
func NewClient(APIKey, URL, timeout string) *openai.Client {
// Set up OpenAI client
if APIKey == "" {
//log.Fatal("OPENAI_API_KEY environment variable not set")
APIKey = "sk-xxx"
}
config := openai.DefaultConfig(APIKey)
config.BaseURL = URL
dur, err := time.ParseDuration(timeout)
if err != nil {
dur = 150 * time.Second
}
config.HTTPClient = &http.Client{
Timeout: dur,
}
return openai.NewClientWithConfig(config)
}

57
pkg/llm/json.go Normal file
View File

@@ -0,0 +1,57 @@
package llm
import (
"context"
"encoding/json"
"fmt"
"github.com/mudler/LocalAGI/pkg/xlog"
"github.com/sashabaranov/go-openai"
"github.com/sashabaranov/go-openai/jsonschema"
)
func GenerateTypedJSON(ctx context.Context, client *openai.Client, guidance, model string, i jsonschema.Definition, dst any) error {
toolName := "json"
decision := openai.ChatCompletionRequest{
Model: model,
Messages: []openai.ChatCompletionMessage{
{
Role: "user",
Content: guidance,
},
},
Tools: []openai.Tool{
{
Type: openai.ToolTypeFunction,
Function: &openai.FunctionDefinition{
Name: toolName,
Parameters: i,
},
},
},
ToolChoice: openai.ToolChoice{
Type: openai.ToolTypeFunction,
Function: openai.ToolFunction{Name: toolName},
},
}
resp, err := client.CreateChatCompletion(ctx, decision)
if err != nil {
return err
}
if len(resp.Choices) != 1 {
return fmt.Errorf("no choices: %d", len(resp.Choices))
}
msg := resp.Choices[0].Message
if len(msg.ToolCalls) == 0 {
return fmt.Errorf("no tool calls: %d", len(msg.ToolCalls))
}
xlog.Debug("JSON generated", "Arguments", msg.ToolCalls[0].Function.Arguments)
return json.Unmarshal([]byte(msg.ToolCalls[0].Function.Arguments), dst)
}

View File

@@ -0,0 +1,72 @@
package localoperator
import (
"bytes"
"encoding/json"
"fmt"
"net/http"
)
// Client represents a client for interacting with the LocalOperator API
type Client struct {
baseURL string
httpClient *http.Client
}
// NewClient creates a new API client
func NewClient(baseURL string) *Client {
return &Client{
baseURL: baseURL,
httpClient: &http.Client{},
}
}
// AgentRequest represents the request body for running an agent
type AgentRequest struct {
Goal string `json:"goal"`
MaxAttempts int `json:"max_attempts,omitempty"`
MaxNoActionAttempts int `json:"max_no_action_attempts,omitempty"`
}
// StateDescription represents a single state in the agent's history
type StateDescription struct {
CurrentURL string `json:"current_url"`
PageTitle string `json:"page_title"`
PageContentDescription string `json:"page_content_description"`
Screenshot string `json:"screenshot"`
ScreenshotMimeType string `json:"screenshot_mime_type"` // MIME type of the screenshot (e.g., "image/png")
}
// StateHistory represents the complete history of states during agent execution
type StateHistory struct {
States []StateDescription `json:"states"`
}
// RunAgent sends a request to run an agent with the given goal
func (c *Client) RunBrowserAgent(req AgentRequest) (*StateHistory, error) {
body, err := json.Marshal(req)
if err != nil {
return nil, fmt.Errorf("failed to marshal request: %w", err)
}
resp, err := c.httpClient.Post(
fmt.Sprintf("%s/api/browser/run", c.baseURL),
"application/json",
bytes.NewBuffer(body),
)
if err != nil {
return nil, fmt.Errorf("failed to send request: %w", err)
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
return nil, fmt.Errorf("unexpected status code: %d", resp.StatusCode)
}
var state StateHistory
if err := json.NewDecoder(resp.Body).Decode(&state); err != nil {
return nil, fmt.Errorf("failed to decode response: %w", err)
}
return &state, nil
}

389
pkg/localrag/client.go Normal file
View File

@@ -0,0 +1,389 @@
// TODO: this is a duplicate of LocalRAG/pkg/client
package localrag
import (
"bytes"
"crypto/md5"
"encoding/hex"
"encoding/json"
"errors"
"fmt"
"io"
"mime/multipart"
"net/http"
"os"
"path/filepath"
"time"
"github.com/mudler/LocalAGI/core/agent"
"github.com/mudler/LocalAGI/pkg/xlog"
)
var _ agent.RAGDB = &WrappedClient{}
type WrappedClient struct {
*Client
collection string
}
func NewWrappedClient(baseURL, apiKey, collection string) *WrappedClient {
wc := &WrappedClient{
Client: NewClient(baseURL, apiKey),
collection: collection,
}
wc.CreateCollection(collection)
return wc
}
func (c *WrappedClient) Count() int {
entries, err := c.ListEntries(c.collection)
if err != nil {
return 0
}
return len(entries)
}
func (c *WrappedClient) Reset() error {
return c.Client.Reset(c.collection)
}
func (c *WrappedClient) Search(s string, similarity int) ([]string, error) {
results, err := c.Client.Search(c.collection, s, similarity)
if err != nil {
return nil, err
}
var res []string
for _, r := range results {
res = append(res, fmt.Sprintf("%s (%+v)", r.Content, r.Metadata))
}
return res, nil
}
func (c *WrappedClient) Store(s string) error {
// the Client API of LocalRAG takes only files at the moment.
// So we take the string that we want to store, write it to a file, and then store the file.
t := time.Now()
dateTime := t.Format("2006-01-02-15-04-05")
hash := md5.Sum([]byte(s))
fileName := fmt.Sprintf("%s-%s.%s", dateTime, hex.EncodeToString(hash[:]), "txt")
xlog.Debug("Storing string in LocalRAG", "collection", c.collection, "fileName", fileName)
tempdir, err := os.MkdirTemp("", "localrag")
if err != nil {
return err
}
defer os.RemoveAll(tempdir)
f := filepath.Join(tempdir, fileName)
err = os.WriteFile(f, []byte(s), 0644)
if err != nil {
return err
}
defer os.Remove(f)
return c.Client.Store(c.collection, f)
}
// Result represents a single result from a query.
type Result struct {
ID string
Metadata map[string]string
Embedding []float32
Content string
// The cosine similarity between the query and the document.
// The higher the value, the more similar the document is to the query.
// The value is in the range [-1, 1].
Similarity float32
}
// Client is a client for the RAG API
type Client struct {
BaseURL string
APIKey string
}
// NewClient creates a new RAG API client
func NewClient(baseURL, apiKey string) *Client {
return &Client{
BaseURL: baseURL,
APIKey: apiKey,
}
}
// Add a helper method to set the Authorization header
func (c *Client) addAuthHeader(req *http.Request) {
if c.APIKey == "" {
return
}
req.Header.Set("Authorization", "Bearer "+c.APIKey)
}
// CreateCollection creates a new collection
func (c *Client) CreateCollection(name string) error {
url := fmt.Sprintf("%s/api/collections", c.BaseURL)
type request struct {
Name string `json:"name"`
}
payload, err := json.Marshal(request{Name: name})
if err != nil {
return err
}
req, err := http.NewRequest(http.MethodPost, url, bytes.NewBuffer(payload))
if err != nil {
return err
}
req.Header.Set("Content-Type", "application/json")
c.addAuthHeader(req)
client := &http.Client{}
resp, err := client.Do(req)
if err != nil {
return err
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusCreated {
return errors.New("failed to create collection")
}
return nil
}
// ListCollections lists all collections
func (c *Client) ListCollections() ([]string, error) {
url := fmt.Sprintf("%s/api/collections", c.BaseURL)
req, err := http.NewRequest(http.MethodGet, url, nil)
if err != nil {
return nil, err
}
c.addAuthHeader(req)
client := &http.Client{}
resp, err := client.Do(req)
if err != nil {
return nil, err
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
return nil, errors.New("failed to list collections")
}
var collections []string
err = json.NewDecoder(resp.Body).Decode(&collections)
if err != nil {
return nil, err
}
return collections, nil
}
// ListEntries lists all entries in a collection
func (c *Client) ListEntries(collection string) ([]string, error) {
url := fmt.Sprintf("%s/api/collections/%s/entries", c.BaseURL, collection)
req, err := http.NewRequest(http.MethodGet, url, nil)
if err != nil {
return nil, err
}
c.addAuthHeader(req)
client := &http.Client{}
resp, err := client.Do(req)
if err != nil {
return nil, err
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
return nil, errors.New("failed to list entries")
}
var entries []string
err = json.NewDecoder(resp.Body).Decode(&entries)
if err != nil {
return nil, err
}
return entries, nil
}
// DeleteEntry deletes an entry in a collection
func (c *Client) DeleteEntry(collection, entry string) ([]string, error) {
url := fmt.Sprintf("%s/api/collections/%s/entry/delete", c.BaseURL, collection)
type request struct {
Entry string `json:"entry"`
}
payload, err := json.Marshal(request{Entry: entry})
if err != nil {
return nil, err
}
req, err := http.NewRequest(http.MethodDelete, url, bytes.NewBuffer(payload))
if err != nil {
return nil, err
}
req.Header.Set("Content-Type", "application/json")
c.addAuthHeader(req)
client := &http.Client{}
resp, err := client.Do(req)
if err != nil {
return nil, err
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
bodyResult := new(bytes.Buffer)
bodyResult.ReadFrom(resp.Body)
return nil, errors.New("failed to delete entry: " + bodyResult.String())
}
var results []string
err = json.NewDecoder(resp.Body).Decode(&results)
if err != nil {
return nil, err
}
return results, nil
}
// Search searches a collection
func (c *Client) Search(collection, query string, maxResults int) ([]Result, error) {
url := fmt.Sprintf("%s/api/collections/%s/search", c.BaseURL, collection)
type request struct {
Query string `json:"query"`
MaxResults int `json:"max_results"`
}
payload, err := json.Marshal(request{Query: query, MaxResults: maxResults})
if err != nil {
return nil, err
}
req, err := http.NewRequest(http.MethodPost, url, bytes.NewBuffer(payload))
if err != nil {
return nil, err
}
req.Header.Set("Content-Type", "application/json")
c.addAuthHeader(req)
client := &http.Client{}
resp, err := client.Do(req)
if err != nil {
return nil, err
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
return nil, errors.New("failed to search collection")
}
var results []Result
err = json.NewDecoder(resp.Body).Decode(&results)
if err != nil {
return nil, err
}
return results, nil
}
// Reset resets a collection
func (c *Client) Reset(collection string) error {
url := fmt.Sprintf("%s/api/collections/%s/reset", c.BaseURL, collection)
req, err := http.NewRequest(http.MethodPost, url, nil)
if err != nil {
return err
}
c.addAuthHeader(req)
client := &http.Client{}
resp, err := client.Do(req)
if err != nil {
return err
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
b := new(bytes.Buffer)
b.ReadFrom(resp.Body)
return errors.New("failed to reset collection: " + b.String())
}
return nil
}
// Store uploads a file to a collection
func (c *Client) Store(collection, filePath string) error {
url := fmt.Sprintf("%s/api/collections/%s/upload", c.BaseURL, collection)
file, err := os.Open(filePath)
if err != nil {
return err
}
defer file.Close()
body := &bytes.Buffer{}
writer := multipart.NewWriter(body)
part, err := writer.CreateFormFile("file", file.Name())
if err != nil {
return err
}
_, err = io.Copy(part, file)
if err != nil {
return err
}
err = writer.Close()
if err != nil {
return err
}
req, err := http.NewRequest(http.MethodPost, url, body)
if err != nil {
return err
}
req.Header.Set("Content-Type", writer.FormDataContentType())
c.addAuthHeader(req)
client := &http.Client{}
resp, err := client.Do(req)
if err != nil {
return err
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
b := new(bytes.Buffer)
b.ReadFrom(resp.Body)
type response struct {
Error string `json:"error"`
}
var r response
err = json.Unmarshal(b.Bytes(), &r)
if err == nil {
return errors.New("failed to upload file: " + r.Error)
}
return errors.New("failed to upload file")
}
return nil
}

325
pkg/stdio/client.go Normal file
View File

@@ -0,0 +1,325 @@
package stdio
import (
"bytes"
"context"
"encoding/json"
"fmt"
"io"
"log"
"net/http"
"net/url"
"sync"
"time"
"github.com/gorilla/websocket"
)
// Client implements the transport.Interface for stdio processes
type Client struct {
baseURL string
processes map[string]*Process
groups map[string][]string
mu sync.RWMutex
}
// NewClient creates a new stdio transport client
func NewClient(baseURL string) *Client {
return &Client{
baseURL: baseURL,
processes: make(map[string]*Process),
groups: make(map[string][]string),
}
}
// CreateProcess starts a new process in a group
func (c *Client) CreateProcess(ctx context.Context, command string, args []string, env []string, groupID string) (*Process, error) {
log.Printf("Creating process: command=%s, args=%v, groupID=%s", command, args, groupID)
req := struct {
Command string `json:"command"`
Args []string `json:"args"`
Env []string `json:"env"`
GroupID string `json:"group_id"`
}{
Command: command,
Args: args,
Env: env,
GroupID: groupID,
}
reqBody, err := json.Marshal(req)
if err != nil {
return nil, fmt.Errorf("failed to marshal request: %w", err)
}
url := fmt.Sprintf("%s/processes", c.baseURL)
log.Printf("Sending POST request to %s", url)
resp, err := http.Post(url, "application/json", bytes.NewReader(reqBody))
if err != nil {
return nil, fmt.Errorf("failed to start process: %w", err)
}
defer resp.Body.Close()
log.Printf("Received response with status: %d", resp.StatusCode)
var result struct {
ID string `json:"id"`
}
body, _ := io.ReadAll(resp.Body)
if err := json.Unmarshal(body, &result); err != nil {
return nil, fmt.Errorf("failed to decode response: %w. body: %s", err, string(body))
}
log.Printf("Successfully created process with ID: %s", result.ID)
process := &Process{
ID: result.ID,
GroupID: groupID,
CreatedAt: time.Now(),
}
c.mu.Lock()
c.processes[process.ID] = process
if groupID != "" {
c.groups[groupID] = append(c.groups[groupID], process.ID)
}
c.mu.Unlock()
return process, nil
}
// GetProcess returns a process by ID
func (c *Client) GetProcess(id string) (*Process, error) {
c.mu.RLock()
process, exists := c.processes[id]
c.mu.RUnlock()
if !exists {
return nil, fmt.Errorf("process not found: %s", id)
}
return process, nil
}
// GetGroupProcesses returns all processes in a group
func (c *Client) GetGroupProcesses(groupID string) ([]*Process, error) {
c.mu.RLock()
processIDs, exists := c.groups[groupID]
if !exists {
c.mu.RUnlock()
return nil, fmt.Errorf("group not found: %s", groupID)
}
processes := make([]*Process, 0, len(processIDs))
for _, pid := range processIDs {
if process, exists := c.processes[pid]; exists {
processes = append(processes, process)
}
}
c.mu.RUnlock()
return processes, nil
}
// StopProcess stops a single process
func (c *Client) StopProcess(id string) error {
c.mu.Lock()
process, exists := c.processes[id]
if !exists {
c.mu.Unlock()
return fmt.Errorf("process not found: %s", id)
}
// Remove from group if it exists
if process.GroupID != "" {
groupProcesses := c.groups[process.GroupID]
for i, pid := range groupProcesses {
if pid == id {
c.groups[process.GroupID] = append(groupProcesses[:i], groupProcesses[i+1:]...)
break
}
}
if len(c.groups[process.GroupID]) == 0 {
delete(c.groups, process.GroupID)
}
}
delete(c.processes, id)
c.mu.Unlock()
req, err := http.NewRequest(
"DELETE",
fmt.Sprintf("%s/processes/%s", c.baseURL, id),
nil,
)
if err != nil {
return fmt.Errorf("failed to create request: %w", err)
}
resp, err := http.DefaultClient.Do(req)
if err != nil {
return fmt.Errorf("failed to stop process: %w", err)
}
resp.Body.Close()
return nil
}
// StopGroup stops all processes in a group
func (c *Client) StopGroup(groupID string) error {
c.mu.Lock()
processIDs, exists := c.groups[groupID]
if !exists {
c.mu.Unlock()
return fmt.Errorf("group not found: %s", groupID)
}
c.mu.Unlock()
for _, pid := range processIDs {
if err := c.StopProcess(pid); err != nil {
return fmt.Errorf("failed to stop process %s in group %s: %w", pid, groupID, err)
}
}
return nil
}
// ListGroups returns all group IDs
func (c *Client) ListGroups() []string {
c.mu.RLock()
defer c.mu.RUnlock()
groups := make([]string, 0, len(c.groups))
for groupID := range c.groups {
groups = append(groups, groupID)
}
return groups
}
// GetProcessIO returns io.Reader and io.Writer for a process
func (c *Client) GetProcessIO(id string) (io.Reader, io.Writer, error) {
log.Printf("Getting IO for process: %s", id)
process, err := c.GetProcess(id)
if err != nil {
return nil, nil, err
}
// Parse the base URL to get the host
baseURL, err := url.Parse(c.baseURL)
if err != nil {
return nil, nil, fmt.Errorf("failed to parse base URL: %w", err)
}
// Connect to WebSocket
u := url.URL{
Scheme: "ws",
Host: baseURL.Host,
Path: fmt.Sprintf("/ws/%s", process.ID),
}
log.Printf("Connecting to WebSocket at: %s", u.String())
conn, _, err := websocket.DefaultDialer.Dial(u.String(), nil)
if err != nil {
return nil, nil, fmt.Errorf("failed to connect to WebSocket: %w", err)
}
log.Printf("Successfully connected to WebSocket for process: %s", id)
// Create reader and writer
reader := &websocketReader{conn: conn}
writer := &websocketWriter{conn: conn}
return reader, writer, nil
}
// websocketReader implements io.Reader for WebSocket
type websocketReader struct {
conn *websocket.Conn
}
func (r *websocketReader) Read(p []byte) (n int, err error) {
_, message, err := r.conn.ReadMessage()
if err != nil {
return 0, err
}
n = copy(p, message)
return n, nil
}
// websocketWriter implements io.Writer for WebSocket
type websocketWriter struct {
conn *websocket.Conn
}
func (w *websocketWriter) Write(p []byte) (n int, err error) {
// Use BinaryMessage type for better compatibility
err = w.conn.WriteMessage(websocket.BinaryMessage, p)
if err != nil {
return 0, fmt.Errorf("failed to write WebSocket message: %w", err)
}
return len(p), nil
}
// Close closes all connections and stops all processes
func (c *Client) Close() error {
c.mu.Lock()
defer c.mu.Unlock()
// Stop all processes
for id := range c.processes {
if err := c.StopProcess(id); err != nil {
return fmt.Errorf("failed to stop process %s: %w", id, err)
}
}
return nil
}
// RunProcess executes a command and returns its output
func (c *Client) RunProcess(ctx context.Context, command string, args []string, env []string) (string, error) {
log.Printf("Running one-time process: command=%s, args=%v", command, args)
req := struct {
Command string `json:"command"`
Args []string `json:"args"`
Env []string `json:"env"`
}{
Command: command,
Args: args,
Env: env,
}
reqBody, err := json.Marshal(req)
if err != nil {
return "", fmt.Errorf("failed to marshal request: %w", err)
}
url := fmt.Sprintf("%s/run", c.baseURL)
log.Printf("Sending POST request to %s", url)
resp, err := http.Post(url, "application/json", bytes.NewReader(reqBody))
if err != nil {
return "", fmt.Errorf("failed to execute process: %w", err)
}
defer resp.Body.Close()
log.Printf("Received response with status: %d", resp.StatusCode)
var result struct {
Output string `json:"output"`
}
if err := json.NewDecoder(resp.Body).Decode(&result); err != nil {
body, _ := io.ReadAll(resp.Body)
return "", fmt.Errorf("failed to decode response: %w. body: %s", err, string(body))
}
log.Printf("Successfully executed process with output length: %d", len(result.Output))
return result.Output, nil
}

View File

@@ -0,0 +1,28 @@
package stdio
import (
"os"
"testing"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
)
func TestSTDIOTransport(t *testing.T) {
RegisterFailHandler(Fail)
RunSpecs(t, "STDIOTransport test suite")
}
var baseURL string
func init() {
baseURL = os.Getenv("LOCALAGI_MCPBOX_URL")
if baseURL == "" {
baseURL = "http://localhost:8080"
}
}
var _ = AfterSuite(func() {
client := NewClient(baseURL)
client.StopGroup("test-group")
})

235
pkg/stdio/client_test.go Normal file
View File

@@ -0,0 +1,235 @@
package stdio
import (
"context"
"time"
mcp "github.com/metoro-io/mcp-golang"
"github.com/metoro-io/mcp-golang/transport/stdio"
"github.com/mudler/LocalAGI/pkg/xlog"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
)
var _ = Describe("Client", func() {
var (
client *Client
)
BeforeEach(func() {
client = NewClient(baseURL)
})
AfterEach(func() {
if client != nil {
Expect(client.Close()).To(Succeed())
}
})
Context("Process Management", func() {
It("should create and stop a process", func() {
ctx := context.Background()
// Use a command that doesn't exit immediately
process, err := client.CreateProcess(ctx, "sh", []string{"-c", "echo 'Hello, World!'; sleep 10"}, []string{}, "test-group")
Expect(err).NotTo(HaveOccurred())
Expect(process).NotTo(BeNil())
Expect(process.ID).NotTo(BeEmpty())
// Get process IO
reader, writer, err := client.GetProcessIO(process.ID)
Expect(err).NotTo(HaveOccurred())
Expect(reader).NotTo(BeNil())
Expect(writer).NotTo(BeNil())
// Write to process
_, err = writer.Write([]byte("test input\n"))
Expect(err).NotTo(HaveOccurred())
// Read from process with timeout
buf := make([]byte, 1024)
readDone := make(chan struct{})
var readErr error
var readN int
go func() {
readN, readErr = reader.Read(buf)
close(readDone)
}()
// Wait for read with timeout
select {
case <-readDone:
Expect(readErr).NotTo(HaveOccurred())
Expect(readN).To(BeNumerically(">", 0))
Expect(string(buf[:readN])).To(ContainSubstring("Hello, World!"))
case <-time.After(5 * time.Second):
Fail("Timeout waiting for process output")
}
// Stop the process
err = client.StopProcess(process.ID)
Expect(err).NotTo(HaveOccurred())
})
It("should manage process groups", func() {
ctx := context.Background()
groupID := "test-group"
// Create multiple processes in the same group
process1, err := client.CreateProcess(ctx, "sh", []string{"-c", "echo 'Process 1'; sleep 1"}, []string{}, groupID)
Expect(err).NotTo(HaveOccurred())
Expect(process1).NotTo(BeNil())
process2, err := client.CreateProcess(ctx, "sh", []string{"-c", "echo 'Process 2'; sleep 1"}, []string{}, groupID)
Expect(err).NotTo(HaveOccurred())
Expect(process2).NotTo(BeNil())
// Get group processes
processes, err := client.GetGroupProcesses(groupID)
Expect(err).NotTo(HaveOccurred())
Expect(processes).To(HaveLen(2))
// List groups
groups := client.ListGroups()
Expect(groups).To(ContainElement(groupID))
// Stop the group
err = client.StopGroup(groupID)
Expect(err).NotTo(HaveOccurred())
})
It("should run a one-time process", func() {
ctx := context.Background()
output, err := client.RunProcess(ctx, "echo", []string{"One-time process"}, []string{})
Expect(err).NotTo(HaveOccurred())
Expect(output).To(ContainSubstring("One-time process"))
})
It("should handle process with environment variables", func() {
ctx := context.Background()
env := []string{"TEST_VAR=test_value"}
process, err := client.CreateProcess(ctx, "sh", []string{"-c", "env | grep TEST_VAR; sleep 1"}, env, "test-group")
Expect(err).NotTo(HaveOccurred())
Expect(process).NotTo(BeNil())
// Get process IO
reader, _, err := client.GetProcessIO(process.ID)
Expect(err).NotTo(HaveOccurred())
// Read environment variables with timeout
buf := make([]byte, 1024)
readDone := make(chan struct{})
var readErr error
var readN int
go func() {
readN, readErr = reader.Read(buf)
close(readDone)
}()
// Wait for read with timeout
select {
case <-readDone:
Expect(readErr).NotTo(HaveOccurred())
Expect(readN).To(BeNumerically(">", 0))
Expect(string(buf[:readN])).To(ContainSubstring("TEST_VAR=test_value"))
case <-time.After(5 * time.Second):
Fail("Timeout waiting for process output")
}
// Stop the process
err = client.StopProcess(process.ID)
Expect(err).NotTo(HaveOccurred())
})
It("should handle long-running processes", func() {
ctx := context.Background()
process, err := client.CreateProcess(ctx, "sh", []string{"-c", "echo 'Starting long process'; sleep 5"}, []string{}, "test-group")
Expect(err).NotTo(HaveOccurred())
Expect(process).NotTo(BeNil())
// Get process IO
reader, _, err := client.GetProcessIO(process.ID)
Expect(err).NotTo(HaveOccurred())
// Read initial output
buf := make([]byte, 1024)
readDone := make(chan struct{})
var readErr error
var readN int
go func() {
readN, readErr = reader.Read(buf)
close(readDone)
}()
// Wait for read with timeout
select {
case <-readDone:
Expect(readErr).NotTo(HaveOccurred())
Expect(readN).To(BeNumerically(">", 0))
Expect(string(buf[:readN])).To(ContainSubstring("Starting long process"))
case <-time.After(5 * time.Second):
Fail("Timeout waiting for process output")
}
// Wait a bit to ensure process is running
time.Sleep(time.Second)
// Stop the process
err = client.StopProcess(process.ID)
Expect(err).NotTo(HaveOccurred())
})
It("MCP", func() {
ctx := context.Background()
process, err := client.CreateProcess(ctx,
"docker", []string{"run", "-i", "--rm", "-e", "GITHUB_PERSONAL_ACCESS_TOKEN", "ghcr.io/github/github-mcp-server"},
[]string{"GITHUB_PERSONAL_ACCESS_TOKEN=test"}, "test-group")
Expect(err).NotTo(HaveOccurred())
Expect(process).NotTo(BeNil())
Expect(process.ID).NotTo(BeEmpty())
defer client.StopProcess(process.ID)
// MCP client
read, writer, err := client.GetProcessIO(process.ID)
Expect(err).NotTo(HaveOccurred())
Expect(read).NotTo(BeNil())
Expect(writer).NotTo(BeNil())
transport := stdio.NewStdioServerTransportWithIO(read, writer)
// Create a new client
mcpClient := mcp.NewClient(transport)
// Initialize the client
response, e := mcpClient.Initialize(ctx)
Expect(e).NotTo(HaveOccurred())
Expect(response).NotTo(BeNil())
Expect(mcpClient.Ping(ctx)).To(Succeed())
xlog.Debug("Client initialized: %v", response.Instructions)
alltools := []mcp.ToolRetType{}
var cursor *string
for {
tools, err := mcpClient.ListTools(ctx, cursor)
Expect(err).NotTo(HaveOccurred())
Expect(tools).NotTo(BeNil())
Expect(tools.Tools).NotTo(BeEmpty())
alltools = append(alltools, tools.Tools...)
if tools.NextCursor == nil {
break // No more pages
}
cursor = tools.NextCursor
}
for _, tool := range alltools {
xlog.Debug("Tool: %v", tool)
}
})
})
})

473
pkg/stdio/server.go Normal file
View File

@@ -0,0 +1,473 @@
package stdio
import (
"context"
"encoding/json"
"fmt"
"io"
"log"
"net/http"
"os"
"os/exec"
"sync"
"time"
"github.com/gorilla/websocket"
"github.com/mudler/LocalAGI/pkg/xlog"
)
// Process represents a running process with its stdio streams
type Process struct {
ID string
GroupID string
Cmd *exec.Cmd
Stdin io.WriteCloser
Stdout io.ReadCloser
Stderr io.ReadCloser
CreatedAt time.Time
}
// Server handles process management and stdio streaming
type Server struct {
processes map[string]*Process
groups map[string][]string // maps group ID to process IDs
mu sync.RWMutex
upgrader websocket.Upgrader
}
// NewServer creates a new stdio server
func NewServer() *Server {
return &Server{
processes: make(map[string]*Process),
groups: make(map[string][]string),
upgrader: websocket.Upgrader{},
}
}
// StartProcess starts a new process and returns its ID
func (s *Server) StartProcess(ctx context.Context, command string, args []string, env []string, groupID string) (string, error) {
xlog.Debug("Starting process", "command", command, "args", args, "groupID", groupID)
cmd := exec.CommandContext(ctx, command, args...)
if len(env) > 0 {
cmd.Env = append(os.Environ(), env...)
xlog.Debug("Process environment", "env", cmd.Env)
}
stdin, err := cmd.StdinPipe()
if err != nil {
return "", fmt.Errorf("failed to create stdin pipe: %w", err)
}
stdout, err := cmd.StdoutPipe()
if err != nil {
return "", fmt.Errorf("failed to create stdout pipe: %w", err)
}
stderr, err := cmd.StderrPipe()
if err != nil {
return "", fmt.Errorf("failed to create stderr pipe: %w", err)
}
if err := cmd.Start(); err != nil {
return "", fmt.Errorf("failed to start process: %w", err)
}
process := &Process{
ID: fmt.Sprintf("%d", time.Now().UnixNano()),
GroupID: groupID,
Cmd: cmd,
Stdin: stdin,
Stdout: stdout,
Stderr: stderr,
CreatedAt: time.Now(),
}
s.mu.Lock()
s.processes[process.ID] = process
if groupID != "" {
s.groups[groupID] = append(s.groups[groupID], process.ID)
}
s.mu.Unlock()
xlog.Debug("Successfully started process", "id", process.ID, "pid", cmd.Process.Pid)
return process.ID, nil
}
// StopProcess stops a running process
func (s *Server) StopProcess(id string) error {
s.mu.Lock()
process, exists := s.processes[id]
if !exists {
s.mu.Unlock()
return fmt.Errorf("process not found: %s", id)
}
xlog.Debug("Stopping process", "processID", id, "pid", process.Cmd.Process.Pid)
// Remove from group if it exists
if process.GroupID != "" {
groupProcesses := s.groups[process.GroupID]
for i, pid := range groupProcesses {
if pid == id {
s.groups[process.GroupID] = append(groupProcesses[:i], groupProcesses[i+1:]...)
break
}
}
if len(s.groups[process.GroupID]) == 0 {
delete(s.groups, process.GroupID)
}
}
delete(s.processes, id)
s.mu.Unlock()
if err := process.Cmd.Process.Kill(); err != nil {
xlog.Debug("Failed to kill process", "processID", id, "pid", process.Cmd.Process.Pid, "error", err)
return fmt.Errorf("failed to kill process: %w", err)
}
xlog.Debug("Successfully killed process", "processID", id, "pid", process.Cmd.Process.Pid)
return nil
}
// StopGroup stops all processes in a group
func (s *Server) StopGroup(groupID string) error {
s.mu.Lock()
processIDs, exists := s.groups[groupID]
if !exists {
s.mu.Unlock()
return fmt.Errorf("group not found: %s", groupID)
}
s.mu.Unlock()
for _, pid := range processIDs {
if err := s.StopProcess(pid); err != nil {
return fmt.Errorf("failed to stop process %s in group %s: %w", pid, groupID, err)
}
}
return nil
}
// GetGroupProcesses returns all processes in a group
func (s *Server) GetGroupProcesses(groupID string) ([]*Process, error) {
s.mu.RLock()
processIDs, exists := s.groups[groupID]
if !exists {
s.mu.RUnlock()
return nil, fmt.Errorf("group not found: %s", groupID)
}
processes := make([]*Process, 0, len(processIDs))
for _, pid := range processIDs {
if process, exists := s.processes[pid]; exists {
processes = append(processes, process)
}
}
s.mu.RUnlock()
return processes, nil
}
// ListGroups returns all group IDs
func (s *Server) ListGroups() []string {
s.mu.RLock()
defer s.mu.RUnlock()
groups := make([]string, 0, len(s.groups))
for groupID := range s.groups {
groups = append(groups, groupID)
}
return groups
}
// GetProcess returns a process by ID
func (s *Server) GetProcess(id string) (*Process, error) {
s.mu.RLock()
process, exists := s.processes[id]
s.mu.RUnlock()
if !exists {
return nil, fmt.Errorf("process not found: %s", id)
}
return process, nil
}
// ListProcesses returns all running processes
func (s *Server) ListProcesses() []*Process {
s.mu.RLock()
defer s.mu.RUnlock()
processes := make([]*Process, 0, len(s.processes))
for _, p := range s.processes {
processes = append(processes, p)
}
return processes
}
// RunProcess executes a command and returns its output
func (s *Server) RunProcess(ctx context.Context, command string, args []string, env []string) (string, error) {
cmd := exec.CommandContext(ctx, command, args...)
if len(env) > 0 {
cmd.Env = append(os.Environ(), env...)
}
output, err := cmd.CombinedOutput()
if err != nil {
return string(output), fmt.Errorf("process failed: %w", err)
}
return string(output), nil
}
// Start starts the HTTP server
func (s *Server) Start(addr string) error {
http.HandleFunc("/processes", s.handleProcesses)
http.HandleFunc("/processes/", s.handleProcess)
http.HandleFunc("/ws/", s.handleWebSocket)
http.HandleFunc("/groups", s.handleGroups)
http.HandleFunc("/groups/", s.handleGroup)
http.HandleFunc("/run", s.handleRun)
return http.ListenAndServe(addr, nil)
}
func (s *Server) handleProcesses(w http.ResponseWriter, r *http.Request) {
log.Printf("Handling /processes request: method=%s", r.Method)
switch r.Method {
case http.MethodGet:
processes := s.ListProcesses()
json.NewEncoder(w).Encode(processes)
case http.MethodPost:
var req struct {
Command string `json:"command"`
Args []string `json:"args"`
Env []string `json:"env"`
GroupID string `json:"group_id"`
}
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
http.Error(w, err.Error(), http.StatusBadRequest)
return
}
id, err := s.StartProcess(context.Background(), req.Command, req.Args, req.Env, req.GroupID)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
w.Header().Set("Content-Type", "application/json")
json.NewEncoder(w).Encode(map[string]string{"id": id})
default:
http.Error(w, "Method not allowed", http.StatusMethodNotAllowed)
}
}
func (s *Server) handleProcess(w http.ResponseWriter, r *http.Request) {
id := r.URL.Path[len("/processes/"):]
switch r.Method {
case http.MethodGet:
process, err := s.GetProcess(id)
if err != nil {
http.Error(w, err.Error(), http.StatusNotFound)
return
}
json.NewEncoder(w).Encode(process)
case http.MethodDelete:
if err := s.StopProcess(id); err != nil {
http.Error(w, err.Error(), http.StatusNotFound)
return
}
w.WriteHeader(http.StatusNoContent)
default:
http.Error(w, "Method not allowed", http.StatusMethodNotAllowed)
}
}
func (s *Server) handleWebSocket(w http.ResponseWriter, r *http.Request) {
id := r.URL.Path[len("/ws/"):]
xlog.Debug("Handling WebSocket connection", "processID", id)
process, err := s.GetProcess(id)
if err != nil {
http.Error(w, err.Error(), http.StatusNotFound)
return
}
if process.Cmd.ProcessState != nil && process.Cmd.ProcessState.Exited() {
xlog.Debug("Process already exited", "processID", id)
http.Error(w, "Process already exited", http.StatusGone)
return
}
xlog.Debug("Process is running", "processID", id, "pid", process.Cmd.Process.Pid)
conn, err := s.upgrader.Upgrade(w, r, nil)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
defer conn.Close()
xlog.Debug("WebSocket connection established", "processID", id)
// Create a done channel to signal process completion
done := make(chan struct{})
// Handle stdin
go func() {
defer func() {
select {
case <-done:
xlog.Debug("Process stdin handler done", "processID", id)
default:
xlog.Debug("WebSocket stdin connection closed", "processID", id)
}
}()
for {
_, message, err := conn.ReadMessage()
if err != nil {
if websocket.IsUnexpectedCloseError(err, websocket.CloseGoingAway, websocket.CloseNormalClosure) {
xlog.Debug("WebSocket stdin unexpected error", "processID", id, "error", err)
}
return
}
xlog.Debug("Received message", "processID", id, "message", string(message))
if _, err := process.Stdin.Write(message); err != nil {
if err != io.EOF {
xlog.Debug("WebSocket stdin write error", "processID", id, "error", err)
}
return
}
xlog.Debug("Message sent to process", "processID", id, "message", string(message))
}
}()
// Handle stdout and stderr
go func() {
defer func() {
select {
case <-done:
xlog.Debug("Process output handler done", "processID", id)
default:
xlog.Debug("WebSocket output connection closed", "processID", id)
}
}()
// Create a buffer for reading
buf := make([]byte, 4096)
reader := io.MultiReader(process.Stdout, process.Stderr)
for {
n, err := reader.Read(buf)
if err != nil {
if err != io.EOF {
xlog.Debug("Read error", "processID", id, "error", err)
}
return
}
if n > 0 {
xlog.Debug("Sending message", "processID", id, "size", n)
if err := conn.WriteMessage(websocket.BinaryMessage, buf[:n]); err != nil {
if websocket.IsUnexpectedCloseError(err, websocket.CloseGoingAway, websocket.CloseNormalClosure) {
xlog.Debug("WebSocket output write error", "processID", id, "error", err)
}
return
}
xlog.Debug("Message sent to client", "processID", id, "size", n)
}
}
}()
// Wait for process to exit
xlog.Debug("Waiting for process to exit", "processID", id)
err = process.Cmd.Wait()
close(done) // Signal that the process is done
if err != nil {
xlog.Debug("Process exited with error",
"processID", id,
"pid", process.Cmd.Process.Pid,
"error", err)
} else {
xlog.Debug("Process exited successfully",
"processID", id,
"pid", process.Cmd.Process.Pid)
}
}
// Add new handlers for group management
func (s *Server) handleGroups(w http.ResponseWriter, r *http.Request) {
switch r.Method {
case http.MethodGet:
groups := s.ListGroups()
json.NewEncoder(w).Encode(groups)
default:
http.Error(w, "Method not allowed", http.StatusMethodNotAllowed)
}
}
func (s *Server) handleGroup(w http.ResponseWriter, r *http.Request) {
groupID := r.URL.Path[len("/groups/"):]
switch r.Method {
case http.MethodGet:
processes, err := s.GetGroupProcesses(groupID)
if err != nil {
http.Error(w, err.Error(), http.StatusNotFound)
return
}
json.NewEncoder(w).Encode(processes)
case http.MethodDelete:
if err := s.StopGroup(groupID); err != nil {
http.Error(w, err.Error(), http.StatusNotFound)
return
}
w.WriteHeader(http.StatusNoContent)
default:
http.Error(w, "Method not allowed", http.StatusMethodNotAllowed)
}
}
func (s *Server) handleRun(w http.ResponseWriter, r *http.Request) {
if r.Method != http.MethodPost {
http.Error(w, "Method not allowed", http.StatusMethodNotAllowed)
return
}
log.Printf("Handling /run request")
var req struct {
Command string `json:"command"`
Args []string `json:"args"`
Env []string `json:"env"`
}
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
http.Error(w, err.Error(), http.StatusBadRequest)
return
}
log.Printf("Executing one-time process: command=%s, args=%v", req.Command, req.Args)
output, err := s.RunProcess(r.Context(), req.Command, req.Args, req.Env)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
log.Printf("One-time process completed with output length: %d", len(output))
w.Header().Set("Content-Type", "application/json")
json.NewEncoder(w).Encode(map[string]string{
"output": output,
})
}

9
pkg/utils/html.go Normal file
View File

@@ -0,0 +1,9 @@
package utils
import "strings"
func HTMLify(s string) string {
s = strings.TrimSpace(s)
s = strings.ReplaceAll(s, "\n", "<br>")
return s
}

113
pkg/vectorstore/chromem.go Normal file
View File

@@ -0,0 +1,113 @@
package vectorstore
import (
"context"
"fmt"
"runtime"
"github.com/philippgille/chromem-go"
"github.com/sashabaranov/go-openai"
)
type ChromemDB struct {
collectionName string
collection *chromem.Collection
index int
client *openai.Client
db *chromem.DB
embeddingsModel string
}
func NewChromemDB(collection, path string, openaiClient *openai.Client, embeddingsModel string) (*ChromemDB, error) {
// db, err := chromem.NewPersistentDB(path, true)
// if err != nil {
// return nil, err
// }
db := chromem.NewDB()
chromem := &ChromemDB{
collectionName: collection,
index: 1,
db: db,
client: openaiClient,
embeddingsModel: embeddingsModel,
}
c, err := db.GetOrCreateCollection(collection, nil, chromem.embedding())
if err != nil {
return nil, err
}
chromem.collection = c
return chromem, nil
}
func (c *ChromemDB) Count() int {
return c.collection.Count()
}
func (c *ChromemDB) Reset() error {
if err := c.db.DeleteCollection(c.collectionName); err != nil {
return err
}
collection, err := c.db.GetOrCreateCollection(c.collectionName, nil, c.embedding())
if err != nil {
return err
}
c.collection = collection
return nil
}
func (c *ChromemDB) embedding() chromem.EmbeddingFunc {
return chromem.EmbeddingFunc(
func(ctx context.Context, text string) ([]float32, error) {
resp, err := c.client.CreateEmbeddings(ctx,
openai.EmbeddingRequestStrings{
Input: []string{text},
Model: openai.EmbeddingModel(c.embeddingsModel),
},
)
if err != nil {
return []float32{}, fmt.Errorf("error getting keys: %v", err)
}
if len(resp.Data) == 0 {
return []float32{}, fmt.Errorf("no response from OpenAI API")
}
embedding := resp.Data[0].Embedding
return embedding, nil
},
)
}
func (c *ChromemDB) Store(s string) error {
defer func() {
c.index++
}()
if s == "" {
return fmt.Errorf("empty string")
}
return c.collection.AddDocuments(context.Background(), []chromem.Document{
{
Content: s,
ID: fmt.Sprint(c.index),
},
}, runtime.NumCPU())
}
func (c *ChromemDB) Search(s string, similarEntries int) ([]string, error) {
res, err := c.collection.Query(context.Background(), s, similarEntries, nil, nil)
if err != nil {
return nil, err
}
var results []string
for _, r := range res {
results = append(results, r.Content)
}
return results, nil
}

View File

@@ -0,0 +1,86 @@
package vectorstore
import (
"context"
"fmt"
"github.com/sashabaranov/go-openai"
)
type LocalAIRAGDB struct {
client *StoreClient
openaiClient *openai.Client
}
func NewLocalAIRAGDB(storeClient *StoreClient, openaiClient *openai.Client) *LocalAIRAGDB {
return &LocalAIRAGDB{
client: storeClient,
openaiClient: openaiClient,
}
}
func (db *LocalAIRAGDB) Reset() error {
return fmt.Errorf("not implemented")
}
func (db *LocalAIRAGDB) Count() int {
return 0
}
func (db *LocalAIRAGDB) Store(s string) error {
resp, err := db.openaiClient.CreateEmbeddings(context.TODO(),
openai.EmbeddingRequestStrings{
Input: []string{s},
Model: openai.AdaEmbeddingV2,
},
)
if err != nil {
return fmt.Errorf("error getting keys: %v", err)
}
if len(resp.Data) == 0 {
return fmt.Errorf("no response from OpenAI API")
}
embedding := resp.Data[0].Embedding
setReq := SetRequest{
Keys: [][]float32{embedding},
Values: []string{s},
}
err = db.client.Set(setReq)
if err != nil {
return fmt.Errorf("error setting keys: %v", err)
}
return nil
}
func (db *LocalAIRAGDB) Search(s string, similarEntries int) ([]string, error) {
resp, err := db.openaiClient.CreateEmbeddings(context.TODO(),
openai.EmbeddingRequestStrings{
Input: []string{s},
Model: openai.AdaEmbeddingV2,
},
)
if err != nil {
return []string{}, fmt.Errorf("error getting keys: %v", err)
}
if len(resp.Data) == 0 {
return []string{}, fmt.Errorf("no response from OpenAI API")
}
embedding := resp.Data[0].Embedding
// Find example
findReq := FindRequest{
TopK: similarEntries, // Number of similar entries you want to find
Key: embedding, // The key you're looking for similarities to
}
findResp, err := db.client.Find(findReq)
if err != nil {
return []string{}, fmt.Errorf("error finding keys: %v", err)
}
return findResp.Values, nil
}

161
pkg/vectorstore/store.go Normal file
View File

@@ -0,0 +1,161 @@
package vectorstore
import (
"bytes"
"encoding/json"
"fmt"
"io/ioutil"
"net/http"
)
// Define a struct to hold your store API client
type StoreClient struct {
BaseURL string
APIToken string
Client *http.Client
}
// Define request and response struct formats based on the API documentation
type SetRequest struct {
Keys [][]float32 `json:"keys"`
Values []string `json:"values"`
}
type GetRequest struct {
Keys [][]float32 `json:"keys"`
}
type GetResponse struct {
Keys [][]float32 `json:"keys"`
Values []string `json:"values"`
}
type DeleteRequest struct {
Keys [][]float32 `json:"keys"`
}
type FindRequest struct {
TopK int `json:"topk"`
Key []float32 `json:"key"`
}
type FindResponse struct {
Keys [][]float32 `json:"keys"`
Values []string `json:"values"`
Similarities []float32 `json:"similarities"`
}
// Constructor for StoreClient
func NewStoreClient(baseUrl, apiToken string) *StoreClient {
return &StoreClient{
BaseURL: baseUrl,
APIToken: apiToken,
Client: &http.Client{},
}
}
// Implement Set method
func (c *StoreClient) Set(req SetRequest) error {
return c.doRequest("stores/set", req)
}
// Implement Get method
func (c *StoreClient) Get(req GetRequest) (*GetResponse, error) {
body, err := c.doRequestWithResponse("stores/get", req)
if err != nil {
return nil, err
}
var resp GetResponse
err = json.Unmarshal(body, &resp)
if err != nil {
return nil, err
}
return &resp, nil
}
// Implement Delete method
func (c *StoreClient) Delete(req DeleteRequest) error {
return c.doRequest("stores/delete", req)
}
// Implement Find method
func (c *StoreClient) Find(req FindRequest) (*FindResponse, error) {
body, err := c.doRequestWithResponse("stores/find", req)
if err != nil {
return nil, err
}
var resp FindResponse
err = json.Unmarshal(body, &resp)
if err != nil {
return nil, err
}
return &resp, nil
}
// Helper function to perform a request without expecting a response body
func (c *StoreClient) doRequest(path string, data interface{}) error {
jsonData, err := json.Marshal(data)
if err != nil {
return err
}
req, err := http.NewRequest("POST", c.BaseURL+"/"+path, bytes.NewBuffer(jsonData))
if err != nil {
return err
}
// Set Bearer token
if c.APIToken != "" {
req.Header.Set("Authorization", "Bearer "+c.APIToken)
}
req.Header.Set("Content-Type", "application/json")
resp, err := c.Client.Do(req)
if err != nil {
return err
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
return fmt.Errorf("API request to %s failed with status code %d", path, resp.StatusCode)
}
return nil
}
// Helper function to perform a request and parse the response body
func (c *StoreClient) doRequestWithResponse(path string, data interface{}) ([]byte, error) {
jsonData, err := json.Marshal(data)
if err != nil {
return nil, err
}
req, err := http.NewRequest("POST", c.BaseURL+"/"+path, bytes.NewBuffer(jsonData))
if err != nil {
return nil, err
}
req.Header.Set("Content-Type", "application/json")
// Set Bearer token
if c.APIToken != "" {
req.Header.Set("Authorization", "Bearer "+c.APIToken)
}
resp, err := c.Client.Do(req)
if err != nil {
return nil, err
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
return nil, fmt.Errorf("API request to %s failed with status code %d", path, resp.StatusCode)
}
body, err := ioutil.ReadAll(resp.Body)
if err != nil {
return nil, err
}
return body, nil
}

71
pkg/xlog/xlog.go Normal file
View File

@@ -0,0 +1,71 @@
package xlog
import (
"context"
"log/slog"
"os"
"runtime"
)
var logger *slog.Logger
func init() {
var level = slog.LevelDebug
switch os.Getenv("LOG_LEVEL") {
case "info":
level = slog.LevelInfo
case "warn":
level = slog.LevelWarn
case "error":
level = slog.LevelError
case "debug":
level = slog.LevelDebug
}
var opts = &slog.HandlerOptions{
Level: level,
}
var handler slog.Handler
if os.Getenv("LOG_FORMAT") == "json" {
handler = slog.NewJSONHandler(os.Stdout, opts)
} else {
handler = slog.NewTextHandler(os.Stdout, opts)
}
logger = slog.New(handler)
}
func _log(level slog.Level, msg string, args ...any) {
_, f, l, _ := runtime.Caller(2)
group := slog.Group(
"source",
slog.Attr{
Key: "file",
Value: slog.AnyValue(f),
},
slog.Attr{
Key: "L",
Value: slog.AnyValue(l),
},
)
args = append(args, group)
logger.Log(context.Background(), level, msg, args...)
}
func Info(msg string, args ...any) {
_log(slog.LevelInfo, msg, args...)
}
func Debug(msg string, args ...any) {
_log(slog.LevelDebug, msg, args...)
}
func Error(msg string, args ...any) {
_log(slog.LevelError, msg, args...)
}
func Warn(msg string, args ...any) {
_log(slog.LevelWarn, msg, args...)
}

72
pkg/xstrings/split.go Normal file
View File

@@ -0,0 +1,72 @@
package xstrings
import (
"strings"
)
// SplitTextByLength splits text into chunks of specified maxLength,
// preserving complete words and special characters like newlines.
// It returns a slice of strings, each with length <= maxLength.
func SplitParagraph(text string, maxLength int) []string {
// Handle edge cases
if maxLength <= 0 || len(text) == 0 {
return []string{text}
}
var chunks []string
remainingText := text
for len(remainingText) > 0 {
// If remaining text fits in a chunk, add it and we're done
if len(remainingText) <= maxLength {
chunks = append(chunks, remainingText)
break
}
// Try to find a good split point near the max length
splitIndex := maxLength
// Look backward from the max length to find a space or newline
for splitIndex > 0 && !isWhitespace(rune(remainingText[splitIndex])) {
splitIndex--
}
// If we couldn't find a good split point (no whitespace),
// look forward for the next whitespace
if splitIndex == 0 {
splitIndex = maxLength
// If we can't find whitespace forward, we'll have to split a word
for splitIndex < len(remainingText) && !isWhitespace(rune(remainingText[splitIndex])) {
splitIndex++
}
// If we still couldn't find whitespace, take the whole string
if splitIndex == len(remainingText) {
chunks = append(chunks, remainingText)
break
}
}
// Add the chunk up to the split point
chunk := remainingText[:splitIndex]
// Preserve trailing newlines with the current chunk
if splitIndex < len(remainingText) && remainingText[splitIndex] == '\n' {
chunk += string(remainingText[splitIndex])
splitIndex++
}
chunks = append(chunks, chunk)
// Remove leading whitespace from the next chunk
remainingText = remainingText[splitIndex:]
remainingText = strings.TrimLeftFunc(remainingText, isWhitespace)
}
return chunks
}
// Helper function to determine if a character is whitespace
func isWhitespace(r rune) bool {
return r == ' ' || r == '\t' || r == '\n' || r == '\r'
}

View File

@@ -0,0 +1,79 @@
package xstrings_test
import (
xtrings "github.com/mudler/LocalAGI/pkg/xstrings"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
)
var _ = Describe("SplitParagraph", func() {
It("should return the text as a single chunk if it's shorter than maxLen", func() {
text := "Short text"
maxLen := 20
result := xtrings.SplitParagraph(text, maxLen)
Expect(result).To(Equal([]string{"Short text"}))
})
It("should split the text into chunks of maxLen without truncating words", func() {
text := "This is a longer text that needs to be split into chunks."
maxLen := 10
result := xtrings.SplitParagraph(text, maxLen)
Expect(result).To(Equal([]string{"This is a", "longer", "text that", "needs to", "be split", "into", "chunks."}))
})
It("should handle texts with multiple spaces and newlines correctly", func() {
text := "This is\na\ntext with\n\nmultiple spaces and\nnewlines."
maxLen := 10
result := xtrings.SplitParagraph(text, maxLen)
Expect(result).To(Equal([]string{"This is\na\n", "text with\n", "multiple", "spaces ", "and\n", "newlines."}))
})
It("should handle a text with a single word longer than maxLen", func() {
text := "supercalifragilisticexpialidocious"
maxLen := 10
result := xtrings.SplitParagraph(text, maxLen)
Expect(result).To(Equal([]string{"supercalifragilisticexpialidocious"}))
})
It("should handle a text with empty lines", func() {
text := "line1\n\nline2"
maxLen := 10
result := xtrings.SplitParagraph(text, maxLen)
Expect(result).To(Equal([]string{"line1\n\n", "line2"}))
})
It("should handle a text with leading and trailing spaces", func() {
text := " leading spaces and trailing spaces "
maxLen := 15
result := xtrings.SplitParagraph(text, maxLen)
Expect(result).To(Equal([]string{" leading", "spaces and", "trailing spaces"}))
})
It("should handle a text with only spaces", func() {
text := " "
maxLen := 10
result := xtrings.SplitParagraph(text, maxLen)
Expect(result).To(Equal([]string{" "}))
})
It("should handle empty string", func() {
text := ""
maxLen := 10
result := xtrings.SplitParagraph(text, maxLen)
Expect(result).To(Equal([]string{""}))
})
It("should handle a text with only newlines", func() {
text := "\n\n\n"
maxLen := 10
result := xtrings.SplitParagraph(text, maxLen)
Expect(result).To(Equal([]string{"\n\n\n"}))
})
It("should handle a text with special characters", func() {
text := "This is a text with special characters !@#$%^&*()"
maxLen := 20
result := xtrings.SplitParagraph(text, maxLen)
Expect(result).To(Equal([]string{"This is a text with", "special characters", "!@#$%^&*()"}))
})
})

15
pkg/xstrings/uniq.go Normal file
View File

@@ -0,0 +1,15 @@
package xstrings
type Comparable interface{ ~int | ~int64 | ~string }
func UniqueSlice[T Comparable](s []T) []T {
keys := make(map[T]bool)
list := []T{}
for _, entry := range s {
if _, value := keys[entry]; !value {
keys[entry] = true
list = append(list, entry)
}
}
return list
}

View File

@@ -0,0 +1,13 @@
package xstrings_test
import (
"testing"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
)
func TestXStrings(t *testing.T) {
RegisterFailHandler(Fail)
RunSpecs(t, "XStrings test suite")
}

View File

@@ -1,8 +0,0 @@
langchain
openai
chromadb
pysqlite3-binary
requests
ascii-magic
loguru
duckduckgo_search

305
services/actions.go Normal file
View File

@@ -0,0 +1,305 @@
package services
import (
"context"
"encoding/json"
"fmt"
"github.com/mudler/LocalAGI/core/action"
"github.com/mudler/LocalAGI/core/state"
"github.com/mudler/LocalAGI/core/types"
"github.com/mudler/LocalAGI/pkg/config"
"github.com/mudler/LocalAGI/pkg/xlog"
"github.com/mudler/LocalAGI/services/actions"
)
const (
// Actions
ActionSearch = "search"
ActionCustom = "custom"
ActionBrowserAgentRunner = "browser-agent-runner"
ActionGithubIssueLabeler = "github-issue-labeler"
ActionGithubIssueOpener = "github-issue-opener"
ActionGithubIssueCloser = "github-issue-closer"
ActionGithubIssueSearcher = "github-issue-searcher"
ActionGithubRepositoryGet = "github-repository-get-content"
ActionGithubRepositoryCreateOrUpdate = "github-repository-create-or-update-content"
ActionGithubIssueReader = "github-issue-reader"
ActionGithubIssueCommenter = "github-issue-commenter"
ActionGithubPRReader = "github-pr-reader"
ActionGithubPRCommenter = "github-pr-commenter"
ActionGithubPRReviewer = "github-pr-reviewer"
ActionGithubPRCreator = "github-pr-creator"
ActionGithubGetAllContent = "github-get-all-repository-content"
ActionGithubREADME = "github-readme"
ActionScraper = "scraper"
ActionWikipedia = "wikipedia"
ActionBrowse = "browse"
ActionTwitterPost = "twitter-post"
ActionSendMail = "send-mail"
ActionGenerateImage = "generate_image"
ActionCounter = "counter"
ActionCallAgents = "call_agents"
ActionShellcommand = "shell-command"
)
var AvailableActions = []string{
ActionSearch,
ActionCustom,
ActionGithubIssueLabeler,
ActionGithubIssueOpener,
ActionGithubIssueCloser,
ActionGithubIssueSearcher,
ActionGithubRepositoryGet,
ActionGithubGetAllContent,
ActionBrowserAgentRunner,
ActionGithubRepositoryCreateOrUpdate,
ActionGithubIssueReader,
ActionGithubIssueCommenter,
ActionGithubPRReader,
ActionGithubPRCommenter,
ActionGithubPRReviewer,
ActionGithubPRCreator,
ActionGithubREADME,
ActionScraper,
ActionBrowse,
ActionWikipedia,
ActionSendMail,
ActionGenerateImage,
ActionTwitterPost,
ActionCounter,
ActionCallAgents,
ActionShellcommand,
}
func Actions(actionsConfigs map[string]string) func(a *state.AgentConfig) func(ctx context.Context, pool *state.AgentPool) []types.Action {
return func(a *state.AgentConfig) func(ctx context.Context, pool *state.AgentPool) []types.Action {
return func(ctx context.Context, pool *state.AgentPool) []types.Action {
allActions := []types.Action{}
agentName := a.Name
for _, a := range a.Actions {
var config map[string]string
if err := json.Unmarshal([]byte(a.Config), &config); err != nil {
xlog.Error("Error unmarshalling action config", "error", err)
continue
}
a, err := Action(a.Name, agentName, config, pool, actionsConfigs)
if err != nil {
continue
}
allActions = append(allActions, a)
}
return allActions
}
}
}
func Action(name, agentName string, config map[string]string, pool *state.AgentPool, actionsConfigs map[string]string) (types.Action, error) {
var a types.Action
var err error
switch name {
case ActionCustom:
a, err = action.NewCustom(config, "")
case ActionGenerateImage:
a = actions.NewGenImage(config)
case ActionSearch:
a = actions.NewSearch(config)
case ActionGithubIssueLabeler:
a = actions.NewGithubIssueLabeler(config)
case ActionGithubIssueOpener:
a = actions.NewGithubIssueOpener(config)
case ActionGithubIssueCloser:
a = actions.NewGithubIssueCloser(config)
case ActionGithubIssueSearcher:
a = actions.NewGithubIssueSearch(config)
case ActionBrowserAgentRunner:
a = actions.NewBrowserAgentRunner(config, actionsConfigs["browser-agent-runner-base-url"])
case ActionGithubIssueReader:
a = actions.NewGithubIssueReader(config)
case ActionGithubPRReader:
a = actions.NewGithubPRReader(config)
case ActionGithubPRCommenter:
a = actions.NewGithubPRCommenter(config)
case ActionGithubPRReviewer:
a = actions.NewGithubPRReviewer(config)
case ActionGithubPRCreator:
a = actions.NewGithubPRCreator(config)
case ActionGithubGetAllContent:
a = actions.NewGithubRepositoryGetAllContent(config)
case ActionGithubIssueCommenter:
a = actions.NewGithubIssueCommenter(config)
case ActionGithubRepositoryGet:
a = actions.NewGithubRepositoryGetContent(config)
case ActionGithubRepositoryCreateOrUpdate:
a = actions.NewGithubRepositoryCreateOrUpdateContent(config)
case ActionGithubREADME:
a = actions.NewGithubRepositoryREADME(config)
case ActionScraper:
a = actions.NewScraper(config)
case ActionWikipedia:
a = actions.NewWikipedia(config)
case ActionBrowse:
a = actions.NewBrowse(config)
case ActionSendMail:
a = actions.NewSendMail(config)
case ActionTwitterPost:
a = actions.NewPostTweet(config)
case ActionCounter:
a = actions.NewCounter(config)
case ActionCallAgents:
a = actions.NewCallAgent(config, agentName, pool.InternalAPI())
case ActionShellcommand:
a = actions.NewShell(config)
default:
xlog.Error("Action not found", "name", name)
return nil, fmt.Errorf("Action not found")
}
if err != nil {
return nil, err
}
return a, nil
}
func ActionsConfigMeta() []config.FieldGroup {
return []config.FieldGroup{
{
Name: "search",
Label: "Search",
Fields: actions.SearchConfigMeta(),
},
{
Name: "browser-agent-runner",
Label: "Browser Agent Runner",
Fields: actions.BrowserAgentRunnerConfigMeta(),
},
{
Name: "generate_image",
Label: "Generate Image",
Fields: actions.GenImageConfigMeta(),
},
{
Name: "github-issue-labeler",
Label: "GitHub Issue Labeler",
Fields: actions.GithubIssueLabelerConfigMeta(),
},
{
Name: "github-issue-opener",
Label: "GitHub Issue Opener",
Fields: actions.GithubIssueOpenerConfigMeta(),
},
{
Name: "github-issue-closer",
Label: "GitHub Issue Closer",
Fields: actions.GithubIssueCloserConfigMeta(),
},
{
Name: "github-issue-commenter",
Label: "GitHub Issue Commenter",
Fields: actions.GithubIssueCommenterConfigMeta(),
},
{
Name: "github-issue-reader",
Label: "GitHub Issue Reader",
Fields: actions.GithubIssueReaderConfigMeta(),
},
{
Name: "github-issue-searcher",
Label: "GitHub Issue Search",
Fields: actions.GithubIssueSearchConfigMeta(),
},
{
Name: "github-repository-get-content",
Label: "GitHub Repository Get Content",
Fields: actions.GithubRepositoryGetContentConfigMeta(),
},
{
Name: "github-get-all-repository-content",
Label: "GitHub Get All Repository Content",
Fields: actions.GithubRepositoryGetAllContentConfigMeta(),
},
{
Name: "github-repository-create-or-update-content",
Label: "GitHub Repository Create/Update Content",
Fields: actions.GithubRepositoryCreateOrUpdateContentConfigMeta(),
},
{
Name: "github-readme",
Label: "GitHub Repository README",
Fields: actions.GithubRepositoryREADMEConfigMeta(),
},
{
Name: "github-pr-reader",
Label: "GitHub PR Reader",
Fields: actions.GithubPRReaderConfigMeta(),
},
{
Name: "github-pr-commenter",
Label: "GitHub PR Commenter",
Fields: actions.GithubPRCommenterConfigMeta(),
},
{
Name: "github-pr-reviewer",
Label: "GitHub PR Reviewer",
Fields: actions.GithubPRReviewerConfigMeta(),
},
{
Name: "github-pr-creator",
Label: "GitHub PR Creator",
Fields: actions.GithubPRCreatorConfigMeta(),
},
{
Name: "twitter-post",
Label: "Twitter Post",
Fields: actions.TwitterPostConfigMeta(),
},
{
Name: "send-mail",
Label: "Send Mail",
Fields: actions.SendMailConfigMeta(),
},
{
Name: "shell-command",
Label: "Shell Command",
Fields: actions.ShellConfigMeta(),
},
{
Name: "custom",
Label: "Custom",
Fields: action.CustomConfigMeta(),
},
{
Name: "scraper",
Label: "Scraper",
Fields: []config.Field{},
},
{
Name: "wikipedia",
Label: "Wikipedia",
Fields: []config.Field{},
},
{
Name: "browse",
Label: "Browse",
Fields: []config.Field{},
},
{
Name: "counter",
Label: "Counter",
Fields: []config.Field{},
},
{
Name: "call_agents",
Label: "Call Agents",
Fields: []config.Field{},
},
}
}

View File

@@ -0,0 +1,13 @@
package actions_test
import (
"testing"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
)
func TestActions(t *testing.T) {
RegisterFailHandler(Fail)
RunSpecs(t, "Agent actions test suite")
}

View File

@@ -0,0 +1,72 @@
package actions
import (
"context"
"fmt"
"io"
"net/http"
"github.com/mudler/LocalAGI/core/types"
"github.com/sashabaranov/go-openai/jsonschema"
"jaytaylor.com/html2text"
)
func NewBrowse(config map[string]string) *BrowseAction {
return &BrowseAction{}
}
type BrowseAction struct{}
func (a *BrowseAction) Run(ctx context.Context, params types.ActionParams) (types.ActionResult, error) {
result := struct {
URL string `json:"url"`
}{}
err := params.Unmarshal(&result)
if err != nil {
fmt.Printf("error: %v", err)
return types.ActionResult{}, err
}
// download page with http.Client
client := &http.Client{}
req, err := http.NewRequest("GET", result.URL, nil)
if err != nil {
return types.ActionResult{}, err
}
resp, err := client.Do(req)
if err != nil {
return types.ActionResult{}, err
}
defer resp.Body.Close()
pagebyte, err := io.ReadAll(resp.Body)
if err != nil {
return types.ActionResult{}, err
}
rendered, err := html2text.FromString(string(pagebyte), html2text.Options{PrettyTables: true})
if err != nil {
return types.ActionResult{}, err
}
return types.ActionResult{Result: fmt.Sprintf("The webpage '%s' content is:\n%s", result.URL, rendered)}, nil
}
func (a *BrowseAction) Definition() types.ActionDefinition {
return types.ActionDefinition{
Name: "browse",
Description: "Use this tool to visit an URL. It browse a website page and return the text content.",
Properties: map[string]jsonschema.Definition{
"url": {
Type: jsonschema.String,
Description: "The website URL.",
},
},
Required: []string{"url"},
}
}
func (a *BrowseAction) Plannable() bool {
return true
}

View File

@@ -0,0 +1,121 @@
package actions
import (
"context"
"fmt"
"github.com/mudler/LocalAGI/core/types"
"github.com/mudler/LocalAGI/pkg/config"
api "github.com/mudler/LocalAGI/pkg/localoperator"
"github.com/sashabaranov/go-openai/jsonschema"
)
const (
MetadataBrowserAgentHistory = "browser_agent_history"
)
type BrowserAgentRunner struct {
baseURL, customActionName string
client *api.Client
}
func NewBrowserAgentRunner(config map[string]string, defaultURL string) *BrowserAgentRunner {
if config["baseURL"] == "" {
config["baseURL"] = defaultURL
}
client := api.NewClient(config["baseURL"])
return &BrowserAgentRunner{
client: client,
baseURL: config["baseURL"],
customActionName: config["customActionName"],
}
}
func (b *BrowserAgentRunner) Run(ctx context.Context, params types.ActionParams) (types.ActionResult, error) {
result := api.AgentRequest{}
err := params.Unmarshal(&result)
if err != nil {
return types.ActionResult{}, fmt.Errorf("failed to unmarshal params: %w", err)
}
req := api.AgentRequest{
Goal: result.Goal,
MaxAttempts: result.MaxAttempts,
MaxNoActionAttempts: result.MaxNoActionAttempts,
}
stateHistory, err := b.client.RunBrowserAgent(req)
if err != nil {
return types.ActionResult{}, fmt.Errorf("failed to run browser agent: %w", err)
}
// Format the state history into a readable string
var historyStr string
// for i, state := range stateHistory.States {
// historyStr += fmt.Sprintf("State %d:\n", i+1)
// historyStr += fmt.Sprintf(" URL: %s\n", state.CurrentURL)
// historyStr += fmt.Sprintf(" Title: %s\n", state.PageTitle)
// historyStr += fmt.Sprintf(" Description: %s\n\n", state.PageContentDescription)
// }
historyStr += fmt.Sprintf(" URL: %s\n", stateHistory.States[len(stateHistory.States)-1].CurrentURL)
historyStr += fmt.Sprintf(" Title: %s\n", stateHistory.States[len(stateHistory.States)-1].PageTitle)
historyStr += fmt.Sprintf(" Description: %s\n\n", stateHistory.States[len(stateHistory.States)-1].PageContentDescription)
return types.ActionResult{
Result: fmt.Sprintf("Browser agent completed successfully. History:\n%s", historyStr),
Metadata: map[string]interface{}{MetadataBrowserAgentHistory: stateHistory},
}, nil
}
func (b *BrowserAgentRunner) Definition() types.ActionDefinition {
actionName := "run_browser_agent"
if b.customActionName != "" {
actionName = b.customActionName
}
description := "Run a browser agent to achieve a specific goal, for example: 'Go to https://www.google.com and search for 'LocalAI', and tell me what's on the first page'"
return types.ActionDefinition{
Name: types.ActionDefinitionName(actionName),
Description: description,
Properties: map[string]jsonschema.Definition{
"goal": {
Type: jsonschema.String,
Description: "The goal for the browser agent to achieve",
},
"max_attempts": {
Type: jsonschema.Number,
Description: "Maximum number of attempts the agent can make (optional)",
},
"max_no_action_attempts": {
Type: jsonschema.Number,
Description: "Maximum number of attempts without taking an action (optional)",
},
},
Required: []string{"goal"},
}
}
func (a *BrowserAgentRunner) Plannable() bool {
return true
}
// BrowserAgentRunnerConfigMeta returns the metadata for Browser Agent Runner action configuration fields
func BrowserAgentRunnerConfigMeta() []config.Field {
return []config.Field{
{
Name: "baseURL",
Label: "Base URL",
Type: config.FieldTypeText,
Required: false,
HelpText: "Base URL of the LocalOperator API",
},
{
Name: "customActionName",
Label: "Custom Action Name",
Type: config.FieldTypeText,
HelpText: "Custom name for this action",
},
}
}

View File

@@ -0,0 +1,127 @@
package actions
import (
"context"
"fmt"
"github.com/mudler/LocalAGI/core/state"
"github.com/mudler/LocalAGI/core/types"
"github.com/sashabaranov/go-openai"
"github.com/sashabaranov/go-openai/jsonschema"
)
func NewCallAgent(config map[string]string, agentName string, pool *state.AgentPoolInternalAPI) *CallAgentAction {
return &CallAgentAction{
pool: pool,
myName: agentName,
}
}
type CallAgentAction struct {
pool *state.AgentPoolInternalAPI
myName string
}
func (a *CallAgentAction) Run(ctx context.Context, params types.ActionParams) (types.ActionResult, error) {
result := struct {
AgentName string `json:"agent_name"`
Message string `json:"message"`
}{}
err := params.Unmarshal(&result)
if err != nil {
fmt.Printf("error: %v", err)
return types.ActionResult{}, err
}
ag := a.pool.GetAgent(result.AgentName)
if ag == nil {
return types.ActionResult{}, fmt.Errorf("agent '%s' not found", result.AgentName)
}
resp := ag.Ask(
types.WithConversationHistory(
[]openai.ChatCompletionMessage{
{
Role: "user",
Content: result.Message,
},
},
),
)
if resp.Error != nil {
return types.ActionResult{}, err
}
metadata := make(map[string]interface{})
for _, s := range resp.State {
for k, v := range s.Metadata {
if existingValue, ok := metadata[k]; ok {
switch existingValue := existingValue.(type) {
case []string:
switch v := v.(type) {
case []string:
metadata[k] = append(existingValue, v...)
case string:
metadata[k] = append(existingValue, v)
}
case string:
switch v := v.(type) {
case []string:
metadata[k] = append([]string{existingValue}, v...)
case string:
metadata[k] = []string{existingValue, v}
}
}
} else {
metadata[k] = v
}
}
}
return types.ActionResult{Result: resp.Response, Metadata: metadata}, nil
}
func (a *CallAgentAction) Definition() types.ActionDefinition {
allAgents := a.pool.AllAgents()
agents := []string{}
for _, ag := range allAgents {
if ag != a.myName {
agents = append(agents, ag)
}
}
description := "Use this tool to call another agent. Available agents and their roles are:"
for _, agent := range agents {
agentConfig := a.pool.GetConfig(agent)
if agentConfig == nil {
continue
}
description += fmt.Sprintf("\n\t- %s: %s", agent, agentConfig.Description)
}
return types.ActionDefinition{
Name: "call_agent",
Description: description,
Properties: map[string]jsonschema.Definition{
"agent_name": {
Type: jsonschema.String,
Description: "The name of the agent to call.",
Enum: agents,
},
"message": {
Type: jsonschema.String,
Description: "The message to send to the agent.",
},
},
Required: []string{"agent_name", "message"},
}
}
func (a *CallAgentAction) Plannable() bool {
return true
}

View File

@@ -0,0 +1,98 @@
package actions
import (
"context"
"fmt"
"sync"
"github.com/mudler/LocalAGI/core/types"
"github.com/sashabaranov/go-openai/jsonschema"
)
// CounterAction manages named counters that can be created, updated, and queried
type CounterAction struct {
counters map[string]int
mutex sync.RWMutex
}
// NewCounter creates a new counter action
func NewCounter(config map[string]string) *CounterAction {
return &CounterAction{
counters: make(map[string]int),
mutex: sync.RWMutex{},
}
}
// Run executes the counter action
func (a *CounterAction) Run(ctx context.Context, params types.ActionParams) (types.ActionResult, error) {
// Parse parameters
request := struct {
Name string `json:"name"`
Adjustment int `json:"adjustment"`
}{}
if err := params.Unmarshal(&request); err != nil {
return types.ActionResult{}, fmt.Errorf("invalid parameters: %w", err)
}
if request.Name == "" {
return types.ActionResult{}, fmt.Errorf("counter name cannot be empty")
}
a.mutex.Lock()
defer a.mutex.Unlock()
// Get current value or initialize if it doesn't exist
currentValue, exists := a.counters[request.Name]
// Update the counter
newValue := currentValue + request.Adjustment
a.counters[request.Name] = newValue
// Prepare the response message
var message string
if !exists && request.Adjustment == 0 {
message = fmt.Sprintf("Created counter '%s' with initial value 0", request.Name)
} else if !exists {
message = fmt.Sprintf("Created counter '%s' with initial value %d", request.Name, newValue)
} else if request.Adjustment > 0 {
message = fmt.Sprintf("Increased counter '%s' by %d to %d", request.Name, request.Adjustment, newValue)
} else if request.Adjustment < 0 {
message = fmt.Sprintf("Decreased counter '%s' by %d to %d", request.Name, -request.Adjustment, newValue)
} else {
message = fmt.Sprintf("Current value of counter '%s' is %d", request.Name, newValue)
}
return types.ActionResult{
Result: message,
Metadata: map[string]any{
"counter_name": request.Name,
"counter_value": newValue,
"adjustment": request.Adjustment,
"is_new": !exists,
},
}, nil
}
// Definition returns the action definition
func (a *CounterAction) Definition() types.ActionDefinition {
return types.ActionDefinition{
Name: "counter",
Description: "Create, update, or query named counters. Specify a name and an adjustment value (positive to increase, negative to decrease, zero to query).",
Properties: map[string]jsonschema.Definition{
"name": {
Type: jsonschema.String,
Description: "The name of the counter to create, update, or query.",
},
"adjustment": {
Type: jsonschema.Integer,
Description: "The value to adjust the counter by. Positive to increase, negative to decrease, zero to query the current value.",
},
},
Required: []string{"name", "adjustment"},
}
}
func (a *CounterAction) Plannable() bool {
return true
}

View File

@@ -0,0 +1,128 @@
package actions
import (
"context"
"fmt"
"github.com/mudler/LocalAGI/core/types"
"github.com/mudler/LocalAGI/pkg/config"
"github.com/sashabaranov/go-openai"
"github.com/sashabaranov/go-openai/jsonschema"
)
const (
MetadataImages = "images_url"
)
func NewGenImage(config map[string]string) *GenImageAction {
defaultConfig := openai.DefaultConfig(config["apiKey"])
defaultConfig.BaseURL = config["apiURL"]
return &GenImageAction{
client: openai.NewClientWithConfig(defaultConfig),
imageModel: config["model"],
}
}
type GenImageAction struct {
client *openai.Client
imageModel string
}
func (a *GenImageAction) Run(ctx context.Context, params types.ActionParams) (types.ActionResult, error) {
result := struct {
Prompt string `json:"prompt"`
Size string `json:"size"`
}{}
err := params.Unmarshal(&result)
if err != nil {
return types.ActionResult{}, err
}
if result.Prompt == "" {
return types.ActionResult{}, fmt.Errorf("prompt is required")
}
req := openai.ImageRequest{
Prompt: result.Prompt,
Model: a.imageModel,
}
switch result.Size {
case "256x256":
req.Size = openai.CreateImageSize256x256
case "512x512":
req.Size = openai.CreateImageSize512x512
case "1024x1024":
req.Size = openai.CreateImageSize1024x1024
default:
req.Size = openai.CreateImageSize256x256
}
resp, err := a.client.CreateImage(ctx, req)
if err != nil {
return types.ActionResult{Result: "Failed to generate image " + err.Error()}, err
}
if len(resp.Data) == 0 {
return types.ActionResult{Result: "Failed to generate image"}, nil
}
return types.ActionResult{
Result: fmt.Sprintf("The image was generated and available at: %s", resp.Data[0].URL),
Metadata: map[string]interface{}{
MetadataImages: []string{resp.Data[0].URL},
}}, nil
}
func (a *GenImageAction) Definition() types.ActionDefinition {
return types.ActionDefinition{
Name: "generate_image",
Description: "Generate image with.",
Properties: map[string]jsonschema.Definition{
"prompt": {
Type: jsonschema.String,
Description: "The image prompt to generate the image.",
},
"size": {
Type: jsonschema.String,
Description: "The image prompt to generate the image.",
Enum: []string{"256x256", "512x512", "1024x1024"},
},
},
Required: []string{"prompt"},
}
}
func (a *GenImageAction) Plannable() bool {
return true
}
// GenImageConfigMeta returns the metadata for GenImage action configuration fields
func GenImageConfigMeta() []config.Field {
return []config.Field{
{
Name: "apiKey",
Label: "API Key",
Type: config.FieldTypeText,
Required: true,
HelpText: "OpenAI API key for image generation",
},
{
Name: "apiURL",
Label: "API URL",
Type: config.FieldTypeText,
Required: true,
DefaultValue: "https://api.openai.com/v1",
HelpText: "OpenAI API URL",
},
{
Name: "model",
Label: "Model",
Type: config.FieldTypeText,
Required: true,
DefaultValue: "dall-e-3",
HelpText: "Image generation model to use (e.g., dall-e-3)",
},
}
}

View File

@@ -0,0 +1,70 @@
package actions_test
import (
"context"
"os"
"github.com/mudler/LocalAGI/core/types"
. "github.com/mudler/LocalAGI/services/actions"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
)
var _ = Describe("GenImageAction", func() {
var (
ctx context.Context
action *GenImageAction
params types.ActionParams
config map[string]string
)
BeforeEach(func() {
ctx = context.Background()
apiKey := os.Getenv("OPENAI_API_KEY")
apiURL := os.Getenv("OPENAI_API_URL")
testModel := os.Getenv("OPENAI_MODEL")
if apiURL == "" {
Skip("OPENAI_API_URL must be set")
}
config = map[string]string{
"apiKey": apiKey,
"apiURL": apiURL,
"model": testModel,
}
action = NewGenImage(config)
})
Describe("Run", func() {
It("should generate an image with valid prompt and size", func() {
params = types.ActionParams{
"prompt": "test prompt",
"size": "256x256",
}
url, err := action.Run(ctx, params)
Expect(err).ToNot(HaveOccurred())
Expect(url).ToNot(BeEmpty())
})
It("should return an error if the prompt is not provided", func() {
params = types.ActionParams{
"size": "256x256",
}
_, err := action.Run(ctx, params)
Expect(err).To(HaveOccurred())
})
})
Describe("Definition", func() {
It("should return the correct action definition", func() {
definition := action.Definition()
Expect(definition.Name.String()).To(Equal("generate_image"))
Expect(definition.Description).To(Equal("Generate image with."))
Expect(definition.Properties).To(HaveKey("prompt"))
Expect(definition.Properties).To(HaveKey("size"))
Expect(definition.Required).To(ContainElement("prompt"))
})
})
})

View File

@@ -0,0 +1,154 @@
package actions
import (
"context"
"fmt"
"github.com/google/go-github/v69/github"
"github.com/mudler/LocalAGI/core/types"
"github.com/mudler/LocalAGI/pkg/config"
"github.com/sashabaranov/go-openai/jsonschema"
)
type GithubIssuesCloser struct {
token, repository, owner, customActionName string
client *github.Client
}
func NewGithubIssueCloser(config map[string]string) *GithubIssuesCloser {
client := github.NewClient(nil).WithAuthToken(config["token"])
return &GithubIssuesCloser{
client: client,
token: config["token"],
repository: config["repository"],
owner: config["owner"],
customActionName: config["customActionName"],
}
}
func (g *GithubIssuesCloser) Run(ctx context.Context, params types.ActionParams) (types.ActionResult, error) {
result := struct {
Repository string `json:"repository"`
Owner string `json:"owner"`
IssueNumber int `json:"issue_number"`
}{}
err := params.Unmarshal(&result)
if err != nil {
fmt.Printf("error: %v", err)
return types.ActionResult{}, err
}
if g.repository != "" {
result.Repository = g.repository
}
if g.owner != "" {
result.Owner = g.owner
}
// _, _, err = g.client.Issues.CreateComment(
// g.context,
// result.Owner, result.Repository,
// result.IssueNumber, &github.IssueComment{
// //Body: &result.Text,
// },
// )
// if err != nil {
// fmt.Printf("error: %v", err)
// return "", err
// }
_, _, err = g.client.Issues.Edit(ctx, result.Owner, result.Repository, result.IssueNumber, &github.IssueRequest{
State: github.String("closed"),
})
if err != nil {
fmt.Printf("error: %v", err)
return types.ActionResult{}, err
}
resultString := fmt.Sprintf("Closed issue %d in repository %s/%s", result.IssueNumber, result.Owner, result.Repository)
if err != nil {
resultString = fmt.Sprintf("Error closing issue %d in repository %s/%s: %v", result.IssueNumber, result.Owner, result.Repository, err)
}
return types.ActionResult{Result: resultString}, err
}
func (g *GithubIssuesCloser) Definition() types.ActionDefinition {
actionName := "close_github_issue"
if g.customActionName != "" {
actionName = g.customActionName
}
if g.repository != "" && g.owner != "" {
return types.ActionDefinition{
Name: types.ActionDefinitionName(actionName),
Description: "Closes a Github issue.",
Properties: map[string]jsonschema.Definition{
"issue_number": {
Type: jsonschema.Number,
Description: "The issue number to close",
},
},
Required: []string{"issue_number"},
}
}
return types.ActionDefinition{
Name: types.ActionDefinitionName(actionName),
Description: "Closes a Github issue.",
Properties: map[string]jsonschema.Definition{
"repository": {
Type: jsonschema.String,
Description: "The repository to close the issue in.",
},
"owner": {
Type: jsonschema.String,
Description: "The owner of the repository.",
},
"issue_number": {
Type: jsonschema.Number,
Description: "The issue number to close",
},
},
Required: []string{"issue_number", "repository", "owner"},
}
}
func (a *GithubIssuesCloser) Plannable() bool {
return true
}
// GithubIssueCloserConfigMeta returns the metadata for GitHub Issue Closer action configuration fields
func GithubIssueCloserConfigMeta() []config.Field {
return []config.Field{
{
Name: "token",
Label: "GitHub Token",
Type: config.FieldTypeText,
Required: true,
HelpText: "GitHub API token with repository access",
},
{
Name: "repository",
Label: "Repository",
Type: config.FieldTypeText,
Required: true,
HelpText: "GitHub repository name",
},
{
Name: "owner",
Label: "Owner",
Type: config.FieldTypeText,
Required: true,
HelpText: "GitHub repository owner",
},
{
Name: "customActionName",
Label: "Custom Action Name",
Type: config.FieldTypeText,
HelpText: "Custom name for this action",
},
}
}

View File

@@ -0,0 +1,141 @@
package actions
import (
"context"
"fmt"
"github.com/google/go-github/v69/github"
"github.com/mudler/LocalAGI/core/types"
"github.com/mudler/LocalAGI/pkg/config"
"github.com/sashabaranov/go-openai/jsonschema"
)
type GithubIssuesCommenter struct {
token, repository, owner, customActionName string
client *github.Client
}
func NewGithubIssueCommenter(config map[string]string) *GithubIssuesCommenter {
client := github.NewClient(nil).WithAuthToken(config["token"])
return &GithubIssuesCommenter{
client: client,
token: config["token"],
customActionName: config["customActionName"],
repository: config["repository"],
owner: config["owner"],
}
}
func (g *GithubIssuesCommenter) Run(ctx context.Context, params types.ActionParams) (types.ActionResult, error) {
result := struct {
Repository string `json:"repository"`
Owner string `json:"owner"`
Comment string `json:"comment"`
IssueNumber int `json:"issue_number"`
}{}
err := params.Unmarshal(&result)
if err != nil {
return types.ActionResult{}, err
}
if g.repository != "" && g.owner != "" {
result.Repository = g.repository
result.Owner = g.owner
}
_, _, err = g.client.Issues.CreateComment(ctx, result.Owner, result.Repository, result.IssueNumber,
&github.IssueComment{
Body: &result.Comment,
})
resultString := fmt.Sprintf("Added comment to issue %d in repository %s/%s", result.IssueNumber, result.Owner, result.Repository)
if err != nil {
resultString = fmt.Sprintf("Error adding comment to issue %d in repository %s/%s: %v", result.IssueNumber, result.Owner, result.Repository, err)
}
return types.ActionResult{Result: resultString}, err
}
func (g *GithubIssuesCommenter) Definition() types.ActionDefinition {
actionName := "add_comment_to_github_issue"
if g.customActionName != "" {
actionName = g.customActionName
}
description := "Add a comment to a Github issue to a repository."
if g.repository != "" && g.owner != "" {
return types.ActionDefinition{
Name: types.ActionDefinitionName(actionName),
Description: description,
Properties: map[string]jsonschema.Definition{
"issue_number": {
Type: jsonschema.Number,
Description: "The number of the issue to add the label to.",
},
"comment": {
Type: jsonschema.String,
Description: "The comment to add to the issue.",
},
},
Required: []string{"issue_number", "comment"},
}
}
return types.ActionDefinition{
Name: types.ActionDefinitionName(actionName),
Description: description,
Properties: map[string]jsonschema.Definition{
"issue_number": {
Type: jsonschema.Number,
Description: "The number of the issue to add the label to.",
},
"repository": {
Type: jsonschema.String,
Description: "The repository to add the label to.",
},
"owner": {
Type: jsonschema.String,
Description: "The owner of the repository.",
},
"comment": {
Type: jsonschema.String,
Description: "The comment to add to the issue.",
},
},
Required: []string{"issue_number", "repository", "owner", "comment"},
}
}
func (a *GithubIssuesCommenter) Plannable() bool {
return true
}
// GithubIssueCommenterConfigMeta returns the metadata for GitHub Issue Commenter action configuration fields
func GithubIssueCommenterConfigMeta() []config.Field {
return []config.Field{
{
Name: "token",
Label: "GitHub Token",
Type: config.FieldTypeText,
Required: true,
HelpText: "GitHub API token with repository access",
},
{
Name: "repository",
Label: "Repository",
Type: config.FieldTypeText,
Required: true,
HelpText: "GitHub repository name",
},
{
Name: "owner",
Label: "Owner",
Type: config.FieldTypeText,
Required: true,
HelpText: "GitHub repository owner",
},
{
Name: "customActionName",
Label: "Custom Action Name",
Type: config.FieldTypeText,
HelpText: "Custom name for this action",
},
}
}

View File

@@ -0,0 +1,163 @@
package actions
import (
"context"
"fmt"
"strings"
"github.com/google/go-github/v69/github"
"github.com/mudler/LocalAGI/core/types"
"github.com/mudler/LocalAGI/pkg/config"
"github.com/mudler/LocalAGI/pkg/xlog"
"github.com/sashabaranov/go-openai/jsonschema"
)
type GithubIssuesLabeler struct {
token, repository, owner, customActionName string
availableLabels []string
client *github.Client
}
func NewGithubIssueLabeler(config map[string]string) *GithubIssuesLabeler {
client := github.NewClient(nil).WithAuthToken(config["token"])
// Get available labels
availableLabels := []string{"bug", "enhancement"}
if config["availableLabels"] != "" {
availableLabels = strings.Split(config["availableLabels"], ",")
}
return &GithubIssuesLabeler{
client: client,
token: config["token"],
customActionName: config["customActionName"],
repository: config["repository"],
owner: config["owner"],
availableLabels: availableLabels,
}
}
func (g *GithubIssuesLabeler) Run(ctx context.Context, params types.ActionParams) (types.ActionResult, error) {
result := struct {
Repository string `json:"repository"`
Owner string `json:"owner"`
Label string `json:"label"`
IssueNumber int `json:"issue_number"`
}{}
err := params.Unmarshal(&result)
if err != nil {
return types.ActionResult{}, err
}
if g.repository != "" && g.owner != "" {
result.Repository = g.repository
result.Owner = g.owner
}
labels, _, err := g.client.Issues.AddLabelsToIssue(ctx, result.Owner, result.Repository, result.IssueNumber, []string{result.Label})
//labelsNames := []string{}
for _, l := range labels {
xlog.Info("Label added", "label", l.Name)
//labelsNames = append(labelsNames, l.GetName())
}
resultString := fmt.Sprintf("Added label '%s' to issue %d in repository %s/%s", result.Label, result.IssueNumber, result.Owner, result.Repository)
if err != nil {
resultString = fmt.Sprintf("Error adding label '%s' to issue %d in repository %s/%s: %v", result.Label, result.IssueNumber, result.Owner, result.Repository, err)
}
return types.ActionResult{Result: resultString}, err
}
func (g *GithubIssuesLabeler) Definition() types.ActionDefinition {
actionName := "add_label_to_github_issue"
if g.customActionName != "" {
actionName = g.customActionName
}
if g.repository != "" && g.owner != "" {
return types.ActionDefinition{
Name: types.ActionDefinitionName(actionName),
Description: "Add a label to a Github issue. You might want to assign labels to issues to categorize them.",
Properties: map[string]jsonschema.Definition{
"issue_number": {
Type: jsonschema.Number,
Description: "The number of the issue to add the label to.",
},
"label": {
Type: jsonschema.String,
Description: "The label to add to the issue.",
Enum: g.availableLabels,
},
},
Required: []string{"issue_number", "label"},
}
}
return types.ActionDefinition{
Name: types.ActionDefinitionName(actionName),
Description: "Add a label to a Github issue. You might want to assign labels to issues to categorize them.",
Properties: map[string]jsonschema.Definition{
"issue_number": {
Type: jsonschema.Number,
Description: "The number of the issue to add the label to.",
},
"repository": {
Type: jsonschema.String,
Description: "The repository to add the label to.",
},
"owner": {
Type: jsonschema.String,
Description: "The owner of the repository.",
},
"label": {
Type: jsonschema.String,
Description: "The label to add to the issue.",
Enum: g.availableLabels,
},
},
Required: []string{"issue_number", "repository", "owner", "label"},
}
}
func (a *GithubIssuesLabeler) Plannable() bool {
return true
}
// GithubIssueLabelerConfigMeta returns the metadata for GitHub Issue Labeler action configuration fields
func GithubIssueLabelerConfigMeta() []config.Field {
return []config.Field{
{
Name: "token",
Label: "GitHub Token",
Type: config.FieldTypeText,
Required: true,
HelpText: "GitHub API token with repository access",
},
{
Name: "repository",
Label: "Repository",
Type: config.FieldTypeText,
Required: true,
HelpText: "GitHub repository name",
},
{
Name: "owner",
Label: "Owner",
Type: config.FieldTypeText,
Required: true,
HelpText: "GitHub repository owner",
},
{
Name: "availableLabels",
Label: "Available Labels",
Type: config.FieldTypeText,
HelpText: "Comma-separated list of available labels",
DefaultValue: "bug,enhancement",
},
{
Name: "customActionName",
Label: "Custom Action Name",
Type: config.FieldTypeText,
HelpText: "Custom name for this action",
},
}
}

View File

@@ -0,0 +1,147 @@
package actions
import (
"context"
"fmt"
"github.com/google/go-github/v69/github"
"github.com/mudler/LocalAGI/core/types"
"github.com/mudler/LocalAGI/pkg/config"
"github.com/sashabaranov/go-openai/jsonschema"
)
type GithubIssuesOpener struct {
token, repository, owner, customActionName string
client *github.Client
}
func NewGithubIssueOpener(config map[string]string) *GithubIssuesOpener {
client := github.NewClient(nil).WithAuthToken(config["token"])
return &GithubIssuesOpener{
client: client,
token: config["token"],
repository: config["repository"],
owner: config["owner"],
customActionName: config["customActionName"],
}
}
func (g *GithubIssuesOpener) Run(ctx context.Context, params types.ActionParams) (types.ActionResult, error) {
result := struct {
Title string `json:"title"`
Body string `json:"text"`
Repository string `json:"repository"`
Owner string `json:"owner"`
}{}
err := params.Unmarshal(&result)
if err != nil {
fmt.Printf("error: %v", err)
return types.ActionResult{}, err
}
if g.repository != "" && g.owner != "" {
result.Repository = g.repository
result.Owner = g.owner
}
issue := &github.IssueRequest{
Title: &result.Title,
Body: &result.Body,
}
resultString := ""
createdIssue, _, err := g.client.Issues.Create(ctx, result.Owner, result.Repository, issue)
if err != nil {
resultString = fmt.Sprintf("Error creating issue: %v", err)
} else {
resultString = fmt.Sprintf("Created issue %d in repository %s/%s: %s", createdIssue.GetNumber(), result.Owner, result.Repository, createdIssue.GetURL())
}
return types.ActionResult{Result: resultString}, err
}
func (g *GithubIssuesOpener) Definition() types.ActionDefinition {
actionName := "create_github_issue"
if g.customActionName != "" {
actionName = g.customActionName
}
if g.repository != "" && g.owner != "" {
return types.ActionDefinition{
Name: types.ActionDefinitionName(actionName),
Description: "Create a new issue on a GitHub repository.",
Properties: map[string]jsonschema.Definition{
"text": {
Type: jsonschema.String,
Description: "The text of the new issue",
},
"title": {
Type: jsonschema.String,
Description: "The title of the issue.",
},
},
Required: []string{"title", "text"},
}
}
return types.ActionDefinition{
Name: types.ActionDefinitionName(actionName),
Description: "Create a new issue on a GitHub repository.",
Properties: map[string]jsonschema.Definition{
"text": {
Type: jsonschema.String,
Description: "The text of the new issue",
},
"title": {
Type: jsonschema.String,
Description: "The title of the issue.",
},
"owner": {
Type: jsonschema.String,
Description: "The owner of the repository.",
},
"repository": {
Type: jsonschema.String,
Description: "The repository where to create the issue.",
},
},
Required: []string{"title", "text", "owner", "repository"},
}
}
func (a *GithubIssuesOpener) Plannable() bool {
return true
}
// GithubIssueOpenerConfigMeta returns the metadata for GitHub Issue Opener action configuration fields
func GithubIssueOpenerConfigMeta() []config.Field {
return []config.Field{
{
Name: "token",
Label: "GitHub Token",
Type: config.FieldTypeText,
Required: true,
HelpText: "GitHub API token with repository access",
},
{
Name: "repository",
Label: "Repository",
Type: config.FieldTypeText,
Required: true,
HelpText: "GitHub repository name",
},
{
Name: "owner",
Label: "Owner",
Type: config.FieldTypeText,
Required: true,
HelpText: "GitHub repository owner",
},
{
Name: "customActionName",
Label: "Custom Action Name",
Type: config.FieldTypeText,
HelpText: "Custom name for this action",
},
}
}

Some files were not shown because too many files have changed in this diff Show More