aboutsummaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
authorLeonard Kugis <leonard@kug.is>2026-02-01 23:00:07 +0100
committerLeonard Kugis <leonard@kug.is>2026-02-01 23:00:07 +0100
commitc80be920eedb13d0fdf60f2ef4d64bdb30d7d871 (patch)
tree292f9dbb52f59201cee2b2f488680342981bb0f3
parent812cc11f4273f099fa52d014adaabbd1cebf7ab9 (diff)
downloadaichat-dev-docker-c80be920eedb13d0fdf60f2ef4d64bdb30d7d871.tar.gz
README: Fixed list indentation
-rw-r--r--README.md92
1 files changed, 48 insertions, 44 deletions
diff --git a/README.md b/README.md
index b12aea8..15ae3e2 100644
--- a/README.md
+++ b/README.md
@@ -6,60 +6,64 @@ This is a `sigoden/aichat` configuration to be used with docker.
1. Install `docker`, `docker-compose`, `docker-buildx` in your environment
2. Clone this repository and all of its submodules:
-```bash
-git clone --recurse-submodules https://git.kug.is/aichat-dev-docker.git
-```
+
+ ```bash
+ git clone --recurse-submodules https://git.kug.is/aichat-dev-docker.git
+ ```
+
3. Build `llm-functions-docker`. Refer to the [llm-functions-docker/README.md](llm-functions-docker/README.md) inside the project for detailed guide.
4. `cd` into this project directory:
-```bash
-cd aichat-dev-docker
-```
+
+ ```bash
+ cd aichat-dev-docker
+ ```
+
5. Adjust `config/config.yml` and `config/roles/tools.md` to your preferences or keep the defaults. Aichat is configured to use Deepinfra as model host, and role `tools` for tool usage.
6. Copy `docker-config-example.yml` to `docker-compose.yml` and adjust it to your preferences or keep the defaults. It will use directories `work`, `projects` and `config` relative to the project directory. To integrate it into a host installation, you might want to mount `projects` to your local projects folder, and `config` to `~/.config/aichat` to use the same config on host and docker.
7. Collect required API keys and place them in `.env` files.
-
-`config/.env` example:
-
-```conf
-DEEPINFRA_API_BASE=https://api.deepinfra.com/v1/openai
-DEEPINFRA_API_KEY=[...]
-```
-
-`DEEPINFRA` being the client name as in `config/config.yml`.
-
-`llm-functions-docker/.env` example:
-
-```conf
-TAVILY_API_KEY=[...]
-```
-
+
+ `config/.env` example:
+
+ ```conf
+ DEEPINFRA_API_BASE=https://api.deepinfra.com/v1/openai
+ DEEPINFRA_API_KEY=[...]
+ ```
+
+ `DEEPINFRA` being the client name as in `config/config.yml`.
+
+ `llm-functions-docker/.env` example:
+
+ ```conf
+ TAVILY_API_KEY=[...]
+ ```
+
Optional. Required, if you want to use `web_search` tool using Tavily.
8. Build the docker image:
-
-```bash
-docker compose build
-```
+
+ ```bash
+ docker compose build
+ ```
## Usage
1. Start the docker container:
-
-```bash
-docker compose up -d
-```
-
+
+ ```bash
+ docker compose up -d
+ ```
+
2. Run aichat inside the docker container.
-
-If you want to use tools, you have to enable the role explicitly:
-
-```bash
-docker exec -it aichat aichat --role tools
-```
-
-If you dont want to use tools at all, you can omit this role (or define a different one):
-
-
-```bash
-docker exec -it aichat aichat
-```
+
+ If you want to use tools, you have to enable the role explicitly:
+
+ ```bash
+ docker exec -it aichat aichat --role tools
+ ```
+
+ If you dont want to use tools at all, you can omit this role (or define a different one):
+
+
+ ```bash
+ docker exec -it aichat aichat
+ ```