aboutsummaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
authorLeonard Kugis <leonard@kug.is>2026-02-01 23:09:12 +0100
committerLeonard Kugis <leonard@kug.is>2026-02-01 23:09:12 +0100
commit8cd5330195a46433f66a5134900203e6f45cc367 (patch)
tree45e50f218055d8c6e2b7c65880ba133699c3556a
parentc80be920eedb13d0fdf60f2ef4d64bdb30d7d871 (diff)
downloadaichat-dev-docker-8cd5330195a46433f66a5134900203e6f45cc367.tar.gz
README: Fixed indentation
-rw-r--r--README.md95
1 files changed, 47 insertions, 48 deletions
diff --git a/README.md b/README.md
index 15ae3e2..b706448 100644
--- a/README.md
+++ b/README.md
@@ -6,64 +6,63 @@ This is a `sigoden/aichat` configuration to be used with docker.
1. Install `docker`, `docker-compose`, `docker-buildx` in your environment
2. Clone this repository and all of its submodules:
-
- ```bash
- git clone --recurse-submodules https://git.kug.is/aichat-dev-docker.git
- ```
-
+
+ ```bash
+ git clone --recurse-submodules https://git.kug.is/aichat-dev-docker.git
+ ```
+
3. Build `llm-functions-docker`. Refer to the [llm-functions-docker/README.md](llm-functions-docker/README.md) inside the project for detailed guide.
4. `cd` into this project directory:
-
- ```bash
- cd aichat-dev-docker
- ```
-
+
+ ```bash
+ cd aichat-dev-docker
+ ```
+
5. Adjust `config/config.yml` and `config/roles/tools.md` to your preferences or keep the defaults. Aichat is configured to use Deepinfra as model host, and role `tools` for tool usage.
6. Copy `docker-config-example.yml` to `docker-compose.yml` and adjust it to your preferences or keep the defaults. It will use directories `work`, `projects` and `config` relative to the project directory. To integrate it into a host installation, you might want to mount `projects` to your local projects folder, and `config` to `~/.config/aichat` to use the same config on host and docker.
7. Collect required API keys and place them in `.env` files.
-
- `config/.env` example:
-
- ```conf
- DEEPINFRA_API_BASE=https://api.deepinfra.com/v1/openai
- DEEPINFRA_API_KEY=[...]
- ```
-
- `DEEPINFRA` being the client name as in `config/config.yml`.
-
- `llm-functions-docker/.env` example:
-
- ```conf
- TAVILY_API_KEY=[...]
- ```
-
-Optional. Required, if you want to use `web_search` tool using Tavily.
+
+ `config/.env` example:
+
+ ```conf
+ DEEPINFRA_API_BASE=https://api.deepinfra.com/v1/openai
+ DEEPINFRA_API_KEY=[...]
+ ```
+
+ `DEEPINFRA` being the client name as in `config/config.yml`.
+
+ `llm-functions-docker/.env` example:
+
+ ```conf
+ TAVILY_API_KEY=[...]
+ ```
+
+ Optional. Required, if you want to use `web_search` tool using Tavily.
8. Build the docker image:
-
- ```bash
- docker compose build
- ```
+
+ ```bash
+ docker compose build
+ ```
## Usage
1. Start the docker container:
-
- ```bash
- docker compose up -d
- ```
-
+
+ ```bash
+ docker compose up -d
+ ```
+
2. Run aichat inside the docker container.
+
+ If you want to use tools, you have to enable the role explicitly:
+
+ ```bash
+ docker exec -it aichat aichat --role tools
+ ```
+
+ If you dont want to use tools at all, you can omit this role (or define a different one):
- If you want to use tools, you have to enable the role explicitly:
-
- ```bash
- docker exec -it aichat aichat --role tools
- ```
-
- If you dont want to use tools at all, you can omit this role (or define a different one):
-
-
- ```bash
- docker exec -it aichat aichat
- ```
+ ```bash
+ docker exec -it aichat aichat
+ ```