aboutsummaryrefslogtreecommitdiffstats

aichat-dev-docker

This is a sigoden/aichat configuration to be used with docker.

Installation

  1. Install docker, docker-compose, docker-buildx in your environment
  2. Clone this repository and all of its submodules:

    bash git clone --recurse-submodules https://git.kug.is/aichat-dev-docker.git

  3. Build llm-functions-docker. Refer to the llm-functions-docker/README.md inside the project for detailed guide.

  4. cd into this project directory:

    bash cd aichat-dev-docker

  5. Adjust config/config.yml and config/roles/tools.md to your preferences or keep the defaults. Aichat is configured to use Deepinfra as model host, and role tools for tool usage.

  6. Copy docker-config-example.yml to docker-compose.yml and adjust it to your preferences or keep the defaults. It will use directories work, projects and config relative to the project directory. To integrate it into a host installation, you might want to mount projects to your local projects folder, and config to ~/.config/aichat to use the same config on host and docker.
  7. Collect required API keys and place them in .env files.

    config/.env example:

    conf DEEPINFRA_API_BASE=https://api.deepinfra.com/v1/openai DEEPINFRA_API_KEY=[...]

    DEEPINFRA being the client name as in config/config.yml.

    llm-functions-docker/.env example:

    conf TAVILY_API_KEY=[...]

    Optional. Required, if you want to use web_search tool using Tavily.

  8. Build the docker image:

    bash docker compose build

Usage

  1. Start the docker container:

    bash docker compose up -d

  2. Run aichat inside the docker container.

    If you want to use tools, you have to enable the role explicitly:

    bash docker exec -it aichat aichat --role tools

    If you dont want to use tools at all, you can omit this role (or define a different one):

    bash docker exec -it aichat aichat