aichat-dev-docker
This is a sigoden/aichat configuration to be used with docker.
Installation
- Install
docker,docker-compose,docker-buildxin your environment -
Clone this repository and all of its submodules:
bash git clone --recurse-submodules https://git.kug.is/aichat-dev-docker.git -
Build
llm-functions-docker. Refer to the llm-functions-docker/README.md inside the project for detailed guide. -
cdinto this project directory:bash cd aichat-dev-docker -
Adjust
config/config.ymlandconfig/roles/tools.mdto your preferences or keep the defaults. Aichat is configured to use Deepinfra as model host, and roletoolsfor tool usage. - Copy
docker-config-example.ymltodocker-compose.ymland adjust it to your preferences or keep the defaults. It will use directorieswork,projectsandconfigrelative to the project directory. To integrate it into a host installation, you might want to mountprojectsto your local projects folder, andconfigto~/.config/aichatto use the same config on host and docker. -
Collect required API keys and place them in
.envfiles.config/.envexample:conf DEEPINFRA_API_BASE=https://api.deepinfra.com/v1/openai DEEPINFRA_API_KEY=[...]DEEPINFRAbeing the client name as inconfig/config.yml.llm-functions-docker/.envexample:conf TAVILY_API_KEY=[...]Optional. Required, if you want to use
web_searchtool using Tavily. -
Build the docker image:
bash docker compose build
Usage
-
Start the docker container:
bash docker compose up -d -
Run aichat inside the docker container.
If you want to use tools, you have to enable the role explicitly:
bash docker exec -it aichat aichat --role toolsIf you dont want to use tools at all, you can omit this role (or define a different one):
bash docker exec -it aichat aichat
