- Prepare LLM: Follow the instructions here to set up LLM locally.
- Install Dependencies:
Navigate to the support-ai directory and run
make. - Modify config.yaml: Adjust the settings in the config.yaml file to match your environment setup.
- Execute ai-bot:
Run the
ai-botscript to start testing.
canonical/support-ai
Folders and files
| Name | Name | Last commit date | ||
|---|---|---|---|---|