Prerequisites

  • Basic Tracecat deployment with Docker Compose
  • Minimum 10GB of disk space

Instructions

Deploying self-hosted LLMs is resource intensive with large downloads and large model weights. The Ollama Docker image is 1.5GB+ large and model weights can vary greatly in size.

Only models less than 5GB in size are currently supported.

Tracecat supports self-hosted LLMs through Ollama.

Supported models:

1

Configure open source models

Specify the open source models you wish to use in Tracecat by setting the TRACECAT__PRELOAD_OSS_MODELS environment variable in the .env file.

For example, to preload the llama3.2 model, set the following:

TRACECAT__PRELOAD_OSS_MODELS=llama3.2
2

Deploy

Deploy Tracecat with the Ollama docker compose extension:

docker compose up -f docker-compose.yml -f docker-compose.ollama.yml up -d
3

AI Action

You can now use Tracecat’s AI action to call your preloaded open source LLMs. For example, to call the llama3.2 model, you can specify the following arguments: