Self-hosting Tracecat lets you retain data on your own infrastructure and network. Choose from a number of deployment options listed below to get started.

Interested in using open source LLMs (e.g. llama3.1) in Tracecat’s AI actions? Check out our guide on deploying self-hosted LLMs with Tracecat’s Ollama Docker Compose extension.