Docker Deployment
Running Denkflow in Docker requires careful handling of the data directory for persistence, especially with OneTimeLicenseSource
or TensorRT caching.
- Python
Sample Dockerfile
(TensorRT Jetson)
FROM nvcr.io/nvidia/l4t-jetpack:r36.4.0
WORKDIR /denkflow
# uv for python management
ADD https://astral.sh/uv/install.sh /uv-installer.sh
RUN sh /uv-installer.sh && rm /uv-installer.sh
ENV PATH="/root/.local/bin/:$PATH"
RUN uv venv -p 3.10.12
RUN . .venv/bin/activate
# install denkflow
RUN uv pip install --upgrade denkflow[tensorrt] \
--index-url https://__token__:glpat-8ZB7unxdFxiTGdW-BBzA@\
gitlab.com/api/v4/projects/69262737/packages/pypi/simple
# Set up entrypoint to activate virtual environment and start bash
ENTRYPOINT ["/bin/bash", "-c", \
"source .venv/bin/activate && exec /bin/bash"]
Build the Docker image
docker build -t denkflow:latest .
Run the container
Mount a host directory to the container and set DENKFLOW_DATA_DIRECTORY
to point to a path inside the mounted volume. Pass the PAT securely.
# Create a directory on the host for persistent data
mkdir -p ./denkflow_persistent_data
# Run the container
docker run --runtime nvidia -it -v \
-v ./denkflow_persistent_data:/app/persistent_data \ # Mount host dir
-e DENKFLOW_DATA_DIRECTORY=/app/persistent_data \ # Set env var inside container
denkflow:latest
This setup ensures that license state (OneTimeLicenseSource
) and caches (TensorRT) are saved in ./denkflow_persistent_data
on the host machine and persist across container restarts.