How to Migrate your Python & Django Projects to uv

I recently migrated a legacy project that used requirements files to uv. I undertook the project in hopes of streamlining the setup process and to help to ensure that the versions of packages installed locally, in CI/CD, and in production are all consistent.

uv manages everything about your Python environment, so I found it's best to start with uv's approach and integrate other tools as needed.

Local development

To migrate a legacy project to uv, I followed these steps.

  1. First, I added a project definition to our project's pyproject.toml:

    [project]
    name = "my-product"
    version = "1.2.3"
    description = "Our amazing product."
    readme = "README.md"
    requires-python = "~=3.12"
    dependencies = []
    
  2. Then, I moved the requirements from our pre-existing requirements files to the project dependencies and removed the old files:

    uv add -r requirements/base.txt
    uv add -r requirements/dev.txt --group dev
    uv add -r requirements/deploy.txt --group deploy
    git rm requirements/*.txt
    

    This adds the base requirements to the dependencies list in pyproject.toml, and the dev and deploy requirements to the dev and deploy groups, respectively.

  3. Next, I installed and pinned a Python version, and synced the dependencies:

    uv python install 3.12
    uv python pin 3.12
    uv sync
    

    This installs a Python 3.12 interpreter, ensures that uv uses Python 3.12 for the current directory and all subdirectories (through the .python-version file it creates), and downloads and installs the necessary dependencies in a .venv virtual environment that uv manages for you.

    Note that uv installs your dev requirements by default, but not other groups (this can be customized with default-groups), but you can include a group name on the command line to also install the dependencies for that group. For example, if I install the deploy group, uv will install the gunicorn package along with the already-installed packages for my project:

    ❯ uv sync --locked --group deploy
    Resolved 223 packages in 6ms
    Installed 1 package in 7ms
    + gunicorn==23.0.0
    

    It's worth noting that (as in the example below) one needs to specify --no-dev when calling uv sync in your deployed environment (Docker-based or otherwise). If you don't, then your deployed environment will include all of your local development dependencies.

  4. Finally, I use direnv for managing environment variables, so I updated my .envrc to remove the old layout python command and add the new venv To my PATH:

    sed -e '/layout python/ s/^#*/#/' -i .envrc
    echo 'export PATH="$(pwd)/.venv/bin:${PATH}"' >> .envrc
    direnv allow
    

    Note that if you installed uv in the same terminal session, you may need to restart your terminal, otherwise, direnv allow might wipe out the PATH changes that the uv installation script made and prevent you from running uv.

Updating the project's Dockerfile

uv comes with a nice uv-docker-example that shows how to integrate uv with Docker.

In our case:

  1. I changed the base image to use the uv image:

    FROM ghcr.io/astral-sh/uv:python3.12-bookworm
    
  2. I enabled byte code compilation, copying from cache, and set a custom virtual environment path:

    # Enable bytecode compilation
    ENV UV_COMPILE_BYTECODE=1
    
    # Copy from the cache instead of linking since it's a mounted volume
    ENV UV_LINK_MODE=copy
    
    # Use a custom VIRTUAL_ENV with uv to avoid conflicts with local developer's
    # .venv/ while running tests in Docker
    ENV VIRTUAL_ENV=/venv
    

    Setting a custom VIRTUAL_ENV is not recommended in the uv example, but I found it necessary in our case to avoid conflicts with the local developer's .venv/ when running tests locally inside the Docker container (our project supports running tests locally both with and without Docker).

  3. In our build step that installs requirements, I mount the cache and necessary files as recommended in the example, and adjust the uv sync command to target my custom virtual environment path:

    ARG UV_OPTS="--no-dev --group deploy"
    RUN --mount=type=cache,target=/root/.cache/uv \
        --mount=type=bind,source=uv.lock,target=uv.lock \
        --mount=type=bind,source=pyproject.toml,target=pyproject.toml \
        set -ex \
        && BUILD_DEPS=" \
        build-essential \
        git \
        libpq-dev \
        " \
        && apt-get update && apt-get install -y --no-install-recommends $BUILD_DEPS \
        && uv venv $VIRTUAL_ENV \
        && uv sync --active --locked --no-install-project $UV_OPTS \
        && apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false $BUILD_DEPS \
        && rm -rf /var/lib/apt/lists/*
    
    # Add uv venv to PATH
    ENV PATH="$VIRTUAL_ENV/bin:$PATH"
    

I hit an odd qemu/Docker bug with bytecode compilation during this process, but fortunately it happened only on a self-hosted runner that we control, so I was able to switch from the docker-container driver to the docker driver and avoid the issue altogether.

Integrating pre-commit

Our project also uses pre-commit for managing code quality checks and formatting.

I integrated the pre-commit check for the uv.lock file by adding the following to our .pre-commit-config.yaml:

repos:
  - repo: https://github.com/astral-sh/uv-pre-commit
    # uv version.
    rev: 0.7.12
    hooks:
      - id: uv-lock

This makes sure that when the dependencies are updated, the uv.lock file is also updated.

There are a number of other pre-commit hooks available for uv which may be useful for your project.

Integrating Github Actions

I switched to using setup-uv to install and manage Python and our project's requirements:

name: CI

# <snip>

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v2
      - uses: astral-sh/setup-uv@v1
        with:
          python-version: 3.12
          enable-cache: true
          cache-dependency-glob: "uv.lock"
      - run: uv sync --locked
      - run: uv run pre-commit run --show-diff-on-failure --color=always --all-files
      - run: uv run ... # your test command here

Our CI workflow is more complex than is worth sharing here, but hopefully this snippet helps demonstrate how you can use setup-uv in place of setup-python to manage your Python environment in CI/CD.

Local setup instructions (documentation!)

Last but certainly not least, I updated our project's local setup instructions to work both for new developers creating a fresh environment, and for pre-existing local environments that need to be migrated to use uv.

Additionally, we found the workflows for updating dependent packages are still being worked on, so we added some documentation to our project repo to explain these steps.

For example, if you need to update a dependent package that's not listed in pyproject.toml, you can run the following commands to update the uv.lock file and sync your local venv:

uv lock --upgrade-package <package-name>
uv sync --locked

From time to time, we found it may also be helpful to regenerate the uv.lock file with all the indirect dependencies upgraded. You can do this by running:

uv lock --upgrade
uv sync --locked

My colleagues Simon and Mariatta reviewed the changes and tested the setup instructions on their own machines, and they provided valuable feedback that helped refine the documentation.

Conclusion

We're still early in the migration process (the related PR hasn't been merged yet as I write this!), but my initial impressions are that uv will help streamline the Python setup process for our project. As an added bonus, the base Docker image is hosted on Github Container Registry, which as far as I can tell does not (yet) enforce a rate limit on pulls.

I hope this has provided a helpful outline for testing uv in your projects. If you have any questions or suggestions, feel free to reach out to us directly!

blog comments powered by Disqus
Times
Check

Success!

Times

You're already subscribed

Times