16 Dec 2023   software
A few years ago, I was introduced to a piece of software called pip-tools. Since then, I’ve been using it religiously in all of my Python projects. This post is a quick overview of why I think it’s great.
At a high level, pip-tools solves the dependency management problem. Here’s how the problem usually presents itself:
pip freeze > requirements.txt
to keep track of the dependencyrequirements.txt
requirements.txt
file, not quite sure which are dependencies (or versions) are necessary to
actually run your program. Spend hours running pip remove
, pip install
--upgrade
, and pip freeze
in a futile attempt to clean up your environment
and requirements.txt
. Give up and resign to just installing everything
everywhere.Pip-tools solves this problem by separating direct dependencies from transitive dependencies, automatically managing the latter, and providing commands for easily upgrading package versions and syncing the specified versions into your environment.
That was a lot of words. I think a quick demo will help to illustrate.
First, create a virtual environment if necessary and activate it:
python3.10 -m venv venv
source venv/bin/activate
Then install pip-tools:
pip install pip-tools
Then create a pip-tools package configuration file, conventionally called
requirements.in
. Here’s what it might look like for a project that uses
aiohttp
, an asynchronous HTTP client/server framework:
aiohttp
Then, to turn configuration file into a universally recognized
requirements.txt
file, simply run:
pip-compile requirements.in
Here’s what the generated requirements.txt
might contain:
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
# pip-compile requirements.in
#
aiohttp==3.9.1
# via -r requirements.in
aiosignal==1.3.1
# via aiohttp
async-timeout==4.0.3
# via aiohttp
attrs==23.1.0
# via aiohttp
frozenlist==1.4.0
# via
# aiohttp
# aiosignal
idna==3.6
# via yarl
multidict==6.0.4
# via
# aiohttp
# yarl
yarl==1.9.3
# via aiohttp
A few things to note:
pip-compile
is reproducible; it will continue to generate the same
requirements.txt
file, even if new versions of your direct dependencies are
available. Adding new dependencies never affects existing dependencies. And you
can always explicitly ask pip-tools to upgrade the pinned versions via
pip-compile --upgrade
. In short, you’re always in control of your package
versions.The coolest part, in my opinion, comes after the requirements.txt
is
generated. You can easily install all necessary dependencies, and uninstall
unnecessary ones, by running the following:
pip-sync requirements.txt
The “uninstall unnecessary ones” is crucial to ensuring that you’re not unknowingly depending on some package that just-so-happens to be installed on your machine, but not in production. With pip-tools, you can be confident that if your dependencies are happy on your machine, they’ll be happy in other environments, too.
Oh, and one more thing: you can have multiple “requirements” files, one for each
environment. So if you wanted Ruff, a Python
linter, to also be available at development time, you could create a
requirements-dev.in
file containing the following lines:
-r requirements.in
ruff
Running pip-compile
on it would produce a requirements-dev.txt
file that contains both your production dependencies–in this case, just
aiohttp
–and your development dependencies, ready for pip-sync
-ing.
I’ll be the first to admit that pip-tools has a non-zero learning curve, but
pip-compile
and pip-sync
quickly become second nature, and the time saved
from not having to worry about dependency management adds up quickly. If you
haven’t done so already, do yourself a favor and check it out.