Generate requirements[-dev].txt
from Pipfile
(using pipenv
).
Used in my modern Python module project cookiecutter.
template.
- Free software: MIT
- Documentation: https://pipenv-to-requirements.readthedocs.org/en/latest/
- Source: https://github.com/gsemet/pipenv-to-requirements
Pipfile
and its sibling Pipfile.lock
are clearly superior tools defining clear dependencies
or a package. Pipfile
is to be maintained by the package's developer while Pipfile.lock
represent a clear image of what is currently installed on the current system, guarantying full
reproductibility of the setup. See more information about Pipfile format here. Most of the time, Pipfile.lock
should be ignored (ie, not
tracked in your git) for packages published on Pypi.
pipenv is a great tool to maintain Pipfile
, but
developers might be stuck with backward compatibility issues for tools and services that still use
requirements.txt and does not know how to handle Pipfile
or Pipfile.lock
yet.
For examples:
- Read the Docs
- Pyup (experimental support is arriving )
- Any library that uses PBR (*)
pip install
(if you install a package withpip
that does not have arequirements.txt
, its dependencies won't be installed, even if you usePipfile
)
(*): for the moment, I recommend to generate at least requirements.txt
(without version
freeze) for the libraries using PBR that you publish on Pypi, and commit this file into your git
history.
Remember PBR automatically synchronize the content of requirements.txt found at the root of your
package with the install_requires section of your package's setup.py.
This allows the automatic installation of the all production dependencies when "pip-installing"
your package .
Without this file, your package would still be installed by pip
, but without its own dependencies.
Support in PBR may be added in the future (see this
this patch ).
For build reproductibility, I also recommend to check in your lock file even for libraries, so that your CI won't fail when new packages is published on Pypi.
Just before building source/binary/wheel package of your python module, only of the following commands:
To generate requirements files (ie, dependencies are described eventually by range), typically for libraries:
pipenv run pipenv_to_requirements
To generate frozen requirements (ie, all dependencies have their version frozen), typically for applications:
pipenv run pipenv_to_requirements -f
It will generate requirements.txt
and, if applicable, requirements-dev.txt
, in the current
directory.
Also possible:
pipenv run pipenv_to_requirements -d requirements-dev-custom.txt pipenv run pipenv_to_requirements -d requirements-dev-custom.txt -f pipenv run pipenv_to_requirements -o requirements-custom.txt pipenv run pipenv_to_requirements -o requirements-custom.txt -f pipenv run pipenv_to_requirements -d requirements-dev-custom.txt -o requirements-custom.txt -f
Example using a Makefile:
dev: pipenv install --dev pipenv run pip install -e . dists: requirements sdist bdist wheels requirements: # For a library, use: pipenv run pipenv_to_requirements # For an application, use: # pipenv run pipenv_to_requirements -f sdist: requirements pipenv run python setup.py sdist bdist: requirements pipenv run python setup.py bdist wheels: requirements pipenv run python setup.py bdist_wheel
Just use make requirements to refresh the requirements.txt.
Simply commit these files in your tree so that Read the Rocs, and ensure they are synchronized each
time you change your Pipfile
. Do not forget to ask Read the Docs to use requirements-dev.txt
when building the documentation.
This package has been bootstrapped with Gsemet's [Python-module-cookiecutter](https://github.com/gsemet/python-module-cookiecutter).
Create your development environment with
$ make dev
Execute unit tests:
$ make test
Code formatter:
$ make style
Code Style Checks:
$ make check
Build distribution packages with
$ make dists