PyScaffold comes with a lot of elaborated features and configuration defaults to make the most common tasks in developing, maintaining and distributing your own Python package as easy as possible.
Configuration, Packaging & Distribution#
All configuration can be done in
setup.cfg like changing the description,
URL, classifiers, installation requirements and so on as defined by setuptools.
That means in most cases it is not necessary to tamper with
The syntax of
setup.cfg is pretty much self-explanatory and well commented,
check out this example or setuptools’ documentation.
tox -e build
Alternatively, if you are not a huge fan of isolated builds, or prefer running
the commands yourself, you can execute
python -m build --no-isolation.
Uploading to PyPI#
tox -e publish
This will first upload your package using TestPyPI, so you can be a good
citizen of the Python world, check/test everything is fine, and then, when you
are absolutely sure the moment has come for your package to shine, you can go
ahead and run
tox -e publish -- --repository pypi . Just
remember that for this to work, you have to first register a PyPI account (and
also a TestPyPI one).
pip install twine twine upload --repository testpypi dist/*
Please notice that PyPI does not allow uploading local versions, e.g.
for practical reasons. Thus, you have to create a Git tag before uploading a version
of your distribution. Read more about it in the versioning section below.
If you want to work with namespace packages, you will be glad to hear that
PyScaffold supports the PEP 420 specification for implicit namespaces,
which is very useful to distribute a larger package as a collection of smaller ones.
putup can automatically setup everything you need with the
option. For example, use:
putup my_project --package my_package --namespace com.my_domain
my_package inside the namespace
Prior to PyScaffold 4.0, namespaces were generated
explicitly with pkg_resources, instead of PEP 420. Moreover, if you
are developing “subpackages” for already existing namespaces, please check
which convention the namespaces are currently following. Different styles of
namespace packages might be incompatible. If you don’t want to update
existing namespace packages to PEP 420, you will probably need to
manually copy the
__init__.py file for the umbrella namespace folder
from an existing project. Additionally have a look in our FAQ
about how to disable implicit namespaces.
Package and Files Data#
Additional data, e.g. images and text files, that must reside within your package, e.g.
my_project/src/my_package, and are tracked by Git will automatically be included
include_package_data = True in
setup.cfg. In case that data files are not packaged,
git ls-files to debug if they are really tracked by Git.
It is not necessary to have a
MANIFEST.in file for this to work. Just make
sure that all files are added to your repository.
To read this data in your code, use:
from pkgutil import get_data data = get_data('my_package', 'path/to/my/data.txt')
Starting from Python 3.7 an even better approach is using
from importlib.resources import read_text, read_binary data = read_text('my_package.sub_package', 'data.txt')
Note that we need a proper package structure in this case, i.e. directories need
__init__.py and be named as a valid Python package (which follow
the same rules as variable names).
We only specify the file
data.txt, no path is allowed.
The library importlib_resources provides a backport of this feature.
Please have in mind that the
include_package_data option in
setup.cfg is only
guaranteed to be read when creating a wheels distribution. Other distribution methods might
behave unexpectedly (e.g. always including data files even when
include_package_data = False). Therefore, the best option if you want to have
data files in your repository but not as part of the pip installable package
is to add them somewhere outside the
src directory (e.g. a
directory in the root of the project, or inside
tests if you use them for
checks). Additionally you can exclude them explicitly via the
[options.packages.find] exclude option in
More information about data files support is available on the
Using package files to store runtime configuration or mutable data is not considered good practice. Package files should be read-only. If you need configuration files, or files that should be written at runtime, please consider doing so inside standard locations in the user’s home folder (platformdirs is a good library for that). If needed you can even create them at the first usage from a read-only template, which in turn can be a package file.
Versioning and Git Integration#
Your project is already an initialised Git repository and setuptools uses the
information of tags to infer the version of your project with the help of
setuptools_scm. To use this feature you need to tag with the format
MAJOR.MINOR[.PATCH] , e.g.
You can run
python -m setuptools_scm to retrieve the current PEP 440-compliant version .
This version will be used when building a package and is also accessible through
my_project.__version__. If you want to upload to PyPI you have to tag the current commit
before uploading since PyPI does not allow local versions, e.g.
for practical reasons.
Please check our docs for the best practices and common errors with version numbers.
Unleash the power of Git by using its pre-commit hooks.
This feature is available through the
After your project’s scaffold was generated, make sure pre-commit is
pip install pre-commit, then just run
It goes unsaid that also a default
.gitignore file is provided that is well
adjusted for Python projects and the most common tools.
PyScaffold will prepare a
docs directory with all you need to start writing
your documentation. Start editing the file
docs/index.rst to extend the documentation
and note that even the Numpy and Google style docstrings are activated by default.
If you have tox in your system, simply run
tox -e docs or
doctests to compile the docs or run the doctests.
Alternatively, if you have make and Sphinx installed in your computer, build the
make -C docs html and run doctests with
make -C docs doctest. Just make sure Sphinx 1.3 or above is installed.
In order to generate the docs locally, you will need to install any
dependency used to build your doc files (and probably all your project dependencies) in
the same Python environment where Sphinx is installed (either the global Python
installation or a conda/virtualenv/venv environment).
For example, if you want to use the Read the Docs classic theme,
sphinx_rtd_theme package should be installed.
If you are using
tox -e docs, tox will take care of generating a
virtual environment and installing all these dependencies automatically.
You will only need to list your doc dependencies (like
deps property of the
Your can also use the
docs/requirements.txt file to store them.
This file can be used by both Read the Docs and tox
when generating the docs.
Dependency Management in a Breeze#
PyScaffold out of the box allows developers to express abstract dependencies
and take advantage of
pip to manage installation. It also can be used
together with a virtual environment (also called virtual env)
to avoid dependency hell during both development and production stages.
If you like the traditional style of dependency management using a virtual env
co-located with your package, PyScaffold can help to reduce the boilerplate.
--venv option, a virtualenv will be bootstrapped and waiting to be
activated. And if you are the kind of person that always install the same
packages when creating a virtual env, PyScaffold’s option
PACKAGE will be the right one for you. You can even integrate pip-tools in
this workflow, by putting a
-e file:. in your requirements.in.
Alternatively, PyPA’s Pipenv can be integrated in any PyScaffold-generated
project by following standard setuptools conventions. Keeping abstract
setup.cfg and running
pipenv install -e . is basically
what you have to do.
You can check the details on how all of that works in Dependency Management.
Experimental Feature - Pipenv and pip-tools support is experimental and might change in the future.
Automation, Tests & Coverage#
PyScaffold relies on pytest to run all automated tests defined in the subfolder
tests. Some sane default flags for pytest are already defined in the
[tool:pytest] section of
setup.cfg. The pytest plugin pytest-cov is used
to automatically generate a coverage report. It is also possible to provide
additional parameters and flags on the commandline, e.g., type:
JUnit and Coverage HTML/XML#
For usage with a continuous integration software JUnit and Coverage XML output
can be activated in
setup.cfg. Use the flag
--cirrus to generate
templates of the Cirrus CI configuration file
.cirrus.yml which even features the coverage and stats system Coveralls.
Alternatively, you can also generate configuration files for
GitLab CI or GitHub Actions by running
putup with the
Managing test environments and tasks with tox#
Projects generated with PyScaffold are configured by default to use tox to run some common tasks. Tox is a virtual environment management and test tool that allows you to define and run custom tasks that call executables from Python packages.
If you simply install tox and run from the root folder of your project:
tox will download the dependencies you have specified, build the package, install it in a virtual environment and run the tests using pytest, so you are sure everything is properly tested. You can rely on the tox documentation for detailed configuration options (which include the possibility of running the tests for different versions of Python).
You are not limited to running your tests, with tox you can define all sorts of automation tasks. Here are a few examples for you:
tox -e build # will bundle your package and create a distribution inside the `dist` folder tox -e publish # will upload your distribution to a package index server tox -e docs # will build your docs
but you can go ahead and check tox examples, or this tox tutorial from
Sean Hammond for more ideas, e.g. running static code analyzers (pyflakes and
pep8) with flake8. Run
tox -av to list all the available tasks.
Management of Requirements & Licenses#
Installation requirements of your project can be defined inside
install_requires = numpy; scipy. To avoid package dependency problems
it is common to not pin installation requirements to any specific version,
although minimum versions, e.g.
sphinx>=1.3, and/or maximum versions, e.g.
pandas<0.12, are used frequently in accordance with semantic versioning.
For test/dev purposes, you can additionally create a
pinning packages to specific version, e.g.
This helps to ensure reproducibility, but be sure to read our
Dependency Management Guide to understand the role of a
requirements.txt file for library and application projects
pip-compile from pip-tools can help you to manage that file).
Packages defined in
requirements.txt can be easily installed with:
pip install -r requirements.txt
The most popular open source licenses can be easily added to your project with
the help of the
--license flag. You only need to specify the license identifier
according to the SPDX index so PyScaffold can generate the appropriate
LICENSE.txt and configure your package. For example:
putup --license MPL-2.0 my_project
will create the
my_project package under the Mozilla Public License 2.0
The available licenses can be listed with
putup --help, and you can find
more information about each license in the SPDX index and choosealicense.com.
PyScaffold offers several extensions:
If you want a project setup for a Data Science task, just use
--dsprojectafter having installed pyscaffoldext-dsproject.
README.mdbased on Markdown instead of
--markdownafter having installed pyscaffoldext-markdown.
… and many more like
--gitlabto create the necessary files for GitLab CI,
--github-actionsto configure GitHub Actions,
--travisfor Travis CI (see pyscaffoldext-travis), or
--cookiecutterfor Cookiecutter integration (see pyscaffoldext-cookiecutter).
Find more extensions within the PyScaffold organisation and consider contributing your own,
it is very easy!
You can quickly generate a template for your extension with the
--custom-extension option after having installed pyscaffoldext-custom-extension.
Have a look in our guide on writing extensions to get started.
All extensions can easily be installed with
pip install pyscaffoldext-NAME.
Keep your project’s scaffold up-to-date by applying
putup --update my_project
when a new version of PyScaffold was released.
An update will only overwrite files that are not often altered by users like
setup.py. To update all files use
An existing project that was not setup with PyScaffold can be converted with
putup --force existing_project. The force option is completely safe to use
since the git repository of the existing project is not touched!
Please check out the Updating from Previous Versions docs for more information on how to migrate
from old versions and configuration options in
With the help of an experimental updating functionality it is also possible to
add additional features to your existing project scaffold. If a scaffold lacking
.cirrus.yml was created with
putup my_project it can later be added by issuing
putup my_project --update --cirrus. For this to work, PyScaffold stores all
options that were initially used to put up the scaffold under the
setup.cfg. Be aware that right now PyScaffold provides no way to
remove a feature which was once added.
After having used PyScaffold for some time, you probably will notice yourself
repeating the same options most of the time for every new project.
Don’t worry, PyScaffold now allows you to set default flags using the
default.cfg file .
Check out our Configuration section to get started.