diff --git a/.github/workflows/cicd.yml b/.github/workflows/cicd.yml index 64abdbe..65b0ba5 100644 --- a/.github/workflows/cicd.yml +++ b/.github/workflows/cicd.yml @@ -5,10 +5,13 @@ name: Python application on: push: + branches: ["main"] pull_request: branches: ["main"] schedule: - cron: "0 2 * * 3" + release: + types: [published] permissions: contents: read @@ -41,7 +44,7 @@ jobs: runs-on: ubuntu-latest strategy: matrix: - python-version: ["3.11", "3.12"] + python-version: ["3.10", "3.11", "3.12", "3.13"] steps: - uses: actions/checkout@v4 - name: Set up Python ${{ matrix.python-version }} diff --git a/README.md b/README.md index 0b2d238..7d6f59b 100644 --- a/README.md +++ b/README.md @@ -20,6 +20,10 @@ You will need to find and replace occurences of and [Creative Commons](https://creativecommons.org/chooser/). Replace [`LICENSE`](LICENSE) file with the license you choose. - Update the `CITATION.cff` file with your information. +- once you change the code under `src`, please make sure to update + - `docs/tutorial/tutorial.ipynb` (add your usage example) + - `tests/test_mockup.py` (delete file and add your own tests following the + naming conventions `test_.py`) ## Development environment @@ -29,6 +33,19 @@ Install package so that new code is picked up in a restared python interpreter: pip install -e ".[dev]" ``` +Then you want to run locally the commands which are check in the CI/CD pipeline +(using GitHub Actions). You can type + +```bash +# run unittests +pytest tests +# format code and sort imports +black . +isort . +# lint code and check for obvious errors +ruff check src +``` + ## Basic usage > works using this template diff --git a/developing.md b/developing.md index 4fce3ee..568e867 100644 --- a/developing.md +++ b/developing.md @@ -2,14 +2,14 @@ > Author: Henry Webel -[packaging.python.org](https://packaging.python.org/en/latest/tutorials/packaging-projects/) +[packaging.python.org](https://packaging.python.org/en/latest/tutorials/packaging-projects/) has an excellent tutorial on how to package a Python project. I read and used insights from that website to help create the template which is available on GitHub at [https://github.com/RasmussenLab/python_package](https://github.com/RasmussenLab/python_package) and I want to give here an overview specifically to some details regarding this template. -Some are overlapping with the +Some are overlapping with the [packaging.python.org](https://packaging.python.org/en/latest/tutorials/packaging-projects/) -tutorial, but as always we decided for a certain set of tools, conventions and complexity +tutorial, but as always we decided for a certain set of tools, conventions and complexity which needs some explanation. Here a brief overview of external resources you can also look at: @@ -21,7 +21,7 @@ Here a brief overview of external resources you can also look at: ## Project structure -First an overview of the main folder structure. See line comments for details on what +First an overview of the main folder structure. See line comments for details on what is the purpose of each folder or file: ```bash @@ -40,20 +40,20 @@ python_package ## Core packaging files -We will first look at [`pyproject.toml`](pyproject.toml) and its relation to the -[`src`](src) directory. The +We will first look at [`pyproject.toml`](pyproject.toml) and its relation to the +[`src`](src) directory. The [`pyproject.toml`](pyproject.toml) file is the main configuration file for the Python package and is used to specify the package metadata, dependencies, build tools and configurations. The [`src`](src) folder stores the actual source code of the package, where the package itself is -the subdirectories of the [`src`](src) directory. The (e.g. `src/python_package`). +the subdirectories of the [`src`](src) directory. The (e.g. `src/python_package`).
About setup.py and setup.cfg configuration files -The [`setup.py`](setup.py) file is an artefact for backward compatibility and should not -be changed. Everything that used to be in [`setup.py`](setup.py) or +The [`setup.py`](setup.py) file is an artefact for backward compatibility and should not +be changed. Everything that used to be in [`setup.py`](setup.py) or [`setup.cfg`](setup.cfg) is now largely in [`pyproject.toml`](pyproject.toml). -The notable exception would be the desired maximum line length in `setup.cfg` for +The notable exception would be the desired maximum line length in `setup.cfg` for the tool [`flake8`](https://flake8.pycqa.org/), which does not yet supported [`pyproject.toml`](pyproject.toml) configuration. As we use `ruff` as linter, we left it empty, but in case you want to use `flake8`, you can add: @@ -71,12 +71,12 @@ aggressive = 2 ### Changes required in `pyproject.toml` You have to change entries under the `[project]` section to match your project name, -description, author, license, etc. Make sure to pick a license that works for you, e.g. +description, author, license, etc. Make sure to pick a license that works for you, e.g. using [choosealicense.com](https://choosealicense.com/). Also update the `LICENSE` file accordingly. -The `dependencies` key can -list the dependencies and is currently commented out. The dependencies could also be +The `dependencies` key can +list the dependencies and is currently commented out. The dependencies could also be specified in via a `requirements.txt`, if you already have such a file. ```toml @@ -93,7 +93,7 @@ dynamic = ["version", # version is loaded from the package #"dependencies", # add if using requirements.txt ] readme = "README.md" -requires-python = ">=3.9" +requires-python = ">=3.10" # These are keywords classifiers = [ "Programming Language :: Python :: 3", @@ -114,13 +114,13 @@ The entry dynamic = ["version"] ``` -means that the version is loaded dynamically using the extension +means that the version is loaded dynamically using the extension [setuptools_scm](https://setuptools-scm.readthedocs.io/) -we list under the `[build-system]` section in [`pyproject.toml`](pyproject.toml). -This is done to avoid having to manually update the version and integrate with automatic +we list under the `[build-system]` section in [`pyproject.toml`](pyproject.toml). +This is done to avoid having to manually update the version and integrate with automatic versioning through releases on GitHub. It also ensures that each commit has a unique version number, which is useful for attributing -errors to specific non-released versions. The dynamic version is picked up in the +errors to specific non-released versions. The dynamic version is picked up in the `__version__` variable in the `__init__.py` file of the package, which is located in the [`src/python_package`](src/python_package) directory. @@ -130,7 +130,7 @@ build-backend = "setuptools.build_meta" requires = ["setuptools>=64", "setuptools_scm>=8"] [tool.setuptools_scm] -# https://setuptools-scm.readthedocs.io/ +# https://setuptools-scm.readthedocs.io/ # used to pick up the version from the git tags or the latest commit. ``` @@ -166,12 +166,12 @@ python-package-hello -n 3 ## Source directory layout of the package -The source code of the package is located in the `src` directory, to have a project +The source code of the package is located in the `src` directory, to have a project independent folder to look for the source code recognized by most tools you would need -to build a package -(read on [packagin namespace packages](https://packaging.python.org/en/latest/guides/packaging-namespace-packages/)). -It also allows to have multiple subpackages or modules -in the same project under the `python_package` package (see example +to build a package +(read on [packagin namespace packages](https://packaging.python.org/en/latest/guides/packaging-namespace-packages/)). +It also allows to have multiple subpackages or modules +in the same project under the `python_package` package (see example [here](https://setuptools.pypa.io/en/latest/userguide/package_discovery.html#src-layout)). ```bash @@ -181,8 +181,8 @@ in the same project under the `python_package` package (see example │ └── mockup.py # a submodule of the package (import python_package.mockup) ``` -So you will need to rename the `python_package` directory to your package name, -e.g. `my_package` and specify the package name in the [`pyproject.toml`](pyproject.toml) file +So you will need to rename the `python_package` directory to your package name, +e.g. `my_package` and specify the package name in the [`pyproject.toml`](pyproject.toml) file under the `[project]` section: ```toml @@ -191,14 +191,14 @@ name = "my_package" Strictly speaking you can give different names in both places, but this will only confuse potential users. Think of `scikit-learn` for an example of a package that uses a different -name in the [`pyproject.toml`](pyproject.toml) file and the source code directory name, +name in the [`pyproject.toml`](pyproject.toml) file and the source code directory name, leading to the `sklearn` package name when imported. ## Documentation The documentation is created using [Sphinx](https://www.sphinx-doc.org/en/master/), which is common for Python documentation. It relies additionally on several extensions -enabling the use of `markdown` and `jupyter` notebooks. +enabling the use of `markdown` and `jupyter` notebooks. The documentation is located in the [`docs`](docs) directory. Sphinx is configured via the [`conf.py`](docs/conf.py) file, where you can specify the extension you want: @@ -218,12 +218,12 @@ extensions = [ ] ``` -These are added as dependencies through the +These are added as dependencies through the `pyproject.toml` file under the `[project.optional-dependencies]` section: ```toml [project.optional-dependencies] -# Optional dependencies to locally build the documentation, also used for +# Optional dependencies to locally build the documentation, also used for # readthedocs. docs = [ "sphinx", @@ -273,7 +273,7 @@ package on the fly. See the Read The Docs section below for more details. We build the documentation based on the template [sphinx_book_theme](https://sphinx-book-theme.readthedocs.io), which is set in the -[`conf.py`](docs/conf.py) file and parts of our docs requirements in +[`conf.py`](docs/conf.py) file and parts of our docs requirements in [`pyproject.toml`](pyproject.toml): ```python @@ -284,12 +284,12 @@ html_theme = "sphinx_book_theme" > and need to be changed. Explore other themes here: > [sphinx-themes.org](https://sphinx-themes.org/) -The API of the Python package in the `src` directory is automatically included +The API of the Python package in the `src` directory is automatically included in the documentation using the [`autodoc` extension](https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html). We use per default the `numpydoc` style for docstrings, see the format [here](https://numpydoc.readthedocs.io/en/stable/format.html). -The API documentation can be augmented with highlights from other types from projects +The API documentation can be augmented with highlights from other types from projects using `intersphinx`: ```python @@ -308,8 +308,8 @@ like `pandas`, `scikit-learn`, or `matplotlib` to the mapping. ### Building the documentation locally (with integration tests) To build the documentation locally, you can follow the instructions in the -[`docs/README.md`](docs/README.md), which you should also update with your name changes. -In short, you can run the following commands in the [`docs`](docs ) directory: +[`docs/README.md`](docs/README.md), which you should also update with your name changes. +In short, you can run the following commands in the [`docs`](docs) directory: ```bash # in root of the project @@ -321,10 +321,10 @@ sphinx-build -n -W --keep-going -b html ./ ./_build/ this will create a `reference` directory with the API documentation of the Python package `python_package`, a `jupyter_execute` for the tutorial in [`docs/tutorial`](docs/tutorial) - and a `_build` directory with an HTML version of the documentation. You can open the +and a `_build` directory with an HTML version of the documentation. You can open the `_build/index.html` file in your browser to view the documentation built locally. -The tutorial related configuration in `conf.py` is the following, specifying that +The tutorial related configuration in `conf.py` is the following, specifying that errors stop the build process ensuring that examples are tested: ```python @@ -348,22 +348,22 @@ nb_merge_streams = True The tutorials are meant as a sort of integration test, where you make sure that the core functionality your project wants to support is working as expected. For easier github -diffs, we use [`jupytext`](https://jupytext.readthedocs.io), which allows to -have the tutorial in both a Jupyter Notebook format and a Python script format. +diffs, we use [`jupytext`](https://jupytext.readthedocs.io), which allows to +have the tutorial in both a Jupyter Notebook format and a Python script format. You have to keep the files in sync using: ```bash jupytext --sync docs/tutorial/*.ipynb ``` -The [`docs/tutorial/.jupytext`](docs/tutorial/.jupytext) configuration sets the default +The [`docs/tutorial/.jupytext`](docs/tutorial/.jupytext) configuration sets the default format to `py:percent` and automatically allows syncing of new notebooks. ### Read The Docs To build the documentation on Read The Docs, you need to create a file called -[`.readthedocs.yaml`](.readthedocs.yaml), which is located in the root of the project and -specifies which dependencies are needed. The core is the following specifying where the +[`.readthedocs.yaml`](.readthedocs.yaml), which is located in the root of the project and +specifies which dependencies are needed. The core is the following specifying where the [`conf.py`](docs/conf.py) file is and from where to install the required dependencies: ```yaml @@ -379,20 +379,20 @@ python: - docs ``` -You will need to manually register your project repository on +You will need to manually register your project repository on [Read The Docs](https://readthedocs.org/) in order that it can build the documentation -by the service. I recommend to activate builds for Pull Requests, so that -the documentation is built for each PR and you can see if the documentation is gradually +by the service. I recommend to activate builds for Pull Requests, so that +the documentation is built for each PR and you can see if the documentation is gradually breaking, i.e. your integration test using the notebooks in [`docs/tutorial`](docs/tutorial) fail. See their documentation -[on adding a project](https://docs.readthedocs.com/platform/stable/intro/add-project.html) +[on adding a project](https://docs.readthedocs.com/platform/stable/intro/add-project.html) for instructions. ## Running tests The tests are located in the `tests` directory and can be run using `pytest`. Pytest is specified as a dependency in the `pyproject.toml` file under the -`[project.optional-dependencies]` section along with the formatter `black` and the +`[project.optional-dependencies]` section along with the formatter `black` and the linter `ruff`: ```toml @@ -414,7 +414,7 @@ read the next section to see how this is automated using `GitHub Actions`. ## GitHub Actions We run these checks also on GitHub using GitHub Actions. The configuration -for the actions is located in the [`.github/workflows`](.github/workflows) directory +for the actions is located in the [`.github/workflows`](.github/workflows) directory and is specified in the `cdci.yml` file. See the biosustain dsp tutorial on GitHub Actions for more details (or any other resource you find): [biosustain/dsp_actions_tutorial](https://github.com/biosustain/dsp_actions_tutorial) @@ -427,10 +427,13 @@ name: Python application on: push: + branches: ["main"] pull_request: branches: ["main"] schedule: - cron: "0 2 * * 3" + release: + types: [published] permissions: contents: read @@ -462,7 +465,7 @@ jobs: runs-on: ubuntu-latest strategy: matrix: - python-version: ["3.11", "3.12"] + python-version: ["3.9", "3.11", "3.12", "3.13"] steps: - uses: actions/checkout@v4 - name: Set up Python ${{ matrix.python-version }} @@ -480,42 +483,42 @@ jobs: run: python -m pytest tests ``` -This workflow also allows to create `PyPI` releases automatically if you register your +This workflow also allows to create `PyPI` releases automatically if you register your project on `PyPI` (or `TestPyPI` for testing first) and create a GitHub release: ```yaml - publish: - name: Publish package - if: startsWith(github.ref, 'refs/tags') - needs: - - format - - lint - - test - - build_source_dist - # - build_wheels - runs-on: ubuntu-latest - - steps: - - uses: actions/download-artifact@v4 - with: - name: artifact - path: ./dist - - - uses: pypa/gh-action-pypi-publish@release/v1 - with: - # remove repository key to set the default to pypi (not test.pypi.org) - repository-url: https://test.pypi.org/legacy/ +publish: + name: Publish package + if: startsWith(github.ref, 'refs/tags') + needs: + - format + - lint + - test + - build_source_dist + # - build_wheels + runs-on: ubuntu-latest + + steps: + - uses: actions/download-artifact@v4 + with: + name: artifact + path: ./dist + + - uses: pypa/gh-action-pypi-publish@release/v1 + with: + # remove repository key to set the default to pypi (not test.pypi.org) + repository-url: https://test.pypi.org/legacy/ ``` To setup the [`gh-action-pypi-publish`](https://github.com/pypa/gh-action-pypi-publish) action, you need to register the repository -on [PyPI](https://pypi.org/) or [`TestPyPI`](https://test.pypi.org/), which allows PyPI +on [PyPI](https://pypi.org/) or [`TestPyPI`](https://test.pypi.org/), which allows PyPI and GitHub to communicate securely. See the instructions on [packaging.python.org](https://packaging.python.org/en/latest/guides/publishing-package-distribution-releases-using-github-actions-ci-cd-workflows/). You then trigger new releases to PyPI by creating a new GitHub release, which will automatically trigger the `publish` job in the workflow as it needs you to set a tag. -Have a look at [VueGen Releases]( https://github.com/Multiomics-Analytics-Group/vuegen/releases) +Have a look at [VueGen Releases](https://github.com/Multiomics-Analytics-Group/vuegen/releases) for an example. The release notes are automatically generated using the PR titles, see GitHub's [docs](https://docs.github.com/en/repositories/releasing-projects-on-github/automatically-generated-release-notes). @@ -528,31 +531,30 @@ to be partly compiled, e.g. if you use `Cython`, `numpy` C extensions or Rust ex Also additionally you could use the artifact from the `build_source_dist` job to test the build of the source distribution. This is useful to ensure that a package with non-Python files (e.g. data files) is built correctly and that the package -can be installed correctly. You should probably best test this in as much isolation as +can be installed correctly. You should probably best test this in as much isolation as you can, e.g. by not pulling the repository using `actions/checkout@v4`. ```yaml - test_sdist: - name: Install built source distribution - needs: build_source_dist - runs-on: ubuntu-latest - steps: - # - uses: actions/checkout@v4 - - uses: actions/download-artifact@v4 - with: - name: artifact - path: ./dist - - uses: actions/setup-python@v5 - with: - python-version: "3.11" - - name: Install built sdist - run: | - pip install ./dist/*.tar.gz - # ... some checks +test_sdist: + name: Install built source distribution + needs: build_source_dist + runs-on: ubuntu-latest + steps: + # - uses: actions/checkout@v4 + - uses: actions/download-artifact@v4 + with: + name: artifact + path: ./dist + - uses: actions/setup-python@v5 + with: + python-version: "3.11" + - name: Install built sdist + run: | + pip install ./dist/*.tar.gz + # ... some checks ``` -
- + ## Full project structure diff --git a/pyproject.toml b/pyproject.toml index ec501b0..595531f 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -10,7 +10,7 @@ dynamic = [ #"dependencies", # add if using requirements.txt ] readme = "README.md" -requires-python = ">=3.9" # test all higher Python versions +requires-python = ">=3.10" # test all higher Python versions # These are keywords classifiers = [ "Programming Language :: Python :: 3",