diff --git a/.github/workflows/check.yaml b/.github/workflows/check.yaml
new file mode 100644
index 0000000..3762f8f
--- /dev/null
+++ b/.github/workflows/check.yaml
@@ -0,0 +1,43 @@
+on:
+ workflow_dispatch:
+ pull_request:
+ branches: [main]
+
+name: Quarto Check
+
+jobs:
+ build-deploy:
+ runs-on: ubuntu-latest
+ permissions:
+ contents: write
+ steps:
+ - name: Check out repository
+ uses: actions/checkout@v4
+
+ - name: Install uv
+ uses: astral-sh/setup-uv@v5
+
+ - name: Enable caching
+ uses: astral-sh/setup-uv@v5
+ with:
+ enable-cache: true
+
+ - name: Install the project
+ run: uv sync --all-groups
+
+ - name: Set up Quarto
+ uses: quarto-dev/quarto-actions/setup@v2
+
+ - name: Install TinyTex
+ run: quarto install tinytex
+
+ - name: Configure Git identity
+ run: |
+ git config --global user.name "github-actions[bot]"
+ git config --global user.email "github-actions[bot]@users.noreply.github.com"
+
+ - name: Render and Publish
+ run: |
+ source .venv/bin/activate
+ quarto check
+ quarto render
diff --git a/.gitignore b/.gitignore
index fb11e63..6e1166d 100644
--- a/.gitignore
+++ b/.gitignore
@@ -3,4 +3,5 @@ _book/
.DS_Store
/.quarto/
docs/
-.venv/
\ No newline at end of file
+.venv/
+**/*.quarto_ipynb
diff --git a/_quarto.yml b/_quarto.yml
index 9327918..d20f31f 100644
--- a/_quarto.yml
+++ b/_quarto.yml
@@ -31,7 +31,10 @@ book:
- blog/workflow/github-actions.qmd
- blog/workflow/pre-commit.qmd
- blog/workflow/publish-to-pypi.qmd
- - blog/bonus/index.qmd
+ - part: blog/bonus/index.qmd
+ chapters:
+ - blog/bonus/miscellaneous.qmd
+ - blog/bonus/makefile.qmd
- python-package-template.qmd
- contributing.qmd
diff --git a/blog/bonus/index.qmd b/blog/bonus/index.qmd
index dd990db..b95d3b3 100644
--- a/blog/bonus/index.qmd
+++ b/blog/bonus/index.qmd
@@ -1,90 +1,3 @@
---
title: Bonus
----
-
-## How to name your package
-
-Yes, creating a good Python package name is both an art and a bit of a science. There **are constraints** you should follow, and some **best practices** that can help your package stand out and be easy to use.
-
-### Constraints
-
-- **Lowercase only**: Package names should be all lowercase.
-- **No special characters or spaces**: Use only letters, numbers, and underscores or dashes (`a-z`, `0-9`, `_`, `-`).
-- **Can't conflict with standard library modules**: Avoid names like `json`, `os`, `email`, etc.
-- **Must be unique on PyPI**: Check if the name is available: [https://pypi.org/](https://pypi.org/)
-- **Max length**: There’s no strict limit, but practical limits (about 50 characters) make sense.
-- **Underscores vs Dashes**:
- - Use dashes (`-`) in the **distribution name** (`setup.py` or `pyproject.toml`).
- - Use underscores (`_`) or no separator at all in **importable module names**.
-
-### Best Practices
-
-- **Short & memorable**: Easier for users to type and remember.
-- **Descriptive but concise**: Reflect what the package does.
- * Good: `requests`, `black`, `httpx`
- * Bad: `jdhfhc`, `my_cool_package`, `python_toolkit_2023_version_final`
-- **Avoid generic terms** unless paired cleverly: `data`, `utils`, `tools`, etc.
-- **Avoid abbreviations** unless well-known.
-- **Check for conflicts**: Google the name and check on GitHub too, not just PyPI.
-- **Consider branding**: If it becomes popular, the name matters.
-
-
-## How to name files
-
-Having good filenames is mostly useful for having a clear and consistent project architecture. There are some best practices to follow:
-
-- use lowercase only
-- avoid spaces and odd characters
-- keep it short
-- use underscores "`_`"
-
-::: {.panel-tabset}
-
-### Bad file names
-
-```txt
-my file.py
-Myfile.py
-myFile.py
-my@file.py
-my-file.py
-this-file-does-this-and-that.py
-```
-
-### Good file names
-
-```txt
-my_file.py
-myfile.py
-```
-
-:::
-
-
-## What is the `__all__` variable
-
-The `__all__` variable in Python lives in the `__init__.py` file and is used to control **what gets imported** when someone uses `from my_package import *`. It should be defined at the module level as a list of strings, where each string is the name of a symbol—like a function, class, or variable—that you want to make publicly available.
-
-If `__all__` is present, only those names listed will be imported during wildcard imports. If it's not defined, Python will import all names that don’t start with an underscore by default.
-
-For example, consider a file called `my_module.py`:
-
-```{.python filename="my_package/my_module.py"}
-def cool_function():
- return "This is public"
-
-class CoolClass:
- pass
-```
-
-And our `__init__.py` file:
-
-```{.python filename="my_package/__init__.py"}
-from .my_module import cool_function, CoolClass
-
-__all__ = ["cool_function"]
-```
-
-Now if a user does `from my_package import *`, only `cool_function` will be available. Attempting to use `CoolClass` will raise a `NameError`.
-
-It's also important to have in mind that, most of the time, it's highly discouraged to use `from my_package import *` as it does not explicit what is actually imported. Interestingly, it's even [not allowed in marimo notebooks](https://www.youtube.com/watch?v=8ZPIkDInKRM&ab_channel=marimo){target="_blank"}.
+---
\ No newline at end of file
diff --git a/blog/bonus/makefile.qmd b/blog/bonus/makefile.qmd
new file mode 100644
index 0000000..9b77474
--- /dev/null
+++ b/blog/bonus/makefile.qmd
@@ -0,0 +1,3 @@
+---
+title: "Makefile"
+---
diff --git a/blog/bonus/miscellaneous.qmd b/blog/bonus/miscellaneous.qmd
new file mode 100644
index 0000000..e84ff37
--- /dev/null
+++ b/blog/bonus/miscellaneous.qmd
@@ -0,0 +1,90 @@
+---
+title: "Miscellaneous"
+---
+
+## How to name your package
+
+Yes, creating a good Python package name is both an art and a bit of a science. There **are constraints** you should follow, and some **best practices** that can help your package stand out and be easy to use.
+
+### Constraints
+
+- **Lowercase only**: Package names should be all lowercase.
+- **No special characters or spaces**: Use only letters, numbers, and underscores or dashes (`a-z`, `0-9`, `_`, `-`).
+- **Can't conflict with standard library modules**: Avoid names like `json`, `os`, `email`, etc.
+- **Must be unique on PyPI**: Check if the name is available: [https://pypi.org/](https://pypi.org/)
+- **Max length**: There’s no strict limit, but practical limits (about 50 characters) make sense.
+- **Underscores vs Dashes**:
+ - Use dashes (`-`) in the **distribution name** (`setup.py` or `pyproject.toml`).
+ - Use underscores (`_`) or no separator at all in **importable module names**.
+
+### Best Practices
+
+- **Short & memorable**: Easier for users to type and remember.
+- **Descriptive but concise**: Reflect what the package does.
+ * Good: `requests`, `black`, `httpx`
+ * Bad: `jdhfhc`, `my_cool_package`, `python_toolkit_2023_version_final`
+- **Avoid generic terms** unless paired cleverly: `data`, `utils`, `tools`, etc.
+- **Avoid abbreviations** unless well-known.
+- **Check for conflicts**: Google the name and check on GitHub too, not just PyPI.
+- **Consider branding**: If it becomes popular, the name matters.
+
+
+## How to name files
+
+Having good filenames is mostly useful for having a clear and consistent project architecture. There are some best practices to follow:
+
+- use lowercase only
+- avoid spaces and odd characters
+- keep it short
+- use underscores "`_`"
+
+::: {.panel-tabset}
+
+### Bad file names
+
+```txt
+my file.py
+Myfile.py
+myFile.py
+my@file.py
+my-file.py
+this-file-does-this-and-that.py
+```
+
+### Good file names
+
+```txt
+my_file.py
+myfile.py
+```
+
+:::
+
+
+## What is the `__all__` variable
+
+The `__all__` variable in Python lives in the `__init__.py` file and is used to control **what gets imported** when someone uses `from my_package import *`. It should be defined at the module level as a list of strings, where each string is the name of a symbol—like a function, class, or variable—that you want to make publicly available.
+
+If `__all__` is present, only those names listed will be imported during wildcard imports. If it's not defined, Python will import all names that don’t start with an underscore by default.
+
+For example, consider a file called `my_module.py`:
+
+```{.python filename="my_package/my_module.py"}
+def cool_function():
+ return "This is public"
+
+class CoolClass:
+ pass
+```
+
+And our `__init__.py` file:
+
+```{.python filename="my_package/__init__.py"}
+from .my_module import cool_function, CoolClass
+
+__all__ = ["cool_function"]
+```
+
+Now if a user does `from my_package import *`, only `cool_function` will be available. Attempting to use `CoolClass` will raise a `NameError`.
+
+It's also important to have in mind that, most of the time, it's highly discouraged to use `from my_package import *` as it does not explicit what is actually imported. Interestingly, it's even [not allowed in marimo notebooks](https://www.youtube.com/watch?v=8ZPIkDInKRM&ab_channel=marimo){target="_blank"}.
diff --git a/blog/code-quality/writing-documentation.qmd b/blog/code-quality/writing-documentation.qmd
index 3c973cc..ace8365 100644
--- a/blog/code-quality/writing-documentation.qmd
+++ b/blog/code-quality/writing-documentation.qmd
@@ -193,8 +193,59 @@ We do not have to rewrite the name of the arguments and so on thanks to this. Th
Reference documentation is a very important part of documentation, but it is not exhaustive. It is important that you also add tutorials, guides, and explanations, as suggested in the [Diátaxis framework](https://diataxis.fr/){target="_blank"}. You can place them in their own directories to allow users to easily navigate your website.
:::
+## Advanced configuration
+
+For a large projects, our `mkdocs-material` configuration file could look like this:
+
+```yaml
+site_name: Name of your package
+
+nav:
+ - index.md # main page of your website
+ - contributing.md # contributing guide
+ - Tutorials: # tutorials
+ - tutorials/get_started.md
+ - tutorials/more_complex_usage.md
+ - Guides: # guides blog post
+ - guide/first_guide.md
+ - guide/second_guide.md
+ - Reference: # reference documentation
+ - reference/some_function.md
+ - reference/another_function.md
+
+plugins:
+ - mkdocstrings
+```
+
+Our package would then have the following structure:
+
+```
+my_package/
+├── my_package/
+│ ├── __init__.py
+│ ├── my_module.py
+│ └── other_module.py
+├── docs/
+│ ├── index.md
+│ ├── contributing.md
+│ ├── reference/
+│ ├── guides/
+│ └── tutorials/
+├── .git/
+├── .venv/
+├── mkdocs.yaml
+├── .gitignore
+├── README.md
+├── LICENSE
+└── pyproject.toml
+```
+
+This is of course just a very simple example and can be very different.
+
## Advanced customization
+So far, we've written some simple Markdown code, but you can change many other things: set a different theme, use pretty user interface elements, add code snippets, and so on.
+
`mkdocs-material` has a ton of super cool customization options, that you can often configure with just one line in your `mkdocs.yaml` file. It is recommended to browse the [official website](https://squidfunk.github.io/mkdocs-material/){target="_blank"} for this purpose.
If you are interested in a pre made configuration template, check out the [Python package template](../../python-package-template.html).
diff --git a/blog/create-a-package/handling-dependencies.qmd b/blog/create-a-package/handling-dependencies.qmd
index 2700cd5..3896996 100644
--- a/blog/create-a-package/handling-dependencies.qmd
+++ b/blog/create-a-package/handling-dependencies.qmd
@@ -251,7 +251,7 @@ def read_file(filename):
```
::: {.callout-important}
-It is **required** here to place `import` inside the function, because otherwise a `ModuleNotFoundError` error will be generated on the user's machine, even when importing a function with only optional dependencies..
+It is **required** here to place `import` inside the function, because otherwise a `ModuleNotFoundError` error will be generated on the user's machine, even when importing a function without optional dependencies.
:::
This will give your users a **clear and meaningful error message** that they can resolve very quickly. This kind of thing exist for the same reason we're talking about in this article: trying to minimize the number of dependencies (especially the unused ones!).
diff --git a/blog/create-a-package/index.qmd b/blog/create-a-package/index.qmd
index d45e663..ba5b8f3 100644
--- a/blog/create-a-package/index.qmd
+++ b/blog/create-a-package/index.qmd
@@ -19,6 +19,8 @@ Not really, especially if you take the time to read about it.
Making a Python package is mostly about organizing your project in a specific, standardized way. There are no low-level computer science concepts that you should know, but rather a more or less large set of rules to respect.
+Note that the emphasis here is on best practices for developing high-quality software under the best conditions. Much of what you learn while developing Python packages will also help you become a better programmer in general.
+
## Is it only for large projects?
Not at all! Even if your project is 200 lines of code in a single file, it might make sense to make it a package. You can find a fun example [here](https://github.com/koaning/smartfunc){target="_blank"}.
diff --git a/blog/workflow/publish-to-pypi.qmd b/blog/workflow/publish-to-pypi.qmd
index 17efa6a..0381ef9 100644
--- a/blog/workflow/publish-to-pypi.qmd
+++ b/blog/workflow/publish-to-pypi.qmd
@@ -2,48 +2,294 @@
title: "Publish to PyPI"
---
-This page is a work in progress. You can see the current state of the project [here](https://github.com/y-sunflower/python-packaging-essentials).
+## PyPI (Python Package Index)
+[PyPI](https://pypi.org) is the default online repository for Python packages. This is where packages are stored so that others can find and install them.
-
-
-
-
-
+When you run:
+
+```bash
+pip install requests
+```
-## note
+You're downloading the `requests` package from PyPI. More specifically, you're downloading the package's distribution (which might be source code or precompiled binaries) to your local machine from PyPI servers.
-- pyproject.toml to specify dependencies, python req, etc
-## PyPI (Python Package Index)
+## You don't need PyPI
-[PyPI](https://pypi.org) is the default online repository for Python packages. This is where packages are stored so that others can find and install them.
+If you're here to learn how to publish a Python package to PyPI, **you're in the right place**.
-When you run:
+But it's important that you become aware of the following facts:
+
+- you don't need PyPI to allow anyone in the world to `pip install` your package
+- adding your package to PyPI takes a namespace name forever ([related and interesting blog post](https://koaning.io/posts/uvx-pattern-for-two-tiers-of-open-work/))
+- it's a bit complicated to upload your package the clean way
+
+### `pip install` without PyPI
+
+If you're package is on Github/Gitlab (it should) and is open source, then people can super easily install it with:
```bash
-pip install requests
+pip install git+https://github.com/github_username/repository_name.git
```
-You’re downloading the `requests` package from PyPI. More specifically, you’re downloading the package's distribution (which might be source code or precompiled binaries) to your local machine from PyPI servers.
+::: {.panel-tabset}
+
+### Pros
+
+- No extra work
+- Updates automatically when you push changes
+
+### Cons
+
+- Managing dependencies is harder (for users)
+- You don't have a clean `pip install package_name`
+
+:::
+
+The main reason behind using PyPI is that it simplifies a lot package versionning (installing a specific version of a package) and you're sure of what you're getting, while a Github repo might just disappear one day.
+
+If your project is just a toy project, with no clear versionning of and actual maintenance, you probably shouldn't put it on PyPI.
+
+## Before distributing
+
+Before putting your package on PyPI, we need to make sure it checks all the valid boxes.
+
+### Dependencies
+
+:::{.callout-note}
+If you haven't already, check out the [handling dependencies blog post](../create-a-package/handling-dependencies.html), which is an important pre-requisite.
+:::
+
+Everything that is inside your `pyproject.toml` defines what will be used for your package once on PyPI. Fields like `name` or `author` will be displayed on the PyPI page (example of the [PyPI page of scikit-learn](https://api.slack.com/apps/A09FFHZP4FL)).
+
+But more importantly, the dependencies and Python version required will be used from this file. If you don't know if things are missing/invalid from your `pyproject.toml`, check out [the official guide](https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#writing-pyproject-toml).
+
+### Security
+
+When distributing your package, security is critical. Since an installed package can execute arbitrary code, everything that we decide to put on PyPI should be clear.
+
+Rules to follow absolutely:
+
+- Keep source in sync: The code on PyPI should match the code on GitHub (or any other public repo). Users trust that what they download is the same as what's publicly visible. Avoid "hidden" changes that only exist in the PyPI release.
+- Strong authentication: Use a strong, unique password for your PyPI account and enable two-factor authentication (2FA). PyPI now requires 2FA for maintainers of critical projects, and it's a good habit for all packages. Avoid relying on "trusted publisher" features until you fully understand them.
+
+## Distributing to PyPI
+
+Distributing to PyPI means multiple things:
+
+- building your package
+- uploading that build to PyPI servers
+
+### Build
+
+When we talk about *building a package*, we mean taking your source code and metadata (from `pyproject.toml`, `README.md`, etc.) and transforming them into **distribution artifacts**.
+
+There are two main types of distributions:
+
+* **Source distribution (`sdist`)** → a tarball (`.tar.gz`) containing your source code.
+* **Built distribution (`wheel`)** → a pre-built archive (`.whl`) that can be installed faster because it doesn't require building on the user's machine.
+Having a proper build step matters because:
-## pip (and friends)
+* it ensures reproducibility (everyone installing your package gets the same artifact)
+* it avoids shipping incomplete or broken files
+* it decouples packaging from distribution (you can build once and upload anywhere)
-`pip` is the tool used to install packages from PyPI. It’s simple and widely supported.
+In practice, we don't want to build the package manually, and for this we'll use **trust publishing**.
-Example:
+### Publishing
+
+Once you have your built artifacts, the next step is uploading them to PyPI. Historically (and probably still a lot today), this has been done with [twine](https://pypi.org/project/twine/):
```bash
-pip install numpy
+twine upload dist/*
```
-But when working with packages in Python, you need to take into account the package version. Maybe you need `numpy` `2.1.2` instead of `2.1.1` for your project.
+This is still valid, but it requires you to handle credentials (API tokens) locally.
+
+The modern, recommended approach is to use **trusted publishing** — where PyPI integrates directly with GitHub Actions, GitLab CI, or other CI/CD providers. This way, you don't store secrets at all, and only builds coming from your repository can publish under your package name.
+
+Here we'll use Github Actions (check out the [dedicated blog post](../workflow/github-actions.html)), and in particular the following script:
+
+```yaml
+# This is taken from
+# https://packaging.python.org/en/latest/guides/publishing-package-distribution-releases-using-github-actions-ci-cd-workflows/#the-whole-ci-cd-workflow
+# but with the following differences
+# - removed the TestPyPI part
+# - instead of `on: push`, we have `tags` in there too
+
+name: Publish Python 🐍 distribution 📦 to PyPI
+
+on:
+ push:
+ tags:
+ - "v[0-9]+.[0-9]+.[0-9]+*"
+
+jobs:
+ build:
+ name: Build distribution 📦
+ runs-on: ubuntu-latest
+
+ steps:
+ - uses: actions/checkout@v4
+ with:
+ persist-credentials: false
+ - name: Set up Python
+ uses: actions/setup-python@v5
+ with:
+ python-version: "3.x"
+ - name: Install pypa/build
+ run: python3 -m pip install build --user
+ - name: Build a binary wheel and a source tarball
+ run: python3 -m build
+ - name: Store the distribution packages
+ uses: actions/upload-artifact@v4
+ with:
+ name: python-package-distributions
+ path: dist/
+
+ publish-to-pypi:
+ name: >-
+ Publish Python 🐍 distribution 📦 to PyPI
+ if: startsWith(github.ref, 'refs/tags/') # only publish to PyPI on tag pushes
+ needs:
+ - build
+ runs-on: ubuntu-latest
+ environment:
+ name: pypi
+ url: https://pypi.org/p/pytest-cov
+ permissions:
+ id-token: write # IMPORTANT: mandatory for trusted publishing
+
+ steps:
+ - name: Download all the dists
+ uses: actions/download-artifact@v4
+ with:
+ name: python-package-distributions
+ path: dist/
+ - name: Publish distribution 📦 to PyPI
+ uses: pypa/gh-action-pypi-publish@release/v1
+
+ github-release:
+ name: >-
+ Sign the Python 🐍 distribution 📦 with Sigstore
+ and upload them to GitHub Release
+ needs:
+ - publish-to-pypi
+ runs-on: ubuntu-latest
+ permissions:
+ contents: write # IMPORTANT: mandatory for making GitHub Releases
+ id-token: write # IMPORTANT: mandatory for sigstore
+ steps:
+ - name: Download all the dists
+ uses: actions/download-artifact@v4
+ with:
+ name: python-package-distributions
+ path: dist/
+ - name: Sign the dists with Sigstore
+ uses: sigstore/gh-action-sigstore-python@v3.0.0
+ with:
+ inputs: >-
+ ./dist/*.tar.gz
+ ./dist/*.whl
+ - name: Create GitHub Release
+ env:
+ GITHUB_TOKEN: ${{ github.token }}
+ run: >-
+ gh release create
+ "$GITHUB_REF_NAME"
+ --repo "$GITHUB_REPOSITORY"
+ --notes ""
+ - name: Upload artifact signatures to GitHub Release
+ env:
+ GITHUB_TOKEN: ${{ github.token }}
+ # Upload to GitHub Release using the `gh` CLI.
+ # `dist/` contains the built packages, and the
+ # sigstore-produced signatures and certificates.
+ run: >-
+ gh release upload
+ "$GITHUB_REF_NAME" dist/**
+ --repo "$GITHUB_REPOSITORY"
+```
+
+What happens in the workflow? Once a [tag](https://git-scm.com/book/en/v2/Git-Basics-Tagging) is pushed:
+
+* **Build job**: GitHub Actions checks out your repo, installs `build`, and creates both the wheel (`.whl`) and the source distribution (`.tar.gz`).
+* **Publish job**: Those artifacts are uploaded directly to PyPI using trusted publishing (no API tokens needed).
+* **GitHub Release job**: The distributions are signed with [Sigstore](https://www.sigstore.dev/) and attached to a GitHub Release, so users can download the same artifacts outside PyPI if they prefer.
+
+:::{.callout-important}
+For the above script to work, you first need to [add a publisher in PyPI](https://docs.pypi.org/trusted-publishers/adding-a-publisher/), and this for each package you have.
+:::
+
+The best way to use the GitHub Actions workflow above is to pair it with a small release script like this:
+
+```bash
+#!/bin/bash
+
+# Usage: ./release.sh 1.2.3
+# Assumes version is set in pyproject.toml or setup.cfg
-You can read more about this in the [handling dependencies article](./handling-dependencies.html), but in summary, it's important to control the version of the packages you use/distribute, to ensure reproducible workflows and avoid unexpected things.
+set -e
-Some newer tools are built around `pip` to offer additional features such as **dependency management** and **better performance**.
+VERSION=$1
-One of the most important things these tools do is called dependency resolution, which involves calculating which versions of each package are compatible with each other based on version constraints. For example, you might be using a version of numpy that is incompatible (for whatever reason) with matplotlib, and since matplotlib relies on numpy, there's a problem.
+if [[ -z "$VERSION" ]]; then
+ echo "❌ Error: No version number supplied."
+ echo "👉 Usage: ./release.sh 1.2.3"
+ exit 1
+fi
-Since 2024, the best tool available is called [uv](https://docs.astral.sh/uv/). It's super easy to use, super fast and does everything you need, in one place, with one tool. It's more of a Python project manager than a simple package installer.
+TAG="v$VERSION"
+
+# Confirm version bump is in code
+echo "📦 Preparing release: $TAG"
+grep "$VERSION" pyproject.toml || {
+ echo "❌ Version $VERSION not found in pyproject.toml. Did you forget to bump it?"
+ exit 1
+}
+
+# Commit the version bump
+git add -A
+git commit -m "Release $TAG"
+
+# Create tag
+git tag "$TAG"
+
+# Push commit and tag
+git push origin main
+git push origin "$TAG"
+
+echo "✅ Release $TAG pushed! GitHub Actions will handle the rest."
+```
+
+Save this script as `release.sh` (for example), make it executable (`chmod +x release.sh`), and keep it at the root of your repository.
+
+### How to use it
+
+1. Bump the version in `pyproject.toml`.
+
+2. Run:
+
+ ```bash
+ ./release.sh 1.2.3
+ ```
+
+3. The script will:
+
+ * check the version string exists in `pyproject.toml`
+ * commit the change
+ * tag it as `v1.2.3`
+ * push both the commit and the tag
+
+This way:
+
+* you don’t need to manually run `python -m build` or `twine upload`
+* you don’t need to manage PyPI tokens or secrets
+* releases are reproducible and cryptographically signed
+* the release process becomes a single command:
+
+```bash
+./release.sh 1.2.3
+```