Contributing to Frequenz Common API¤
Build¤
You can use build
to simply build the source and binary distribution:
Local development¤
You need to make sure you have the git submodules
updated:
Running protolint¤
To make sure some common mistakes are avoided and to ensure a consistent style
it is recommended to run protolint
. After you installed
protolint
, just run:
Python setup¤
You can use editable installs to develop the project locally (it will install all the dependencies too):
This will also generate the Python files from the proto/
files and leave them
in py/
, so you can inspect them.
Or you can install all development dependencies (mypy
, pylint
, pytest
,
etc.) in one go too:
Adding new proto files¤
When you create a new directory or add proto files in the /proto/*
directory
structure, you must create a corresponding __init__.py
file in the
matching /py/*
directory path.
For example, if you add a new proto file at:
You need to create:
Why is this required?
Python requires __init__.py
files to treat directories as packages. Without
these files:
- The generated Python code from your proto files won't be importable
- The package hierarchy won't be recognized by Python
- Import statements will fail when trying to use the generated protobuf bindings
The __init__.py
file should include:
- A license header (see existing files for the template)
- A docstring describing the module's purpose
Example __init__.py
content:
# License: MIT
# Copyright © 2025 Frequenz Energy-as-a-Service GmbH
"""Description of what this module provides."""
If you don't want to install all the dependencies, you can also use nox
to
run the tests and other checks creating its own virtual environments:
You can also use nox -R
to reuse the current testing environment to speed up
test at the expense of a higher chance to end up with a dirty test environment.
Upgrading dependencies¤
If you want to update the dependency frequenz-api-common
, then you need to:
- Update the submodule
frequenz-api-common
- Update the version of the
frequenz-api-common
package inpyproject.toml
The version of frequenz-api-common
used in both places mentioned above should
be the same.
Here is an example of upgrading the frequenz-api-common
dependency to version
v0.2.0
:
ver="0.2.0"
cd submodules/frequenz-api-common
git remote update
git checkout v${ver}
cd -
sed s/"frequenz-api-common == [0-9]\.[0-9]\.[0-9]"/"frequenz-api-common == ${ver}"/g -i pyproject.toml
Running tests / checks individually¤
For a better development test cycle you can install the runtime and test
dependencies and run pytest
manually.
python -m pip install .[dev-pytest] # included in .[dev] too
# And for example
pytest pytests/test_*.py
Or you can use nox
:
The same appliest to pylint
or mypy
for example:
Building the documentation¤
To build the documentation, first install the dependencies (if you didn't
install all dev
dependencies):
Then you can build the documentation (it will be written in the site/
directory):
Or you can just serve the documentation without building it using:
Your site will be updated live when you change your files (provided that
you used pip install -e .
, beware of a common pitfall of using pip install
without -e
, in that case the API reference won't change unless you do a new
pip install
).
To build multi-version documentation, we use mike. If you want to see how the multi-version sites looks like locally, you can use:
mike
works in mysterious ways. Some basic information:
mike deploy
will do amike build
and write the results to your localgh-pages
branch.my-version
is an arbitrary name for the local version you want to preview.mike set-default
is needed so when you serve the documentation, it goes to your newly produced documentation by default.mike serve
will serve the contents of your localgh-pages
branch. Be aware that, unlikemkdocs serve
, changes to the sources won't be shown live, as themike deploy
step is needed to refresh them.
Be careful not to use --push
with mike deploy
, otherwise it will push your
local gh-pages
branch to the origin
remote.
That said, if you want to test the actual website in your fork, you can
always use mike deploy --push --remote your-fork-remote
, and then access the
GitHub pages produced for your fork.
Releasing¤
These are the steps to create a new release:
-
Get the latest head you want to create a release from.
-
Update the
RELEASE_NOTES.md
file if it is not complete, up to date, and remove template comments (<!-- ... ->
) and empty sections. Submit a pull request if an update is needed, wait until it is merged, and update the latest head you want to create a release from to get the new merged pull request. -
Create a new signed tag using the release notes and a semver compatible version number with a
v
prefix, for example:
-
Push the new tag.
-
A GitHub action will test the tag and if all goes well it will create a GitHub Release, and upload a new package to PyPI automatically.
-
Once this is done, reset the
RELEASE_NOTES.md
with the template:
Commit the new release notes and create a PR (this step should be automated eventually too).
- Celebrate!