Skip to content
Snippets Groups Projects
Commit c92f7e38 authored by André Anjos's avatar André Anjos :speech_balloon:
Browse files

[pyproject] Use pixi for development and deployment; Remove all traces of...

[pyproject] Use pixi for development and deployment; Remove all traces of dev-profile dependant behaviour
parent bc42bfc1
No related branches found
No related tags found
1 merge request!17Use pixi for development and deployment
Showing
with 2116 additions and 4023 deletions
...@@ -7,21 +7,21 @@ ...@@ -7,21 +7,21 @@
*.pyc *.pyc
*.egg-info *.egg-info
.nfs* .nfs*
.coverage .coverage*
*.DS_Store *.DS_Store
.envrc .envrc
coverage.xml coverage.xml
test_results.xml test_results.xml
junit-coverage.xml junit-coverage.xml
environment.yaml
html/ html/
build/ build/
doc/api/ doc/api/
dist/ dist/
cache/ cache/
venv/ .venv/
_citools/ _citools/
_work/ _work/
.mypy_cache/ .mypy_cache/
.pytest_cache/ .pytest_cache/
changelog.md
.pixi/ .pixi/
...@@ -4,4 +4,4 @@ ...@@ -4,4 +4,4 @@
include: include:
- project: software/dev-profile - project: software/dev-profile
file: /gitlab/python.yml file: /gitlab/pixi.yml
...@@ -24,13 +24,18 @@ repos: ...@@ -24,13 +24,18 @@ repos:
- id: check-added-large-files - id: check-added-large-files
- id: check-toml - id: check-toml
- id: check-yaml - id: check-yaml
exclude: conda/meta.yaml - id: check-json
- id: debug-statements - id: debug-statements
- id: check-case-conflict - id: check-case-conflict
- id: trailing-whitespace - id: trailing-whitespace
- id: end-of-file-fixer - id: end-of-file-fixer
- id: debug-statements - id: debug-statements
- repo: https://github.com/fsfe/reuse-tool - repo: https://github.com/fsfe/reuse-tool
rev: v3.0.1 rev: v3.0.2
hooks: hooks:
- id: reuse - id: reuse
exclude: |
(?x)(
^.pixi/|
^.pixi.lock|
)
# Copyright © 2022 Idiap Research Institute <contact@idiap.ch>
#
# SPDX-License-Identifier: BSD-3-Clause
{% set data = load_file_data(RECIPE_DIR + '/../pyproject.toml') %}
package:
name: {{ data['project']['name'] }}
version: {{ environ.get('PACKAGE_VERSION', environ.get('GIT_DESCRIBE_TAG')) }}
source:
path: ..
build:
noarch: python
number: {{ environ.get('NEXT_BUILD_NUMBER', 0) }}
run_exports:
- {{ pin_subpackage(data['project']['name']) }}
script:
- "{{ PYTHON }} -m pip install {{ SRC_DIR }} -vv"
requirements:
host:
- python >=3.10
- pip
- hatchling
- versioningit
run:
- python >=3.10
- pip
# conda/mamba ecosystem dependencies
- conda
- conda-build
- mamba
- boa
# things we depend on
- click >=8
- tomli
- tomlkit
- cookiecutter
- packaging
- pyyaml
- gitpython
- python-gitlab
- python-dateutil
- pytz
- xdg
test:
source_files:
- tests
imports:
- {{ data['project']['name'].replace('-','_') }}
commands:
- pytest -sv tests
requires:
- pytest {{ pytest }}
- git
about:
home: {{ data['project']['urls']['homepage'] }}
summary: {{ data['project']['description'] }}
license: {{ data['project']['license'] }}
license_family: BSD
...@@ -17,11 +17,8 @@ This section includes information for using the Python API of ...@@ -17,11 +17,8 @@ This section includes information for using the Python API of
idiap_devtools idiap_devtools
idiap_devtools.click idiap_devtools.click
idiap_devtools.conda
idiap_devtools.logging idiap_devtools.logging
idiap_devtools.profile
idiap_devtools.python idiap_devtools.python
idiap_devtools.update_pins
idiap_devtools.utils idiap_devtools.utils
.. autosummary:: .. autosummary::
......
.. Copyright © 2022 Idiap Research Institute <contact@idiap.ch>
..
.. SPDX-License-Identifier: BSD-3-Clause
.. _idiap-devtools.develop:
===============================
Local development of packages
===============================
We recommend you create isolated virtual environments using mamba_ (conda_) to
develop existing or new projects, then pip_ install development requirements
over that mamba_ environment. We offer guidance of two variants for installing
dependencies: one exclusively using Python packages, and a second which
installs most dependencies as conda_ packages (being able to better handle
non-Python dependencies such as Nvidia CUDA-compiled packages). In both cases,
the top-level package (or packages) is (are) always installed on the
development environment through pip_ (with `the --editable option <pip-e_>`_).
.. note::
Pip_ may be configured with command-line options as shown below, but equally
through environment variables, or `configuration files <pip-config_>`_. We
leave to the developer's discretion the decision on how to best manage their
own use of pip_.
.. note::
You may develop software against different development (c.f.
:ref:`idiap-devtools.install.setup.profile`). In the context of these
instructions, we assume your development profile is located at
``../profile``.
.. tab:: pip
In this variant, the latest (beta) versions of internally developed
dependencies are fetched from our local package registry if applicable.
Furthermore, external dependencies are fetched from PyPI and respect
versions used on the continuous integration (CI) server. It is useful to
reproduce bugs reported on the CI, during test builds:
.. code:: sh
$ git clone <PACKAGE-URL> # e.g. git clone git@gitlab.idiap.ch/software/clapp
$ cd <PACKAGE> # e.g. cd clapp
$ mamba create -n dev python=3.10 pip
$ conda activate dev
(dev) $ pip install --pre --index-url https://token:<YOUR-GITLAB-TOKEN>@gitlab.idiap.ch/api/v4/groups/software/-/packages/pypi/simple --extra-index-url https://pypi.org/simple --constraint ../profile/python/pip-constraints.txt --editable '.[qa,doc,test]'
(dev) $ # `dev` environment is now ready, just develop
.. note::
If you need to install *private* packages developed through GitLab, you
must `generate a personal token <gitlab-token_>`_ with at least access to
the package registry (currently implemented through the ``read_api``
privilege).
Otherwise, you may suppress ``token:<YOUR-GITLAB-TOKEN>@`` from the
index-url option above.
.. tip::
Optionally, you may create a standard Python virtual environment (instead of
a conda_ virtual environment) with either venv_ or virtualenv_, and then
apply the same instructions above to locally install dependencies and the
package itself.
The Python version in this case will match that used to create the virtual
environment.
.. tab:: conda
In this variant, the latest (beta) versions of internally developed
dependencies are fetched from our local conda (beta) package registry, if
applicable. Furthermore, external dependencies are fetched from conda-forge_
and respect versions used on the continuous integration (CI) server. It is
useful to reproduce bugs reported on the CI, during conda-package test
builds, or to install further non-Python dependencies required for package
development (e.g. Nvidia CUDA-enabled packages):
.. code:: sh
$ git clone <PACKAGE>
$ cd <PACKAGE>
$ mamba run -n idiap-devtools --live-stream devtool env -vv .
$ mamba env create -n dev -f environment.yaml
$ conda activate dev
(dev) $ pip install --no-build-isolation --no-dependencies --editable .
(dev) $ # `dev` environment is now ready, just develop
.. note::
The application ``devtool env`` uses the ``conda`` API to parse your
package's recipe (typically at ``conda/meta.yaml``) and
``pyproject.toml``, and then to search dependencies (including those for
quality-assurance, documentation and tests, which may be not listed on
``conda/meta.yaml``). The installation respects CI constraints
established on your chosen profile.
After that step, your package will be installed and ready for use inside the
``dev`` environment. You must activate the virtual environment everytime you
want to further develop the package, or simply deactivate it when you branch
off to another activity.
With the development environment active, you can optionally test the package
installation by either building its Sphinx documentation, doctests, running its
test suite, or the quality assurance (pre-commit) checks:
.. code:: sh
$ conda activate dev
(dev) $ pre-commit run --all-files # quality assurance
(dev) $ pytest -sv tests/ # test units
(dev) $ sphinx-build doc sphinx # documentation
(dev) $ sphinx-build doctest doc sphinx # doctests
Developing multiple existing packages simultaneously
----------------------------------------------------
It may happen that you may want to develop several packages against each other
for your project. This is the case if you are changing a high-level package
``package-a``, that in turn depends on functionality on ``package-b`` you may
also need to adapt. While you change ``package-b``, you want to verify how
these changes work on ``package-a``. The procedure to accomodate this is
similar to the above, except you will git-clone and pip-install more packages:
.. tab:: pip
In this variant, the latest (beta) versions of internally developed
dependencies are fetched from our local package registry if applicable.
Furthermore, external dependencies are fetched from PyPI and respect
versions used on the continuous integration (CI) server.
To setup your environment, you must install packages in reverse dependence
order, with the top-level package (``package-a`` in this example), being
installed **by last**.
.. code:: sh
$ git clone <PACKAGE-A>
$ cd <PACKAGE-A>
$ git clone <PACKAGE-B> src/<PACKAGE-B>
$ mamba create -n dev python=3.10 pip
# get the constraints for the "target" development environment.
# this is just an example:
$ curl -O constraints.txt https://gitlab.idiap.ch/software/dev-profile/-/raw/main/python/pip-constraints.txt
$ conda activate dev
(dev) $ for pkg in "src/package-b" "."; do pip install --pre --index-url https://token:<YOUR-GITLAB-TOKEN>@gitlab.idiap.ch/api/v4/groups/software/-/packages/pypi/simple --extra-index-url https://pypi.org/simple --constraint constraints.txt --editable "${pkg}[qa,doc,test]"; done
(dev) $ # `dev` environment is now ready, just develop
.. tab:: conda
In this variant, the latest (beta) versions of internally developed
dependencies are fetched from our local conda (beta) package registry, if
applicable. Furthermore, external dependencies are fetched from conda-forge_
and respect versions used on the continuous integration (CI) server. It is
useful to reproduce bugs reported on the CI, during conda-package test
builds:
.. code:: sh
$ git clone <PACKAGE-A>
$ cd <PACKAGE-A>
$ git clone <PACKAGE-B> src/<PACKAGE-B>
$ mamba run -n idiap-devtools --live-stream devtool env -vv src/package-b .
$ mamba env create -n dev -f environment.yaml
$ conda activate dev
(dev) $ for pkg in "src/package-b" "."; do pip install --no-build-isolation --no-dependencies --editable "${pkg}"
(dev) $ # `dev` environment is now ready, just develop
Installing all constrained packages
-----------------------------------
If you plan to develop many packages together, it may be faster to first
pre-install all (constrained) packages, to then pip-install all the individual
packages. Because the mamba_ (or conda_) ecosystem is a superset of Python
packages, we only provide this option:
.. code:: sh
$ mamba run -n idiap-devtools --live-stream devtool fullenv -vv
$ mamba env create -n dev -f environment.yaml
$ conda activate dev
(dev) $ for pkg in "src/*"; do pip install --no-build-isolation --no-dependencies --editable "${pkg}"
Creating new packages
---------------------
To create a new package, use our `cookiecutter template
<cookiecutter-template_>`_ and associated instructions. Do **not** copy
another package source code, or no Xmas gifts for you...
.. include:: links.rst
...@@ -11,7 +11,8 @@ ...@@ -11,7 +11,8 @@
.. todolist:: .. todolist::
This package contains a set of small utilities to support development of Python This package contains a set of small utilities to support development of Python
packages through GitLab. It is targetted for package development at Idiap. packages through GitLab. It is targetted for package development at Idiap,
however some of the commands provided can benefit a larger audience.
Documentation Documentation
...@@ -21,7 +22,6 @@ Documentation ...@@ -21,7 +22,6 @@ Documentation
:maxdepth: 2 :maxdepth: 2
install install
develop
release release
cli cli
api api
......
...@@ -8,42 +8,45 @@ ...@@ -8,42 +8,45 @@
Installation Installation
============== ==============
First install mamba_ or conda (preferably via mambaforge_, as it is already Installation may follow one of two paths: deployment or development. Choose the
setup to use conda-forge_ as its main distribution channel). Then, create a relevant tab for details on each of those installation paths.
new environment, containing this package:
.. tab:: mamba/conda (RECOMMENDED) .. tab:: Deployment (pixi)
.. code-block:: sh Use pixi_ to add this package as a dependence:
# installs the latest release on conda-forge: .. code:: sh
mamba create -n idiap-devtools idiap-devtools
# OR, installs the latest development code: pixi add idiap-devtools
mamba create -n idiap-devtools -c https://www.idiap.ch/software/biosignal/conda/label/beta idiap-devtools
.. tab:: pip .. tab:: Development
.. warning:: Checkout the repository, and then use pixi_ to setup a full development
environment:
While this is possible for testing purposes, it is **not recommended**, .. code:: sh
as this package depends on conda/mamba for some of its functionality. If
you decide to do so, create a new conda/mamba environment, and
pip-install this package on it.
.. code-block:: sh git clone git@gitlab.idiap.ch:software/idiap-devtools
pixi install --frozen
# creates the new environment .. tip::
mamba create -n idiap-devtools python=3 pip conda mamba conda-build boa
conda activate idiap-devtools
# installs the latest release on PyPI: The ``--frozen`` flag will ensure that the latest lock-file available
pip install idiap-devtools with sources is used. If you'd like to update the lock-file to the
latest set of compatible dependencies, remove that option.
# OR, installs the latest development code: If you use `direnv to setup your pixi environment
pip install git+https://gitlab.idiap.ch/software/idiap-devtools <https://pixi.sh/latest/features/environment/#using-pixi-with-direnv>`_
when you enter the directory containing this package, you can use a
``.envrc`` file similar to this:
.. code:: sh
watch_file pixi.lock
export PIXI_FROZEN="true"
eval "$(pixi shell-hook)"
.. _idiap-devtools.install.running: .. _idiap-devtools.install.running:
...@@ -53,35 +56,11 @@ Running ...@@ -53,35 +56,11 @@ Running
This package contains a single command-line executable named ``devtool``, which This package contains a single command-line executable named ``devtool``, which
in turn contains subcommands with various actions. To run the main in turn contains subcommands with various actions. To run the main
command-line tool, you must first activate the environment where it is command-line tool, use ``pixi run``:
installed in, and then call it on the command-line:
.. code-block:: sh
conda activate idiap-devtools
devtool --help
conda deactivate # to go back to the previous state
It is possible to use the command ``mamba run`` (or ``conda run``, if you
installed miniforge) to, instead, automatically prefix the execution of
``devtool`` with an environment activation, and follow it with a deactivation.
This allows to compact the above form into a "one-liner":
.. code-block:: sh .. code-block:: sh
mamba run -n idiap-devtools --live-stream devtool --help pixi run devtool --help
.. warning::
The ``devtool`` application requires that ``mamba``/``conda`` are available
on the environment it executes. When using ``mamba``/``conda`` to create
new environments, ensure you are using the mamba executable **from the
``base`` environment**. Creating new environments as sub-environments of
the ``idiap-devtools`` environment may have surprising effects. A way to do
this is to first activate the ``base`` environment, and then create the new
environment.
.. _idiap-devtools.install.setup: .. _idiap-devtools.install.setup:
...@@ -89,52 +68,6 @@ This allows to compact the above form into a "one-liner": ...@@ -89,52 +68,6 @@ This allows to compact the above form into a "one-liner":
Setup Setup
----- -----
.. _idiap-devtools.install.setup.profile:
Setting up Development Profiles
===============================
Development profiles contain a set of constants that are useful for developing,
and interacting with projects from a particular GitLab group, or groups. They
may contain webserver addresses, and both Python and conda installation
constraints (package pinnings). Development profiles are GitLab repositories,
organized in a specific way, and potentially used by various development,
continuous integration, and administrative tools. Some examples:
* Software's group: https://gitlab.idiap.ch/software/dev-profile
* Biosignal's group: https://gitlab.idiap.ch/biosignal/software/dev-profile
* Bob's group: https://gitlab.idiap.ch/bob/dev-profile
While developing using the command-line utility ``devtool``, one or more
commands may require you pass the base directory of a development profile.
You may set a number of development shortcuts by configuring the section
``[profiles]`` on the file ``~/.config/idiap-devtools.toml``, like so:
.. code-block:: toml
[profiles]
default = "software"
software = "~/dev-profiles/software"
biosignal = "~/dev-profiles/biosignal"
bob = "~/dev-profiles/bob"
custom = "~/dev-profiles/custom-profile"
.. note::
The location of the configuration file respects ``${XDG_CONFIG_HOME}``,
which defaults to ``~/.config`` in typical UNIX-style operating systems.
The special ``default`` entry refers to one of the other entries in this
section, and determines the default profile to use, if none is passed on the
command-line. All other entries match name to a local directory where the
profile is available.
Development profiles are typically shared via GitLab as independent
repositories. In this case, **it is your job to clone and ensure the profile
is kept up-to-date with your group's development requirements.**
.. _idiap-devtools.install.setup.gitlab: .. _idiap-devtools.install.setup.gitlab:
Automated GitLab interaction Automated GitLab interaction
......
...@@ -5,19 +5,14 @@ ...@@ -5,19 +5,14 @@
.. place re-used URLs here, then include this file .. place re-used URLs here, then include this file
.. on your other RST sources. .. on your other RST sources.
.. _conda: https://conda.io
.. _idiap: http://www.idiap.ch .. _idiap: http://www.idiap.ch
.. _python: http://www.python.org .. _python: http://www.python.org
.. _mamba: https://mamba.readthedocs.io/en/latest/index.html .. _pip: https://pip.pypa.io/en/stable/
.. _mambaforge: https://github.com/conda-forge/miniforge#mambaforge .. _uv: https://github.com/astral-sh/uv
.. _conda-forge: https://conda-forge.org .. _rye: https://github.com/astral-sh/rye
.. _poetry: https://python-poetry.org
.. _pixi: https://pixi.sh
.. _venv: https://docs.python.org/3/library/venv.html .. _venv: https://docs.python.org/3/library/venv.html
.. _virtualenv: https://virtualenv.pypa.io/en/latest/ .. _virtualenv: https://virtualenv.pypa.io/en/latest/
.. _gitlab-token: https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html .. _gitlab-token: https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html
.. _cookiecutter: https://cookiecutter.readthedocs.io
.. _cookiecutter-template: https://gitlab.idiap.ch/software/cookiecutter-idiap-pypackage
.. _semantic version numbers: https://semver.org .. _semantic version numbers: https://semver.org
.. _pip: https://pip.pypa.io/en/stable/
.. _pip-e: https://pip.pypa.io/en/stable/cli/pip_install/#cmdoption-e
.. _pip-config: https://pip.pypa.io/en/stable/topics/configuration/
...@@ -29,9 +29,7 @@ Use the ``--help`` flag in each command to learn more about each phase. ...@@ -29,9 +29,7 @@ Use the ``--help`` flag in each command to learn more about each phase.
Pipelines for stable and non-stable (a.k.a. beta) packages may differ w.r.t. Pipelines for stable and non-stable (a.k.a. beta) packages may differ w.r.t.
deployment locations for various artefacts (e.g. packages and deployment locations for various artefacts (e.g. packages and
documentation), but also w.r.t. requirements for the version of the profile documentation).
to be used (e.g. tagged versions non-tagged). You should check the target
development profile CI instructions for details.
Create the Changelogs Create the Changelogs
......
This diff is collapsed.
...@@ -24,16 +24,13 @@ classifiers = [ ...@@ -24,16 +24,13 @@ classifiers = [
] ]
dependencies = [ dependencies = [
"click>=8", "click>=8",
"cookiecutter",
"gitpython", "gitpython",
"packaging", "packaging",
"python-dateutil", "python-dateutil",
"python-gitlab", "python-gitlab",
"pytz", "pytz",
"pyyaml",
"tomli", "tomli",
"tomlkit", "tomlkit",
"xdg",
] ]
[project.urls] [project.urls]
...@@ -64,24 +61,22 @@ platforms = ["linux-64", "osx-arm64"] ...@@ -64,24 +61,22 @@ platforms = ["linux-64", "osx-arm64"]
[tool.pixi.dependencies] [tool.pixi.dependencies]
click = ">=8" click = ">=8"
cookiecutter = "*"
gitpython = "*" gitpython = "*"
packaging = "*" packaging = "*"
python-dateutil = "*" python-dateutil = "*"
python-gitlab = "*" python-gitlab = "*"
pytz = "*" pytz = "*"
pyyaml = "*"
tomli = "*" tomli = "*"
tomlkit = "*" tomlkit = "*"
xdg = "*"
# conda/mamba ecosystem dependencies
conda = "*"
conda-build = "*"
mamba = "*"
boa = "*"
[tool.pixi.pypi-dependencies] [tool.pixi.feature.self.pypi-dependencies]
idiap-devtools = { path = ".", editable = true, extras = ["qa", "doc", "test"] } idiap-devtools = { path = ".", editable = true }
[tool.pixi.feature.py311.dependencies]
python = "~=3.11.0"
[tool.pixi.feature.py312.dependencies]
python = "~=3.12.0"
[tool.pixi.feature.qa.dependencies] [tool.pixi.feature.qa.dependencies]
pre-commit = "*" pre-commit = "*"
...@@ -91,6 +86,7 @@ reuse = "*" ...@@ -91,6 +86,7 @@ reuse = "*"
[tool.pixi.feature.qa.tasks] [tool.pixi.feature.qa.tasks]
qa-install = "pre-commit install" qa-install = "pre-commit install"
qa = "pre-commit run --all-files" qa = "pre-commit run --all-files"
qa-ci = "pre-commit run --all-files --show-diff-on-failure --verbose"
[tool.pixi.feature.doc.dependencies] [tool.pixi.feature.doc.dependencies]
sphinx = "*" sphinx = "*"
...@@ -102,7 +98,9 @@ sphinx-inline-tabs = "*" ...@@ -102,7 +98,9 @@ sphinx-inline-tabs = "*"
sphinx-click = "*" sphinx-click = "*"
[tool.pixi.feature.doc.tasks] [tool.pixi.feature.doc.tasks]
doc = "rm -rf doc/api && rm -rf html && sphinx-build -aEW doc html" doc-clean = "rm -rf doc/api && rm -rf html"
doc = "sphinx-build -aEW doc html"
doctest = "sphinx-build -aEb doctest doc html/doctest"
[tool.pixi.feature.test.dependencies] [tool.pixi.feature.test.dependencies]
pytest = "*" pytest = "*"
...@@ -115,8 +113,25 @@ pdbpp = "*" ...@@ -115,8 +113,25 @@ pdbpp = "*"
test = "pytest -sv tests/" test = "pytest -sv tests/"
test-ci = "pytest -sv --cov-report 'html:html/coverage' --cov-report 'xml:coverage.xml' --junitxml 'junit-coverage.xml' --ignore '.profile' tests/" test-ci = "pytest -sv --cov-report 'html:html/coverage' --cov-report 'xml:coverage.xml' --junitxml 'junit-coverage.xml' --ignore '.profile' tests/"
[tool.pixi.feature.build.dependencies]
hatch = "*"
versioningit = "*"
twine = "*"
[tool.pixi.feature.build.tasks]
build = "hatch build"
check = "twine check dist/*"
upload = "twine upload dist/*"
[tool.pixi.feature.dev.dependencies]
pdbpp = "*"
uv = "*"
[tool.pixi.environments] [tool.pixi.environments]
default = { features = [ "qa", "doc", "test", "debug" ] } default = { features = ["qa", "build", "doc", "test", "dev", "py312", "self", "debug"] }
qa-ci = { features = ["qa", "py312"] }
build-ci = { features = ["build", "py312"] }
test-ci-alternative = { features = ["test", "py311", "self"] }
[tool.hatch.version] [tool.hatch.version]
source = "versioningit" source = "versioningit"
......
...@@ -7,8 +7,6 @@ import typing ...@@ -7,8 +7,6 @@ import typing
import click import click
from .profile import get_profile_path
def verbosity_option( def verbosity_option(
logger: logging.Logger, logger: logging.Logger,
...@@ -168,37 +166,3 @@ class PreserveIndentCommand(click.Command): ...@@ -168,37 +166,3 @@ class PreserveIndentCommand(click.Command):
formatter.write_paragraph() formatter.write_paragraph()
for line in self.description.split("\n"): for line in self.description.split("\n"):
formatter.write_text(line) formatter.write_text(line)
def validate_profile(_: click.Context, __: str, value: str) -> str:
"""Call back for click doing a profile name validation.
Arguments:
_: current command context
__: The option being validated
value: The value set for the option
Returns
-------
The validated option value
"""
profile_path = get_profile_path(value)
if profile_path is None:
raise click.BadParameter(
"You have not provided a development profile path "
"(-P/--profile) option or set the `default' profile at "
"your configuration file. I need one of these to "
"load my development constants"
)
if not (profile_path / "profile.toml").exists():
raise click.BadParameter(
f"Error while attempting to load the profile `{value}' from "
f"`{str(profile_path / 'profile.toml')}' - file does not exist!"
)
return value
# Copyright © 2022 Idiap Research Institute <contact@idiap.ch>
#
# SPDX-License-Identifier: BSD-3-Clause
import contextlib
import copy
import logging
import pathlib
import typing
from .utils import uniq
logger = logging.getLogger(__name__)
@contextlib.contextmanager
def root_logger_protection():
"""Protect the root logger against spurious (conda) manipulation.
Still to verify: conda does some operations on loggers at import, so
we may need to put the import inside this context manager too.
"""
root_logger = logging.getLogger()
level = root_logger.level
handlers = copy.copy(root_logger.handlers)
yield
root_logger.setLevel(level)
root_logger.handlers = handlers
def make_conda_config(
options: dict[str, typing.Any],
) -> typing.Any:
"""Create a conda configuration for a build merging various sources.
This function will use the conda-build API to construct a configuration by
merging different sources of information.
Arguments:
options: A dictionary (typically read from a condarc YAML file) that
contains build and channel options
Returns
-------
A dictionary containing the merged configuration, as produced by
conda-build API's ``get_or_merge_config()`` function.
"""
with root_logger_protection():
import conda_build.api
from conda.utils import url_path
retval = conda_build.api.get_or_merge_config(None, **options)
retval.channel_urls = []
for url in options["channels"]:
# allow people to specify relative or absolute paths to local channels
# These channels still must follow conda rules - they must have the
# appropriate platform-specific subdir (e.g. win-64)
url_ = pathlib.Path(url)
if url_.is_dir():
if not url_.is_absolute():
url = str(url_.resolve())
with root_logger_protection():
url = url_path(url)
retval.channel_urls.append(url)
return retval
def use_mambabuild():
"""Inject mamba solver to conda build API to speed up resolves."""
# only importing this module will do the job.
with root_logger_protection():
from boa.cli.mambabuild import prepare
prepare()
def get_rendered_metadata(recipe_dir, config):
"""Render the recipe and returns the interpreted YAML file."""
with root_logger_protection():
import conda_build.api
# use mambabuild instead
use_mambabuild()
return conda_build.api.render(recipe_dir, config=config)
def get_parsed_recipe(metadata):
"""Render the recipe and returns the interpreted YAML file."""
with root_logger_protection():
return metadata[0][0].get_rendered_recipe_text()
def remove_pins(deps):
"""Return dependencies without their pinned versions."""
return [ll.split()[0] for ll in deps]
def parse_dependencies(recipe_dir, config) -> tuple[str, list[str]]:
"""Parse dependencies in a meta.yaml file."""
metadata = get_rendered_metadata(recipe_dir, config)
recipe = get_parsed_recipe(metadata)
requirements = []
for section in ("build", "host"):
requirements += remove_pins(recipe.get("requirements", {}).get(section, []))
# we don't remove pins for the rest of the recipe
requirements += recipe.get("requirements", {}).get("run", [])
requirements += recipe.get("test", {}).get("requires", [])
return recipe["package"]["name"], uniq(requirements)
...@@ -14,10 +14,6 @@ import packaging.requirements ...@@ -14,10 +14,6 @@ import packaging.requirements
import packaging.version import packaging.version
import tomlkit import tomlkit
from git import Repo
from idiap_devtools.profile import Profile
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
...@@ -56,10 +52,14 @@ def _update_readme( ...@@ -56,10 +52,14 @@ def _update_readme(
} }
# matches the graphical badge in the readme's text with the given version # matches the graphical badge in the readme's text with the given version
doc_image_re = re.compile(r"docs\-(" + "|".join(variants) + r")\-", re.VERBOSE) doc_image_re = re.compile(
r"docs\-(" + "|".join(variants) + r")\-", re.VERBOSE | re.IGNORECASE
)
# matches all other occurrences we need to handle # matches all other occurrences we need to handle
branch_re = re.compile(r"/(" + "|".join(variants) + r")", re.VERBOSE) branch_re = re.compile(
r"/(" + "|".join(variants) + r")", re.VERBOSE | re.IGNORECASE
)
new_contents = [] new_contents = []
for line in contents.splitlines(): for line in contents.splitlines():
...@@ -85,179 +85,30 @@ def _update_readme( ...@@ -85,179 +85,30 @@ def _update_readme(
return "\n".join(new_contents) + "\n" return "\n".join(new_contents) + "\n"
def _pin_versions_of_packages_list(
packages_list: list[str],
dependencies_versions: list[packaging.requirements.Requirement],
) -> list[str]:
"""Add its version to each package according to a dictionary of versions.
Modifies ``packages_list`` in-place.
Iterates over ``packages_list`` and sets the version to be the corresponding
one in ``dependencies_versions``.
Edge cases:
**Package not in ``dependencies_versions``**: The package will not be
pinned.
**Package already has version specifier**: Raises a ``ValueError``.
Arguments:
packages_list: The packages to pin.
dependencies_versions: All the known packages with their desired version
pinning.
Raises
------
``ValueError`` if a version in ``dependencies_versions`` conflicts with
an already present pinning in ``packages_list``.
"""
# Check that there is not the same dependency twice in the pins
seen = set()
for d in dependencies_versions:
if d.name in seen:
raise NotImplementedError(
"Pinning with more than one specification per dependency not"
"supported."
)
seen.add(d.name)
# Make it easier to retrieve the dependency pin for each package.
dependencies_dict = {d.name: d for d in dependencies_versions}
results = []
# package is the dependency we want to pin
for pkg_id, package in enumerate(packages_list):
results.append(package)
# Get the dependency package version specifier if already present.
pkg_req = packaging.requirements.Requirement(package)
if pkg_req.url is not None:
logger.warning(
"Ignoring dependency '%s' as it is specified with a url (%s).",
pkg_req.name,
pkg_req.url,
)
# Retrieve this dependency's constraint Requirement object
desired_pin = dependencies_dict.get(pkg_req.name)
if desired_pin is None:
logger.warning(
"Dependency '%s' is not available in constraints. Skipping "
"pinning. Consider adding this package to your dev-profile "
"constraints file.",
pkg_req.name,
)
continue
# A Requirement is composed of:
# name[extras]@ url ; marker
# Or
# name[extras]specifier; marker
# Where extras and marker are optional
# The following handles those different fields
if desired_pin.url is not None:
logger.info(
"Pinning of %s will be done with a URL (%s).",
pkg_req.name,
desired_pin.url,
)
else:
# Build the 'specs' field
if not desired_pin.specifier:
logger.warning(
"Dependency '%s' has no version specifier in constraints "
"'%s'. Skipping pinning.",
pkg_req.name,
desired_pin,
)
continue
# If version specifiers are already present in that dependency
if pkg_req.specifier:
raise ValueError(
f"You cannot specify a version for the dependency {pkg_req}"
)
specs_str = str(desired_pin.specifier)
# Build the 'marker' field
if pkg_req.marker is not None:
raise ValueError(
f"You can not specify a marker for the dependency {pkg_req}! "
f"({pkg_req.marker})"
)
marker_str = ""
if desired_pin.marker is not None:
marker_str = f"; {desired_pin.marker}"
# Build the 'extras' field
if len(pkg_req.extras) > 0:
raise ValueError(
f"You can not specify extras for the dependency {pkg_req}! "
f"({pkg_req.extras})"
)
extras_str = ""
if len(desired_pin.extras) > 0:
extras_str = f"[{','.join(desired_pin.extras)}]"
# Assemble the dependency specification in one string
if desired_pin.url is not None:
final_str = "".join(
(
pkg_req.name,
extras_str,
"@ ",
desired_pin.url,
" ",
marker_str,
)
)
else:
final_str = "".join((pkg_req.name, extras_str, specs_str, marker_str))
# Replace the package specification with the pinned version
packages_list[pkg_id] = str(packaging.requirements.Requirement(final_str))
logger.debug("Package pinned: %s", packages_list[pkg_id])
return packages_list
def _update_pyproject( def _update_pyproject(
contents: str, contents: str,
version: str, version: str,
default_branch: str, default_branch: str,
update_urls: bool, update_urls: bool,
profile: Profile | None = None,
) -> str: ) -> str:
"""Update contents of pyproject.toml to make it release/latest ready. """Update contents of pyproject.toml to make it release/latest ready.
- Sets the project.version field to the given version. - Sets the project.version field to the given version, if the version is
- Pins the dependencies version to the ones in the given dev-profile. not dynamic.
- Saves the dev-profile's url and commit in the pyproject.toml.
- Updates the documentation URLs to point specifically to the given version. - Updates the documentation URLs to point specifically to the given version.
Arguments: Parameters
----------
contents: Text of the ``pyproject.toml`` file from a package contents
Text of the ``pyproject.toml`` file from a package
version
Format of the version string is '#.#.#'
default_branch
The name of the default project branch to use
update_urls
If set to ``True``, then also updates the relevant URL links
considering the version number provided at ``version``.
version: Format of the version string is '#.#.#'
default_branch: The name of the default project branch to use
update_urls: If set to ``True``, then also updates the relevant URL
links considering the version number provided at ``version``.
profile: Used to retrieve and note the current dev-profile commit.
Returns Returns
------- -------
...@@ -275,102 +126,46 @@ def _update_pyproject( ...@@ -275,102 +126,46 @@ def _update_pyproject(
} }
data = tomlkit.loads(contents) data = tomlkit.loads(contents)
project = data.setdefault("project", {})
if re.match(packaging.version.VERSION_PATTERN, version, re.VERBOSE) is not None: if "version" in project.get("dynamic", []):
logger.info("Not setting project version on pyproject.toml as it is dynamic")
elif (
re.match(
packaging.version.VERSION_PATTERN,
version,
re.VERBOSE | re.IGNORECASE,
)
is not None
):
logger.info( logger.info(
"Updating pyproject.toml version from '%s' to '%s'", "Updating pyproject.toml version from '%s' to '%s'",
data.get("project", {}).get("version", "unknown version"), data.get("project", {}).get("version", "unknown version"),
version, version,
) )
data["project"]["version"] = version project["version"] = version
else: else:
logger.info( logger.info(
"Not setting project version on pyproject.toml as it is " f"Not setting project version on pyproject.toml as it is "
f"not PEP-440 compliant (given value: `{version}')" f"not PEP-440 compliant (given value: `{version}')"
) )
# Pinning of the dependencies packages version if update_urls:
if profile is not None: # matches all other occurrences we need to handle
dependencies_pins = profile.python_constraints() branch_re = re.compile(
r"/(" + "|".join(variants) + r")", re.VERBOSE | re.IGNORECASE
# Main dependencies
logger.info("Pinning versions of dependencies.")
pkg_deps = data.get("project", {}).get("dependencies", [])
(
_pin_versions_of_packages_list(
packages_list=pkg_deps,
dependencies_versions=dependencies_pins,
),
) )
# Optional dependencies # sets the various URLs
opt_pkg_deps = data.get("project", {}).get("optional-dependencies", []) urls = project.setdefault("urls", {})
for pkg_group in opt_pkg_deps: docurl = urls.get("documentation")
logger.info( if (docurl is not None) and (branch_re.search(docurl) is not None):
"Pinning versions of optional dependencies group `%s`.", replacement = (
pkg_group, f"/v{version}" if version is not None else f"/{default_branch}"
)
_pin_versions_of_packages_list(
packages_list=opt_pkg_deps[pkg_group],
dependencies_versions=dependencies_pins,
)
# Registering dev-profile version
logger.info("Annotating pyproject with current dev-profile commit.")
logger.debug("Using dev-profile at '%s'", profile._basedir) # noqa: SLF001
profile_repo = Repo(profile._basedir) # noqa: SLF001
if profile_repo.is_dirty():
raise RuntimeError(
"dev-profile was modified and is dirty! Unable to ensure a "
"commit corresponds to the current state of that repository. "
"Please commit and push your changes."
)
logger.debug("Fetching origin of dev-profile.")
profile_repo.remotes.origin.fetch()
logger.debug("Checking that the local commits are available on origin.")
commits_ahead = [c for c in profile_repo.iter_commits("origin/main..HEAD")]
if len(commits_ahead) != 0:
raise RuntimeError(
"Local commits of dev-profile were not pushed to origin!\n"
f"(dev-profile HEAD is {len(commits_ahead)} commits ahead of "
"origin).\n "
"Please 'git push' your modifications or revert them.\n"
"We enforce this so a dev-profile version can always be "
"retrieved."
)
logger.debug("Checking we are up to date with origin.")
commits_behind = [c for c in profile_repo.iter_commits("HEAD..origin/main")]
if len(commits_behind) != 0:
logger.warning(
"Your local dev-profile is not up to date with the origin "
"remote. It is fine as long as you know what you are doing, "
"but you should consider 'git pull' the latest changes.\n"
"(dev-profile HEAD is %d commits behind origin)",
len(commits_behind),
) )
# Actually add the dev-profile commit hash to pyproject.toml urls["documentation"] = branch_re.sub(replacement, docurl)
data["profile"] = tomlkit.table()
data["profile"].add(
"repository_url",
tomlkit.item(profile_repo.remotes.origin.url).indent(4),
)
data["profile"].add(
"commit_hash",
tomlkit.item(profile_repo.commit("HEAD").hexsha).indent(4),
)
if not update_urls:
return tomlkit.dumps(data)
# matches all other occurrences we need to handle
branch_re = re.compile(r"/(" + "|".join(variants) + r")", re.VERBOSE)
# sets the various URLs
url = data["project"].get("urls", {}).get("documentation")
if (url is not None) and (branch_re.search(url) is not None):
replacement = "/v%s" % version if version is not None else f"/{default_branch}"
data["project"]["urls"]["documentation"] = branch_re.sub(replacement, url)
return tomlkit.dumps(data) return tomlkit.dumps(data)
...@@ -397,10 +192,11 @@ def get_latest_tag_name( ...@@ -397,10 +192,11 @@ def get_latest_tag_name(
return None return None
# create list of tags' names but ignore the first 'v' character in each name # create list of tags' names but ignore the first 'v' character in each name
# also filter out non version tags # also filter out non version tags
version_pattern_re = re.compile(
packaging.version.VERSION_PATTERN, re.VERBOSE | re.IGNORECASE
)
tag_names = [ tag_names = [
tag.name[1:] tag.name[1:] for tag in latest_tags if version_pattern_re.match(tag.name)
for tag in latest_tags
if re.match(packaging.version.VERSION_PATTERN, tag.name[1:])
] ]
if not tag_names: # no tags were found. if not tag_names: # no tags were found.
return None return None
...@@ -668,7 +464,6 @@ def release_package( ...@@ -668,7 +464,6 @@ def release_package(
tag_name: str, tag_name: str,
tag_comments: str, tag_comments: str,
dry_run: bool = False, dry_run: bool = False,
profile: Profile | None = None,
) -> int | None: ) -> int | None:
"""Release a package. """Release a package.
...@@ -676,20 +471,16 @@ def release_package( ...@@ -676,20 +471,16 @@ def release_package(
such as ``README.md`` and ``pyproject.toml`` will be updated according to such as ``README.md`` and ``pyproject.toml`` will be updated according to
the release procedures. the release procedures.
Parameters
Arguments: ----------
gitpkg
gitpkg: gitlab package object gitlab package object
tag_name
tag_name: The name of the release tag The name of the release tag
tag_comments_list
tag_comments_list: New annotations for this tag in a form of list New annotations for this tag in a form of list
dry_run
dry_run: If ``True``, nothing will be committed or pushed to GitLab If ``True``, nothing will be committed or pushed to GitLab
profile: An instance of :class:`idiap_devtools.profile.Profile` used to
retrieve the specifiers to pin the package's dependencies in
``pyproject.toml``.
Returns Returns
------- -------
...@@ -721,7 +512,6 @@ def release_package( ...@@ -721,7 +512,6 @@ def release_package(
version=version_number, version=version_number,
default_branch=gitpkg.default_branch, default_branch=gitpkg.default_branch,
update_urls=True, update_urls=True,
profile=profile,
) )
if dry_run: if dry_run:
d = _get_differences( d = _get_differences(
...@@ -733,7 +523,7 @@ def release_package( ...@@ -733,7 +523,7 @@ def release_package(
update_files_at_default_branch( update_files_at_default_branch(
gitpkg, gitpkg,
{"README.md": readme_contents, "pyproject.toml": pyproject_contents}, {"README.md": readme_contents, "pyproject.toml": pyproject_contents},
"Increased stable version to %s" % version_number, f"Increased stable version to {version_number}",
dry_run, dry_run,
) )
...@@ -777,7 +567,7 @@ def release_package( ...@@ -777,7 +567,7 @@ def release_package(
"README.md": readme_contents_orig, "README.md": readme_contents_orig,
"pyproject.toml": pyproject_contents_latest, "pyproject.toml": pyproject_contents_latest,
}, },
"Increased latest version to %s [skip ci]" % next_version_number, f"Increased latest version to {next_version_number} [skip ci]",
dry_run, dry_run,
) )
if dry_run: if dry_run:
......
# Copyright © 2022 Idiap Research Institute <contact@idiap.ch>
#
# SPDX-License-Identifier: BSD-3-Clause
import io
import pathlib
import typing
import packaging.requirements
import tomli
import xdg
import yaml
from .logging import setup
logger = setup(__name__)
OLD_USER_CONFIGURATION = xdg.xdg_config_home() / "devtools.toml"
"""The previous default location for the user configuration file."""
USER_CONFIGURATION = xdg.xdg_config_home() / "idiap-devtools.toml"
"""The default location for the user configuration file."""
def load(directory: pathlib.Path) -> dict[str, typing.Any]:
"""Load a profile TOML file, returns a dictionary with contents."""
with (directory / "profile.toml").open("rb") as f:
return tomli.load(f)
def get_profile_path(name: str | pathlib.Path) -> pathlib.Path | None:
"""Return the local directory of the named profile.
If the input name corresponds to an existing directory, then that is
returned. Otherwise, we lookup the said name inside the user
configuration. If one exists, then the path pointed by that variable is
returned. Otherwise, an exception is raised.
Arguments:
name: The name of the local profile to return - can be either an
existing path, or any name from the user configuration file.
Returns
-------
Either ``None``, if the profile cannot be found, or a verified path, if
one is found.
"""
path = pathlib.Path(name)
if path.exists() and path.is_dir():
logger.debug(f"Returning path to profile {str(path)}...")
return path
# makes the user move the configuration file quickly!
if OLD_USER_CONFIGURATION.exists():
raise RuntimeError(
f"Move your configuration from "
f"{str(OLD_USER_CONFIGURATION)} to {str(USER_CONFIGURATION)}, "
f"and then re-run this application."
)
# if you get to this point, then no local directory with that name exists
# check the user configuration for a specific key
if USER_CONFIGURATION.exists():
logger.debug(f"Loading user-configuration from {str(USER_CONFIGURATION)}...")
with USER_CONFIGURATION.open("rb") as f:
usercfg = tomli.load(f)
else:
usercfg = {}
if name == "default":
value = usercfg.get("profiles", {}).get("default", None)
if value is None:
return None
name = value
value = usercfg.get("profiles", {}).get(name, None)
if value is None:
logger.warning(
f"Requested profile `{name}' is not an existing directory "
f"or an existing profile key (may be you forgot to clone "
f"the relevant repository or setup your configuration file?)"
)
return None
return pathlib.Path(value).expanduser()
class Profile:
"""A class representing the development profile.
Arguments:
path: The name of the local profile to return - can be either an
existing path, or any name from the user configuration file.
"""
data: dict[str, typing.Any] #: A readout of the ``profile.toml`` file
_basedir: pathlib.Path
def __init__(self, name: str | pathlib.Path):
basedir = get_profile_path(name)
if basedir is None:
raise FileNotFoundError(
f"Cannot find `profile.toml' in the input "
f"profile path or key: `{name}' (resolved to `{basedir}')"
)
self._basedir = basedir
logger.info(
f"Loading development profile from `{name}' "
f"(resolved to `{basedir}')..."
)
with (self._basedir / "profile.toml").open("rb") as f:
self.data = tomli.load(f)
def conda_config(
self, python: str, public: bool, stable: bool
) -> typing.Any: # Using Any as type, as either flake8, mypy, or sphinx
# will complain about conda otherwise. Will anyway be fixed when
# resolving https://gitlab.idiap.ch/software/idiap-devtools/-/issues/3
"""Build the conda-configuration to use based on the profile.
Arguments:
python: The python version in the format "X.Y" (e.g. "3.11" or
"3.12")
private: Set to ``True`` if we should use private channels/indexes
to lookup dependencies. Should be ``False`` otherwise
stable: Set to ``True`` if we should only consider stable versions
of packages, as opposed to pre-release ones (beta packages). Set
to ``False`` otherwise.
return_type:
conda_build.config.Config: A dictionary containing the merged
configuration, as produced by conda-build API's
get_or_merge_config() function.
"""
baserc = self.data.get("conda", {}).get("baserc")
if baserc is None:
condarc_options: dict[str, typing.Any] = dict(show_channel_urls=True)
else:
f = io.BytesIO(self.data["conda"]["baserc"].encode())
condarc_options = yaml.load(f, Loader=yaml.FullLoader)
channel_data = self.data.get("conda", {}).get("channels")
privacy_key = "public" if public else "private"
stability_key = "stable" if stable else "beta"
if channel_data is not None:
channels = channel_data[privacy_key][stability_key]
else:
channels = ["conda-forge"]
condarc_options["channels"] = channels
# incorporate constraints, if there are any
constraints = self.data.get("conda", {}).get("constraints")
if constraints is not None:
constraints_path = self._basedir / pathlib.Path(constraints)
condarc_options["variant_config_files"] = str(constraints_path)
# detect append-file, if any
copy_files = self.data.get("conda", {}).get("build-copy")
if copy_files is not None:
append_file = [k for k in copy_files if k.endswith("recipe_append.yaml")]
if append_file:
condarc_options["append_sections_file"] = str(
self._basedir / append_file[0]
)
condarc_options["python"] = python
conda_build_copy = self.data.get("conda", {}).get("build-copy", [])
append_file = [k for k in conda_build_copy if k != constraints]
append_file = append_file[0] if append_file else None
from .conda import make_conda_config
return make_conda_config(condarc_options)
def python_indexes(self, public: bool, stable: bool) -> list[str]:
"""Return Python indexes to be used according to the current profile.
Arguments:
private: Set to ``True`` if we should use private channels/indexes
to lookup dependencies. Should be ``False`` otherwise
stable: Set to ``True`` if we should only consider stable versions
of packages, as opposed to pre-release ones (beta packages). Set
to ``False`` otherwise.
"""
indexes = self.data.get("python", {}).get("indexes")
privacy_key = "public" if public else "private"
stability_key = "stable" if stable else "beta"
if indexes is not None:
return indexes[privacy_key][stability_key]
return [] if stable else ["--pre"]
def get(
self, key: str | typing.Iterable[str], default: typing.Any = None
) -> typing.Any:
"""Read the contents of a certain toml profile variable."""
if isinstance(key, str):
return self.data.get(key, default)
# key is a tuple of strings, iterate over the dictionary
d = self.data
for level in key:
d = d.get(level)
if d is None:
return default
return d
def get_path(
self,
key: str | typing.Iterable[str],
default: None | pathlib.Path = None,
) -> pathlib.Path | None:
"""Read the contents of path from the profile and resolves it.
This function will search for a given profile key, consider it points
to a path (relative or absolute) and will return that resolved path to
the caller.
Arguments:
key: The key, pointing to the variable inside ``profile.toml`` that
contains the datafile to be _load_conda_packages
default: The value to return to the caller by default, if the key
does not exist within the profile.
Returns
-------
The selected profile file path, or the contents of ``default``
otherwise.
"""
path = self.get(key)
if path is None:
return default
if isinstance(path, dict):
raise KeyError(f"Key {key} does not correspond to a path")
ppath = pathlib.Path(path)
if not ppath.is_absolute():
ppath = self._basedir / ppath
return ppath
def get_file_contents(
self, key: str | typing.Iterable[str], default: None | str = None
) -> str | None:
"""Read the contents of a file from the profile.
This function will search for a given profile key, consider it points
to a filename (relative or absolute) and will read its contents,
returning them to the caller.
Arguments:
key: The key, pointing to the variable inside ``profile.toml`` that
contains the datafile to be _load_conda_packages
default: The value to return to the caller by default, if the key
does not exist within the profile.
Returns
-------
The contents of the selected profile file, or the contents of
``default`` otherwise.
"""
path = self.get_path(key)
return path.open().read() if path is not None else default
def conda_constraints(self, python: str) -> dict[str, str] | None:
"""Return a list of conda constraints given the current profile.
Arguments:
python: The python version in the format "X.Y" (e.g. "3.9" or
"3.10")
"""
content = self.get_file_contents(("conda", "constraints"))
if content is None:
return None
idx1 = content.find("# AUTOMATIC PARSING START")
idx2 = content.find("# AUTOMATIC PARSING END")
content = content[idx1:idx2]
# filter out using conda-build specific markers
from conda_build.metadata import ns_cfg, select_lines
config = self.conda_config(python, public=True, stable=True)
content = select_lines(content, ns_cfg(config), variants_in_place=False)
package_pins = yaml.safe_load(content)
package_names_map = package_pins.pop("package_names_map")
return {
f"{package_names_map.get(p, p)}": f"{str(v[0]).split(' ')[0]}"
for p, v in package_pins.items()
}
def python_constraints(
self,
) -> list[packaging.requirements.Requirement] | None:
"""Return a list of Python requirements given the current profile."""
content = self.get_file_contents(("python", "constraints"))
if content is None:
return None
return [packaging.requirements.Requirement(k) for k in content.split()]
...@@ -4,8 +4,8 @@ ...@@ -4,8 +4,8 @@
import click import click
from ...click import PreserveIndentCommand, verbosity_option from ..click import PreserveIndentCommand, verbosity_option
from ...logging import setup from ..logging import setup
logger = setup(__name__.split(".", 1)[0]) logger = setup(__name__.split(".", 1)[0])
...@@ -127,8 +127,8 @@ def badges(package, update_readme, dry_run, server, **_) -> None: ...@@ -127,8 +127,8 @@ def badges(package, update_readme, dry_run, server, **_) -> None:
import gitlab import gitlab
from ...gitlab import get_gitlab_instance from ..gitlab import get_gitlab_instance
from ...gitlab.release import update_files_at_default_branch from ..gitlab.release import update_files_at_default_branch
if dry_run: if dry_run:
click.secho("!!!! DRY RUN MODE !!!!", fg="yellow", bold=True) click.secho("!!!! DRY RUN MODE !!!!", fg="yellow", bold=True)
......
...@@ -6,8 +6,8 @@ import sys ...@@ -6,8 +6,8 @@ import sys
import click import click
from ...click import PreserveIndentCommand, verbosity_option from ..click import PreserveIndentCommand, verbosity_option
from ...logging import setup from ..logging import setup
logger = setup(__name__.split(".", 1)[0]) logger = setup(__name__.split(".", 1)[0])
...@@ -105,8 +105,8 @@ def changelog(target, output, mode, since, **_) -> None: ...@@ -105,8 +105,8 @@ def changelog(target, output, mode, since, **_) -> None:
import datetime import datetime
import pathlib import pathlib
from ...gitlab import get_gitlab_instance from ..gitlab import get_gitlab_instance
from ...gitlab.changelog import ( from ..gitlab.changelog import (
get_last_tag_date, get_last_tag_date,
parse_date, parse_date,
write_tags_with_commits, write_tags_with_commits,
......
...@@ -5,11 +5,17 @@ ...@@ -5,11 +5,17 @@
import click import click
from ..click import AliasedGroup from ..click import AliasedGroup
from .env import env from ..logging import setup
from .fullenv import fullenv from .badges import badges
from .gitlab import gitlab from .changelog import changelog
from .pixi import pixi from .getpath import getpath
from .update_pins import update_pins from .jobs import jobs
from .lasttag import lasttag
from .release import release
from .runners import runners
from .settings import settings
logger = setup(__name__.split(".", 1)[0])
@click.group( @click.group(
...@@ -17,13 +23,24 @@ from .update_pins import update_pins ...@@ -17,13 +23,24 @@ from .update_pins import update_pins
context_settings=dict(help_option_names=["-?", "-h", "--help"]), context_settings=dict(help_option_names=["-?", "-h", "--help"]),
) )
@click.version_option() @click.version_option()
def cli(): def cli() -> None:
"""Idiap development tools - see available commands below.""" """Commands that interact directly with GitLab.
Commands defined here are supposed to interact with gitlab, and
add/modify/remove resources on it directly.
To avoid repetitive asking, create a configuration file as indicated
at :ref:`idiap-devtools.install.setup.gitlab` section of the user
guide.
"""
pass pass
cli.add_command(env) cli.add_command(changelog)
cli.add_command(fullenv) cli.add_command(release)
cli.add_command(gitlab) cli.add_command(badges)
cli.add_command(pixi) cli.add_command(runners)
cli.add_command(update_pins) cli.add_command(jobs)
cli.add_command(getpath)
cli.add_command(lasttag)
cli.add_command(settings)
# Copyright © 2022 Idiap Research Institute <contact@idiap.ch>
#
# SPDX-License-Identifier: BSD-3-Clause
import pathlib
import sys
import typing
import click
from ..click import PreserveIndentCommand, validate_profile, verbosity_option
from ..logging import setup
from ..profile import Profile
logger = setup(__name__.split(".", 1)[0])
def _load_conda_packages(
meta: list[pathlib.Path], conda_config: typing.Any
) -> tuple[list[pathlib.Path], list[str], list[str]]:
"""Load dependence packages by scanning conda recipes.
Input entries in ``meta`` may correspond to various types of entries, out
of which, we will not ignore in this function:
* A full path to a ``meta.yaml`` file that will be scanned
* A project directory containing a conda recipe at ``conda/meta.yaml``
* The name of a conda package to install (condition: does not contain a
path separator nor exists as a file or directory in the filesystem)
Arguments:
meta: List of directories, conda recipes and conda package names to parse.
conda_config: Profile-dependent conda configuration to use for
determining packages to install and constraints.
Returns
-------
A list of non-consumed elements from the ``meta`` list, the list of
parsed package names, and finally the list of dependencies from the
parsed recipes.
"""
import os
from .. import conda, utils
consumed = []
parsed_packages = []
conda_packages = []
for m in meta:
if m.name == "meta.yaml":
# user has passed the full path to the file
# we can consume this from the input list
recipe_dir = str(m.parent)
logger.info(f"Parsing conda recipe at {recipe_dir}...")
pkg_name, pkg_deps = conda.parse_dependencies(recipe_dir, conda_config)
logger.info(f"Added {len(pkg_deps)} packages from package '{pkg_name}'")
parsed_packages.append(pkg_name)
conda_packages += pkg_deps
consumed.append(m)
elif (m / "conda" / "meta.yaml").exists():
# it is the root of a project
# may need to parse it for python packages later on
recipe_dir = str(m / "conda")
logger.info(f"Parsing conda recipe at {recipe_dir}...")
pkg_name, pkg_deps = conda.parse_dependencies(recipe_dir, conda_config)
logger.info(f"Added {len(pkg_deps)} packages from package '{pkg_name}'")
parsed_packages.append(pkg_name)
conda_packages += pkg_deps
elif not m.exists() and os.sep not in str(m):
# it is a conda package name, add to list of packages to install
# we can consume this from the input list
logger.info(f"Adding conda package {m}...")
conda_packages.append(str(m))
consumed.append(m)
meta = [k for k in meta if k not in consumed]
# we should install all packages that have not been parsed yet
conda_packages = [k for k in conda_packages if k not in parsed_packages]
# now we sort and make it unique
conda_packages = utils.uniq(sorted(conda_packages))
logger.info(
f"Adding {len(conda_packages)} conda packages to installation plan",
)
return meta, parsed_packages, conda_packages
def _load_python_packages(
the_profile: Profile,
python: str,
meta: list[pathlib.Path],
conda_pkgs: list[str],
) -> tuple[list[pathlib.Path], list[str], list[str]]:
"""Load dependence packages by scanning Python recipes.
Input entries in ``meta`` may correspond to various types of entries, out
of which, we will not ignore in this function:
* A full path to a ``pyproject.toml`` file that will be scanned
* A project directory containing a Python project declaration at
``pyproject.toml``
Arguments:
the_profile: The current development profile, that will be looked up
for conda-to-Python package name translations.
python: The version of Python to load conda constraints for
meta: List of directories, and Python project definitions to scan
conda_pkgs: List of conda packages that either have been parsed, or
are dependencies of parsed recipes. We must **not** install Python
equivalents for those.
Returns
-------
A list of non-consumed elements from the ``meta`` list, the list of
pure-Python dependencies from the parsed recipes, that are not at
``conda_pkgs`` and have no conda equivalents, and finally, an extension
to the list of conda packages that can be installed that way.
"""
from .. import python as pyutils
from .. import utils
parsed_packages = [k.split(" ", 1)[0] for k in conda_pkgs]
to_python = the_profile.get(("conda", "to_python"), {})
parsed_packages = [to_python.get(k, k) for k in parsed_packages]
consumed = []
python_packages = []
for m in meta:
if m.name == "pyproject.toml":
# user has passed the full path to the file
# we can consume this from the input list
logger.info(f"Parsing Python package at {str(m)}...")
pkg_name, pkg_deps = pyutils.dependencies_from_pyproject_toml(m)
logger.info(
f"Added {len(pkg_deps)} Python packages from package '{pkg_name}'",
)
parsed_packages.append(pkg_name)
python_packages += pkg_deps
consumed.append(m)
elif (m / "pyproject.toml").exists():
# it is the root of a project
proj = m / "pyproject.toml"
logger.info(f"Parsing Python package at {str(proj)}...")
pkg_name, pkg_deps = pyutils.dependencies_from_pyproject_toml(proj)
logger.info(
f"Added {len(pkg_deps)} Python packages from package '{pkg_name}'",
)
parsed_packages.append(pkg_name)
python_packages += pkg_deps
consumed.append(m)
meta = [k for k in meta if k not in consumed]
# if there are equivalent conda-pinned packages, we should prefer them
# instead of pure-Python versions without any constraints
conda_constraints = the_profile.conda_constraints(python)
if conda_constraints is None:
conda_constraints = {}
constrained = [k for k in python_packages if k.specifier]
unconstrained = [k for k in python_packages if not k.specifier]
has_conda = [
f"{k.name} {conda_constraints[k.name]}"
for k in unconstrained
if k.name in conda_constraints
]
no_conda = [k for k in unconstrained if k.name not in conda_constraints]
# we should install all packages that have not been parsed yet, and have no
# conda equivalent via Python/pip
python_packages_str = [
str(k) for k in constrained + no_conda if k.name not in parsed_packages
]
# now we sort and make it unique
python_packages_str = utils.uniq(sorted(python_packages_str))
has_conda = utils.uniq(sorted(has_conda))
logger.info(
f"Adding {len(python_packages_str)} Python and {len(has_conda)} conda "
f"packages to installation plan",
)
return meta, python_packages_str, has_conda
def _simplify_conda_plan(deps: list[str]) -> list[str]:
"""Simplifies the conda package plan by removing reduntant entries."""
from .. import utils
pins_striped = [k.split()[0] for k in deps if len(k.split()) > 1]
no_pins = [k for k in deps if len(k.split()) == 1]
keep_no_pins = [k for k in no_pins if k not in pins_striped]
full_pins = [k for k in deps if len(k.split()) > 1]
return utils.uniq(sorted(keep_no_pins + full_pins))
def _add_missing_conda_pins(
the_profile: Profile, python: str, deps: list[str]
) -> list[str]:
"""Add pins to unpinned packages, to respect the profile."""
from .. import utils
pinned_packages = the_profile.conda_constraints(python)
if pinned_packages is None:
return deps
no_pins = [k for k in deps if len(k.split()) == 1 and k in pinned_packages]
no_change = [k for k in deps if k not in no_pins]
new_pins = [f"{k} {v}" for k, v in pinned_packages.items() if k in no_pins]
return utils.uniq(sorted(no_change + new_pins))
@click.command(
cls=PreserveIndentCommand,
epilog="""
Examples:
1. Creates a conda environment installation plan for developing the currently
checked-out package, and the development profile in ``../profile``:
.. code:: sh
$ git clone <package>
$ cd <package>
$ conda activate base
(base) devtool env -vv .
(base) $ mamba env create -n dev -f environment.yaml
(base) $ conda activate dev
(dev) $ pip install --no-build-isolation --no-dependencies --editable .
You may, of course, hand-edit the output file ``environment.yaml`` to
adjust for details, add conda or Python packages you'd like to complement
your work environment. An example would be adding debuggers such as
``ipdb`` to the installation plan before calling ``mamba env create``.
2. By default, we use the native Python version of your conda installation
as the Python version to use for the newly created environment. You may
select a different one with `--python=X.Y'. You may also set the output
filename with ``--output=name.yaml``, if the default does not please you:
.. code:: sh
$ conda activate base
(base) devtool env -vv --python=3.12 --output=whatever-i-like.yaml .
3. To develop multiple packages you checked out, just add the meta package
files of all packages you wish to consider, then pip-install the packages
on teh top of the created environment, in reverse dependence order
(e.g. package A depends on B):
.. code:: sh
$ mkdir dev-dir
$ cd dev-dir
$ git clone <repo-of-B> src/B
$ git clone <repo-of-A> src/A
$ conda activate base
(base) $ devtool env -vv src/*
(base) $ mamba env create -n dev -f environment.yaml
(base) $ conda activate dev
(dev) $ pip install --no-build-isolation --no-dependencies --editable "src/B"
(dev) $ pip install --no-build-isolation --no-dependencies --editable "src/A"
""",
)
@click.argument(
"meta",
nargs=-1,
required=True,
type=click.Path(path_type=pathlib.Path),
)
@click.option(
"-P",
"--profile",
default="default",
show_default=True,
callback=validate_profile,
help="Directory containing the development profile (and a file named "
"profile.toml), or the name of a configuration key pointing to the "
"development profile to use",
)
@click.option(
"-p",
"--python",
default=("%d.%d" % sys.version_info[:2]),
show_default=True,
help="Version of python to build the environment for",
)
@click.option(
"-u/-U",
"--public/--no-public",
default=True,
help="Set this to **include** private channels/indexes on your plan. "
"For conda packages in this case, you **must** execute this within the "
"Idiap intranet.",
)
@click.option(
"-s/-S",
"--stable/--no-stable",
default=False,
help="Set this to **exclude** beta channels from your build",
)
@click.option(
"-o",
"--output",
default="environment.yaml",
show_default=True,
help="The name of the environment plan file",
type=click.Path(path_type=pathlib.Path),
)
@verbosity_option(logger=logger)
def env(
meta,
profile,
python,
public,
stable,
output,
**_,
) -> None:
"""Create a development environment for one or more projects.
The environment is created by scanning conda's ``meta.yaml`` and Python
``pyproject.toml`` files for all input projects. All input that is not an
existing file path, is considered a supplemental conda package to be
installed. The environment is dumped to disk in the form of a
conda-installable YAML environment. The user may edit this file to add
Python packages that may be of interest.
To interpret ``meta.yaml`` files found on the input directories, this
command uses the conda render API to discover all profile- constrained and
unconstrained packages to add to the new environment.
"""
import shutil
import yaml
# 1. loads profile data
the_profile = Profile(profile)
# 2. loads all conda package data, reset "meta" to remove consumed entries
conda_config = the_profile.conda_config(python, public, stable)
leftover_meta, conda_parsed, conda_packages = _load_conda_packages(
meta, conda_config
)
# 3. loads all python package data, reset "meta" to remove consumed entries
(
leftover_meta,
python_packages,
extra_conda_packages,
) = _load_python_packages(
the_profile, python, leftover_meta, conda_parsed + conda_packages
)
# At this point, there shouldn't be anything else to consume
if leftover_meta:
logger.error(
f"Ended parsing with unconsumed entries from the command-line: "
f"{' ,'.join(str(leftover_meta))}"
)
# Adds python on the required version
conda_packages.append(f"python {python}")
conda_packages += extra_conda_packages
# Always append pip, if that is not the case already
# we need it for the own package installation later on
conda_packages.append("pip")
# Performs "easy" simplification: if a package appears in two entries,
# however one of them is a pin, keep only the pin
conda_packages = _simplify_conda_plan(conda_packages)
# Adds missing pins
conda_packages = _add_missing_conda_pins(the_profile, python, conda_packages)
# Write package installation plan, in YAML format
data: dict[str, typing.Any] = dict(channels=conda_config.channels)
if python_packages:
conda_packages.append(
dict( # type: ignore
pip=the_profile.python_indexes(public, stable) + python_packages
)
)
data["dependencies"] = conda_packages
# backup previous installation plan, if one exists
if output.exists():
backup = output.parent / (output.name + "~")
shutil.copy(output, backup)
with output.open("w") as f:
import math
yaml.dump(data, f, width=math.inf)
click.echo(
"Run the following commands to create and prepare your development environment:"
)
install_cmds = [
f"mamba env create --force -n dev -f {output}",
"conda activate dev",
]
for k in meta:
install_cmds.append(
f"pip install --no-build-isolation --no-dependencies --editable {k}",
)
for k in install_cmds:
click.secho(k, fg="yellow", bold=True)
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment