bob issueshttps://gitlab.idiap.ch/groups/bob/-/issues2022-11-17T16:03:29Zhttps://gitlab.idiap.ch/bob/bob.bio.video/-/issues/25Install ffmpeg on macOS M12022-11-17T16:03:29ZAndré MAYORAZInstall ffmpeg on macOS M1Using methods from `imageio-ffmpeg` on macOS M1 leads to the following error:
```
RuntimeError: No FFmpeg exe could be found. Install FFmpeg on your system, or set the IMAGEIO_FFMPEG_EXE environment variable.
```
According to [this issue...Using methods from `imageio-ffmpeg` on macOS M1 leads to the following error:
```
RuntimeError: No FFmpeg exe could be found. Install FFmpeg on your system, or set the IMAGEIO_FFMPEG_EXE environment variable.
```
According to [this issue on github](https://github.com/imageio/imageio-ffmpeg/issues/71), the package FFmpeg is currently not included in the wheel of the imageio-ffmpeg library for macOS M1. But it would work using conda.
This would mean that we have to find a workaround to install ffmpeg if we don't want to use conda or wait that the maintainers of `imageio-ffmpeg` to create a proper wheel for this OShttps://gitlab.idiap.ch/bob/bob.learn.em/-/issues/49dask problem with fitting GMMMachine ...2022-11-10T12:13:55ZPasra Rahimidask problem with fitting GMMMachine ...Can you guys check this error? The original script can be found at `/idiap/temp/prahimi/latentplay/gmm/gmm.py`
```bash
Loading latent data ...
Done loading.
(50000, 9216)
Working on GMMMachine with number of mixture models set to : 10...Can you guys check this error? The original script can be found at `/idiap/temp/prahimi/latentplay/gmm/gmm.py`
```bash
Loading latent data ...
Done loading.
(50000, 9216)
Working on GMMMachine with number of mixture models set to : 10
Traceback (most recent call last):
File "/somewhere/exps/latentplay/gmm/gmm.py", line 38, in <module>
machine = machine.fit(latents)
File "/somewhere/mambaforge/envs/latentplay/lib/python3.10/site-packages/bob/learn/em/gmm.py", line 792, in fit
self.initialize_gaussians(X)
File "/somewhere/mambaforge/envs/latentplay/lib/python3.10/site-packages/bob/learn/em/gmm.py", line 731, in initialize_gaussians
kmeans_machine = kmeans_machine.fit(data)
File "/somewhere/mambaforge/envs/latentplay/lib/python3.10/site-packages/bob/learn/em/kmeans.py", line 328, in fit
self.initialize(data=X)
File "/somewhere/mambaforge/envs/latentplay/lib/python3.10/site-packages/bob/learn/em/kmeans.py", line 312, in initialize
self.centroids_ = k_init(
File "/somewhere/mambaforge/envs/latentplay/lib/python3.10/site-packages/dask_ml/cluster/k_means.py", line 365, in k_init
return init_scalable(X, n_clusters, random_state, max_iter, oversampling_factor)
File "/somewhere/mambaforge/envs/latentplay/lib/python3.10/site-packages/dask_ml/utils.py", line 550, in wraps
results = f(*args, **kwargs)
File "/somewhere/mambaforge/envs/latentplay/lib/python3.10/site-packages/dask_ml/cluster/k_means.py", line 423, in init_scalable
(cost,) = compute(evaluate_cost(X, centers))
File "/somewhere/mambaforge/envs/latentplay/lib/python3.10/site-packages/dask_ml/cluster/k_means.py", line 486, in evaluate_cost
return (pairwise_distances(X, centers).min(1) ** 2).sum()
File "/somewhere/mambaforge/envs/latentplay/lib/python3.10/site-packages/dask_ml/metrics/pairwise.py", line 59, in pairwise_distances
return X.map_blocks(
File "/somewhere/mambaforge/envs/latentplay/lib/python3.10/site-packages/dask/array/core.py", line 2676, in map_blocks
return map_blocks(func, self, *args, **kwargs)
File "/somewhere/mambaforge/envs/latentplay/lib/python3.10/site-packages/dask/array/core.py", line 873, in map_blocks
out = blockwise(
File "/somewhere/mambaforge/envs/latentplay/lib/python3.10/site-packages/dask/array/blockwise.py", line 269, in blockwise
raise ValueError(
ValueError: Dimension 1 has 2 blocks, adjust_chunks specified with 1 blocks
```Flavio TARSETTIFlavio TARSETTIhttps://gitlab.idiap.ch/bob/bob.bio.face/-/issues/90Where went `CSVDatasetZTNorm` ?2023-02-20T08:45:24ZChristophe ECABERTWhere went `CSVDatasetZTNorm` ?With recent `csv-dataset` refactoring, some derived classes have been left on the side. With changes partially pushed to `bob` a simple import like `from bob.bio.face.database import MobioDatabase` will leads to:
```
ImportError: canno...With recent `csv-dataset` refactoring, some derived classes have been left on the side. With changes partially pushed to `bob` a simple import like `from bob.bio.face.database import MobioDatabase` will leads to:
```
ImportError: cannot import name 'CSVDatasetZTNorm' from 'bob.bio.base.database'
```
Which is quite inconvenient.https://gitlab.idiap.ch/bob/bob.bio.demographics/-/issues/3Default value of boxplot percentile should be None2022-10-28T15:14:44ZYu LinghuDefault value of boxplot percentile should be NoneIn `plot.py`, the `percentile` could be either `None` or a `float` between 0 and 1, but the `type` defined in click option is `float`. So `None` is not a valid entry in the command line (`--percentile None`). In this case, the default fo...In `plot.py`, the `percentile` could be either `None` or a `float` between 0 and 1, but the `type` defined in click option is `float`. So `None` is not a valid entry in the command line (`--percentile None`). In this case, the default for option `--percentile` should be None so that it is possible to change to a `float` later. When the default is a `float` like `0.5`, then if we want to take the whole scores to make boxplot, then entry `--percentile None` will bring an Error.
I have this command updated in branch `change_boxplot_percentile_default`, but it cannot pass the CI/CD because the name updates in `bob.bio.base` are not updated here. I will take care of it if necessary.https://gitlab.idiap.ch/bob/bob.learn.em/-/issues/48Token for pypi deployment2022-10-19T16:24:39ZAndré MAYORAZToken for pypi deploymentThere is currently a problem with the local-pypi job that is supposed to deploy the package :
```
$ test -z "${CI_COMMIT_TAG}" && citool deregister --git-url "${CI_SERVER_URL}" --token "${PYPI_PACKAGE_REGISTRY_TOKEN}" --project "${CI_PRO...There is currently a problem with the local-pypi job that is supposed to deploy the package :
```
$ test -z "${CI_COMMIT_TAG}" && citool deregister --git-url "${CI_SERVER_URL}" --token "${PYPI_PACKAGE_REGISTRY_TOKEN}" --project "${CI_PROJECT_ID}" --name "${PACKAGE_NAME}" --version "${PACKAGE_VERSION}"
Traceback (most recent call last):
File "/scratch/builds/bob/bob.learn.em/venv/lib/python3.10/site-packages/gitlab/exceptions.py", line 333, in wrapped_f
return f(*args, **kwargs)
File "/scratch/builds/bob/bob.learn.em/venv/lib/python3.10/site-packages/gitlab/mixins.py", line 246, in list
obj = self.gitlab.http_list(path, **data)
File "/scratch/builds/bob/bob.learn.em/venv/lib/python3.10/site-packages/gitlab/client.py", line 942, in http_list
gl_list = GitlabList(self, url, query_data, get_next=False, **kwargs)
File "/scratch/builds/bob/bob.learn.em/venv/lib/python3.10/site-packages/gitlab/client.py", line 1144, in __init__
self._query(url, query_data, **self._kwargs)
File "/scratch/builds/bob/bob.learn.em/venv/lib/python3.10/site-packages/gitlab/client.py", line 1154, in _query
result = self._gl.http_request("get", url, query_data=query_data, **kwargs)
File "/scratch/builds/bob/bob.learn.em/venv/lib/python3.10/site-packages/gitlab/client.py", line 798, in http_request
raise gitlab.exceptions.GitlabHttpError(
gitlab.exceptions.GitlabHttpError: 403: 403 Forbidden
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/scratch/builds/bob/bob.learn.em/venv/bin/citool", line 8, in <module>
sys.exit(main())
File "/scratch/builds/bob/bob.learn.em/venv/lib/python3.10/site-packages/citools/script.py", line 54, in main
args.func(args)
File "/scratch/builds/bob/bob.learn.em/venv/lib/python3.10/site-packages/citools/deregister.py", line 43, in _wrapper
for p in proj.packages.list():
File "/scratch/builds/bob/bob.learn.em/venv/lib/python3.10/site-packages/gitlab/exceptions.py", line 335, in wrapped_f
raise error(e.error_message, e.response_code, e.response_body) from e
gitlab.exceptions.GitlabListError: 403: 403 Forbidden
Cleaning up project directory and file based variables
00:01
ERROR: Job failed: exit code 1
```
The problem is in the call of `for p in proj.packages.list():` in the `citools/deregister.py` file.
From what I investigated, the `gitlab.exceptions.GitlabListError: 403: 403 Forbidden` error could indicate that no token is provided to Gitlab, and therefore the runner cannot access to the packages.
For info, `proj.packages.list()` calls here:
```
curl --header "PRIVATE-TOKEN: <token>" "https://gitlab.idiap.ch/api/v4/projects/1510/packages"
```
If an invalid token is given, it returns `{"message":"401 Unauthorized"}` but if no token is given, it returns `{"message":"403 Forbidden"}`
This would mean that `${PYPI_PACKAGE_REGISTRY_TOKEN}` is never set.
Where is it supposed to be set?
And why not use `${CI_JOB_TOKEN}` as `deregister.py` calls Gitlab API methods ?https://gitlab.idiap.ch/bob/bob.extension/-/issues/89Wrong link in package documentation2022-10-04T09:46:07ZManuel Günthersiebenkopf@googlemail.comWrong link in package documentationOn the first page of the documentation, the link for `bob development tools` points to https://www.idiap.ch/software/bob/develop which does not seem to exist. The same issue appears in a different page:
https://www.idiap.ch/software/bob...On the first page of the documentation, the link for `bob development tools` points to https://www.idiap.ch/software/bob/develop which does not seem to exist. The same issue appears in a different page:
https://www.idiap.ch/software/bob/docs/bob/bob.extension/v7.0.2/development.html
https://www.idiap.ch/software/bob/docs/bob/bob.extension/v7.0.2/pure_python.html#building-your-package
I am not sure which should be the right package to point at.https://gitlab.idiap.ch/bob/bob.bio.face/-/issues/88Scale function on preprocessor/Scaler.py cannot handle variable input shapes2022-09-30T13:16:24ZLuis LUEVANOScale function on preprocessor/Scaler.py cannot handle variable input shapesWhen running verification without annotations, the scale function of Scaler.py is used. However, it does not handle scaling for input images from different shapes in the same SampleBatch. In the scale function, the check_array processes ...When running verification without annotations, the scale function of Scaler.py is used. However, it does not handle scaling for input images from different shapes in the same SampleBatch. In the scale function, the check_array processes the SampleBatch and it assumes the shape of the first image in the batch as the one for the rest of the images in the same batch; when the shapes are different it throws an exception.https://gitlab.idiap.ch/bob/bob.devtools/-/issues/111Adding xlrd library2022-09-29T16:09:56ZFlavio TARSETTIAdding xlrd libraryAdd `xlrd`
Package request from @smichelAdd `xlrd`
Package request from @smichelFlavio TARSETTIFlavio TARSETTIhttps://gitlab.idiap.ch/bob/bob.pad.face/-/issues/49Test for the deep_pix_bis pipeline fails on the linux CI pipeline2022-10-06T14:19:43ZYannick DAYERTest for the deep_pix_bis pipeline fails on the linux CI pipelineJobs [#283925](https://gitlab.idiap.ch/bob/bob.pad.face/-/jobs/283925) and [#283926](https://gitlab.idiap.ch/bob/bob.pad.face/-/jobs/283926) failed.
The `deep_pix_bis` pipeline fails by returning a prediction score of `0.60` or more (th...Jobs [#283925](https://gitlab.idiap.ch/bob/bob.pad.face/-/jobs/283925) and [#283926](https://gitlab.idiap.ch/bob/bob.pad.face/-/jobs/283926) failed.
The `deep_pix_bis` pipeline fails by returning a prediction score of `0.60` or more (the value changes between runs) when expected to be below `0.04`.
The mac_intel and mac_arm CI pipelines pass.
The test also passes on a fresh local environment.Yannick DAYERYannick DAYERhttps://gitlab.idiap.ch/bob/bob.bio.face/-/issues/87Adding Model Complexity Measurements2022-09-22T10:26:58ZPasra RahimiAdding Model Complexity MeasurementsI think we should introduce a couple of model complexity measurements (in sense of a number of parameters, execution time, FLOPS, ... ) to the pipelines ...
This will be hard especially in the case of execution time since the infrastru...I think we should introduce a couple of model complexity measurements (in sense of a number of parameters, execution time, FLOPS, ... ) to the pipelines ...
This will be hard especially in the case of execution time since the infrastructure at this point to my best understanding is not normalized.
Let met know your comments.https://gitlab.idiap.ch/bob/bob.bio.face/-/issues/86Adding PFC2022-09-22T11:53:37ZPasra RahimiAdding PFCI will try to add the PFC (With ViT backbone) to the repo, if possible, please assign me ...I will try to add the PFC (With ViT backbone) to the repo, if possible, please assign me ...Pasra RahimiPasra Rahimihttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/110Overhauling our CI and pipelines2022-11-17T17:34:29ZAndré AnjosOverhauling our CI and pipelinesI'm proposing we should re-think the whole CI procedure entirely. Our CI system was made for the time we had some C++ code to be compiled and made available for other packages (@lcolbois: this was probably another reason for the splits)...I'm proposing we should re-think the whole CI procedure entirely. Our CI system was made for the time we had some C++ code to be compiled and made available for other packages (@lcolbois: this was probably another reason for the splits).
As of now and thanks to a lot of work by predecessors, most of the packages in this group are Python-only - this is certainly true for the (biometrics) nightlies. This means we can probably benefit from this fact and improve our pipelines, at least for most packages in the bob/beat namespace.
I have a proposal in mind, but I think we need to debate this a bit more thoroughly as I may not be thinking about all aspects and oversimplifying. The bulk of the proposal goes through revising our pipeline requirements and moving most of the packages into a python-only workflow (for most part). The proposed pipeline would be like this:
1. QA via `pre-commit`, if `.pre-commit-config.yaml` is present - only linux, should fail fast. Caching is enabled.
2. Test (via pytest) on various platforms and python versions - uses a Python docker image (not conda), respects python package pinning, retrieve beta versions from internal Gitlab package registry (if they exist), otherwise PyPI. Pip caching is enabled.
3. Generate sphinx documentation building and doctests (if `doc/` directory is present) - only linux, via Python docker image (not conda)
4. Package python code (via `python setup.py sdist`)
5. Package conda package using the sdist produced at the previous step, run tests (single platform - linux, just to cross-check)
6. Deploy sdist (unreleased beta) package on internal "group" package registry (GitLab: https://docs.gitlab.com/ee/user/packages/pypi_repository/) if not tagged or private/internal. Deploy sdist package on PyPI if tagged and public.
7. Deploy conda package on internal beta channel if not tagged, deploy on stable channel if tagged.
8. Deploy documentation and coverage information on DAV web server if success (master and stable, if required)
For non-python packages, maintain minimal CI configuration, but allow maximum flexibility via their own personalised `conda_build_config.yaml`. Everything else should match the above pipeline.
I also have some ideas on how to improve the packaging:
* Moving to `pyproject.toml` (https://setuptools.pypa.io/en/latest/userguide/pyproject_config.html), and
* Reflect that automatically on conda packages to deduplicate package lists (https://docs.conda.io/projects/conda-build/en/stable/resources/define-metadata.html#loading-data-from-other-files)
I expect this plan should make our pipelines run much faster, while simplifying the setup and testing. Conda environment creation continues to be supported and possible. For other simpler setups, simple Python software management (e.g. via pip or poetry) would be possible.
(This would affect issue #102 for example, #104 and maybe #103.)André AnjosAndré Anjoshttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/109`bdt dev create` fails to create a Python 3.8 or 3.9 environment and installs...2022-09-15T14:52:20ZAndré Anjos`bdt dev create` fails to create a Python 3.8 or 3.9 environment and installs Python 3.10 insteadI'm not sure where the problem is, but the following doesn't seem to work anymore:
```sh
$ cd bob.extension
$ #git checkout master; git pull # just make sure you're up-to-date
$ bdt create -vv ext --python=3.8
...
$ conda activate ext
...I'm not sure where the problem is, but the following doesn't seem to work anymore:
```sh
$ cd bob.extension
$ #git checkout master; git pull # just make sure you're up-to-date
$ bdt create -vv ext --python=3.8
...
$ conda activate ext
$ python -V
Python 3.10.6
```
I have the latest beta version of bdt installed (`5.3.1b0-py39_6`, arm/mac), as it seems.
Package planning shows Python 3.8 passing by, however it is later dropped for Python 3.10.
@flavio.tarsetti, @ydayer: Does anybody understand what is going on?https://gitlab.idiap.ch/bob/bob.bio.face/-/issues/85Formatting: output for compare_samples diagonal is not zero2022-09-12T16:20:16ZLuis LUEVANOFormatting: output for compare_samples diagonal is not zeroThe output for the compare_samples command is not zero when showing the diagonal of the All vs All comparison in all pipelines.
Bad formatting example with mobilefacenet pipeline:
```
All vs All comparison
------------------ ---------...The output for the compare_samples command is not zero when showing the diagonal of the All vs All comparison in all pipelines.
Bad formatting example with mobilefacenet pipeline:
```
All vs All comparison
------------------ -----------------------
./me.jpg ./not_me.jpg
-0.0 -0.9227539984332366
-0.922753991574597 -3.5416114485542494e-14
------------------ -----------------------
```
However it is correct with resnet50-msceleb-arcface-2021 pipeline:
```
All vs All comparison
----------------- -----------------
./me.jpg ./not_me.jpg
-0.0 -1.03846231201703
-1.03846231201703 -0.0
----------------- -----------------
```
So far I have only tested a few pipelines
- Bad formatting: facenet_sanderberg, arcface-insightface, mobilefacenet
- Correct formatting: resnet50-msceleb-arcface-2021, resnet50-msceleb-arcface20210521https://gitlab.idiap.ch/bob/bob.devtools/-/issues/107Add sphinx-click and sphinx-autodoc-typehints2022-09-01T14:49:11ZFlavio TARSETTIAdd sphinx-click and sphinx-autodoc-typehintsThose 2 packages will help in the development process and are requested by @smichelThose 2 packages will help in the development process and are requested by @smichelFlavio TARSETTIFlavio TARSETTIhttps://gitlab.idiap.ch/bob/bob.bio.base/-/issues/187The command "bob bio compare-samples" is not working2022-08-22T09:08:59ZAlain KOMATYThe command "bob bio compare-samples" is not workingI tried following the bob.bio.base documentation and started by running the first command in the docs:
`bob bio compare-samples --pipeline facenet-sanderberg me.png not_me.png`
I got the following error:
`AttributeError: 'PipelineSimple'...I tried following the bob.bio.base documentation and started by running the first command in the docs:
`bob bio compare-samples --pipeline facenet-sanderberg me.png not_me.png`
I got the following error:
`AttributeError: 'PipelineSimple' object has no attribute 'create_biometric_reference'`Yannick DAYERYannick DAYERhttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/106Change GitLab's runners configuration to allow build and push of docker image...2022-08-19T07:44:58ZAndré MAYORAZChange GitLab's runners configuration to allow build and push of docker images using the CIIt is related to Issue [102](https://gitlab.idiap.ch/bob/bob.devtools/-/issues/102).
The goal here is to be able to build and push docker images with GitLab's ci using either directly docker or podman as follows.
```build_image:
tag...It is related to Issue [102](https://gitlab.idiap.ch/bob/bob.devtools/-/issues/102).
The goal here is to be able to build and push docker images with GitLab's ci using either directly docker or podman as follows.
```build_image:
tags:
- docker
- bob
stage: build_image
image:
name: quay.io/podman/stable
before_script:
- docker info
script:
- docker build --tag docker.idiap.ch/bob/bdt:latest .
```
It is currently impossible to do it because of access right within the container.
A change in the runner's configuration has to be made to run the container in privileged mode. An example is shown in the [GitLab documentation](https://docs.gitlab.com/runner/executors/docker.html#use-podman-to-run-docker-commands-beta).https://gitlab.idiap.ch/bob/bob.devtools/-/issues/105Pinned versions packages interpreted as float numbers2022-08-11T11:46:04ZAndré MAYORAZPinned versions packages interpreted as float numbersPinned packages listed in `conda_build_config.yaml` have their version number interpreted as floats when in format x.x and as strings when in format x.x.x when loaded by the pyyaml library in `bob.devtools/bob/devtools/build.py` in the `...Pinned packages listed in `conda_build_config.yaml` have their version number interpreted as floats when in format x.x and as strings when in format x.x.x when loaded by the pyyaml library in `bob.devtools/bob/devtools/build.py` in the `load_packages_from_conda_build_config` method.
The problem is, for instance, that if we want the package `python-graphviz=0.20` conda will search for `python-graphviz=0.2` and may return an error as it doesn't find the package in this version.
A solution could be to write all the package versions' numbers between quotes so they are all interpreted as strings.André MAYORAZAndré MAYORAZhttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/104pyproject.toml not created in new packages2022-11-07T15:58:18ZYannick DAYERpyproject.toml not created in new packagesThe required `pyproject.toml` file is not in the project template and thus not created by the `bdt dev new ...` command.The required `pyproject.toml` file is not in the project template and thus not created by the `bdt dev new ...` command.https://gitlab.idiap.ch/bob/bob.paper.8years/-/issues/5Provide shore files and instructions on how to use them2022-08-11T12:57:54ZManuel Günthersiebenkopf@googlemail.comProvide shore files and instructions on how to use themFor easy reproduction of our plots, score files would be a great asset. Maybe it is possible to upload them to the Biometrics Resources (https://www.idiap.ch/webarchives/sites/www.idiap.ch/resource/biometric/) where the score files for t...For easy reproduction of our plots, score files would be a great asset. Maybe it is possible to upload them to the Biometrics Resources (https://www.idiap.ch/webarchives/sites/www.idiap.ch/resource/biometric/) where the score files for the old paper also are.Tiago de Freitas PereiraTiago de Freitas Pereira