bob issueshttps://gitlab.idiap.ch/groups/bob/-/issues2022-08-19T07:44:58Zhttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/106Change GitLab's runners configuration to allow build and push of docker image...2022-08-19T07:44:58ZAndré MAYORAZChange GitLab's runners configuration to allow build and push of docker images using the CIIt is related to Issue [102](https://gitlab.idiap.ch/bob/bob.devtools/-/issues/102).
The goal here is to be able to build and push docker images with GitLab's ci using either directly docker or podman as follows.
```build_image:
tag...It is related to Issue [102](https://gitlab.idiap.ch/bob/bob.devtools/-/issues/102).
The goal here is to be able to build and push docker images with GitLab's ci using either directly docker or podman as follows.
```build_image:
tags:
- docker
- bob
stage: build_image
image:
name: quay.io/podman/stable
before_script:
- docker info
script:
- docker build --tag docker.idiap.ch/bob/bdt:latest .
```
It is currently impossible to do it because of access right within the container.
A change in the runner's configuration has to be made to run the container in privileged mode. An example is shown in the [GitLab documentation](https://docs.gitlab.com/runner/executors/docker.html#use-podman-to-run-docker-commands-beta).https://gitlab.idiap.ch/bob/bob.devtools/-/issues/105Pinned versions packages interpreted as float numbers2022-08-11T11:46:04ZAndré MAYORAZPinned versions packages interpreted as float numbersPinned packages listed in `conda_build_config.yaml` have their version number interpreted as floats when in format x.x and as strings when in format x.x.x when loaded by the pyyaml library in `bob.devtools/bob/devtools/build.py` in the `...Pinned packages listed in `conda_build_config.yaml` have their version number interpreted as floats when in format x.x and as strings when in format x.x.x when loaded by the pyyaml library in `bob.devtools/bob/devtools/build.py` in the `load_packages_from_conda_build_config` method.
The problem is, for instance, that if we want the package `python-graphviz=0.20` conda will search for `python-graphviz=0.2` and may return an error as it doesn't find the package in this version.
A solution could be to write all the package versions' numbers between quotes so they are all interpreted as strings.André MAYORAZAndré MAYORAZhttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/104pyproject.toml not created in new packages2022-11-07T15:58:18ZYannick DAYERpyproject.toml not created in new packagesThe required `pyproject.toml` file is not in the project template and thus not created by the `bdt dev new ...` command.The required `pyproject.toml` file is not in the project template and thus not created by the `bdt dev new ...` command.https://gitlab.idiap.ch/bob/bob.paper.8years/-/issues/5Provide shore files and instructions on how to use them2022-08-11T12:57:54ZManuel Günthersiebenkopf@googlemail.comProvide shore files and instructions on how to use themFor easy reproduction of our plots, score files would be a great asset. Maybe it is possible to upload them to the Biometrics Resources (https://www.idiap.ch/webarchives/sites/www.idiap.ch/resource/biometric/) where the score files for t...For easy reproduction of our plots, score files would be a great asset. Maybe it is possible to upload them to the Biometrics Resources (https://www.idiap.ch/webarchives/sites/www.idiap.ch/resource/biometric/) where the score files for the old paper also are.Tiago de Freitas PereiraTiago de Freitas Pereirahttps://gitlab.idiap.ch/bob/bob.paper.8years/-/issues/4Instructions for GBU plots are missing in the README2022-08-10T09:42:39ZManuel Günthersiebenkopf@googlemail.comInstructions for GBU plots are missing in the READMEThe README lists 6 datasets to be used, but provides commands only for 5 of them. As far as I can see, the GBU dataset is missing.The README lists 6 datasets to be used, but provides commands only for 5 of them. As far as I can see, the GBU dataset is missing.Tiago de Freitas PereiraTiago de Freitas Pereirahttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/103CI fails on macos-arm64 only packages2022-09-20T09:35:21ZYannick DAYERCI fails on macos-arm64 only packagesDefining the pins for the `mne` package (added by !307) fails on [`build_macos_arm_bob_devtools`](https://gitlab.idiap.ch/bob/bob.devtools/-/jobs/277736).
`mne` has different dependencies depending on the architecture, and needs `pyobjc...Defining the pins for the `mne` package (added by !307) fails on [`build_macos_arm_bob_devtools`](https://gitlab.idiap.ch/bob/bob.devtools/-/jobs/277736).
`mne` has different dependencies depending on the architecture, and needs `pyobjc-framework-cocoa` on mac machines.
However, it is not found when running on a mac arm CI pipeline: `package mne-1.1.0-hce30654_0 requires pyobjc-framework-cocoa, but none of the providers can be installed`
- Trying to install `mne` directly on the mac arm machine works and mamba finds `pyobjc-framework-cocoa`.
- [The other CI pipelines](https://gitlab.idiap.ch/bob/bob.devtools/-/pipelines/63615) pass.
**Assumption**: Our CI configuration for macos arm is not correct and somehow does not search for packages on the `osx-arm64` platform on conda-forge.https://gitlab.idiap.ch/bob/bob.devtools/-/issues/102Improve pipelines by using custom Docker images2022-10-05T22:46:25ZSamuel GAISTImprove pipelines by using custom Docker imagesThe current pipeline of BOB, and therefore BEAT, follows this schema:
- Grab the bootstrap script
- Bootstrap miniconda (which might be downloading the installer script)
- Setup some configuration
- Do some checks
- Do a cleanup
So ev...The current pipeline of BOB, and therefore BEAT, follows this schema:
- Grab the bootstrap script
- Bootstrap miniconda (which might be downloading the installer script)
- Setup some configuration
- Do some checks
- Do a cleanup
So every build starts from scratch, installs Miniconda, which version is hard coded in `bootstrap.py`, then downloads a bunch of dependencies, do the actual work, rinse and repeat for each and every package. Even if there is some caching involved it is still pretty inefficient in terms of bandwidth, power and time consumption.
The script is more or less optimized for shell runner usage however, all these runners can run Docker and actually do use the Docker runner for the Linux build stage.
Since BOB pins its dependencies pretty strictly, it would make sense to create a Docker image with the environment preinstalled. That would reduce that repetitive bdt environment setup happening for every job.
What I would suggest is to do that in steps:
1) Create a bdt Docker image based on the image used for the Linux build jobs
2) Use that image for Linux specific and deploy job
3) Cache the conda related downloads
4) Check if it would make sense to preinstall in the Docker image all the bob pinned dependencies
5) Check if the [Docker-OSX project](https://hub.docker.com/r/sickcodes/docker-osx) is something that could be used on the macOS runners so that we don't need a shell runner anymore
6) If above's answer is yes, create a macOS bdt image keeping in mind point 4
7) Use Docker for all stages
For point 4, even if we have a local mirror for conda, does it really make sense to re-download everything every time ?Flavio TARSETTIFlavio TARSETTIhttps://gitlab.idiap.ch/bob/bob.bio.spear/-/issues/43SpeechbrainEmbeddings fails when run in dask for the first time2022-07-21T09:09:34ZYannick DAYERSpeechbrainEmbeddings fails when run in dask for the first timeWhen running `speechbrain-ecapa-voxceleb` for the first time, it fails if on dask.
Multiple workers are calling load_model at the same time and as the model files do not exist (first time running the pipeline), speechbrain tries to down...When running `speechbrain-ecapa-voxceleb` for the first time, it fails if on dask.
Multiple workers are calling load_model at the same time and as the model files do not exist (first time running the pipeline), speechbrain tries to download them from huggingface. It uses a cache in `~/.cache/huggingface`. Error rise as multiple worker access simultaneousely to the same files.
Workaround: The first time you run `speechbrain-ecapa-voxceleb`, do it without the `-l ...` option, at least until the files are downloaded and the computations starts. You can then stop the execution and run with the `-l` option.Flavio TARSETTIFlavio TARSETTIhttps://gitlab.idiap.ch/bob/bob.bio.vein/-/issues/27`all_samples` in `full` protocol of utfvp dataset2022-08-10T08:28:12ZHatef OTROSHI`all_samples` in `full` protocol of utfvp datasetHi,
It seems that `all_samples` in `full` protocol of the utfvp dataset has a problem and raises an issue.
Here is a sample code:
```
from bob.bio.vein.database.utfvp import UtfvpDatabase
database = UtfvpDatabase(protocol="full")
datab...Hi,
It seems that `all_samples` in `full` protocol of the utfvp dataset has a problem and raises an issue.
Here is a sample code:
```
from bob.bio.vein.database.utfvp import UtfvpDatabase
database = UtfvpDatabase(protocol="full")
database.all_samples()
```
and here is the error:
```
packages/bob/bio/base/database/csv_dataset.py", line 494, in all_samples
samples = samples + self.background_model_samples()
File "..../lib/python3.9/site-packages/bob/bio/base/database/csv_dataset.py", line 381, in background_model_samples
self.csv_to_sample_loader.transform(self.train_csv)
File "..../lib/python3.9/site-packages/sklearn/pipeline.py", line 635, in transform
Xt = transform.transform(Xt)
File "..../lib/python3.9/site-packages/bob/pipelines/sample_loaders.py", line 84, in transform
X.seek(0)
AttributeError: 'NoneType' object has no attribute 'seek'
```https://gitlab.idiap.ch/bob/bob.bio.face/-/issues/83Leaderboad needs to point to bob.bio.face_ongoing's temporarily2022-08-10T10:27:03ZYannick DAYERLeaderboad needs to point to bob.bio.face_ongoing's temporarilyWhile the leaderboard is being re-built, we should point to the old existing one in bob.bio.face_ongoing [here](https://www.idiap.ch/software/bob/docs/bob/bob.bio.face_ongoing/master/leaderboard.html).
or include it in this doc, since t...While the leaderboard is being re-built, we should point to the old existing one in bob.bio.face_ongoing [here](https://www.idiap.ch/software/bob/docs/bob/bob.bio.face_ongoing/master/leaderboard.html).
or include it in this doc, since the command lines shown there are no longer working.Yannick DAYERYannick DAYERhttps://gitlab.idiap.ch/bob/bob.bio.face/-/issues/82RFW dataset: overlapping and mis-labelling between training and testing sets2022-07-13T13:01:02ZYu LinghuRFW dataset: overlapping and mis-labelling between training and testing setsBased on the datasets we received from Wang et al., when we use z-samples or t-samples as shown below
https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/master/bob/bio/face/database/rfw.py#L242, 2 problems occurred during the experiments.
...Based on the datasets we received from Wang et al., when we use z-samples or t-samples as shown below
https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/master/bob/bio/face/database/rfw.py#L242, 2 problems occurred during the experiments.
1. There are 44 subjects classified as Caucasian in the training set, but as Indian in the testing set. (e.g. m.0c96fs, m.08y5xt, etc.)
2. When we choose to obtain 2500 z-samples from each race as the cohort, we detect more than 6000 pairs of subjects (one from training and one from testing) that have very high similarity scores (-0.5~-0.1). After manually check some of them, those samples should belong to same person, i.e. not imposter scores. So the overlapping exists between training and testing sets, which is not supposed to be.
This bug report works as a record of problems. I'm not sure if those problems only happen to us because of different versions of datasets.
We could discuss it in a later stage, e.g. use other BUPT datasets like BUPT-Balanced as training set since Wang et al. stated there is no overlap between BUPT-Balanced and RFW, face detection might be necessary since no landmark is given for this dataset.https://gitlab.idiap.ch/bob/bob.bio.base/-/issues/186Migrating the database interfaces to the new new CSV format2022-09-06T13:19:13ZAmir MOHAMMADIMigrating the database interfaces to the new new CSV formatA new interface is being implemented in https://gitlab.idiap.ch/bob/bob.bio.base/-/merge_requests/300, and I am trying to migrate all dbs to that.
This is a meta issue to track the migration.
- [ ] Convert all csv based ones to the new ...A new interface is being implemented in https://gitlab.idiap.ch/bob/bob.bio.base/-/merge_requests/300, and I am trying to migrate all dbs to that.
This is a meta issue to track the migration.
- [ ] Convert all csv based ones to the new format
- [ ] Convert all custom interfaces to the new format
- [ ] Rewrite `bob.bio.face.database` file for database that were using custom interface
- [ ] Test and verify the new interfaces!
- [ ] Upload csv files according to https://gitlab.idiap.ch/bob/private/-/wikis/How-to-upload-resources
- follow the `data/bob/bob.bio.modality/bio-modality-database_name.tar.gz`
- Use this format in the code:
```python
from bob.extension.download import get_file
name = "bio-modality-database_name-586b7e81.tar.gz"
dataset_protocols_path = get_file(
name,
# don't use https here, use http so the link works in the CI as well.
[f"https://www.idiap.ch/software/bob/data/bob/bob.bio.modality/{name}",
f"http://www.idiap.ch/software/bob/data/bob/bob.bio.modality/{name}"],
cache_subdir="protocols",
file_hash="586b7e81",
)
# remove .urls method.
```
Separate problems
- [ ] asvspoof2017-spoof the interface does not load
- [ ] utfvp has some columns without header in its CSV files!
- [ ] caspeal has some columns without header in its CSV files!
- [ ] gbu, rfw, lfw, ijbc, youtube interfaces are custom
- [ ] bob.bio.spear licit and spoof protocols need to change to new format like replaymobileLaurent COLBOISLaurent COLBOIShttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/101we have to use gcc 102022-06-27T14:26:59ZAmir MOHAMMADIwe have to use gcc 10similar to https://github.com/conda-forge/conda-forge-pinning-feedstock/blob/main/recipe/conda_build_config.yamlsimilar to https://github.com/conda-forge/conda-forge-pinning-feedstock/blob/main/recipe/conda_build_config.yamlFlavio TARSETTIFlavio TARSETTIhttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/100The after_script in bob.devtools CI instructions is broken2022-06-27T13:14:02ZAmir MOHAMMADIThe after_script in bob.devtools CI instructions is brokenThis line:
https://gitlab.idiap.ch/bob/bob.devtools/-/blob/dfa4a19a4d2088e9f6e5019ab44c94097d33e78f/.gitlab-ci.yml#L181
Does not work in the CI and silently fails. You can see it in:
https://gitlab.idiap.ch/bob/bob.devtools/-/jobs/27233...This line:
https://gitlab.idiap.ch/bob/bob.devtools/-/blob/dfa4a19a4d2088e9f6e5019ab44c94097d33e78f/.gitlab-ci.yml#L181
Does not work in the CI and silently fails. You can see it in:
https://gitlab.idiap.ch/bob/bob.devtools/-/jobs/272337#L675
```
$ bdt ci clean -vv
/usr/bin/bash: line 129: bdt: command not found
```Flavio TARSETTIFlavio TARSETTIhttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/99PYTEST_ADDOPTS does not need to be in recipe_append.yaml2022-06-21T09:18:19ZAmir MOHAMMADIPYTEST_ADDOPTS does not need to be in recipe_append.yamlConda-build only ignores env variables during the build phase, but does not do this at test phase.
Hence, adding `PYTEST_ADDOPTS` and the deprecated `NOSE_EVAL_ATTR` to `recipe_append.yaml` is not needed.Conda-build only ignores env variables during the build phase, but does not do this at test phase.
Hence, adding `PYTEST_ADDOPTS` and the deprecated `NOSE_EVAL_ATTR` to `recipe_append.yaml` is not needed.Amir MOHAMMADIAmir MOHAMMADIhttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/98The DOCSERVER env variable is no longer used.2022-06-24T08:30:27ZAmir MOHAMMADIThe DOCSERVER env variable is no longer used.I can't remember where this was used. Maybe when using `bob_dbmanage.py download all`? This env variable is not used anywhere anymore and it can be removed from here. Please correct if i am wrong @andre.anjosI can't remember where this was used. Maybe when using `bob_dbmanage.py download all`? This env variable is not used anywhere anymore and it can be removed from here. Please correct if i am wrong @andre.anjosFlavio TARSETTIFlavio TARSETTIhttps://gitlab.idiap.ch/bob/bob.extension/-/issues/88`search_file` scans the whole underlying folder structure2022-06-21T09:06:56ZChristophe ECABERT`search_file` scans the whole underlying folder structureWhen looking into a directory for a file with `search_file(base_path, options)`, the whole structure of the `base_path` get scanned [L395](https://gitlab.idiap.ch/bob/bob.extension/-/blob/master/bob/extension/download.py#L395). The time ...When looking into a directory for a file with `search_file(base_path, options)`, the whole structure of the `base_path` get scanned [L395](https://gitlab.idiap.ch/bob/bob.extension/-/blob/master/bob/extension/download.py#L395). The time complexity increases linearly with the number of underlying files.
This behavior is fine if the intended design was for `options` to be an incomplete relative path within the `base_path`. However if it is expected that `options` are complete relative path, one can speed up the process as follow:
```python
f = None
for o in options:
try:
f = open(os.path.join(base_path, o)
break
except OSError:
pass # File does not exist
return f
```
It implements the same behavior, assuming `options` is the full relative path, and speed up the search.Amir MOHAMMADIAmir MOHAMMADIhttps://gitlab.idiap.ch/bob/bob.bio.face/-/issues/81MTCNN is not serializable2022-06-07T12:10:09ZChristophe ECABERTMTCNN is not serializableWith the current implementation, MTCNN is not serializable via pickle. When used locally, it is not an issue, however, when running on the cluster, we are getting lovely messages into workers' logs such as:
```
distributed.protocol.pick...With the current implementation, MTCNN is not serializable via pickle. When used locally, it is not an issue, however, when running on the cluster, we are getting lovely messages into workers' logs such as:
```
distributed.protocol.pickle - INFO - Failed to serialize CheckpointWrapper(estimator=SampleWrapper(estimator=BoundingBoxAnnotatorCrop(annotator=MTCNN(thresholds=(0.1,
0.2,
0.2)),
eyes_cropper=FaceEyesNorm(final_image_size=(112,
112),
reference_eyes_location={'bottomright': (112,
112),
'leye': (55,
72),
'reye': (55,
40),
'topleft': (0,
0)})),
fit_extra_arguments=(),
input_attribute='data',
output_attribute='data',
transform_...',
'annotations'),)),
extension='.h5',
features_dir='/idiap/temp/cecabert/experiments/bob101/results-sge/ijbc/cropper',
hash_fn=<function hash_string at 0x7fa50d891000>,
load_func=<function load at 0x7fa507f5ed40>,
model_path='/idiap/temp/cecabert/experiments/bob101/results-sge/ijbc/cropper.pkl',
sample_attribute='data',
save_func=<function save at 0x7fa507f5ef80>). Exception: cannot pickle '_thread.RLock' object
distributed.protocol.core - CRITICAL - Failed to deserialize
Traceback (most recent call last):
File "/remote/idiap.svm/temp.biometric03/cecabert/mambaforge/envs/bob_deps/lib/python3.10/site-packages/distributed/protocol/core.py", line 111, in loads
return msgpack.loads(
File "msgpack/_unpacker.pyx", line 194, in msgpack._cmsgpack.unpackb
File "/remote/idiap.svm/temp.biometric03/cecabert/mambaforge/envs/bob_deps/lib/python3.10/site-packages/distributed/protocol/core.py", line 103, in _decode_default
return merge_and_deserialize(
File "/remote/idiap.svm/temp.biometric03/cecabert/mambaforge/envs/bob_deps/lib/python3.10/site-packages/distributed/protocol/serialize.py", line 488, in merge_and_deserialize
return deserialize(header, merged_frames, deserializers=deserializers)
File "/remote/idiap.svm/temp.biometric03/cecabert/mambaforge/envs/bob_deps/lib/python3.10/site-packages/distributed/protocol/serialize.py", line 417, in deserialize
return loads(header, frames)
File "/remote/idiap.svm/temp.biometric03/cecabert/mambaforge/envs/bob_deps/lib/python3.10/site-packages/distributed/protocol/serialize.py", line 180, in serialization_error_loads
raise TypeError(msg)
TypeError: Could not serialize object of type CheckpointWrapper.
Traceback (most recent call last):
File "/remote/idiap.svm/temp.biometric03/cecabert/mambaforge/envs/bob_deps/lib/python3.10/site-packages/distributed/protocol/pickle.py", line 40, in dumps
result = pickle.dumps(x, **dump_kwargs)
AttributeError: Can't pickle local object 'WeakSet.__init__.<locals>._remove'
```
The issue it that in some case, it tries to serialize the already loaded underlying Tensorflow Graph. This can be solved with the same mechanism used in `PyTorchModel` class by overriding the `__getstate__` method as follow:
```python
def __getstate__(self):
# Handling unpicklable objects
state = {}
for key, value in super().__getstate__().items():
if key != '_fun':
state[key] = value
state['_fun'] = None
return state
```
With this change, the serialization now works properly and can be tested with:
```python
mtcnn = MTCNN()
mtcnn.mtcnn_fun # Force instantiation of TF graph
other = pickle.loads(pickle.dumps(mtcnn))
# No AttributeError: Can't pickle local object 'WeakSet.__init__.<locals>._remove'
# TF graph will be lazily initialized if needed
```https://gitlab.idiap.ch/bob/nightlies/-/issues/63Not all bob packages are taken from local channel2022-06-01T09:44:27ZAmir MOHAMMADINot all bob packages are taken from local channelJob [#270279](https://gitlab.idiap.ch/bob/nightlies/-/jobs/270279) failed for 526f22f2ac40ed01c6b54ffcb992ef6eeeeefab5:
Somehow, mamba is not picking up all bob packages from the local conda channel:
```
bob.bio.base: ...Job [#270279](https://gitlab.idiap.ch/bob/nightlies/-/jobs/270279) failed for 526f22f2ac40ed01c6b54ffcb992ef6eeeeefab5:
Somehow, mamba is not picking up all bob packages from the local conda channel:
```
bob.bio.base: 6.0.1b0-py38h44fe14c_25 local
bob.bio.face: 6.0.1b0-py38h4d93ce0_18 local
bob.bio.video: 5.0.1b0-py38hd5d7b21_13 local
bob.extension: 6.1.2b0-py38h460ab40_32 local
bob.io.base: 4.0.1b0-py38h2539f08_35 local
bob.learn.em: 3.0.0b0-py38h4769f04_18 http://www.idiap.ch/software/bob/conda/label/beta
bob.measure: 5.0.1b0-py38h521d9d0_31 local
bob.pipelines: 2.0.1b0-py38hbfafce2_17 http://www.idiap.ch/software/bob/conda/label/beta
```
I don't know why yet but that is why the nightlies are broken.Amir MOHAMMADIAmir MOHAMMADIhttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/97Doc issues2022-05-31T13:21:29ZLaurent COLBOISDoc issues### 1. `pre-commit` alias doesn't work properly
##### Run
```
bdt dev checkout bob.bio.base
```
##### Error
```
Traceback (most recent call last):
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/bin/bdt", line 11, in <module>
sys.e...### 1. `pre-commit` alias doesn't work properly
##### Run
```
bdt dev checkout bob.bio.base
```
##### Error
```
Traceback (most recent call last):
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/bin/bdt", line 11, in <module>
sys.exit(main())
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1130, in __call__
return self.main(*args, **kwargs)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1055, in main
rv = self.invoke(ctx)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1657, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1657, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1404, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 760, in invoke
return __callback(*args, **kwargs)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/decorators.py", line 26, in new_func
return f(get_current_context(), *args, **kwargs)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/bob/devtools/scripts/development.py", line 70, in checkout
subprocess.check_call(["pre-commit", "install"], cwd=dest)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/subprocess.py", line 368, in check_call
retcode = call(*popenargs, **kwargs)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/subprocess.py", line 349, in call
with Popen(*popenargs, **kwargs) as p:
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/subprocess.py", line 951, in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/subprocess.py", line 1821, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'pre-commit'
```
### 2. Additional information required regarding virtual packages (I think)
##### Checkout `bob.bio.face` and run :
```
bdt dev create -vv dev
```
##### Error:
```
The reported errors are:
- Encountered problems while solving:
- - nothing provides __cuda needed by tensorflow-2.7.0-cuda102py310hcf4adbc_0
-
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/bin/bdt", line 11, in <module>
sys.exit(main())
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1130, in __call__
return self.main(*args, **kwargs)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1055, in main
rv = self.invoke(ctx)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1657, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1657, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1404, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 760, in invoke
return __callback(*args, **kwargs)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/bob/devtools/scripts/bdt.py", line 43, in _decorator
value = view_func(*args, **kwargs)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/bob/devtools/scripts/create.py", line 268, in create
deps = parse_dependencies(recipe_dir, conda_config)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/bob/devtools/build.py", line 333, in parse_dependencies
metadata = get_rendered_metadata(recipe_dir, config)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/bob/devtools/build.py", line 239, in get_rendered_metadata
return conda_build.api.render(recipe_dir, config=config)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/conda_build/api.py", line 42, in render
metadata_tuples = render_recipe(recipe_path, bypass_env_check=bypass_env_check,
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/conda_build/render.py", line 860, in render_recipe
rendered_metadata = distribute_variants(m, variants,
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/conda_build/render.py", line 763, in distribute_variants
mv.parse_until_resolved(allow_no_other_outputs=allow_no_other_outputs,
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/conda_build/metadata.py", line 1084, in parse_until_resolved
self.parse_again(permit_undefined_jinja=False,
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/conda_build/metadata.py", line 1006, in parse_again
self.meta = parse(self._get_contents(permit_undefined_jinja,
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/conda_build/metadata.py", line 1601, in _get_contents
rendered = template.render(environment=env)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/jinja2/environment.py", line 1301, in render
self.environment.handle_exception()
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/jinja2/environment.py", line 936, in handle_exception
raise rewrite_traceback_stack(source=source)
File "/remote/idiap.svm/user.active/lcolbois/bob_dev/run_ijbc/bob.bio.face/conda/meta.yaml", line 41, in top-level template code
- setuptools
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/conda_build/jinja_context.py", line 228, in pin_compatible
pins, _, _ = get_env_dependencies(m, 'host', m.config.variant)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/conda_build/render.py", line 138, in get_env_dependencies
actions = environ.get_install_actions(tmpdir, tuple(dependencies), env,
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/boa/cli/mambabuild.py", line 133, in mamba_get_install_actions
raise err
conda_build.exceptions.DependencyNeedsBuildingError: Unsatisfiable dependencies for platform linux-64: {MatchSpec("tensorflow==2.7.0=cuda102py310hcf4adbc_0")}
```
##### Possible fix : first run
```
export CONDA_OVERRIDE_CUDA=11.6
```Amir MOHAMMADIAmir MOHAMMADI