bob issueshttps://gitlab.idiap.ch/groups/bob/-/issues2022-11-07T15:58:18Zhttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/104pyproject.toml not created in new packages2022-11-07T15:58:18ZYannick DAYERpyproject.toml not created in new packagesThe required `pyproject.toml` file is not in the project template and thus not created by the `bdt dev new ...` command.The required `pyproject.toml` file is not in the project template and thus not created by the `bdt dev new ...` command.https://gitlab.idiap.ch/bob/bob.paper.8years/-/issues/5Provide shore files and instructions on how to use them2022-08-11T12:57:54ZManuel Günthersiebenkopf@googlemail.comProvide shore files and instructions on how to use themFor easy reproduction of our plots, score files would be a great asset. Maybe it is possible to upload them to the Biometrics Resources (https://www.idiap.ch/webarchives/sites/www.idiap.ch/resource/biometric/) where the score files for t...For easy reproduction of our plots, score files would be a great asset. Maybe it is possible to upload them to the Biometrics Resources (https://www.idiap.ch/webarchives/sites/www.idiap.ch/resource/biometric/) where the score files for the old paper also are.Tiago de Freitas PereiraTiago de Freitas Pereirahttps://gitlab.idiap.ch/bob/bob.paper.8years/-/issues/4Instructions for GBU plots are missing in the README2022-08-10T09:42:39ZManuel Günthersiebenkopf@googlemail.comInstructions for GBU plots are missing in the READMEThe README lists 6 datasets to be used, but provides commands only for 5 of them. As far as I can see, the GBU dataset is missing.The README lists 6 datasets to be used, but provides commands only for 5 of them. As far as I can see, the GBU dataset is missing.Tiago de Freitas PereiraTiago de Freitas Pereirahttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/102Improve pipelines by using custom Docker images2022-10-05T22:46:25ZSamuel GAISTImprove pipelines by using custom Docker imagesThe current pipeline of BOB, and therefore BEAT, follows this schema:
- Grab the bootstrap script
- Bootstrap miniconda (which might be downloading the installer script)
- Setup some configuration
- Do some checks
- Do a cleanup
So ev...The current pipeline of BOB, and therefore BEAT, follows this schema:
- Grab the bootstrap script
- Bootstrap miniconda (which might be downloading the installer script)
- Setup some configuration
- Do some checks
- Do a cleanup
So every build starts from scratch, installs Miniconda, which version is hard coded in `bootstrap.py`, then downloads a bunch of dependencies, do the actual work, rinse and repeat for each and every package. Even if there is some caching involved it is still pretty inefficient in terms of bandwidth, power and time consumption.
The script is more or less optimized for shell runner usage however, all these runners can run Docker and actually do use the Docker runner for the Linux build stage.
Since BOB pins its dependencies pretty strictly, it would make sense to create a Docker image with the environment preinstalled. That would reduce that repetitive bdt environment setup happening for every job.
What I would suggest is to do that in steps:
1) Create a bdt Docker image based on the image used for the Linux build jobs
2) Use that image for Linux specific and deploy job
3) Cache the conda related downloads
4) Check if it would make sense to preinstall in the Docker image all the bob pinned dependencies
5) Check if the [Docker-OSX project](https://hub.docker.com/r/sickcodes/docker-osx) is something that could be used on the macOS runners so that we don't need a shell runner anymore
6) If above's answer is yes, create a macOS bdt image keeping in mind point 4
7) Use Docker for all stages
For point 4, even if we have a local mirror for conda, does it really make sense to re-download everything every time ?Flavio TARSETTIFlavio TARSETTIhttps://gitlab.idiap.ch/bob/bob.bio.spear/-/issues/43SpeechbrainEmbeddings fails when run in dask for the first time2022-07-21T09:09:34ZYannick DAYERSpeechbrainEmbeddings fails when run in dask for the first timeWhen running `speechbrain-ecapa-voxceleb` for the first time, it fails if on dask.
Multiple workers are calling load_model at the same time and as the model files do not exist (first time running the pipeline), speechbrain tries to down...When running `speechbrain-ecapa-voxceleb` for the first time, it fails if on dask.
Multiple workers are calling load_model at the same time and as the model files do not exist (first time running the pipeline), speechbrain tries to download them from huggingface. It uses a cache in `~/.cache/huggingface`. Error rise as multiple worker access simultaneousely to the same files.
Workaround: The first time you run `speechbrain-ecapa-voxceleb`, do it without the `-l ...` option, at least until the files are downloaded and the computations starts. You can then stop the execution and run with the `-l` option.Flavio TARSETTIFlavio TARSETTIhttps://gitlab.idiap.ch/bob/bob.bio.vein/-/issues/27`all_samples` in `full` protocol of utfvp dataset2022-08-10T08:28:12ZHatef OTROSHI`all_samples` in `full` protocol of utfvp datasetHi,
It seems that `all_samples` in `full` protocol of the utfvp dataset has a problem and raises an issue.
Here is a sample code:
```
from bob.bio.vein.database.utfvp import UtfvpDatabase
database = UtfvpDatabase(protocol="full")
datab...Hi,
It seems that `all_samples` in `full` protocol of the utfvp dataset has a problem and raises an issue.
Here is a sample code:
```
from bob.bio.vein.database.utfvp import UtfvpDatabase
database = UtfvpDatabase(protocol="full")
database.all_samples()
```
and here is the error:
```
packages/bob/bio/base/database/csv_dataset.py", line 494, in all_samples
samples = samples + self.background_model_samples()
File "..../lib/python3.9/site-packages/bob/bio/base/database/csv_dataset.py", line 381, in background_model_samples
self.csv_to_sample_loader.transform(self.train_csv)
File "..../lib/python3.9/site-packages/sklearn/pipeline.py", line 635, in transform
Xt = transform.transform(Xt)
File "..../lib/python3.9/site-packages/bob/pipelines/sample_loaders.py", line 84, in transform
X.seek(0)
AttributeError: 'NoneType' object has no attribute 'seek'
```https://gitlab.idiap.ch/bob/bob.bio.face/-/issues/83Leaderboad needs to point to bob.bio.face_ongoing's temporarily2022-08-10T10:27:03ZYannick DAYERLeaderboad needs to point to bob.bio.face_ongoing's temporarilyWhile the leaderboard is being re-built, we should point to the old existing one in bob.bio.face_ongoing [here](https://www.idiap.ch/software/bob/docs/bob/bob.bio.face_ongoing/master/leaderboard.html).
or include it in this doc, since t...While the leaderboard is being re-built, we should point to the old existing one in bob.bio.face_ongoing [here](https://www.idiap.ch/software/bob/docs/bob/bob.bio.face_ongoing/master/leaderboard.html).
or include it in this doc, since the command lines shown there are no longer working.Yannick DAYERYannick DAYERhttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/101we have to use gcc 102022-06-27T14:26:59ZAmir MOHAMMADIwe have to use gcc 10similar to https://github.com/conda-forge/conda-forge-pinning-feedstock/blob/main/recipe/conda_build_config.yamlsimilar to https://github.com/conda-forge/conda-forge-pinning-feedstock/blob/main/recipe/conda_build_config.yamlFlavio TARSETTIFlavio TARSETTIhttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/100The after_script in bob.devtools CI instructions is broken2022-06-27T13:14:02ZAmir MOHAMMADIThe after_script in bob.devtools CI instructions is brokenThis line:
https://gitlab.idiap.ch/bob/bob.devtools/-/blob/dfa4a19a4d2088e9f6e5019ab44c94097d33e78f/.gitlab-ci.yml#L181
Does not work in the CI and silently fails. You can see it in:
https://gitlab.idiap.ch/bob/bob.devtools/-/jobs/27233...This line:
https://gitlab.idiap.ch/bob/bob.devtools/-/blob/dfa4a19a4d2088e9f6e5019ab44c94097d33e78f/.gitlab-ci.yml#L181
Does not work in the CI and silently fails. You can see it in:
https://gitlab.idiap.ch/bob/bob.devtools/-/jobs/272337#L675
```
$ bdt ci clean -vv
/usr/bin/bash: line 129: bdt: command not found
```Flavio TARSETTIFlavio TARSETTIhttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/99PYTEST_ADDOPTS does not need to be in recipe_append.yaml2022-06-21T09:18:19ZAmir MOHAMMADIPYTEST_ADDOPTS does not need to be in recipe_append.yamlConda-build only ignores env variables during the build phase, but does not do this at test phase.
Hence, adding `PYTEST_ADDOPTS` and the deprecated `NOSE_EVAL_ATTR` to `recipe_append.yaml` is not needed.Conda-build only ignores env variables during the build phase, but does not do this at test phase.
Hence, adding `PYTEST_ADDOPTS` and the deprecated `NOSE_EVAL_ATTR` to `recipe_append.yaml` is not needed.Amir MOHAMMADIAmir MOHAMMADIhttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/98The DOCSERVER env variable is no longer used.2022-06-24T08:30:27ZAmir MOHAMMADIThe DOCSERVER env variable is no longer used.I can't remember where this was used. Maybe when using `bob_dbmanage.py download all`? This env variable is not used anywhere anymore and it can be removed from here. Please correct if i am wrong @andre.anjosI can't remember where this was used. Maybe when using `bob_dbmanage.py download all`? This env variable is not used anywhere anymore and it can be removed from here. Please correct if i am wrong @andre.anjosFlavio TARSETTIFlavio TARSETTIhttps://gitlab.idiap.ch/bob/bob.extension/-/issues/88`search_file` scans the whole underlying folder structure2022-06-21T09:06:56ZChristophe ECABERT`search_file` scans the whole underlying folder structureWhen looking into a directory for a file with `search_file(base_path, options)`, the whole structure of the `base_path` get scanned [L395](https://gitlab.idiap.ch/bob/bob.extension/-/blob/master/bob/extension/download.py#L395). The time ...When looking into a directory for a file with `search_file(base_path, options)`, the whole structure of the `base_path` get scanned [L395](https://gitlab.idiap.ch/bob/bob.extension/-/blob/master/bob/extension/download.py#L395). The time complexity increases linearly with the number of underlying files.
This behavior is fine if the intended design was for `options` to be an incomplete relative path within the `base_path`. However if it is expected that `options` are complete relative path, one can speed up the process as follow:
```python
f = None
for o in options:
try:
f = open(os.path.join(base_path, o)
break
except OSError:
pass # File does not exist
return f
```
It implements the same behavior, assuming `options` is the full relative path, and speed up the search.Amir MOHAMMADIAmir MOHAMMADIhttps://gitlab.idiap.ch/bob/bob.bio.face/-/issues/81MTCNN is not serializable2022-06-07T12:10:09ZChristophe ECABERTMTCNN is not serializableWith the current implementation, MTCNN is not serializable via pickle. When used locally, it is not an issue, however, when running on the cluster, we are getting lovely messages into workers' logs such as:
```
distributed.protocol.pick...With the current implementation, MTCNN is not serializable via pickle. When used locally, it is not an issue, however, when running on the cluster, we are getting lovely messages into workers' logs such as:
```
distributed.protocol.pickle - INFO - Failed to serialize CheckpointWrapper(estimator=SampleWrapper(estimator=BoundingBoxAnnotatorCrop(annotator=MTCNN(thresholds=(0.1,
0.2,
0.2)),
eyes_cropper=FaceEyesNorm(final_image_size=(112,
112),
reference_eyes_location={'bottomright': (112,
112),
'leye': (55,
72),
'reye': (55,
40),
'topleft': (0,
0)})),
fit_extra_arguments=(),
input_attribute='data',
output_attribute='data',
transform_...',
'annotations'),)),
extension='.h5',
features_dir='/idiap/temp/cecabert/experiments/bob101/results-sge/ijbc/cropper',
hash_fn=<function hash_string at 0x7fa50d891000>,
load_func=<function load at 0x7fa507f5ed40>,
model_path='/idiap/temp/cecabert/experiments/bob101/results-sge/ijbc/cropper.pkl',
sample_attribute='data',
save_func=<function save at 0x7fa507f5ef80>). Exception: cannot pickle '_thread.RLock' object
distributed.protocol.core - CRITICAL - Failed to deserialize
Traceback (most recent call last):
File "/remote/idiap.svm/temp.biometric03/cecabert/mambaforge/envs/bob_deps/lib/python3.10/site-packages/distributed/protocol/core.py", line 111, in loads
return msgpack.loads(
File "msgpack/_unpacker.pyx", line 194, in msgpack._cmsgpack.unpackb
File "/remote/idiap.svm/temp.biometric03/cecabert/mambaforge/envs/bob_deps/lib/python3.10/site-packages/distributed/protocol/core.py", line 103, in _decode_default
return merge_and_deserialize(
File "/remote/idiap.svm/temp.biometric03/cecabert/mambaforge/envs/bob_deps/lib/python3.10/site-packages/distributed/protocol/serialize.py", line 488, in merge_and_deserialize
return deserialize(header, merged_frames, deserializers=deserializers)
File "/remote/idiap.svm/temp.biometric03/cecabert/mambaforge/envs/bob_deps/lib/python3.10/site-packages/distributed/protocol/serialize.py", line 417, in deserialize
return loads(header, frames)
File "/remote/idiap.svm/temp.biometric03/cecabert/mambaforge/envs/bob_deps/lib/python3.10/site-packages/distributed/protocol/serialize.py", line 180, in serialization_error_loads
raise TypeError(msg)
TypeError: Could not serialize object of type CheckpointWrapper.
Traceback (most recent call last):
File "/remote/idiap.svm/temp.biometric03/cecabert/mambaforge/envs/bob_deps/lib/python3.10/site-packages/distributed/protocol/pickle.py", line 40, in dumps
result = pickle.dumps(x, **dump_kwargs)
AttributeError: Can't pickle local object 'WeakSet.__init__.<locals>._remove'
```
The issue it that in some case, it tries to serialize the already loaded underlying Tensorflow Graph. This can be solved with the same mechanism used in `PyTorchModel` class by overriding the `__getstate__` method as follow:
```python
def __getstate__(self):
# Handling unpicklable objects
state = {}
for key, value in super().__getstate__().items():
if key != '_fun':
state[key] = value
state['_fun'] = None
return state
```
With this change, the serialization now works properly and can be tested with:
```python
mtcnn = MTCNN()
mtcnn.mtcnn_fun # Force instantiation of TF graph
other = pickle.loads(pickle.dumps(mtcnn))
# No AttributeError: Can't pickle local object 'WeakSet.__init__.<locals>._remove'
# TF graph will be lazily initialized if needed
```https://gitlab.idiap.ch/bob/nightlies/-/issues/63Not all bob packages are taken from local channel2022-06-01T09:44:27ZAmir MOHAMMADINot all bob packages are taken from local channelJob [#270279](https://gitlab.idiap.ch/bob/nightlies/-/jobs/270279) failed for 526f22f2ac40ed01c6b54ffcb992ef6eeeeefab5:
Somehow, mamba is not picking up all bob packages from the local conda channel:
```
bob.bio.base: ...Job [#270279](https://gitlab.idiap.ch/bob/nightlies/-/jobs/270279) failed for 526f22f2ac40ed01c6b54ffcb992ef6eeeeefab5:
Somehow, mamba is not picking up all bob packages from the local conda channel:
```
bob.bio.base: 6.0.1b0-py38h44fe14c_25 local
bob.bio.face: 6.0.1b0-py38h4d93ce0_18 local
bob.bio.video: 5.0.1b0-py38hd5d7b21_13 local
bob.extension: 6.1.2b0-py38h460ab40_32 local
bob.io.base: 4.0.1b0-py38h2539f08_35 local
bob.learn.em: 3.0.0b0-py38h4769f04_18 http://www.idiap.ch/software/bob/conda/label/beta
bob.measure: 5.0.1b0-py38h521d9d0_31 local
bob.pipelines: 2.0.1b0-py38hbfafce2_17 http://www.idiap.ch/software/bob/conda/label/beta
```
I don't know why yet but that is why the nightlies are broken.Amir MOHAMMADIAmir MOHAMMADIhttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/97Doc issues2022-05-31T13:21:29ZLaurent COLBOISDoc issues### 1. `pre-commit` alias doesn't work properly
##### Run
```
bdt dev checkout bob.bio.base
```
##### Error
```
Traceback (most recent call last):
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/bin/bdt", line 11, in <module>
sys.e...### 1. `pre-commit` alias doesn't work properly
##### Run
```
bdt dev checkout bob.bio.base
```
##### Error
```
Traceback (most recent call last):
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/bin/bdt", line 11, in <module>
sys.exit(main())
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1130, in __call__
return self.main(*args, **kwargs)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1055, in main
rv = self.invoke(ctx)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1657, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1657, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1404, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 760, in invoke
return __callback(*args, **kwargs)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/decorators.py", line 26, in new_func
return f(get_current_context(), *args, **kwargs)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/bob/devtools/scripts/development.py", line 70, in checkout
subprocess.check_call(["pre-commit", "install"], cwd=dest)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/subprocess.py", line 368, in check_call
retcode = call(*popenargs, **kwargs)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/subprocess.py", line 349, in call
with Popen(*popenargs, **kwargs) as p:
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/subprocess.py", line 951, in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/subprocess.py", line 1821, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'pre-commit'
```
### 2. Additional information required regarding virtual packages (I think)
##### Checkout `bob.bio.face` and run :
```
bdt dev create -vv dev
```
##### Error:
```
The reported errors are:
- Encountered problems while solving:
- - nothing provides __cuda needed by tensorflow-2.7.0-cuda102py310hcf4adbc_0
-
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/bin/bdt", line 11, in <module>
sys.exit(main())
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1130, in __call__
return self.main(*args, **kwargs)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1055, in main
rv = self.invoke(ctx)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1657, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1657, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 1404, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/click/core.py", line 760, in invoke
return __callback(*args, **kwargs)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/bob/devtools/scripts/bdt.py", line 43, in _decorator
value = view_func(*args, **kwargs)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/bob/devtools/scripts/create.py", line 268, in create
deps = parse_dependencies(recipe_dir, conda_config)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/bob/devtools/build.py", line 333, in parse_dependencies
metadata = get_rendered_metadata(recipe_dir, config)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/bob/devtools/build.py", line 239, in get_rendered_metadata
return conda_build.api.render(recipe_dir, config=config)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/conda_build/api.py", line 42, in render
metadata_tuples = render_recipe(recipe_path, bypass_env_check=bypass_env_check,
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/conda_build/render.py", line 860, in render_recipe
rendered_metadata = distribute_variants(m, variants,
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/conda_build/render.py", line 763, in distribute_variants
mv.parse_until_resolved(allow_no_other_outputs=allow_no_other_outputs,
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/conda_build/metadata.py", line 1084, in parse_until_resolved
self.parse_again(permit_undefined_jinja=False,
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/conda_build/metadata.py", line 1006, in parse_again
self.meta = parse(self._get_contents(permit_undefined_jinja,
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/conda_build/metadata.py", line 1601, in _get_contents
rendered = template.render(environment=env)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/jinja2/environment.py", line 1301, in render
self.environment.handle_exception()
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/jinja2/environment.py", line 936, in handle_exception
raise rewrite_traceback_stack(source=source)
File "/remote/idiap.svm/user.active/lcolbois/bob_dev/run_ijbc/bob.bio.face/conda/meta.yaml", line 41, in top-level template code
- setuptools
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/conda_build/jinja_context.py", line 228, in pin_compatible
pins, _, _ = get_env_dependencies(m, 'host', m.config.variant)
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/conda_build/render.py", line 138, in get_env_dependencies
actions = environ.get_install_actions(tmpdir, tuple(dependencies), env,
File "/idiap/temp/lcolbois/miniconda3/envs/bdt/lib/python3.9/site-packages/boa/cli/mambabuild.py", line 133, in mamba_get_install_actions
raise err
conda_build.exceptions.DependencyNeedsBuildingError: Unsatisfiable dependencies for platform linux-64: {MatchSpec("tensorflow==2.7.0=cuda102py310hcf4adbc_0")}
```
##### Possible fix : first run
```
export CONDA_OVERRIDE_CUDA=11.6
```Amir MOHAMMADIAmir MOHAMMADIhttps://gitlab.idiap.ch/bob/bob.bio.base/-/issues/185Probe template not accepted if not numpy array2022-05-31T08:57:06ZYannick DAYERProbe template not accepted if not numpy arrayIn `score_sample_template` from `abstract_classes.py` there is a check that the data is "valid" ([here](https://gitlab.idiap.ch/bob/bob.bio.base/-/blob/master/bob/bio/base/pipelines/abstract_classes.py#L246)).
It checks that the templat...In `score_sample_template` from `abstract_classes.py` there is a check that the data is "valid" ([here](https://gitlab.idiap.ch/bob/bob.bio.base/-/blob/master/bob/bio/base/pipelines/abstract_classes.py#L246)).
It checks that the template in the samples is a Numpy array and is not empty. But in GMM for example, the templates are objects (`GMMStats`) and all the samples are considered "invalid" and ignored silently.
We could either remove the check or implement a way of detecting non-Numpy objects and allow those unconditionally.Amir MOHAMMADIAmir MOHAMMADIhttps://gitlab.idiap.ch/bob/bob.bio.spear/-/issues/42ImportError: cannot import name 'SpeechbrainEmbeddings' from 'bob.bio.spear.e...2022-05-25T15:00:39ZHatef OTROSHIImportError: cannot import name 'SpeechbrainEmbeddings' from 'bob.bio.spear.extractor'In the master branch, in [bob/bio/spear/config/pipeline/speechbrain_ecapa_voxceleb.py#L4](https://gitlab.idiap.ch/bob/bob.bio.spear/-/blob/ddf128c6da2872d7ec5a2608b4fddf03b3884537/bob/bio/spear/config/pipeline/speechbrain_ecapa_voxceleb....In the master branch, in [bob/bio/spear/config/pipeline/speechbrain_ecapa_voxceleb.py#L4](https://gitlab.idiap.ch/bob/bob.bio.spear/-/blob/ddf128c6da2872d7ec5a2608b4fddf03b3884537/bob/bio/spear/config/pipeline/speechbrain_ecapa_voxceleb.py#L4) there is `from bob.bio.spear.extractor import SpeechbrainEmbeddings` but `SpeechbrainEmbeddings` cannot be imported.https://gitlab.idiap.ch/bob/bob.bio.base/-/issues/184cannot import 'isinstance_nested' from 'bob.pipelines.utils'2022-05-24T17:23:17ZHatef OTROSHIcannot import 'isinstance_nested' from 'bob.pipelines.utils'In the master branch, in [bob/bio/base/pipelines/pipelines.py#L18](https://gitlab.idiap.ch/bob/bob.bio.base/-/blob/1abaa5f11559c15ee185203a65f8480800524b01/bob/bio/base/pipelines/pipelines.py#L18) there is `from bob.pipelines.utils impor...In the master branch, in [bob/bio/base/pipelines/pipelines.py#L18](https://gitlab.idiap.ch/bob/bob.bio.base/-/blob/1abaa5f11559c15ee185203a65f8480800524b01/bob/bio/base/pipelines/pipelines.py#L18) there is `from bob.pipelines.utils import isinstance_nested` but `isinstance_nested` is not available in master branch of [bob/pipelines/utils.py](https://gitlab.idiap.ch/bob/bob.pipelines/-/blob/611f8867a16711bf0394cc9af2901bd4941b6a5b/bob/pipelines/utils.py)https://gitlab.idiap.ch/bob/bob.pad.face/-/issues/48VideoPadSample's output is not consistent2022-06-09T12:54:04ZChristophe ECABERTVideoPadSample's output is not consistentThe samples generated by the `bob.pad.face.database.VideoPadSample` are not consistent. The attributes `.data` and `.annotations` do not have the same first dimensions. This behaviour can be reproduce with:
```python
database = ReplayAt...The samples generated by the `bob.pad.face.database.VideoPadSample` are not consistent. The attributes `.data` and `.annotations` do not have the same first dimensions. This behaviour can be reproduce with:
```python
database = ReplayAttackPadDatabase(protocol='smalltest')
samples = database.fit_samples()
vid0 = samples[0].data # vid0.shape == (20, 3, 240, 320)
ann0 = samples[0].annotations # len(ann0) == 375
```
The `.data` attribute contains the sampled frames (i.e. by `VideoAsArray` container) whereas the `.annotations` attribute contains the annotations for the whole video (i.e. they are not sampled)https://gitlab.idiap.ch/bob/bob.pad.face/-/issues/47Vanilla-pad baseline does not run2022-05-31T08:57:22ZChristophe ECABERTVanilla-pad baseline does not runFollowing the [documentation](https://www.idiap.ch/software/bob/docs/bob/docs/master/bob/bob.pad.base/doc/vanilla_pad_intro.html#running-a-biometric-experiment-with-vanilla-pad), the vanilla-pad baseline can be invoked with:
```python
b...Following the [documentation](https://www.idiap.ch/software/bob/docs/bob/docs/master/bob/bob.pad.base/doc/vanilla_pad_intro.html#running-a-biometric-experiment-with-vanilla-pad), the vanilla-pad baseline can be invoked with:
```python
bob pad vanilla-pad replay-attack svm-frames -o results -vv
```
This leads to the following exception:
```python
Traceback (most recent call last):
File "/remote/idiap.svm/temp.biometric03/cecabert/bob_beta/src/bob.pipelines/bob/pipelines/wrappers.py", line 808, in _fit
self.estimator = self.estimator.fit(X, y, **fit_params)
File "/remote/idiap.svm/temp.biometric03/cecabert/bob_beta/src/bob.pipelines/bob/pipelines/wrappers.py", line 337, in fit
self.estimator = self.estimator.fit(X, **kwargs)
File "/remote/idiap.svm/temp.biometric03/cecabert/mambaforge/envs/bob_deps/lib/python3.10/site-packages/sklearn/svm/_base.py", line 190, in fit
X, y = self._validate_data(
File "/remote/idiap.svm/temp.biometric03/cecabert/mambaforge/envs/bob_deps/lib/python3.10/site-packages/sklearn/base.py", line 581, in _validate_data
X, y = check_X_y(X, y, **check_params)
File "/remote/idiap.svm/temp.biometric03/cecabert/mambaforge/envs/bob_deps/lib/python3.10/site-packages/sklearn/utils/validation.py", line 964, in check_X_y
X = check_array(
File "/remote/idiap.svm/temp.biometric03/cecabert/mambaforge/envs/bob_deps/lib/python3.10/site-packages/sklearn/utils/validation.py", line 794, in check_array
raise ValueError(
ValueError: Found array with dim 4. Estimator expected <= 2.
```
The `fit` method of the SVM classifier is expecting its input to be `(n_samples, n_features)` which is not what is provided by the `VideoToFrames` transformer. One possible solution would be to add an intermediate step between the two operations to flatten the data.