bob issueshttps://gitlab.idiap.ch/groups/bob/-/issues2023-03-29T16:59:00Zhttps://gitlab.idiap.ch/bob/bob.bio.face/-/issues/97Missing mxnet as dependency2023-03-29T16:59:00ZYannick DAYERMissing mxnet as dependencyWhen running the `arcface-insightface` baseline, an error complains that `mxnet` can not be imported.
After installing manually with `pip install mxnet`, everything works (conda did not manage to install it, though).
`mxnet` is missing...When running the `arcface-insightface` baseline, an error complains that `mxnet` can not be imported.
After installing manually with `pip install mxnet`, everything works (conda did not manage to install it, though).
`mxnet` is missing from the dependencies and dev-profile.https://gitlab.idiap.ch/bob/docs/-/issues/12Links to docs in readme.md are wrong2023-10-26T13:32:15ZYannick DAYERLinks to docs in readme.md are wrongThe links to the documentation of each package in `readme.md` are wrong, as the format changed. Now the links need to point to a `sphinx` sub-directory.
The `latest` links are wrong now, but the `stable` links will become erroneous late...The links to the documentation of each package in `readme.md` are wrong, as the format changed. Now the links need to point to a `sphinx` sub-directory.
The `latest` links are wrong now, but the `stable` links will become erroneous later on (after packages are released).https://gitlab.idiap.ch/bob/bob.bio.face/-/issues/94Face cropping based on bounding box still requires facial landmarks / an anno...2023-02-15T15:42:45ZManuel Günthersiebenkopf@googlemail.comFace cropping based on bounding box still requires facial landmarks / an annotatorRelated to #91.
Currently, there is no easy way of cropping the face purely based on bounding boxes, i.e., without alignment based on some facial landmarks. While we have an implementation for this case in `FaceCropBoundingBox`, but thi...Related to #91.
Currently, there is no easy way of cropping the face purely based on bounding boxes, i.e., without alignment based on some facial landmarks. While we have an implementation for this case in `FaceCropBoundingBox`, but this is buggy, see #91: https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/d6d8e20bb73cfe4b099fedee603fad6498203d7f/src/bob/bio/face/preprocessor/croppers.py#L312
this is not directly called from within out `face_crop_sover`: https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/d6d8e20bb73cfe4b099fedee603fad6498203d7f/src/bob/bio/face/utils.py#L377
but it is only indirectly included in the `BoundingBoxAnnotatorCrop`, which uses this only for cutting out the face, and detect landmarks in the crop: https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/d6d8e20bb73cfe4b099fedee603fad6498203d7f/src/bob/bio/face/preprocessor/FaceCrop.py#L305
While this is a useful use-case, another use-case would be to only extract the face based on the bounding box, without further landmark localization and alignment.
Actually, in the previous version of Bob, this was possible through (ab-)using the `FaceEyesNorm` class by providing `topleft` and `bottomright` coordinates instead.
In the current version, this is no longer possible.
I will add back an option for this.Manuel Günthersiebenkopf@googlemail.comManuel Günthersiebenkopf@googlemail.comhttps://gitlab.idiap.ch/bob/bob.bio.face/-/issues/93MTCNN comes without Non-Maximum-Suppression (NMS)2023-01-30T16:28:42ZManuel Günthersiebenkopf@googlemail.comMTCNN comes without Non-Maximum-Suppression (NMS)When running our MTCNN face detector, we get a lot of overlapping detections. Typically, these are removed with a non-maximum-suppression algorithm, see for example here: https://github.com/TropComplique/mtcnn-pytorch/blob/45b34462fc995e...When running our MTCNN face detector, we get a lot of overlapping detections. Typically, these are removed with a non-maximum-suppression algorithm, see for example here: https://github.com/TropComplique/mtcnn-pytorch/blob/45b34462fc995e6b8dbd17545b799e8c8a30026b/src/detector.py#L120 or in our TinyFaces implementation: https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/de683894f9f14876293ad56390f4c34e7dd83234/src/bob/bio/face/annotator/tinyface.py#L229
However, our MTCNN implementation returns the outputs of the network unfiltered, leading to many overlapping detections: https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/de683894f9f14876293ad56390f4c34e7dd83234/src/bob/bio/face/annotator/mtcnn.py#L113
When using only the first annotation as often done in our pipelines, this is not a big issue since NMS would just remove the overlapping boxes. When we need to detect more than one face in an image, on the other hand, we get a lot of repeated detections.
I would recommend to make the NMS function from TinyFaces accessible for other functions, and make use of it in MTCNN as well to filter out overlapping faces.https://gitlab.idiap.ch/bob/bob.bio.face/-/issues/92MTCNN models should not be with the code2022-12-21T15:49:25ZYannick DAYERMTCNN models should not be with the codeBig files should not be in a git repository.
TODO:
- Upload the model (`src/bob/bio/face/mtcnn.pb`) on the WebDav server (like https://www.idiap.ch/software/bob/data/bob.bio.face)
- Use the download utility to retrieve the file at runti...Big files should not be in a git repository.
TODO:
- Upload the model (`src/bob/bio/face/mtcnn.pb`) on the WebDav server (like https://www.idiap.ch/software/bob/data/bob.bio.face)
- Use the download utility to retrieve the file at runtime in `bob_data`.Yannick DAYERYannick DAYERhttps://gitlab.idiap.ch/bob/bob.bio.video/-/issues/26Youtube database is missing protocols2022-12-21T15:30:15ZYannick DAYERYoutube database is missing protocolsThe new Youtube CSVDatabase protocol definition files are missing protocols `fold1` through `fold9`.
They need to be generated from the old version of the database (bob=11), and tests added to follow the change.The new Youtube CSVDatabase protocol definition files are missing protocols `fold1` through `fold9`.
They need to be generated from the old version of the database (bob=11), and tests added to follow the change.Yannick DAYERYannick DAYERhttps://gitlab.idiap.ch/bob/bob.bio.face/-/issues/91Face cropping by bounding box fails with negative top/left coordinates2023-02-15T11:40:14ZManuel Günthersiebenkopf@googlemail.comFace cropping by bounding box fails with negative top/left coordinatesIn case of negative annotations of the bounding box, cropping the face will result in an error.
Apparently, the range
```
X[...,top:bottom,left:right]
```
will result in a dimension of 0 when top or left is negative, and therefore the ...In case of negative annotations of the bounding box, cropping the face will result in an error.
Apparently, the range
```
X[...,top:bottom,left:right]
```
will result in a dimension of 0 when top or left is negative, and therefore the cropping via OpenCV will fail:
https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/fb8ffece2423465fdbe6325c75845817d4b53a92/bob/bio/face/preprocessor/croppers.py#L390
Please note that the cropping works well for `FaceEyesNorm`, where the corresponding dimensions are padded before extraction: https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/fb8ffece2423465fdbe6325c75845817d4b53a92/bob/bio/face/preprocessor/croppers.py#L294
Maybe we can make use of the `FaceEyesNorm` class here instead of trying to do the cropping by hand.
Additionally, in the same line of code, it is assumed that the bounding box has the same aspect ratio as the `self.final_image_size`.
If this is not the case, the facial image will be distorted.
It would be great if we could adapt top/bottom or left/right such that the aspect ratio of the target size is kept (as far as possible, despite rounding issues).https://gitlab.idiap.ch/bob/bob.bio.video/-/issues/25Install ffmpeg on macOS M12022-11-17T16:03:29ZAndré MAYORAZInstall ffmpeg on macOS M1Using methods from `imageio-ffmpeg` on macOS M1 leads to the following error:
```
RuntimeError: No FFmpeg exe could be found. Install FFmpeg on your system, or set the IMAGEIO_FFMPEG_EXE environment variable.
```
According to [this issue...Using methods from `imageio-ffmpeg` on macOS M1 leads to the following error:
```
RuntimeError: No FFmpeg exe could be found. Install FFmpeg on your system, or set the IMAGEIO_FFMPEG_EXE environment variable.
```
According to [this issue on github](https://github.com/imageio/imageio-ffmpeg/issues/71), the package FFmpeg is currently not included in the wheel of the imageio-ffmpeg library for macOS M1. But it would work using conda.
This would mean that we have to find a workaround to install ffmpeg if we don't want to use conda or wait that the maintainers of `imageio-ffmpeg` to create a proper wheel for this OShttps://gitlab.idiap.ch/bob/bob.bio.demographics/-/issues/3Default value of boxplot percentile should be None2022-10-28T15:14:44ZYu LinghuDefault value of boxplot percentile should be NoneIn `plot.py`, the `percentile` could be either `None` or a `float` between 0 and 1, but the `type` defined in click option is `float`. So `None` is not a valid entry in the command line (`--percentile None`). In this case, the default fo...In `plot.py`, the `percentile` could be either `None` or a `float` between 0 and 1, but the `type` defined in click option is `float`. So `None` is not a valid entry in the command line (`--percentile None`). In this case, the default for option `--percentile` should be None so that it is possible to change to a `float` later. When the default is a `float` like `0.5`, then if we want to take the whole scores to make boxplot, then entry `--percentile None` will bring an Error.
I have this command updated in branch `change_boxplot_percentile_default`, but it cannot pass the CI/CD because the name updates in `bob.bio.base` are not updated here. I will take care of it if necessary.https://gitlab.idiap.ch/bob/bob.extension/-/issues/89Wrong link in package documentation2022-10-04T09:46:07ZManuel Günthersiebenkopf@googlemail.comWrong link in package documentationOn the first page of the documentation, the link for `bob development tools` points to https://www.idiap.ch/software/bob/develop which does not seem to exist. The same issue appears in a different page:
https://www.idiap.ch/software/bob...On the first page of the documentation, the link for `bob development tools` points to https://www.idiap.ch/software/bob/develop which does not seem to exist. The same issue appears in a different page:
https://www.idiap.ch/software/bob/docs/bob/bob.extension/v7.0.2/development.html
https://www.idiap.ch/software/bob/docs/bob/bob.extension/v7.0.2/pure_python.html#building-your-package
I am not sure which should be the right package to point at.https://gitlab.idiap.ch/bob/bob.bio.face/-/issues/88Scale function on preprocessor/Scaler.py cannot handle variable input shapes2022-09-30T13:16:24ZLuis LUEVANOScale function on preprocessor/Scaler.py cannot handle variable input shapesWhen running verification without annotations, the scale function of Scaler.py is used. However, it does not handle scaling for input images from different shapes in the same SampleBatch. In the scale function, the check_array processes ...When running verification without annotations, the scale function of Scaler.py is used. However, it does not handle scaling for input images from different shapes in the same SampleBatch. In the scale function, the check_array processes the SampleBatch and it assumes the shape of the first image in the batch as the one for the rest of the images in the same batch; when the shapes are different it throws an exception.https://gitlab.idiap.ch/bob/bob.bio.face/-/issues/87Adding Model Complexity Measurements2022-09-22T10:26:58ZPasra RahimiAdding Model Complexity MeasurementsI think we should introduce a couple of model complexity measurements (in sense of a number of parameters, execution time, FLOPS, ... ) to the pipelines ...
This will be hard especially in the case of execution time since the infrastru...I think we should introduce a couple of model complexity measurements (in sense of a number of parameters, execution time, FLOPS, ... ) to the pipelines ...
This will be hard especially in the case of execution time since the infrastructure at this point to my best understanding is not normalized.
Let met know your comments.https://gitlab.idiap.ch/bob/bob.bio.face/-/issues/86Adding PFC2022-09-22T11:53:37ZPasra RahimiAdding PFCI will try to add the PFC (With ViT backbone) to the repo, if possible, please assign me ...I will try to add the PFC (With ViT backbone) to the repo, if possible, please assign me ...Pasra RahimiPasra Rahimihttps://gitlab.idiap.ch/bob/bob.bio.face/-/issues/85Formatting: output for compare_samples diagonal is not zero2022-09-12T16:20:16ZLuis LUEVANOFormatting: output for compare_samples diagonal is not zeroThe output for the compare_samples command is not zero when showing the diagonal of the All vs All comparison in all pipelines.
Bad formatting example with mobilefacenet pipeline:
```
All vs All comparison
------------------ ---------...The output for the compare_samples command is not zero when showing the diagonal of the All vs All comparison in all pipelines.
Bad formatting example with mobilefacenet pipeline:
```
All vs All comparison
------------------ -----------------------
./me.jpg ./not_me.jpg
-0.0 -0.9227539984332366
-0.922753991574597 -3.5416114485542494e-14
------------------ -----------------------
```
However it is correct with resnet50-msceleb-arcface-2021 pipeline:
```
All vs All comparison
----------------- -----------------
./me.jpg ./not_me.jpg
-0.0 -1.03846231201703
-1.03846231201703 -0.0
----------------- -----------------
```
So far I have only tested a few pipelines
- Bad formatting: facenet_sanderberg, arcface-insightface, mobilefacenet
- Correct formatting: resnet50-msceleb-arcface-2021, resnet50-msceleb-arcface20210521https://gitlab.idiap.ch/bob/bob.devtools/-/issues/106Change GitLab's runners configuration to allow build and push of docker image...2022-08-19T07:44:58ZAndré MAYORAZChange GitLab's runners configuration to allow build and push of docker images using the CIIt is related to Issue [102](https://gitlab.idiap.ch/bob/bob.devtools/-/issues/102).
The goal here is to be able to build and push docker images with GitLab's ci using either directly docker or podman as follows.
```build_image:
tag...It is related to Issue [102](https://gitlab.idiap.ch/bob/bob.devtools/-/issues/102).
The goal here is to be able to build and push docker images with GitLab's ci using either directly docker or podman as follows.
```build_image:
tags:
- docker
- bob
stage: build_image
image:
name: quay.io/podman/stable
before_script:
- docker info
script:
- docker build --tag docker.idiap.ch/bob/bdt:latest .
```
It is currently impossible to do it because of access right within the container.
A change in the runner's configuration has to be made to run the container in privileged mode. An example is shown in the [GitLab documentation](https://docs.gitlab.com/runner/executors/docker.html#use-podman-to-run-docker-commands-beta).https://gitlab.idiap.ch/bob/bob.devtools/-/issues/103CI fails on macos-arm64 only packages2022-09-20T09:35:21ZYannick DAYERCI fails on macos-arm64 only packagesDefining the pins for the `mne` package (added by !307) fails on [`build_macos_arm_bob_devtools`](https://gitlab.idiap.ch/bob/bob.devtools/-/jobs/277736).
`mne` has different dependencies depending on the architecture, and needs `pyobjc...Defining the pins for the `mne` package (added by !307) fails on [`build_macos_arm_bob_devtools`](https://gitlab.idiap.ch/bob/bob.devtools/-/jobs/277736).
`mne` has different dependencies depending on the architecture, and needs `pyobjc-framework-cocoa` on mac machines.
However, it is not found when running on a mac arm CI pipeline: `package mne-1.1.0-hce30654_0 requires pyobjc-framework-cocoa, but none of the providers can be installed`
- Trying to install `mne` directly on the mac arm machine works and mamba finds `pyobjc-framework-cocoa`.
- [The other CI pipelines](https://gitlab.idiap.ch/bob/bob.devtools/-/pipelines/63615) pass.
**Assumption**: Our CI configuration for macos arm is not correct and somehow does not search for packages on the `osx-arm64` platform on conda-forge.https://gitlab.idiap.ch/bob/bob.bio.face/-/issues/82RFW dataset: overlapping and mis-labelling between training and testing sets2022-07-13T13:01:02ZYu LinghuRFW dataset: overlapping and mis-labelling between training and testing setsBased on the datasets we received from Wang et al., when we use z-samples or t-samples as shown below
https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/master/bob/bio/face/database/rfw.py#L242, 2 problems occurred during the experiments.
...Based on the datasets we received from Wang et al., when we use z-samples or t-samples as shown below
https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/master/bob/bio/face/database/rfw.py#L242, 2 problems occurred during the experiments.
1. There are 44 subjects classified as Caucasian in the training set, but as Indian in the testing set. (e.g. m.0c96fs, m.08y5xt, etc.)
2. When we choose to obtain 2500 z-samples from each race as the cohort, we detect more than 6000 pairs of subjects (one from training and one from testing) that have very high similarity scores (-0.5~-0.1). After manually check some of them, those samples should belong to same person, i.e. not imposter scores. So the overlapping exists between training and testing sets, which is not supposed to be.
This bug report works as a record of problems. I'm not sure if those problems only happen to us because of different versions of datasets.
We could discuss it in a later stage, e.g. use other BUPT datasets like BUPT-Balanced as training set since Wang et al. stated there is no overlap between BUPT-Balanced and RFW, face detection might be necessary since no landmark is given for this dataset.https://gitlab.idiap.ch/bob/bob.bio.base/-/issues/186Migrating the database interfaces to the new new CSV format2022-09-06T13:19:13ZAmir MOHAMMADIMigrating the database interfaces to the new new CSV formatA new interface is being implemented in https://gitlab.idiap.ch/bob/bob.bio.base/-/merge_requests/300, and I am trying to migrate all dbs to that.
This is a meta issue to track the migration.
- [ ] Convert all csv based ones to the new ...A new interface is being implemented in https://gitlab.idiap.ch/bob/bob.bio.base/-/merge_requests/300, and I am trying to migrate all dbs to that.
This is a meta issue to track the migration.
- [ ] Convert all csv based ones to the new format
- [ ] Convert all custom interfaces to the new format
- [ ] Rewrite `bob.bio.face.database` file for database that were using custom interface
- [ ] Test and verify the new interfaces!
- [ ] Upload csv files according to https://gitlab.idiap.ch/bob/private/-/wikis/How-to-upload-resources
- follow the `data/bob/bob.bio.modality/bio-modality-database_name.tar.gz`
- Use this format in the code:
```python
from bob.extension.download import get_file
name = "bio-modality-database_name-586b7e81.tar.gz"
dataset_protocols_path = get_file(
name,
# don't use https here, use http so the link works in the CI as well.
[f"https://www.idiap.ch/software/bob/data/bob/bob.bio.modality/{name}",
f"http://www.idiap.ch/software/bob/data/bob/bob.bio.modality/{name}"],
cache_subdir="protocols",
file_hash="586b7e81",
)
# remove .urls method.
```
Separate problems
- [ ] asvspoof2017-spoof the interface does not load
- [ ] utfvp has some columns without header in its CSV files!
- [ ] caspeal has some columns without header in its CSV files!
- [ ] gbu, rfw, lfw, ijbc, youtube interfaces are custom
- [ ] bob.bio.spear licit and spoof protocols need to change to new format like replaymobileLaurent COLBOISLaurent COLBOIShttps://gitlab.idiap.ch/bob/bob.measure/-/issues/68bob.measure.plot.det() yields points out of bounds in some cases2022-05-31T09:21:15ZAlex UNNERVIKbob.measure.plot.det() yields points out of bounds in some casesIt seems that `bob.measure.plot.det()` yields incorrect plots where the values are outside of the expected [0,1] range in at least certain cases.
It becomes impossible to plot correctly as a result.
Example code to reproduce the issue:
...It seems that `bob.measure.plot.det()` yields incorrect plots where the values are outside of the expected [0,1] range in at least certain cases.
It becomes impossible to plot correctly as a result.
Example code to reproduce the issue:
```
import bob.measure
import matplotlib.pyplot as plt
import numpy as np
plt.figure()
l1 = bob.measure.plot.det([1, 2, 3], [2, 2.5, 4], npoints=200) # works for apparently any number of points, including 3 and 2000
print(np.max((l1[0].get_ydata()))) # yields 8.126357928110227
print(np.min((l1[0].get_ydata()))) # yields -8.126357928110227
print(np.max((l1[0].get_xdata()))) # yields 8.126357928110227
print(np.min((l1[0].get_xdata()))) # yields -8.126357928110227
```
These values are inconsistent with expected values in the range of [0,1].
The image shows the result of the plot of the above code, simply adding `plt.show()`.![bob_det_plot](/uploads/9f7eb0264fac5629eeee1d4c4c633c6a/bob_det_plot.png)https://gitlab.idiap.ch/bob/bob.bio.spear/-/issues/41Memory issues with big dataset2022-05-16T15:02:59ZYannick DAYERMemory issues with big datasetWhen running a GMM experiment with a big dataset (Nist-SRE04to16) the pipeline fails with a memory error on a worker.
for example on the branch of !55:
```bob bio pipeline simple -vvv -d nist-sre04to16 -p gmm-voxforge -o results\~/gmm_...When running a GMM experiment with a big dataset (Nist-SRE04to16) the pipeline fails with a memory error on a worker.
for example on the branch of !55:
```bob bio pipeline simple -vvv -d nist-sre04to16 -p gmm-voxforge -o results\~/gmm_nist -l sge```
I ran with the default Dask `sge` client as well as the `sge-io-big-non-adaptive`, asking for 128 nodes (but got only ~60 while running).
The issue seems to happen before reaching the k-means initialization, maybe hinting at an issue in the Dask bags to array wrapping.
I also tried running the experiment with a lower Dask memory limit for each node, forcing the workers to spill their memory to disk early, trying to prevent the memory error if it reached the hard cap. This failed too (the workers effectively spilled to disk but still failed with a memory error).
<details><summary>Local Output and Traceback (Click to expand)</summary>
```
[...]
bob.pipelines.wrappers@2022-05-16 13:04:59,280 -- DEBUG: ToDaskBag(npartitions=128).transform
bob.pipelines.wrappers@2022-05-16 13:04:59,926 -- DEBUG: Dask|Checkpoint|Sample|Energy_.transform
bob.pipelines.wrappers@2022-05-16 13:04:59,927 -- DEBUG: Dask|Checkpoint|Sample|Cepstra.transform
bob.pipelines.wrappers@2022-05-16 13:04:59,929 -- DEBUG: Dask|Checkpoint|Sample|GMM(con.fit
bob.pipelines.wrappers@2022-05-16 13:04:59,941 -- DEBUG: Preparing data as dask arrays for fit
Traceback (most recent call last):
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/bin/bob", line 10, in <module>
sys.exit(main_cli())
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/click/core.py", line 1128, in __call__
return self.main(*args, **kwargs)
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/click/core.py", line 1053, in main
rv = self.invoke(ctx)
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/click/core.py", line 1659, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/click/core.py", line 1659, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/click/core.py", line 1659, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/click/core.py", line 1395, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/click/core.py", line 754, in invoke
return __callback(*args, **kwargs)
File "/remote/idiap.svm/temp.devel01/ydayer/spear_develop/bob.bio.base/bob/bio/base/script/pipeline_simple.py", line 276, in pipeline_simple
execute_pipeline_simple(
File "/remote/idiap.svm/temp.devel01/ydayer/spear_develop/bob.bio.base/bob/bio/base/pipelines/entry_points.py", line 225, in execute_pipeline_simple
result = pipeline(
File "/remote/idiap.svm/temp.devel01/ydayer/spear_develop/bob.bio.base/bob/bio/base/pipelines/pipelines.py", line 109, in __call__
self.transformer = self.train_background_model(background_model_samples)
File "/remote/idiap.svm/temp.devel01/ydayer/spear_develop/bob.bio.base/bob/bio/base/pipelines/pipelines.py", line 144, in train_background_model
return self.transformer.fit(background_model_samples)
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/sklearn/pipeline.py", line 394, in fit
self._final_estimator.fit(Xt, y, **fit_params_last_step)
File "/remote/idiap.svm/temp.devel01/ydayer/spear_develop/bob.pipelines/bob/pipelines/wrappers.py", line 881, in fit
return self._fit_on_dask_array(X, y, **fit_params)
File "/remote/idiap.svm/temp.devel01/ydayer/spear_develop/bob.pipelines/bob/pipelines/wrappers.py", line 835, in _fit_on_dask_array
X, fit_params = self._get_fit_params_from_sample_bags(bags)
File "/remote/idiap.svm/temp.devel01/ydayer/spear_develop/bob.pipelines/bob/pipelines/wrappers.py", line 816, in _get_fit_params_from_sample_bags
X = _array_from_sample_bags(bags, input_attribute, ndim=2)
File "/remote/idiap.svm/temp.devel01/ydayer/spear_develop/bob.pipelines/bob/pipelines/wrappers.py", line 693, in _array_from_sample_bags
lengths, shapes = dask.compute(lengths, shapes)
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/dask/base.py", line 573, in compute
results = schedule(dsk, keys, **kwargs)
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/distributed/client.py", line 3010, in get
results = self.gather(packed, asynchronous=asynchronous, direct=direct)
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/distributed/client.py", line 2162, in gather
return self.sync(
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/distributed/utils.py", line 311, in sync
return sync(
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/distributed/utils.py", line 378, in sync
raise exc.with_traceback(tb)
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/distributed/utils.py", line 351, in f
result = yield future
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/tornado/gen.py", line 762, in run
value = future.result()
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/distributed/client.py", line 2025, in _gather
raise exception.with_traceback(traceback)
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/dask/utils.py", line 39, in apply
return func(*args, **kwargs)
File "/remote/idiap.svm/temp.devel01/ydayer/spear_develop/bob.pipelines/bob/pipelines/wrappers.py", line 664, in _sample_attribute
return [getattr(s, attribute) for s in samples]
File "/remote/idiap.svm/temp.devel01/ydayer/spear_develop/bob.pipelines/bob/pipelines/wrappers.py", line 664, in <listcomp>
return [getattr(s, attribute) for s in samples]
File "/remote/idiap.svm/temp.devel01/ydayer/spear_develop/bob.pipelines/bob/pipelines/sample.py", line 170, in __getattribute__
return super().__getattribute__(name)
File "/remote/idiap.svm/temp.devel01/ydayer/spear_develop/bob.pipelines/bob/pipelines/sample.py", line 188, in data
return self._load()
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/bob/io/base/__init__.py", line 191, in load
return open_file(inputs)
File "/idiap/home/ydayer/miniconda3/envs/spear_bob10/lib/python3.8/site-packages/bob/io/base/__init__.py", line 101, in open_file
return np.array(f[key])
numpy.core._exceptions.MemoryError: Unable to allocate 7.89 MiB for an array with shape (17226, 60) and data type float64
```
</details>