bob issueshttps://gitlab.idiap.ch/groups/bob/-/issues2022-03-30T09:33:02Zhttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/72conda-forge migration2022-03-30T09:33:02ZAmir MOHAMMADIconda-forge migrationIt might be in our interest to base our builds based on conda-forge.
Here is what comes to my mind to enable the migration:
* [x] Cache conda-forge recent (and needed) packages from linux-64 and osx-64 and osx-arm64 builds
* [x] Vince...It might be in our interest to base our builds based on conda-forge.
Here is what comes to my mind to enable the migration:
* [x] Cache conda-forge recent (and needed) packages from linux-64 and osx-64 and osx-arm64 builds
* [x] Vincent: update [libblitz-feedstock](https://github.com/conda-forge/libblitz-feedstock/blob/master/recipe/meta.yaml) to match [ours](https://gitlab.idiap.ch/bob/conda/-/blob/master/conda/libblitz/meta.yaml)
* [x] Tiago: remove vlfeat dependency from bob.ip.base ~~Fix and merge https://github.com/conda-forge/vlfeat-feedstock/pull/9~~
* [x] Laurent: Submit [zc.buildout](https://gitlab.idiap.ch/bob/conda/-/tree/master/conda/zc.buildout) to conda-forge
* [x] Laurent: Submit [zc.recipe.egg](https://gitlab.idiap.ch/bob/conda/-/tree/master/conda/zc.recipe.egg) to conda-forge
* [x] Amir: We need to use conda-forge docker image(s), their compilers, and mac requirements
* [x] Remove `deps` from bob/bob.devtools>
* [x] Start building bob/bob.devtools> based on conda-forge
* [x] Update pin versions in bob.devtools
* [x] Archive bob/conda>
* [x] Amir: We need to delist packages in our channel that are already in conda-forge using metadata patches
* [x] Delete all the beta packages
* [x] Start building bob packages based on conda-forge
* [x] Use https://github.com/conda-forge/miniforge/ instead of miniconda
* [x] We might need to support `yum-requirements.txt` to support `pyopengl` which needs `mesa-libGL-devel` package installed. See: https://gitlab.idiap.ch/bob/bob.devtools/-/blob/5cd514f64181dd0ee834aaf61ebb189c0257e9f8/.gitlab-ci.yml#L71 https://gitlab.idiap.ch/bob/bob.ip.view uses pyopenglhttps://gitlab.idiap.ch/bob/bob.bio.face/-/issues/46Cleanup face cropping helpers2021-10-29T15:34:57ZLaurent COLBOISCleanup face cropping helpersThe available embedding extractors expect various types of cropping (various cropped image size, and more or less tight).
Currently those defaults are provided through some helpers functions (https://gitlab.idiap.ch/bob/bob.bio.face/-/bl...The available embedding extractors expect various types of cropping (various cropped image size, and more or less tight).
Currently those defaults are provided through some helpers functions (https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/master/bob/bio/face/config/baseline/helpers.py) that are not well documented and confusing to follow. The aim is to cleanup and document better the face cropping, mainly :
1. Baselines cleanup
* [x] Hard code specific cropped positions for available baselines directly in the associated config file
* [x] Set the correct annotator that should be used with each extractor (-> the same one than during training)
2. Helpers cleanup
* [x] Refactor to provide only few, general purpose, default cropped positions (e.g. wide & tight crops)
* [x] Document the available default crops
I'll be working on that.Laurent COLBOISLaurent COLBOIShttps://gitlab.idiap.ch/bob/bob.bio.face/-/issues/45MultiFace crop issue2021-10-29T15:34:57ZTiago de Freitas PereiraMultiFace crop issueHi @lcolbois
If I run
`./bin/bob bio pipeline vanilla-biometrics multipie-pose arcface-insightface -m -c -o /path/ -vvv`
I get the following issue.
```
allow_scoring_with_all_biometric_references=allow_scoring_with_all_biometri...Hi @lcolbois
If I run
`./bin/bob bio pipeline vanilla-biometrics multipie-pose arcface-insightface -m -c -o /path/ -vvv`
I get the following issue.
```
allow_scoring_with_all_biometric_references=allow_scoring_with_all_biometric_references,
File "/remote/idiap.svm/user.active/tpereira/gitlab/bob/bob.nightlies/src/bob.bio.base/bob/bio/base/pipelines/vanilla_biometrics/pipelines.py", line 100, in __call__
f" >> Vanilla Biometrics: Training background model with pipeline {self.transformer}"
File "/idiap/user/tpereira/conda/envs/bob.nightlies/lib/python3.7/site-packages/sklearn/base.py", line 260, in __repr__
repr_ = pp.pformat(self)
File "/idiap/user/tpereira/conda/envs/bob.nightlies/lib/python3.7/pprint.py", line 144, in pformat
self._format(object, sio, 0, 0, {}, 0)
File "/idiap/user/tpereira/conda/envs/bob.nightlies/lib/python3.7/pprint.py", line 161, in _format
rep = self._repr(object, context, level)
File "/idiap/user/tpereira/conda/envs/bob.nightlies/lib/python3.7/pprint.py", line 393, in _repr
self._depth, level)
File "/idiap/user/tpereira/conda/envs/bob.nightlies/lib/python3.7/site-packages/sklearn/utils/_pprint.py", line 181, in format
changed_only=self._changed_only)
File "/idiap/user/tpereira/conda/envs/bob.nightlies/lib/python3.7/site-packages/sklearn/utils/_pprint.py", line 437, in _safe_repr
v, context, maxlevels, level, changed_only=changed_only)
File "/idiap/user/tpereira/conda/envs/bob.nightlies/lib/python3.7/site-packages/sklearn/utils/_pprint.py", line 406, in _safe_repr
o, context, maxlevels, level, changed_only=changed_only)
File "/idiap/user/tpereira/conda/envs/bob.nightlies/lib/python3.7/site-packages/sklearn/utils/_pprint.py", line 406, in _safe_repr
o, context, maxlevels, level, changed_only=changed_only)
File "/idiap/user/tpereira/conda/envs/bob.nightlies/lib/python3.7/site-packages/sklearn/utils/_pprint.py", line 437, in _safe_repr
v, context, maxlevels, level, changed_only=changed_only)
File "/idiap/user/tpereira/conda/envs/bob.nightlies/lib/python3.7/site-packages/sklearn/utils/_pprint.py", line 437, in _safe_repr
v, context, maxlevels, level, changed_only=changed_only)
File "/idiap/user/tpereira/conda/envs/bob.nightlies/lib/python3.7/site-packages/sklearn/utils/_pprint.py", line 437, in _safe_repr
v, context, maxlevels, level, changed_only=changed_only)
File "/idiap/user/tpereira/conda/envs/bob.nightlies/lib/python3.7/site-packages/sklearn/utils/_pprint.py", line 425, in _safe_repr
params = _changed_params(object)
File "/idiap/user/tpereira/conda/envs/bob.nightlies/lib/python3.7/site-packages/sklearn/utils/_pprint.py", line 91, in _changed_params
params = estimator.get_params(deep=False)
File "/idiap/user/tpereira/conda/envs/bob.nightlies/lib/python3.7/site-packages/sklearn/base.py", line 195, in get_params
value = getattr(self, key)
AttributeError: 'MultiFaceCrop' object has no attribute 'allow_upside_down_normalized_faces'
```
Since you've implemented this feature, do you mind having a look?
Thanks
ping @ageorgehttps://gitlab.idiap.ch/bob/bob.bio.face/-/issues/44Filename extension of ARface database is undefined2021-04-08T06:37:35ZManuel Günthersiebenkopf@googlemail.comFilename extension of ARface database is undefinedWhen downloading the images of the AR face database from its original site: https://www2.ece.ohio-state.edu/~aleix/ARdatabase.html The images are stored in `.raw` format. Depending on which format was used to convert the files, different...When downloading the images of the AR face database from its original site: https://www2.ece.ohio-state.edu/~aleix/ARdatabase.html The images are stored in `.raw` format. Depending on which format was used to convert the files, different file name extensions need to be provided for the arface database interface: https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/fdf454e829c320df1747cbbbcd050be9f6a26b34/bob/bio/face/config/database/arface.py#L10
Currently, this extension is fixed to `.png`. It would be optimal if we could have a resource parameter to update this extension in case people stored images in a different format -- as it is the case at Idiap, where `.ppm` was used.Manuel Günthersiebenkopf@googlemail.comManuel Günthersiebenkopf@googlemail.comhttps://gitlab.idiap.ch/bob/bob.learn.linear/-/issues/14Severe bug in LDA (default algorithm)2021-09-14T16:25:15ZAndré AnjosSevere bug in LDA (default algorithm)The default algorithm in this package is wrong. We cannot use symmetric to evaluate the eigenvalue decomposition of $Sw^-1 Sb$, because that multiplication is not guaranteed symmetric.
We should always be using `pinv`. There should be...The default algorithm in this package is wrong. We cannot use symmetric to evaluate the eigenvalue decomposition of $Sw^-1 Sb$, because that multiplication is not guaranteed symmetric.
We should always be using `pinv`. There should be no other option.
References:
* https://gitlab.idiap.ch/bob/bob.learn.linear/-/blob/master/bob/learn/linear/include/bob.learn.linear/lda.h#L49
* https://gitlab.idiap.ch/bob/bob.learn.linear/-/blob/master/bob/learn/linear/cpp/lda.cpp#L137Tiago de Freitas PereiraTiago de Freitas Pereirahttps://gitlab.idiap.ch/bob/bob/-/issues/267pytest still runs the slow nosetests2022-05-10T09:48:19ZAmir MOHAMMADIpytest still runs the slow nosetestsWe used to export `NOSE_EVAL_ATTR="not slow"` to have nosetets skip some tests but now in this project, we run all tests with pytests and pytests ignores this env variable.We used to export `NOSE_EVAL_ATTR="not slow"` to have nosetets skip some tests but now in this project, we run all tests with pytests and pytests ignores this env variable.https://gitlab.idiap.ch/bob/bob.bio.face/-/issues/42ArcFace (MXNET) + IJBC lead to memory error2021-03-02T10:58:29ZLaurent COLBOISArcFace (MXNET) + IJBC lead to memory errorHi,
For info I tried to run the ArcFace baseline on IJBC but I can't seem to make it work, there is a memory error like
```
mxnet.base.MXNetError: [11:11:36] /tmp/build/80754af9/libmxnet_1564766659613/work/src/storage/./cpu_device_storag...Hi,
For info I tried to run the ArcFace baseline on IJBC but I can't seem to make it work, there is a memory error like
```
mxnet.base.MXNetError: [11:11:36] /tmp/build/80754af9/libmxnet_1564766659613/work/src/storage/./cpu_device_storage.h:75: Failed to allocate CPU Memory
Stack trace:
[bt] (0) /idiap/temp/lcolbois/miniconda3/envs/bob_tf2/lib/python3.7/site-packages/mxnet/libmxnet.so(+0x38e1c4) [0x7f9ad17381c4]
[bt] (1) /idiap/temp/lcolbois/miniconda3/envs/bob_tf2/lib/python3.7/site-packages/mxnet/libmxnet.so(+0x26a34f3) [0x7f9ad3a4d4f3]
[bt] (2) /idiap/temp/lcolbois/miniconda3/envs/bob_tf2/lib/python3.7/site-packages/mxnet/libmxnet.so(mxnet::StorageImpl::Alloc(mxnet::Storage::Handle*)+0x5d) [0x7f9ad3a52dbd]
[bt] (3) /idiap/temp/lcolbois/miniconda3/envs/bob_tf2/lib/python3.7/site-packages/mxnet/libmxnet.so(+0x399cd1) [0x7f9ad1743cd1]
[bt] (4) /idiap/temp/lcolbois/miniconda3/envs/bob_tf2/lib/python3.7/site-packages/mxnet/libmxnet.so(+0x493357) [0x7f9ad183d357]
[bt] (5) /idiap/temp/lcolbois/miniconda3/envs/bob_tf2/lib/python3.7/site-packages/mxnet/libmxnet.so(+0x496107) [0x7f9ad1840107]
[bt] (6) /idiap/temp/lcolbois/miniconda3/envs/bob_tf2/lib/python3.7/site-packages/mxnet/libmxnet.so(+0x49653e) [0x7f9ad184053e]
[bt] (7) /idiap/temp/lcolbois/miniconda3/envs/bob_tf2/lib/python3.7/site-packages/mxnet/libmxnet.so(+0x49da1d) [0x7f9ad1847a1d]
[bt] (8) /idiap/temp/lcolbois/miniconda3/envs/bob_tf2/lib/python3.7/site-packages/mxnet/libmxnet.so(+0x49dadf) [0x7f9ad1847adf]
```
I tried to play a bit with the Dask partition size parameter with no success.
Note that the ArcFace model works for other lighter databases (Mobio, Multipie), and IJBC works with TF2 baselines. But the MXNet + IJBC combination seems to be too demanding. I remember that we adjusted the TF2 implementation to add an option to toggle samplewise inference (vs by batch) to limit the memory footprint, is there something similar we should do with MXNet models ?
I can manage my work without running this evaluation, but I just wanted to point it out.Laurent COLBOISLaurent COLBOIShttps://gitlab.idiap.ch/bob/bob.ip.base/-/issues/14GeomNorm and extrapolate_mask require different mask types2021-02-24T10:43:10ZManuel Günthersiebenkopf@googlemail.comGeomNorm and extrapolate_mask require different mask typesWhen looking at the two functions `GeomNorm.process` and `extrapolate_mask` for color images, two different mask types are required:
- `GeomNorm` takes a 3D mask: https://gitlab.idiap.ch/bob/bob.ip.base/-/blob/a9109e06264f2e28ece3b131da...When looking at the two functions `GeomNorm.process` and `extrapolate_mask` for color images, two different mask types are required:
- `GeomNorm` takes a 3D mask: https://gitlab.idiap.ch/bob/bob.ip.base/-/blob/a9109e06264f2e28ece3b131da9babb0365b6f36/bob/ip/base/include/bob.ip.base/GeomNorm.h#L99 (implementation here: https://gitlab.idiap.ch/bob/bob.ip.base/-/blob/a9109e06264f2e28ece3b131da9babb0365b6f36/bob/ip/base/include/bob.ip.base/GeomNorm.h#L166)
- `extrapolate_mask` takes a 2D mask: https://gitlab.idiap.ch/bob/bob.ip.base/-/blob/a9109e06264f2e28ece3b131da9babb0365b6f36/bob/ip/base/include/bob.ip.base/Affine.h#L445
This is a bit inconsequent. I am not sure, why `GeomNorm` requires a 3D mask for color images. Each pixel will have the same visibility independent of the color channel. Thus, I would propose to reduce the mask for `GeomNorm` to 2D. We can also have the possibility to have both 2D and 3D masks, in case you require backward compatibility (which I could understand).Manuel Günthersiebenkopf@googlemail.comManuel Günthersiebenkopf@googlemail.comhttps://gitlab.idiap.ch/bob/bob.bio.video/-/issues/17Documentation on loading annotations2021-02-23T11:02:34ZAnjith GEORGEanjith.george@idiap.chDocumentation on loading annotationsCan you add some documentation on how to pass annotations to the preprocessor
https://gitlab.idiap.ch/bob/bob.bio.video/-/blob/master/bob/bio/video/transformer.py#L42Can you add some documentation on how to pass annotations to the preprocessor
https://gitlab.idiap.ch/bob/bob.bio.video/-/blob/master/bob/bio/video/transformer.py#L42https://gitlab.idiap.ch/bob/bob.pad.base/-/issues/38Nightlies failing because of this one2021-02-15T12:58:52ZTiago de Freitas PereiraNightlies failing because of this oneHey @ydayer,
It's a sphinx warning, check:
https://gitlab.idiap.ch/bob/nightlies/-/jobs/223557
Do you mind having a look on this?
ThanksHey @ydayer,
It's a sphinx warning, check:
https://gitlab.idiap.ch/bob/nightlies/-/jobs/223557
Do you mind having a look on this?
Thankshttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/71`sphinx_rtd_theme` not in the list of required packages2021-02-12T10:12:04ZManuel Günthersiebenkopf@googlemail.com`sphinx_rtd_theme` not in the list of required packagesWhen installing `bob.devtools` and try to generate the documentation of a certain package (`bob.db.lfw` in my case), I got the error that `sphinx_rtd_theme` is missing:
```
Configuration error:
There is a programmable error in your conf...When installing `bob.devtools` and try to generate the documentation of a certain package (`bob.db.lfw` in my case), I got the error that `sphinx_rtd_theme` is missing:
```
Configuration error:
There is a programmable error in your configuration file:
Traceback (most recent call last):
File "envs/bob/lib/python3.7/site-packages/sphinx/config.py", line 326, in eval_config_file
execfile_(filename, namespace)
File "envs/bob/lib/python3.7/site-packages/sphinx/util/pycompat.py", line 88, in execfile_
exec(code, _globals)
File "bob.db.lfw/doc/conf.py", line 137, in <module>
import sphinx_rtd_theme
ModuleNotFoundError: No module named 'sphinx_rtd_theme'
```
When looking for that package here, it is only listed as test requirement. Is this intended behavior?https://gitlab.idiap.ch/bob/bob.db.lfw/-/issues/2Combined `view-2` protocol2021-02-12T12:02:12ZManuel Günthersiebenkopf@googlemail.comCombined `view-2` protocolCurrently, running an experiment on `view 2` of the dataset requires to have 10 different experiments run, and at the end concatenate all score files into one in order to plot ROC curves.
Since modern deep-learning evaluations do not re...Currently, running an experiment on `view 2` of the dataset requires to have 10 different experiments run, and at the end concatenate all score files into one in order to plot ROC curves.
Since modern deep-learning evaluations do not require to use the training split of the 10-fold LFW data splits, it would be easier to implement an `eval`-only (or maybe `dev`-only) protocol on `view 2` that simply concatenates all 10 folds of `view 2`.Manuel Günthersiebenkopf@googlemail.comManuel Günthersiebenkopf@googlemail.comhttps://gitlab.idiap.ch/bob/bob.bio.face/-/issues/41EyesAnnotations ordering of x & y coordinates is wrong2021-02-09T17:13:32ZLaurent COLBOISEyesAnnotations ordering of x & y coordinates is wrongHi,
If I am not mistaken, the adopted convention for face landmarks in `bob.bio.face` is to provide them under the form
`lm = (y coordinate, x coordinate)`.
The [`EyesAnnotations` loader](https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/m...Hi,
If I am not mistaken, the adopted convention for face landmarks in `bob.bio.face` is to provide them under the form
`lm = (y coordinate, x coordinate)`.
The [`EyesAnnotations` loader](https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/master/bob/bio/face/database/sample_loaders.py#L34) does the opposite, i.e. `lm = (x coordinate, y coordinate)`, same with the `MultiposeAnnotations` loader.
The issue does not show on the CSVDatabase implementation of Mobio and Multipie, because the x- and y- coordinates are *also* inverted in those CSV files.
Example from mobio :
```
| PATH | REFERENCE_ID | GENDER | DEVICE | ENVIRONMENT | SESSION_ID | SHORT_ID | SPEECH_TYPE | reye_x | reye_y | leye_x | leye_y |
|--------------------------------------|--------------|--------|--------|-------------|------------|----------|-------------|--------|--------|--------|--------|
| uman/m103/01_mobile/m103_01_p01_i0_0 | 103 | m | mobile | i | 1 | 1 | p | 186 | 243 | 187 | 329 |
```
You can see the right and left eyes have almost the same x-coordinate which is wrong.
With the corrected coordinates, then the cropped images are rotated 90° because of the issue in `EyesAnnotations`.
Fixing the code is easy, however when we do it we should also update the CSV files for Mobio and Multipie. What's the process to update those CSV files ?
ping @tiago.pereira
Edit : I guess this is only an issue because we provide some default `cropped_positions` that follow the (y, x) convention, so we could also keep the EyesAnnotations as they are and change the defaults to follow the (x, y) convention. But in any case, we'll need to fix the CSV files.https://gitlab.idiap.ch/bob/bob.bio.face/-/issues/40Convention when feeding data to an annotator2021-02-05T18:32:43ZLaurent COLBOISConvention when feeding data to an annotatorHi, I have some trouble using face annotators, and I think the issue is caused by some inconsistency between feeding a batch of images vs a single image when calling the annotator. More precisely :
+ The FaceCrop [sends a **single image*...Hi, I have some trouble using face annotators, and I think the issue is caused by some inconsistency between feeding a batch of images vs a single image when calling the annotator. More precisely :
+ The FaceCrop [sends a **single image** to the annotator](https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/master/bob/bio/face/preprocessor/FaceCrop.py#L349), and [expects to receive a **single dictionary of annotations**](https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/master/bob/bio/face/preprocessor/FaceCrop.py#L299)
+ While (from my understanding)
the [Base annotator](https://gitlab.idiap.ch/bob/bob.bio.face/-/blob/master/bob/bio/face/annotator/Base.py#L5) expects a **list of images** and returns a **list of annotations dictionaries** (one dictionary per provided image).
I should be able to propose an easy fix, but which is the correct expected behavior ?
ping @ydayer @tiago.pereiraLaurent COLBOISLaurent COLBOIShttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/70Removing `llvm-tools` from build dependencies in `bdt create` is undesirable2021-10-29T15:34:56ZAndré AnjosRemoving `llvm-tools` from build dependencies in `bdt create` is undesirableThis was done to overcome another issue regarding `llvm-tools` and its dependence to `libllvm10`.
One of the (build) packages we depend on wants to have `libllvm10=10.0.0`, but "llvm-tools" requires `libllvm10=10.0.1`. This creates a c...This was done to overcome another issue regarding `llvm-tools` and its dependence to `libllvm10`.
One of the (build) packages we depend on wants to have `libllvm10=10.0.0`, but "llvm-tools" requires `libllvm10=10.0.1`. This creates a conflict. Build requirements from our packages will end-up bringing in the requirement for "llvm-tools".
To test this, on a macOS system, try the following:
```sh
$ conda create --dry-run --name xxx --override-channels --channel=http://www.idiap.ch/software/bob/conda/label/beta --channel=http://www.idiap.ch/software/bob/conda --channel=defaults --dry-run 'bob-devel=2021.01.28.*' clang llvm-tools
```
The problem is, possibly, that our own bob-devel brings in dependencies that make the above build tools conflict. Removing any of the 3 packages listed above from the command-line makes the command-line work again. Once that command (or a variant with a newer version of bob-devel) works, the problem is fixed and a patch is no longer required.André AnjosAndré Anjoshttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/69Pin of `smmap<4` is undesirable2021-04-10T11:28:22ZAndré AnjosPin of `smmap<4` is undesirableThis pin was put in place to fix a dependency issue with package `gitdb`, which we don't provide, but depend on.
This can be undone when this (https://github.com/ContinuumIO/anaconda-issues/issues/12255) is fixed.
To undo this work, ju...This pin was put in place to fix a dependency issue with package `gitdb`, which we don't provide, but depend on.
This can be undone when this (https://github.com/ContinuumIO/anaconda-issues/issues/12255) is fixed.
To undo this work, just revert !205.André AnjosAndré Anjoshttps://gitlab.idiap.ch/bob/nightlies/-/issues/59Nightlies MacOS SegFault2021-01-22T07:05:43ZTiago de Freitas PereiraNightlies MacOS SegFaultWith this MR https://gitlab.idiap.ch/bob/bob.devtools/-/merge_requests/203 the Seg.fault was solved for individual packages.
However, the problem still remains with nightly builds.
https://gitlab.idiap.ch/bob/nightlies/-/jobs/221664
Lo...With this MR https://gitlab.idiap.ch/bob/bob.devtools/-/merge_requests/203 the Seg.fault was solved for individual packages.
However, the problem still remains with nightly builds.
https://gitlab.idiap.ch/bob/nightlies/-/jobs/221664
Looking at the logs there are still some references pointing to 10.9 `sdk`.
```
creating build/temp.macosx-10.9-x86_64-3.7/bob/blitz
x86_64-apple-darwin13.4.0-clang -fno-strict-aliasing -Wsign-compare -Wunreachable-code -DNDEBUG -fwrapv -O3 -Wall -march=core2 -mtune=haswell -mssse3 -ftree-vectorize -fPIC -fPIE -fstack-protector-strong -O3 -pipe -fdebug-prefix-map=${SRC_DIR}=/usr/local/src/conda/${PKG_NAME}-${PKG_VERSION} -fdebug-prefix-map=$PREFIX=/usr/local/src/conda-prefix -flto -Wl,-export_dynamic -march=core2 -mtune=haswell -mssse3 -ftree-vectorize -fPIC -fPIE -fstack-protector-strong -O3 -march=core2 -mtune=haswell -mssse3 -ftree-vectorize -fPIC -fPIE -fstack-protector-strong -O2 -pipe -isystem $PREFIX/include -fdebug-prefix-map=$SRC_DIR=/usr/local/src/conda/bob.blitz-2.0.23b0 -fdebug-prefix-map=$PREFIX=/usr/local/src/conda-prefix -D_FORTIFY_SOURCE=2 -mmacosx-version-min=10.9 -isystem $PREFIX/include -Wno-strict-aliasing -DBOB_EXT_MODULE_PREFIX="bob.blitz" -DBOB_EXT_MODULE_NAME="version" -DBOB_EXT_ENTRY_NAME=PyInit_version -DBOB_EXT_MODULE_VERSION="2.0.23b0" -DHAVE_BOOST=1 -DHAVE_BLITZ=1 -DPY_ARRAY_UNIQUE_SYMBOL=BOB_BLITZ_NUMPY_C_API -DNO_IMPORT_ARRAY=1 -DNPY_NO_DEPRECATED_API=NPY_1_7_API_VERSION -I/Users/gitlab/builds/b6d3167a/0/bob/nightlies/src/bob/bob.blitz/bob/blitz/include -I$PREFIX/lib/python3.7/site-packages/bob/extension/include -I$PREFIX/include/python3.7m -c bob/blitz/version.cpp -o build/temp.macosx-10.9-x86_64-3.7/bob/blitz/version.o -mmacosx-version-min=10.10 -isysroot /opt/MacOSX10.10.sdk -march=core2 -mtune=haswell -mssse3 -ftree-vectorize -fPIC -fPIE -fstack-protector-strong -O2 -pipe -stdlib=libc++ -fvisibility-inlines-hidden -std=c++14 -fmessage-length=0 -fdebug-prefix-map=$SRC_DIR=/usr/local/src/conda/bob.blitz-2.0.23b0 -fdebug-prefix-map=$PREFIX=/usr/local/src/conda-prefix -Wno-#warnings -pthread -isystem $PREFIX/lib/python3.7/site-packages/numpy/core/include -isystem $PREFIX/include
```
or
```
+CPPFLAGS=-D_FORTIFY_SOURCE=2 -mmacosx-version-min=10.9 -isystem $PREFIX/include
```
I don't know if these lines have a major impact. However, I don't know where the 10.9 come from.https://gitlab.idiap.ch/bob/bob.devtools/-/issues/68CI is not working2021-01-19T10:54:42ZTiago de Freitas PereiraCI is not workingFrom today, builds are not working.
https://gitlab.idiap.ch/bob/bob.buildout/-/jobs/221383
What happened with `conda-build`?
Thanks
```
------------------------------------------------------------------------------------------------...From today, builds are not working.
https://gitlab.idiap.ch/bob/bob.buildout/-/jobs/221383
What happened with `conda-build`?
Thanks
```
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
TOTAL 862 649 25%
Coverage HTML written to dir /scratch/builds/bob/bob.buildout/conda/../sphinx/coverage
Coverage XML written to file /scratch/builds/bob/bob.buildout/conda/../coverage.xml
============================== 8 passed in 0.90s ===============================
+ conda inspect linkages -p /scratch/builds/bob/bob.buildout/miniconda/conda-bld/bob.buildout_1610543493742/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_pla bob.buildout
Traceback (most recent call last):
File "/scratch/builds/bob/bob.buildout/miniconda/bin/conda-inspect", line 7, in <module>
from conda_build.cli.main_inspect import main
ModuleNotFoundError: No module named 'conda_build'
Tests failed for bob.buildout-2.2.6b0-py37ha12b548_43.conda - moving package to /scratch/builds/bob/bob.buildout/miniconda/conda-bld/broken
TESTS FAILED: bob.buildout-2.2.6b0-py37ha12b548_43.conda
Uploading artifacts for failed job
00:01
Uploading artifacts...
coverage.xml: found 1 matching files and directories
Uploading artifacts as "cobertura" to coordinator... ok id=221383 responseStatus=201 Created token=cHw11iVi
Cleaning up file based variables
00:02
ERROR: Job failed: exit code 1
```https://gitlab.idiap.ch/bob/bob.devtools/-/issues/67Moving from macOS 10.13 to a newer version2021-01-05T14:37:42ZAndré AnjosMoving from macOS 10.13 to a newer versionWhile trying to upgrade some packages on the mac ci (black tower), I got this message today:
```
Warning: You are using macOS 10.13.
We (and Apple) do not provide support for this old version.
You will encounter build failures with some...While trying to upgrade some packages on the mac ci (black tower), I got this message today:
```
Warning: You are using macOS 10.13.
We (and Apple) do not provide support for this old version.
You will encounter build failures with some formulae.
Please create pull requests instead of asking for help on Homebrew's GitHub,
Twitter or any other official channels. You are responsible for resolving
any issues you experience while you are running this
old version.
```
We need to consider an OS upgrade for both machines. This should be OK as we will still be able to compile for an older version of macOS as we currently do.
I propose we bump the OS *directly* to version 10.15 (Catalina).André AnjosAndré Anjoshttps://gitlab.idiap.ch/bob/bob.devtools/-/issues/66Location of miniconda install script2021-01-08T14:37:57ZVincent POLLETLocation of miniconda install scriptI am getting a 302 error when running [bootstrap.py](https://gitlab.idiap.ch/bob/bob.devtools/-/blob/master/bob/devtools/bootstrap.py), in the [request](https://gitlab.idiap.ch/bob/bob.devtools/-/blob/master/bob/devtools/bootstrap.py#L24...I am getting a 302 error when running [bootstrap.py](https://gitlab.idiap.ch/bob/bob.devtools/-/blob/master/bob/devtools/bootstrap.py), in the [request](https://gitlab.idiap.ch/bob/bob.devtools/-/blob/master/bob/devtools/bootstrap.py#L241) for the miniconda install script.
The url of the request is `http://www.idiap.ch/miniconda/Miniconda3-py37_4.8.2-Linux-x86_64.sh` and the url of the re-direction is the same, but https instead of http. However, I get a 404 at the https url. Is the ressource missing or am I doing something wrong ?
For context, I am trying to find out why the [build of the opencv package in bob/conda](https://gitlab.idiap.ch/bob/conda/-/jobs/220377) is failing in the CI, but working fine locally. So I am trying to run the build steps of [base-build.yaml](https://gitlab.idiap.ch/bob/bob.devtools/-/blob/master/bob/devtools/data/gitlab-ci/base-build.yaml) in a docker with the c3i-linux-64 image.