You can now specify a minimum file size for saved files
Sometimes, the saved preprocessed, features, models are less than 1000
bytes.
This can happen when using a proprietary algorithm.
Merge request reports
Activity
added 1 commit
- 90f3840f - You can now specify a minimum file size for saved files
Indeed, this was a point that I never really liked. I just hard-coded the 1000 bytes as an empty (and therewith invalid) .hdf5 file had 800 bytes, while any reasonable .hdf5 file was larger. Hence, as long as we only use(d) .hdf5 to store data, invalid files smaller than 1000 are always recreated, while (valid) files larger than 1000 were skipped.
Having this size specifiable and even different for different types of files looks absolutely reasonable to me. Just one note: I don't think that we need to specify different file sizes for a
model
and at_model
. Both are created with the same function (enroll
) and should have the same minimal size.enabled an automatic merge when the pipeline for c3a14afc succeeds
mentioned in commit 7bd85a60