Commit 93c487b6 authored by André Anjos's avatar André Anjos 💬

Use bob_bio_vein prefix on all scripts

parent 0e297d93
Pipeline #20776 passed with stage
in 25 minutes and 23 seconds
...@@ -254,7 +254,7 @@ correspond to the the equal-error rate on the development set, in percentage): ...@@ -254,7 +254,7 @@ correspond to the the equal-error rate on the development set, in percentage):
Feature Extractor Full B Nom 1vsall nom Feature Extractor Full B Nom 1vsall nom
======================== ====== ====== ====== ====== ====== ======================== ====== ====== ====== ====== ======
Repeated Line Tracking 14.6 13.4 23.6 3.4 1.4 Repeated Line Tracking 14.6 13.4 23.6 3.4 1.4
Wide Line Detector 5.8 5.6 9.7 2.8 1.9 Wide Line Detector 5.8 5.6 9.9 2.8 1.9
Maximum Curvature 2.5 1.4 4.5 0.9 0.4 Maximum Curvature 2.5 1.4 4.5 0.9 0.4
======================== ====== ====== ====== ====== ====== ======================== ====== ====== ====== ====== ======
...@@ -368,11 +368,11 @@ When used to run an experiment, ...@@ -368,11 +368,11 @@ When used to run an experiment,
:py:class:`bob.bio.vein.preprocessor.WatershedMask` requires you provide a :py:class:`bob.bio.vein.preprocessor.WatershedMask` requires you provide a
*pre-trained* neural network model that presets the markers before *pre-trained* neural network model that presets the markers before
watershedding takes place. In order to create one, you can run the program watershedding takes place. In order to create one, you can run the program
`bob_vein_markdet.py`: `bob_bio_vein_markdet.py`:
.. code-block:: sh .. code-block:: sh
$ bob_vein_markdet.py --hidden=20 --samples=500 fv3d central dev $ bob_bio_vein_markdet.py --hidden=20 --samples=500 fv3d central dev
You input, as arguments to this application, the database, protocol and subset You input, as arguments to this application, the database, protocol and subset
name you wish to use for training the network. The data is loaded observing a name you wish to use for training the network. The data is loaded observing a
...@@ -388,9 +388,9 @@ Region of Interest Goodness of Fit ...@@ -388,9 +388,9 @@ Region of Interest Goodness of Fit
================================== ==================================
Automatic region of interest (RoI) finding and cropping can be evaluated using Automatic region of interest (RoI) finding and cropping can be evaluated using
a couple of scripts available in this package. The program ``bob_vein_compare_rois.py`` a couple of scripts available in this package. The program
compares two sets of ``preprocessed`` images and masks, generated by ``bob_bio_vein_compare_rois.py`` compares two sets of ``preprocessed`` images
*different* preprocessors (see and masks, generated by *different* preprocessors (see
:py:class:`bob.bio.base.preprocessor.Preprocessor`) and calculates a few :py:class:`bob.bio.base.preprocessor.Preprocessor`) and calculates a few
metrics to help you determine how both techniques compare. Normally, the metrics to help you determine how both techniques compare. Normally, the
program is used to compare the result of automatic RoI to manually annoted program is used to compare the result of automatic RoI to manually annoted
...@@ -400,7 +400,7 @@ extracted ones. E.g.: ...@@ -400,7 +400,7 @@ extracted ones. E.g.:
.. code-block:: sh .. code-block:: sh
$ bob_vein_compare_rois.py ~/verafinger/mc_annot/preprocessed ~/verafinger/mc/preprocessed $ bob_bio_vein_compare_rois.py ~/verafinger/mc_annot/preprocessed ~/verafinger/mc/preprocessed
Jaccard index: 9.60e-01 +- 5.98e-02 Jaccard index: 9.60e-01 +- 5.98e-02
Intersection ratio (m1): 9.79e-01 +- 5.81e-02 Intersection ratio (m1): 9.79e-01 +- 5.81e-02
Intersection ratio of complement (m2): 1.96e-02 +- 1.53e-02 Intersection ratio of complement (m2): 1.96e-02 +- 1.53e-02
...@@ -420,12 +420,12 @@ metrics. ...@@ -420,12 +420,12 @@ metrics.
Pipeline Display Pipeline Display
================ ================
You can use the program ``bob_vein_view_sample.py`` to display the images after You can use the program ``bob_bio_vein_view_sample.py`` to display the images
full processing using: after full processing using:
.. code-block:: sh .. code-block:: sh
$ bob_vein_view_sample.py --save=output-dir verafinger /path/to/processed/directory 030-M/030_L_1 $ bob_bio_vein_view_sample.py --save=output-dir verafinger /path/to/processed/directory 030-M/030_L_1
$ # open output-dir $ # open output-dir
And you should be able to view images like these (example taken from the Vera And you should be able to view images like these (example taken from the Vera
...@@ -436,7 +436,7 @@ feature extractor): ...@@ -436,7 +436,7 @@ feature extractor):
:scale: 50% :scale: 50%
Example RoI overlayed on finger vein image of the Vera fingervein database, Example RoI overlayed on finger vein image of the Vera fingervein database,
as produced by the script ``bob_vein_view_sample.py``. as produced by the script ``bob_bio_vein_view_sample.py``.
.. figure:: img/binarized.* .. figure:: img/binarized.*
......
...@@ -50,11 +50,11 @@ setup( ...@@ -50,11 +50,11 @@ setup(
], ],
'console_scripts': [ 'console_scripts': [
'bob_vein_compare_rois.py = bob.bio.vein.script.compare_rois:main', 'bob_bio_vein_compare_rois.py = bob.bio.vein.script.compare_rois:main',
'bob_vein_view_sample.py = bob.bio.vein.script.view_sample:main', 'bob_bio_vein_view_sample.py = bob.bio.vein.script.view_sample:main',
'bob_vein_blame.py = bob.bio.vein.script.blame:main', 'bob_bio_vein_blame.py = bob.bio.vein.script.blame:main',
'bob_vein_markdet.py = bob.bio.vein.script.markdet:main', 'bob_bio_vein_markdet.py = bob.bio.vein.script.markdet:main',
'bob_vein_watershed_mask.py = bob.bio.vein.script.watershed_mask:main', 'bob_bio_vein_watershed_mask.py = bob.bio.vein.script.watershed_mask:main',
] ]
}, },
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment