Updated the docs w.r.t. the new bob API.

parent 0693a692
......@@ -147,7 +147,7 @@ performance:
.. code-block:: sh
$ bob_eval_threshold.py <path-to>/verafinger/rlt/Nom/nonorm/scores-dev
$ bob bio metrics <path-to>/verafinger/rlt/Nom/nonorm/scores-dev --no-evaluation
('Threshold:', 0.31835292)
FAR : 23.636% (11388/48180)
FRR : 23.636% (52/220)
......@@ -180,7 +180,7 @@ we obtained:
.. code-block:: sh
$ bob_eval_threshold.py <path-to>/verafinger/mc/Nom/nonorm/scores-dev
$ bob bio metrics <path-to>/verafinger/mc/Nom/nonorm/scores-dev --no-evaluation
('Threshold:', 0.0737283)
FAR : 4.388% (2114/48180)
FRR : 4.545% (10/220)
......@@ -213,7 +213,7 @@ we obtained:
.. code-block:: sh
$ bob_eval_threshold.py <path-to>/verafinger/wld/NOM/nonorm/scores-dev
$ bob bio metrics <path-to>/verafinger/wld/NOM/nonorm/scores-dev --no-evaluation
('Threshold:', 0.240269475)
FAR : 9.770% (4707/48180)
FRR : 9.545% (21/220)
......@@ -347,11 +347,11 @@ When used to run an experiment,
:py:class:`bob.bio.vein.preprocessor.WatershedMask` requires you provide a
*pre-trained* neural network model that presets the markers before
watershedding takes place. In order to create one, you can run the program
`markdet.py`:
`bob_vein_markdet.py`:
.. code-block:: sh
$ markdet.py --hidden=20 --samples=500 fv3d central dev
$ bob_vein_markdet.py --hidden=20 --samples=500 fv3d central dev
You input, as arguments to this application, the database, protocol and subset
name you wish to use for training the network. The data is loaded observing a
......@@ -367,7 +367,7 @@ Region of Interest Goodness of Fit
==================================
Automatic region of interest (RoI) finding and cropping can be evaluated using
a couple of scripts available in this package. The program ``compare_rois.py``
a couple of scripts available in this package. The program ``bob_vein_compare_rois.py``
compares two sets of ``preprocessed`` images and masks, generated by
*different* preprocessors (see
:py:class:`bob.bio.base.preprocessor.Preprocessor`) and calculates a few
......@@ -379,7 +379,7 @@ extracted ones. E.g.:
.. code-block:: sh
$ compare_rois.py ~/verafinger/mc_annot/preprocessed ~/verafinger/mc/preprocessed
$ bob_vein_compare_rois.py ~/verafinger/mc_annot/preprocessed ~/verafinger/mc/preprocessed
Jaccard index: 9.60e-01 +- 5.98e-02
Intersection ratio (m1): 9.79e-01 +- 5.81e-02
Intersection ratio of complement (m2): 1.96e-02 +- 1.53e-02
......@@ -399,12 +399,12 @@ metrics.
Pipeline Display
================
You can use the program ``view_sample.py`` to display the images after
You can use the program ``bob_vein_view_sample.py`` to display the images after
full processing using:
.. code-block:: sh
$ ./bin/view_sample.py --save=output-dir verafinger /path/to/processed/directory 030-M/030_L_1
$ bob_vein_view_sample.py --save=output-dir verafinger /path/to/processed/directory 030-M/030_L_1
$ # open output-dir
And you should be able to view images like these (example taken from the Vera
......@@ -415,7 +415,7 @@ feature extractor):
:scale: 50%
Example RoI overlayed on finger vein image of the Vera fingervein database,
as produced by the script ``view_sample.py``.
as produced by the script ``bob_vein_view_sample.py``.
.. figure:: img/binarized.*
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment