Skip to content
Snippets Groups Projects
Commit 8e1f7faa authored by Anjith GEORGE's avatar Anjith GEORGE
Browse files

small mods

parent 22479d16
No related branches found
No related tags found
1 merge request!1Test ci
Pipeline #29976 failed
......@@ -12,8 +12,8 @@ FASNet accepts RGB images only, hence the preprocesing is done first. This can b
.. code-block:: sh
./bin/spoof.py \
<PATH_TO_CONFIG>/wmca_grandtest_dbconfig.py \
<PATH_TO_CONFIG>/wmca_config_pytorch_extractor.py \
wmca-color \
fasnet \
--execute-only preprocessing \
--sub-directory <FASNET_PIPELINE_FOLDER>
--grid idiap
......@@ -34,7 +34,7 @@ Once the config file is defined, training the network can be done with the follo
.. code-block:: sh
./bin/train_fasnet.py \ # script used for FASNET training
<PATH_TO_TRAINER_CONFIG>/wmca_FASNet.py \ # configuration file defining the FASNET network, database, and training parameters
<PATH_TO_TRAINER_CONFIG>/wmca_faset.py \ # configuration file defining the FASNET network, database, and training parameters
-vv # set verbosity level
People in Idiap can benefit from GPU cluster, running the training as follows:
......@@ -46,7 +46,7 @@ People in Idiap can benefit from GPU cluster, running the training as follows:
--log-dir <FOLDER_TO_SAVE_THE_RESULTS>/logs/ \ # substitute the path to save the logs to (Idiap only)
--environment="PYTHONUNBUFFERED=1" -- \ #
./bin/train_fasnet.py \ # script used for FASNET training
<PATH_TO_TRAINER_CONFIG>/wmca_FASNet.py \ # configuration file defining the FASNET network, database, and training parameters
<PATH_TO_TRAINER_CONFIG>/wmca_fasnet.py \ # configuration file defining the FASNET network, database, and training parameters
--use-gpu \ # enable the GPU mode
-vv # set verbosity level
......@@ -57,7 +57,7 @@ For a more detailed documentation of functionality available in the training scr
./bin/train_fasnet.py --help # note: remove ./bin/ if buildout is not used
Please inspect the corresponding configuration file, ``wmca_FASNet.py`` for example, for more details on how to define the database, network architecture and training parameters.
Please inspect the corresponding configuration file, ``wmca_faset.py`` for example, for more details on how to define the database, network architecture and training parameters.
The protocols, and channels used in the experiments can be easily configured in the configuration file.
......@@ -72,10 +72,10 @@ For **grandtest** protocol this can be done as follows.
.. code-block:: sh
./bin/spoof.py \
<PATH_TO_DATABASE_CONFIG>/wmca_grandtest_dbconfig.py \
<PATH_TO_EXTRACTORS>/wmca_config_pytorch_extractor_v1_FASNet.py \
wmca-color \
fasnet \
--protocol grandtest \
--sub-directory <FOLDER_TO_SAVE_MCCNN_RESULTS> -vv
--sub-directory <FOLDER_TO_SAVE_FASNET_RESULTS> -vv
......@@ -87,8 +87,8 @@ To evaluate the models run the following command.
.. code-block:: python
./bin/scoring.py -df \
<PATH_TO_SCORES>/scores-dev -ef \
<PATH_TO_SCORES>/scores-eval
<FOLDER_TO_SAVE_FASNET_RESULTS>/grandtest/scores/scores-dev -ef \
<FOLDER_TO_SAVE_FASNET_RESULTS>/grandtest/scores/scores-eval
......
......@@ -100,8 +100,8 @@ To evaluate the models run the following command.
.. code-block:: python
./bin/scoring.py -df \
<PATH_TO_SCORES>/scores-dev -ef \
<PATH_TO_SCORES>/scores-eval
<FOLDER_TO_SAVE_MCCNN_RESULTS>/grandtest/scores/scores-dev -ef \
<FOLDER_TO_SAVE_MCCNN_RESULTS>/grandtest/scores/scores-eval
Using pretrained models
=======================
......
......@@ -109,7 +109,7 @@ setup(
'lbp-lr-infrared = bob.paper.mccnn.tifs2018.config.lbp_lr_infrared',
'lbp-lr-thermal= bob.paper.mccnn.tifs2018.config.lbp_lr_thermal',
'lbp-lr-depth= bob.paper.mccnn.tifs2018.config.lbp_lr_depth',
'fastnet = bob.paper.mccnn.tifs2018.config.FASNet_config',
'fasnet = bob.paper.mccnn.tifs2018.config.FASNet_config',
'mccnn = bob.paper.mccnn.tifs2018.config.MCCNN_config',
'iqm-lr = bob.paper.mccnn.tifs2018.config.iqm_lr',
'haralick-svm = bob.paper.mccnn.tifs2018.config.haralick_svm',],
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment