Commit fbc58899 authored by Tiago de Freitas Pereira's avatar Tiago de Freitas Pereira

Merge branch 'doc_fix' into 'master'

[doc] Fix values in idiap_resnet_msceleb.rst

See merge request !4
parents 094e4c7e 9f7e002c
Pipeline #35760 passed with stages
in 5 minutes and 59 seconds
......@@ -7,7 +7,7 @@ Idiap - Resnet V2 - MSCeleba
============================
Inspired by `**FaceNet** <https://github.com/davidsandberg/facenet>`_ we here at Idiap trained our own CNN using the Inception Resnet 2 architecture using MSCeleba database.
In this `link <https://gitlab.idiap.ch/bob/bob.bio.htface/blob/277781d9c99738ff141218e1ce04103f9a427b0c/bob/bio/htface/config/tensorflow/MSCELEBA_inception_resnet_v2_center_loss.py>`_ you can find the script that trains this neural network.
In this `link <https://gitlab.idiap.ch/bob/bob.bio.face_ongoing/blob/master/bob/bio/face_ongoing/configs/cnn/resnet_inception_v2/MSCeleba_centerloss.py>`_ you can find the script that trains this neural network.
To trigger this training it's necessary to use the `bob.learn.tensorflow <http://gitlab.idiap.ch/bob/bob.learn.tensorflow/>`_ package and run the following command::
......@@ -16,16 +16,16 @@ To trigger this training it's necessary to use the `bob.learn.tensorflow <http:/
Some quick details about this CNN (just as a mental note):
- The hot encoded layer has 99879 neurons.
- The hot encoded layer has 87662 neurons (number of identities in ``msceleba_182x_hand_prunned_44``).
- MSCeleba has a lot of mislabeling, a very simple prunning was implemented `in this python package <http://gitlab.idiap.ch/tiago.pereira/bob.db.msceleb>`_.
- Faces were detected and croped to :math:`182 \times 182` using `MTCNN <https://gitlab.idiap.ch/bob/bob.ip.mtcnn>`_ face and landmark detector
- The following data augmentation strategies were implemented:
* Random crop to :math:`160 \times 160`
* Random Flip
* Images were normalized to have zero mean and standard deviation one
- Learning rate of 0.01
- Adagrad as Optimizer
- Batch size of 16
- Learning rate of 0.1, 0.01, and 0.001
- RMSProp as Optimizer
- Batch size of 90
Two versions of it were trained: one providing color images for training and another one providing gray scale images.
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment