diff --git a/doc/baselines.rst b/doc/baselines.rst index 4f8d9c2da0043f74340d937ba66169613d803863..3913fa50404e0bdc334e0923aa7f48504c9c88e4 100644 --- a/doc/baselines.rst +++ b/doc/baselines.rst @@ -55,20 +55,20 @@ Deep learning baselines * ``inception-resnetv1-casiawebface``: Inception Resnet v1 model trained using the Casia Web dataset in the context of the work published by [TFP18]_ -* ``arcface-insightface``: Arcface model from `Insightface <https://github.com/deepinsight/insightface>`_ +* ``arcface-insightface``: Arcface model (Resnet100 backbone) from `Insightface <https://github.com/deepinsight/insightface>`_ +* ``resnet50-msceleb-arcface-2021``: Resnet Arcface model trained with MSCeleb dataset (dataset partially prunned) -Deep Learning with different interfaces baselines -================================================= +* ``resnet50-msceleb-arcface-20210521``: Arcface model trained with MSCeleb dataset (dataset prunned) -* ``mxnet-pipe``: Arcface Resnet Model using MxNet Interfaces from `Insightface <https://github.com/deepinsight/insightface>`_ +* ``resnet50-vgg2-arcface-2021``: Arcface model trained with VGG2 dataset -* ``mxnet-tinyface``: Applying `tinyface annoator <https://github.com/chinakook/hr101_mxnet>`_ for the Arcface Resnet Model using MxNet Interfaces from `Insightface <https://github.com/deepinsight/insightface>`_ +* ``iresnet34``: Arcface model (Resnet 34 backbone) from `Pytorch InsightFace <https://github.com/nizhib/pytorch-insightface>`_ + +* ``iresnet50``: Arcface model (Resnet 50 backbone) from `Pytorch InsightFace <https://github.com/nizhib/pytorch-insightface>`_ + +* ``iresnet100``: Arcface model (Resnet 100 backbone) from `Pytorch InsightFace <https://github.com/nizhib/pytorch-insightface>`_ -* ``pytorch-pipe-v1``: Pytorch network that extracts 1000-dimensional features, trained by Manual Gunther, as described in [LGB18]_ +* ``vgg16-oxford``: VGG16 Face model from `Oxford <https://www.robots.ox.ac.uk/~vgg/publications/2015/Parkhi15/>`_ -* ``pytorch-pipe-v2``: Inception Resnet face recognition model from `facenet_pytorch <https://github.com/timesler/facenet-pytorch>`_ - -* ``tf-pipe``: Inception Resnet v2 model trained using the MSCeleb dataset in the context of the work published by [TFP18]_ - -* ``opencv-pipe``: VGG Face descriptor pretrained models, i.e. `Caffe model <https://www.robots.ox.ac.uk/~vgg/software/vgg_face/>`_ +* ``afffe``: Pytorch network that extracts 1000-dimensional features, trained by Manual Gunther, as described in [LGB18]_ diff --git a/doc/leaderboard/mobio.rst b/doc/leaderboard/mobio.rst index 708566505b5b666b590f415f0b2f2545b25048b4..0d92490d2a8e14143cebf9ebf07f852c83597b4e 100644 --- a/doc/leaderboard/mobio.rst +++ b/doc/leaderboard/mobio.rst @@ -7,7 +7,52 @@ Mobio Dataset ============= -.. todo:: - Benchmarks on Mobio Database +The MOBIO dataset is a video database containing bimodal data (face/speaker). +It is composed by 152 people (split in the two genders male and female), mostly Europeans, split in 5 sessions (few weeks time lapse between sessions). +The database was recorded using two types of mobile devices: mobile phones (NOKIA N93i) and laptop +computers(standard 2008 MacBook). + +For face recognition images are used instead of videos. +One image was extracted from each video by choosing the video frame after 10 seconds. +The eye positions were manually labelled and distributed with the database. + +For more information check: + +.. code-block:: latex + + @article{McCool_IET_BMT_2013, + title = {Session variability modelling for face authentication}, + author = {McCool, Chris and Wallace, Roy and McLaren, Mitchell and El Shafey, Laurent and Marcel, S{\'{e}}bastien}, + month = sep, + journal = {IET Biometrics}, + volume = {2}, + number = {3}, + year = {2013}, + pages = {117-129}, + issn = {2047-4938}, + doi = {10.1049/iet-bmt.2012.0059}, + } + + +Benchmarks +========== + +You can run the mobio baselines command with a simple command such as: + +.. code-block:: bash + + bob bio pipeline vanilla-biometrics mobio-male arcface-insightface + + +Scores from some of our baselines can be found `here <https://www.idiap.ch/software/bob/data/bob/bob.bio.face/master/scores/mobio-male.tar.gz>`_. +A det curve can be generated with these scores by running the following commands: + +.. code-block:: bash + + wget https://www.idiap.ch/software/bob/data/bob/bob.bio.face/master/scores/mobio-male.tar.gz + tar -xzvf mobio-male.tar.gz + bob bio det ./mobio-male/{arcface_insightFace_lresnet100,inception_resnet_v2_msceleb_centerloss_2018,iresnet50,iresnet100,mobilenetv2_msceleb_arcface_2021,resnet50_msceleb_arcface_20210521,vgg16_oxford_baseline,afffe_baseline}/scores-{dev,eval} --legends arcface_insightFace_lresnet100,inception_resnet_v2_msceleb_centerloss_2018,iresnet50,iresnet100,mobilenetv2_msceleb_arcface_2021,resnet50_msceleb_arcface_20210521,vgg16_oxford_baseline,afffe -S -e --figsize 16,8 + +and get the following :download:`plot <./plots/det-mobio-male.pdf>`. + - Probably for Manuel's students \ No newline at end of file diff --git a/doc/leaderboard/plots/det-mobio-male.pdf b/doc/leaderboard/plots/det-mobio-male.pdf new file mode 100644 index 0000000000000000000000000000000000000000..f2794c3965c31ae300e02c3381e91d30e2a2f583 Binary files /dev/null and b/doc/leaderboard/plots/det-mobio-male.pdf differ