Commit c8982be5 authored by Zohreh MOSTAANI's avatar Zohreh MOSTAANI
Browse files

[general][doc] fixes the images in hands on tutorial

parent 736d7072
Pipeline #24649 passed with stages
in 5 minutes and 57 seconds
doc/beat/img/iris_toolchain.png

85.1 KB | W: | H:

doc/beat/img/iris_toolchain.png

82.3 KB | W: | H:

doc/beat/img/iris_toolchain.png
doc/beat/img/iris_toolchain.png
doc/beat/img/iris_toolchain.png
doc/beat/img/iris_toolchain.png
  • 2-up
  • Swipe
  • Onion skin
......@@ -16,7 +16,7 @@ Requirements
#. A ``BEAT`` installation: Please see :ref:`beat-installation`.
#. BEAT familiarity: This guide assumes you are somewhat familiar with what BEAT is and how it works. Please see :ref:`beat-system` and refer to it if you come across an unfamiliar BEAT term or concept.
#. A BEAT ``prefix``: All the building blocks of BEAT is stored in a folder typically named ``prefix``. For the purpose of this tutorial we provide this folder. Please download the prefix folder in this `git repository <https://gitlab.idiap.ch/beat/beat.tutorial.prefix>`_ . Run any BEAT commands relating to the prefix in the top-level folder of it, next to the prefix folder.
#. To simplify code examples, we will be using `Bob`_, a free signal-processing and machine learning toolbox originally developed by the Biometrics group at `Idiap`_ Research Institute, Switzerland. Please add the conda channel for ``bob`` to the conda environment with BEAT packages are installed and then install the packages needed for this tutorial.
#. To simplify code examples, we will be using `Bob`_, a free signal-processing and machine learning toolbox originally developed by the Biometrics group at `Idiap`_ Research Institute, Switzerland. Please add the Conda channel for ``bob`` to the Conda environment with BEAT packages are installed and then install the packages needed for this tutorial.
.. code:: sh
......@@ -38,7 +38,7 @@ There are a few things you have to do to create BEAT objects locally:
Each of these steps is filled by a different tool:
#. ``beat.editor`` edits metadata through a web application.
#. You use your own editor of your choice to edit the necessary codes.
#. You use the editor of your choice to edit the necessary codes.
#. ``beat.cmdline`` does "the rest", letting you run & visualize experiments, manage the cache, debug, and much more. For more information see :ref:`beat_cmdline_introduction`.
You might have a window setup like the following:
......@@ -85,25 +85,22 @@ The Problem: What are we doing?
Our task will be to discriminate between the 3 types of flowers in Fisher's Iris dataset using LDA. To keep it simple, we will just be discriminating setosa & versicolor flower samples versus virginica flower samples, giving us a 2-class problem.
Setosa flowers:
.. image:: ./img/iris_setosa.jpg
Versicolor flowers:
.. image:: ./img/iris_versicolor.jpg
Virginica flowers:
.. image:: ./img/iris_virginica.jpg
.. image:: ./img/iris_flowers.jpg
Each sample in Fisher's Iris dataset is 4 measurements from a flower:
.. image:: ./img/iris_versicolor_measurements.png
:width: 400px
:align: center
:height: 300px
The dataset therefore looks like the following:
.. image:: ./img/iris_db_overview.png
:width: 600px
:align: center
:height: 393px
The Goal: What do we want?
--------------------------
......@@ -122,37 +119,64 @@ Let's design the experiment abstractly, before showing how it is designed out of
The first thing we need is the training data for training our LDA machine and the testing data to test the trained model:
.. image:: ./img/iris_diagram_1.png
.. image:: ./img/iris_diagram_1.jpg
:width: 500px
:align: center
:height: 174px
Then we need to train the LDA:
.. image:: ./img/iris_diagram_2.png
.. image:: ./img/iris_diagram_2.jpg
:width: 500px
:align: center
:height: 174px
Then we test the LDA:
.. image:: ./img/iris_diagram_3.png
.. image:: ./img/iris_diagram_3.jpg
:width: 500px
:align: center
:height: 174px
Then we analyze our scores and generate our figures of merit:
.. image:: ./img/iris_diagram_4.png
:width: 500px
:align: center
:height: 261px
And there is our conceptual experiment diagram! You've probably seen and thought of designs similar to this. Let's see how to split it into BEAT objects.
The data for BEAT experiments come from `databases`, so we need one of those:
.. image:: ./img/iris_diagram_database.png
:width: 500px
:align: center
:height: 261px
We need an `algorithm` for training the LDA machine:
.. image:: ./img/iris_diagram_training.png
:width: 500px
:align: center
:height: 261px
We need another algorithm for testing the model:
.. image:: ./img/iris_diagram_testing.png
:width: 500px
:align: center
:height: 261px
And we'll need an `analyzer` algorithm for generating our figures of merit:
.. image:: ./img/iris_diagram_analyzer.png
:width: 500px
:align: center
:height: 261px
In addition to that, we will need a `toolchain` and an `experiment`. This gives us 6 different BEAT objects:
......@@ -360,10 +384,16 @@ Our preprocessing algorithm will be very simple (and not particularly useful) -
The dataset, currently:
.. image:: ./img/iris_db_before.png
:width: 600px
:align: center
:height: 393px
The dataset, after being preprocessed:
.. image:: ./img/iris_db_after.png
:width: 600px
:align: center
:height: 393px
Everything else will be the same as the Iris LDA experiment.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment