Commit 4cb988a5 authored by André Anjos's avatar André Anjos 💬
Browse files

Merge branch 'docedit' into 'master'

merge new documentation to master

See merge request !48
parents 7e987c53 835cad09
Pipeline #25469 passed with stages
in 41 minutes and 33 seconds
This diff is collapsed.
......@@ -34,11 +34,70 @@ with a hybrid set of algorithms that execute on different backends. Each
backend can be implemented in a different programming language and contain any
number of (pre-installed) libraries users can call on their algorithms.
This document describes the API required by such backend implementations. The
The requirements for BEAT when reading/writing data are:
* Ability to manage large and complex data
* Portability to allow the use of heterogeneous environments
Based on our experience and on these requirements, we investigated
the use of HDF5. Unfortunately, HDF5 is not convenient to handle
structures such as arrays of variable-size elements, for instance,
array of strings.
Therefore, we decided to rely on our own binary format.
This document describes the binary formats in BEAT and the API required by BEAT to handle multiple backend implementations. The
package `beat.env.python27`_ provides the *reference* Python backend
implementation based on `Python 2.7`_.
Binary Format
-------------
Our binary format does *not* contains information about the format of the data
itself, and it is hence necessary to know this format a priori. This means that
the format cannot be inferred from the content of a file.
We rely on the following fundamental C-style formats:
* int8
* int16
* int32
* int64
* uint8
* uint16
* uint32
* uint64
* float32
* float64
* complex64 (first real value, and then imaginary value)
* complex128 (first real value, and then imaginary value)
* bool (written as a byte)
* string
An element of such a basic format is written in the C-style way, using
little-endian byte ordering.
Besides, dataformats always consist of arrays or dictionary of such fundamental
formats or compound formats.
An array of elements is saved as followed. First, the shape of the array is
saved using an *uint64* value for each dimension. Next, the elements of the
arrays are saved in C-style order.
A dictionary of elements is saved as followed. First, the key are ordered
according to the lexicographic ordering. Then, the values associated to each of
these keys are saved following this ordering.
The platform is data-driven and always processes chunks of data. Therefore,
data are always written by chunks, each chunk being preceded by a text-formated
header indicated the start- and end- indices followed by the size (in bytes) of
the chunck.
Considering the Python backend of the platform, this binary format has been
successfully implemented using the ``struct`` module.
Filesystem Organization
-----------------------
......
.. vim: set fileencoding=utf-8 :
.. Copyright (c) 2016 Idiap Research Institute, http://www.idiap.ch/ ..
.. Contact: beat.support@idiap.ch ..
.. ..
.. This file is part of the beat.core module of the BEAT platform. ..
.. ..
.. Commercial License Usage ..
.. Licensees holding valid commercial BEAT licenses may use this file in ..
.. accordance with the terms contained in a written agreement between you ..
.. and Idiap. For further information contact tto@idiap.ch ..
.. ..
.. Alternatively, this file may be used under the terms of the GNU Affero ..
.. Public License version 3 as published by the Free Software and appearing ..
.. in the file LICENSE.AGPL included in the packaging of this file. ..
.. The BEAT platform is distributed in the hope that it will be useful, but ..
.. WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY ..
.. or FITNESS FOR A PARTICULAR PURPOSE. ..
.. ..
.. You should have received a copy of the GNU Affero Public License along ..
.. with the BEAT platform. If not, see http://www.gnu.org/licenses/. ..
==========
Databases
==========
A database is a collection of data files, one for each output of the database.
This data are inputs to the BEAT toolchains. Therefore, it is important to
define evaluation protocols, which describe how a specific system must use the
raw data of a given database.
For instance, a recognition system will typically use a subset of the data to
train a recognition `model`, while another subset of data will be used to
evaluate the performance of this model.
Structure of a database
-----------------------
A database has the following structure on disk::
database_name/
output1_name.data
output2_name.data
...
outputN_name.data
For a given database, the BEAT platform will typically stores information
about the root folder containing this raw data as well as a description of
it.
Evaluation protocols
--------------------
A BEAT evaluation protocol consists of several ``datasets``, each datasets
having several ``outputs`` with well-defined data formats. In practice,
each dataset will typically be used for a different purpose.
For instance, in the case of a simple face recognition protocol, the
database may be split into three datasets: one for training, one for enrolling
client-specific model, and one for testing these models.
The training dataset may have two outputs: grayscale images as two-dimensional
array of type `uint8` and client id as `uint64` integers.
The BEAT platform is data-driven, which means that all the outputs of a given
dataset are synchronized. The way the data is generated by each template
is defined in a piece of code called the ``database view``. It is important
that a database view has a deterministic behavior for reproducibility
purposes.
Database set templates
----------------------
In practice, different databases used for the same purpose may have the exact
same datasets with the exact same outputs (and attached data formats). In this
case, it is interesting to abstract the definition of the database sets from
a given database. BEAT defines ``database set templates`` for this purpose.
For instance, the simple face recognition evaluation protocol described above,
which consists of three datasets and few inputs may be abstracted in a
database set template. This template defines both the datasets, their outputs
as well as their corresponding data formats. Next, if several databases
implements such a protocol, they may rely on the same `database set template`.
Similarly, evaluation protocols testing different conditions (such as
enrolling on clean and testing on clean data vs. enrolling on clean and
testing on noisy data) may rely on the same database set template.
In practice, this reduces the amount of work to integrate new databases and/or
new evaluation protocols into the platform. Besides, at the experiment level,
this allows to re-use a toolchain on a different database, with almost no
configuration changes from the user.
.. vim: set fileencoding=utf-8 :
.. Copyright (c) 2016 Idiap Research Institute, http://www.idiap.ch/ ..
.. Contact: beat.support@idiap.ch ..
.. ..
.. This file is part of the beat.core module of the BEAT platform. ..
.. ..
.. Commercial License Usage ..
.. Licensees holding valid commercial BEAT licenses may use this file in ..
.. accordance with the terms contained in a written agreement between you ..
.. and Idiap. For further information contact tto@idiap.ch ..
.. ..
.. Alternatively, this file may be used under the terms of the GNU Affero ..
.. Public License version 3 as published by the Free Software and appearing ..
.. in the file LICENSE.AGPL included in the packaging of this file. ..
.. The BEAT platform is distributed in the hope that it will be useful, but ..
.. WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY ..
.. or FITNESS FOR A PARTICULAR PURPOSE. ..
.. ..
.. You should have received a copy of the GNU Affero Public License along ..
.. with the BEAT platform. If not, see http://www.gnu.org/licenses/. ..
.. _beat-core-dataformats:
=============
Data formats
=============
Data formats formalize the interaction between algorithms and data sets, so
they can communicate data in an orderly manner. All data formats produced or
consumed by these objects must be formally declared. Two algorithms which must
directly communicate data must produce and consume the same type of data
objects.
A data format specifies a list of typed fields. An algorithm or data set
generating a block of data (via one of its outputs) **must** fill all the
fields declared in that data format. An algorithm consuming a block of data
(via one of its inputs) **must not** expect the presence of any other field
than the ones defined by the data format.
This section contains information on the definition of dataformats, its
programmatic use on Python-based language bindings.
Definition
----------
A data format is declared as a `JSON`_ object with several fields. For example,
the following declaration could represent the coordinates of a rectangular
region in an image:
.. code-block:: json
{
"x": "int32",
"y": "int32",
"width": "int32",
"height": "int32"
}
.. note::
We have chosen to define objects inside the BEAT platform using JSON
declarations as JSON files can be easily validated, transferred through
web-based APIs and provide and easy to read format for local inspection.
Each field must be named according to typical programming rules for variable
names. For example, these are valid names:
* ``my_field``
* ``_my_field``
* ``number1``
These are invalid field names:
* ``1number``
* ``my field``
The following regular expression is used to validate field names:
``^[a-zA-Z_][a-zA-Z0-9_-]*$``. In short, a field name has to start with a
letter or an underscore character and can contain, immediately after, any
number of alpha-numerical characters or underscores.
By convention, fields prefixed and suffixed with a double underscore (``__``)
are reserved and should be avoided.
The special field ``#description`` can be used to store a short description of
the declared data format and also ignored:
.. code-block:: json
{
"#description": "A rectangle in an pixeled image",
"x": "int32",
"y": "int32",
"width": "int32",
"height": "int32"
}
The ``#description`` field is ignored in practice and only used for
informational purposes.
Each field in a declaration has a well-defined type, which can be one of:
* a primitive, simple type (see :ref:`beat-core-dataformats-simple`)
* a directly nested object (see :ref:`beat-core-dataformats-complex`)
* another data format (see :ref:`beat-core-dataformats-aggregation`)
* an array (see :ref:`beat-core-dataformats-array`)
A data format can also extend another one, as explained further down (see
ref:`beat-core-dataformats-extension`).
.. _beat-core-dataformats-simple:
Simple types
------------
The following primitive data types are available in the BEAT platform:
* Integers: ``int8``, ``int16``, ``int32``, ``int64``
* Unsigned integers: ``uint8``, ``uint16``, ``uint32``, ``uint64``
* Floating-point numbers: ``float32``, ``float64``
* Complex numbers: ``complex64``, ``complex128``
* ``bool``
* ``string``
.. note::
All primitive types are implemented using their :py:mod:`numpy`
counterparts.
When determining if a block of data corresponds to a data format, the platform
will check that the value of each field can safely (without loss of precision)
be converted to the type declared by the data format. An error is generated if
you fail to follow these requirements.
For example, an ``int8`` *can* be converted, without a precision loss, to an
``int16``, but a ``float32`` **cannot** be losslessly converted to an
``int32``. In case of doubt, you can manually test for `NumPy safe-casting
rules`_ yourself in order to understand imposed restrictions. If you wish to
allow for a precision loss on your code, you must do it explicitly (`Zen of
Python`_).
.. _beat-core-dataformats-complex:
Complex types
-------------
A data format can be composed of complex objects formed by nesting other types.
The coordinates of a rectangular region in an image be represented like this:
.. code-block:: json
{
"coords": {
"x": "int32",
"y": "int32"
},
"size": {
"width": "int32",
"height": "int32"
}
}
.. _beat-core-dataformats-aggregation:
Aggregation
-----------
.. note::
Data formats are named using 3 values joined by a ``/`` (slash) separator:
the username who is the author of the dataformat, an identifier and the
object version (integer starting from 1). Here are examples of data format
names:
* ``user/my_format/1``
* ``johndoe/integers/37``
* ``mary_mary/rectangle/2``
A field can use the declaration of another data format instead of specifying
its own declaration. Consider the following data formats, on their first
version, for user ``user``:
.. code-block:: json
:caption: Two dimensional coordinates (``user/coordinates/1``)
{
"x": "int32",
"y": "int32"
}
.. code-block:: json
:caption: Two dimensional size (``user/size/1``):
{
"width": "int32",
"height": "int32"
}
Now let's aggregate both previous formats in order to declare a new data format
for describing a rectangle:
.. code-block:: json
:caption: The definition of a rectangle
{
"coords": "user/coordinates/1",
"size": "user/size/1"
}
.. _beat-core-dataformats-array:
Arrays
------
A field can be a multi-dimensional array of any other type. For instance,
consider the following example:
.. code-block:: json
{
"field1": [10, "int32"],
"field2": [10, 5, "bool"]
}
Here we declare that ``field1`` is a one-dimensional array of 10 32-bit signed
integers (``int32``), and ``field2`` is a two-dimensional array with 10 rows
and 5 columns of booleans.
.. note::
In the Python language representation of data formats, multi-dimensional
arrays are implemented using :py:class:`numpy.ndarray`'s.
An array can have as many dimensions as you want. It can also contain objects
(either declared inline, or using another data format):
.. code-block:: json
{
"inline": [10, {
"x": "int32",
"y": "int32"
}],
"imported": [10, "beat/coordinates/1"]
}
It is also possible to declare an array without specifying the number of
elements in some of its dimensions, by using a size of 0 (zero):
.. code-block:: json
{
"field1": [0, "int32"],
"field2": [0, 0, "bool"],
"field3": [10, 0, "float32"]
}
Here, ``field1`` is a one-dimensional array of 32-bit signed integers
(``int32``), ``field2`` is a two-dimensional array of booleans, and ``field3``
is a two-dimensional array of floating-point numbers (``float32``) whose the
first dimension is fixed to 10 (number of rows).
Note that the following declaration isn't valid (you can't fix a dimension if
the preceding one isn't fixed too):
.. code-block:: json
{
"error": [0, 10, "int32"]
}
.. note::
When determining if that a block of data corresponds to a data format
containing an array, the platform automatically checks that:
* the number of dimensions is correct
* the size of each declared dimension that isn't 0 is correct
* the type of each value in the array is correct
.. _beat-core-dataformats-extension:
Extensions
----------
Besides aggregation, it is possible to extend data formats through inheritance.
In practice, inheriting from a data format is the same as pasting its
declaration right on the top of the new format.
For example, one might implement a face detector algorithm and may want to
create a data format containing all the informations about a face (say its
position, its size and the position of each eye). This could be done by
extending the type ``user/rectangular_area/1`` defined earlier:
.. code-block:: json
{
"#extends": "user/rectangular_area/1",
"left_eye": "coordinates",
"right_eye": "coordinates"
}
.. _beat-core-dataformats-usage:
Python API
----------
Data formats are useful descriptions of data blocks that are consumed by
algorithmic code inside the platform. In BEAT, the user never instantiates data
formats directly. Instead, when a new object representing a data format needs
to be created, the user may just create a dictionary in which the keys are the
format field names, whereas the values are instances of the type defined for
such a field. If the type is a reference to another format, the user may nest
dictionaries so as to build objects of any complexity. When the dictionary
representing a data format is written to an algorithm output, the data is
properly validated.
This concept will become clearer when you'll read about algorithms and the way
they receive and produce data. Here is just a simple illustrative example:
.. testsetup:: test-output-write
import numpy
from beat.core.dataformat import DataFormat
from beat.core.test.mocks import MockDataSink
from beat.core.outputs import Output
dataformat = DataFormat('/not/needed', {
"x": "int32",
"y": "int32",
"width": "int32",
"height": "int32"
})
assert dataformat.valid
data_sink = MockDataSink(dataformat)
output = Output('test', data_sink)
.. testcode:: test-output-write
# suppose, for this example, `output' is provided to your algorithm
output.write({
"x": numpy.int32(10),
"y": numpy.int32(20),
"width": numpy.int32(100),
"height": numpy.int32(100),
})
.. include:: links.rst
......@@ -20,6 +20,7 @@
.. You should have received a copy of the GNU Affero Public License along ..
.. with the BEAT platform. If not, see http://www.gnu.org/licenses/. ..
.. _beat-core-local-development:
===================
Local Development
......@@ -30,11 +31,11 @@ Go through the following sequence of commands:
.. code-block:: sh
$ git co https://gitlab.idiap.ch/bob/bob.admin
$ git checkout https://gitlab.idiap.ch/bob/bob.admin
$ #install miniconda (version 4.4 or above required)
$ conda activate base
$ cd beat.backend.python #cd into this package's sources
$ ../bob.admin/conda/conda-bootstrap.py --overwrite --python=2.7 beat-core-dev
$ ../bob.admin/conda/conda-bootstrap.py --overwrite --python=3.6 beat-core-dev
$ conda activate beat-core-dev
$ #n.b.: docker must be installed on your system (see next section)
$ buildout -c develop.cfg
......@@ -67,17 +68,29 @@ execute algorithms or experiments.
We use specific docker images to run user algorithms. Download the following
base images before you try to run tests or experiments on your computer::
$ docker pull docker.idiap.ch/beat/beat.env.system.python:1.1.2
$ docker pull docker.idiap.ch/beat/beat.env.db.examples:1.1.1
$ docker pull docker.idiap.ch/beat/beat.env.client:1.2.0
$ docker pull docker.idiap.ch/beat/beat.env.cxx:1.0.2
$ docker pull docker.idiap.ch/beat/beat.env.system.python:1.3.0
$ docker pull docker.idiap.ch/beat/beat.env.db.examples:1.4.0
$ docker pull docker.idiap.ch/beat/beat.env.client:2.0.0
$ docker pull docker.idiap.ch/beat/beat.env.cxx:2.0.0
Optionally, also download the following images to be able to re-run experiments
downloaded from the BEAT platform (not required for unit testing)::
downloaded from the BEAT platform (not required for unit testing). These docker
images corresponds to the python environment available on the platform. Keep in
mind that at the moment you cannot use different environments to run each block
when you are using BEAT locally (meaning not using the Docker executor)::
$ docker pull docker.idiap.ch/beat/beat.env.python:0.0.4
$ docker pull docker.idiap.ch/beat/beat.env.python:1.0.0
$ docker pull docker.idiap.ch/beat/beat.env.db:1.2.2
$ docker pull docker.idiap.ch/beat/beat.env.python:1.1.0
$ docker pull docker.idiap.ch/beat/beat.env.python:2.0.0
$ docker pull docker.idiap.ch/beat/beat.env.db:1.4.0
Before pulling these images, you should check the registry as there might have
been new release (i.e. rX versions).
To run an experiment using docker you should specify the docker image when defining the experiment, then use the `--docker` flag when using `beat.cmdline`::
$ beat experiment run --docker <experiment name>
You can find more information about running experiments locally using different executors in `here <https://www.idiap.ch/software/beat/docs/beat/docs/master/beat.cmdline/doc/experiments.html#how-to-run-an-experiment>`_.
Documentation
......@@ -90,6 +103,7 @@ To build the documentation, just do:
$ ./bin/sphinx-build doc sphinx
Testing
-------
......@@ -103,18 +117,18 @@ use ``nose``:
.. note::
Some of the tests for our command-line toolkit require a running BEAT
platform web-server, with a compatible ``beat.core`` installed (preferably
the same). By default, these tests will be skipped. If you want to run
them, you must setup a development web server and set the environment
variable ``BEAT_CORE_TEST_PLATFORM`` to point to that address. For example::
Some of the tests for our command-line toolkit require a running BEAT
platform web-server, with a compatible ``beat.core`` installed (preferably
the same). By default, these tests will be skipped. If you want to run
them, you must setup a development web server and set the environment
variable ``BEAT_CORE_TEST_PLATFORM`` to point to that address. For example::
$ export BEAT_CORE_TEST_PLATFORM="http://example.com/platform/"
$ ./bin/nosetests -sv
$ export BEAT_CORE_TEST_PLATFORM="http://example.com/platform/"
$ ./bin/nosetests -sv
.. warning::
.. warning::
Do **NOT** run tests against a production web server.
Do **NOT** run tests against a production web server.
If you want to skip slow tests (at least those pulling stuff from our servers)
......@@ -131,15 +145,13 @@ To measure the test coverage, do the following::
Our documentation is also interspersed with test units. You can run them using
sphinx::
$ ./bin/sphinx -b doctest doc sphinx
Other bits
----------
$ ./bin/sphinx -b doctest doc sphinx
Other Bits
==========
Profiling
==========
---------
In order to profile the test code, try the following::
......@@ -154,4 +166,4 @@ This will allow you to dump and print the profiling statistics as you may find
fit.
.. _docker: https://www.docker.com/
.. include:: links.rst