Skip to content
Snippets Groups Projects
Commit 69f2bb59 authored by Yannick DAYER's avatar Yannick DAYER
Browse files

[doc] skip doctests of removed C++ modules

parent c431c24a
No related branches found
No related tags found
2 merge requests!43Remove C++ code, tests, dependencies, and build scripts,!40Transition to a pure python implementation
Pipeline #56573 failed
...@@ -100,7 +100,7 @@ This statistical model is defined in the class ...@@ -100,7 +100,7 @@ This statistical model is defined in the class
:py:class:`bob.learn.em.GMMMachine` as bellow. :py:class:`bob.learn.em.GMMMachine` as bellow.
.. doctest:: .. doctest::
:options: +NORMALIZE_WHITESPACE :options: +NORMALIZE_WHITESPACE +SKIP
>>> import bob.learn.em >>> import bob.learn.em
>>> # Create a GMM with k=2 Gaussians with the dimensionality of 3 >>> # Create a GMM with k=2 Gaussians with the dimensionality of 3
...@@ -132,7 +132,7 @@ estimator. ...@@ -132,7 +132,7 @@ estimator.
.. doctest:: .. doctest::
:options: +NORMALIZE_WHITESPACE :options: +NORMALIZE_WHITESPACE +SKIP
>>> import bob.learn.em >>> import bob.learn.em
>>> import numpy >>> import numpy
...@@ -197,7 +197,7 @@ Follow bellow an snippet on how to train a GMM using the MAP estimator. ...@@ -197,7 +197,7 @@ Follow bellow an snippet on how to train a GMM using the MAP estimator.
.. doctest:: .. doctest::
:options: +NORMALIZE_WHITESPACE :options: +NORMALIZE_WHITESPACE +SKIP
>>> import bob.learn.em >>> import bob.learn.em
>>> import numpy >>> import numpy
...@@ -275,7 +275,7 @@ prior GMM. ...@@ -275,7 +275,7 @@ prior GMM.
.. doctest:: .. doctest::
:options: +NORMALIZE_WHITESPACE :options: +NORMALIZE_WHITESPACE +SKIP
>>> import bob.learn.em >>> import bob.learn.em
>>> import numpy >>> import numpy
...@@ -340,7 +340,7 @@ Intersession variability modeling. ...@@ -340,7 +340,7 @@ Intersession variability modeling.
.. doctest:: .. doctest::
:options: +NORMALIZE_WHITESPACE :options: +NORMALIZE_WHITESPACE +SKIP
>>> import bob.learn.em >>> import bob.learn.em
>>> import numpy >>> import numpy
...@@ -414,7 +414,7 @@ The JFA statistical model is stored in this container ...@@ -414,7 +414,7 @@ The JFA statistical model is stored in this container
Intersession variability modeling. Intersession variability modeling.
.. doctest:: .. doctest::
:options: +NORMALIZE_WHITESPACE :options: +NORMALIZE_WHITESPACE +SKIP
>>> import bob.learn.em >>> import bob.learn.em
>>> import numpy >>> import numpy
...@@ -489,7 +489,7 @@ The iVector statistical model is stored in this container ...@@ -489,7 +489,7 @@ The iVector statistical model is stored in this container
a Total variability modeling. a Total variability modeling.
.. doctest:: .. doctest::
:options: +NORMALIZE_WHITESPACE :options: +NORMALIZE_WHITESPACE +SKIP
>>> import bob.learn.em >>> import bob.learn.em
>>> import numpy >>> import numpy
...@@ -564,7 +564,7 @@ This scoring technique is implemented in :py:func:`bob.learn.em.linear_scoring`. ...@@ -564,7 +564,7 @@ This scoring technique is implemented in :py:func:`bob.learn.em.linear_scoring`.
The snippet bellow shows how to compute scores using this approximation. The snippet bellow shows how to compute scores using this approximation.
.. doctest:: .. doctest::
:options: +NORMALIZE_WHITESPACE :options: +NORMALIZE_WHITESPACE +SKIP
>>> import bob.learn.em >>> import bob.learn.em
>>> import numpy >>> import numpy
...@@ -611,7 +611,7 @@ Let us consider a training set of two classes, each with 3 samples of ...@@ -611,7 +611,7 @@ Let us consider a training set of two classes, each with 3 samples of
dimensionality 3. dimensionality 3.
.. doctest:: .. doctest::
:options: +NORMALIZE_WHITESPACE :options: +NORMALIZE_WHITESPACE +SKIP
>>> data1 = numpy.array( >>> data1 = numpy.array(
... [[3,-3,100], ... [[3,-3,100],
...@@ -628,6 +628,7 @@ Learning a PLDA model can be performed by instantiating the class ...@@ -628,6 +628,7 @@ Learning a PLDA model can be performed by instantiating the class
:py:meth:`bob.learn.em.train` method. :py:meth:`bob.learn.em.train` method.
.. doctest:: .. doctest::
:options: +SKIP
>>> # This creates a PLDABase container for input feature of dimensionality >>> # This creates a PLDABase container for input feature of dimensionality
>>> # 3 and with subspaces F and G of rank 1 and 2, respectively. >>> # 3 and with subspaces F and G of rank 1 and 2, respectively.
...@@ -645,6 +646,7 @@ obtained by calling the ...@@ -645,6 +646,7 @@ obtained by calling the
:py:meth:`bob.learn.em.PLDAMachine.compute_log_likelihood()` method. :py:meth:`bob.learn.em.PLDAMachine.compute_log_likelihood()` method.
.. doctest:: .. doctest::
:options: +SKIP
>>> plda = bob.learn.em.PLDAMachine(pldabase) >>> plda = bob.learn.em.PLDAMachine(pldabase)
>>> samples = numpy.array( >>> samples = numpy.array(
...@@ -658,6 +660,7 @@ a set of enrollment samples, then, several instances of ...@@ -658,6 +660,7 @@ a set of enrollment samples, then, several instances of
the :py:meth:`bob.learn.em.PLDATrainer.enroll()` method as follows. the :py:meth:`bob.learn.em.PLDATrainer.enroll()` method as follows.
.. doctest:: .. doctest::
:options: +SKIP
>>> plda1 = bob.learn.em.PLDAMachine(pldabase) >>> plda1 = bob.learn.em.PLDAMachine(pldabase)
>>> samples1 = numpy.array( >>> samples1 = numpy.array(
...@@ -675,6 +678,7 @@ several test samples can be computed as previously described, and this ...@@ -675,6 +678,7 @@ several test samples can be computed as previously described, and this
separately for each model. separately for each model.
.. doctest:: .. doctest::
:options: +SKIP
>>> sample = numpy.array([3.2,-3.3,58], dtype=numpy.float64) >>> sample = numpy.array([3.2,-3.3,58], dtype=numpy.float64)
>>> l1 = plda1.compute_log_likelihood(sample) >>> l1 = plda1.compute_log_likelihood(sample)
...@@ -691,6 +695,7 @@ computed, which is defined in more formal way by: ...@@ -691,6 +695,7 @@ computed, which is defined in more formal way by:
:math:`s = \ln(P(x_{test},x_{enroll})) - \ln(P(x_{test})P(x_{enroll}))` :math:`s = \ln(P(x_{test},x_{enroll})) - \ln(P(x_{test})P(x_{enroll}))`
.. doctest:: .. doctest::
:options: +SKIP
>>> s1 = plda1(sample) >>> s1 = plda1(sample)
>>> s2 = plda2(sample) >>> s2 = plda2(sample)
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment