Skip to content
Snippets Groups Projects
Commit ea0ec084 authored by Tiago Pereira's avatar Tiago Pereira Committed by Amir MOHAMMADI
Browse files

Fixed doc tests

parent 5bbde19c
No related branches found
No related tags found
1 merge request!24Re-write the user guide
...@@ -171,9 +171,8 @@ Follow bellow an snippet on how to train a GMM using the MAP estimator. ...@@ -171,9 +171,8 @@ Follow bellow an snippet on how to train a GMM using the MAP estimator.
>>> # Training >>> # Training
>>> bob.learn.em.train(gmm_trainer, adapted_gmm, data, max_iterations = max_iterations, convergence_threshold = convergence_threshold) # Train the KMeansMachine >>> bob.learn.em.train(gmm_trainer, adapted_gmm, data, max_iterations = max_iterations, convergence_threshold = convergence_threshold) # Train the KMeansMachine
>>> print(adapted_gmm.means) >>> print(adapted_gmm.means)
[[ -4.66666667 3.53333333 -40.5 ] [[ -4.667 3.533 -40.5 ]
[ 2.92857143 -4.07142857 76.14285714]] [ 2.929 -4.071 76.143]]
Bellow follow an intuition of the GMM trained with the MAP estimator using the Iris flower `dataset <https://en.wikipedia.org/wiki/Iris_flower_data_set>`_. Bellow follow an intuition of the GMM trained with the MAP estimator using the Iris flower `dataset <https://en.wikipedia.org/wiki/Iris_flower_data_set>`_.
...@@ -223,15 +222,14 @@ The snippet bellow shows how to compute accumulated these statistics given a pri ...@@ -223,15 +222,14 @@ The snippet bellow shows how to compute accumulated these statistics given a pri
>>> # All nice and round diagonal covariance >>> # All nice and round diagonal covariance
>>> prior_gmm.variances = numpy.ones((2, 3)) * 0.5 >>> prior_gmm.variances = numpy.ones((2, 3)) * 0.5
>>> prior_gmm.weights = numpy.array([0.3, 0.7]) >>> prior_gmm.weights = numpy.array([0.3, 0.7])
>>> # Creating the container >>> # Creating the container
>>> gmm_stats_container = bob.learn.em.GMMStats(2, 3) >>> gmm_stats_container = bob.learn.em.GMMStats(2, 3)
>>> for d in data: >>> for d in data:
... prior_gmm.acc_statistics(d, gmm_stats_container) ... prior_gmm.acc_statistics(d, gmm_stats_container)
>>> >>>
>>> # Printing the responsibilities >>> # Printing the responsibilities
>>> print gmm_stats_container.n/gmm_stats_container.t >>> print gmm_stats_container.n/gmm_stats_container.t
[ 0.42861627 0.57138373] [ 0.429 0.571]
Inter-Session Variability Inter-Session Variability
...@@ -283,11 +281,11 @@ The snippet bellow shows how to train a Intersession variability modelling. ...@@ -283,11 +281,11 @@ The snippet bellow shows how to train a Intersession variability modelling.
>>> gmm_stats_per_class = [] >>> gmm_stats_per_class = []
>>> for d in data: >>> for d in data:
... stats = [] ... stats = []
... for i in d: ... for i in d:
... gmm_stats_container = bob.learn.em.GMMStats(2, 3) ... gmm_stats_container = bob.learn.em.GMMStats(2, 3)
... prior_gmm.acc_statistics(i, gmm_stats_container) ... prior_gmm.acc_statistics(i, gmm_stats_container)
... stats.append(gmm_stats_container) ... stats.append(gmm_stats_container)
... gmm_stats_per_class.append(stats) ... gmm_stats_per_class.append(stats)
>>> # Finally doing the ISV training >>> # Finally doing the ISV training
>>> subspace_dimension_of_u = 2 >>> subspace_dimension_of_u = 2
...@@ -297,12 +295,12 @@ The snippet bellow shows how to train a Intersession variability modelling. ...@@ -297,12 +295,12 @@ The snippet bellow shows how to train a Intersession variability modelling.
>>> bob.learn.em.train(trainer, isvbase, gmm_stats_per_class, max_iterations=50) >>> bob.learn.em.train(trainer, isvbase, gmm_stats_per_class, max_iterations=50)
>>> # Printing the session offset w.r.t each Gaussian component >>> # Printing the session offset w.r.t each Gaussian component
>>> print isvbase.u >>> print isvbase.u
[[-0.01018674 -0.0266506 ] [[-0.01 -0.027]
[-0.00160621 -0.00420217] [-0.002 -0.004]
[ 0.02811708 0.07356007] [ 0.028 0.074]
[ 0.01162401 0.0304108 ] [ 0.012 0.03 ]
[ 0.03261834 0.08533628] [ 0.033 0.085]
[ 0.04602195 0.1204029 ]] [ 0.046 0.12 ]]
Joint Factor Analysis Joint Factor Analysis
...@@ -349,11 +347,11 @@ The snippet bellow shows how to train a Intersession variability modelling. ...@@ -349,11 +347,11 @@ The snippet bellow shows how to train a Intersession variability modelling.
>>> gmm_stats_per_class = [] >>> gmm_stats_per_class = []
>>> for d in data: >>> for d in data:
... stats = [] ... stats = []
... for i in d: ... for i in d:
... gmm_stats_container = bob.learn.em.GMMStats(2, 3) ... gmm_stats_container = bob.learn.em.GMMStats(2, 3)
... prior_gmm.acc_statistics(i, gmm_stats_container) ... prior_gmm.acc_statistics(i, gmm_stats_container)
... stats.append(gmm_stats_container) ... stats.append(gmm_stats_container)
... gmm_stats_per_class.append(stats) ... gmm_stats_per_class.append(stats)
>>> >>>
>>> # Finally doing the JFA training >>> # Finally doing the JFA training
>>> subspace_dimension_of_u = 2 >>> subspace_dimension_of_u = 2
...@@ -365,12 +363,12 @@ The snippet bellow shows how to train a Intersession variability modelling. ...@@ -365,12 +363,12 @@ The snippet bellow shows how to train a Intersession variability modelling.
>>> # Printing the session offset w.r.t each Gaussian component >>> # Printing the session offset w.r.t each Gaussian component
>>> print jfabase.v >>> print jfabase.v
[[ 0.002881 -0.00584226] [[ 0.003 -0.006]
[ 0.04143534 -0.084025 ] [ 0.041 -0.084]
[-0.26149889 0.53028268] [-0.261 0.53 ]
[-0.25156799 0.51014422] [-0.252 0.51 ]
[-0.38687714 0.78453199] [-0.387 0.785]
[-0.36015773 0.73034882]] [-0.36 0.73 ]]
Total variability Modelling Total variability Modelling
*************************** ***************************
...@@ -431,13 +429,12 @@ The snippet bellow shows how to train a Total variability modelling. ...@@ -431,13 +429,12 @@ The snippet bellow shows how to train a Total variability modelling.
>>> >>>
>>> # Printing the session offset w.r.t each Gaussian component >>> # Printing the session offset w.r.t each Gaussian component
>>> print ivector_machine.t >>> print ivector_machine.t
[[ 0.1101072 -0.20271139] [[ 0.11 -0.203]
[-0.12426696 0.01402857] [-0.124 0.014]
[ 0.29584642 0.67414389] [ 0.296 0.674]
[ 0.44728435 0.1744876 ] [ 0.447 0.174]
[ 0.42547226 0.58287138] [ 0.425 0.583]
[ 0.39369553 0.79358693]] [ 0.394 0.794]]
Linear Scoring Linear Scoring
************** **************
...@@ -470,26 +467,19 @@ The snippet bellow shows how to compute scores using this approximation. ...@@ -470,26 +467,19 @@ The snippet bellow shows how to compute scores using this approximation.
>>> import bob.learn.em >>> import bob.learn.em
>>> import numpy >>> import numpy
>>> # Defining a fake prior >>> # Defining a fake prior
>>> prior_gmm = bob.learn.em.GMMMachine(3, 2) >>> prior_gmm = bob.learn.em.GMMMachine(3, 2)
>>> prior_gmm.means = numpy.array([[1, 1], [2, 2.1], [3, 3]]) >>> prior_gmm.means = numpy.array([[1, 1], [2, 2.1], [3, 3]])
>>> # Defining a fake prior >>> # Defining a fake prior
>>> adapted_gmm = bob.learn.em.GMMMachine(3,2) >>> adapted_gmm = bob.learn.em.GMMMachine(3,2)
>>> adapted_gmm.means = numpy.array([[1.5, 1.5], [2.5, 2.5], [2, 2]]) >>> adapted_gmm.means = numpy.array([[1.5, 1.5], [2.5, 2.5], [2, 2]])
>>> # Defining an input >>> # Defining an input
>>> input = numpy.array([[1.5, 1.5], [1.6, 1.6]]) >>> input = numpy.array([[1.5, 1.5], [1.6, 1.6]])
>>> #Accumulating statistics of the GMM >>> #Accumulating statistics of the GMM
>>> stats = bob.learn.em.GMMStats(3, 2) >>> stats = bob.learn.em.GMMStats(3, 2)
>>> prior_gmm.acc_statistics(input, stats) >>> prior_gmm.acc_statistics(input, stats)
>>> print bob.learn.em.linear_scoring([adapted_gmm], prior_gmm, [stats], [], frame_length_normalisation=True)
>>> score = bob.learn.em.linear_scoring([adapted_gmm], prior_gmm, [stats], [], [[ 0.254]]
>>> frame_length_normalisation=True)
>>> print score
[[ 0.25354909]]
Probabilistic Linear Discriminant Analysis (PLDA) Probabilistic Linear Discriminant Analysis (PLDA)
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment