gmm weights changes not taken into account
Created by: DavidDoukhan
Hi list, I was willing to use class bob.learn.em.GMMMachine. I was following the following documentation, which consist to create a GMMMachine instance, and set means, covariance and weights manually: https://www.idiap.ch/software/bob/docs/latest/bioidiap/bob.learn.misc/master/guide.html
I installed bob with default options using pip. According to the few logs bellow, it seems the GMM weights are not taken into account when computing the log likelihood of a sample.
import bob.learn.em from bob.learn.em import GMMMachine import numpy as np print bob.learn.em.__version__
gmm1 = GMMMachine(1,1) data = np.ones((1,1)) gmm1(data)
gmm1.weights[:] = 666 gmm1(data)
After few more investigation, I had the feeling that this behavior had to do with the caching of the log weights, which is not done when assigning new values to the weights, so I found a workaround consisting to create a new instance from a illed instance:
gmm2 = GMMMachine(gmm1) gmm2(data)
This workaround seems to provide coherent results.
In conclusion, I think that this behavior should either be fixed, or set explicit within the documentation.