The script will run out of memory if training data is rather big
I am trying to run a GMM algorithm but it runs out of memory in the k-init
step.
I saw that in the code it does something like numpy.vstack(list(features))
which I think
would require twice the memory of training features (to create a list and then the numpy array)
Looking at numpy's documentation looks like there is only one function that takes an iterable: https://docs.scipy.org/doc/numpy/reference/generated/numpy.fromiter.html But working with that is tricky.
I am addressing this issue in !9 (merged)