"""Computes the Intrapersonal/Extrapersonal classifier using a generic feature type and feature comparison function"""
"""Computes the Intrapersonal/Extrapersonal classifier using a generic feature type and feature comparison function.
In this generic implementation, any distance or similarity vector that results as a comparison of two feature vectors can be used.
Currently two different versions are implemented: One with [MWP98]_ and one without (a generalization of [GW09]_) subspace projection of the features.
The implementation of the BIC training is taken from :ref:`bob.learn.linear <bob.learn.linear>`.
**Parameters:**
comparison_function : function
The function to compare the features in the original feature space.
For a given pair of features, this function is supposed to compute a vector of similarity (or distance) values.
In the easiest case, it just computes the element-wise difference of the feature vectors, but more difficult functions can be applied, and the function might be specialized for the features you put in.
maximum_training_pair_count : int or None
Limit the number of training image pairs to the given value, i.e., to avoid memory issues.
subspace_dimensions : (int, int) or None
A tuple of sizes of the intrapersonal and extrapersonal subspaces.
If given, subspace projection is performed (cf. [MWP98]_) and the subspace projection matrices are truncated to the given sizes.
If omitted, no subspace projection is performed (cf. [GW09]_).
uses_dffs : bool
Only valid, when ``subspace_dimensions`` are specified.
Use the *Distance From Feature Space* (DFFS) (cf. [MWP98]_) during scoring.
Use this flag with care!
read_function : function
A function to read a feature from :py:class:`bob.io.base.HDF5File`.
This function need to be appropriate to read the type of features that you are using.
By default, :py:func:`bob.bio.base.load` is used.
write_function : function
A function to write a feature to :py:class:`bob.io.base.HDF5File`.
This function is used to write the model and need to be appropriate to write the type of features that you are using.
By default, :py:func:`bob.bio.base.save` is used.
kwargs : ``key=value`` pairs
A list of keyword arguments directly passed to the :py:class:`Algorithm` base class constructor.
"""
def__init__(
self,
...
...
@@ -59,17 +97,27 @@ class BIC (Algorithm):
self.M_E=None
def_sqr(self,x):
returnx*x
def_trainset_for(self,pairs):
"""Computes the array containing the comparison results for the given set of image pairs."""
"""Computes the IEC score for the given model and probe pair"""
"""score(model, probe) -> float
Computes the BIC score between the model and the probe.
First, the ``comparison_function`` is used to create the comparison vectors between all model features and the probe feature.
Then, a BIC score is computed for each comparison vector, and the BIC scores are fused using the :py:func:`model_fusion_function` defined in the :py:class:`Algorithm` base class.
**Parameters:**
model : [object]
The model storing all model features.
probe : object
The probe feature.
**Returns:**
score : float
A fused BIC similarity value between ``model`` and ``probe``.
"""Computes a linear discriminant analysis (LDA) on the given data, possibly after computing a principal component analysis (PCA).
This algorithm computes a LDA projection (:py:class:`bob.learn.linear.FisherLDATrainer`) on the given training features, projects the features to Fisher space and computes the distance of two projected features in Fisher space.
For example, the Fisher faces algorithm as proposed by [ZKC+98]_ can be run with this class.
Additionally, a PCA projection matrix can be computed beforehand, to reduce the dimensionality of the input vectors.
In that case, the finally stored projection matrix is the combination of the PCA and LDA projection.
**Parameters:**
lda_subspace_dimension : int or ``None``
If specified, the LDA subspace will be truncated to the given number of dimensions.
By default (``None``) it is limited to the number of classes in the training set - 1.
pca_subspace_dimentsion : int or float or ``None``
If specified, a combined PCA + LDA projection matrix will be computed.
If specified as ``int``, defines the number of eigenvectors used in the PCA projection matrix.
If specified as ``float`` (between 0 and 1), the number of eigenvectors is calculated such that the given percentage of variance is kept.
distance_function : function
A function taking two parameters and returns a float.
If ``uses_variances`` is set to ``True``, the function is provided with a third parameter, which is the vector of variances (aka. eigenvalues).
is_distance_function : bool
Set this flag to ``False`` if the given ``distance_function`` computes a similarity value (i.e., higher values are better)
use_variances : bool
If set to ``True``, the ``distance_function`` is provided with a third argument, which is the vector of variances (aka. eigenvalues).
kwargs : ``key=value`` pairs
A list of keyword arguments directly passed to the :py:class:`Algorithm` base class constructor.
"""
def__init__(
self,
lda_subspace_dimension=0,# if set, the LDA subspace will be truncated to the given number of dimensions; by default it is limited to the number of classes in the training set
lda_subspace_dimension=None,# if set, the LDA subspace will be truncated to the given number of dimensions; by default it is limited to the number of classes in the training set
pca_subspace_dimension=None,# if set, a PCA subspace truncation is performed before applying LDA; might be integral or float
"""Performs a principal component analysis (PCA) on the given data.
This algorithm computes a PCA projection (:py:class:`bob.learn.linear.PCATrainer`) on the given training features, projects the features to face space and computes the distance of two projected features in face space.
For eaxmple, the eigenface algorithm as proposed by [TP91]_ can be run with this class.
This algorithm computes a PCA projection (:py:class:`bob.learn.linear.PCATrainer`) on the given training features, projects the features to eigenspace and computes the distance of two projected features in eigenspace.
For example, the eigenface algorithm as proposed by [TP91]_ can be run with this class.
**Parameters:**
...
...
@@ -35,6 +35,8 @@ class PCA (Algorithm):
use_variances : bool
If set to ``True``, the ``distance_function`` is provided with a third argument, which is the vector of variances (aka. eigenvalues).
kwargs : ``key=value`` pairs
A list of keyword arguments directly passed to the :py:class:`Algorithm` base class constructor.
"""
def__init__(
...
...
@@ -190,7 +192,6 @@ class PCA (Algorithm):
score : float
A similarity value between ``model`` and ``probe``
"""
self._check_feature(probe,True)
# return the negative distance (as a similarity measure)
* ``score_for_multiple_probes(self, model, probes)``: By default, the average (or min, max, ...) of the scores for all probes are computed. **Override** this function in case you want different behavior.
Implemented Tools
-----------------
In this base class, only one feature extractor and some recognition algorithms are defined.
However, implementations of the base classes can be found in all of the ``bob.bio`` packages.
.. [ZKC+98] *W. Zhao, A. Krishnaswamy, R. Chellappa, D. Swets and J. Weng*. **Discriminant analysis of principal components for face recognition**, pages 73-85. Springer Verlag Berlin, 1998.
.. [Pri07] *S. J. D. Prince*. **Probabilistic linear discriminant analysis for inferences about identity**. Proceedings of the International Conference on Computer Vision. 2007.
.. [ESM+13] *L. El Shafey, Chris McCool, Roy Wallace and Sébastien Marcel*. **A scalable formulation of probabilistic linear discriminant analysis: applied to face recognition**. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(7):1788-1794, 7/2013.
.. [MWP98] *B. Moghaddam, W. Wahid and A. Pentland*. **Beyond eigenfaces: probabilistic matching for face recognition**. IEEE International Conference on Automatic Face and Gesture Recognition, pages 30-35. 1998.
.. [GW09] *M. Günther and R.P. Würtz*. **Face detection and recognition using maximum likelihood classifiers on Gabor graphs**. International Journal of Pattern Recognition and Artificial Intelligence, 23(3):433-461, 2009.