MiuraMatch.py 4.44 KB
Newer Older
Pedro TOME's avatar
Pedro TOME committed
1 2 3 4 5 6
#!/usr/bin/env python
# vim: set fileencoding=utf-8 :

import numpy
import scipy.signal

7
from bob.bio.base.algorithm import Algorithm
Pedro TOME's avatar
Pedro TOME committed
8

9 10

class MiuraMatch (Algorithm):
11 12 13 14
  """Finger vein matching: match ratio via cross-correlation

  The method is based on "cross-correlation" between a model and a probe image.
  It convolves the binary image(s) representing the model with the binary image
André Anjos's avatar
André Anjos committed
15
  representing the probe (rotated by 180 degrees), and evaluates how they
16
  cross-correlate. If the model and probe are very similar, the output of the
André Anjos's avatar
André Anjos committed
17
  correlation corresponds to a single scalar and approaches a maximum. The
18 19 20 21 22 23
  value is then normalized by the sum of the pixels lit in both binary images.
  Therefore, the output of this method is a floating-point number in the range
  :math:`[0, 0.5]`. The higher, the better match.

  In case model and probe represent images from the same vein structure, but
  are misaligned, the output is not guaranteed to be accurate. To mitigate this
André Anjos's avatar
André Anjos committed
24
  aspect, Miura et al. proposed to add a *small* cropping factor to the model
25 26 27 28
  image, assuming not much information is available on the borders (``ch``, for
  the vertical direction and ``cw``, for the horizontal direction). This allows
  the convolution to yield searches for different areas in the probe image. The
  maximum value is then taken from the resulting operation. The convolution
André Anjos's avatar
André Anjos committed
29 30
  result is normalized by the pixels lit in both the cropped model image and
  the matching pixels on the probe that yield the maximum on the resulting
31
  convolution.
32

André Anjos's avatar
André Anjos committed
33 34 35
  For this to work properly, input images are supposed to be binary in nature,
  with zeros and ones.

36 37 38 39
  Based on N. Miura, A. Nagasaka, and T. Miyatake. Feature extraction of finger
  vein patterns based on repeated line tracking and its application to personal
  identification. Machine Vision and Applications, Vol. 15, Num. 4, pp.
  194--203, 2004
40

André Anjos's avatar
André Anjos committed
41
  Parameters:
42

43
    ch (:py:class:`int`, optional): Maximum search displacement in y-direction.
André Anjos's avatar
André Anjos committed
44

45
    cw (:py:class:`int`, optional): Maximum search displacement in x-direction.
46

Pedro TOME's avatar
Pedro TOME committed
47 48
  """

49
  def __init__(self,
50 51
      ch = 80,       # Maximum search displacement in y-direction
      cw = 90,       # Maximum search displacement in x-direction
52
      ):
Pedro TOME's avatar
Pedro TOME committed
53 54

    # call base class constructor
55
    Algorithm.__init__(
Pedro TOME's avatar
Pedro TOME committed
56 57 58 59 60 61 62 63 64 65 66 67
        self,

        ch = ch,
        cw = cw,

        multiple_model_scoring = None,
        multiple_probe_scoring = None
    )

    self.ch = ch
    self.cw = cw

68

Pedro TOME's avatar
Pedro TOME committed
69 70
  def enroll(self, enroll_features):
    """Enrolls the model by computing an average graph for each model"""
71

Pedro TOME's avatar
Pedro TOME committed
72 73 74 75
    # return the generated model
    return numpy.array(enroll_features)


76 77
  def score(self, model, probe):
    """Computes the score between the probe and the model.
78

79
    Parameters:
80

81
      model (numpy.ndarray): The model of the user to test the probe agains
82

83
      probe (numpy.ndarray): The probe to test
Pedro TOME's avatar
Pedro TOME committed
84 85


86
    Returns:
Olegs NIKISINS's avatar
Olegs NIKISINS committed
87

André Anjos's avatar
André Anjos committed
88
      float: Value between 0 and 0.5, larger value means a better match
Olegs NIKISINS's avatar
Olegs NIKISINS committed
89

Pedro TOME's avatar
Pedro TOME committed
90
    """
91

Pedro TOME's avatar
Pedro TOME committed
92
    I=probe.astype(numpy.float64)
93 94

    if len(model.shape) == 2:
Pedro TOME's avatar
Pedro TOME committed
95
      model = numpy.array([model])
96

Pedro TOME's avatar
Pedro TOME committed
97
    scores = []
98

99 100
    # iterate over all models for a given individual
    for md in model:
101 102

      # erode model by (ch, cw)
103
      R = md.astype(numpy.float64)
104
      h, w = R.shape #same as I
Pedro TOME's avatar
Pedro TOME committed
105
      crop_R = R[self.ch:h-self.ch, self.cw:w-self.cw]
106

107 108 109 110 111 112 113 114 115 116 117
      # correlates using scipy - fastest option available iff the self.ch and
      # self.cw are height (>30). In this case, the number of components
      # returned by the convolution is high and using an FFT-based method
      # yields best results. Otherwise, you may try  the other options bellow
      # -> check our test_correlation() method on the test units for more
      # details and benchmarks.
      Nm = scipy.signal.fftconvolve(I, numpy.rot90(crop_R, k=2), 'valid')
      # 2nd best: use convolve2d or correlate2d directly;
      # Nm = scipy.signal.convolve2d(I, numpy.rot90(crop_R, k=2), 'valid')
      # 3rd best: use correlate2d
      # Nm = scipy.signal.correlate2d(I, crop_R, 'valid')
118 119

      # figures out where the maximum is on the resulting matrix
Pedro TOME's avatar
Pedro TOME committed
120
      t0, s0 = numpy.unravel_index(Nm.argmax(), Nm.shape)
121 122

      # this is our output
Pedro TOME's avatar
Pedro TOME committed
123
      Nmm = Nm[t0,s0]
124 125 126 127

      # normalizes the output by the number of pixels lit on the input
      # matrices, taking into consideration the surface that produced the
      # result (i.e., the eroded model and part of the probe)
128
      scores.append(Nmm/(crop_R.sum() + I[t0:t0+h-2*self.ch, s0:s0+w-2*self.cw].sum()))
129

Pedro TOME's avatar
Pedro TOME committed
130
    return numpy.mean(scores)