ROC and DET plots are calculated incorrectly sometimes
Following the discussion here: https://groups.google.com/forum/#!topic/bob-devel/EIp1nvw5-vQ
Looks like we have a corner case where the scores have a very large peak in their distribution: The scores are also available: fusion_all_200_datatset.npy To load them:
>>> scores = numpy.load('fusion_all_200_datatset.npy')
... positives = scores[0]
... negatives = scores[1]
# The negatives are mostly 0
>>> sum(negatives < 9e-16)
51911
>>> sum(negatives < 9e-17)
51525
>>> sum(negatives < 9e-18)
51029
>>> sum(negatives < 9e-20)
49675
>>> sum(negatives < 9e-22)
47543
>>> sum(negatives < 9e-30)
27487
>>> sum(negatives < 9e-60)
0
@tiago.pereira may be able to provide a set of smaller scores to debug this.
When I calculate the EER, I get 2.7%
:
>>> scores = numpy.load('fusion_all_200_datatset.npy')
... positives = scores[0]
... negatives = scores[1]
...
>>> threshold = bob.measure.eer_threshold(negatives, positives)
... FAR, FRR = bob.measure.farfrr(negatives, positives, threshold)
...
>>> FAR, FRR
(0.02762483029114497, 0.027626084163186636)
>>> negatives.mean(), negatives.std(), positives.mean(), positives.std()
(3.0290521114232657e-06, 0.00067550514259906739, 0.20795613959959214, 0.32512541269459205)
>>> 100*(FAR+FRR)/2
2.76254572271658
>>> bob.measure.plot.roc(negatives, positives, npoints)
[<matplotlib.lines.Line2D object at 0x7f472a8bed10>]
>>> bob.measure.plot.det(negatives, positives, npoints, color=(0,0,0), linestyle='-', label='test')
... bob.measure.plot.det_axis([0.1, 80, 0.1, 80])
...
[-3.090232246772911, 0.8416212348748217, -3.090232246772911, 0.8416212348748217]
# plot the EER point on the DET curve
>>> pyplot.plot(bob.measure.ppndf(FAR), bob.measure.ppndf(FAR), 'ro')
[<matplotlib.lines.Line2D object at 0x7f472a4dc510>]
But when I plot the ROC and DET curves, I get curves with EER of 6%
or more than 20%
: