test.py 13.4 KB
 André Anjos committed Dec 13, 2013 1 2 3 ``````#!/usr/bin/env python # vim: set fileencoding=utf-8 : # Andre Anjos `````` André Anjos committed Jan 14, 2014 4 ``````# Sun 2 Jun 16:42:52 2013 `````` André Anjos committed Dec 13, 2013 5 6 7 8 9 10 11 `````` """Test activation functions """ import numpy import math `````` Tiago de Freitas Pereira committed Aug 15, 2019 12 13 14 15 16 17 18 19 20 ``````from . import ( Identity, Linear, Logistic, HyperbolicTangent, MultipliedHyperbolicTangent, Activation, ) from nose.tools import assert_raises `````` Tiago de Freitas Pereira committed Aug 09, 2019 21 `````` `````` André Anjos committed Dec 13, 2013 22 `````` `````` André Anjos committed Jan 15, 2014 23 ``````def estimate_gradient(f, x, epsilon=1e-4, args=()): `````` Tiago de Freitas Pereira committed Aug 09, 2019 24 `````` """Estimates the gradient for a given callable f `````` André Anjos committed Jan 15, 2014 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 `````` Suppose you have a function :math:`f'(x)` that purportedly computes :math`\frac{\partial f(x)}{\partial x}`. You'd like to check if :math:`f'(x)` is outputting correct derivative values. You can then use this function to estimate the gradient around a point and compare it to the output of :math:`f'(x)`. The estimation can have a precision of up to a few decimal houses. Imagine a random value for :math:`x`, called :math:`x_t` (for test). Now imagine you modify one of the elements in :math:`x_t` so that :math:`x_{t+\epsilon}` has that element added with a small (positive) value :math:`\epsilon` and :math:`x_{t-\epsilon}` has the same value subtracted. In this case, one can use a truncated Taylor expansion of the derivative to calculate the approximate supposed value: .. math:: f'(x_t) \sim \frac{f(x_{t+\epsilon}) - f(x_{t-\epsilon})}{2\epsilon} The degree to which these two values should approximate each other will depend on the details of :math:`f(x)`. But assuming :math:`\epsilon = 10^{-4}`, you’ll usually find that the left- and right-hand sides of the above will agree to at least 4 significant digits (and often many more). Keyword arguments: f The function which you'd like to have the gradient estimated for. x The input to ``f``. This must be the first parameter ``f`` receives. If that is not the case, you must write a wrapper around ``f`` so it does the parameter inversion and provide that wrapper instead. If f expects a multi-dimensional array, than this entry should be a :py:class:`numpy.ndarray` with as many dimensions as required for f. precision The epsilon step args (optional) Extra arguments (a tuple) to ``f`` This function returns the estimated value for :math:`f'(x)` given ``x``. .. note:: Gradient estimation is a powerful tool for testing if a function is correctly computing the derivative of another function, but can be quite slow. It therefore is not a good replacement for writing specific code that can compute the derivative of ``f``. """ `````` Tiago de Freitas Pereira committed Aug 09, 2019 77 78 79 `````` epsilon = 1e-4 if isinstance(x, numpy.ndarray): `````` André Anjos committed Jan 15, 2014 80 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 81 82 83 84 85 86 87 88 `````` retval = numpy.ndarray(x.shape, dtype=x.dtype) for k in range(x.size): xt_plus = x.copy() xt_plus.flat[k] += epsilon xt_minus = x.copy() xt_minus.flat[k] -= epsilon retval.flat[k] = (f(xt_plus, *args) - f(xt_minus, *args)) / (2 * epsilon) return retval `````` André Anjos committed Jan 15, 2014 89 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 90 91 `````` else: # x is scalar return (f(x + epsilon, *args) - f(x - epsilon, *args)) / (2 * epsilon) `````` André Anjos committed Jan 15, 2014 92 `````` `````` André Anjos committed Dec 13, 2013 93 94 `````` def is_close(x, y, eps=1e-10): `````` Tiago de Freitas Pereira committed Aug 09, 2019 95 96 `````` return abs(x - y) < eps `````` André Anjos committed Dec 13, 2013 97 98 `````` def test_identity(): `````` André Anjos committed Jan 14, 2014 99 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 100 101 102 103 104 105 106 107 108 `````` op = Identity() x = numpy.random.rand(10) # 10 random numbers between 0 and 1 # go for an exact match for k in x: assert is_close(op.f(k), k), "Identity does not perform identity %g != %g" % ( op.f(k), k, ) `````` André Anjos committed Dec 13, 2013 109 110 111 `````` def test_identity_derivative(): `````` André Anjos committed Jan 14, 2014 112 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 113 114 `````` op = Identity() x = numpy.random.rand(10) # 10 random numbers between 0 and 1 `````` André Anjos committed Dec 13, 2013 115 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 116 117 118 119 120 121 122 123 124 125 126 127 128 `````` # go for an exact match for k in x: assert is_close( op.f_prime(k), 1.0 ), "Identity derivative is not equal to 1.: %g != 1." % (op.f_prime(k),) # tries to estimate the gradient and check for k in x: absdiff = abs(op.f_prime(k) - estimate_gradient(op.f, k)) assert absdiff < 1e-4, ( "Identity derivative and estimation do not match to 10^-4: |%g-%g| = %g" % (op.f_prime(k), estimate_gradient(op.f, k), absdiff) ) `````` André Anjos committed Dec 13, 2013 129 130 131 `````` def test_linear(): `````` André Anjos committed Jan 14, 2014 132 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 133 134 135 136 137 138 139 140 141 `````` C = numpy.random.rand() op = Linear(C) x = numpy.random.rand(10) # 10 random numbers between 0 and 1 # go for an exact match for k in x: assert is_close( op.f(k), (C * k) ), "Linear does not match expected value: %g != %g" % (op.f(k), C * k) `````` André Anjos committed Dec 13, 2013 142 143 144 `````` def test_linear_derivative(): `````` André Anjos committed Jan 14, 2014 145 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 146 147 148 149 150 151 152 153 154 155 `````` C = numpy.random.rand() op = Linear(C) x = numpy.random.rand(10) # 10 random numbers between 0 and 1 # go for an exact match for k in x: assert is_close(op.f_prime(k), C), ( "Linear derivative does not match expected value: %g != %g" % (op.f_prime(k), k) ) `````` André Anjos committed Dec 13, 2013 156 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 157 158 159 160 161 162 163 `````` # tries to estimate the gradient and check for k in x: absdiff = abs(op.f_prime(k) - estimate_gradient(op.f, k)) assert absdiff < 1e-4, ( "Identity derivative and estimation do not match to 10^-4: |%g-%g| = %g" % (op.f_prime(k), estimate_gradient(op.f, k), absdiff) ) `````` André Anjos committed Jan 14, 2014 164 `````` `````` André Anjos committed Dec 13, 2013 165 166 `````` def test_hyperbolic_tangent(): `````` André Anjos committed Jan 14, 2014 167 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 168 169 170 171 172 173 174 175 176 `````` op = HyperbolicTangent() x = numpy.random.rand(10) # 10 random numbers between 0 and 1 # go for an exact match for k in x: assert is_close(op.f(k), math.tanh(k)), ( "HyperbolicTangent does not match expected value: %g != %g" % (op.f(k), math.tanh(k)) ) `````` André Anjos committed Dec 13, 2013 177 178 179 `````` def test_hyperbolic_tangent_derivative(): `````` André Anjos committed Jan 14, 2014 180 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 181 182 183 184 185 186 187 188 189 190 `````` op = HyperbolicTangent() x = numpy.random.rand(10) # 10 random numbers between 0 and 1 # go for an exact match for k in x: precise = 1 - op.f(k) ** 2 assert is_close(op.f_prime(k), precise), ( "HyperbolicTangent derivative does not match expected value: %g != %g" % (op.f_prime(k), precise) ) `````` André Anjos committed Dec 13, 2013 191 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 192 193 194 195 196 197 198 `````` # tries to estimate the gradient and check for k in x: absdiff = abs(op.f_prime(k) - estimate_gradient(op.f, k)) assert absdiff < 1e-4, ( "HyperbolicTangent derivative and estimation do not match to 10^-4: |%g-%g| = %g" % (op.f_prime(k), estimate_gradient(op.f, k), absdiff) ) `````` André Anjos committed Jan 14, 2014 199 `````` `````` André Anjos committed Dec 13, 2013 200 201 `````` def test_logistic(): `````` André Anjos committed Jan 14, 2014 202 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 203 204 205 206 207 208 209 210 211 `````` op = Logistic() x = numpy.random.rand(10) # 10 random numbers between 0 and 1 # go for an exact match for k in x: precise = 1.0 / (1.0 + math.exp(-k)) assert is_close( op.f(k), precise ), "Logistic does not match expected value: %g != %g" % (op.f(k), precise) `````` André Anjos committed Dec 13, 2013 212 213 214 `````` def test_logistic_derivative(): `````` André Anjos committed Jan 14, 2014 215 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 216 217 218 219 220 221 222 223 224 225 `````` op = Logistic() x = numpy.random.rand(10) # 10 random numbers between 0 and 1 # go for an exact match for k in x: precise = op.f(k) * (1 - op.f(k)) assert is_close(op.f_prime(k), precise), ( "Logistic derivative does not match expected value: %g != %g" % (op.f_prime(k), precise) ) `````` André Anjos committed Dec 13, 2013 226 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 227 228 229 230 231 232 233 `````` # tries to estimate the gradient and check for k in x: absdiff = abs(op.f_prime(k) - estimate_gradient(op.f, k)) assert absdiff < 1e-4, ( "Logistic derivative and estimation do not match to 10^-4: |%g-%g| = %g" % (op.f_prime(k), estimate_gradient(op.f, k), absdiff) ) `````` André Anjos committed Jan 14, 2014 234 `````` `````` André Anjos committed Dec 13, 2013 235 236 237 `````` def test_multiplied_tanh(): `````` Tiago de Freitas Pereira committed Aug 09, 2019 238 239 240 241 242 243 244 245 246 247 248 `````` C = numpy.random.rand() M = numpy.random.rand() op = MultipliedHyperbolicTangent(C, M) x = numpy.random.rand(10) # 10 random numbers between 0 and 1 # go for an exact match for k in x: assert is_close(op.f(k), C * math.tanh(M * k)), ( "MultipliedHyperbolicTangent does not match expected value: %g != %g" % (op.f(k), C * math.tanh(M * k)) ) `````` André Anjos committed Dec 13, 2013 249 250 251 `````` def test_multiplied_tanh_derivative(): `````` André Anjos committed Jan 14, 2014 252 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 253 254 255 256 257 258 259 260 261 262 263 264 `````` C = numpy.random.rand() M = numpy.random.rand() op = MultipliedHyperbolicTangent(C, M) x = numpy.random.rand(10) # 10 random numbers between 0 and 1 # go for an exact match for k in x: precise = C * M * (1 - math.pow(math.tanh(M * k), 2)) assert is_close(op.f_prime(k), precise), ( "MultipliedHyperbolicTangent derivative does not match expected value: %g != %g" % (op.f_prime(k), precise) ) `````` André Anjos committed Dec 13, 2013 265 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 266 267 268 269 270 271 272 `````` # tries to estimate the gradient and check for k in x: absdiff = abs(op.f_prime(k) - estimate_gradient(op.f, k)) assert absdiff < 1e-4, ( "MultipliedHyperbolicTangent derivative and estimation do not match to 10^-4: |%g-%g| = %g" % (op.f_prime(k), estimate_gradient(op.f, k), absdiff) ) `````` André Anjos committed Jan 14, 2014 273 `````` `````` André Anjos committed Dec 13, 2013 274 275 276 `````` def test_1d_ndarray(): `````` Tiago de Freitas Pereira committed Aug 09, 2019 277 278 279 280 281 282 283 `````` C = numpy.random.rand() op = Linear(C) X = numpy.random.rand(10) # 10 random numbers between 0 and 1 Y = op(X) assert Y.shape == X.shape assert Y.dtype == numpy.dtype(float) `````` André Anjos committed Dec 13, 2013 284 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 285 286 287 `````` Y_f = op.f(X) assert Y_f.shape == X.shape assert Y.dtype == numpy.dtype(float) `````` André Anjos committed Dec 13, 2013 288 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 289 290 291 `````` Y_f_prime = op.f_prime(X) assert Y_f_prime.shape == X.shape assert Y.dtype == numpy.dtype(float) `````` André Anjos committed Dec 13, 2013 292 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 293 294 295 `````` Y_f_prime_from_f = op.f_prime_from_f(X) assert Y_f_prime_from_f.shape == X.shape assert Y.dtype == numpy.dtype(float) `````` André Anjos committed Dec 13, 2013 296 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 297 298 299 300 301 `````` for k, x in enumerate(X): assert is_close(op(x), Y[k]) assert is_close(op.f(x), Y_f[k]) assert is_close(op.f_prime(x), Y_f_prime[k]) assert is_close(op.f_prime_from_f(x), Y_f_prime_from_f[k]) `````` André Anjos committed Dec 13, 2013 302 303 304 305 `````` def test_2d_ndarray(): `````` Tiago de Freitas Pereira committed Aug 09, 2019 306 307 308 `````` C = numpy.random.rand() op = Linear(C) X = numpy.random.rand(4, 4) `````` André Anjos committed Dec 13, 2013 309 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 310 311 312 `````` Y = op(X) assert Y.shape == X.shape assert Y.dtype == numpy.dtype(float) `````` André Anjos committed Dec 13, 2013 313 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 314 315 316 `````` Y_f = op.f(X) assert Y_f.shape == X.shape assert Y_f.dtype == numpy.dtype(float) `````` André Anjos committed Dec 13, 2013 317 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 318 319 320 `````` Y_f_prime = op.f_prime(X) assert Y_f_prime.shape == X.shape assert Y_f_prime.dtype == numpy.dtype(float) `````` André Anjos committed Dec 13, 2013 321 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 322 323 324 325 326 327 328 329 330 `````` Y_f_prime_from_f = op.f_prime_from_f(X) assert Y_f_prime_from_f.shape == X.shape assert Y_f_prime_from_f.dtype == numpy.dtype(float) for k, x in enumerate(X.flat): assert is_close(op(x), Y.flat[k]) assert is_close(op.f(x), Y_f.flat[k]) assert is_close(op.f_prime(x), Y_f_prime.flat[k]) assert is_close(op.f_prime_from_f(x), Y_f_prime_from_f.flat[k]) `````` André Anjos committed Dec 13, 2013 331 332 333 334 `````` def test_3d_ndarray(): `````` Tiago de Freitas Pereira committed Aug 09, 2019 335 336 337 338 339 340 341 `````` C = numpy.random.rand() op = Linear(C) X = numpy.random.rand(3, 3, 3) Y = op(X) assert Y.shape == X.shape assert Y.dtype == numpy.dtype(float) `````` André Anjos committed Dec 13, 2013 342 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 343 344 345 `````` Y_f = op.f(X) assert Y_f.shape == X.shape assert Y_f.dtype == numpy.dtype(float) `````` André Anjos committed Dec 13, 2013 346 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 347 348 349 `````` Y_f_prime = op.f_prime(X) assert Y_f_prime.shape == X.shape assert Y_f_prime.dtype == numpy.dtype(float) `````` André Anjos committed Dec 13, 2013 350 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 351 352 353 `````` Y_f_prime_from_f = op.f_prime_from_f(X) assert Y_f_prime_from_f.shape == X.shape assert Y_f_prime_from_f.dtype == numpy.dtype(float) `````` André Anjos committed Dec 13, 2013 354 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 355 356 357 358 359 `````` for k, x in enumerate(X.flat): assert is_close(op(x), Y.flat[k]) assert is_close(op.f(x), Y_f.flat[k]) assert is_close(op.f_prime(x), Y_f_prime.flat[k]) assert is_close(op.f_prime_from_f(x), Y_f_prime_from_f.flat[k]) `````` André Anjos committed Dec 13, 2013 360 361 362 363 `````` def test_4d_ndarray(): `````` Tiago de Freitas Pereira committed Aug 09, 2019 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 `````` C = numpy.random.rand() op = Linear(C) X = numpy.random.rand(2, 2, 2, 2) Y = op(X) assert Y.shape == X.shape assert Y.dtype == numpy.dtype(float) Y_f = op.f(X) assert Y_f.shape == X.shape assert Y_f.dtype == numpy.dtype(float) Y_f_prime = op.f_prime(X) assert Y_f_prime.shape == X.shape assert Y_f_prime.dtype == numpy.dtype(float) Y_f_prime_from_f = op.f_prime_from_f(X) assert Y_f_prime_from_f.shape == X.shape assert Y_f_prime_from_f.dtype == numpy.dtype(float) for k, x in enumerate(X.flat): assert is_close(op(x), Y.flat[k]) assert is_close(op.f(x), Y_f.flat[k]) assert is_close(op.f_prime(x), Y_f_prime.flat[k]) assert is_close(op.f_prime_from_f(x), Y_f_prime_from_f.flat[k]) `````` Tiago de Freitas Pereira committed Aug 09, 2019 391 ``````def test_to_dict(): `````` Tiago de Freitas Pereira committed Aug 09, 2019 392 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 393 394 `````` logistic = Logistic() assert logistic.to_dict()["id"] == "bob.learn.activation.Activation.Logistic" `````` Tiago de Freitas Pereira committed Aug 09, 2019 395 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 396 `````` hyperbolic_tangent = HyperbolicTangent() `````` Tiago de Freitas Pereira committed Aug 09, 2019 397 398 399 400 `````` assert ( hyperbolic_tangent.to_dict()["id"] == "bob.learn.activation.Activation.HyperbolicTangent" ) `````` Tiago de Freitas Pereira committed Aug 09, 2019 401 402 403 404 405 406 407 408 409 `````` identity = Identity() assert identity.to_dict()["id"] == "bob.learn.activation.Activation.Identity" linear = Linear() assert linear.to_dict()["id"] == "bob.learn.activation.Activation.Linear" assert linear.to_dict()["C"] == 1 multiplied_hyperbolic_tangent = MultipliedHyperbolicTangent() `````` Tiago de Freitas Pereira committed Aug 09, 2019 410 411 412 413 414 415 `````` assert ( multiplied_hyperbolic_tangent.to_dict()["id"] == "bob.learn.activation.Activation.MultipliedHyperbolicTangent" ) assert multiplied_hyperbolic_tangent.to_dict()["C"] == 1.0 assert multiplied_hyperbolic_tangent.to_dict()["M"] == 1.0 `````` Tiago de Freitas Pereira committed Aug 09, 2019 416 417 418 419 `````` def test_from_dict(): `````` Tiago de Freitas Pereira committed Aug 12, 2019 420 421 `````` # The first 3 tests don't make much sense, but I'm testing them anyways input_dict = {"id": "bob.learn.activation.Activation.Logistic"} `````` Tiago de Freitas Pereira committed Aug 15, 2019 422 `````` logistic = Activation.from_dict(input_dict) `````` Tiago de Freitas Pereira committed Aug 12, 2019 423 424 425 `````` assert isinstance(logistic, Logistic) input_dict = {"id": "bob.learn.activation.Activation.HyperbolicTangent"} `````` Tiago de Freitas Pereira committed Aug 15, 2019 426 `````` hyperbolic_tangent = Activation.from_dict(input_dict) `````` Tiago de Freitas Pereira committed Aug 12, 2019 427 428 429 `````` assert isinstance(hyperbolic_tangent, HyperbolicTangent) input_dict = {"id": "bob.learn.activation.Activation.Identity"} `````` Tiago de Freitas Pereira committed Aug 15, 2019 430 `````` identity = Activation.from_dict(input_dict) `````` Tiago de Freitas Pereira committed Aug 12, 2019 431 432 `````` assert isinstance(identity, Identity) `````` Tiago de Freitas Pereira committed Aug 09, 2019 433 `````` input_dict = {"id": "bob.learn.activation.Activation.Linear", "C": 2.0} `````` Tiago de Freitas Pereira committed Aug 15, 2019 434 `````` linear = Activation.from_dict(input_dict) `````` Tiago de Freitas Pereira committed Aug 09, 2019 435 `````` assert linear.C == 2 `````` Tiago de Freitas Pereira committed Aug 09, 2019 436 `````` `````` Tiago de Freitas Pereira committed Aug 09, 2019 437 438 439 440 441 `````` input_dict = { "id": "bob.learn.activation.Activation.MultipliedHyperbolicTangent", "C": 2.0, "M": 3.0, } `````` Tiago de Freitas Pereira committed Aug 15, 2019 442 443 `````` multiplied_hyperbolic_tangent = Activation.from_dict(input_dict) `````` Tiago de Freitas Pereira committed Aug 09, 2019 444 445 `````` assert multiplied_hyperbolic_tangent.C == 2.0 assert multiplied_hyperbolic_tangent.M == 3.0 `````` Tiago de Freitas Pereira committed Aug 15, 2019 446 447 448 449 `````` with assert_raises(ValueError): input_dict = {"id": "bob.learn.activation.Activation.Wrong"} Activation.from_dict(input_dict)``````