Commit 2e863924 authored by Manuel Günther's avatar Manuel Günther
Browse files

New README and documentation strategy.

parent 816abecf
......@@ -6,6 +6,7 @@ matrix:
- secure: gX4Yr/32b9tOVFiAxjzFsbIm1VWiB6WJ1rXOd4hCd6ncBaXw83e4B/jaP1Ci2uKRi6PYO9y7AHj8y64OcTeS2Uwr+5QECRSRgf6+AKCde4VgKoh5wUpVOp4QLaZlJLNM6L/EwGWih5K2yoM/17cRLyGQC/x6QID/ZfEjwtZni2M=
- secure: kq32QuV+vu0Qaqo1wn4UvUNAglNF4HGJv96XNwf263G+uZekCMpx6DJdiVJouxoYRcOsgkoJgbc1rxAFGUZ7PZ4gfP6srQ9i3ZT7H3zghVBr8ULwmi9+TR58W/P+KBZl/lLVSsdKeMVfT2hJ9XpF1P6G8GM/3UkTP3Rp0t8SIug=
- python: 3.2
- NUMPYSPEC===1.7.1
......@@ -15,7 +16,7 @@ matrix:
- sudo add-apt-repository -y ppa:biometrics/bob
- sudo apt-get update -qq
- sudo apt-get install -qq --force-yes libboost-all-dev libblitz1-dev libhdf5-serial-dev libatlas-dev libatlas-base-dev liblapack-dev
- sudo apt-get install -qq --force-yes libboost-all-dev libblitz1-dev libhdf5-serial-dev libatlas-dev libatlas-base-dev liblapack-dev texlive-latex-recommended texlive-latex-extra texlive-fonts-recommended
- if [ -n "${NUMPYSPEC}" ]; then sudo apt-get install -qq gfortran; fi
- if [ -n "${NUMPYSPEC}" ]; then pip install --upgrade pip setuptools; fi
- if [ -n "${NUMPYSPEC}" ]; then pip install --find-links --find-links --use-wheel numpy$NUMPYSPEC; fi
......@@ -2,78 +2,36 @@
.. Andre Anjos <>
.. Thu 24 Apr 17:24:10 2014 CEST
.. image::
.. image::
.. image::
.. image::
.. image::
.. image::
.. image::
.. image::
.. image::
Python bindings for Bob's Multi-Layer Perceptron and Trainers
Multi-Layer Perceptron and Trainers in Bob
This package contains a set of Pythonic bindings for Bob's MLP and Trainers.
Install it through normal means, via PyPI or use ``zc.buildout`` to bootstrap
the package and run test units.
To install this package -- alone or together with other `Packages of Bob <>`_ -- please read the `Installation Instructions <>`_.
For Bob_ to be able to work properly, some dependent packages are required to be installed.
Please make sure that you have read the `Dependencies <>`_ for your operating system.
For further documentation on this package, please read the `Stable Version <>`_ or the `Latest Version <>`_ of the documentation.
For a list of tutorials on this or the other packages ob Bob_, or information on submitting issues, asking questions and starting discussions, please visit its website.
The latest version of the documentation can be found `here <>`_.
Otherwise, you can generate the documentation for this package yourself, after installation, using Sphinx::
$ sphinx-build -b html doc sphinx
This shall place in the directory ``sphinx``, the current version for the
documentation of the package.
You can run a set of tests using the nose test runner::
$ nosetests -sv
.. warning::
If Bob <= 1.2.1 is installed on your python path, nose will automatically
load the old version of the insulate plugin available in Bob, which will
trigger the loading of incompatible shared libraries (from Bob itself), in
to your working binary. This will cause a stack corruption. Either remove
the centrally installed version of Bob, or build your own version of Python
in which Bob <= 1.2.1 is not installed.
You can run our documentation tests using sphinx itself::
$ sphinx-build -b doctest doc sphinx
You can test overall test coverage with::
$ nosetests --with-coverage --cover-package=bob.learn.mlp
The ``coverage`` egg must be installed for this to work properly.
To develop this package, install using ``zc.buildout``, using the buildout
configuration found on the root of the package::
$ python
$ ./bin/buildout
Tweak the options in ``buildout.cfg`` to disable/enable verbosity and debug
.. _bob:
......@@ -33,12 +33,12 @@ class Machine:
The activation function to use for the hidden neurons of the network.
Should be one of the classes derived from
The activation function to use for the output neurons of the network.
Should be one of the classes derived from
if bias is None:
......@@ -112,7 +112,7 @@ class TrainableMachine(Machine):
This is the error back-propagated through the last neuron by any of the
available :py:class:`bob.trainer.Cost` functors. Every row in b matches
available :py:class:`bob.learn.mlp.Cost` functors. Every row in b matches
one example.
......@@ -220,7 +220,7 @@ PyDoc_STRVAR(s_cost_object_str, "cost_object");
"An object, derived from :py:class:`bob.learn.mlp.Cost` (e.g.\n\
:py:class:`bob.learn.mlp.SquareError` or \n\
:py:class:`bob.trainer.CrossEntropyLoss`), that is used to evaluate\n\
:py:class:`bob.learn.mlp.CrossEntropyLoss`), that is used to evaluate\n\
the cost (a.k.a. *loss*) and the derivatives given the input, the\n\
target and the MLP structure.");
......@@ -19,6 +19,7 @@ develop = src/bob.extension
; options for bob.buildout extension
debug = true
verbose = true
newest = false
bob.extension = git
......@@ -249,36 +249,11 @@ autoclass_content = 'both'
autodoc_member_order = 'bysource'
autodoc_default_flags = ['members', 'undoc-members', 'inherited-members', 'show-inheritance']
def smaller_than(v1, v2):
"""Compares scipy/numpy version numbers"""
c1 = v1.split('.')
c2 = v2.split('.')[:len(c1)] #clip to the compared version
for i, k in enumerate(c2):
n1 = c1[i]
n2 = c2[i]
n1 = int(n1)
n2 = int(n2)
except ValueError:
n1 = str(n1)
n2 = str(n2)
if n1 > n2: return False
return True
# Some name mangling to find the correct sphinx manuals for some packages
numpy_version = __import__('numpy').version.version
if smaller_than(numpy_version, '1.5.z'):
numpy_version = '.'.join(numpy_version.split('.')[:-1]) + '.x'
numpy_version = '.'.join(numpy_version.split('.')[:-1]) + '.0'
numpy_manual = '' % numpy_version
# For inter-documentation mapping:
intersphinx_mapping = {
'' % sys.version_info[:2]: None,
numpy_manual: None,
from bob.extension.utils import link_documentation
# the documentation also links to bob.learn.linear
intersphinx_mapping = link_documentation(['python', 'numpy', 'bob.learn.linear'])
def setup(app):
......@@ -100,7 +100,7 @@ this example:
Once the network weights and biases are set, we can feed forward an example
through this machine. This is done using the ``()`` operator, like for a
.. doctest::
......@@ -120,7 +120,7 @@ available MLP trainers in two different 2D `NumPy`_ arrays, one for the input
>>> t0 = numpy.array([[.0]]) # target
The class used to train a MLP [1]_ with backpropagation [2]_ is
:py:class:`bob.learn.MLP.BackProp`. An example is shown below.
:py:class:`bob.learn.mlp.BackProp`. An example is shown below.
.. doctest::
......@@ -21,7 +21,7 @@ setup(
description='Bindings for bob.machine\'s Multi-layer Perceptron and Trainers',
description='Bob\'s Multi-layer Perceptron and Trainers',
author='Andre Anjos',
......@@ -39,7 +39,7 @@ setup(
ext_modules = [
......@@ -97,13 +97,14 @@ setup(
classifiers = [
'Development Status :: 3 - Alpha',
'Framework :: Bob',
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'License :: OSI Approved :: BSD License',
'Natural Language :: English',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Topic :: Software Development :: Libraries :: Python Modules',
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment