Commit ed45d3d9 authored by André Anjos's avatar André Anjos

Initial port to conda-based CI/CD

parent d4043ba1
include LICENSE.AGPL README.rst buildout.cfg bootstrap-buildout.py
include LICENSE.AGPL README.rst version.txt requirements.txt
include buildout.cfg develop.cfg
recursive-include scripts *.sh
recursive-include doc conf.py *.rst *.png *.svg *.ico *.odg *.pdf *.dot
recursive-include beat/core/schema *.json
recursive-include beat/core/prototypes *.json *.py
......
......@@ -20,159 +20,45 @@
.. You should have received a copy of the GNU Affero Public License along ..
.. with the BEAT platform. If not, see http://www.gnu.org/licenses/. ..
.. image:: https://img.shields.io/badge/docs-stable-yellow.svg
:target: https://www.idiap.ch/software/beat/docs/beat/beat.core/stable/index.html
.. image:: https://img.shields.io/badge/docs-latest-orange.svg
:target: https://www.idiap.ch/software/beat/docs/beat/beat.core/master/index.html
.. image:: https://gitlab.idiap.ch/beat/beat.core/badges/master/build.svg
:target: https://gitlab.idiap.ch/beat/beat.core/commits/master
.. image:: https://gitlab.idiap.ch/beat/beat.core/badges/master/coverage.svg
:target: https://gitlab.idiap.ch/beat/beat.core/commits/master
.. image:: https://img.shields.io/badge/gitlab-project-0000c0.svg
:target: https://gitlab.idiap.ch/beat/beat.core
.. image:: https://img.shields.io/pypi/v/beat.core.svg
:target: https://pypi.python.org/pypi/beat.core
============================================
Biometrics Evaluation and Testing Platform
============================================
This package contains the source code for the core components of the BEAT
platform.
==========================
Core Components for BEAT
==========================
This package part of BEAT_, an open-source evaluation platform for data science
algorithms and workflows. It contains the source code for its core components.
Installation
------------
Really easy, with ``zc.buildout``::
$ python bootstrap-buildout.py
$ ./bin/buildout
These 2 commands should download and install all non-installed dependencies and
get you a fully operational test and development environment.
.. note::
The python shell used in the first line of the previous command set
determines the python interpreter that will be used for all scripts developed
inside this package.
If you are on the Idiap filesystem, you may use
``/idiap/project/beat/beat.env.deploy/usr/bin/python`` to bootstrap this
package instead. It contains the same setup deployed at the final BEAT
machinery.
Docker
======
This package depends on Docker_ and uses it to run user algorithms in a
container with the required software stack. You must install the Docker_ engine
and make sure the user running tests has access to it.
In particular, this package controls memory and CPU utilisation of the
containers it launches. You must make sure to enable those functionalities on
your installation.
Docker Setup
============
Complete BEAT's `installation`_ instructions. Then, to install this package,
run::
Make sure you have the ``docker`` command available on your system. For certain
operating systems, it is necessary to install ``docker`` via an external
virtual machine (a.k.a. the *docker machine*). Follow the instructions at `the
docker website <https://docs.docker.com/engine/installation/>` before trying to
execute algorithms or experiments.
$ conda install beat.backend.python
We use specific docker images to run user algorithms. Download the following
base images before you try to run tests or experiments on your computer::
$ docker pull docker.idiap.ch/beat/beat.env.system.python:1.1.2
$ docker pull docker.idiap.ch/beat/beat.env.db.examples:1.1.1
$ docker pull docker.idiap.ch/beat/beat.env.client:1.2.0
$ docker pull docker.idiap.ch/beat/beat.env.cxx:1.0.2
Optionally, also download the following images to be able to re-run experiments
downloaded from the BEAT platform (not required for unit testing)::
$ docker pull docker.idiap.ch/beat/beat.env.python:0.0.4
$ docker pull docker.idiap.ch/beat/beat.env.python:1.0.0
$ docker pull docker.idiap.ch/beat/beat.env.db:1.2.2
Documentation
-------------
To build the documentation, just do::
$ ./bin/sphinx-apidoc --separate -d 2 --output=doc/api beat beat/core/test beat/core/scripts
$ ./bin/sphinx-build doc sphinx
Testing
Contact
-------
After installation, it is possible to run our suite of unit tests. To do so,
use ``nose``::
$ ./bin/nosetests -sv
.. note::
Some of the tests for our command-line toolkit require a running BEAT
platform web-server, with a compatible ``beat.core`` installed (preferably
the same). By default, these tests will be skipped. If you want to run
them, you must setup a development web server and set the environment
variable ``BEAT_CORE_TEST_PLATFORM`` to point to that address. For example::
$ export BEAT_CORE_TEST_PLATFORM="http://example.com/platform/"
$ ./bin/nosetests -sv
It is **not** adviseable to run tests against a production web server.
If you want to skip slow tests (at least those pulling stuff from our servers)
or executing lengthy operations, just do::
$ ./bin/nosetests -sv -a '!slow'
To measure the test coverage, do the following::
$ ./bin/nosetests -sv --with-coverage --cover-package=beat.core
To produce an HTML test coverage report, at the directory `./htmlcov`, do the
following::
$ ./bin/nosetests -sv --with-coverage --cover-package=beat.core --cover-html --cover-html-dir=htmlcov
Our documentation is also interspersed with test units. You can run them using
sphinx::
$ ./bin/sphinx -b doctest doc sphinx
Development
-----------
Indentation
===========
You can enforce PEP8_ compliance using the application ``autopep8``. For
example, to enforce compliance on a single file and edit it in place, do::
$ ./bin/autopep8 --indent-size=2 --in-place beat/core/utils.py
We normally use 2-space indentation. If ever, you can easily change the
indentation to 4 spaces like this::
$ ./bin/autopep8 --indent-size=4 --in-place beat/core/utils.py
Profiling
=========
In order to profile the test code, try the following::
$ ./bin/python -mcProfile -oprof.data ./bin/nosetests -sv ...
This will dump the profiling data at ``prof.data``. You can dump its contents
in different ways using another command::
$ ./bin/python -mpstats prof.data
This will allow you to dump and print the profiling statistics as you may find
fit.
For questions or reporting issues to this software package, contact our
development `mailing list`_.
.. References go here
.. _pep8: https://www.python.org/dev/peps/pep-0008/
.. _docker: https://www.docker.com/
.. Place your references here:
.. _beat: https://www.idiap.ch/software/beat
.. _installation: https://www.idiap.ch/software/beat/install
.. _mailing list: https://www.idiap.ch/software/beat/discuss
##############################################################################
#
# Copyright (c) 2006 Zope Foundation and Contributors.
# All Rights Reserved.
#
# This software is subject to the provisions of the Zope Public License,
# Version 2.1 (ZPL). A copy of the ZPL should accompany this distribution.
# THIS SOFTWARE IS PROVIDED "AS IS" AND ANY AND ALL EXPRESS OR IMPLIED
# WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND FITNESS
# FOR A PARTICULAR PURPOSE.
#
##############################################################################
"""Bootstrap a buildout-based project
Simply run this script in a directory containing a buildout.cfg.
The script accepts buildout command-line options, so you can
use the -c option to specify an alternate configuration file.
"""
import os
import shutil
import sys
import tempfile
from optparse import OptionParser
tmpeggs = tempfile.mkdtemp()
usage = '''\
[DESIRED PYTHON FOR BUILDOUT] bootstrap.py [options]
Bootstraps a buildout-based project.
Simply run this script in a directory containing a buildout.cfg, using the
Python that you want bin/buildout to use.
Note that by using --find-links to point to local resources, you can keep
this script from going over the network.
'''
parser = OptionParser(usage=usage)
parser.add_option("-v", "--version", help="use a specific zc.buildout version")
parser.add_option("-t", "--accept-buildout-test-releases",
dest='accept_buildout_test_releases',
action="store_true", default=False,
help=("Normally, if you do not specify a --version, the "
"bootstrap script and buildout gets the newest "
"*final* versions of zc.buildout and its recipes and "
"extensions for you. If you use this flag, "
"bootstrap and buildout will get the newest releases "
"even if they are alphas or betas."))
parser.add_option("-c", "--config-file",
help=("Specify the path to the buildout configuration "
"file to be used."))
parser.add_option("-f", "--find-links",
help=("Specify a URL to search for buildout releases"))
parser.add_option("--allow-site-packages",
action="store_true", default=False,
help=("Let bootstrap.py use existing site packages"))
parser.add_option("--setuptools-version",
help="use a specific setuptools version")
options, args = parser.parse_args()
######################################################################
# load/install setuptools
try:
if options.allow_site_packages:
import setuptools
import pkg_resources
from urllib.request import urlopen
except ImportError:
from urllib2 import urlopen
ez = {}
exec(urlopen('https://bootstrap.pypa.io/ez_setup.py').read(), ez)
if not options.allow_site_packages:
# ez_setup imports site, which adds site packages
# this will remove them from the path to ensure that incompatible versions
# of setuptools are not in the path
import site
# inside a virtualenv, there is no 'getsitepackages'.
# We can't remove these reliably
if hasattr(site, 'getsitepackages'):
for sitepackage_path in site.getsitepackages():
sys.path[:] = [x for x in sys.path if sitepackage_path not in x]
setup_args = dict(to_dir=tmpeggs, download_delay=0)
if options.setuptools_version is not None:
setup_args['version'] = options.setuptools_version
ez['use_setuptools'](**setup_args)
import setuptools
import pkg_resources
# This does not (always?) update the default working set. We will
# do it.
for path in sys.path:
if path not in pkg_resources.working_set.entries:
pkg_resources.working_set.add_entry(path)
######################################################################
# Install buildout
ws = pkg_resources.working_set
cmd = [sys.executable, '-c',
'from setuptools.command.easy_install import main; main()',
'-mZqNxd', tmpeggs]
find_links = os.environ.get(
'bootstrap-testing-find-links',
options.find_links or
('http://downloads.buildout.org/'
if options.accept_buildout_test_releases else None)
)
if find_links:
cmd.extend(['-f', find_links])
setuptools_path = ws.find(
pkg_resources.Requirement.parse('setuptools')).location
requirement = 'zc.buildout'
version = options.version
if version is None and not options.accept_buildout_test_releases:
# Figure out the most recent final version of zc.buildout.
import setuptools.package_index
_final_parts = '*final-', '*final'
def _final_version(parsed_version):
try:
return not parsed_version.is_prerelease
except AttributeError:
# Older setuptools
for part in parsed_version:
if (part[:1] == '*') and (part not in _final_parts):
return False
return True
index = setuptools.package_index.PackageIndex(
search_path=[setuptools_path])
if find_links:
index.add_find_links((find_links,))
req = pkg_resources.Requirement.parse(requirement)
if index.obtain(req) is not None:
best = []
bestv = None
for dist in index[req.project_name]:
distv = dist.parsed_version
if _final_version(distv):
if bestv is None or distv > bestv:
best = [dist]
bestv = distv
elif distv == bestv:
best.append(dist)
if best:
best.sort()
version = best[-1].version
if version:
requirement = '=='.join((requirement, version))
cmd.append(requirement)
import subprocess
if subprocess.call(cmd, env=dict(os.environ, PYTHONPATH=setuptools_path)) != 0:
raise Exception(
"Failed to execute command:\n%s" % repr(cmd)[1:-1])
######################################################################
# Import and run buildout
ws.add_entry(tmpeggs)
ws.require(requirement)
import zc.buildout.buildout
if not [a for a in args if '=' not in a]:
args.append('bootstrap')
# if -c was provided, we push it back into args for buildout' main function
if options.config_file is not None:
args[0:0] = ['-c', options.config_file]
zc.buildout.buildout.main(args)
shutil.rmtree(tmpeggs)
{% set name = 'beat.core' %}
{% set project_dir = environ.get('RECIPE_DIR') + '/..' %}
package:
name: {{ name }}
version: {{ environ.get('BOB_PACKAGE_VERSION', '0.0.1') }}
build:
entry_points:
- worker = beat.core.scripts.worker:main
number: {{ environ.get('BOB_BUILD_NUMBER', 0) }}
run_exports:
- {{ pin_subpackage(name) }}
script:
- cd {{ project_dir }}
{% if environ.get('BUILD_EGG') %}
- python setup.py sdist --formats=zip
{% endif %}
- python setup.py install --single-version-externally-managed --record record.txt
requirements:
host:
- python {{ python }}
- setuptools {{ setuptools }}
run:
- python
- setuptools
- docker
- docopt
- graphviz
- jsonschema
- numpy
- pip
- pyzmq
- simplejson
- six
- beat.backend.python
test:
requires:
- beat-devel {{ beat_devel }}.*
- bob.extension
- nose
- coverage
- sphinx
- sphinx_rtd_theme
imports:
- {{ name }}
commands:
- worker --help
- nosetests --with-coverage --cover-package={{ name }} -sv {{ name }}
- sphinx-build -aEW {{ project_dir }}/doc {{ project_dir }}/sphinx
- sphinx-build -aEb doctest {{ project_dir }}/doc sphinx
- conda inspect linkages -p $PREFIX {{ name }} # [not win]
- conda inspect objects -p $PREFIX {{ name }} # [osx]
about:
home: https://www.idiap.ch/software/beat/
license: AGPLv3
summary: Python Backend for the BEAT Platform
license_family: AGPL
......@@ -7,7 +7,6 @@ develop = .
newest = false
eggs = beat.core
beat.backend.python
ipdb
[sources]
beat.backend.python = git https://gitlab.idiap.ch/beat/beat.backend.python branch=1.5.x
......@@ -18,7 +17,7 @@ recipe = bob.buildout:scripts
[docker_images]
recipe = collective.recipe.cmd
cmds = ./buildout_pull_images.sh
uninstall_cmds =
uninstall_cmds =
on_install = true
on_update = true
......
......@@ -882,6 +882,4 @@ the data block on the output.
return True
.. Place your references here
.. _json: http://en.wikipedia.org/wiki/JSON
.. include:: links.rst
......@@ -154,10 +154,10 @@ Further to those files, it is prudent to include:
Message Passing
---------------
The BEAT infrastructure communicates with the ``bin/execute`` process via
`Zero Message Queue`_ or 0MQ for short. `0MQ`_ provides a portable
bidirectional communication layer between the BEAT infrastructure and the
target backend, with many `language bindings`_, including `python bindings`_.
The BEAT infrastructure communicates with the ``bin/execute`` process via `Zero
Message Queue`_ or ZMQ for short. ZMQ_ provides a portable bidirectional
communication layer between the BEAT infrastructure and the target backend,
with many `language bindings`_, including `python bindings`_.
The user process, which manages the data readout of a given algorithm, sends
commands back to the infrastructure for requesting data when needed.
......@@ -187,26 +187,10 @@ The next diagram represents some possible states between the BEAT
infrastructure and the ``execute`` process in case of a successful execution:
.. msc::
.. _beat-core-backend-msc:
.. figure:: ./img/execute.*
hscale = "2.0";
io [label="BEAT Infrastructure (language agnostic)"], up [label="Execute (user code)"];
up->io [ label = "nxt channel" ];
io->up [ label = "2 name1 <bin1> name2 <bin2>" ];
up->io [ label = "hmd channel name" ];
io->up [ label = "tru" ];
up->io [ label = "oic name" ];
io->up [ label = "tru" ];
up->io [ label = "wrt out sz <bin>" ];
io->up [ label = "ack" ];
...;
up->io [ label = "hmd channel name" ];
io->up [ label = "fal" ];
up->io [ label = "don" ];
...;
io->up [ label = "ack" ];
Message Sequence Chart between BEAT agents and user containers/algorithms
In the remainder of this section, we describe the various commands, which are
......@@ -437,15 +421,4 @@ indicating a problem with the infrastructure was detected and that system
administrators were informed.
.. Place your references from this point onwards
.. _beat.env.python27: http://gitlab.idiap.ch/biometric/beat.env.python27
.. _beat.backend.python: http://gitlab.idiap.ch/biometric/beat.backend.python
.. _python 2.7: http://www.python.org
.. _0mq: http://zeromq.org
.. _zmq: http://zeromq.org
.. _zero message queue: http://zeromq.org
.. _language bindings: http://zeromq.org/bindings:_start
.. _python bindings: http://zeromq.org/bindings:python
.. _markdown: http://daringfireball.net/projects/markdown/
.. _restructuredtext: http://docutils.sourceforge.net/rst.html
.. include:: links.rst
......@@ -25,16 +25,11 @@
# #
###############################################################################
import os
import sys
import glob
import pkg_resources
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#sys.path.insert(0, os.path.abspath('.'))
# -- General configuration -----------------------------------------------------
......@@ -44,25 +39,38 @@ needs_sphinx = '1.3'
# Add any Sphinx extension module names here, as strings. They can be extensions
# coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
extensions = [
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.ifconfig',
'sphinx.ext.autodoc',
'sphinx.ext.autosummary',
'sphinx.ext.doctest',
'sphinx.ext.intersphinx',
'sphinx.ext.graphviz',
'sphinx.ext.intersphinx',
'sphinx.ext.napoleon',
'sphinx.ext.viewcode',
'sphinxcontrib.mscgen',
'sphinx.ext.mathjax',
#'matplotlib.sphinxext.plot_directive'
]
import sphinx
if sphinx.__version__ >= "1.4.1":
extensions.append('sphinx.ext.imgmath')
else:
extensions.append('sphinx.ext.pngmath')
# Be picky about warnings
nitpicky = True
# Ignores stuff we can't easily resolve on other project's sphinx manuals
nitpick_ignore = []
# Allows the user to override warnings from a separate file
if os.path.exists('nitpick-exceptions.txt'):
for line in open('nitpick-exceptions.txt'):
if line.strip() == "" or line.startswith("#"):
continue
dtype, target = line.split(None, 1)
target = target.strip()
try: # python 2.x
target = unicode(target)
except NameError:
pass
nitpick_ignore.append((dtype, target))
# Always includes todos
todo_include_todos = True
......@@ -118,10 +126,7 @@ release = distribution.version
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = [
'api/modules.rst',
'api/beat.rst',
]
exclude_patterns = ['links.rst']
# The reST default role (used for this markup: `text`) to use for all documents.
#default_role = None
......@@ -145,7 +150,7 @@ pygments_style = 'sphinx'
# Some variables which are useful for generated material
project_variable = project.replace('.', '_')
short_description = u'Biometrics Evaluation and Testing Platform (Core Modules)'
short_description = u'Core modules and definitions for the BEAT platform'
owner = [u'Idiap Research Institute']
......@@ -173,12 +178,12 @@ html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
# The name of an image file (relative to this directory) to place at the top
# of the sidebar.
#html_logo = 'img/beat.svg'
html_logo = 'img/logo.png'
# The name of an image file (within the static path) to use as favicon of the
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
# pixels large.
html_favicon = 'img/beat.ico'
html_favicon = 'img/favicon.ico'
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
......@@ -230,67 +235,15 @@ html_favicon = 'img/beat.ico'
htmlhelp_basename = project_variable + u'_doc'
# -- Options for LaTeX output --------------------------------------------------
# The paper size ('letter' or 'a4').
latex_paper_size = 'a4'
# The font size ('10pt', '11pt' or '12pt').
latex_font_size = '10pt'
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title, author, documentclass [howto/manual]).
latex_documents = [(
'index',
project_variable + '.tex',
short_description,
owner[0],
'manual',
)]
# The name of an image file (relative to this directory) to place at the top of
# the title page.
latex_logo = ''
# For "manual" documents, if this is true, then toplevel headings are parts,
# not chapters.
#latex_use_parts = False
# If true, show page references after internal links.
#latex_show_pagerefs = False
# If true, show URL addresses after external links.
#latex_show_urls = False
# Additional stuff for the LaTeX preamble.
#latex_preamble = ''
# Documents to append as an appendix to all manuals.
#latex_appendices = []
# If false, no module index is generated.
#latex_domain_indices = True
# -- Post configuration --------------------------------------------------------
# Included after all input documents
rst_epilog = """
.. |project| replace:: BEAT
.. |url| replace:: https://www.beat-eu.org/platform/
.. |version| replace:: %s
.. |current-year| date:: %%Y
""" % (version,)
# -- Options for manual page output --------------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [(
'index',
project_variable,
short_description,
owner,
1,
)]
# Default processing flags for sphinx
autoclass_content = 'class'
autodoc_member_order = 'bysource'
......@@ -300,19 +253,40 @@ autodoc_default_flags = [
'show-inheritance',
]
python_version = '.'.join(str(k) for k in sys.version_info[0:2])
python_link = 'http://docs.python.org/' + python_version + '/'
numpy_link = 'http://docs.scipy.org/doc/numpy/'
intersphinx_mapping = dict(
python=(python_link, None),
numpy=(numpy_link, None),
)
# For inter-documentation mapping:
from bob.extension.utils import link_documentation, load_requirements
sphinx_requirements = "extra-intersphinx.txt"
if os.path.exists(sphinx_requirements):
intersphinx_mapping = link_documentation(
additional_packages=['python','numpy'] + \
load_requirements(sphinx_requirements)
)
else:
intersphinx_mapping = link_documentation()
# Adds simplejson, pyzmq links
intersphinx_mapping['http://simplejson.readthedocs.io/en/stable/'] = None
intersphinx_mapping['http://pyzmq.readthedocs.io/en/stable/'] = None
intersphinx_mapping['http://six.readthedocs.io'] = None
intersphinx_mapping['http://python-jsonschema.readthedocs.io/en/stable/'] = None
intersphinx_mapping['https://docker-py.readthedocs.io/en/stable/'] = None
# We want to remove all private (i.e. _. or __.__) members
# that are not in the list of accepted functions
accepted_private_functions = ['__array__']
def member_function_test(app, what, name, obj, skip, options):
# test if we have a private function