Commit 5de07640 authored by Tiago de Freitas Pereira's avatar Tiago de Freitas Pereira
Browse files

Fixing CI

parent b7c1d437
This diff is collapsed.
Copyright (c) 2016 Idiap Research Institute, http://www.idiap.ch/
Written by Andre Anjos <andre.anjos@idiap.ch>
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
3. Neither the name of the copyright holder nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
include README.rst bootstrap-buildout.py buildout.cfg COPYING version.txt requirements.txt
recursive-include doc *.py *.rst
recursive-include bob *.wav *.hdf5
recursive-include bob *.wav *.hdf5 *.pickle *.meta *.ckp
\ No newline at end of file
# see https://docs.python.org/3/library/pkgutil.html
from pkgutil import extend_path
__path__ = extend_path(__path__, __name__)
from bob.learn.tensorflow import analyzers
from bob.learn.tensorflow import datashuffler
from bob.learn.tensorflow import initialization
from bob.learn.tensorflow import layers
from bob.learn.tensorflow import loss
from bob.learn.tensorflow import network
from bob.learn.tensorflow import trainers
from bob.learn.tensorflow import utils
__path__ = extend_path(__path__, __name__)
\ No newline at end of file
# gets sphinx autodoc done right - don't remove it
__all__ = [_ for _ in dir() if not _.startswith('_')]
def get_config():
"""
Returns a string containing the configuration information.
"""
import bob.extension
return bob.extension.get_config(__name__)
\ No newline at end of file
from .BaseLoss import BaseLoss
from .ContrastiveLoss import ContrastiveLoss
from .TripletLoss import TripletLoss
......
from .SequenceNetwork import SequenceNetwork
from .Lenet import Lenet
from .Chopra import Chopra
......
# see https://docs.python.org/3/library/pkgutil.html
from pkgutil import extend_path
__path__ = extend_path(__path__, __name__)
\ No newline at end of file
......@@ -16,7 +16,7 @@ The example consists in training a very simple **CNN** with `MNIST` dataset in 4
1. Preparing your input data
.. doctest::
.. code-block:: python
>>> import tensorflow as tf
>>> import bob.learn.tensorflow
......@@ -28,7 +28,7 @@ The example consists in training a very simple **CNN** with `MNIST` dataset in 4
2. Create an architecture
.. doctest::
.. code-block:: python
>>> architecture = bob.learn.tensorflow.network.SequenceNetwork()
>>> architecture.add(bob.learn.tensorflow.layers.Conv2D(name="conv1", kernel_size=3, filters=10, activation=tf.nn.tanh))
......@@ -36,7 +36,7 @@ The example consists in training a very simple **CNN** with `MNIST` dataset in 4
3. Defining a loss and training
.. doctest::
.. code-block:: python
>>> loss = bob.learn.tensorflow.loss.BaseLoss(tf.nn.sparse_softmax_cross_entropy_with_logits, tf.reduce_mean)
>>> trainer = bob.learn.tensorflow.trainers.Trainer(architecture=architecture, loss=loss, iterations=100, temp_dir="./cnn")
......@@ -45,7 +45,7 @@ The example consists in training a very simple **CNN** with `MNIST` dataset in 4
4. Predicting and computing the accuracy
.. doctest::
.. code-block:: python
>>> # Loading the model
>>> architecture = bob.learn.tensorflow.network.SequenceNetwork()
......@@ -124,7 +124,7 @@ The data can be fetched either from the memory (:py:class:`bob.learn.tensorflow.
disk (:py:class:`bob.learn.tensorflow.datashuffler.Disk`).
To train networks fetched from the disk, your training data must be a list of paths like in the example below:
.. doctest::
.. code-block:: python
>>> train_data = ['./file/id1_0.jpg', './file/id1_1.jpg', './file/id2_1.jpg']
>>> train_labels = [0, 0, 1]
......@@ -171,7 +171,7 @@ The library has already some crafted networks implemented in `Architectures <py_
It is also possible to craft simple MLPs with this library using the class :py:class:`bob.learn.tensorflow.network.MLP`.
The example bellow shows how to create a simple MLP with 10 putputs and 2 hidden layers.
.. doctest::
.. code-block:: python
>>> architecture = bob.learn.tensorflow.network.MLP(10, hidden_layers=[20, 40])
......@@ -221,7 +221,7 @@ Loss
Loss functions must be wrapped as a :py:class:`bob.learn.tensorflow.loss.BaseLoss` objects.
For instance, if you want to use the sparse softmax cross entropy loss between logits and labels you should do like this.
.. doctest::
.. code-block:: python
>>> loss = BaseLoss(tf.nn.sparse_softmax_cross_entropy_with_logits, tf.reduce_mean)
......
......@@ -2,93 +2,66 @@
# vim: set fileencoding=utf-8 :
# Andre Anjos <andre.anjos@idiap.ch>
# Mon 16 Apr 08:18:08 2012 CEST
#
# Copyright (C) Idiap Research Institute, Martigny, Switzerland
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, version 3 of the License.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
# This file contains the python (distutils/setuptools) instructions so your
# package can be installed on **any** host system. It defines some basic
# information like the package name for instance, or its homepage.
#
# It also defines which other packages this python package depends on and that
# are required for this package's operation. The python subsystem will make
# sure all dependent packages are installed or will install them for you upon
# the installation of this package.
#
# The 'buildout' system we use here will go further and wrap this package in
# such a way to create an isolated python working environment. Buildout will
# make sure that dependencies which are not yet installed do get installed, but
# **without** requiring administrative privileges on the host system. This
# allows you to test your package with new python dependencies w/o requiring
# administrative interventions.
from setuptools import setup
from setuptools import setup, dist
dist.Distribution(dict(setup_requires=['bob.extension']))
from bob.extension.utils import load_requirements, find_packages
install_requires = load_requirements()
# The only thing we do in this file is to call the setup() function with all
# parameters that define our package.
setup(
# This is the basic information about your project. Modify all this
# information before releasing code publicly.
name = 'bob.learn.tensorflow',
version = open("version.txt").read().rstrip(),
description = 'Bob bindings for tensorflow',
name='bob.learn.tensorflow',
version=open("version.txt").read().rstrip(),
description='Bob bindings for tensorflow',
url = '',
license = 'BSD',
author = 'Tiago de Freitas Pereira',
author_email = 'tiago.pereira@idiap.ch',
keywords = 'tensorflow',
url='',
license='BSD',
author='Tiago de Freitas Pereira',
author_email='tiago.pereira@idiap.ch',
keywords='tensorflow',
# If you have a better, long description of your package, place it on the
# 'doc' directory and then hook it here
long_description = open('README.rst').read(),
long_description=open('README.rst').read(),
# This line is required for any distutils based packaging.
include_package_data = True,
include_package_data=True,
# This line defines which packages should be installed when you "install"
# this package. All packages that are mentioned here, but are not installed
# on the current system will be installed locally and only visible to the
# scripts of this package. Don't worry - You won't need administrative
# privileges when using buildout.
install_requires = [
'setuptools',
'docopt',
'ipython',
'bob.db.mnist',
],
install_requires=install_requires,
packages=find_packages(),
zip_safe=False,
entry_points = {
entry_points={
# scripts should be declared using this entry:
'console_scripts': [
'compute_statistics.py = bob.learn.tensorflow.script.compute_statistics:main'
],
# scripts should be declared using this entry:
'console_scripts': [
'compute_statistics.py = bob.learn.tensorflow.script.compute_statistics:main'
],
},
# Classifiers are important if you plan to distribute this package through
# PyPI. You can find the complete list of classifiers that are valid and
# useful here (http://pypi.python.org/pypi?%3Aaction=list_classifiers).
classifiers = [
'Framework :: Tensorflow',
'Development Status :: 3 - Alpha',
'Intended Audience :: Developers',
'License :: OSI Approved :: BSD License',
'Natural Language :: English',
'Programming Language :: Python',
'Topic :: Scientific/Engineering :: Artificial Intelligence',
classifiers=[
'Framework :: Tensorflow',
'Development Status :: 3 - Alpha',
'Intended Audience :: Developers',
'License :: OSI Approved :: BSD License',
'Natural Language :: English',
'Programming Language :: Python',
'Topic :: Scientific/Engineering :: Artificial Intelligence',
],
)
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment