Skip to content
GitLab
Projects Groups Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in
  • bob.learn.tensorflow bob.learn.tensorflow
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 11
    • Issues 11
    • List
    • Boards
    • Service Desk
    • Milestones
  • Merge requests 0
    • Merge requests 0
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
  • Deployments
    • Deployments
    • Environments
    • Releases
  • Monitor
    • Monitor
    • Incidents
  • Analytics
    • Analytics
    • Value stream
    • CI/CD
    • Repository
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • bobbob
  • bob.learn.tensorflowbob.learn.tensorflow
  • Merge requests
  • !35

Created mechanism that allows us to train only parts of the graph during the graph adaptation

  • Review changes

  • Download
  • Email patches
  • Plain diff
Merged Tiago de Freitas Pereira requested to merge shutting-down-parts-of-the-network into master Nov 27, 2017
  • Overview 13
  • Commits 2
  • Pipelines 3
  • Changes 18

Hey,

Sometimes when we want to apply some checkpoint in another dataset (or domain), we may want to choose which parts of the graph we want to re-train.

With this MR, our estimators (Logits.., Triplet, Siamese,) were enhanced with this keyword argument

        extra_checkpoint = {
            "checkpoint_path": <YOUR_CHECKPOINT>,
            "scopes": dict({"<SOURCE_SCOPE>/": "<TARGET_SCOPE>/"}),
            "trainable_variables": [<LIST OF VARIABLES OR SCOPES THAT YOU WANT TO RETRAIN>]
        }

The novelty here is the trainable_variables, where now we can set the parts of the graph we want to do back-propagation. If you set an empty list ( "trainable_variables": []) all variables will not be trainable. If this variable is not set at all, everything is trainable.

This is strongly dependent on how the architecture function is crafted. Look some example on how such functions should be crafted.

  • Simple example: https://gitlab.idiap.ch/bob/bob.learn.tensorflow/blob/9a1cd4eda365f1470eab4ce9d8b49bb0278a0ae4/bob/learn/tensorflow/network/Dummy.py
  • Complex one: https://gitlab.idiap.ch/bob/bob.learn.tensorflow/blob/9a1cd4eda365f1470eab4ce9d8b49bb0278a0ae4/bob/learn/tensorflow/network/InceptionResnetV2.py

Do you have some time to review this one @amohammadi ? Perhaps this can be useful for you.

Thanks

Assignee
Assign to
Reviewers
Request review from
Time tracking
Source branch: shutting-down-parts-of-the-network