Welcome to Blocks’ documentation!

Blocks is a framework that helps you build and manage neural network models on using Theano.

Want to get try it out? Start by installing Blocks and having a look at the quickstart further down this page. Once you’re hooked, try your hand at the tutorials and the examples.

Blocks is developed in parallel with Fuel, a dataset processing framework.

Warning

Blocks is a new project which is still under development. As such, certain (all) parts of the framework are subject to change. The last stable (and thus likely an outdated) version can be found in the stable branch.

Tip

That said, if you are interested in using Blocks and run into any problems, feel free to ask your question on the mailing list. Also, don’t hesitate to file bug reports and feature requests by making a GitHub issue.

Quickstart

Construct your model.

>>> mlp = MLP(activations=[Tanh(), Softmax()], dims=[784, 100, 10],
...           weights_init=IsotropicGaussian(0.01), biases_init=Constant(0))
>>> mlp.initialize()

Calculate your loss function.

>>> x = tensor.matrix('features')
>>> y = tensor.lmatrix('targets')
>>> y_hat = mlp.apply(x)
>>> cost = CategoricalCrossEntropy().apply(y.flatten(), y_hat)
>>> error_rate = MisclassificationRate().apply(y.flatten(), y_hat)

Load your training data using Fuel.

>>> mnist_train = MNIST(("train",))
>>> train_stream = Flatten(
...     DataStream.default_stream(
...         dataset=mnist_train,
...         iteration_scheme=SequentialScheme(mnist_train.num_examples, 128)),
...     which_sources=('features',))
>>> mnist_test = MNIST(("test",))
>>> test_stream = Flatten(
...     DataStream.default_stream(
...         dataset=mnist_test,
...         iteration_scheme=SequentialScheme(mnist_test.num_examples, 1024)),
...     which_sources=('features',))

And train!

>>> from blocks.model import Model
>>> main_loop = MainLoop(
...     model=Model(cost), data_stream=train_stream,
...     algorithm=GradientDescent(
...         cost=cost, parameters=ComputationGraph(cost).parameters,
...         step_rule=Scale(learning_rate=0.1)),
...     extensions=[FinishAfter(after_n_epochs=5),
...                 DataStreamMonitoring(
...                     variables=[cost, error_rate],
...                     data_stream=test_stream,
...                     prefix="test"),
...                 Printing()])
>>> main_loop.run() 

...

For a runnable version of this code, please see the MNIST demo in our repository with examples.

Features

Currently Blocks supports and provides:

  • Constructing parametrized Theano operations, called “bricks”
  • Pattern matching to select variables and bricks in large models
  • Algorithms to optimize your model
  • Saving and resuming of training
  • Monitoring and analyzing values during training progress (on the training set as well as on test sets)
  • Application of graph transformations, such as dropout (limited support)

In the future we also hope to support:

  • Dimension, type and axes-checking
https://img.shields.io/coveralls/mila-udem/blocks.svg https://travis-ci.org/mila-udem/blocks.svg?branch=master https://readthedocs.org/projects/blocks/badge/?version=latest https://img.shields.io/scrutinizer/g/mila-udem/blocks.svg https://img.shields.io/badge/license-MIT-blue.svg

Indices and tables