nolearn.lasagne

Two introductory tutorials exist for nolearn.lasagne:

For specifics around classes and functions out of the lasagne package, such as layers, updates, and nonlinearities, you’ll want to look at the Lasagne project’s documentation.

nolearn.lasagne comes with a number of tests that demonstrate some of the more advanced features, such as networks with merge layers, and networks with multiple inputs.

Finally, there’s a few presentations and examples from around the web. Note that some of these might need a specific version of nolearn and Lasange to run:

API

class nolearn.lasagne.NeuralNet(layers, update=<function nesterov_momentum at 0x7f14b1fe9b70>, loss=None, objective=<function objective at 0x7f14b1f78510>, objective_loss_function=None, batch_iterator_train=<nolearn.lasagne.base.BatchIterator object at 0x7f14b1f745f8>, batch_iterator_test=<nolearn.lasagne.base.BatchIterator object at 0x7f14b1f74668>, regression=False, max_epochs=100, train_split=<nolearn.lasagne.base.TrainSplit object at 0x7f14b1f746a0>, custom_score=None, X_tensor_type=None, y_tensor_type=None, use_label_encoder=False, on_epoch_finished=None, on_training_started=None, on_training_finished=None, more_params=None, verbose=0, **kwargs)

A scikit-learn estimator based on Lasagne.

class nolearn.lasagne.BatchIterator(batch_size)

Standard batch iterator.

Given a reference to the X data and y target variables, it’s the batch iterator’s job to generate batches from these in (__iter__()).

Here’s a simple example:

>>> X, y = np.arange(18).reshape((9, 2)), np.arange(9)
>>> bi = BatchIterator(batch_size=3)
>>> for Xb, yb in bi(X, y):
...     print(Xb)
...     print(yb)
[[0 1]
 [2 3]
 [4 5]]
[0 1 2]
[[ 6  7]
 [ 8  9]
 [10 11]]
[3 4 5]
[[12 13]
 [14 15]
 [16 17]]
[6 7 8]

This implementation also knows how to deal with X inputs that are dictionaries mapping input names to arrays. Note that in this demo we’ll also only pass X, no y to the batch iterator, which also works:

>>> X1 = np.arange(18).reshape((9, 2))
>>> X2 = np.arange(18)[::-1].reshape((9, 2))
>>> bi = BatchIterator(batch_size=3)
>>> for Xb, yb in bi({'X1': X1, 'X2': X2}):
...     pass
>>> print(Xb['X1'])
[[12 13]
 [14 15]
 [16 17]]
>>> print(Xb['X2'])
[[5 4]
 [3 2]
 [1 0]]
>>> assert yb is None
class nolearn.lasagne.TrainSplit(eval_size, stratify=True)

Table Of Contents

Previous topic

nolearn.inischema

Next topic

nolearn.metrics

This Page