Two introductory tutorials exist for nolearn.lasagne:
For specifics around classes and functions out of the lasagne package, such as layers, updates, and nonlinearities, you’ll want to look at the Lasagne project’s documentation.
nolearn.lasagne comes with a number of tests that demonstrate some of the more advanced features, such as networks with merge layers, and networks with multiple inputs.
Finally, there’s a few presentations and examples from around the web. Note that some of these might need a specific version of nolearn and Lasange to run:
A scikit-learn estimator based on Lasagne.
Standard batch iterator.
Given a reference to the X data and y target variables, it’s the batch iterator’s job to generate batches from these in (__iter__()).
Here’s a simple example:
>>> X, y = np.arange(18).reshape((9, 2)), np.arange(9)
>>> bi = BatchIterator(batch_size=3)
>>> for Xb, yb in bi(X, y):
... print(Xb)
... print(yb)
[[0 1]
[2 3]
[4 5]]
[0 1 2]
[[ 6 7]
[ 8 9]
[10 11]]
[3 4 5]
[[12 13]
[14 15]
[16 17]]
[6 7 8]
This implementation also knows how to deal with X inputs that are dictionaries mapping input names to arrays. Note that in this demo we’ll also only pass X, no y to the batch iterator, which also works:
>>> X1 = np.arange(18).reshape((9, 2))
>>> X2 = np.arange(18)[::-1].reshape((9, 2))
>>> bi = BatchIterator(batch_size=3)
>>> for Xb, yb in bi({'X1': X1, 'X2': X2}):
... pass
>>> print(Xb['X1'])
[[12 13]
[14 15]
[16 17]]
>>> print(Xb['X2'])
[[5 4]
[3 2]
[1 0]]
>>> assert yb is None