You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This exercise explores the use of LSTM-networks for the task of poetry generation.
To make it work finish src/train.py. The LSTM-cells are typically defined as,
The input is denoted as $\mathbf{x}_t \in \mathbb{R}^{n_i}$ and it changes according to time $t$.
Potential new states $\mathbf{z}_t$ are called block input.
$\mathbf{i}$ is called the input gate. The forget gate is $\mathbf{f}$ and $\mathbf{o}$ denotes the output gate.
$\mathbf{W} \in \mathbb{R}^{n_i \times n_h}$ denotes input,
$\mathbf{R} \in \mathbb{R}^{n_o \times n_h}$ are the recurrent matrices.
$\odot$ indicates element-wise products.
When you have trained a model, run src/recurrent_poetry.py, enjoy!