Tensorflow implementation of Bi-directional RNN Langauge Model refer to paper [Contextual Bidirectional Long Short-Term Memory Recurrent Neural Network Language Models: A Generative Approach to Sentiment Analysis].
- Python 3
- TensorFlow
Penn Tree Bank (PTB) dataset is used for training and test. ptb_data is copied from data/ directory of the PTB dataset from Tomas Mikolov's webpage.
$ python train.py
$ python train.py -h
usage: train.py [-h] [--model MODEL] [--embedding_size EMBEDDING_SIZE]
[--num_layers NUM_LAYERS] [--num_hidden NUM_HIDDEN]
[--keep_prob KEEP_PROB] [--learning_rate LEARNING_RATE]
[--batch_size BATCH_SIZE] [--num_epochs NUM_EPOCHS]
optional arguments:
-h, --help show this help message and exit
--model MODEL rnn | birnn
--embedding_size EMBEDDING_SIZE
embedding size.
--num_layers NUM_LAYERS
RNN network depth.
--num_hidden NUM_HIDDEN
RNN network size.
--keep_prob KEEP_PROB
dropout keep prob.
--learning_rate LEARNING_RATE
learning rate.
--batch_size BATCH_SIZE
batch size.
--num_epochs NUM_EPOCHS
number of epochs.
- Orange Line: LSTM language model
- Blue Line: Bi-directional LSTM language model