Skip to content

An attempt at replicating "Neural Turing Machines" in Keras.

License

Notifications You must be signed in to change notification settings

SigmaQuan/NTM-Keras

Repository files navigation

Neural Turing Machines

An attempt at replicating "Neural Turing Machines" (by Alex Graves, Greg Wayne, and Ivo Danihelka) in Keras.

Prerequisites

Results

Algorithms Learning

Repeat Copy alt_tag NTM Memory Use During the Copy Task alt_tag

Associative Recall (in progress)

Priority Sort (in progress)

Usage

To train a repeat copy task with LSTM:

    $ python learning_repeat_copy_lstm.py

To train a associative recall task with LSTM:

    $ python learning_associative_recall_lstm.py

To train a priority sort task with LSTM:

    $ python learning_priority_sort_lstm.py

To train three different tasks one by one with LSTM:

    $ python learning_algorithm_lstm.py

Other NTM Implementations

Future works

  • Training NTM to learning repeat copy.
  • Training NTM to learning associative recall.
  • Training NTM to learning dynamical n-grams.
  • Training NTM to learning priority sort.
  • Using NTM for other natural language processing tasks such as neural language model.

Author

Zhibin Quan / @SigmaQuan

About

An attempt at replicating "Neural Turing Machines" in Keras.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages