Skip to content

subho406/Sequence-to-Sequence-and-Attention-from-scratch-using-Tensorflow

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Sequence to Sequence Model and Attention from Scratch

The goal is to not use any existing contrib library implementation of Tensorflow and build a sequence to sequence LSTM based model from scratch using Tensorflow. Probably only thing not from scratch in this project is backpropagation.

A Neural Attention mechanism was also implemented and the results were compared. This project was done for learning purpose for the course Deep Learning by Google (Udacity)

Releases

No releases published

Packages

No packages published