Last week:
review seq2seq model and build the model
This week:
complete the seq2seq model and add attention mechanism