Date |
People |
Last Week |
This Week
|
2016/12/19
|
Yang Feng |
- s2smn: wrote the manual of s2s with tensorflow [nmt-manual]
- wrote part of the code of mn.
- wrote the manual of Moses [moses-manual]
- Huilan: fixed the problem of syntax-based translation.
- sort out the system and corresponding documents.
|
|
Jiyuan Zhang |
- coded tone_model,but had some trouble
- run four_attention_model, generated by local_attention model
|
|
Andi Zhang |
- tried to modify the wrong softmax, but abandoned at last
- added bleu scoring part
|
|
Shiyue Zhang |
- changed the one-hot vector to (0, -inf, -inf...), and retied the experiments. But no improvement showed.
- tried 1-dim gate, but converged to baseline
- tried to only train gate, but the best is taking all instance as "right"
- trying a model similar to attention
- [report]
|
- try to add true action info when training gate
- try different scale vectors
- try to change cos to only inner product
|
Guli |
- read papers about Transfer learning and solving OOV
- conducted comparative test
- writing survey
|
- complete the first draft of the survey
|
Peilun Xiao |
- Read a paper about document classification wiht GMM distributions of word vecotrs and try to code it in python
- Use LDA to reduce the dimension of the text in r52、r8 and contrast the performance of classification
|
- Use LDA to reduce the dimension of the text in 20news and webkb
|