“Chao Xing 2015-12-7”版本间的差异
来自cslt Wiki
(以“Last Week: Solution: 1. None. Plan to do: 1. Reproduce RNN picture model with specified image. 2. Done with Matrix Factorization. 3. Start to investigate word...”为内容创建页面) |
|||
第2行: | 第2行: | ||
Solution: | Solution: | ||
− | + | ---- | |
− | + | * None. | |
Plan to do: | Plan to do: | ||
− | + | ---- | |
1. Reproduce RNN picture model with specified image. | 1. Reproduce RNN picture model with specified image. | ||
第14行: | 第14行: | ||
This Week: | This Week: | ||
− | + | ---- | |
What have done: | What have done: | ||
− | + | ---- | |
1. Investigate DRAW with RNN-lstm and RNN-at-lstm by input same random sample. | 1. Investigate DRAW with RNN-lstm and RNN-at-lstm by input same random sample. | ||
第22行: | 第22行: | ||
Plan to do: | Plan to do: | ||
− | + | ---- | |
1. Broadcast last week mnist results to Chinese images. | 1. Broadcast last week mnist results to Chinese images. | ||
2015年12月7日 (一) 01:18的版本
Last Week:
Solution:
* None.
Plan to do:
1. Reproduce RNN picture model with specified image.
2. Done with Matrix Factorization.
3. Start to investigate word segmentation.
This Week:
What have done:
1. Investigate DRAW with RNN-lstm and RNN-at-lstm by input same random sample.
2. Investigate word sense vectors. Try to find some strategy among balance word frequency, word sense vectors, and neural computing.
Plan to do:
1. Broadcast last week mnist results to Chinese images.
2. Give a report after read some papers.
3. Test some word sense vector models.