“Chao Xing 2015-12-7”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
 
(相同用户的3个中间修订版本未显示)
第6行: 第6行:
 
Plan to do:
 
Plan to do:
  
# Reproduce RNN picture model with specified image.
+
    1. Reproduce RNN picture model with specified image.
  
## Done with Matrix Factorization.
+
    2. Done with Matrix Factorization.
  
# Start to investigate word segmentation.
+
    3. Start to investigate word segmentation.
  
 
This Week:
 
This Week:
第16行: 第16行:
 
What have done:
 
What have done:
  
1. Investigate DRAW with RNN-lstm and RNN-at-lstm by input same random sample.
+
    1. Investigate DRAW with RNN-lstm and RNN-at-lstm by input same random sample.
  
2. Investigate word sense vectors. Try to find some strategy among balance word frequency, word sense vectors, and neural computing.
+
    2. Investigate word sense vectors. Try to find some strategy among balance word frequency, word sense vectors, and neural computing.
  
 
Plan to do:
 
Plan to do:
  
1. Broadcast last week mnist results to Chinese images.
+
    1. Broadcast last week mnist results to Chinese images.
  
2. Give a report after read some papers.
+
    2. Give a report after read some papers.
  
3. Test some word sense vector models.
+
    3. Test some word sense vector models.

2015年12月7日 (一) 01:23的最后版本

Last Week:


Solution:

  • None.

Plan to do:

   1. Reproduce RNN picture model with specified image.
   2. Done with Matrix Factorization.
   3. Start to investigate word segmentation.

This Week:


What have done:

   1. Investigate DRAW with RNN-lstm and RNN-at-lstm by input same random sample.
   2. Investigate word sense vectors. Try to find some strategy among balance word frequency, word sense vectors, and neural computing.

Plan to do:

   1. Broadcast last week mnist results to Chinese images.
   2. Give a report after read some papers.
   3. Test some word sense vector models.