“Chao Xing 2015-12-7”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
(以“Last Week: Solution: 1. None. Plan to do: 1. Reproduce RNN picture model with specified image. 2. Done with Matrix Factorization. 3. Start to investigate word...”为内容创建页面)
 
 
(相同用户的14个中间修订版本未显示)
第1行: 第1行:
 
Last Week:
 
Last Week:
 
+
----
 
Solution:
 
Solution:
 
+
* None.
1. None.
+
  
 
Plan to do:
 
Plan to do:
  
1. Reproduce RNN picture model with specified image.
+
    1. Reproduce RNN picture model with specified image.
  
2. Done with Matrix Factorization.
+
    2. Done with Matrix Factorization.
  
3. Start to investigate word segmentation.
+
    3. Start to investigate word segmentation.
  
 
This Week:
 
This Week:
 
+
----
 
What have done:
 
What have done:
  
1. Investigate DRAW with RNN-lstm and RNN-at-lstm by input same random sample.
+
    1. Investigate DRAW with RNN-lstm and RNN-at-lstm by input same random sample.
  
2. Investigate word sense vectors. Try to find some strategy among balance word frequency, word sense vectors, and neural computing.
+
    2. Investigate word sense vectors. Try to find some strategy among balance word frequency, word sense vectors, and neural computing.
  
 
Plan to do:
 
Plan to do:
  
1. Broadcast last week mnist results to Chinese images.
+
    1. Broadcast last week mnist results to Chinese images.
  
2. Give a report after read some papers.
+
    2. Give a report after read some papers.
  
3. Test some word sense vector models.
+
    3. Test some word sense vector models.

2015年12月7日 (一) 01:23的最后版本

Last Week:


Solution:

  • None.

Plan to do:

   1. Reproduce RNN picture model with specified image.
   2. Done with Matrix Factorization.
   3. Start to investigate word segmentation.

This Week:


What have done:

   1. Investigate DRAW with RNN-lstm and RNN-at-lstm by input same random sample.
   2. Investigate word sense vectors. Try to find some strategy among balance word frequency, word sense vectors, and neural computing.

Plan to do:

   1. Broadcast last week mnist results to Chinese images.
   2. Give a report after read some papers.
   3. Test some word sense vector models.