“Chao Xing 2015-12-7”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
 
(相同用户的11个中间修订版本未显示)
第1行: 第1行:
 
Last Week:
 
Last Week:
 
Solution:
 
 
----
 
----
 +
Solution:
 
* None.
 
* None.
  
 
Plan to do:
 
Plan to do:
----
 
1. Reproduce RNN picture model with specified image.
 
  
2. Done with Matrix Factorization.
+
    1. Reproduce RNN picture model with specified image.
  
3. Start to investigate word segmentation.
+
    2. Done with Matrix Factorization.
 +
 
 +
    3. Start to investigate word segmentation.
  
 
This Week:
 
This Week:
---
 
What have done:
 
 
----
 
----
1. Investigate DRAW with RNN-lstm and RNN-at-lstm by input same random sample.
+
What have done:
  
2. Investigate word sense vectors. Try to find some strategy among balance word frequency, word sense vectors, and neural computing.
+
    1. Investigate DRAW with RNN-lstm and RNN-at-lstm by input same random sample.
 +
 
 +
    2. Investigate word sense vectors. Try to find some strategy among balance word frequency, word sense vectors, and neural computing.
  
 
Plan to do:
 
Plan to do:
----
 
1. Broadcast last week mnist results to Chinese images.
 
  
2. Give a report after read some papers.
+
    1. Broadcast last week mnist results to Chinese images.
 +
 
 +
    2. Give a report after read some papers.
  
3. Test some word sense vector models.
+
    3. Test some word sense vector models.

2015年12月7日 (一) 01:23的最后版本

Last Week:


Solution:

  • None.

Plan to do:

   1. Reproduce RNN picture model with specified image.
   2. Done with Matrix Factorization.
   3. Start to investigate word segmentation.

This Week:


What have done:

   1. Investigate DRAW with RNN-lstm and RNN-at-lstm by input same random sample.
   2. Investigate word sense vectors. Try to find some strategy among balance word frequency, word sense vectors, and neural computing.

Plan to do:

   1. Broadcast last week mnist results to Chinese images.
   2. Give a report after read some papers.
   3. Test some word sense vector models.