“Chao Xing”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
(以“Last Week: Last Week Solution: 1. DNN program has some problems, fix these & DNN contain same performance to Linear. Problem is training process, training sa...”为内容创建页面)
 
 
(相同用户的2个中间修订版本未显示)
第1行: 第1行:
Last Week:  
+
Last Week:
Last Week Solution:
+
 
  1. DNN program has some problems, fix these & DNN contain same performance to Linear.
+
Solution:
    Problem is training process, training samples are equal to speakers' so this will lead model less compute 50% training samples.
+
 
 +
1. None.
 +
 
 
Plan to do:
 
Plan to do:
  (Hold)
 
Problem:
 
  1. DNN still not better than linear model.
 
 
 
  
 +
1. Test different rules for matrix factorization.
 +
2. Learning RNN, LSTM, Memory.
 +
 +
This Week:
 +
 +
What have done:
 +
 +
1. Test different rules for matrix factorization.
 +
 +
Plan to do:
  
This Week:
+
  1. Learning RNN, LSTM, Memory.
  Solution:
+
  2. Writing technique report about Matrix Factorization.
  1. DNN with dropout method, and add SAA to reduce randomly properties.
+
3. Draw deep into different theory of matrix decompose.
  Plan to do:
+
  1. Run for a small vocabulary size, such as 5000.
+
  2. Run and test some ideas.
+

2015年10月26日 (一) 06:50的最后版本

Last Week:

Solution:

1. None.

Plan to do:

1. Test different rules for matrix factorization.
2. Learning RNN, LSTM, Memory.

This Week:

What have done:

1. Test different rules for matrix factorization.

Plan to do:

1. Learning RNN, LSTM, Memory.
2. Writing technique report about Matrix Factorization.
3. Draw deep into different theory of matrix decompose.