“Xiangyu Zeng 2015-11-09”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
(以“last week: 1.did some additional sequence training of Adam-max, and these results show that adam-max is good for training to jump out the local minimum with new dat...”为内容创建页面)
 
(没有差异)

2015年11月9日 (一) 08:34的最后版本

last week:

1.did some additional sequence training of Adam-max, and these results show that adam-max is good for training to jump out the local minimum with new data

2.did some experiments on multitasks in speech rate. Found that sr-learning method is useful, especially the rate is extreme. But the multitask didn't show a good result. more in cvss.

next week:

1.complete the rest experiments of Adam-max sequence training

2.prepare my own application