“ASR:2015-02-09”版本间的差异
来自cslt Wiki
(→Sparse NN in NLP) |
(→QA) |
||
第116行: | 第116行: | ||
:* Optimize the code about extracting features and reranking and commit to Rong Liu to check in. | :* Optimize the code about extracting features and reranking and commit to Rong Liu to check in. | ||
:* using sentence vector, it doesn't work. | :* using sentence vector, it doesn't work. | ||
+ | * online learning to rank | ||
+ | :* tool is ok and ready to test | ||
+ | |||
===online learning=== | ===online learning=== | ||
*a simple edition about online learning part about QA. | *a simple edition about online learning part about QA. |
2015年2月9日 (一) 02:43的最后版本
目录
Speech Processing
AM development
Environment
- May gpu760 of grid-14 has been repairing.
- grid-11 often shutdown automatically, too slow computation speed.
RNN AM
- details at http://liuc.cslt.org/pages/rnnam.html
Mic-Array
- XueWei is reading papers and preparing the technical report
Dropout & Maxout & rectifier
- Need to solve the too small learning-rate problem
- 20h small scale sparse dnn with rectifier. --Chao liu
- 20h small scale sparse dnn with Maxout/rectifier based on weight-magnitude-pruning. --Mengyuan Zhao
- hold
Convolutive network
- Convolutive network(DAE)
- http://cslt.riit.tsinghua.edu.cn/cgi-bin/cvss/cvss_request.pl?account=zhangzy&step=view_request&cvssid=311
- Technical report to draft, Mian Wang, Yiye Lin, Shi Yin, Mengyuan Zhao
DNN-DAE(Deep Auto-Encode-DNN)
- Technical report to draft, Xiangyu Zeng, Shi Yin, Mengyuan Zhao and Zhiyong Zhang,
- http://cslt.riit.tsinghua.edu.cn/cgi-bin/cvss/cvss_request.pl?account=zhangzy&step=view_request&cvssid=318
RNN-DAE(Deep based Auto-Encode-RNN)
VAD
- DAE
- HOLD
- Technical report -- Shi Yin
Speech rate training
- http://cslt.riit.tsinghua.edu.cn/cgi-bin/cvss/cvss_request.pl?account=zhangzy&step=view_request&cvssid=268
- Technical report to draft. Xiangyu Zeng, Shi Yin
- Prepare for ChinaSIP
Confidence
- Reproduce the experiments on fisher dataset.
- Use the fisher DNN model to decode all-wsj dataset
- preparing scoring for puqiang data
- HOLD
Neural network visulization
- http://cslt.riit.tsinghua.edu.cn/cgi-bin/cvss/cvss_request.pl?account=zhangzy&step=view_request&cvssid=324
- Technical report, Mian Wang.
Speaker ID
Text Processing
LM development
Domain specific LM
- LM2.X
- mix the sougou2T-lm,kn-discount(done)
- train a large lm using 25w-dict.(hanzhenglong/wxx)
- v2.0a adjust the weight and smaller weight of transcription is better.(done)
- v2.0b add the v1.0 vocab(this week)
- v2.0c filter the useless word.(next week)
- set the test set for new word (hold)
tag LM
- Tag Lm
- add 3-class tag and test
- similar word extension in FST
- improve the key-word weight in G , and result is good in keyword recognization
- read to deal with the English-Chinese
RNN LM
- rnn
- test wer RNNLM on Chinese data from jietong-data
- generate the ngram model from rnnlm and test the ppl with different size txt.
- lstm+rnn
- check the lstm-rnnlm code about how to Initialize and update learning rate.(hold)
Word2Vector
W2V based doc classification
- data prepare.
Knowledge vector
- run the big data
- write the paper.
Character to word
- Character to word conversion(hold)
Word vector online learing
- prepare the ACL
Translation
- v5.0 demo released
- cut the dict and use new segment-tool
Sparse NN in NLP
- write a technical report(Wednesday) and make a report.
- ready to prepare the ACL
QA
improve fuzzy match
- add Synonyms similarity using MERT-4 method(hold)
improve lucene search
- add more feature to improve search.
- POS, NER ,tf ,idf
- result:P@1: 0.68734335-->0.7763158P@5: 0.80325814-->0.8383459 [1]
- Optimize the code about extracting features and reranking and commit to Rong Liu to check in.
- using sentence vector, it doesn't work.
- online learning to rank
- tool is ok and ready to test
online learning
- a simple edition about online learning part about QA.
context framework
- code for organization
query normalization
- using NER to normalize the word
- new inter will install SEMPRE