“ASR:2015-03-09”版本间的差异

来自cslt Wiki
跳转至: 导航搜索
(以“==Speech Processing == === AM development === ==== Environment ==== * grid-11 often shutdown automatically, too slow computation speed. * buy a new 800W power -- Xu...”为内容创建页面)
 
Lr讨论 | 贡献
Text Processing
第67行: 第67行:
 
:* mix the sougou2T-lm,kn-discount(done)
 
:* mix the sougou2T-lm,kn-discount(done)
 
:* train a large lm using 25w-dict.(hanzhenglong/wxx)
 
:* train a large lm using 25w-dict.(hanzhenglong/wxx)
::* v2.0a adjust the weight and smaller weight of transcription is better.(done)
 
::* v2.0b add the v1.0 vocab(done)
 
 
::* v2.0c filter the useless word.(next week)
 
::* v2.0c filter the useless word.(next week)
 
::* set the test set for new word (hold)
 
::* set the test set for new word (hold)
  
 
====tag LM====
 
====tag LM====
* Tag Lm
+
* Tag Lm(JT)
:* add 3-class tag and test
+
:* error
 
* similar word extension in FST
 
* similar word extension in FST
:* improve the key-word weight in G , and result is good in keyword recognization
+
:* add the experiment to tag-lm paper.
:* read to deal with the English-Chinese
+
  
 
====RNN LM====
 
====RNN LM====
 
*rnn
 
*rnn
:* test wer RNNLM on Chinese data from jietong-data
+
:* discuss the rnn-lstm lm
:* generate the ngram model from rnnlm and test the ppl with different size txt.
+
 
*lstm+rnn
 
*lstm+rnn
 
:* check the lstm-rnnlm code about how to Initialize and update learning rate.(hold)
 
:* check the lstm-rnnlm code about how to Initialize and update learning rate.(hold)
第89行: 第85行:
  
 
====W2V based doc classification====
 
====W2V based doc classification====
* data prepare.
+
* data prepare.(hold)
 
====Knowledge vector====
 
====Knowledge vector====
 
* paper is done, submitted ACL  
 
* paper is done, submitted ACL  
====Character to word====
 
* Character to word conversion(hold)
 
 
====Word vector online learing====
 
* prepare the ACL
 
 
 
===Translation===
 
===Translation===
  
第110行: 第100行:
 
====improve fuzzy match====
 
====improve fuzzy match====
 
* add Synonyms similarity using MERT-4 method(hold)
 
* add Synonyms similarity using MERT-4 method(hold)
====improve lucene search====
 
:* commit to Rong Liu to check in.
 
* online learning to rank
 
:* tool is ok and ready to test
 
 
 
===online learning===
 
===online learning===
*a simple edition about online learning part about QA.
+
* data is ready.prepare the ACL paper
 
====context framework====
 
====context framework====
* code for organization
+
* code for demo
:* change to knowledge graph,and learn the D2R tool and JENA[[媒体文件:政府组织机构图谱--汇联.pdf| 政府组织推演]] [[媒体文件:员工信息推演_知识图谱.pdf|员工信息实例推演]]
+
:* demo is done
:* code for demo
+
 
::*
+
====query normalization====
+
* using NER to normalize the word
+
  
 
* new inter will install SEMPRE
 
* new inter will install SEMPRE

2015年3月9日 (一) 06:10的版本

Speech Processing

AM development

Environment

  • grid-11 often shutdown automatically, too slow computation speed.
  • buy a new 800W power -- Xuewei

RNN AM

Mic-Array

  • the technical report is done.
  • reproduce environment for interspeech

Dropout & Maxout & rectifier

  • HOLD
  • Need to solve the too small learning-rate problem
  • 20h small scale sparse dnn with rectifier. --Chao liu
  • 20h small scale sparse dnn with Maxout/rectifier based on weight-magnitude-pruning. --Mengyuan Zhao

Convolutive network

  • Convolutive network(DAE)

DNN-DAE(Deep Auto-Encode-DNN)

RNN-DAE(Deep based Auto-Encode-RNN)

VAD

  • DAE
  • Technical report done. -- Shi Yin

Speech rate training

Confidence

  • HOLD
  • Reproduce the experiments on fisher dataset.
  • Use the fisher DNN model to decode all-wsj dataset
  • preparing scoring for puqiang data

Neural network visulization

Speaker ID


Text Processing

LM development

Domain specific LM

  • LM2.X
  • mix the sougou2T-lm,kn-discount(done)
  • train a large lm using 25w-dict.(hanzhenglong/wxx)
  • v2.0c filter the useless word.(next week)
  • set the test set for new word (hold)

tag LM

  • Tag Lm(JT)
  • error
  • similar word extension in FST
  • add the experiment to tag-lm paper.

RNN LM

  • rnn
  • discuss the rnn-lstm lm
  • lstm+rnn
  • check the lstm-rnnlm code about how to Initialize and update learning rate.(hold)

Word2Vector

W2V based doc classification

  • data prepare.(hold)

Knowledge vector

  • paper is done, submitted ACL

Translation

  • v5.0 demo released
  • cut the dict and use new segment-tool

Sparse NN in NLP

  • write a technical report(Wednesday) and make a report.
  • ready to prepare the ACL

QA

improve fuzzy match

  • add Synonyms similarity using MERT-4 method(hold)

online learning

  • data is ready.prepare the ACL paper

context framework

  • code for demo
  • demo is done


  • new inter will install SEMPRE