Ran Xin(冉鑫),Zhang Yongxin.[J].高技术通讯(英文),2012,18(3):302~307 |
|
Online split-and-merge expec tation-maximization training of Gaussian mixture model and its optimization |
Revised:August 10, 2010 |
DOI:10.3772/j.issn.1006-6748.2012.03.014 |
中文关键词: |
英文关键词: Gaussian mixture model (GMM), online training, split-and-merge expectation-maximization(SMEM), speech processing |
基金项目: |
Author Name | Affiliation | Ran Xin(冉鑫) | | Zhang Yongxin | |
|
Hits: 1038 |
Download times: 1143 |
中文摘要: |
|
英文摘要: |
This paper presents a new online incremental training algorithm of Gaussian mixture model(GMM), which aims to perform the expectation-maximization(EM)training incrementally to update GMM model parameters online sample by sample, instead of waiting for a block of data with the sufficient size to start training as in the traditional EM procedure. The proposed method is extended from the split-and-merge EM procedure, so inherently it is also capable escaping from local maxima and reducing the chances of singularities. In the application domain, the algorithm is optimized in the context of speech processing applications. Experiments on the synthetic data show the advantage and efficiency of the new method and the results in a speech processing task also confirm the improvement of system performance. |
View Full Text
View/Add Comment Download reader |
Close |