RNN language model with word clustering and class-based output layer

Yongzhe Shi, Wei Qiang Zhang, Jia Liu, Michael T. Johnson

Producción científica: Articlerevisión exhaustiva

28 Citas (Scopus)

Resumen

The recurrent neural network language model (RNNLM) has shown significant promise for statistical language modeling. In this work, a new class-based output layer method is introduced to further improve the RNNLM. In this method, word class information is incorporated into the output layer by utilizing the Brown clustering algorithm to estimate a class-based language model. Experimental results show that the new output layer with word clustering not only improves the convergence obviously but also reduces the perplexity and word error rate in large vocabulary continuous speech recognition.

Idioma originalEnglish
Número de artículo22
PublicaciónEurasip Journal on Audio, Speech, and Music Processing
Volumen2013
N.º1
DOI
EstadoPublished - 2013

Nota bibliográfica

Funding Information:
This work was supported by the National Natural Science Foundation of China under grant nos. 61273268, 61005019 and 90920302, and in part by Beijing Natural Science Foundation Program under grant no. KZ201110005005.

Financiación

This work was supported by the National Natural Science Foundation of China under grant nos. 61273268, 61005019 and 90920302, and in part by Beijing Natural Science Foundation Program under grant no. KZ201110005005.

FinanciadoresNúmero del financiador
Natural Science Foundation of Beijing MunicipalityKZ201110005005
Natural Science Foundation of Beijing Municipality
National Natural Science Foundation of China (NSFC)61273268, 61005019, 90920302
National Natural Science Foundation of China (NSFC)

    ASJC Scopus subject areas

    • Acoustics and Ultrasonics
    • Electrical and Electronic Engineering

    Huella

    Profundice en los temas de investigación de 'RNN language model with word clustering and class-based output layer'. En conjunto forman una huella única.

    Citar esto