A carregar...

Cross Entropy of Neural Language Models at Infinity—A New Bound of the Entropy Rate

Neural language models have drawn a lot of attention for their strong ability to predict natural language text. In this paper, we estimate the entropy rate of natural language with state-of-the-art neural language models. To obtain the estimate, we consider the cross entropy, a measure of the predic...

ver descrição completa

Na minha lista:
Detalhes bibliográficos
Publicado no:Entropy (Basel)
Main Authors: Takahashi, Shuntaro, Tanaka-Ishii, Kumiko
Formato: Artigo
Idioma:Inglês
Publicado em: MDPI 2018
Assuntos:
Acesso em linha:https://ncbi.nlm.nih.gov/pmc/articles/PMC7512401/
https://ncbi.nlm.nih.gov/pubmed/33266563
https://ncbi.nlm.nih.govhttp://dx.doi.org/10.3390/e20110839
Tags: Adicionar Tag
Sem tags, seja o primeiro a adicionar uma tag!