A Neural Probabilistic Language Model cover art

A Neural Probabilistic Language Model

A Neural Probabilistic Language Model

Listen for free

View show details

About this listen

This paper published in 2003 introduces a neural probabilistic language model designed to address the curse of dimensionality inherent in modeling word sequences. The authors propose learning a distributed representation for words, which enables the model to generalize from seen sentences to an exponential number of semantically similar, unseen sentences. This approach simultaneously learns word feature vectors and the probability function for word sequences using neural networks. The paper details the architecture of the neural network, the training process involving stochastic gradient ascent, and methods for parallel implementation to manage the computational demands of large datasets. Experimental results on two corpora demonstrate that this neural network approach significantly improves upon state-of-the-art n-gram models, particularly by leveraging longer word contexts.


Source: https://www.jmlr.org/papers/volume3/bengio03a/bengio03a.pdf

No reviews yet
In the spirit of reconciliation, Audible acknowledges the Traditional Custodians of country throughout Australia and their connections to land, sea and community. We pay our respect to their elders past and present and extend that respect to all Aboriginal and Torres Strait Islander peoples today.