Reputation: 1
I have been trying to learn NLTK and NLP, but to use n-grams to build a next word predictor seems to be relatively simple. What are some other ways I might approach this problem?
Upvotes: 0
Views: 108
Reputation: 4349
This is called Language Modeling. It is one of the primary tasks in NLP. This article is old now, but it explains in detail how to build a character level language model (given chars c_0 through c_(n-1), predict character c_n).
LSTMs are the best balance of resource-usage and accuracy. ULM-FIT is the best example of LSTM language modeling. Most state of the art results are using enormous Transformers, like the famous BERT* and GPT-2.
Upvotes: 1