Vignesh Muthu.S
Vignesh Muthu.S

Reputation: 103

Tokenize the words based on a list

I have a requirement of tokenizing the words in a sentence based on the specific word list.

wordlist = ["nlp - nltk", "CIFA R12 - INV"]

Example-input: This is sample text for nlp - nltk CIFA R12 - INV.

while using word_tokenize(Exapmle-input), here I need nlp - nltk as one token and CIFA R12 - INV as another token. Is that possible rather than getting nlp - CIFA as different tokens?

Upvotes: 0

Views: 300

Answers (1)

Vignesh Muthu.S
Vignesh Muthu.S

Reputation: 103

For those who comes here in future:-
After some reading, i have found out nltk.tokenize.mwe module is the option to achieve my above requirement.

Reference: http://www.nltk.org/api/nltk.tokenize.mwe.html

Upvotes: 2

Related Questions