Reputation: 644
I'm trying to use spacy noun_chunks
but it's throwing an error.
I downloaded the model with
python -m spacy download en_core_web_sm
AttributeError: 'English' object has no attribute 'noun_chunks'
NLP = spacy.load('en_core_web_sm')
NOUN_CHUNKS = NLP.noun_chunks
Upvotes: 0
Views: 6880
Reputation: 7105
How did you come up with that code? The loaded nlp
processing object doesn't have a property noun_chunks
– instead, you want to be accessing the noun chunks of a processed document:
nlp = spacy.load("en_core_web_sm") # load the English model
doc = nlp("There is a big dog.") # process a text and create a Doc object
for chunk in doc.noun_chunks: # iterate over the noun chunks in the Doc
print(chunk.text)
# 'a big dog'
For more details, see the documentation here.
Upvotes: 7