Reputation: 63
I tried training a spacy model but recently I started to get some errors , i got the below error and i would like some one to help me resolve error
def train_model(model, train_data, optimizer, batch_size, epochs=10):
losses = {}
random.seed(1)
for epoch in range(epochs):
random.shuffle(train_data)
batches = minibatch(train_data, size=batch_size)
for batch in batches:
# Split batch into texts and labels
texts, labels = zip(*batch)
# Update model with texts and labels
nlp.update(texts, labels, sgd=optimizer, losses=losses)
print("Loss: {}".format(losses['textcat']))
return losses['textcat']
optimizer = nlp.begin_training()
batch_size = 5
epochs = 20
# Training the model
train_model(nlp, train_data, optimizer, batch_size, epochs)
Below is the error which shows that there is a value error
ValueError
Traceback (most recent call last)
~\AppData\Local\Temp/ipykernel_16120/3494358196.py in <module>
4
5 # Training the model
----> 6 train_model(nlp, train_data, optimizer, batch_size, epochs)
~\AppData\Local\Temp/ipykernel_16120/3158014372.py in train_model(model, train_data, optimizer, batch_size, epochs)
12
13 # Update model with texts and labels
---> 14 nlp.update(texts, labels, sgd=optimizer, losses=losses)
15 print("Loss: {}".format(losses['textcat']))
16
~\anaconda3\lib\site-packages\spacy\language.py in update(self, examples, _, drop, sgd, losses, component_cfg, exclude, annotates)
1132 """
1133 if _ is not None:
-> 1134 raise ValueError(Errors.E989)
1135 if losses is None:
1136 losses = {}
ValueError: [E989] `nlp.update()` was called with two positional arguments. This may be due to a backwards-incompatible change to the format of the training data in spaCy 3.0 onwards. The 'update' function should now be called with a batch of Example objects, instead of `(text, annotation)` tuples.
Upvotes: 2
Views: 593
Reputation: 143098
Base on documentation they made some changes in version 3.x
and now it uses directly batch
without spliting texts, labels = zip(*batch)
.
for batch in batches:
nlp.update(batch, sgd=optimizer, losses=losses)
That's all.
Upvotes: 1