Ali.Turkkan
Ali.Turkkan

Reputation: 265

i can't use pretrained_model=URLs.WT103 from fastai.text

I'm trying to create a model that predicts a word as input and output as paragraph. I get an error when trying to implement the same example given on fastai|text to my own data set. It gives an error in the following step. When you reviewed the site, it didn't matter until you get the code below. But this code gives an error. What could be the cause of this error?

Code:

from fastai import *
from fastai.text import * 

path = untar_data(URLs.IMDB_SAMPLE)

df = pd.read_csv(path/'texts.csv')

# Language model data
data_lm = TextLMDataBunch.from_csv(path, 'texts.csv')
# Classifier model data
data_clas = TextClasDataBunch.from_csv(path, 'texts.csv', 
vocab=data_lm.train_ds.vocab, bs=32)

data_lm.save()
data_clas.save()

data_lm = TextLMDataBunch.load(path)
data_clas = TextClasDataBunch.load(path, bs=32)

learn = language_model_learner(data_lm, pretrained_model=URLs.WT103, drop_mult=0.5)
learn.fit_one_cycle(1, 1e-2)

Error Code:

learn = language_model_learner(data_lm, pretrained_model=URLs.WT103, drop_mult=0.5)

Output:

    102     if not ps: return None
    103     if b is None: return ps[0].requires_grad
--> 104     for p in ps: p.requires_grad=b
    105 
    106 def trainable_params(m:nn.Module)->ParamList:

RuntimeError: you can only change requires_grad flags of leaf variables. If you want to use a computed variable in a subgraph that doesn't require differentiation use var_no_grad = var.detach().

Upvotes: 0

Views: 898

Answers (1)

VaRuN SiNgH
VaRuN SiNgH

Reputation: 36

set grad to false with following command: torch.set_grad_enabled(False) (use it before creation of learner object)

and wrap the call of the function (learn.fit cycle()) with torch.enable_grad():

Upvotes: 2

Related Questions