Matt
Matt

Reputation: 1839

tqdm not showing bar

I'm using the tqdm library and it doesn't give me the progress bar, instead it gives me output that looks like this where it just tells me the iteration:

251it [01:44, 2.39it/s]

Any idea why the code would do this? I thought it was maybe because I was passing it a generator but then again I've used generators in the past that have worked. I've never really messed with tdqm formatting before. Here is part of the source code:

train_iter = zip(train_x, train_y) #train_x and train_y are just lists of elements
....
def train(train_iter, model, criterion, optimizer):
    model.train()
    total_loss = 0
    for x, y in tqdm(train_iter):
        x = x.transpose(0, 1)
        y = y.transpose(0, 1)
        optimizer.zero_grad()
        bloss = model.forward(x, y, criterion)   
        bloss.backward()
        torch.nn.utils.clip_grad_norm(model.parameters(), args.clip)
        optimizer.step()        
        total_loss += bloss.data[0]
    return total_loss

Upvotes: 34

Views: 57109

Answers (3)

Viman Deb
Viman Deb

Reputation: 41

@Dogus's answer is more natural use of tqdm, but you need to make sure that your data loader (if it is a custom iterator) also exposes the len method.

Upvotes: 0

Doğuş
Doğuş

Reputation: 1957

Filling the "total" parameter with length worked for me. Now the progress bar appears.

from tqdm import tqdm

# ...
for imgs, targets in tqdm( train_dataloader, total=len(train_dataloader)):
   # ...

Upvotes: 17

DDGG
DDGG

Reputation: 1241

tqdm needs to known how many iters will be performed (the total amount) to show a progress bar.

You can try this:

from tqdm import tqdm

train_x = range(100)
train_y = range(200)

train_iter = zip(train_x, train_y)

# Notice `train_iter` can only be iter over once, so i get `total` in this way.
total = min(len(train_x), len(train_y))

with tqdm(total=total) as pbar:
    for item in train_iter:
        # do something ...
        pbar.update(1)

Upvotes: 52

Related Questions