spiridon_the_sun_rotator
spiridon_the_sun_rotator

Reputation: 1034

How to disable logging from PyTorch-Lightning logger?

Logger in PyTorch-Lightning prints information about the model to be trained (or evaluated) and the progress during the training,

However, in my case I would like to hide all messages from the logger in order not to flood the output in Jupyter Notebook.

I've looked into the API of the Trainer class on the official docs page https://pytorch-lightning.readthedocs.io/en/latest/common/trainer.html#trainer-flags and it seems like there is no option to turn off the messages from the logger.

There is a parameter log_every_n_steps which can be set to big value, but nevertheless, the logging result after each epoch is displayed.

How can one disable the logging?

Upvotes: 9

Views: 12048

Answers (4)

carusyte
carusyte

Reputation: 1739

In newer versions model summary and progress reporting might be suppressed using these 2 trainer arguments

enable_progress_bar (Optional[bool]) – Whether to enable to progress bar by default. Default: True.

enable_model_summary (Optional[bool]) – Whether to enable model summarization by default. Default: True.

See the Trainer API for more details.

However I'm still wondering how to suppress these device usage logs:

GPU available: False, used: False
TPU available: False, using: 0 TPU cores
HPU available: False, using: 0 HPUs

---Update #1---

Turns out that, to disable the device availability messages above, you might want to use

import logging
logging.getLogger("lightning.pytorch.utilities.rank_zero").setLevel(logging.FATAL)

Upvotes: 1

Artyrm Sergeev
Artyrm Sergeev

Reputation: 329

Maybe try like that?

logging.getLogger("package").propagate = False

Upvotes: 0

ayandas
ayandas

Reputation: 2268

I am assuming that two things are particularly bothering you in terms of flooding output stream:

One, The "weight summary":

  | Name | Type   | Params
--------------------------------
0 | l1   | Linear | 100 K 
1 | l2   | Linear | 1.3 K 
--------------------------------
...

Second, the progress bar:

Epoch 0:  74%|███████████   | 642/1874 [00:02<00:05, 233.59it/s, loss=0.85, v_num=wxln]

PyTorch Lightning provided very clear and elegant solutions for turning them off: Trainer(progress_bar_refresh_rate=0) for turning off progress bar and Trainer(weights_summary=None) for turning off weight summary.

Upvotes: 7

spiridon_the_sun_rotator
spiridon_the_sun_rotator

Reputation: 1034

The solution was the combination of the @Artyrm Sergeev suggestion and the answer suggested here https://stackoverflow.com/a/52559560/13614416.

  1. Get all pytorch_lightning loggers:

    pl_loggers = [ logging.getLogger(name) for name in logging.root.manager.loggerDict if 'pytorch_lightning' in name ]

  2. Put the trainer.fit inside following construction:

    with io.capture_output() as captured: trainer.fit(...)

Upvotes: 0

Related Questions