Reputation: 35
I am not able to see Logging messages created in my python code when dataflow jobs are running. I have followed the documentation but no logs are appearing!
https://cloud.google.com/dataflow/docs/guides/logging
I have imported 'logging' and even created a custom logger as per the below:
logger = logging.getLogger('TEST_LOGGER')
logger.setLevel(logging.INFO)
The below is an extracted function where I have input a number of logs and print statements, just to try and see any sort of output in the Dataflow UI:
def check_file_size(file):
file = 'gs://Redacted'
file_size = gcsio.GcsIO.size(file)
print('FILE SIZE:' + file_size)
logger.info('FILE SIZE:' + file_size)
logger.info(f'file_size..... - {file_size}')
if(file_size <= 4):
print('FILE SIZE:' + file_size)
logger.info('FILE SIZE:' + file_size)
raise Exception("ERROR IN FILE")
else:
print('FILE SIZE:' + file_size)
logger.info('FILE SIZE:' + file_size)
return file_size
None of the above messages are shown in the UI, in fact no custom log messages are shown...
I would expect the logs to be shown both under 'worker logs' and also on the individual step logs for that function. I am really running out of ideas as to why these logs are not appearing -n any guidance would be greatly appreciated.
Upvotes: 1
Views: 779
Reputation: 6582
If you import logging
in the following way, it should work :
import logging
def check_file_size(file):
file = 'gs://Redacted'
file_size = gcsio.GcsIO.size(file)
print('FILE SIZE:' + file_size)
logging.info('FILE SIZE:' + file_size)
logging.info(f'file_size..... - {file_size}')
if(file_size <= 4):
print('FILE SIZE:' + file_size)
logging.info('FILE SIZE:' + file_size)
raise Exception("ERROR IN FILE")
else:
print('FILE SIZE:' + file_size)
logging.info('FILE SIZE:' + file_size)
return file_size
In this case, the logs should appear in the Dataflow
UI steps and also in Cloud Logging
.
Upvotes: 2