Pradyot Mohanty
Pradyot Mohanty

Reputation: 149

implement FileNotFound exception in databricks using pyspark

I am trying to implement exception handeling using Pyspark in Databricks, where I need to check the file if it exists in the source location.

  df = spark.read.csv.option("inferschema", "true").load("mnt/pnt/abc.csv") 

  try:    
      df = open("abc.csv", "rt")
      print("File opened")
  except FileNotFoundError:
      print("File does not exist")
  except:
      print("Other error")**

I wish to have something like the above code snippet however i am not being able to take the approach. I would request some help would be really thankful

Upvotes: 5

Views: 4274

Answers (1)

Davide Anghileri
Davide Anghileri

Reputation: 901

You can't directly Except java.io errors, however you could do something like:

def read_file(path):
  try:
    dbutils.fs.ls(path)
    return spark.read.option("inferschema","true").csv(path)
  except Exception as e:
    if 'java.io.FileNotFoundException' in str(e):
      print('File does not exists')
    else:
      print('Other error')

read_file('mnt/pnt/abc.csv')

Upvotes: 13

Related Questions