Dex
Dex

Reputation: 57

How to escape single quote in sparkSQL

I am new to pySpark and SQL. I am working on below query;

sqlContext.sql("Select Crime_type, substring(Location,11,100) as Location_where_crime_happened, count(*) as Count\
                            From street_SQLTB\
                            where LSOA_name = 'City of London 001F' and \
                            group by Location_where_crime_happened, Crime_type\
                            having Location_where_crime_happened = 'Alderman'S Walk'")

I am struggling in dealing with single quote. I need to apply filter on Alderman'S Walk. It could be easy one but I am unable to figure out. Your help is much appreciated.

Upvotes: 2

Views: 8325

Answers (1)

Sudhin
Sudhin

Reputation: 149

Try this

import pyspark
from pyspark.sql import SparkSession

spark = SparkSession.builder.appName('SparkByExamples.com').getOrCreate()
simpleData = [("James","Sales","NY",90000,34,10000), \
    ("Michael","Sales","NY",86000,56,20000), \
    ("Robert","Sales","CA",81000,30,23000), \
    ("Maria","Alderman'S Walk","CA",90000,24,23000) \
  ]
columns= ["employee_name","department","state","salary","age","bonus"]
df1 = spark.createDataFrame(data = simpleData, schema = columns)
df1.createOrReplaceTempView('temp') 

df = sqlContext.sql("""select * from temp where department = "Alderman'S Walk" """)
display(df)

or

df = sqlContext.sql("select * from temp where department = 'Alderman\\'S Walk' ")
display(df)

Filtered output: enter image description here

Upvotes: 2

Related Questions