Reputation: 8427
I have a Spark DataFrame query that is guaranteed to return single column with single Int value. What is the best way to extract this value as Int from the resulting DataFrame?
Upvotes: 59
Views: 98330
Reputation: 143
If we have the spark dataframe as :
+----------+
|_c0 |
+----------+
|2021-08-31|
+----------+
x = df.first()[0]
print(x)
2021-08-31
Upvotes: 0
Reputation: 7396
In Pyspark
, you can simply get the first element if the dataframe
is single entity with one column
as a response, otherwise, a whole row
will be returned, then you have to get dimension-wise
response i.e. 2 Dimension list like df.head()[0][0]
df.head()[0]
Upvotes: 8
Reputation: 9559
You can use head
df.head().getInt(0)
or first
df.first().getInt(0)
Check DataFrame scala docs for more details
Upvotes: 85
Reputation: 13346
This could solve your problem.
df.map{
row => row.getInt(0)
}.first()
Upvotes: 7