Reputation: 353
I have a spark data frame in Java, something like below:
I want it to be sorted based on "Col3" but all the values of Col1 and Col2 should be in a group. The result should be something like below:
Upvotes: 0
Views: 437
Reputation: 532
The groupBy() function is used during aggregation while your requirement just requires orderBy()
Assuming dataframe df with 3 columns Col1, Col2, Col3, you can do the below in Spark
val sortedDf = df.orderBy(col("Col1").desc,col("Col2").desc,col("Col3").asc)
POC for the same is available here SQLFIDDLE
Upvotes: 1