Reputation: 95
Hi I'm new to pyspark and I'm trying to convert pyspark.sql.dataframe into list of dictionaries.
Below is my dataframe, the type is <class 'pyspark.sql.dataframe.DataFrame'>:
+------------------+----------+------------------------+
| title|imdb_score|Worldwide_Gross(dollars)|
+------------------+----------+------------------------+
| The Eight Hundred| 7.2| 460699653|
| Bad Boys for Life| 6.6| 426505244|
| Tenet| 7.8| 334000000|
|Sonic the Hedgehog| 6.5| 308439401|
| Dolittle| 5.6| 245229088|
+------------------+----------+------------------------+
I would like to convert it into:
[{"title":"The Eight Hundred", "imdb_score":7.2, "Worldwide_Gross(dollars)":460699653},
{"title":"Bad Boys for Life", "imdb_score":6.6, "Worldwide_Gross(dollars)":426505244},
{"title":"Tenet", "imdb_score":7.8, "Worldwide_Gross(dollars)":334000000},
{"title":"Sonic the Hedgehog", "imdb_score":6.5, "Worldwide_Gross(dollars)":308439401},
{"title":"Dolittle", "imdb_score":5.6, "Worldwide_Gross(dollars)":245229088}]
How should I do this? Thanks in advance!
Upvotes: 8
Views: 7417
Reputation: 42352
You can map each row into a dictionary and collect the results:
df.rdd.map(lambda row: row.asDict()).collect()
Upvotes: 9