user2763088
user2763088

Reputation: 383

How to convert spark sql dataframe to numpy array?

I'm using pyspark and imported a hive table into a dataframe.

df = sqlContext.sql("from hive_table select *") 

I need help on converting this df to numpy array. You may assume hive_table has only one column.

Can you please suggest? Thank you in advance.

Upvotes: 3

Views: 8239

Answers (1)

zero323
zero323

Reputation: 330423

You can:

sqlContext.range(0, 10).toPandas().values  # .reshape(-1) for 1d array
array([[0],
       [1],
       [2],
       [3],
       [4],
       [5],
       [6],
       [7],
       [8],
       [9]])

but it is unlikely you really want to. Created array will be local to the driver node so it its rarely useful. If you're looking for some variant of distributed array-like data structure there is a number of possible choices in Apache Spark:

and independent of Apache Spark:

Upvotes: 4

Related Questions