0xF
0xF

Reputation: 586

Why pyspark sql isn't returning value

I have created an spark RDD table which I'm trying to query,but the result is not a value as expected.Any idea what's going wrong.

In [8]:people.take(15)
Out[8]:
[Row(num1=u'27477.23', num2=u'28759.862564'),
 Row(num1=u'14595.27', num2=u'4753.822798'),
 Row(num1=u'16799.17', num2=u'535.51891148'),
 Row(num1=u'171.85602', num2=u'905.14'),
 Row(num1=u'878488.70139', num2=u'1064731.4136'),
 Row(num1=u'1014.59748', num2=u'1105.91'),
 Row(num1=u'184.53171', num2=u'2415.61'),
 Row(num1=u'28113.931963', num2=u'71011.376036'),
 Row(num1=u'1471.75', num2=u'38.0268375'),
 Row(num1=u'33645.52', num2=u'15341.160558'),
 Row(num1=u'5464.95822', num2=u'14457.08'),
 Row(num1=u'753.58258673', num2=u'3243.75'),
 Row(num1=u'26469.395374', num2=u'38398.135846'),
 Row(num1=u'4709.5768681', num2=u'1554.61'),
 Row(num1=u'1593.1114983', num2=u'2786.4538546')]

The schema is encoded in a string.

In [9]:
schemaString = "num1 num2"
In [10]:

fields = [StructField(field_name, StringType(), True) for field_name in schemaString.split()]
schema = StructType(fields)
In [11]:

# Apply the schema to the RDD
schemaPeople = sqlContext.applySchema(people, schema)

Register the SchemaRDD as a table.

In [12]:
schemaPeople.registerTempTable("people")

SQL can be run over SchemaRDDs that have been registered as a table.**

In [14]:
results = sqlContext.sql("SELECT sum(num1) FROM people")
In [18]:
results
Out[18]:
MapPartitionsRDD[52] at mapPartitions at SerDeUtil.scala:143

Upvotes: 0

Views: 1821

Answers (1)

zero323
zero323

Reputation: 330443

Same as transformations on the plain RDDs Spark SQL queries are only a description of the required operations. If you want to get the results you have trigger an action:

>>> results.first()
Row(_c0=1040953.1831101299)

Just for a clarity it would be better to cast your data and not to depend on a implicit conversion:

>>> result = sqlContext.sql("SELECT SUM(CAST(num1 AS FLOAT)) FROM people")

Upvotes: 2

Related Questions