user3447653
user3447653

Reputation: 4148

Convert spark rdd to pandas dataframe

I have an rdd with 15 fields. To do some computation, I have to convert it to pandas dataframe.

I tried with df.toPandas() function which did not work. I tried extracting every rdd and separate it with a space and putting it in a dataframe, that also did not work.

[u'2015-07-22T09:00:28.019143Z ssh 123.242.248.130:54635 10.0.6.158:80 0.000022 0.026109 0.00002 200 200 0 699 "GET https://google.coml HTTP/1.1" "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/43.0.2357.130 Safari/537.36" ECDE-PAM TLSv1.2',
 u'2015-07-22T09:00:27.894580Z ssh 203.91.211.44:51402 10.0.4.150:80 0.000024 0.15334 0.000026 200 200 0 1497 "GET https://yahoo.com HTTP/1.1" "Mozilla/5.0 (Windows NT 6.1; rv:39.0) Gecko/20100101 Firefox/39.0" ECDL-RAT TLSv1.2']

Is there some function that i can use it?

Thanks in advance!!

Upvotes: 2

Views: 4114

Answers (1)

Mariusz
Mariusz

Reputation: 13926

If you have rdd in the following form:

>>> rdd.collect()
[[u'2015-07-22T09:00:28.019143Z', u'ssh', u'123.242.248.130:54635', u'10.0.6.158:80', u'0.000022', u'0.026109', u'0.00002', u'200', u'200', u'0', u'699', u'"GET https://google.coml HTTP/1.1"', u'"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML like Gecko) Chrome/43.0.2357.130 Safari/537.36"', u'ECDE-PAM', u'TLSv1.2'], 
 [u'2015-07-22T09:00:27.894580Z', u'ssh', u'203.91.211.44:51402', u'10.0.4.150:80', u'0.000024', u'0.15334', u'0.000026', u'200', u'200', u'0', u'1497', u'"GET https://yahoo.com HTTP/1.1"', u'"Mozilla/5.0 (Windows NT 6.1; rv:39.0) Gecko/20100101 Firefox/39.0"', u'ECDL-RAT', u'TLSv1.2']]

then rdd.toDF(['column1_name', 'column2_name', ...., 'column15_name']).toPandas() will do the job (but requires pandas python package to be installed).

Upvotes: 2

Related Questions