Reputation: 21
Running a very simple test example using Azure HDInsights and Hive/Python. Hive does not appear to be loading the Python script.
Hive Code:
ADD FILE asv:///mapper_test.py;
SELECT
TRANSFORM (dob)
USING 'python asv:///mapper_test.py' AS (dob)
FROM test_table;
Error:
Hive history file=c:\apps\dist\hive-0.9.0\logs/hive_job_log_RD00155DD090CC$_201308202117_1738335083.txt
Logging initialized using configuration in file:/C:/apps/dist/hive-0.9.0/conf/hive-log4j.properties
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_201308201542_0025, Tracking URL = http://jobtrackerhost:50030/jobdetails.jsp?jobid=job_201308201542_0025
Kill Command = c:\apps\dist\hadoop-1.1.0-SNAPSHOT\bin\hadoop.cmd job -Dmapred.job.tracker=jobtrackerhost:9010 -kill job_201308201542_0025
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2013-08-20 21:18:04,911 Stage-1 map = 0%, reduce = 0%
2013-08-20 21:19:05,175 Stage-1 map = 0%, reduce = 0%
2013-08-20 21:19:32,292 Stage-1 map = 100%, reduce = 100%
Ended Job = job_201308201542_0025 with errors
Error during job, obtaining debugging information...
Examining task ID: task_201308201542_0025_m_000002 (and more) from job job_201308201542_0025
Exception in thread "Thread-24" java.lang.RuntimeException: Error while reading from task log urlatorg.apache.hadoop.hive.ql.exec.errors.TaskLogProcessor.getStackTraces(TaskLogProcessor.java:242)
at org.apache.hadoop.hive.ql.exec.JobDebugger.showJobFailDebugInfo(JobDebugger.java:227)
at org.apache.hadoop.hive.ql.exec.JobDebugger.run(JobDebugger.java:92)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.io.IOException: Server returned HTTP response code: 400 for URL: http://workernode1:50060/tasklog?taskid=attempt_201308201542_0025_m_000000_7&start=-8193
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1616)
at java.net.URL.openStream(URL.java:1035)
at org.apache.hadoop.hive.ql.exec.errors.TaskLogProcessor.getStackTraces(TaskLogProcessor.java:193)
... 3 more
FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask
MapReduce Jobs Launched:
Job 0: Map: 1 HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec
Upvotes: 0
Views: 535
Reputation: 491
I had a similar experience while using PIG + Python in Azure HDInsight Version 2.0. One thing I found out was, Python was available only in the head node and not in all the nodes in the cluster. You can see a similar question here
You can remote login to the head node of the cluster, and from the head node, find out the IP of the Task Tracker node and remote login to any of the task traker nodes to find out whether python is installed in the node.
This issue is fixed in HDInsight Version 2.1 cluster. But still Python is not added to the "PATH". You may need to do this yourself.
Upvotes: 1