This project is read-only.
1
Vote

Error execution of hive processing

description

The command < hive -f "c:\Analize Tweets with Hive.txt" > after the building of three tables generate this error and stop the execution.

c:\apps\dist\hive-0.9.0\bin>hive -f "c:\Analyze Tweets with Hive.txt"
Hive history file=c:\apps\dist\hive-0.9.0\logs\history/hive_job_log_admin_201304
021247_470612975.txt
Logging initialized using configuration in file:/C:/apps/dist/hive-0.9.0/conf/hi
ve-log4j.properties
OK
Time taken: 21.853 seconds
OK
Time taken: 0.41 seconds
OK
Time taken: 0.094 seconds
OK
Time taken: 0.447 seconds
Total MapReduce jobs = 2
Launching Job 1 out of 2
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_201304020848_0002, Tracking URL = http://jobtrackerhost:50030 /jobdetails.jsp?jobid=job_201304020848_0002
Kill Command = c:\apps\dist\hadoop-1.1.0-SNAPSHOT\bin\hadoop.cmd job -Dmapred.j
ob.tracker=jobtrackerhost:9010 -kill job_201304020848_0002
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2013-04-02 12:48:36,743 Stage-1 map = 0%, reduce = 0%
2013-04-02 12:49:37,191 Stage-1 map = 0%, reduce = 0%
2013-04-02 12:50:11,365 Stage-1 map = 100%, reduce = 100%
Ended Job = job_201304020848_0002 with errors
Hive history file=c:\apps\dist\hive-0.9.0\logs\history/hive_job_log_admin_201304
021250_1991860970.txt
Hive history file=c:\apps\dist\hive-0.9.0\logs\history/hive_job_log_admin_201304
021250_1644682636.txt
Error during job, obtaining debugging information...
Hive history file=c:\apps\dist\hive-0.9.0\logs\history/hive_job_log_admin_201304
021250_77722224.txt
Hive history file=c:\apps\dist\hive-0.9.0\logs\history/hive_job_log_admin_201304
021250_1214186134.txt
Examining task ID: task_201304020848_0002_m_000002 (and more) from job job_20130
4020848_0002
Exception in thread "Thread-31" java.lang.RuntimeException: Error while reading
from task log url
    at org.apache.hadoop.hive.ql.exec.errors.TaskLogProcessor.getStackTraces
(TaskLogProcessor.java:242)
    at org.apache.hadoop.hive.ql.exec.JobDebugger.showJobFailDebugInfo(JobDe
bugger.java:227)
    at org.apache.hadoop.hive.ql.exec.JobDebugger.run(JobDebugger.java:92)
    at java.lang.Thread.run(Thread.java:722)
Caused by: java.io.IOException: Server returned HTTP response code: 400 for URL:
http://10.241.212.24:50060/tasklog?taskid=attempt_201304020848_0002_m_000000_1&
start=-8193
    at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLCon
nection.java:1612)
    at java.net.URL.openStream(URL.java:1035)
    at org.apache.hadoop.hive.ql.exec.errors.TaskLogProcessor.getStackTraces
(TaskLogProcessor.java:193)
    ... 3 more
FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRe
dTask
MapReduce Jobs Launched:
Job 0: Map: 1 HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec

file attachments

comments

Amoll wrote May 17, 2013 at 11:55 PM

There was a slight update to the API for writing hive external tables. You will need to change the location for the external tables in analyze tweets with hive.txt to the following format: asv://container@storageaccount.blob.core.windows.net/foldername New instructions are coming soon