码迷,mamicode.com
首页 > 编程语言 > 详细

hive脚本出现Error: java.lang.RuntimeException: Error in configuring object和Caused by: java.lang.IndexOutOfBoundsException: Index: 9, Size: 9

时间:2016-06-12 20:16:15      阅读:661      评论:0      收藏:0      [点我收藏+]

标签:

是在reduce阶段报的错误,详细错误信息是

朱传豪  19:04:48
Diagnostic Messages for this Task:
Error: java.lang.RuntimeException: Error in configuring object
	at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
	at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:409)
	at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
	... 9 more
Caused by: java.lang.RuntimeException: Reduce operator initialization failed
	at org.apache.hadoop.hive.ql.exec.mr.ExecReducer.configure(ExecReducer.java:173)
	... 14 more
朱传豪  19:05:00
Caused by: java.lang.IndexOutOfBoundsException: Index: 9, Size: 9
	at java.util.ArrayList.rangeCheck(ArrayList.java:635)
	at java.util.ArrayList.get(ArrayList.java:411)
	at org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector.init(StandardStructObjectInspector.java:121)
	at org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector.<init>(StandardStructObjectInspector.java:109)
	at org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorFactory.getStandardStructObjectInspector(ObjectInspectorFactory.java:283)
	at org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorFactory.getStandardStructObjectInspector(ObjectInspectorFactory.java:268)
	at org.apache.hadoop.hive.ql.exec.LateralViewJoinOperator.initializeOp(LateralViewJoinOperator.java:105)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:376)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:460)
	at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:416)
	at org.apache.hadoop.hive.ql.exec.Operator.initializeOp(Operator.java:401)
	at org.apache.hadoop.hive.ql.exec.UDTFOperator.initializeOp(UDTFOperator.java:95)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:376)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:460)
	at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:416)
	at org.apache.hadoop.hive.ql.exec.SelectOperator.initializeOp(SelectOperator.java:65)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:376)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:460)
	at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:416)
	at org.apache.hadoop.hive.ql.exec.Operator.initializeOp(Operator.java:401)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:376)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:460)
	at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:416)
	at org.apache.hadoop.hive.ql.exec.SelectOperator.initializeOp(SelectOperator.java:65)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:376)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:460)
	at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:416)
	at org.apache.hadoop.hive.ql.exec.GroupByOperator.initializeOp(GroupByOperator.java:427)
	at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:376)
	at org.apache.hadoop.hive.ql.exec.mr.ExecReducer.configure(ExecReducer.java:166)
	... 14 more


FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
MapReduce Jobs Launched: 
Stage-Stage-1: Map: 75  Reduce: 5   Cumulative CPU: 2318.05 sec   HDFS Read: 5031117214 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 38 minutes 38 seconds 50 msec
朱传豪  19:08:09

  

我的hive版本是hive-0.13.1+cdh5.3.6+397

 

执行的脚本是:

 insert into table Hive_OnlineFirstActive_newtest partition(ProductId,partitiondate) 
select newData.* from (
select mytable.ClientIp,iptocode(mytable.ClientIp,‘6‘) as province,iptocode(mytable.ClientIp,‘7‘) as city,mytable.imei1,mytable.imei2,mytable.plat,mytable.nettype,mytable.myvername,mytable.os,mytable.clientchannel,hour(mytable.serverdate),mytable.productid,substring(mytable.serverdate,1,10) as partitiondate from (
select UniqueImeiUDAF(concat_ws(‘^‘,ServerDate,IMEI1,IMEI2,Plat,NetType,MyVername,Os,ProductId,ClientChannel,ClientIp)) as r from Hive_OnlinePvData where partitiondate>=‘2016-04-16‘ and partitiondate<=‘2016-05-01‘ group by IMEI1,ProductId 
)tt lateral view FirstActiveUDTF(r) mytable
)newData 
left join 
Hive_OnlineFirstActive_newtest oldData 
on newData.imei1 = oldData.imei1 and newData.ProductId = oldData.ProductId  
where oldData.imei1 is null; 

注意,iptocode是个UDF

 

然后网上查到:https://issues.apache.org/jira/browse/HIVE-5771

 

我理解,这个是hive的bug,在sql优化器优化时可能找不到udf的jar包,该问题是hive0.14.0进行fixed

 

hive脚本出现Error: java.lang.RuntimeException: Error in configuring object和Caused by: java.lang.IndexOutOfBoundsException: Index: 9, Size: 9

标签:

原文地址:http://www.cnblogs.com/hark0623/p/5578566.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!