标签:name report mapred main rtb apache hive 实体 selector
hive执行过程中报错,抓重点(黄色):
2019-02-01 09:56:54,623 ERROR [pool-7-thread-4] dao.IHiveDaoImpl - java.sql.SQLException: org.apache.hive.service.cli.HiveSQLException:
Error while processing statement: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:380) at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:257) at org.apache.hive.service.cli.operation.SQLOperation.access$800(SQLOperation.java:91) at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:348) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1758) at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:362) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)
大概是执行mapreduce的时候的错误:
查看了下mapreduce确实是执行了,
拉取mr错误日志:
2019-02-01 10:28:35,832 INFO [IPC Server handler 4 on 38091] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1537175606568_162793_m_000000_3: Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"log":"5\u001aNEWEHIREWEB17.2019012911\u001a1\u001a3\u001a1548730807629\u001a43\u001a14\u001a2223123\u001a2577551\u001a8e56221be35a44f8845064b8cc8f21f9\u001a61.170.197.152\u001a","webname":"ehireLog","mon":"201901","dt":"20190129"} at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:169) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1758) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"log":"5\u001aNEWEHIREWEB17.2019012911\u001a1\u001a3\u001a1548730807629\u001a43\u001a14\u001a2223123\u001a2577551\u001a8e56221be35a44f8845064b8cc8f21f9\u001a61.170.197.152\u001a","webname":"ehireLog","mon":"201901","dt":"20190129"} at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:562) at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:160) ... 8 more Caused by: com.tracker.common.db.simplehbase.exception.SimpleHBaseException: convert result exception. cells=[003\x111/data:id/1538028988105/Put/vlen=4/seqid=0, 003\x111/data:isInSelector/1538028988105/Put/vlen=4/seqid=0, 003\x111/data:isStats/1538028988105/Put/vlen=4/seqid=0, 003\x111/data:pageDesc/1538028988105/Put/vlen=6/seqid=0, 003\x111/data:pageType/1548918298621/Put/vlen=1/seqid=0, 003\x111/data:webId/1538028988105/Put/vlen=4/seqid=0] type=class com.tracker.common.data.model.dict.website.Page at com.tracker.common.db.simplehbase.HbaseClient.convertToHbaseObjectResult(HbaseClient.java:337) at com.tracker.common.db.simplehbase.HbaseClientImpl$6.handleData(HbaseClientImpl.java:177) at com.tracker.common.db.simplehbase.HbaseClientImpl.handData_internal(HbaseClientImpl.java:733) at com.tracker.common.db.simplehbase.HbaseClientImpl.handDataByRowPrefixList(HbaseClientImpl.java:651) at com.tracker.common.db.simplehbase.HbaseClientImpl.findObjectByRowPrefixList(HbaseClientImpl.java:174) at com.tracker.common.db.simplehbase.HbaseClientImpl.findObjectByRowPrefix(HbaseClientImpl.java:167) at com.tracker.common.data.dao.dict.WebDictDataDao$6.apply(WebDictDataDao.java:154) at com.tracker.common.data.dao.dict.WebDictDataDao$6.apply(WebDictDataDao.java:151) at com.tracker.common.cache.LocalMapCache.getOrElse(LocalMapCache.java:66) at com.tracker.common.data.dao.dict.WebDictDataDao.getPageList(WebDictDataDao.java:151) at com.tracker.common.data.dao.dict.WebDictDataDao.loadDictToCache(WebDictDataDao.java:36) at com.tracker.common.data.query.DictDataQuery.loadLogPaserDict(DictDataQuery.java:84) at com.tracker.hive.func.udf.parse.ParseLog.initialize(ParseLog.java:64) at org.apache.hadoop.hive.ql.udf.generic.GenericUDF.initializeAndFoldConstants(GenericUDF.java:141) at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:146) at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:140) at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:140) at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:140) at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:140) at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluatorHead.initialize(ExprNodeEvaluatorHead.java:39) at org.apache.hadoop.hive.ql.exec.FilterOperator.process(FilterOperator.java:80) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:897) at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:130) at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:148) at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:547) ... 9 more Caused by: com.tracker.common.db.simplehbase.exception.SimpleHBaseException: java.lang.IllegalArgumentException: offset (0) + length (4) exceed the capacity of the array: 1 at com.tracker.common.db.simplehbase.HbaseClient.convertBytesToPOJOField(HbaseClient.java:374) at com.tracker.common.db.simplehbase.HbaseClient.convertToHbaseObjectResult(HbaseClient.java:332) ... 33 more Caused by: java.lang.IllegalArgumentException: offset (0) + length (4) exceed the capacity of the array: 1 at org.apache.hadoop.hbase.util.Bytes.explainWrongLengthOrOffset(Bytes.java:632) at org.apache.hadoop.hbase.util.Bytes.toInt(Bytes.java:802) at org.apache.hadoop.hbase.util.Bytes.toInt(Bytes.java:778) at com.tracker.coprocessor.utils.TypeHandlerHolder$IntegerHandler.toObject(TypeHandlerHolder.java:311) at com.tracker.common.db.simplehbase.HbaseClient.convertBytesToPOJOField(HbaseClient.java:371) ... 34 more
看下黄色部分,可知是hbase的对应实体类错误。
原因:是修改了hbase数据字典表中的类型 -> 没有更新hive的jar包。
Hive问题:Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
标签:name report mapred main rtb apache hive 实体 selector
原文地址:https://www.cnblogs.com/parent-absent-son/p/10345722.html