标签:查询 log 软件 命令 utils mod main apach file
在家编译一个Apache的开源项目,在编译时遇到错误如下:
error: error while loading <root>, error in opening zip file [ERROR] error: error while loading <root>, error in opening zip file error: scala.reflect.internal.MissingRequirementError: object scala.runtime in compiler mirror not found. at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16) at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61) at scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172) at scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175) at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183) at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183) at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184) at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184) at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024) at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023) at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153) at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152) at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196) at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196) at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261) at scala.tools.nsc.Global$Run.<init>(Global.scala:1290) at scala.tools.nsc.Driver.doCompile(Driver.scala:32) at scala.tools.nsc.Main$.doCompile(Main.scala:79) at scala.tools.nsc.Driver.process(Driver.scala:54) at scala.tools.nsc.Driver.main(Driver.scala:67) at scala.tools.nsc.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org_scala_tools_maven_executions.MainHelper.runMain(MainHelper.java:161) at org_scala_tools_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)
当时编译的是开源的Carbondata项目,使用maven3.3.9,jdk是1.8。于是到网上各种搜索和查询看到各种答案,但是都没有能解决问题。其中有个提示说有可能是jar包损坏导致打开时出错。于是把所有的maven库的jar全删掉重新编译,OK编译成功。(也可以写代码检查哪个jar包坏了)
但是jar包为什么会损坏呢? 我回顾了下编译过程,原来在一开始编译时经常下载jar包很慢,在下载的过程中直接中断重新,可能是这个原因导致的。
编译好后开始运行代码又报如下错误
Starting CarbonExample using spark version 1.5.2 Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522) at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171) at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:162) at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:160) at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167) at org.apache.spark.sql.CarbonContext.<init>(CarbonContext.scala:41) at org.apache.carbondata.examples.util.ExampleUtils$.createCarbonContext(ExampleUtils.scala:44) at org.apache.carbondata.examples.CarbonExample$.main(CarbonExample.scala:27) at org.apache.carbondata.examples.CarbonExample.main(CarbonExample.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147) Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612) at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508) ... 13 more
又是各种搜索,有的说要下载正确的winutils.exe,有的说要用命令winutils.exe 给目录赋权限,有的说装了xxx软件冲突。。。。各种都做了还是同样的错误。最后重启一把(重启机器)好了
泪奔。。。。。。。
标签:查询 log 软件 命令 utils mod main apach file
原文地址:http://www.cnblogs.com/nurseryboy/p/6155925.html