一、前言
在使用Spark Streaming
中的Kafka Direct API
进行Kafka消费的过程中,通过spark-submit
的方式提交jar包,会出现如下错误信息,提示无法找到KafkaUtils。
-
Exceptionin thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/kafka/KafkaUtils$ at com.zhkmxx.scala.app.KafkaStream$.main(KafkaStream.scala:33) at com.zhkmxx.scala.app.KafkaStream.main(KafkaStream.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.kafka.KafkaUtils$ at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(NativeMethod) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ...11 more 16/12/1313:16:09 INFO spark.SparkContext:Invoking stop() from shutdown hook
二、解决方案
由于我是通过maven编译的方式搭建的Spark环境,KafkaUtils类存在于spark-examples-1.6.2-hadoop2.6.0.jar
中。因而需要在IDEA中配置此包在linux中的位置,以便于自己发布的 jar包能够找到这个classpath。
配置如图所示,打开project structure: