码迷,mamicode.com
首页 > 其他好文 > 详细

Spark 分布式环境---slave节点无法启动(已解决)

时间:2017-11-30 19:14:19      阅读:2087      评论:0      收藏:0      [点我收藏+]

标签:usr   failed   work   --   bin   class   cal   .sh   bashrc   

soyo@soyo-VPCCB3S1C:~$ start-slaves.sh 
soyo-slave01: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local2/spark/logs/spark-soyo-org.apache.spark.deploy.worker.Worker-1-soyo-slave01.out
soyo-slave01: failed to launch: nice -n 0 /usr/local2/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://soyo-VPCCB3S1C:7077
soyo-slave01:   /usr/local2/spark/bin/spark-class: 行 71: /usr/lib/jvm/java-8-openjdk-amd64/bin/java: 没有那个文件或目录
soyo-slave01: full log in /usr/local2/spark/logs/spark-soyo-org.apache.spark.deploy.worker.Worker-1-soyo-slave01.out
解决:
修改 soyo-slave01 节点上bashrc里JDK的安装路径(因为ubuntu14.04 不是安装的默认openJDK)之后ok

Spark 分布式环境---slave节点无法启动(已解决)

标签:usr   failed   work   --   bin   class   cal   .sh   bashrc   

原文地址:http://www.cnblogs.com/soyo/p/7930205.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!