标签:usr failed work -- bin class cal .sh bashrc
soyo@soyo-VPCCB3S1C:~$ start-slaves.sh
soyo-slave01: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local2/spark/logs/spark-soyo-org.apache.spark.deploy.worker.Worker-1-soyo-slave01.out
soyo-slave01: failed to launch: nice -n 0 /usr/local2/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://soyo-VPCCB3S1C:7077
soyo-slave01: /usr/local2/spark/bin/spark-class: 行 71: /usr/lib/jvm/java-8-openjdk-amd64/bin/java: 没有那个文件或目录
soyo-slave01: full log in /usr/local2/spark/logs/spark-soyo-org.apache.spark.deploy.worker.Worker-1-soyo-slave01.out
解决:
修改 soyo-slave01 节点上bashrc里JDK的安装路径(因为ubuntu14.04 不是安装的默认openJDK)之后ok
Spark 分布式环境---slave节点无法启动(已解决)
标签:usr failed work -- bin class cal .sh bashrc
原文地址:http://www.cnblogs.com/soyo/p/7930205.html