码迷,mamicode.com
首页 > Web开发 > 详细

SPARK搭建中WORKER不能启动(failed to launch org.apache.spark.deploy.worker.worker)

时间:2016-07-19 14:06:46      阅读:1127      评论:0      收藏:0      [点我收藏+]

标签:spark worker 集群部署 案例

SPARK搭建中WORKER不能启动(failed to launch org.apache.spark.deploy.worker.worker)


[dyq@master spark-1.5.0]$ ./sbin/start-all.sh

starting org.apache.spark.deploy.master.Master, logging to /srv/spark-1.5.0/sbin/../logs/spark-dyq-org.apache.spark.deploy.master.Master-1-master.out

slave2: starting org.apache.spark.deploy.worker.Worker, logging to /srv/spark-1.5.0/sbin/../logs/spark-dyq-org.apache.spark.deploy.worker.Worker-1-slave2.out

slave1: starting org.apache.spark.deploy.worker.Worker, logging to /srv/spark-1.5.0/sbin/../logs/spark-dyq-org.apache.spark.deploy.worker.Worker-1-slave1.out

slave2: failed to launch org.apache.spark.deploy.worker.Worker:

slave1: failed to launch org.apache.spark.deploy.worker.Worker:

slave1: full log in /srv/spark-1.5.0/sbin/../logs/spark-dyq-org.apache.spark.deploy.worker.Worker-1-slave1.out

slave2: full log in /srv/spark-1.5.0/sbin/../logs/spark-dyq-org.apache.spark.deploy.worker.Worker-1-slave2.out


这是什么原因

master机器


Spark Command: /srv/jdk1.7.0_79/bin/java -cp /srv/spark-1.5.0/sbin/../conf/:/srv/spark-1.5.0/lib/spark-assembly-1.5.0-hadoop2.6.0.jar:/srv/spark-1.5.0/lib/datanucleus-core-3.2.10.jar:/srv/spark-1.5.0/lib/datanucleus-api-jdo-3.2.6.jar:/srv/spark-1.5.0/lib/datanucleus-rdbms-3.2.9.jar -Xms1g -Xmx1g -XX:MaxPermSize=256m org.apache.spark.deploy.master.Master --ip 192.168.0.100 --port 7077 --webui-port 8080

========================================

Using Spark‘s default log4j profile: org/apache/spark/log4j-defaults.properties

16/07/19 09:11:23 INFO Master: Registered signal handlers for [TERM, HUP, INT]

16/07/19 09:11:47 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

16/07/19 09:12:08 INFO SecurityManager: Changing view acls to: dyq

16/07/19 09:12:08 INFO SecurityManager: Changing modify acls to: dyq

16/07/19 09:12:08 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(dyq); users with modify permissions: Set(dyq)

16/07/19 09:13:45 INFO Slf4jLogger: Slf4jLogger started

16/07/19 09:13:56 INFO Remoting: Starting remoting

Exception in thread "main" java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]


slave机器上:

Spark Command: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.65-3.b17.el7.x86_64/jre/bin/java -cp /srv/spark-1.5.0/sbin/../conf/:/srv/spark-1.5.0/lib/spark-assembly-1.5.0-hadoop2.6.0.jar:/srv/spark-1.5.0/lib/datanucleus-core-3.2.10.jar:/srv/spark-1.5.0/lib/datanucleus-api-jdo-3.2.6.jar:/srv/spark-1.5.0/lib/datanucleus-rdbms-3.2.9.jar -Xms1g -Xmx1g org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://192.168.0.100:7077

========================================

Caused by: [拒绝连接: /192.168.0.100:7077]

SPARK搭建中WORKER不能启动(failed to launch org.apache.spark.deploy.worker.worker)

标签:spark worker 集群部署 案例

原文地址:http://36006798.blog.51cto.com/988282/1827615

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!