码迷,mamicode.com
首页 > 其他好文 > 详细

我的Spark源码核心SparkContext走读全纪录

时间:2016-09-23 21:42:39      阅读:100      评论:0      收藏:0      [点我收藏+]

标签:spark sparkcontext

我的Spark源码核心SparkContext走读全纪录



Dirver Program(SparkConf)  package org.apache.spark.SparkConf

Master        package org.apache.spark.deploy.master


SparkContext  package org.apache.spark.SparkContext


Stage         package org.apache.spark.scheduler.Stage

Task          package org.apache.spark.scheduler.Task  

DAGScheduler  package org.apache.spark.scheduler   

TaskScheduler package org.apache.spark.scheduler.TaskScheduler

TaskSchedulerImpl  package org.apache.spark.scheduler

Worker        package org.apache.spark.deploy.worker

Executor      package org.apache.spark.executor

BlockManager  package org.apache.spark.storage

TaskSet       package org.apache.spark.scheduler


//初始化后开始创建

// Create and start the scheduler

    val (sched, ts) = SparkContext.createTaskScheduler(this, master)

    _schedulerBackend = sched

    _taskScheduler = ts

    _dagScheduler = new DAGScheduler(this)

    _heartbeatReceiver.send(TaskSchedulerIsSet)

 

/**

   * Create a task scheduler based on a given master URL.

   * Return a 2-tuple of the scheduler backend and the task scheduler.

   */

  private def createTaskScheduler(

      sc: SparkContext,

      master: String): (SchedulerBackend, TaskScheduler) = {


master match {

      case "local" =>


实例化一个

val scheduler = new TaskSchedulerImpl(sc)

构建masterUrls:

val masterUrls = localCluster.start()

据说是非常关键的backend:

val backend = new SparkDeploySchedulerBackend(scheduler, sc, masterUrls)

        scheduler.initialize(backend)

        backend.shutdownCallback = (backend: SparkDeploySchedulerBackend) => {

          localCluster.stop()

        }

        (backend, scheduler)



我的Spark源码核心SparkContext走读全纪录

标签:spark sparkcontext

原文地址:http://36006798.blog.51cto.com/988282/1855949

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!