码迷,mamicode.com
首页 > 其他好文 > 详细

Spark wordcount 编译错误 -- reduceByKey is not a member of RDD

时间:2014-11-06 19:08:13      阅读:1293      评论:0      收藏:0      [点我收藏+]

标签:http   io   ar   sp   on   art   bs   html   ad   

Attempting to run http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala from source.

This line val wordCounts = textFile.flatMap(line => line.split(" ")).map(word => (word, 1)).reduceByKey(_+_) reports compile 

value reduceByKey is not a member of org.apache.spark.rdd.RDD[(String, Int)] 

 

Resolution:

 import the implicit conversions from SparkContext:

import org.apache.spark.SparkContext._

They use the ‘pimp up my library‘ pattern to add methods to RDD‘s of specific types. If curious, seeSparkContext:1296

Spark wordcount 编译错误 -- reduceByKey is not a member of RDD

标签:http   io   ar   sp   on   art   bs   html   ad   

原文地址:http://www.cnblogs.com/abelstronger/p/4079293.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!