码迷,mamicode.com
首页 > 系统相关 > 详细

一、spark入门之spark shell:wordcount

时间:2016-08-02 22:16:25      阅读:256      评论:0      收藏:0      [点我收藏+]

标签:

1、安装完spark,进入spark中bin目录: bin/spark-shell
 
scala> val textFile = sc.textFile("/Users/admin/spark/spark-1.6.1-bin-hadoop2.6/README.md")
scala> textFile.flatMap(_.split(" ")).filter(!_.isEmpty).map((_,1)).reduceByKey(_+_).collect().foreach(println)
 
result:
(-Psparkr,1)
(Build,1)
(built,1)
(-Phive-thriftserver,1)
(2.4.0,1)
(-Phadoop-2.4,1)
(Spark,1)
(-Pyarn,1)
(1.5.1,1)
(flags:,1)
(for,1)
(-Phive,1)
(-DzincPort=3034,1)
(Hadoop,1)
 

一、spark入门之spark shell:wordcount

标签:

原文地址:http://www.cnblogs.com/ylcoder/p/5730926.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!