标签:mmu bin park student 转换 map app mutable orm
hbase的操作命令 https://www.cnblogs.com/lzh-boy/p/8966826.html
环境配置略
需要注意:在Spark 2.0版本上缺少相关把hbase的数据转换python可读取的jar包
code:查看表数据
from pyspark.sql import SparkSession
import os
os.environ[‘PYSPARK_PYTHON‘]=‘/opt/anaconda2/bin/python‘
sc = SparkSession.builder.master("local").appName("hbase").getOrCreate()
host = ‘localhost‘
table = ‘student‘
conf = {"hbase.zookeeper.quorum": host, "hbase.mapreduce.inputtable": table}
keyConv = "org.apache.spark.examples.pythonconverters.ImmutableBytesWritableToStringConverter"
valueConv = "org.apache.spark.examples.pythonconverters.HBaseResultToStringConverter"
hbase_rdd = sc.sparkContext.newAPIHadoopRDD("org.apache.hadoop.hbase.mapreduce.TableInputFormat","org.apache.hadoop.hbase.io.ImmutableBytesWritable","org.apache.hadoop.hbase.client.Result",keyConverter=keyConv,valueConverter=valueConv,conf=conf)
count = hbase_rdd.count()
hbase_rdd.cache()
output = hbase_rdd.collect()
for (k, v) in output:
print (k, v)
标签:mmu bin park student 转换 map app mutable orm
原文地址:https://www.cnblogs.com/xiennnnn/p/11609290.html