标签:数据存储 grant 数据库 tool 准备 connect ast option username
一、准备
上传apache-hive-1.2.1.tar.gz和mysql--connector-java-5.1.6-bin.jar到node01
cd /tools
tar -zxvf apache-hive-1.2.1.tar.gz -C /ren/
cd /ren
mv apache-hive-1.2.1 hive-1.2.1
本集群采用mysql作为hive的元数据存储
vi etc/profile
export HIVE_HOME=/ren/hive-1.2.1
export PATH=$PATH:$HIVE_HOME/bin
source /etc/profile
二、安装mysql
yum -y install mysql mysql-server mysql-devel
创建hive数据库 create database hive
创建hive用户 grant all privileges on hive.* to hive@node01 identified by ‘123456‘;
grant all privileges on hive.* to hive@‘%‘ identified by ‘123456‘;
三、hive安装
cd /ren/hive-1.2.1/conf
cp /root/mysql--connector-java-5.1.6-bin.jar /ren/hive-1.2.1/lib/
mv hive-default-xml.template hive-site.xml
vi hive-site.xml
修改<configuration></configuration>
<property>
<name>hive.exec.scratchdir</name>
<value>/ren/hive-1.2.1/data</value>
</property>
<property>
<name>hive.exec.local.scratchdir</name>
<value>/ren/hive-1.2.1/data/tmp</value>
</property>
<property>
<name>hive.downloaded.resources.dir</name>
<value>/ren/hive-1.2.1/data/${hive.session.id}_resources</value>
</property>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://node01:3306/hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>123456</value>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/ren/hive-1.2.1/warehouse</value>
</property>
<property>
<name>hive.querylog.location</name>
<value>/ren/hive-1.2.1/data/log</value>
</property>
同步 scp -r /ren/hive-1.2.1 root@node02:/ren
scp -r /ren/hive-1.2.1 root@node03:/ren
启动hive hive
启动hiveserver hive --service hiveserver2
启动metastore hive --service metastore
四、spark-sql
cd /ren/spark-2.02/conf
加入文件hive-site.xml
内容 <configuration>
<property>
<name>hive.metastore.uris</name>
<value>thrift://node01:9083</value>
</property>
</configuration>
启动 :spark-sql(需要先启动hive的metastore)
hadoop-spark集群安装---5.hive和spark-sql
标签:数据存储 grant 数据库 tool 准备 connect ast option username
原文地址:http://www.cnblogs.com/renjian1995/p/6217873.html