标签:host sql hadoop 统一 connect spark oop password 表示
(一)、在root用户下:(Mysql)
1.在root用户下:创建hive元数据库
create database if not exists hivedb;
2.在root用户下添加添加用户用户:hadoop
注意:这地方主机为localhost,可以自定义主机名,比如下面的spark1,需要统一!
insert into mysql.user(Host,User,Password)values(‘%‘,‘hadoop‘,password(‘hadoop‘));
3.在root用户下给普通用户hadoop授权于hivedb数据库任何主机:
注意:dentified by ‘hadoop‘ 表示密码
grant all privileges on hivedb.* to ‘hadoop‘@‘%‘ identified by ‘hadoop‘;
(二)、修改hive中的配置文件hive-site.xml
修改前:
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://192.168.137.5:3306/1608b?characterEncoding=UTF-8</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>root</value>
</property>
<property>
<name>hive.aux.jars.path</name>
<value>${HIVE_HOME}/auxlib</value>
</property>
</configuration>
修改后:
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://192.168.137.5:3306/hivedb?characterEncoding=UTF-8</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hadoop</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hadoop</value>
</property>
<property>
<name>hive.aux.jars.path</name>
<value>${HIVE_HOME}/auxlib</value>
</property>
</configuration>
(三)、启动hive
标签:host sql hadoop 统一 connect spark oop password 表示
原文地址:http://www.cnblogs.com/lishengnan/p/hadoop8.html