码迷,mamicode.com
首页 > 其他好文 > 详细

hive的安装

时间:2016-12-02 16:40:56      阅读:249      评论:0      收藏:0      [点我收藏+]

标签:hive   java   hadoop   


在此强调:

Hadoop,zookpeer,spark,kafka已经正常启动

开始安装部署hive

基础依赖环境:

1,jdk   1.6+
2, hadoop 2.x
3,hive 0.13-0.19
4,mysql   (mysql-connector-jar)

安装详细如下:

#java 
export JAVA_HOME=/soft/jdk1.7.0_79/
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
#bin
export PATH=$PATH:/$JAVA_HOME/bin:$HADOOP_HOME/bin:$SCALA_HOME/bin:$SPARK_HOME/bin:/usr/local/hadoop/hive/bin
#hadoop
export HADOOP_HOME=/usr/local/hadoop/hadoop
#scala
export SCALA_HOME=/usr/local/hadoop/scala
#spark
export SPARK_HOME=/usr/local/hadoop/spark
#hive
export HIVE_HOME=/usr/local/hadoop/hive

                              


一、开始安装:

1,下载:

https://hive.apache.org/downloads.html

解压:

tar  xvf   apache-hive-2.1.0-bin.tar.gz  -C  /usr/local/hadoop/
cd  /usr/local/hadoop/
mv   apache-hive-2.1.0   hive

2,修改配置

修改启动环境
cd   /usr/local/hadoop/hive
vim bin/hive-config.sh
#java 
export JAVA_HOME=/soft/jdk1.7.0_79/
#hadoop
export HADOOP_HOME=/usr/local/hadoop/hadoop
#hive
export HIVE_HOME=/usr/local/hadoop/hive

修改默认配置文件

cd   /usr/local/hadoop/hive
vim conf/hive-site.xml
<configuration>
    <property>
        <name>javax.jdo.option.ConnectionURL</name>
        <value>jdbc:mysql://master:3306/hive?createDatabaseInfoNotExist=true</value> 
            <description>JDBC connect string for a JDBC metastore</description>
      </property>
     <property>
            <name>javax.jdo.option.ConnectionDriverName</name>
            <value>com.mysql.jdbc.Driver</value>
            <description>Driver class name for a JDBC metastore</description>
      </property>
    <property>
            <name>javax.jdo.option.ConnectionUserName</name>
            <value>hive</value>
            <description>Username to use against metastore database</description>
      </property>
    <property>
            <name>javax.jdo.option.ConnectionPassword</name>
            <value>xujun</value>
            <description>password to use against metastore database</description>
      </property>
</configuration>

3,修改tmp dir

修改 将含有"system:java.io.tmpdir"的配置项的值修改为如上地址

/tmp/hive

 

二、安装好mysql,并且启动

1.创建数据库

create database hive 
grant all on *.* to  hive@‘%‘  identified by ‘hive‘;
flush  privileges;

三,初始化hive

cd   /usr/local/hadoop/hive
bin/schematool -initSchema -dbType mysql 
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL: jdbc:mysql://hadoop3:3306/hive?createDatabaseInfoNotExist=true
Metastore Connection Driver : com.mysql.jdbc.Driver
Metastore connection User: hive
Starting metastore schema initialization to 2.1.0
Initialization script hive-schema-2.1.0.mysql.sql
Initialization script completed
schemaTool completed



四、启动

[hadoop@hadoop1 hadoop]$ hive/bin/hive
which: no hbase in (/usr/lib64/qt-3.3/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin://soft/jdk1.7.0_79//bin:/bin:/bin:/bin:/usr/local/hadoop/hive/bin:/home/hadoop/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Logging initialized using configuration in jar:file:/usr/local/hadoop/hive/lib/hive-common-2.1.0.jar!/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. tez, spark) or using Hive 1.X releases.
hive> show databases;
OK
default
Time taken: 1.184 seconds, Fetched: 1 row(s)
hive>


本文出自 “crazy_sir” 博客,请务必保留此出处http://douya.blog.51cto.com/6173221/1878779

hive的安装

标签:hive   java   hadoop   

原文地址:http://douya.blog.51cto.com/6173221/1878779

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!