码迷,mamicode.com
首页 > 数据库 > 详细

Eclipse环境下使用jdbc访问hive程序(hive-0.12.0 + hadoop-2.4.0集群)

时间:2015-04-08 18:00:36      阅读:285      评论:0      收藏:0      [点我收藏+]

标签:

一、Eclipse 新建Other-》Map/Reduce Project工程

工程自动包含了相关hadoop的jar包,

另外还需分别导入以下hive和连接mysql的jar包:

hive/lib/*.jar

mysql-connector-java-5.1.24-bin.jar

 

二、启运HiveServer

命令:bin/hive --service hiveserver &

曾经执行多次这个命令没成功后,报错:Could not create ServerSocket on address 0.0.0.0/0.0.0.0:10000.

解决办法:启动时,指定端口号,命令

bin/hive --service hiveserver -p 10002

如果没报错就不用理会,但下面代码中的端口号要改为默认的10000,与这里的一致。


三、Eclipse中java测试代码

如果新建工程时不时选择hadoop工程,会报错:Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/io/Writable

解决办法:这是因为一下hadoop的包没有加进来,重新建一个工程,工程类型选择Other-》Map/Reduce Project工程,自动就包含了hadoop开发环境

另需先在/home/hadoop/file/目录下准备user_info.txt文件,内容如下(\t分割符)

1001  jack    30
1002  tom    25
1003  kate    20

//--------------HiveTest .java-----------------
package test;
import java.sql.SQLException;  
import java.sql.Connection;  
import java.sql.ResultSet;  
import java.sql.Statement;  
import java.sql.DriverManager;  
 
public class HiveQuery {  
  private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";  
 
  /**
 * @param args
 * @throws SQLException
   */  
  public static void main(String[] args) throws SQLException {  
      try {  
      Class.forName(driverName);  
    } catch (ClassNotFoundException e) {  
      // TODO Auto-generated catch block  
      e.printStackTrace();  
      System.exit(1);  
    }  
    Connection con = DriverManager.getConnection("jdbc:hive://192.168.1.200:10002/default", "", "");  
    Statement stmt = con.createStatement();  
    String tableName = "testHiveDriverTable";  
    stmt.executeQuery("drop table " + tableName);  
    ResultSet res = stmt.executeQuery("create table " + tableName + " (id int, name string, age string) row format delimited fields terminated by ‘\t‘ lines terminated by ‘\n‘");  
    // show tables  
    String sql = "show tables ‘" + tableName + "‘";  
    System.out.println("Running: " + sql);  
    res = stmt.executeQuery(sql);  
    if (res.next()) {  
      System.out.println(res.getString(1));  
    }  
    // describe table  
    sql = "describe " + tableName;  
    System.out.println("Running: " + sql);  
    res = stmt.executeQuery(sql);  
    while (res.next()) {  
      System.out.println(res.getString(1) + "\t" + res.getString(2)+ "\t" + res.getString(3));  
    }  
 
    // load data into table  
    // NOTE: filepath has to be local to the hive server  
    // NOTE: /tmp/a.txt is a ctrl-A separated file with two fields per line  
    String filepath = "/home/hadoop/file/user_info.txt";  
    sql = "load data local inpath ‘" + filepath + "‘ into table " + tableName;  
    System.out.println("Running: " + sql);  
    res = stmt.executeQuery(sql);  
 
    // select * query  
    sql = "select * from " + tableName;  
    System.out.println("Running: " + sql);  
    res = stmt.executeQuery(sql);  
    while (res.next()) {  
      System.out.println(String.valueOf(res.getInt(1)) + "\t" + res.getString(2) + "\t" + res.getString(3));  
    }  
 
    // regular hive query  
    sql = "select count(1) from " + tableName;  
    System.out.println("Running: " + sql);  
    res = stmt.executeQuery(sql);  
    while (res.next()) {  
      System.out.println(res.getString(1));  
    }  
  }  
//------------end---------------------------------------------

四、显示结果

Running: show tables ‘testHiveDriverTable‘

testhivedrivertable

Running: describe testHiveDriverTable

id                  int                

name             string              

age               string              

Running: load data local inpath ‘/home/hadoop/file/user_info.txt‘ into table testHiveDriverTable

Running: select * from testHiveDriverTable

1001 jack 30

1002 tom 25

1003 kate 20

Running: select count(1) from testHiveDriverTable

3

Eclipse环境下使用jdbc访问hive程序(hive-0.12.0 + hadoop-2.4.0集群)

标签:

原文地址:http://www.cnblogs.com/zhaohz/p/4403066.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!