标签:val pst org statement rman com stat meta sqlt
1.前提
首先是hadoop的两个服务要开启
然后是hive 的metastore
然后启动spark-shell,如果没有启动hive的metastore,则会在这一步报错,说找不到hive的9083端口。至于启动spark-shell,则是为了看4040端口上的JDBS/ODBC服务
然后启动hive thriftservice
实例代码
package com.spark import java.sql.DriverManager /* 通过JDBC的方式访问 */ object SparkSQLThriftServer { def main(args: Array[String]): Unit = { Class.forName("org.apache.hive.jdbc.HiveDriver") val conn=DriverManager.getConnection("","","") val pstmt=conn.prepareStatement("select * from table") val rs=pstmt.executeQuery() while(rs.next()) { println("name"+rs.getInt("name")) } rs.close() pstmt.close() conn.close() } }
标签:val pst org statement rman com stat meta sqlt
原文地址:https://www.cnblogs.com/aishanyishi/p/10317925.html