码迷,mamicode.com
首页 > 数据库 > 详细

[Sqoop]将Mysql数据表导入到Hive

时间:2015-06-08 11:40:33      阅读:206      评论:0      收藏:0      [点我收藏+]

标签:

业务背景

mysql数据表YHD_CATEG_PRIOR结构如下:

-- Table "YHD_CATEG_PRIOR" DDL

CREATE TABLE `YHD_CATEG_PRIOR` (
  `category_id`                     int(11) NOT NULL COMMENT ‘类目ID‘,
  `category_name`                   varchar(250) DEFAULT NULL COMMENT ‘类目名称‘,
  `category_level`                  int(11) DEFAULT ‘0‘ COMMENT ‘类目级别‘,
  `default_import_categ_prior`      int(11) DEFAULT ‘0‘ COMMENT ‘默认引入优先级‘,
  `user_import_categ_prior`         int(11) DEFAULT NULL COMMENT ‘用户引入优先级‘,
  `default_eliminate_categ_prior`   int(11) DEFAULT NULL COMMENT ‘默认淘汰优先级‘,
  `user_eliminate_categ_prior`      int(11) DEFAULT NULL COMMENT ‘用户淘汰优先级‘,
  `UPDATE_TIME`                     timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP COMMENT ‘数据刷新时间‘,
  PRIMARY KEY (`category_id`)
) ENGINE=InnoDB AUTO_INCREMENT=61125 DEFAULT CHARSET=utf8;

现在需要将数据表YHD_CATEG_PRIOR导入到Hive中。

业务实现

脚本如下:

# 创建Hive数据表pms.yhd_categ_prior_user
hive -e "
set mapred.job.queue.name=pms;
set mapred.job.name=[CIS]yhd_categ_prior_user;

-- Hive DDL
DROP TABLE IF EXISTS pms.yhd_categ_prior_user;
CREATE TABLE pms.yhd_categ_prior_user
(
    category_id                     bigint,
    category_name                   string,
    category_level                  int,
    default_import_categ_prior      int,
    user_import_categ_prior         int,
    default_eliminate_categ_prior   int,
    user_eliminate_categ_prior      int,
    update_time                     string
)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ‘\t‘ 
LINES TERMINATED BY ‘\n‘ 
STORED AS TEXTFILE;"

# 同步mysql的market.YHD_CATEG_PRIOR到hive中
hadoop fs -rmr /user/pms/YHD_CATEG_PRIOR 

sqoop import -Dmapred.job.queue.name=pms --connect jdbc:mysql://127.0.0.1:3306/market \
--username admin \
--password 123456 \
--table YHD_CATEG_PRIOR \
--hive-table pms.yhd_categ_prior_user \
--fields-terminated-by ‘\t‘ \
--lines-terminated-by ‘\n‘ \
--hive-overwrite \
--hive-drop-import-delims \
--hive-import 

上述的脚本工作流程:

  • 创建hive表pms.yhd_categ_prior_user
  • 利用sqoop将mysql表YHD_CATEG_PRIOR同步到hive表pms.yhd_categ_prior_user,导入后,hive表的字段以\t分隔,行以\n分隔,

实验结果

col_name    data_type   comment
# col_name              data_type               comment             

category_id             bigint                  None                
category_name           string                  None                
category_level          int                     None                
default_import_categ_prior  int                     None                
user_import_categ_prior int                     None                
default_eliminate_categ_prior   int                     None                
user_eliminate_categ_prior  int                     None                
update_time             string                  None                

# Detailed Table Information         
Database:               pms                      
Owner:                  pms                      
CreateTime:             Fri Jun 05 18:48:01 CST 2015     
LastAccessTime:         UNKNOWN                  
Protect Mode:           None                     
Retention:              0                        
Location:               hdfs://yhd-jqhadoop2.int.yihaodian.com:8020/user/hive/pms/yhd_categ_prior_user   
Table Type:             MANAGED_TABLE            
Table Parameters:        
    numFiles                5                   
    numPartitions           0                   
    numRows                 0                   
    rawDataSize             0                   
    totalSize               447779              
    transient_lastDdlTime   1433501435          

# Storage Information        
SerDe Library:          org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe   
InputFormat:            org.apache.hadoop.mapred.TextInputFormat     
OutputFormat:           org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat   
Compressed:             No                       
Num Buckets:            -1                       
Bucket Columns:         []                       
Sort Columns:           []                       
Storage Desc Params:         
    field.delim             \t                  
    line.delim              \n                  
    serialization.format    \t

[Sqoop]将Mysql数据表导入到Hive

标签:

原文地址:http://blog.csdn.net/yeweiouyang/article/details/46409245

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!