码迷,mamicode.com
首页 > 其他好文 > 详细

hive的基本命令

时间:2014-06-11 23:10:11      阅读:299      评论:0      收藏:0      [点我收藏+]

标签:des   http   tar   ext   com   get   

创建表

CREATE TABLE pokes (foo INT, bar STRING); 

创建表并创建索引字段ds

CREATE TABLE invites (foo INT, bar STRING) PARTITIONED BY (ds STRING); 

显示所有表

SHOW TABLES;

按正条件(正则表达式)显示表,

SHOW TABLES ‘.*s‘;

表添加一列 

ALTER TABLE pokes ADD COLUMNS (new_col INT);

添加一列并增加列字段注释

ALTER TABLE invites ADD COLUMNS (new_col2 INT COMMENT ‘a comment‘);

更改表名

ALTER TABLE events RENAME TO 3koobecaf;

删除列

DROP TABLE pokes;

元数据存储

将文件中的数据加载到表中

LOAD DATA LOCAL INPATH ‘./examples/files/kv1.txt‘ OVERWRITE INTO TABLE pokes; 

加载本地数据,同时给定分区信息

LOAD DATA LOCAL INPATH ‘./examples/files/kv2.txt‘ OVERWRITE INTO TABLE invites PARTITION (ds=‘2008-08-15‘);

加载DFS数据 ,同时给定分区信息

LOAD DATA INPATH ‘/user/myname/kv2.txt‘ OVERWRITE INTO TABLE invites PARTITION (ds=‘2008-08-15‘);

SQL 操作

按先件查询

SELECT a.foo FROM invites a WHERE a.ds=‘<DATE>‘;

将查询数据输出至目录

INSERT OVERWRITE DIRECTORY ‘/tmp/hdfs_out‘ SELECT a.* FROM invites a WHERE a.ds=‘<DATE>‘;

将查询结果输出至本地目录

INSERT OVERWRITE LOCAL DIRECTORY ‘/tmp/local_out‘ SELECT a.* FROM pokes a;

选择所有列到本地目录 

INSERT OVERWRITE TABLE events SELECT a.* FROM profiles a;

INSERT OVERWRITE TABLE events SELECT a.* FROM profiles a WHERE a.key < 100; 

INSERT OVERWRITE LOCAL DIRECTORY ‘/tmp/reg_3‘ SELECT a.* FROM events a;

INSERT OVERWRITE DIRECTORY ‘/tmp/reg_4‘ select a.invites, a.pokes FROM profiles a;

INSERT OVERWRITE DIRECTORY ‘/tmp/reg_5‘ SELECT COUNT(1) FROM invites a WHERE a.ds=‘<DATE>‘;

INSERT OVERWRITE DIRECTORY ‘/tmp/reg_5‘ SELECT a.foo, a.bar FROM invites a;

INSERT OVERWRITE LOCAL DIRECTORY ‘/tmp/sum‘ SELECT SUM(a.pc) FROM pc1 a;

将一个表的统计结果插入另一个表中

FROM invites a INSERT OVERWRITE TABLE events SELECT a.bar, count(1) WHERE a.foo > 0 GROUP BY a.bar;

INSERT OVERWRITE TABLE events SELECT a.bar, count(1) FROM invites a WHERE a.foo > 0 GROUP BY a.bar;

JOIN的使用

FROM pokes t1 JOIN invites t2 ON (t1.bar = t2.bar) INSERT OVERWRITE TABLE events SELECT t1.bar, t1.foo, t2.foo;

将多表数据插入到同一表中

FROM src

INSERT OVERWRITE TABLE dest1 SELECT src.* WHERE src.key < 100

INSERT OVERWRITE TABLE dest2 SELECT src.key, src.value WHERE src.key >= 100 and src.key < 200

INSERT OVERWRITE TABLE dest3 PARTITION(ds=‘2008-04-08‘, hr=‘12‘) SELECT src.key WHERE src.key >= 200 and src.key < 300

INSERT OVERWRITE LOCAL DIRECTORY ‘/tmp/dest4.out‘ SELECT src.value WHERE src.key >= 300;

将文件流直接插入文件

FROM invites a INSERT OVERWRITE TABLE events SELECT TRANSFORM(a.foo, a.bar) AS (oof, rab) USING ‘/bin/cat‘ WHERE a.ds > ‘2008-08-09‘;

实际示例

创建一个表

CREATE TABLE u_data (

userid INT,

movieid INT,

rating INT,

unixtime STRING)

ROW FORMAT DELIMITED

FIELDS TERMINATED BY ‘\t‘

STORED AS TEXTFILE;

下载示例数据文件,并解压缩

wget http://www.grouplens.org/system/files/ml-data.tar__0.gz

tar xvzf ml-data.tar__0.gz

加载数据到表中

LOAD DATA LOCAL INPATH ‘ml-data/u.data‘

OVERWRITE INTO TABLE u_data;

统计数据总量

SELECT COUNT(1) FROM u_data;

现在做一些复杂的数据分析

创建一个 weekday_mapper.py: 文件,作为数据按周进行分割 

import sys

import datetime

for line in sys.stdin:

line = line.strip()

userid, movieid, rating, unixtime = line.split(‘\t‘)

生成数据的周信息

weekday = datetime.datetime.fromtimestamp(float(unixtime)).isoweekday()

print ‘\t‘.join([userid, movieid, rating, str(weekday)])

使用映射脚本

//创建表,按分割符分割行中的字段值

CREATE TABLE u_data_new (

userid INT,

movieid INT,

rating INT,

weekday INT)

ROW FORMAT DELIMITED

FIELDS TERMINATED BY ‘\t‘;

//将python文件加载到系统

add FILE weekday_mapper.py;

将数据按周进行分割

INSERT OVERWRITE TABLE u_data_new

SELECT

TRANSFORM (userid, movieid, rating, unixtime)

USING ‘python weekday_mapper.py‘

AS (userid, movieid, rating, weekday)

FROM u_data;

SELECT weekday, COUNT(1)

FROM u_data_new

GROUP BY weekday;

hive的基本命令,布布扣,bubuko.com

hive的基本命令

标签:des   http   tar   ext   com   get   

原文地址:http://www.cnblogs.com/seeyouseeme/p/3772262.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!