标签:
折腾了一天,最后才发现sqoop2暂时只支持mysql到hdfs或者hdfs到mysql,不支持hive或者hbase,无语啊。不过这里还是记下sqoop2的安装,兴许以后sqoop2加了支持,从这里能够快速记忆起来。
首先下载,这里版本是sqoop1.99.6,解压什么的就不说了。主要是配置,首先是环境变量
export SQOOP_HOME=/home/hadoop/sqoop/sqoop-1.99.6-bin-hadoop200 export PATH = $SQOOP_HOME/bin:$PATH export CATALINA_BASE=/home/hadoop/sqoop/sqoop-1.99.6-bin-hadoop200/server export LOGDIR=$SQOOP_HOME/logs/
然后是sqoop-1.99.6-bin-hadoop200/server/conf/sqoop.properties,修改属性org.apache.sqoop.submission.engine.mapreduce.configuration.directory
org.apache.sqoop.submission.engine.mapreduce.configuration.directory=/home/hadoop/hadoop/hadoop-2.6.1/etc/hadoop
然后是sqoop-1.99.6-bin-hadoop200/server/conf/catalina.properties ,修改属性common.loader
common.loader=/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/common/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/common/lib/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/hdfs/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/hdfs/lib/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/mapreduce/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/mapreduce/lib/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/tools/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/tools/lib/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/yarn/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/yarn/lib/*.jar,/home/hadoop/hadoop/hadoop-2.6.1/share/hadoop/httpfs/tomcat/lib/*.jar
然后下载mysql的驱动包,放到server/lib下。
下面是常用命令
./sqoop.sh server start 启动 ./sqoop.sh server stop 停止 ./sqoop.sh client 进入客户端 set server --host hadoopMaster --port 12000 --webapp sqoop 设置服务器,注意hadoopMaster为hdfs主机名 show connector --all 查看连接类型 create link --cid 1 创建连接,cid为连接类型id show link 查看连接 update link -l 1 修改id为1的连接 delete link -l 1 删除id为1的连接 create job -f 1 -t 2 创建从连接1到连接2的job show job 查看job update job -jid 1 修改job delete job -jid 1 删除job status job -jid 1 看看job状态 stop job -jid 1 停止job
日志在server/logs
标签:
原文地址:http://my.oschina.net/shyloveliyi/blog/514423