标签:数据库 需要 from group water argument load 上传 title
Windows下使用Eclipse工具操作Sqoop1.4.6对象
Sqoop是用来在关系型数据库与Hadoop之间进行数据的导入导出,Windows下使用Eclipse工具操作时,需要先搭建好Hadoop的开发环境
参照Java操作HDFS对象的pom.xml配置,添加配置
<dependency>
<groupId>org.apache.sqoop</groupId>
<artifactId>sqoop</artifactId>
<version>1.4.6</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.36</version>
</dependency>
使用MySQL进行数据和Hadoop之间的导入导出操作,需要MySQL的驱动包。sqoop-1.4.6.jar包在maven里下载不下来,将sqoop的安装目录下$SQOOP_HOME下的sqoop-1.4.6.jar包拷贝到maven仓库对应的sqoop的目录下即可。
Windows下配置Sqoop的环境变量
Windows下操作Sqoop时需要使用到Sqoop下的包,需要配置环境变量。
将下载的sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz解压到本机目录下,配置Sqoop的环境变量:
SQOOP_HOME=F:\data\sqoop-1.4.6.bin__hadoop-2.0.4-alpha
PATH=PATH;%SQOOP_HOME%\bin;
检查是否配置成功 sqoop version
警告不用管,Windows下不需配置
Java操作Sqoop对象
Windows本机下MySQL服务要先启动
4.1 将MySQL数据导入到HDFS
import org.apache.hadoop.conf.Configuration;
import org.apache.sqoop.Sqoop;
import org.apache.sqoop.tool.SqoopTool;
import org.apache.sqoop.util.OptionsFileUtil;
public class SqoopTest {
private static int importDataFromMysql() throws Exception {
String[] args = new String[] {
"--connect","jdbc:mysql://192.168.1.97:3306/mydb",
"--driver","com.mysql.jdbc.Driver",
"-username","root",
"-password","root",
"--table","user",
"-m","1",
"--target-dir","java_import_user"
};
String[] expandArguments = OptionsFileUtil.expandArguments(args);
SqoopTool tool = SqoopTool.getTool("import");
Configuration conf = new Configuration();
conf.set("fs.default.name", "hdfs://192.168.1.200:9000");//设置HDFS服务地址
Configuration loadPlugins = SqoopTool.loadPlugins(conf);
Sqoop sqoop = new Sqoop((com.cloudera.sqoop.tool.SqoopTool) tool, loadPlugins);
return Sqoop.runSqoop(sqoop, expandArguments);
}
public static void main(String[] args) throws Exception {
importDataFromMysql();
}
}
可以在HDFS上查看生成的文件
发现在HDFS下生成了目录/user/cyyun/java_import_user,程序中--target-dir
没有指定具体路径,本机Windows的用户名是cyyun,在Windows上使用程序操作时会在/user/下生成一个Windows用户名的目录
查看结果:
hadoop fs -cat /user/cyyun/java_import_user/part-m-00000
hadoop fs -cat /user/cyyun/java_import_user/part-m-00001
将程序中--target-dir
指定为/user/root/java_import_user
,运行查看结果:
hadoop fs -cat /user/root/java_import_user/part-m-00000
本文参考:
http://blog.csdn.net/guzicheng/article/details/41519947
标签:数据库 需要 from group water argument load 上传 title
原文地址:http://www.cnblogs.com/pejsidney/p/6952457.html