码迷,mamicode.com
首页 > 编程语言 > 详细

Hadoop读书笔记(三)Java API操作HDFS

时间:2014-11-20 00:10:32      阅读:245      评论:0      收藏:0      [点我收藏+]

标签:hadoop   hdfs   java api操作hdfs   

Hadoop读书笔记(一)Hadoop介绍:http://blog.csdn.net/caicongyang/article/details/39898629

Hadoop读书笔记(二)HDFS的shell操作:http://blog.csdn.net/caicongyang/article/details/41253927

JAVA URL 操作HDFS

OperateByURL.java

package hdfs;

import java.io.InputStream;
import java.net.URL;

import org.apache.hadoop.fs.FsUrlStreamHandlerFactory;
import org.apache.hadoop.io.IOUtils;

public class OperateByURL {
	private static final String PATH ="hdfs://192.168.80.100:9000/test.txt";
	public static void main(String[] args) throws Exception {
		//查看文件
		URL.setURLStreamHandlerFactory(new FsUrlStreamHandlerFactory());
		URL url = new URL(PATH);
		InputStream in = url.openStream();
		IOUtils.copyBytes(in, System.out, 1024,true);
	}
}

Hadoop Java api 操作HDFS

OperateByHadoopAPI.Java

package hdfs;

import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.net.URI;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;

public class OperateByHadoopAPI {
	//Hadoop HDFS路径
	private static final String PATH="hdfs://192.168.80.100:9000/";
	private static final String DIR="/d1";
	private static final String FILE="/d1/default.cfg";
	public static void main(String[] args) throws Exception {
		FileSystem fileSystem = FileSystem.get(new URI(PATH), new Configuration());
		//创建文件夹
		fileSystem.mkdirs(new Path(DIR));
		
		//上传文件
		//方法一
		//fileSystem.copyFromLocalFile(new Path("F:/hadoopbaiduyundownload/liclog.txt"), new Path(DIR));
		//方法二
		FSDataOutputStream out = fileSystem.create(new Path(FILE));
		FileInputStream in = new FileInputStream(new File("F:/hadoopbaiduyundownload/default.cfg"));
		IOUtils.copyBytes(in, out, 1024, true);
		
		//下载
		//方法一  产生WARN 待解决:14/11/19 21:39:49 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
		File file = new File("F:/hadoopbaiduyundownload/test.txt");
		File file2 = new File("F:/hadoopbaiduyundownload/test2.txt");
		fileSystem.copyToLocalFile(new Path("hdfs://192.168.80.100:9000/test.txt"), new Path(file.getAbsolutePath()));
		
		//方法二
		FSDataInputStream inputStream = fileSystem.open(new Path("hdfs://192.168.80.100:9000/test.txt"));
		FileOutputStream outputStream = new FileOutputStream(file2.getAbsolutePath());
		IOUtils.copyBytes(inputStream, outputStream, 1024, true);
		
		//遍历文件夹
		
		FileStatus[] listStatus = fileSystem.listStatus(new Path("/"));
		for (FileStatus fileStatus : listStatus) {
			System.out.println(fileStatus.isDir()?"文件夹":"文件"+" "+fileStatus.getOwner()+" "+fileStatus.getReplication()+" "+ fileStatus.getPath());
		}
		
		//删除文件
		/**
		 * @parameter path
		 * @parameter boolean :如果填true,path为文件夹时递归删除
		 * 
		 */
		fileSystem.delete(new Path(DIR), true);
		
	}
}



欢迎大家一起讨论学习!

有用的自己收!

记录与分享,让你我共成长!欢迎查看我的其他博客;我的博客地址:http://blog.csdn.net/caicongyang





Hadoop读书笔记(三)Java API操作HDFS

标签:hadoop   hdfs   java api操作hdfs   

原文地址:http://blog.csdn.net/caicongyang/article/details/41290955

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!