码迷,mamicode.com
首页 > 其他好文 > 详细

【hadoop 2.6】hadoop 2.6源码编译过程,redhat 5.8操作系统进行编译【附:软件下载】

时间:2015-02-04 18:46:08      阅读:174      评论:0      收藏:0      [点我收藏+]

标签:hadoop   源码编译   

大家在官网下载hadoop2.6安装完使用的时候,总是在控制台有这样一句

 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

因为官网给的32位系统编译的版本,在64位的操作系统上使用就需要自己编译源码了

下面介绍下我的编译过程:

技术分享

这里是下载了maven,ant,findbugs,分别解压后,配置环境变量

技术分享

这是hadoop 2.6的源码,解压即可

下面继续安装需要的软件:

1、安装gcc,gcc-c++

技术分享

2、安装make,cmake

技术分享

3、安装openssl-devel,ncurses-devel

技术分享

4、最后开始安装protoc buf

压缩包解压后

./configure

make

make check 

make install

5、编译

mvn clean install -DskipTests


mvn package -Pdist,native -DskipTests -Dtar


6、编译成功

[INFO] Apache Hadoop Main ................................ SUCCESS [7.430s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [16.054s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [4.956s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.536s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [18.575s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [5.347s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [4.062s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [6.434s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [4.823s]
[INFO] Apache Hadoop Common .............................. SUCCESS [2:44.417s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [9.645s]
[INFO] Apache Hadoop KMS ................................. SUCCESS [19.681s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.074s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [5:05.240s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [28.723s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [15.884s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [6.125s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.072s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.068s]
[INFO] hadoop-yarn-api ................................... SUCCESS [4:41.183s]
[INFO] hadoop-yarn-common ................................ SUCCESS [43.377s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.073s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [17.120s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [23.088s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [4.509s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [9.256s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [25.804s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [8.690s]
[INFO] hadoop-yarn-client ................................ SUCCESS [10.050s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.062s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [4.423s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [2.732s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.064s]
[INFO] hadoop-yarn-registry .............................. SUCCESS [8.125s]
[INFO] hadoop-yarn-project ............................... SUCCESS [7.215s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.114s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [27.948s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [31.602s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [6.624s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [12.136s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [12.574s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [6.447s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [2.662s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [8.593s]
[INFO] hadoop-mapreduce .................................. SUCCESS [6.852s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [7.445s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [11.579s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [3.206s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [8.972s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [6.194s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [4.075s]
[INFO] Apache Hadoop Ant Tasks ........................... SUCCESS [3.546s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [4.469s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [9.900s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [8.320s]
[INFO] Apache Hadoop Amazon Web Services support ......... SUCCESS [6.285s]
[INFO] Apache Hadoop Client .............................. SUCCESS [10.249s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.183s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [6.699s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [14.929s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.058s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [48.591s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 22:18.595s
[INFO] Finished at: Wed Feb 04 16:35:19 CST 2015
[INFO] Final Memory: 101M/251M
[INFO] ------------------------------------------------------------------------


7、替换native文件夹

/home/hadoop/hadoop-2.6.0-src/hadoop-dist/target/hadoop-2.6.0/lib/native

将以上编译好的文件夹内的所有内容替换掉你原来的32位的hadoop的native文件

现在你可以安心舒服的进行文件系统操作了,再也不会看见讨厌的:

 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

附录:

软件下载:

maven       ant         findbugs 3.0.0       protobuf 2.5.0       hadoop 2.6






【hadoop 2.6】hadoop 2.6源码编译过程,redhat 5.8操作系统进行编译【附:软件下载】

标签:hadoop   源码编译   

原文地址:http://blog.csdn.net/simonchi/article/details/43487183

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!