码迷,mamicode.com
首页 > 其他好文 > 详细

YARN加载本地库抛出Unable to load native-hadoop library解决办法

时间:2014-08-23 12:30:40      阅读:282      评论:0      收藏:0      [点我收藏+]

标签:des   style   blog   http   color   java   os   io   strong   

YARN加载本地库抛出Unable to load native-hadoop library解决办法

用官方的Hadoop 2.1.0-beta安装后,每次hadoop命令进去都会抛出这样一个Warning

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

设置logger级别,看下具体原因

 

export HADOOP_ROOT_LOGGER=DEBUG,console?

...

DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

14/08/23 10:04:21 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
report: Failed on local exception: java.io.IOException: Connection reset by peer; Host Details : local host is: "VM_160_34_centos/127.0.0.1"; destination host is: "Master":9000; 
 
wrong ELFCLASS32,难道是加载的so文件系统版本不对
执行命令
file libhadoop.so.1.0.0

 

hadoop@VM_160_34_centos:/usr/local/hadoop-2.4.0/lib/native> file libhadoop.so.1.0.0
libhadoop.so.1.0.0: ELF 32-bit LSB shared object, Intel 80386, version 1 (SYSV), dynamically linked, not stripped

 

果然是80386,是32位的系统版本,而我的hadoop环境是64位OS

  原来直接从apache镜像中下载的编译好的Hadoop版本native library都是32版本的,如果要支持64位版本,必须自己重新编译,这就有点坑爹了,要知道几乎所有的生产环境都是64位的OS
YARN官方对于native library的一段话验证了这一点
“The pre-built 32-bit i386-Linux native hadoop library is available as part of the hadoop distribution and is located in the lib/native directory?”

解决方法:重新编译hadoop

解决方法,就是重新编译hadoop软件:

安装开发环境

1.必要的包
yum install svn

yum install autoconfautomake libtool cmake

yum install ncurses-devel

yum install openssl-devel

yum install gcc*
2.安装maven

下载,并解压

 

wget  -c http://mirrors.hust.edu.cn/apache/maven/maven-3/3.2.3/binaries/apache-maven-3.2.3-bin.tar.gz
tar -zxvf apache-maven-3.2.3-bin.tar.gz  -C /usr/local/

/usr/local/apache-maven-3.2.3/bin加到环境变量中

 

root@VM_160_34_centos:~/tools> vi /etc/profile.d/maven-development.sh 
export M2_HOME=/usr/local/apache-maven-3.2.3
export PATH=$PATH:$M2_HOME/bin
root@VM_160_34_centos:~/tools> source /etc/profile

测试 maven

root@VM_160_34_centos:/usr/local/apache-maven-3.2.3> mvn -version
Apache Maven 3.2.3 (33f8c3e1027c3ddde99d3cdebad2656a31e8fdf4; 2014-08-12T04:58:10+08:00)
Maven home: /usr/local/apache-maven-3.2.3
Java version: 1.7.0_55, vendor: Oracle Corporation
Java home: /usr/local/java/jdk1.7.0_55/jre
Default locale: en_US, platform encoding: ANSI_X3.4-1968
OS name: "linux", version: "2.6.32-220.el6.x86_64", arch: "amd64", family: "unix"

 

3.安装protobuf

没装 protobuf,后面编译做不完,结果如下:

[INFO] —hadoop-maven-plugins:2.2.0:protoc (compile-protoc) @ hadoop-common —

[WARNING] [protoc, --version] failed:java.io.IOException: Cannot run program “protoc”: error=2, No suchfile or directory

[ERROR] stdout: []

……………………

[INFO] Apache Hadoop Main………………………….. SUCCESS [5.672s]

[INFO] Apache Hadoop Project POM……………………. SUCCESS [3.682s]

[INFO] Apache Hadoop Annotations……………………. SUCCESS [8.921s]

[INFO] Apache Hadoop Assemblies…………………….. SUCCESS [0.676s]

[INFO] Apache Hadoop Project Dist POM……………….. SUCCESS [4.590s]

[INFO] Apache Hadoop Maven Plugins………………….. SUCCESS [9.172s]

[INFO] Apache Hadoop Auth………………………….. SUCCESS [10.123s]

[INFO] Apache Hadoop Auth Examples………………….. SUCCESS [5.170s]

[INFO] Apache HadoopCommon ………………………… FAILURE [1.224s]

[INFO] Apache Hadoop NFS…………………………… SKIPPED

[INFO] Apache Hadoop Common Project…………………. SKIPPED

[INFO] Apache Hadoop HDFS………………………….. SKIPPED

[INFO] Apache Hadoop HttpFS………………………… SKIPPED

[INFO] Apache Hadoop HDFS BookKeeperJournal …………. SKIPPED

[INFO] Apache Hadoop HDFS-NFS………………………. SKIPPED

[INFO] Apache Hadoop HDFS Project…………………… SKIPPED
安装protobuf过程

下载: 

root@VM_160_34_centos:~/tools> wget -c https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz

解压

root@VM_160_34_centos:~/tools> tar -xvzf protobuf-2.5.0.tar.gz 
root@VM_160_34_centos:~/tools/protobuf-2.5.0> cd protobuf-2.5.0

 

依次执行下面的命令即可

./configure

make

make check

make install

测试安装:

 

 protoc–version

 

libprotoc 2.5.0

重新checkout source code
svn checkout http://svn.apache.org/repos/asf/hadoop/common/tags/release-2.4.0/

 

加上编译native的选项,编译时会根据当前的操作系统架构来生产相应的native库
mvn package -Pdist,native -DskipTests -Dtar

 

 
再去native文件下查看所有的file type,已经都是64位版的了,替换线上文件,WARNING消除

 

 

感谢 : http://www.kankanews.com/ICkengine/archives/81648.shtml

 

YARN加载本地库抛出Unable to load native-hadoop library解决办法

标签:des   style   blog   http   color   java   os   io   strong   

原文地址:http://www.cnblogs.com/mjorcen/p/3930808.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!