码迷,mamicode.com
首页 > 系统相关 > 详细

Hadoop之Linux源码编译

时间:2015-08-15 12:02:53      阅读:175      评论:0      收藏:0      [点我收藏+]

标签:

Hadoop开篇,按惯例,先编译源码,导入到Eclipse,这样以后要了解那块,或者那块出问题了,直接找源码。

hadoop2.4.1编译需要protoc2.5.0的支持,所以还要下载protoc。我下载的是:protobuf-2.5.0.tar.bz2 

对protoc进行编译安装前先要装几个依赖包:gcc,gcc-c++,make 如果已经安装的可以忽略

yum install gcc
yum install gcc-c++
yum install make
yum install cmake  
yum install openssl-devel  
yum install ncurses-devel  

安装protoc

tar -xvf protobuf-2.5.0.tar.bz2  
cd protobuf-2.5.0  
./configure --prefix=/opt/protoc/  
make && make install

linux系统执行编译命令:mvn install eclipse:eclipse -Pdist,native -DskipTests -Dtar -Dmaven.javadoc.skip=true

编译完成后,查看hadoop-dist文件夹:
[root@localhost target]# ll
total 153824
drwxr-xr-x. 2 root root      4096 Jul  9 17:00 antrun
-rw-r--r--. 1 root root      4809 Jul  9 17:00 dist-layout-stitching.sh
-rw-r--r--. 1 root root       666 Jul  9 17:01 dist-tar-stitching.sh
drwxr-xr-x. 9 root root      4096 Jul  9 17:00 hadoop-3.0.0-SNAPSHOT
-rw-r--r--. 1 root root 157482988 Jul  9 17:01 hadoop-3.0.0-SNAPSHOT.tar.gz
-rw-r--r--. 1 root root      3445 Jul  9 17:01 hadoop-dist-3.0.0-SNAPSHOT.jar
drwxr-xr-x. 2 root root      4096 Jul  9 17:01 maven-archiver
drwxr-xr-x. 2 root root      4096 Jul  9 17:00 test-dir
[root@localhost target]# pwd
/home/fish/hadoop/hadoop-dist/target

查看hadoop的版本:
[root@localhost bin]# cd /home/fish/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/bin
[root@localhost bin]# ./hadoop version
Hadoop 3.0.0-SNAPSHOT
Source code repository https://github.com/apache/hadoop.git -r e0febce0e74ec69597376774f771da46834c42b1
Compiled by root on 2015-07-09T08:53Z
Compiled with protoc 2.5.0
From source with checksum d69dd13fde158d22d95a263a0f12bc8
This command was run using /home/fish/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/share/hadoop/common/hadoop-common-3.0.0-SNAPSHOT.jar
[root@localhost bin]# pwd
/home/fish/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/bin

查看编译的一些信息:
[root@localhost hadoop-3.0.0-SNAPSHOT]# file lib//native/*
lib//native/libhadoop.a:            current ar archive
lib//native/libhadooppipes.a:       current ar archive
lib//native/libhadoop.so:           symbolic link to `libhadoop.so.1.0.0'
lib//native/libhadoop.so.1.0.0:     ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped
lib//native/libhadooputils.a:       current ar archive
lib//native/libhdfs.a:              current ar archive
lib//native/libhdfs.so:             symbolic link to `libhdfs.so.0.0.0'
lib//native/libhdfs.so.0.0.0:       ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped
lib//native/libnativetask.a:        current ar archive
lib//native/libnativetask.so:       symbolic link to `libnativetask.so.1.0.0'
lib//native/libnativetask.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped
[root@localhost hadoop-3.0.0-SNAPSHOT]# pwd
/home/fish/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT

编译问题

问题1:

[ERROR] Failed to execute goal on project hadoop-common: Could not resolve dependencies for project org.apache.hadoop:hadoop-common:jar:3.0.0-SNAPSHOT: Failure to find org.apache.hadoop:hadoop-auth:jar:tests:3.0.0-SNAPSHOT in http
://10.0.1.88:8081/nexus/content/repositories/thirdparty/ was cached in the local repository, resolution will not be reattempted until the update interval of thirdparty has elapsed or updates are forced -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-common

修改.m2中的文件:

mv /root/.m2/repository/org/apache/hadoop/hadoop-auth/3.0.0-SNAPSHOT/hadoop-auth-3.0.0-SNAPSHOT-tests.jar.lastUpdated /root/.m2/repository/org/apache/hadoop/hadoop-auth/3.0.0-SNAPSHOT/hadoop-auth-3.0.0-SNAPSHOT-tests.jar
mv /root/.m2/repository/org/apache/hadoop/hadoop-kms/3.0.0-SNAPSHOT/hadoop-kms-3.0.0-SNAPSHOT-tests.jar.lastUpdated /root/.m2/repository/org/apache/hadoop/hadoop-kms/3.0.0-SNAPSHOT/hadoop-kms-3.0.0-SNAPSHOT-tests.jar
mv /root/.m2/repository/org/apache/hadoop/hadoop-hdfs/3.0.0-SNAPSHOT/hadoop-hdfs-3.0.0-SNAPSHOT-tests.jar.lastUpdated /root/.m2/repository/org/apache/hadoop/hadoop-hdfs/3.0.0-SNAPSHOT/hadoop-hdfs-3.0.0-SNAPSHOT-tests.jar

问题2:

还有些错误会报无法下载到jar,这种情况可以登录到http://search.maven.org/官方库去看下这个包存不存在,如果存在的话,可能是因为网络原因,多执行几次就可以了。

问题3:

[root@localhost bin]# ./hadoop
: No such file or directory
修改hadoop命令为linux的格式:
dos2unix /home/fish/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/bin/hadoop


版权声明:本文为博主原创文章,未经博主允许不得转载。

Hadoop之Linux源码编译

标签:

原文地址:http://blog.csdn.net/qianshangding0708/article/details/47679991

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!