码迷,mamicode.com
首页 > 其他好文 > 详细

RedHat Ent 6.5 64bit编译安装hadoop2.4.1

时间:2014-08-07 18:26:00      阅读:242      评论:0      收藏:0      [点我收藏+]

标签:style   blog   http   color   os   io   文件   2014   

RedHat Ent 6.5 64bit编译安装hadoop2.4.1

感谢原帖:http://blog.csdn.net/w13770269691/article/details/16883663/

  step 1.修改yum:(针对redhat ent未注册用户,注册用户直接跳过这一步)

    参考:http://blog.csdn.net/zhngjan/article/details/20843465

  step 2.下载源码包:http://mirrors.hust.edu.cn/apache/hadoop/common/stable2/

wget http://mirrors.hust.edu.cn/apache/hadoop/common/stable2/hadoop-2.4.1-src.tar.gz

  step 3.准备maven作为编译hadoop的工具

    a.下载编译好的maven包:

wget http://mirror.bit.edu.cn/apache/maven/maven-3/3.1.1/binaries/apache-maven-3.1.1-bin.tar.gz
tar -zxvf apache-maven-3.1.1-bin.tar.gz -C /opt/

    b.配置maven的环境变量,在/etc/profile文件结尾中添加如下代码

export MAVEN_HOME=/opt/apache-maven-3.1.1
export PATH=$PATH:${MAVEN_HOME}/bin

    c.执行如下命令使配置文件生效

source /etc/profile

    d.测试maven

mvn -version

   e.由于maven国外服务器可能连不上,先给maven配置一下国内镜像

    在maven目录下,conf/settings.xml,在<mirrors></mirros>里添加如下内容(注意不要添加到注释里面了)

<mirror> 
  <id>nexus-osc</id>  
  <mirrorOf>*</mirrorOf>  
  <name>Nexusosc</name>  
  <url>http://maven.oschina.net/content/groups/public/</url>  
</mirror>

 在maven目录下,conf/settings.xml,在<profiles></profiles>添加如下内容(注意不要添加到注释里面了)

<profile>  
  <id>jdk-1.7</id>  
  <activation>  
  <jdk>1.7</jdk>  
  </activation>  
  <repositories>  
  <repository>  
   <id>nexus</id>  
   <name>local private nexus</name>  
   <url>http://maven.oschina.net/content/groups/public/</url>  
   <releases>  
    <enabled>true</enabled>  
   </releases>  
   <snapshots>  
    <enabled>false</enabled>  
   </snapshots>  
  </repository>  
  </repositories>  
  <pluginRepositories>  
   <pluginRepository>  
    <id>nexus</id>  
    <name>local private nexus</name>  
    <url>http://maven.oschina.net/content/groups/public/</url>  
    <releases>  
      <enabled>true</enabled>  
    </releases>  
    <snapshots>  
      <enabled>false</enabled>  
    </snapshots>  
   </pluginRepository>  
  </pluginRepositories>  
</profile>  

  step 4.hadoop2.4.1编译需要protoc2.5.0的支持,所以还要安装下载protoc2.5.0

    官方网址:https://code.google.com/p/protobuf/downloads/list

    百度网盘网址:http://pan.baidu.com/s/1pJlZubT

    a.对protoc进行编译安装前先要装几个依赖包:gcc,gcc-c++,make 如果已经安装的可以忽略

yum install gcc  
yum intall gcc-c++  
yum install make 

   b.配置protoc

tar -xvf protobuf-2.5.0.tar.bz2  
cd protobuf-2.5.0  
./configure --prefix=/opt/protoc/  
make && make install 
yum install gcc 

    红色字体是自定义的protoc安装目录,完成上面的命令后,再配置protoc环境变量,同样在/etc/profile尾部加入:

export PATH=/opt/protoc/bin:$PATH

  step 5.,还不要着急开始编译安装,不然又是各种错误,需要安装cmake,openssl-devel,ncurses-devel依赖

yum install cmake  
yum install openssl-devel  
yum install ncurses-devel  

  step 6. 解压并编译hadoop源码(一定要进入到源码解压的位置再执行mvn编译)

tar -zxvf hadoop-2.4.1-src.tar.gz -C /opt/
cd /opt/hadoop-2.4.1-src
mvn package -Pdist,native -DskipTests -Dtar

编译耗时比较长,耐心等待,成功后结果类似于:

 pache Hadoop Distributed Copy .................... SUCCESS [33.648s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [7.303s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [21.288s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [14.611s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [8.334s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [10.737s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [18.321s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [17.136s]
[INFO] Apache Hadoop Client .............................. SUCCESS [14.563s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.254s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [17.245s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [14.478s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.084s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [41.979s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 34:24.464s
[INFO] Finished at: Thu Aug 07 14:25:51 CST 2014
[INFO] Final Memory: 159M/814M
[INFO] ------------------------------------------------------------------------
You have new mail in /var/spool/mail/root

编译后的路径在:hadoop-2.4.1-src/hadoop-dist/target/hadoop-2.4.1

进入hadoop-2.4.1目录,测试安装是否成功

[root@cupcs-redhat6 bin]# cd /opt/hadoop-2.4.1-src/hadoop-dist/target/hadoop-2.4.1/bin
[root@cupcs-redhat6 bin]# ./hadoop version
Hadoop 2.4.1
Subversion Unknown -r Unknown
Compiled by root on 2014-08-07T05:52Z
Compiled with protoc 2.5.0
From source with checksum bb7ac0a3c73dc131f4844b873c74b630
This command was run using
/opt/hadoop-2.4.1-src/hadoop-dist/target/hadoop-2.4.1/share/hadoop/common/hadoop-common-2.4.1.jar
[root@cupcs-redhat6 bin]# cd ..
[root@cupcs-redhat6 hadoop-2.4.1]# file lib/native/*
lib/native/libhadoop.a: current ar archive
lib/native/libhadooppipes.a: current ar archive
lib/native/libhadoop.so: symbolic link to `libhadoop.so.1.0.0‘
lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1
(SYSV), dynamically linked, not stripped
lib/native/libhadooputils.a: current ar archive
lib/native/libhdfs.a: current ar archive
lib/native/libhdfs.so: symbolic link to `libhdfs.so.0.0.0‘
lib/native/libhdfs.so.0.0.0: ELF 64-bit LSB shared object, x86-64, version 1
(SYSV), dynamically linked, not stripped
[root@cupcs-redhat6 hadoop-2.4.1]#

 

 

RedHat Ent 6.5 64bit编译安装hadoop2.4.1,布布扣,bubuko.com

RedHat Ent 6.5 64bit编译安装hadoop2.4.1

标签:style   blog   http   color   os   io   文件   2014   

原文地址:http://www.cnblogs.com/wrencai/p/3897438.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!