码迷,mamicode.com
首页 > 系统相关 > 详细

linux 下 CDH4.5编译

时间:2014-07-23 16:58:11      阅读:458      评论:0      收藏:0      [点我收藏+]

标签:des   style   blog   http   java   color   

 

  1、安装JDK

  JDK:我这里 安装的是jdk1.6.0_23

    1.1:给文件执行的权限chmod u+x jdk-6u23-linux-x64.bin

    1.2: ./jdk-6u23-linux-x64.bin ,会生成一个jdk1.6.0_23 的文件

    1.3: 在/etc/profile 文件配置添加环境变量即可

export JAVA_HOME=/home/hadoop/jdk1.6.0_23
export PATH=$PATH:$JAVA_HOME/bin

    输入java-version 如果有版本出来就说明成功。

  2、安装maven

    下载安装包后解压 在/etc/profile 配置路劲即可

    

export MAVEN_HOME=/home/hadoop/apache-maven-3.2.1
export PATH=$PATH:$MAVEN_HOME/bin

  输入  mvn -v

[hadoop@master ~]$ mvn -v
Apache Maven 3.2.1 (ea8b2b07643dbb1b84b6d16e1f08391b666bc1e9; 2014-02-14T09:37:52-08:00)
Maven home: /home/hadoop/apache-maven-3.2.1
Java version: 1.6.0_23, vendor: Sun Microsystems Inc.
Java home: /home/hadoop/jdk1.6.0_23/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-431.el6.x86_64", arch: "amd64", family: "unix"

  说明安装成功

  3、gcc,g++

   参照我前面一片文章 linux centos 装g++安装不了

  4、protoc

    我安装的是protobuf-2.4.1

    tar zxvf protobuf-2.4.1.tar.gz
    cd protobuf-2.4.1
    ./configure  && make && make check && make install

    环境变量配置:

 export PROTOC_HOME=/home/hadoop/protobuf-2.4.1
 export  PATH=$PROTOC_HOME/src:$PATH

    需要注意的是2.4.1版本 没有 目录bin用src代替,这个我折腾了很久。

    

[hadoop@master ~]$ protoc --version
libprotoc 2.4.1

  说明安装成功  

  5、其他依赖  

顺序执行以下命令
    yum install cmake  
    yum install openssl-devel  
    yum install ncurses-devel

 

  6、ant

  网上有些资料说要安装ant,我没有安装也发现可以照样编译

     7、 编译CDH4.5源码

   tar -zxvf hadoop-2.0.0-cdh4.5.0.tar.gz

   进入目录 cd $HADOOP_HOME/src ,后执行 mvn package -DskipTests -Pdist,native,docs     (编译的时候要等一段时间)

  第一次执行的时候遇到一个错误,后面删掉整个文件重新执行就可以

  查看日志提示编译完成:

[INFO] Apache Hadoop Rumen ............................... SUCCESS [  6.738 s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [  4.798 s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [  2.669 s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [  3.134 s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [ 18.866 s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [  3.303 s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [  0.027 s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [ 51.497 s]
[INFO] Apache Hadoop Client .............................. SUCCESS [ 16.715 s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [  4.659 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 14:12 min
[INFO] Finished at: 2014-07-23T00:11:20-08:00
[INFO] Final Memory: 93M/238M

  遇到异常:第一次执行的时候遇到异常:

[ERROR] Failed to execute goal org.apache.avro:avro-maven-plugin:1.7.4:schema
(generate-avro-test-sources) on project hadoop-common:
Execution generate-avro-test-sources of goal org.apache.avro:avro-maven-plugin:1.7.4:schema
failed: Plugin org.apache.avro:avro-maven-plugin:1.7.4 or one of its dependencies
could not be resolved: Failed to collect dependencies at org.apache.avro:avro-maven-plugin:jar:1.7.4 ->
org.apache.avro:avro-compiler:jar:1.7.4 -> commons-lang:commons-lang:jar:2.6: Failed to read artifact
descriptor for commons-lang:commons-lang:jar:2.6: Could not transfer
artifact org.apache.commons:commons-parent:pom:17 from/to central
(http://repo.maven.apache.org/maven2): GET request of: org/apache/commons/commons-parent/17/commons-parent-17.pom from central
failed: Connection reset -> [Help 1]
这个异常没有解决

    

linux 下 CDH4.5编译,布布扣,bubuko.com

linux 下 CDH4.5编译

标签:des   style   blog   http   java   color   

原文地址:http://www.cnblogs.com/zhanggl/p/3863065.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!