Hadoop2.7.1——64位源碼編譯

Hadoop2.7.1——64位源碼編譯

文檔下載地址: http://download.csdn.net/detail/hanxindere/9153021

軟件環境:

CentOS6.5 64位,

jdk1.7.02,64位

maven3.2.3,

ant1.9.4.

protobuf-2.5.0.tar.gz

所有軟件下載:http://yunpan.cn/cH5ebUqNPC6Be   提取碼 2c57



注意事項:

Ø  內存小於1G時,一定要有swap分區,創建swap分區,見下面操作。

Ø  maven版本不要用太新的,有時會出現問題

Ø  jdk用1.7-1.7.45中間版本,64位的。(高於jdk1.7.45可能在Hadoop-common/security會出現問題),各個版本JDK下載地址如下:http://www.oracle.com/technetwork/java/archive-139210.html

Ø  下面的基本軟件安裝一定要先安裝。

一、基本軟件安裝

安裝基本環境

yum -y install  svn  ncurses-devel   gcc*

yum -y install lzo-devel zlib-develautoconf    automake    libtool   cmake     openssl-devel

 

二、軟件安裝

1) jdk1.7.02(高於jdk1.7.45可能在Hadoop-common/security會出現問題,編譯Hadoop2.7.1源碼時會報很多錯誤)

2) Maven 3.0或更新版本

3) ProtocolBuffer 2.5.0 

4) Findbugs 1.3.9,可選的(本文編譯時未安裝)

5)ant1.9.4

 

本文以root用戶在/root目錄下進行安裝,但實際可以選擇非root用戶及非/root目錄進行安裝。

2.1. 安裝ProtocolBuffer

 

標準的automake編譯安裝方式:

1) cd /root

2) tar xzf protobuf-2.5.0.tar.gz

3) cd protobuf-2.5.0

4) ./conigure --prefix=/root/protobuf

5) make

6) make install

 

2.2. 安裝JDK

1) cd /root

2) tar xzf jdk-7u02-linux-x64.gz

3) cd jdk1.7.0_55

4) ln -s jdk1.7.0_02 jdk

 

2.3. 安裝Maven

1) cd /root

2) tar xzf apache-maven-3.0.5-bin.tar.gz

3) ln -s apache-maven-3.0.5 maven

 

2.4. 安裝ANT

1) cd /root

2) tar xzf apache-ant-1.9.4-bin.tar.gz

3) ln -s apache-maven-1.9.4 ant

 

2.5. 環境變量設置

在安裝好之後,還需要設置一下環境變量,可以修改/etc/profile,也可以是修改~/.profile,增加如下內容:

#set environment

export JAVA_HOME=/opt/jdk

export HADOOP_HOME=/opt/hadoop

export MAVEN_HOME=/opt/maven

#防止內存不足

export MAVEN_OPTS="-Xms512m-Xmx1024m"

export JAVA_OPTS="-Xms512m-Xmx1024m"

export ANT_HOME=/opt/ant

export FINDBUGS_HOME=/opt/findbugs

 

exportPATH=$FINDBUGS_HOME/bin:$ANT_HOME/bin:$MAVEN_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$JAVA_HOME/bin:$JAVA_HOME/jre/bin:$PATH

exportCLASSPATH=.:$CLASSPATH:$JAVA_HOME/lib:$JAVA_HOME/jre/lib

 

3. 編譯Hadoop源代碼

若是主機的內存小1G時,創建SWAP分區來緩存。

用命令free -m 查看是否有swap分區,

若沒有swap分區,創建分區如下:

一、決定修改swap大小,首先在空間合適處創建用於分區的swap文件:如/swap1

    #dd if=/dev/zero of=/swap1 bs=1M count=2048

         if表示 infile,of 表示outfile,bs=1M代表增加的模塊大小,count=2048代表2048個模塊,也就是2G空間

二、將目的文件設置爲swap分區文件:

     #mkswap /swap1

三、激活swap,立即啓用交換分區文件:

    #swapon /swap1

四、在/etc/fstab文件中加入下面這樣一行:

 /swap1  swap      swap    defaults   0 0


在編譯前修改hadoop-src-2.7.1/hadoop-common-project/hadoop-auth/pom.xml

         <dependency>

     <groupId>org.mortbay.jetty</groupId>

     <artifactId>jetty-util</artifactId>

     <scope>test</scope>

   </dependency>

   <dependency>

     <groupId>org.mortbay.jetty</groupId>

     <artifactId>jetty</artifactId>               //修改此行

     <scope>test</scope>

   </dependency>

 

完成上述準備工作後,即可通過執行命令:

mvn package -Pdist -DskipTests –Dtar

若是MVN下載速度太慢,可以將mvn3.2.2_hadoop2.7.1.tar.gz 文件中的內容放入~/.m2/ repository即可,建議用以上360雲盤中所出軟件的版本。

編譯成功後,會生成Hadoop二進制安裝包hadoop-2.7.1.tar.gz,放在源代碼的hadoop-dist/target子目錄下:

 

[INFO] Apache Hadoop Main................................. SUCCESS [ 4.075 s]

[INFO] Apache Hadoop Project POM.......................... SUCCESS [ 2.469 s]

[INFO] Apache Hadoop Annotations.......................... SUCCESS [ 6.505 s]

[INFO] Apache Hadoop Assemblies........................... SUCCESS [ 0.309 s]

[INFO] Apache Hadoop Project Dist POM..................... SUCCESS [  2.699 s]

[INFO] Apache Hadoop Maven Plugins........................ SUCCESS [  5.294s]

[INFO] Apache Hadoop MiniKDC.............................. SUCCESS [ 6.434 s]

[INFO] Apache Hadoop Auth .................................SUCCESS [ 10.735 s]

[INFO] Apache Hadoop Auth Examples........................ SUCCESS [  4.638s]

[INFO] Apache Hadoop Common............................... SUCCESS [02:52 min]

[INFO] Apache Hadoop NFS.................................. SUCCESS [ 14.563 s]

[INFO] Apache Hadoop KMS.................................. SUCCESS [ 17.671 s]

[INFO] Apache Hadoop Common Project....................... SUCCESS [  0.047s]

[INFO] Apache Hadoop HDFS................................. SUCCESS [05:23 min]

[INFO] Apache Hadoop HttpFS............................... SUCCESS [01:04 min]

[INFO] Apache Hadoop HDFS BookKeeperJournal .............. SUCCESS [02:09 min]

[INFO] Apache Hadoop HDFS-NFS............................. SUCCESS [ 7.087 s]

[INFO] Apache Hadoop HDFS Project......................... SUCCESS [ 0.067 s]

[INFO] hadoop-yarn........................................ SUCCESS [  0.041 s]

[INFO] hadoop-yarn-api.................................... SUCCESS [02:32 min]

[INFO] hadoop-yarn-common................................. SUCCESS [33:53 min]

[INFO] hadoop-yarn-server................................. SUCCESS [ 0.295 s]

[INFO] hadoop-yarn-server-common.......................... SUCCESS [ 23.604 s]

[INFO] hadoop-yarn-server-nodemanager..................... SUCCESS [ 35.011 s]

[INFO] hadoop-yarn-server-web-proxy....................... SUCCESS [  6.994s]

[INFO]hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 14.614 s]

[INFO] hadoop-yarn-server-resourcemanager................. SUCCESS [ 40.431 s]

[INFO] hadoop-yarn-server-tests........................... SUCCESS [ 10.644 s]

[INFO] hadoop-yarn-client................................. SUCCESS [ 13.446 s]

[INFO]hadoop-yarn-server-sharedcachemanager .............. SUCCESS [  5.840 s]

[INFO] hadoop-yarn-applications........................... SUCCESS [ 0.074 s]

[INFO]hadoop-yarn-applications-distributedshell .......... SUCCESS [  3.636 s]

[INFO]hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  3.179 s]

[INFO] hadoop-yarn-site................................... SUCCESS [ 0.079 s]

[INFO] hadoop-yarn-registry............................... SUCCESS [ 9.001 s]

[INFO] hadoop-yarn-project................................ SUCCESS [ 7.124 s]

[INFO] hadoop-mapreduce-client............................ SUCCESS [ 0.116 s]

[INFO] hadoop-mapreduce-client-core....................... SUCCESS [ 37.457 s]

[INFO] hadoop-mapreduce-client-common..................... SUCCESS [ 34.469 s]

[INFO] hadoop-mapreduce-client-shuffle.................... SUCCESS [  8.019 s]

[INFO] hadoop-mapreduce-client-app........................ SUCCESS [ 15.131 s]

[INFO] hadoop-mapreduce-client-hs......................... SUCCESS [ 9.849 s]

[INFO] hadoop-mapreduce-client-jobclient ..................SUCCESS [17:20 min]

[INFO] hadoop-mapreduce-client-hs-plugins................. SUCCESS [  3.078 s]

[INFO] Apache Hadoop MapReduce Examples................... SUCCESS [  9.110 s]

[INFO] hadoop-mapreduce................................... SUCCESS [ 5.265 s]

[INFO] Apache Hadoop MapReduce Streaming.................. SUCCESS [03:17 min]

[INFO] Apache Hadoop Distributed Copy..................... SUCCESS [08:16 min]

[INFO] Apache Hadoop Archives............................. SUCCESS [ 3.757 s]

[INFO] Apache Hadoop Rumen................................ SUCCESS [ 8.926 s]

[INFO] Apache Hadoop Gridmix.............................. SUCCESS [ 7.391 s]

[INFO] Apache Hadoop Data Join............................ SUCCESS [ 5.499 s]

[INFO] Apache Hadoop Ant Tasks............................ SUCCESS [ 3.722 s]

[INFO] Apache Hadoop Extras............................... SUCCESS [ 5.240 s]

[INFO] Apache Hadoop Pipes................................ SUCCESS [ 12.424 s]

[INFO] Apache Hadoop OpenStack support.................... SUCCESS [  8.173 s]

[INFO] Apache Hadoop Amazon Web Servicessupport .......... SUCCESS [30:11 min]

[INFO] Apache Hadoop Azure support........................ SUCCESS [10:52 min]

[INFO] Apache Hadoop Client ...............................SUCCESS [ 14.450 s]

[INFO] Apache Hadoop Mini-Cluster......................... SUCCESS [ 0.290 s]

[INFO] Apache Hadoop Scheduler LoadSimulator ............. SUCCESS [  8.773s]

[INFO] Apache Hadoop Tools Dist........................... SUCCESS [ 15.627 s]

[INFO] Apache Hadoop Tools................................ SUCCESS [ 0.026 s]

[INFO] Apache Hadoop Distribution......................... SUCCESS [01:14 min]

[INFO]------------------------------------------------------------------------

[INFO] BUILD SUCCESS

[INFO]------------------------------------------------------------------------

[INFO] Total time: 02:07 h

[INFO] Finished at:2015-10-02T14:10:50+08:00

[INFO] Final Memory: 116M/494M

[INFO] ------------------------------------------------------------------------

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章