win10編譯hadoop3.2.1

概述

搞大數據,hadoop是核心組件,因爲開源,並且體現龐雜,難免遇到一些坑,在遇到問題的時候很多情況下就需要來翻源碼,如果有bug還得改改bug然後重新編譯。所以在windows上編譯hadoop還是需要弄一下的。hadoop不像一般的那些java小項目,maven編譯命令一跑,就把源碼給你編譯成jar包了。hadoop裏面依賴的東西比較多,所以在windows下編碼相對來說還是比較麻煩的。

hadoop源碼根目錄有一個BUILDING.txt文件,裏面有各個平臺的編譯指引,windows下的編譯指引是下面這段描述:

按照這段指引來編譯,會讓你少走很多彎路,但是並不能100%保證一次編譯成功,肯定會遇到各種問題,不過不要怕,遇到問題,根據錯誤提示,找解決方案,俗話說,兵來將擋水來土掩,培養解決問題的思路能力,比解決問題本身更重要。

----------------------------------------------------------------------------------

Building on Windows

----------------------------------------------------------------------------------
Requirements:

* Windows System
* JDK 1.8
* Maven 3.0 or later
* ProtocolBuffer 2.5.0
* CMake 3.1 or newer
* Visual Studio 2010 Professional or Higher
* Windows SDK 8.1 (if building CPU rate control for the container executor)
* zlib headers (if building native code bindings for zlib)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
* Unix command-line tools from GnuWin32: sh, mkdir, rm, cp, tar, gzip. These
  tools must be present on your PATH.
* Python ( for generation of docs using 'mvn site')

Unix command-line tools are also included with the Windows Git package which
can be downloaded from http://git-scm.com/downloads

If using Visual Studio, it must be Professional level or higher.
Do not use Visual Studio Express.  It does not support compiling for 64-bit,
which is problematic if running a 64-bit system.

The Windows SDK 8.1 is available to download at:

http://msdn.microsoft.com/en-us/windows/bg162891.aspx

Cygwin is not required.

----------------------------------------------------------------------------------
Building:

Keep the source code tree in a short path to avoid running into problems related
to Windows maximum path length limitation (for example, C:\hdc).

There is one support command file located in dev-support called win-paths-eg.cmd.
It should be copied somewhere convenient and modified to fit your needs.

win-paths-eg.cmd sets up the environment for use. You will need to modify this
file. It will put all of the required components in the command path,
configure the bit-ness of the build, and set several optional components.

Several tests require that the user must have the Create Symbolic Links
privilege.

All Maven goals are the same as described above with the exception that
native code is built by enabling the 'native-win' Maven profile. -Pnative-win
is enabled by default when building on Windows since the native components
are required (not optional) on Windows.

If native code bindings for zlib are required, then the zlib headers must be
deployed on the build machine. Set the ZLIB_HOME environment variable to the
directory containing the headers.

set ZLIB_HOME=C:\zlib-1.2.7

At runtime, zlib1.dll must be accessible on the PATH. Hadoop has been tested
with zlib 1.2.7, built using Visual Studio 2010 out of contrib\vstudio\vc10 in
the zlib 1.2.7 source tree.

http://www.zlib.net/

----------------------------------------------------------------------------------
Building distributions:

 * Build distribution with native code    : mvn package [-Pdist][-Pdocs][-Psrc][-Dtar][-Dmaven.javadoc.skip=true]

環境要求

  • jdk1.8
  • Maven 3.0 or later
  • ProtocolBuffer 2.5.0
  • CMake 3.1 or newer
  • Visual Studio 2010 Professional or Higher
  • Windows SDK 8.1 (if building CPU rate control for the container executor)
  • zlib headers (if building native code bindings for zlib)
  • Internet connection for first build (to fetch all Maven and Hadoop dependencies)
  • Unix command-line tools from GnuWin32: sh, mkdir, rm, cp, tar, gzip. These tools must be present on your PATH。(或者 git)
  • Python ( for generation of docs using ‘mvn site’)

環境準備

JDK安裝

注意: JDK安裝1.8版本

windows 下安裝JDK參看:https://blog.csdn.net/xuejiaguniang/article/details/86331557

Maven安裝

windows 下maven安裝配置:https://blog.csdn.net/a805814077/article/details/100545928

ProtocolBuffer安裝

注意: 只能是 ProtocolBuffer 2.5.0

需要的文件:

  • protobuf-2.5.0.tar.gz
  • protoc-2.5.0-win32.zip。

下載地址: https://download.csdn.net/download/u013501457/10209225

官方下載地址: https://github.com/protocolbuffers/protobuf/releases

解壓protoc-2.5.0-win32.zip會得到一個protoc.exe文件;

解壓protobuf-2.5.0.tar.gz,我的解壓路徑是D:\Java\protobuf\protobuf-2.5.0;

  • a) 將protoc.exe文件拷貝到C:\Windows\System32目錄下;
  • b) 將protoc.exe拷貝到解壓後的D:\Java\protobuf\protobuf-2.5.0\src目錄中
  • c) 在windows的cmd中進入D:\Java\protobuf\protobuf-2.5.0\java 目錄,執行 "mvn package"命令,開始編譯,最終會在D:\Java\protobuf\protobuf-2.5.0\java\target目錄下生成一個protobuf-java-2.5.0.jar包;
  • d) 如果命令行界面出現"BUILD SUCCESS"結果說明protobuf安裝成功,使用"protoc --version"命令來查看安裝是否成功:

在這裏插入圖片描述

CMake安裝

windows 下cmake安裝配置:https://blog.csdn.net/m0_37407756/article/details/79790417

Visual Studio安裝

Windows SDK安裝

zlib安裝

GIT安裝

git 安裝 安裝unix tools。

Unix command-line tools from GnuWin32: sh, mkdir, rm, cp, tar, gzip. These
  tools must be present on your PATH.

在這裏插入圖片描述

Python安裝

編譯

注意不要加 mvn 不要加clean,否則手動修改過的目錄會被清理掉

mvn package [-Pdist][-Pdocs][-Psrc][-Dtar][-Dmaven.javadoc.skip=true]
mvn package -Dmaven.javadoc.skip=true -Dmaven.test.skip=true
mvn package -Pdist,native-win -DskipTests -Dtar -e -X
mvn package -Pdist,native-win -DskipTests -Dtar
mvn package -Pdist -DskipTests -Dtar -Dmaven.javadoc.skip=true
mvn package -Pdist -DskipTests -Dtar -Dmaven.javadoc.skip=true -rf :hadoop-common

錯誤解決

錯誤1

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 1
[ERROR] around Ant part ...<exec failonerror="true" dir="D:\h\hadoop\hadoop-hdfs-project\hadoop-hdfs-native-client\target/native" executable="cmake">... @ 5:125 in D:\h\hadoop\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

解決方案

D:\h\hadoop\hadoop-hdfs-project\hadoop-hdfs-native-client\pom.xml這個pom.xml配置文件中failonerror="true"改爲failonerror="false"

在這裏插入圖片描述

錯誤2

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs-native-client: An Ant BuildException has occured: D:\h\hadoop\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native\bin\RelWithDebInfo does not exist.
[ERROR] around Ant part ...<copy todir="D:\h\hadoop\hadoop-hdfs-project\hadoop-hdfs-native-client\target/bin">... @ 13:86 in D:\h\hadoop\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

解決方案

報目錄D:\h\hadoop\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native\bin\RelWithDebInfo不存在,那就在自定目錄下手動創建需要的目錄。需要注意的是,執行mvn 編譯的時候不能加clean,否則這個手動加的目錄會被清理掉。

錯誤3

[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (convert-ms-winutils) on project hadoop-common: Command execution failed.: Process exited with an error: 1 (Exit value: 1) -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (convert-ms-winutils) on project hadoop-common: Command execution failed.

解決方案

用Visual Studio 2019把D:\h\hadoop\hadoop-common-project\hadoop-common\src\main\native\native.sln重新打開一下,升級編譯版本。

在這裏插入圖片描述


參考

  • 【Hadoop 3.2.1 win10 64位系統 vs2015 編譯】https://www.cnblogs.com/bclshuai/p/12009991.html
  • 【Windows7-64編譯hadoop-3.2.0】https://blog.csdn.net/MoodStreet/article/details/98972784
  • 【在windows下使用linux命令,GnuWin32的使用】https://www.cnblogs.com/cnsevennight/p/4253167.html
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章