Hadoop 3.2.1 win10 64位系统 vs2015 编译

1.1   JDK下载安装

1.1.1         下载

 JDK 1.8    (jdk1.8.0_102)
   下载地址    : http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html(按照电脑版本下载安装)

1.1.2         安装

解压到指定文件夹

(1) 安装JDK 
(2)新建系统变量JAVA_HOME=D:\Program Files\Java\jdk1.8.0_102 
(3)编辑系统变量Path,新增%JAVA_HOME%\bin%JAVA_HOME%\jre\bin

1.2   Marven下载和安装

1.2.1         下载

https://blog.csdn.net/changge458/article/details/53576178

从该网站 http://maven.apache.org/download.cgi 下载

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

1.2.2         安装

解压到文件夹D:\marven\apache-maven-3.6.3

添加系统环境变量:MARVEN_HOME

Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

 

D:\marven\apache-maven-3.6.3\

在系统环境变量path中加入

 

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

测试mvn是否安装成功,打开cmd,以管理员身份运行,否则容易报错:'mvn' 不是内部或外部命令,也不是可运行的程序

 

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

查看是否配置成功可在黑窗口中输入 mvn –v 出现如下图所示说明配置成功

 

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

打开D:\marven\apache-maven-3.6.3\conf\settings.xml 加入阿里云镜像,找到mirrors标签,添加如下阿里云镜像地址,marven下载库从这里下载更快速。

<mirror>

    <id>nexus-aliyun</id>

    <mirrorOf>central</mirrorOf>

    <name>Nexus aliyun</name>

    <url>http://maven.aliyun.com/nexus/content/groups/public</url>

</mirror>

 

1.3   编译安装protobuff

Protobuff是用于序列化数据的,hadoop编译需要依赖这个库。

1.3.1         下载

ProtocolBuffer 2.5.0    (两个文件protobuf-2.5.0.zip  protoc-2.5.0-win32.zip )

   下载地址    : https://github.com/google/protobuf/releases/tag/v2.5.0

   注:除了下载protobuf源码外,还需要下载相应版本的编译过的用于Windows平台的protoc命令(protoc-2.5.0-win32.zip),该命令用于将.proto文件转化为Java或C++源文件。

1.3.2         安装

① 解压ProtocolBuffer到指定目录

② 解压protoc-2.5.0-win32.zip,将protoc.exe复制到C:\WorkSpace\protobuf-2.5.0\src目录下

③ 安装ProtocolBuffer,打开CMD命令行

 

cd C:\WorkSpace\protobuf-2.5.0\java

mvn test

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

mvn install

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

protoc –version    这个命令失败

protoc.exe所在路径C:\WorkSpace\protobuf-2.5.0\src,添加到系统变量Path

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

 

1.4   Git下载和安装

Git是在window的cmd黑窗口中,可以使用linux的命令。不安装会出现错误

Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (pre-dist) on project hadoop-project-dist: Command execution failed.: Cannot run program "bash" (in directory "D:\hadoop\hadoop-3.2.1-src\hadoop-project-dist\target"): CreateProcess error=2, 系统找不到指定的文件。 -> [Help 1]

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

1.4.1         下载

 https://git-for-windows.github.io/

1.4.2         安装

(1)
     
      Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 


   (2)、
       Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 


    (3)、
       Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 


    (4)、
       Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 


   (5)、
      

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

 

1.5   CMake 安装

1.5.1         下载

 https://cmake.org/download/

Windows win64-x64 ZIP  cmake-3.16.0-win64-x64.zip

1.5.2         安装

解压到指定文件夹即可,path环境变量添加bin路径。

D:\hadoop\cmake-3.16.0-win64-x64\bin

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

1.6   Zlib下载安装

1.6.1         下载

http://jaist.dl.sourceforge.net/project/libpng/zlib/1.2.8/

Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

1.6.2         安装

解压到指定文件夹,添加zlib系统变量

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

 

2        编译hadoop

2.1   升级项目Visual Studio的版本

hadoop 各个版本的下载地址如

http://archive.apache.org/dist/hadoop/core/

编译hadoop时,自动编译会去根据项目文件找到编译器visual studio去编译,hadoop3.2.1默认是VS2010,现在要用VS2015去编译,需要将项目文件winutils.sln和native.sln升级为vs2015。由于Visual Studio版本问题,官方默认使用Visual Studio 2010 Professional,但本文采用Visual Studio 2015,因此对于生成失败的项目,需要用Visual Studio 2015重新打开,会弹出升级的对话框,升级项目至Visual Studio 2015版本即可:

(1)       Window工具编译

D:\hadoop\hadoop-3.2.1-src\hadoop-common-project\hadoop-common\src\main\winutils

(2)hadoop.dll编译

D:\hadoop\hadoop-3.2.1-src\hadoop-common-project\hadoop-common\src\main\native

这个项目编译输出的就是hadoop.dll文件,编辑编译生成按钮,会报很多头文件错误,是因为投啊文件件包含了..\..\..\target\native\javah,这个文件夹内的头文件,编译hadoop时,编译命令自动先复制头文件到此文件夹,然后才能编译。

打开cmd窗口,进入路径D:\hadoop\hadoop-3.2.1-src\,执行下面的命令:

mvn package -Pdist,native-win-DskipTests -Dtar

2.2   Apache Hadoop Common编译错误

2.2.1         convert-ms-winutils错误描述

错误描述

[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 12.916 s]

[INFO] Apache Hadoop Common ............................... FAILURE [02:06 min]

[INFO] Apache Hadoop NFS .................................. SKIPPED

[INFO] Apache Hadoop KMS .................................. SKIPPED

[INFO] ------------------------------------------------------------------------

[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (convert-ms-winutils) on project hadoop-common: Command execution failed.: Process exited with an error: 1 (Exit value: 1) -> [Help 1]

[ERROR]

[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.

[ERROR] Re-run Maven using the -X switch to enable full debug logging.

[ERROR]

[ERROR] For more information about the errors and possible solutions, please read the following articles:

[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

[ERROR]

[ERROR] After correcting the problems, you can resume the build with the command

[ERROR]   mvn <args> -rf :hadoop-common

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

解决办法

(1)   确认2.1中两个项目都升级成功,sln文件都是显示Visual studio 14

(2)   VS 2015 x64 native Tools 工具执行命令mvn package -Pdist,native-win -DskipTests -Dtar -e -X

 

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

 

 

 

2.2.2         Javah错误

错误描述

[ERROR] Failed to execute goal org.codehaus.mojo:native-maven-plugin:1.0-alpha-8:javah (default) on project hadoop-common: Error running javah command: Error executing command line. Exit code:1 -> [Help 1]

org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:native-maven-plugin:1.0-alpha-8:javah (default) on project hadoop-common: Error running javah command

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

解决办法

因为D:\hadoop\hadoop-3.2.1-src\hadoop-common-project\hadoop-common路径下的pom.xml文件中,<javahPath>${env.JAVA_HOME}/bin/javah</javahPath>采用env.JAVA_HOME 无法识别,所以讲pom.xml文件中换成javah的绝对路径:D:\Java\jdk1.8.0_181\bin\,一共有两个地方。

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

然后命令加入参数执行–rf :hadoop-common 表示从hadoop-common开始编译:

mvn package -Pdist,native-win-DskipTests –Dtar –rf :hadoop-common

2.3  HDFS Native Client编译错误

2.3.1         错误描述

[INFO] Apache Hadoop HDFS Native Client ................... FAILURE [02:26 min]

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 1

[ERROR] around Ant part ...<exec failonerror="true" dir="D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target/native" executable="cmake">... @ 5:140 in D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

[ERROR] -> [Help 1]

org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs-native-client: An Ant BuildException has occured: exec returned: 1

around Ant part ...<exec failonerror="true" dir="D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target/native" executable="cmake">... @ 5:140 in D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

 

2.3.2         解决方法

D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\pom.xml文件打开,修改如下部分的true为false;

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

 

2.4  hadoop-hdfs-native-client :RelWithDebInfo编译错误

2.4.1         错误描述

Finished at: 2019-12-01T18:20:30+08:00

[INFO] ------------------------------------------------------------------------

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs-native-client: An Ant BuildException has occured: D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native\bin\RelWithDebInfo does not exist.

[ERROR] around Ant part ...<copy todir="D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target/bin">... @ 13:101 in D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

[ERROR] -> [Help 1]

 

2.4.2         解决方法

D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native\bin\RelWithDebInfo does not exist.错误是这个目录不存在,则创建这个目录

2.5  执行maven-plugin:1.3.1:exec (pre-dist)失败

2.5.1         错误描述

Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (pre-dist) on project hadoop-hdfs-native-client: Command execution failed.: Process exited with an error: 1 (Exit value: 1) -> [Help 1]

org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (pre-dist) on project hadoop-hdfs-native-client: Command execution failed.

2.5.2         解决办法

下载安装Cygwin
下载:http://www.cygwin.com/setup-x86_64.exe ,安装
配置D:\cygwin64\bin到PATH

Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

然后打开cygwin64的terminal窗口,执行命令mvn package -Pdist,native-win -DskipTests -Dtar -e -X -rf :hadoop-hdfs-native-client

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

 

2.6  Msbuild编译错误

2.6.1         错误描述

Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs-native-client: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "msbuild" (in directory "D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\native"): CreateProcess error=2, ϵͳ▒Ҳ▒▒▒ָ▒▒▒▒▒ļ▒▒▒

[ERROR] around Ant part ...<exec failonerror="false" dir="D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target/native" executable="msbuild">... @ 9:143 in D:\hadoop\hadoop-3.2.1-src\hadoop-hdfs-project\hadoop-hdfs-native-client\target\antrun\build-main.xml

 

2.6.2         解决办法

下载安装vscode  https://code.visualstudio.com/docs/?dv=win

将路径C:\Program Files (x86)\MSBuild\14.0\Bin添加到path环境变量。

 

2.7  maven-surefire-plugin编译错误

2.7.1         错误描述

Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M1:test (default-test)

2.7.2         解决方法

在D:\hadoop\hadoop-3.2.1-src\hadoop-common-project\hadoop-common\pom.xml找到org.apache.maven.plugins,在configuration中添加<testFailureIgnore>true</testFailureIgnore>屏蔽测试错误。

<plugin>

            <groupId>org.apache.maven.plugins</groupId>

            <artifactId>maven-surefire-plugin</artifactId>

            <configuration>

                       <testFailureIgnore>true</testFailureIgnore>

              <forkCount>${testsThreadCount}</forkCount>

              <reuseForks>false</reuseForks>

              <argLine>${maven-surefire-plugin.argLine} -DminiClusterDedicatedDirs=true</argLine>

              <systemPropertyVariables>

                <testsThreadCount>${testsThreadCount}</testsThreadCount>

                <test.build.data>${test.build.data}/${surefire.forkNumber}</test.build.data>

                <test.build.dir>${test.build.dir}/${surefire.forkNumber}</test.build.dir>

                <hadoop.tmp.dir>${hadoop.tmp.dir}/${surefire.forkNumber}</hadoop.tmp.dir>

 

                <!-- Due to a Maven quirk, setting this to just -->

                <!-- surefire.forkNumber won't do the parameter substitution. -->

                <!-- Putting a prefix in front of it like "fork-" makes it -->

                <!-- work. -->

                <test.unique.fork.id>fork-${surefire.forkNumber}</test.unique.fork.id>

              </systemPropertyVariables>

            </configuration>

          </plugin>

 

2.8  Apache Hadoop Distribution编译错误

2.8.1         错误描述

[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (dist) on project hadoop-dist: Command execution failed.: Process exited with an error: 1 (Exit value: 1) -> [Help 1]

 Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

2.8.2         解决办法

执行命令mvn package -Pdist,native-win -DskipTests -Dtar -e -X -rf :hadoop-dist,加上-e和-X参数显示详细的错误。会发现是目录结构问题,找不到如下路径。解决方法就是创建缺少的路径,并将生成的jar包成果物复制到路径下

[DEBUG] Executing command line: [bash, D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-layout-stitching, 3.2.1, D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target]

cp: cannot stat '/d/hadoop/hadoop-3.2.1-src/hadoop-common-project/hadoop-kms/target/hadoop-kms-3.2.1/*': No such file or directory

2.9  hadoop-dist: exec (toolshooks)编译错误

2.9.1         错误描述

Executing command line: [bash, D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-tools-hooks-maker, 3.2.1, D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target, D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../hadoop-tools]

找不到文件 - *.tools-builtin.txt

D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-tools-hooks-maker: line 137: D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh.new: No such file or directory

mv: cannot stat 'D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh.new': No such file or directory

Rewriting D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh

[INFO] ------------------------------------------------------------------------

[INFO] Reactor Summary for Apache Hadoop Distribution 3.2.1:

[INFO]

[INFO] Apache Hadoop Distribution ......................... FAILURE [ 25.920 s]

[INFO] Apache Hadoop Client Modules ....................... SKIPPED

[INFO] Apache Hadoop Cloud Storage ........................ SKIPPED

[INFO] Apache Hadoop Cloud Storage Project ................ SKIPPED

[INFO] ------------------------------------------------------------------------

[INFO] BUILD FAILURE

[INFO] ------------------------------------------------------------------------

[INFO] Total time:  33.530 s

[INFO] Finished at: 2019-12-08T11:56:12+08:00

[INFO] ------------------------------------------------------------------------

[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (toolshooks) on project hadoop-dist: Command execution failed.: Process exited with an error: 1 (Exit value: 1) -> [Help 1]

org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (toolshooks) on project hadoop-dist: Command execution failed.

2.9.2         解决方法

(1)错误原因和2.8相似,缺少了文件夹和文件,按照错误提示创建文件夹,找到缺少的文件放入指定文件夹。

D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-tools-hooks-maker, 3.2.1, D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target, D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../hadoop-tools]

找不到文件 - *.tools-builtin.txt

D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-tools-hooks-maker: line 137: D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh.new: No such file or directory

mv: cannot stat 'D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh.new': No such file or directory

Rewriting D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target/hadoop-3.2.1/etc/hadoop/hadoop-env.sh

(2)根据描述是添加了/etc/hadoop文件夹,并且从D:\hadoop\hadoop-3.2.1-src\hadoop-common-project\hadoop-common\src\main\conf \ 中追到了hadoop-env.sh复制过去,但是发现在执行编译时,总是会自动删除这个文件夹和文件。在执行下面的命令时,删除的:

[DEBUG] Executing command line: [bash, D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-layout-stitching, 3.2.1, D:\hadoop\hadoop-3.2.1-src\hadoop-dist\target]

Current directory /d/hadoop/hadoop-3.2.1-src/hadoop-dist/target

$ rm -rf hadoop-3.2.1

$ mkdir hadoop-3.2.1

$ cd hadoop-3.2.1

(3)打开D:\hadoop\hadoop-3.2.1-src\hadoop-dist/../dev-support/bin/dist-layout-stitching文件,发现dist-layout-stitching中有一个语句run rm -rf "hadoop-${VERSION}"会删除hadoop-3.2.1文件夹,然后再新建,新建之后再复制,所以怎么加etc/hadoop文件夹都会被删除

(4)在文件末尾处添加如下代码,用代码去创建文件夹,并复制hadoop-env.sh文件。

run mkdir "etc"

run cd "etc"

run mkdir "hadoop"

run cd "hadoop"

run copy  "${ROOT}\hadoop-common-project\hadoop-common\src\main\conf\hadoop-env.sh"

Hadoop 3.2.1 win10 64位系统 vs2015 编译

 

 

 

(5)执行命令>mvn package -Pdist,native-win -DskipTests -Dtar -e -X -rf :hadoop-dist

3       大工告成

[DEBUG]   (f) siteDirectory = D:\hadoop\hadoop-3.2.1-src\hadoop-cloud-storage-project\src\site

[DEBUG]   (f) skip = false

[DEBUG] -- end configuration --

[INFO] No site descriptor found: nothing to attach.

[INFO] ------------------------------------------------------------------------

[INFO] Reactor Summary for Apache Hadoop Distribution 3.2.1:

[INFO]

[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 39.961 s]

[INFO] Apache Hadoop Client Modules ....................... SUCCESS [  5.721 s]

[INFO] Apache Hadoop Cloud Storage ........................ SUCCESS [  10:36 h]

[INFO] Apache Hadoop Cloud Storage Project ................ SUCCESS [  1.471 s]

[INFO] ------------------------------------------------------------------------

[INFO] BUILD SUCCESS

[INFO] ------------------------------------------------------------------------

[INFO] Total time:  10:37 h

[INFO] Finished at: 2019-12-09T10:05:07+08:00

[INFO] ------------------------------------------------------------------------

 

 

 

自己开发了一个股票智能分析软件,功能很强大,需要的点击下面的链接获取:

https://www.cnblogs.com/bclshuai/p/11380657.html

 

参考文献

https://blog.csdn.net/qq_37475168/article/details/90746823

 

相关文章:

  • 2021-08-19
  • 2021-11-27
  • 2022-01-04
  • 2021-08-23
  • 2021-06-18
  • 2022-12-23
  • 2021-08-08
猜你喜欢
  • 2022-12-23
  • 2021-09-10
  • 2022-03-06
  • 2022-12-23
  • 2021-05-22
  • 2021-06-12
  • 2021-08-23
相关资源
相似解决方案