关于后端:Centos7系统编译Hadoop334

35次阅读

共计 5624 个字符,预计需要花费 15 分钟才能阅读完成。

1、背景

最近在学习hadoop,此篇文章简略记录一下通过源码来编译hadoop。为什么要从新编译hadoop 源码,是因为为了匹配不同操作系统的本地库环境。

2、编译源码

2.1 下载并解压源码

[root@hadoop01 ~]# mkdir /opt/hadoop
[root@hadoop01 ~]# cd /opt/hadoop/
[root@hadoop01 hadoop]# wget https://archive.apache.org/dist/hadoop/common/hadoop-3.3.4/hadoop-3.3.4-src.tar.gz
[root@hadoop01 hadoop]# tar -zxvf hadoop-3.3.4-src.tar.gz
[root@hadoop01 hadoop]# rm -rvf hadoop-3.3.4-src.tar.gz

2.2 查看编译 hadoop 必要的环境

[root@hadoop01 hadoop]# pwd
/opt/hadoop
[root@hadoop01 hadoop]# cd hadoop-3.3.4-src/
[root@hadoop01 hadoop-3.3.4-src]# cat BUILDING.txt
Build instructions for Hadoop

----------------------------------------------------------------------------------
Requirements:

* Unix System
* JDK 1.8
* Maven 3.3 or later
* Protocol Buffers 3.7.1 (if compiling native code)
* CMake 3.1 or newer (if compiling native code)
* Zlib devel (if compiling native code)
* Cyrus SASL devel (if compiling native code)
* One of the compilers that support thread_local storage: GCC 4.8.1 or later, Visual Studio,
  Clang (community version), Clang (version for iOS 9 and later) (if compiling native code)
* openssl devel (if compiling native hadoop-pipes and to get the best HDFS encryption performance)
* Linux FUSE (Filesystem in Userspace) version 2.6 or above (if compiling fuse_dfs)
* Doxygen (if compiling libhdfspp and generating the documents)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
* python (for releasedocs)
* bats (for shell code testing)
* Node.js / bower / Ember-cli (for YARN UI v2 building)

----------------------------------------------------------------------------------

能够看到须要JDK1.8 及以上、maven3.3 及以上等等

2.3 装置 JDK

须要留神 JDK 的版本, 参考这个文档。https://cwiki.apache.org/confluence/display/HADOOP/Hadoop+Java+Versions

2.4 装置 maven

[root@hadoop01 hadoop]# wget https://dlcdn.apache.org/maven/maven-3/3.9.0/binaries/apache-maven-3.9.0-bin.tar.gz
[root@hadoop01 hadoop]# tar -zxvf apache-maven-3.9.0-bin.tar.gz -C /usr/local
# 编辑环境变量
[root@hadoop01 hadoop]# vim /etc/profile
# 配置 maven
export M2_HOME=/usr/local/apache-maven-3.9.0
export PATH=${M2_HOME}/bin:$PATH
[root@hadoop01 hadoop]# source /etc/profile
# 查看 maven 的版本
[root@hadoop01 apache-maven-3.9.0]# mvn -version
Apache Maven 3.9.0 (9b58d2bad23a66be161c4664ef21ce219c2c8584)
Maven home: /usr/local/apache-maven-3.9.0
Java version: 1.8.0_333, vendor: Oracle Corporation, runtime: /usr/local/jdk8/jre
Default locale: zh_CN, platform encoding: UTF-8
OS name: "linux", version: "5.11.12-300.el7.aarch64", arch: "aarch64", family: "unix"
# 配置 aliyun 镜像减速拜访
[root@hadoop01 hadoop]# vim /usr/local/apache-maven-3.9.0/conf/settings.xml
<mirrors>
     <mirror>
           <id>alimaven</id>
           <name>aliyun maven</name>
           <url>http://maven.aliyun.com/nexus/content/groups/public/</url>
           <mirrorOf>central</mirrorOf>
      </mirror>
</mirrors>

2.5 装置编译相干的依赖

[root@hadoop01 hadoop]# yum install gcc gcc-c++ make autoconf automake libtool curl lzo-devel zlib-devel openssl openssl-devel ncurses-devel snappy snappy-devel bzip2 bzip2-devel lzo lzo-devel lzop libXtst zlib doxygen cyrus-sasl* saslwrapper-devel* -y

2.6 装置 cmake

hadoop要求 * CMake 3.1 or newer (if compiling native code) cmake 的版本在3.1 及其以上cmake 不是必须的。

# 卸载已有的 cmake
[root@hadoop01 hadoop]# yum erase cmake
# 下载 cmake(此处须要依据本人的操作系统进行下载)
[root@hadoop01 hadoop]# wget https://github.com/Kitware/CMake/releases/download/v3.25.2/cmake-3.25.2.tar.gz
[root@hadoop01 hadoop]# tar -zxvf cmake-3.25.2.tar.gz
# 编译和装置 cmake
[root@hadoop01 cmake-3.25.2-linux-aarch64]# cd cmake-3.25.2/ && ./configure && make && make install
# 查看 cmake 的版本
[root@hadoop01 cmake-3.25.2]# cmake -version
cmake version 3.25.2

CMake suite maintained and supported by Kitware (kitware.com/cmake).

2.7 装置 Snappy

* Snappy compression (only used for hadoop-mapreduce-client-nativetask)

# 卸载已装置的 snappy
[root@hadoop01 hadoop]# rm -rf /usr/local/lib/libsnappy* && rm -rf /lib64/libsnappy*
[root@hadoop01 hadoop]# wget https://src.fedoraproject.org/repo/pkgs/snappy/snappy-1.1.3.tar.gz/7358c82f133dc77798e4c2062a749b73/snappy-1.1.3.tar.gz
[root@hadoop01 snappy]# tar -zxvf snappy-1.1.3.tar.gz
[root@hadoop01 snappy]# cd snappy-1.1.3/ && ./configure && make && make install
[root@hadoop01 build]#

2.8 装置 ProtocolBuffer

* Protocol Buffers 3.7.1 (if compiling native code) 装置 3.7.1 的版本

[root@hadoop01 hadoop]# wget https://github.com/protocolbuffers/protobuf/releases/download/v3.7.1/protobuf-java-3.7.1.tar.gz
[root@hadoop01 hadoop]# tar -zxvf protobuf-java-3.7.1.tar.gz
# 编译和装置
[root@hadoop01 hadoop]# cd protobuf-3.7.1/ && ./autogen.sh && ./configure && make && make install
# 验证是否装置胜利
[root@hadoop01 protobuf-3.7.1]# protoc --version
libprotoc 3.7.1

2.9 编译 hadoop

[root@hadoop01 hadoop-3.3.4-src]# pwd
/opt/hadoop/hadoop-3.3.4-src
[root@hadoop01 hadoop-3.3.4-src]# export MAVEN_OPTS="-Xms3072m -Xmx3072m" && mvn clean package -Pdist,native -DskipTests -Dtar -Dbundle.snappy -Dsnappy.lib=/usr/local/lib -e

此处的 mvn 命令 也能够从 BUILDING.txt 文件中获取。

2.10 编译后的安装包门路

hadoop-3.3.4-src/hadoop-dist/target/hadoop-3.3.4.tar.gz

2.11 检测 native

[root@hadoop01 hadoop]# tar -zxvf hadoop-3.3.4.tar.gz
[root@hadoop01 bin]# cd hadoop-3.3.4/bin
[root@hadoop01 bin]# ./hadoop checknative -a
2023-02-18 16:58:39,698 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library system-native
2023-02-18 16:58:39,700 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
2023-02-18 16:58:39,700 WARN erasurecode.ErasureCodeNative: ISA-L support is not available in your platform... using builtin-java codec where applicable
2023-02-18 16:58:39,760 INFO nativeio.NativeIO: The native code was built without PMDK support.
Native library checking:
hadoop:  true /opt/hadoop/hadoop-3.3.4/lib/native/libhadoop.so.1.0.0
zlib:    true /lib64/libz.so.1
zstd  :  false
bzip2:   true /lib64/libbz2.so.1
openssl: true /lib64/libcrypto.so
ISA-L:   false libhadoop was built without ISA-L support
PMDK:    false The native code was built without PMDK support.
2023-02-18 16:58:39,764 INFO util.ExitUtil: Exiting with status 1: ExitException

能够看到上方还有很多 false 的,不过这不影响 hadoop 的应用。如果要解决的话,能够装置这些依赖,而后从新编译 hadoop。

3、参考文章

1、https://www.vvave.net/archives/how-to-build-hadoop-334-native-libraries-full-kit-on-amd64.html
2、https://cwiki.apache.org/confluence/display/HADOOP/Hadoop+Java+Versions

正文完
 0