hadoop-2.7.3源码编译

前置及环境准备

下载hadoop2.7.3源码:
https://archive.apache.org/dist/hadoop/common/hadoop-2.7.3/hadoop-2.7.3.tar.gz

安装依赖库

sudo yum install ncurses-devel gcc gcc-c++
sudo yum install autoconf automake libtool cmake 
sudo yum install lzo-devel zlib-devel bzip2
sudo yum install openssl-devel openssl-libs

安装protobuf

版本下载:
https://github.com/google/protobuf/releases/download/v2.5.0/protobuf-2.5.0.tar.gz

$ tar zxvf protobuf-2.5.0.tar.gz  
$ mv protobuf-2.5.0 /usr/local/ 
$ cd /usr/local/protobuf-2.5.0 
$ ./configure 
$ make && make install 

安装cmake 2.6+:

在线安装:cmake

sudo yum install cmake

前提:需要先安装gcc gcc-c++ ncurses-devel

sudo yum install gcc gcc-c++ ncurses-devel

检测cmake是否安装成功:

$ cmake
$ cmake --version

安装snappy

编译native code需要配置该项
下载snappy获取编译安装1.1.3版本:
https://github.com/google/snappy/releases/download/1.1.3/snappy-1.1.3.tar.gz

$ sudo tar xf snappy-1.1.3.tar.gz -C /usr/local
$ cd /usr/local/snappy-1.1.3 
$ ./configure 
$ make && make install  

安装完成后检查效果:

root@localhost snappy-1.1.3]# ls -lh /usr/local/lib |grep snappy
-rw-r--r--. 1 root root 615K Jan 17 16:01 libsnappy.a
-rwxr-xr-x. 1 root root  955 Jan 17 16:01 libsnappy.la
lrwxrwxrwx. 1 root root   18 Jan 17 16:01 libsnappy.so -> libsnappy.so.1.3.1
lrwxrwxrwx. 1 root root   18 Jan 17 16:01 libsnappy.so.1 -> libsnappy.so.1.3.1
-rwxr-xr-x. 1 root root 304K Jan 17 16:01 libsnappy.so.1.3.1

安装完成后,生成的动态库和静态库位于/usr/local/lib处,编程需要用到的头文件位于/usr/local/include处。注意需要将这些库文件cp至/usr/lib处,不然就算在链接的时候加上-L/usr/local/lib,在运行时也会报错。./main: error while loading shared libraries: libsnappy.so.1:
cannot open shared object file: No such file or directory

下载配置maven

下载apache-maven-3.3.9版本
可配置阿里云仓库。(实际使用默认Maven2仓库)
vim apache-maven-3.3.9/conf/settings.xml

<mirrors>  
    <!-- 阿里云仓库 -->  
        <mirror>  
            <id>alimaven</id>  
            <mirrorOf>central</mirrorOf>  
            <name>aliyun maven</name>  
            <url>http://maven.aliyun.com/nexus/content/repositories/central/</url>  
        </mirror>  
</mirrors> 

下载配置ant

下载apache-ant-1.10.1版本,配置ant环境变量,参见下述环境变量配置

环境变量配置

## JDK配置
export JAVA_HOME=/usr/java/jdk1.8.0_45
export CLASSPATH=$JAVA_HOME/lib:$JAVA_HOME/jre/lib:.
export PATH=$JAVA_HOME/bin:$JAVA_HOME/jre/bin:$PATH

## Maven配置
export MAVEN_HOME=/mnt/soft-backup/SOFT/devtools/mavenSet/apache-maven-3.3.9
export PATH=$MAVEN_HOME/bin:$PATH
export M2_HOME=$MAVEN_HOME

## ANT 配置
export ANT_HOME=/mnt/soft-backup/SOFT/devtools/buiding/apache-ant-1.10.1
export PATH=$PATH:$ANT_HOME/bin

编译

  • 打包
    mvn clean package -DskipTests

  • 安装到本地仓库
    mvn clean install -DskipTests

  • 编译发行版本-不带native
    mvn clean package -DskipTests -Pdist -Dtar

  • 编译发行版本-带native
    mvn clean package -DskipTests -Pdist,native -Dtar

备注:
编译二进制发行版本,编译成功后,发布包位于hadoop/hadoop-dist/target/hadoop-2.7.3.tar.gz

结果:

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................. SUCCESS [  1.278 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [  1.264 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  1.017 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  1.939 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.328 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  1.670 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.434 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  4.854 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  5.480 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  3.686 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:39 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  6.373 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [06:56 min]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.042 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [02:43 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 48.137 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [  5.578 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  4.336 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.042 s]
[INFO] hadoop-yarn ........................................ SUCCESS [  0.039 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [ 34.300 s]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 43.067 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [  0.046 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 12.388 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 21.126 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  3.920 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [  8.933 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 31.412 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [  4.802 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [  6.483 s]
[INFO] hadoop-yarn-server-sharedcachemanager .............. SUCCESS [  4.677 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.572 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  2.984 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  2.431 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [  0.046 s]
[INFO] hadoop-yarn-registry ............................... SUCCESS [  6.800 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [ 39.523 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.252 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 30.617 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 18.328 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [  4.297 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 14.719 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [  6.102 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 13.940 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  1.872 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  7.894 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [ 25.867 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  4.595 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 15.949 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  1.933 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [  8.079 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  5.082 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  2.260 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  1.912 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  2.826 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  0.032 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  5.213 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [  4.694 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [  6.842 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [ 11.917 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  1.067 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  5.651 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 14.084 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.032 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 53.803 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 21:05 min
[INFO] Finished at: 2018-01-22T10:30:41+08:00
[INFO] Final Memory: 114M/667M
[INFO] ------------------------------------------------------------------------

QA

  1. OpensslCipher.c:256:14: 错误:dereferencing pointer to incomplete type ‘EVP_CIPHER_CTX {或称 struct evp_cipher_ctx_st}’
     [exec] [ 19%] Building C object CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c.o
     [exec] /usr/bin/cc  -I/mnt/soft-backup/Repository_git/hadoop/hadoop-common-project/hadoop-common/target/native/javah -I/mnt/soft-backup/Repository_git/hadoop/hadoop-common-project/hadoop-common/src/main/native/src -I/mnt/soft-backup/Repository_git/hadoop/hadoop-common-project/hadoop-common/src -I/mnt/soft-backup/Repository_git/hadoop/hadoop-common-project/hadoop-common/src/src -I/mnt/soft-backup/Repository_git/hadoop/hadoop-common-project/hadoop-common/target/native -I/usr/java/jdk1.8.0_45/include -I/usr/java/jdk1.8.0_45/include/linux -I/usr/local/include -I/mnt/soft-backup/Repository_git/hadoop/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/util  -g -Wall -O2 -D_REENTRANT -D_GNU_SOURCE -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64   -o CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c.o   -c /mnt/soft-backup/Repository_git/hadoop/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c
     [exec] /mnt/soft-backup/Repository_git/hadoop/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c: 在函数‘check_update_max_output_len’中:
     [exec] /mnt/soft-backup/Repository_git/hadoop/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c:256:14: 错误:dereferencing pointer to incomplete type ‘EVP_CIPHER_CTX {或称 struct evp_cipher_ctx_st}’
     [exec]    if (context->flags & EVP_CIPH_NO_PADDING) {
     [exec]               ^~
     [exec] /mnt/soft-backup/Repository_git/hadoop/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c:275:1: 警告:在有返回值的函数中,控制流程到达函数尾 [-Wreturn-type]
     [exec]  }
     [exec]  ^
     [exec] /mnt/soft-backup/Repository_git/hadoop/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c: 在函数‘check_doFinal_max_output_len’中:
     [exec] /mnt/soft-backup/Repository_git/hadoop/hadoop-common-project/hadoop-common/src/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c:320:1: 警告:在有返回值的函数中,控制流程到达函数尾 [-Wreturn-type]
     [exec]  }
     [exec]  ^
     [exec] make[2]: *** [CMakeFiles/hadoop_static.dir/build.make:231:CMakeFiles/hadoop_static.dir/main/native/src/org/apache/hadoop/crypto/OpensslCipher.c.o] 错误 1
     [exec] make[1]: *** [CMakeFiles/Makefile2:105:CMakeFiles/hadoop_static.dir/all] 错误 2
     [exec] make: *** [Makefile:84:all] 错误 2
     [exec] make[2]: 离开目录“/mnt/soft-backup/Repository_git/hadoop/hadoop-common-project/hadoop-common/target/native”
     [exec] make[1]: 离开目录“/mnt/soft-backup/Repository_git/hadoop/hadoop-common-project/hadoop-common/target/native”

解决:

openssl版本问题,Fedora27系统对应的openssl版本过高。如使用CentOS6或者RHEL6,对应的openssl,openssl-devel版本为:1.0.1

    原文作者:Kigo
    原文地址: https://www.jianshu.com/p/4a0cbda79f35
    本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系博主进行删除。
点赞