1.系统环境需求:
     本地系统为centos7
   源码下载地址(http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-3.0.0-beta1/hadoop-3.0.0-beta1-src.tar.gz)
   下载源码后,解压源码,进入根目录查看 BUILDING.txt
   tar -zxvf hadoop*.tar.gz
   [hadoop@localhost hadoop-3.0.0-beta1-src]$ cat BUILDING.txt  
   可看到以下要求
   Requirements:  * Unix System * JDK 1.8 * Maven 3.3 or later * ProtocolBuffer 2.5.0 * CMake 3.1 or newer (if compiling native code) * Zlib devel (if compiling native code) * openssl devel (if compiling native hadoop-pipes and to get the best HDFS encryption performance) * Linux FUSE (Filesystem in Userspace) version 2.6 or above (if compiling fuse_dfs) * Internet connection for first build (to fetch all Maven and Hadoop dependencies) * python (for releasedocs) * bats (for shell code testing) * Node.js / bower / Ember-cli (for YARN UI v2 building) 
   你需要安装的有
   jdk1.8
   protobuf必须2.5.0
   maven 3.3以上
   cmake 3.1以上
   应为需要编译native ,所以需要安装Zlib库
   还有一些上面没说,但没装可能报错的坑。。。
   apache-ant
   automake
   autoconf
   findbugs
   为了方便大家下载,我把这上面的包都下载下来放到码云上了,地址在文末。
  
  2.安装jdk1.8
     在jdk官网下载jdk1.8  (http://www.oracle.com/technetwork/java/javase/downloads/index.html)
   下载后解压到/opt目录下
   tar -zxvf jdk1.8**.tar.gz -C /opt/
   配置java环境变量
   JAVA_HOME=/opt/jdk1.8** CLASSPATH=$JAVA_HOME/lib/ PATH=$PATH:$JAVA_HOME/bin export PATH JAVA_HOME CLASSPATH
  
   
  3.更新本地cmake
     查看本地cmake版本
   [root@localhost hadoop]# cmake --version cmake version 2.8.12.2
   版本太低了
   
   所以这里编译安装cmake3.3.2
    wget https://cmake.org/files/v3.3/cmake-3.3.2.tar.gz 
   解压到opt目录
   tar -zxvf cmake-3.3.2.tar.gz -C /opt/
   编译安装
   ./configure make make install
   成功后设置环境变量,在/etc/profile中添加如下
   export PATH=/opt/cmake-3.3.2/bin:$PATH
   source一下profile
   source /etc/profile
   查看当前cmake版本
   cmake --version
   看到如下结果就安装成功了
   [root@localhost cmake-3.3.2]# cmake --version cmake version 3.3.2  CMake suite maintained and supported by Kitware (kitware.com/cmake).
  
  4.安装findbugs
     findbugs-3.0.1下载地址(http://prdownloads.sourceforge.net/findbugs/findbugs-3.0.1.tar.gz?download)
   下载后解压到/opt  目录
   tar -zxvf ***.tar.gz -C /opt/
   配置环境变量
   export FINDBUGS_HOME=/opt/findbugs-3.0.1  export PATH=$PATH:$FINDBUGS_HOME/bin
  
  5.安装ant
     apache-ant下载地址(http://mirror.bit.edu.cn/apache//ant/binaries/apache-ant-1.9.9-bin.zip)
   下载后解压到/opt目录下
   unzip apache-ant*.zip -d /opt/
   配置ant环境变量
   export ANT_HOME=/opt/apache-ant-1.9.9  export PATH=$PATH:$ANT_HOME/bin
  
  6.安装maven3.5.2
     maven3.5.2下载地址(http://mirror.bit.edu.cn/apache/maven/maven-3/3.5.2/binaries/apache-maven-3.5.2-bin.tar.gz)
   下载后解压到/opt目录下
   tar -zxvf ***.tar.gz -C /opt/
   配置环境变量
   export MAVEN_HOME=/opt/apache-maven-3 export PATH=/opt/cmake-3.3.2/bin:$PATH  export MAVEN_OPTS="-Xms256m -Xmx512m"  export PATH=$PATH:$MAVEN_HOME/bin
  
  7.安装automake,autoconf
     yum install automake autoconf -y
   一键安装,,不过这俩个好像自带了
  
  8.安装protobuf-2.5.0
     下载protobuf-2.5.0.tar.gz (本文末尾将给出下载链接)
   解压到/opt目录下
   tar -zxvf ***.tar.gz -C /opt/
   进入解压目录
   ./configure make make install
   测试是否成功安装
   protoc --version
   看到版本号即安装成功
  
  接下来将/etc/profile source一下,将上面装的包都生效。
  source /etc/profile
  9.用maven开始编译hadoop3.0.0
     进入hadoop源码根目录后,执行
   mvn package -Pdist,native -DskipTests -Dtar
   如果你运气好的话,能看到下面的结果
   [INFO] Reactor Summary: [INFO]  [INFO] Apache Hadoop Main ................................. SUCCESS [  7.003 s] [INFO] Apache Hadoop Build Tools .......................... SUCCESS [  8.003 s] [INFO] Apache Hadoop Project POM .......................... SUCCESS [  2.948 s] [INFO] Apache Hadoop Annotations .......................... SUCCESS [  4.626 s] [INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.245 s] [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  4.155 s] [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  7.021 s] [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  2.951 s] [INFO] Apache Hadoop Auth ................................. SUCCESS [  9.341 s] [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  4.484 s] [INFO] Apache Hadoop Common ............................... SUCCESS [01:08 min] [INFO] Apache Hadoop NFS .................................. SUCCESS [  7.621 s] [INFO] Apache Hadoop KMS .................................. SUCCESS [  7.392 s] [INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.049 s] [INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 38.861 s] [INFO] Apache Hadoop HDFS ................................. SUCCESS [01:23 min] [INFO] Apache Hadoop HDFS Native Client ................... SUCCESS [  9.793 s] [INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 10.793 s] [INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  4.745 s] [INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.049 s] [INFO] Apache Hadoop YARN ................................. SUCCESS [  0.038 s] [INFO] Apache Hadoop YARN API ............................. SUCCESS [ 20.464 s] [INFO] Apache Hadoop YARN Common .......................... SUCCESS [ 43.830 s] [INFO] Apache Hadoop YARN Server .......................... SUCCESS [  0.034 s] [INFO] Apache Hadoop YARN Server Common ................... SUCCESS [ 14.789 s] [INFO] Apache Hadoop YARN Registry ........................ SUCCESS [  7.127 s] [INFO] Apache Hadoop YARN NodeManager ..................... SUCCESS [ 34.282 s] [INFO] Apache Hadoop YARN Web Proxy ....................... SUCCESS [  4.094 s] [INFO] Apache Hadoop YARN ApplicationHistoryService ....... SUCCESS [  8.096 s] [INFO] Apache Hadoop YARN Timeline Service ................ SUCCESS [  5.807 s] [INFO] Apache Hadoop YARN ResourceManager ................. SUCCESS [ 27.429 s] [INFO] Apache Hadoop YARN Server Tests .................... SUCCESS [  2.429 s] [INFO] Apache Hadoop YARN Client .......................... SUCCESS [  7.644 s] [INFO] Apache Hadoop YARN SharedCacheManager .............. SUCCESS [  4.033 s] [INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SUCCESS [  5.206 s] [INFO] Apache Hadoop YARN TimelineService HBase Backend ... SUCCESS [  9.416 s] [INFO] Apache Hadoop YARN Timeline Service HBase tests .... SUCCESS [  5.647 s] [INFO] Apache Hadoop YARN Router .......................... SUCCESS [  6.088 s] [INFO] Apache Hadoop YARN Applications .................... SUCCESS [  0.049 s] [INFO] Apache Hadoop YARN DistributedShell ................ SUCCESS [  3.741 s] [INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SUCCESS [  2.700 s] [INFO] Apache Hadoop YARN Site ............................ SUCCESS [  0.074 s] [INFO] Apache Hadoop YARN UI .............................. SUCCESS [  0.042 s] [INFO] Apache Hadoop YARN Project ......................... SUCCESS [  9.958 s] [INFO] Apache Hadoop MapReduce Client ..................... SUCCESS [  0.291 s] [INFO] Apache Hadoop MapReduce Core ....................... SUCCESS [ 24.697 s] [INFO] Apache Hadoop MapReduce Common ..................... SUCCESS [ 16.013 s] [INFO] Apache Hadoop MapReduce Shuffle .................... SUCCESS [  4.281 s] [INFO] Apache Hadoop MapReduce App ........................ SUCCESS [  9.812 s] [INFO] Apache Hadoop MapReduce HistoryServer .............. SUCCESS [  6.419 s] [INFO] Apache Hadoop MapReduce JobClient .................. SUCCESS [  9.396 s] [INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SUCCESS [  2.131 s] [INFO] Apache Hadoop MapReduce NativeTask ................. SUCCESS [ 58.791 s] [INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  6.293 s] [INFO] Apache Hadoop MapReduce ............................ SUCCESS [  4.415 s] [INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  7.029 s] [INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [  8.569 s] [INFO] Apache Hadoop Archives ............................. SUCCESS [  2.455 s] [INFO] Apache Hadoop Archive Logs ......................... SUCCESS [  2.771 s] [INFO] Apache Hadoop Rumen ................................ SUCCESS [  5.268 s] [INFO] Apache Hadoop Gridmix .............................. SUCCESS [  4.419 s] [INFO] Apache Hadoop Data Join ............................ SUCCESS [  2.375 s] [INFO] Apache Hadoop Extras ............................... SUCCESS [  2.129 s] [INFO] Apache Hadoop Pipes ................................ SUCCESS [  6.287 s] [INFO] Apache Hadoop OpenStack support .................... SUCCESS [  6.092 s] [INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 12.514 s] [INFO] Apache Hadoop Kafka Library support ................ SUCCESS [  3.487 s] [INFO] Apache Hadoop Azure support ........................ SUCCESS [  6.791 s] [INFO] Apache Hadoop Aliyun OSS support ................... SUCCESS [  3.177 s] [INFO] Apache Hadoop Client Aggregator .................... SUCCESS [  2.760 s] [INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  2.292 s] [INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  5.750 s] [INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [  3.585 s] [INFO] Apache Hadoop Tools Dist ........................... SUCCESS [  7.539 s] [INFO] Apache Hadoop Tools ................................ SUCCESS [  0.026 s] [INFO] Apache Hadoop Client API ........................... SUCCESS [01:07 min] [INFO] Apache Hadoop Client Runtime ....................... SUCCESS [ 52.032 s] [INFO] Apache Hadoop Client Packaging Invariants .......... SUCCESS [  0.691 s] [INFO] Apache Hadoop Client Test Minicluster .............. SUCCESS [01:11 min] [INFO] Apache Hadoop Client Packaging Invariants for Test . SUCCESS [  0.158 s] [INFO] Apache Hadoop Client Packaging Integration Tests ... SUCCESS [  0.339 s] [INFO] Apache Hadoop Distribution ......................... SUCCESS [ 50.456 s] [INFO] Apache Hadoop Client Modules ....................... SUCCESS [  0.734 s] [INFO] Apache Hadoop Cloud Storage ........................ SUCCESS [  4.236 s] [INFO] Apache Hadoop Cloud Storage Project ................ SUCCESS [  0.025 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 16:39 min [INFO] Finished at: 2017-11-22T23:08:06+08:00 [INFO] Final Memory: 169M/494M [INFO] ------------------------------------------------------------------------
   说明编译成功完成
   编译后打包的Hadoop压缩文件在hadoop-dist/target里面
   [root@localhost target]# ll 总用量 256112 drwxr-xr-x. 2 root root        28 11月 22 23:07 antrun drwxr-xr-x. 3 root root        22 11月 22 23:07 classes drwxr-xr-x. 9 root root       149 11月 22 23:07 hadoop-3.0.0-beta1 -rw-r--r--. 1 root root 262258495 11月 22 23:07 hadoop-3.0.0-beta1.tar.gz drwxr-xr-x. 2 root root        33 11月 22 23:02 hadoop-tools-deps drwxr-xr-x. 3 root root        22 11月 22 23:07 maven-shared-archive-resources drwxr-xr-x. 3 root root        22 11月 22 23:07 test-classes drwxr-xr-x. 2 root root         6 11月 22 23:07 test-dir [root@localhost target]# pwd /home/hadoop/hadoop-3.0.0-beta1-src/hadoop-dist/target
  
     总结:
   个人认为Hadoop编译并不是很困难,前提是对Linux有一定的熟练度。编译前看下源码里面的文档,同时需要掌握好谷歌这一利器,基本上没多大的困难,就是得花点时间,我大概花了俩个小时左右编译完成。
  
     编译环境包下载地址:码云https://gitee.com/nanxun/hadoop3.0HuanJingBao