CentOS 64位上编译 Hadoop 2.6.0

Hadoop不提供64位编译好的版本,只能用源码自行编译64位版本。学习一项技术从安装开始,学习hadoop要从编译开始。1.操作系统编译环境yum install cmake lzo-devel zlib-devel gcc gcc-c++ autoconf automake libtool ncurses-devel openssl-devel libXtst2.安装JDK下载JDK1.7,,注意只能用1.7,否则编译会出错tar zxvf jdk-7u75-linux-x64.tar.gz -C /appexport JAVA_HOME=/app/jdk1.7.0_75export JRE_HOME=$JAVA_HOME/jreexport CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jarPATH=$PATH:$JAVA_HOME/bin3.安装protobuf下载protobuf-2.5.0,不能用高版本,否则Hadoop编译不能通过wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gztar xvf protobuf-2.5.0.tar.gzcd protobuf-2.5.0./configuremakemake installldconfigprotoc –version4.安装ANTwget tar zxvf apache-ant-1.9.4-bin.tar.gz -C /appvi /etc/profileexport ANT_HOME=/app/apache-ant-1.9.4PATH=$PATH:$ANT_HOME/bin5.安装mavenwget tar zxvf apache-maven-3.3.1-bin.tar.gz -C /appvi /etc/profileexport MAVEN_HOME=/app/apache-maven-3.3.1export PATH=$PATH:$MAVEN_HOME/bin修改配置文件vi /app/apache-maven-3.3.1/conf/settings.xml更改maven资料库,在<mirrors></mirros>里添加如下内容: <mirror> <id>nexus-osc</id> <mirrorOf>*</mirrorOf> <name>Nexusosc</name> <url></url> </mirror>在<profiles></profiles>内新添加<profile> <id>jdk-1.7</id> <activation> <jdk>1.7</jdk> </activation> <repositories> <repository> <id>nexus</id> <name>local private nexus</name> <url></url> <releases> <enabled>true</enabled> </releases> <snapshots> <enabled>false</enabled> </snapshots> </repository> </repositories> <pluginRepositories> <pluginRepository> <id>nexus</id> <name>local private nexus</name> <url></url> <releases> <enabled>true</enabled> </releases> <snapshots> <enabled>false</enabled> </snapshots> </pluginRepository> </pluginRepositories></profile>6.安装findbugs(非必须)wget ?downloadtar zxvf findbugs-3.0.1.tar.gz -C /appvi /etc/profileexport FINDBUGS_HOME=/app/findbugs-3.0.1PATH=$PATH:$FINDBUGS_HOME/binexport PATH注意:最终,在/etc/profile中环境变量PATH的设置如下:PATH=$PATH:$JAVA_HOME/bin:$ANT_HOME/bin:$MAVEN_HOME/bin:$FINDBUGS_HOME/binexport PATH在shell下执行,使环境变量生效. /etc/profile7.编译 Hadoop2.6.0wget cd hadoop-2.6.0-srcmvn package -DskipTests -Pdist,native -Dtar[INFO] Reactor Summary:[INFO][INFO] Apache Hadoop Main …………………………… SUCCESS [ 4.401 s][INFO] Apache Hadoop Project POM …………………….. SUCCESS [ 3.864 s][INFO] Apache Hadoop Annotations …………………….. SUCCESS [ 7.591 s][INFO] Apache Hadoop Assemblies ……………………… SUCCESS [ 0.535 s][INFO] Apache Hadoop Project Dist POM ………………… SUCCESS [ 3.585 s][INFO] Apache Hadoop Maven Plugins …………………… SUCCESS [ 6.623 s][INFO] Apache Hadoop MiniKDC ………………………… SUCCESS [ 4.722 s][INFO] Apache Hadoop Auth …………………………… SUCCESS [ 7.787 s][INFO] Apache Hadoop Auth Examples …………………… SUCCESS [ 5.500 s][INFO] Apache Hadoop Common …………………………. SUCCESS [02:47 min][INFO] Apache Hadoop NFS ……………………………. SUCCESS [ 12.793 s][INFO] Apache Hadoop KMS ……………………………. SUCCESS [ 20.443 s][INFO] Apache Hadoop Common Project ………………….. SUCCESS [ 0.111 s][INFO] Apache Hadoop HDFS …………………………… SUCCESS [04:35 min][INFO] Apache Hadoop HttpFS …………………………. SUCCESS [ 29.896 s][INFO] Apache Hadoop HDFS BookKeeper Journal ………….. SUCCESS [ 11.100 s][INFO] Apache Hadoop HDFS-NFS ……………………….. SUCCESS [ 8.262 s][INFO] Apache Hadoop HDFS Project ……………………. SUCCESS [ 0.069 s][INFO] hadoop-yarn …………………………………. SUCCESS [ 0.066 s][INFO] hadoop-yarn-api ……………………………… SUCCESS [02:05 min][INFO] hadoop-yarn-common …………………………… SUCCESS [ 46.132 s][INFO] hadoop-yarn-server …………………………… SUCCESS [ 0.123 s][INFO] hadoop-yarn-server-common …………………….. SUCCESS [ 19.166 s][INFO] hadoop-yarn-server-nodemanager ………………… SUCCESS [ 25.552 s][INFO] hadoop-yarn-server-web-proxy ………………….. SUCCESS [ 5.456 s][INFO] hadoop-yarn-server-applicationhistoryservice ……. SUCCESS [ 11.781 s][INFO] hadoop-yarn-server-resourcemanager …………….. SUCCESS [ 30.557 s][INFO] hadoop-yarn-server-tests ……………………… SUCCESS [ 9.765 s][INFO] hadoop-yarn-client …………………………… SUCCESS [ 14.016 s][INFO] hadoop-yarn-applications ……………………… SUCCESS [ 0.101 s][INFO] hadoop-yarn-applications-distributedshell ………. SUCCESS [ 4.116 s][INFO] hadoop-yarn-applications-unmanaged-am-launcher ….. SUCCESS [ 2.993 s][INFO] hadoop-yarn-site …………………………….. SUCCESS [ 0.093 s][INFO] hadoop-yarn-registry …………………………. SUCCESS [ 9.036 s][INFO] hadoop-yarn-project ………………………….. SUCCESS [ 6.557 s][INFO] hadoop-mapreduce-client ………………………. SUCCESS [ 0.267 s][INFO] hadoop-mapreduce-client-core ………………….. SUCCESS [ 36.775 s][INFO] hadoop-mapreduce-client-common ………………… SUCCESS [ 28.049 s][INFO] hadoop-mapreduce-client-shuffle ……………….. SUCCESS [ 7.285 s][INFO] hadoop-mapreduce-client-app …………………… SUCCESS [ 17.333 s][INFO] hadoop-mapreduce-client-hs ……………………. SUCCESS [ 15.283 s][INFO] hadoop-mapreduce-client-jobclient ……………… SUCCESS [ 7.110 s][INFO] hadoop-mapreduce-client-hs-plugins …………….. SUCCESS [ 3.843 s][INFO] Apache Hadoop MapReduce Examples ………………. SUCCESS [ 12.559 s][INFO] hadoop-mapreduce …………………………….. SUCCESS [ 6.331 s][INFO] Apache Hadoop MapReduce Streaming ……………… SUCCESS [ 45.863 s][INFO] Apache Hadoop Distributed Copy ………………… SUCCESS [ 46.304 s][INFO] Apache Hadoop Archives ……………………….. SUCCESS [ 3.575 s][INFO] Apache Hadoop Rumen ………………………….. SUCCESS [ 12.991 s][INFO] Apache Hadoop Gridmix ………………………… SUCCESS [ 10.105 s][INFO] Apache Hadoop Data Join ………………………. SUCCESS [ 5.021 s][INFO] Apache Hadoop Ant Tasks ………………………. SUCCESS [ 3.804 s][INFO] Apache Hadoop Extras …………………………. SUCCESS [ 5.298 s][INFO] Apache Hadoop Pipes ………………………….. SUCCESS [ 10.290 s][INFO] Apache Hadoop OpenStack support ……………….. SUCCESS [ 9.220 s][INFO] Apache Hadoop Amazon Web Services support ………. SUCCESS [11:12 min][INFO] Apache Hadoop Client …………………………. SUCCESS [ 10.714 s][INFO] Apache Hadoop Mini-Cluster ……………………. SUCCESS [ 0.143 s][INFO] Apache Hadoop Scheduler Load Simulator …………. SUCCESS [ 7.664 s][INFO] Apache Hadoop Tools Dist ……………………… SUCCESS [ 29.970 s][INFO] Apache Hadoop Tools ………………………….. SUCCESS [ 0.057 s][INFO] Apache Hadoop Distribution ……………………. SUCCESS [ 49.425 s][INFO] ————————————————————————[INFO] BUILD SUCCESS[INFO] ————————————————————————[INFO] Total time: 32:26 min[INFO] Finished at: 2015-03-19T19:56:40+08:00[INFO] Final Memory: 99M/298M[INFO] ————————————————————————编译成功后会打包,放在hadoop-dist/target# lsantrun dist-tar-stitching.sh hadoop-2.6.0.tar.gz hadoop-dist-2.6.0-javadoc.jar maven-archiverdist-layout-stitching.sh hadoop-2.6.0 hadoop-dist-2.6.0.jar javadoc-bundle-options test-dir

与其用泪水悔恨今天,不如用汗水拼搏今天。

CentOS 64位上编译 Hadoop 2.6.0

相关文章:

你感兴趣的文章:

标签云: