CentOS 64位上编译 Hadoop2.6.0

由于hadoop-2.6.0.tar.gz安装包是在32位机器上编译的,64位的机器加载本地库.so文件时会出错,比如:

java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(II[BI[BIILjava/lang/String;JZ)V

所以需要重新编译

1.编译环境yum install cmake lzo-devel zlib-devel gcc gcc-c++ autoconf automake libtool ncurses-devel openssl-devel libXtst2.安装JDK(下载JDK1.7,只能用1.7,,否则编译会出错)下载页面: tar -zxvf jdk-7u75-linux-x64.tar.gz -C /usr/localexport JAVA_HOME=/usr/local/jdk1.7.0_75export JRE_HOME=$JAVA_HOME/jreexport CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jarexport PATH=$PATH:$JAVA_HOME/bin3.安装protobuf下载protobuf-2.5.0,不能用高版本,否则Hadoop编译不能通过wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz 或者 在百度云盘上下载:?shareid=830873155&uk=3573928349tar -zxvf protobuf-2.5.0.tar.gzcd protobuf-2.5.0./configuremakemake installprotoc –version4.安装ANTwget tar -zxvf apache-ant-1.9.4-bin.tar.gz -C /usr/localvi /etc/profileexport ANT_HOME=/usr/local/apache-ant-1.9.4export PATH=$PATH:$ANT_HOME/bin5.安装mavenwget tar -zxvf apache-maven-3.3.1-bin.tar.gz -C /usr/localvi /etc/profileexport MAVEN_HOME=/usr/local/apache-maven-3.3.1export PATH=$PATH:$MAVEN_HOME/bin修改配置文件vi /usr/local/apache-maven-3.3.1/conf/settings.xml更改maven资料库,在<mirrors></mirros>里添加如下内容: <mirror> <id>nexus-osc</id> <mirrorOf>*</mirrorOf> <name>Nexusosc</name> <url></url> </mirror>在<profiles></profiles>内新添加<profile> <id>jdk-1.7</id> <activation> <jdk>1.7</jdk> </activation> <repositories> <repository> <id>nexus</id> <name>local private nexus</name> <url></url> <releases> <enabled>true</enabled> </releases> <snapshots> <enabled>false</enabled> </snapshots> </repository> </repositories> <pluginRepositories> <pluginRepository> <id>nexus</id> <name>local private nexus</name> <url></url> <releases> <enabled>true</enabled> </releases> <snapshots> <enabled>false</enabled> </snapshots> </pluginRepository> </pluginRepositories></profile>在shell下执行,使环境变量生效source /etc/profile7.编译 Hadoop2.6.0wget cd hadoop-2.6.0-srcmvn package -DskipTests -Pdist,native -Dtar如果是第一次使用maven,会打印很多如下日志信息Downloading:

Scanning for projects…

[INFO] Apache Hadoop Main …………………………… SUCCESS [ 4.590 s][INFO] Apache Hadoop Project POM …………………….. SUCCESS [ 3.503 s][INFO] Apache Hadoop Annotations …………………….. SUCCESS [ 5.870 s][INFO] Apache Hadoop Assemblies ……………………… SUCCESS [ 0.540 s][INFO] Apache Hadoop Project Dist POM ………………… SUCCESS [ 3.921 s][INFO] Apache Hadoop Maven Plugins …………………… SUCCESS [ 7.731 s][INFO] Apache Hadoop MiniKDC ………………………… SUCCESS [ 6.805 s][INFO] Apache Hadoop Auth …………………………… SUCCESS [ 9.008 s][INFO] Apache Hadoop Auth Examples …………………… SUCCESS [ 6.991 s][INFO] Apache Hadoop Common …………………………. SUCCESS [03:12 min][INFO] Apache Hadoop NFS ……………………………. SUCCESS [ 16.557 s][INFO] Apache Hadoop KMS ……………………………. SUCCESS [ 24.476 s][INFO] Apache Hadoop Common Project ………………….. SUCCESS [ 0.115 s][INFO] Apache Hadoop HDFS …………………………… SUCCESS [05:09 min][INFO] Apache Hadoop HttpFS …………………………. SUCCESS [ 40.145 s][INFO] Apache Hadoop HDFS BookKeeper Journal ………….. SUCCESS [ 15.876 s][INFO] Apache Hadoop HDFS-NFS ……………………….. SUCCESS [ 9.236 s][INFO] Apache Hadoop HDFS Project ……………………. SUCCESS [ 0.125 s][INFO] hadoop-yarn …………………………………. SUCCESS [ 0.129 s][INFO] hadoop-yarn-api ……………………………… SUCCESS [02:49 min][INFO] hadoop-yarn-common …………………………… SUCCESS [01:01 min][INFO] hadoop-yarn-server …………………………… SUCCESS [ 0.099 s][INFO] hadoop-yarn-server-common …………………….. SUCCESS [ 25.019 s][INFO] hadoop-yarn-server-nodemanager ………………… SUCCESS [ 33.655 s][INFO] hadoop-yarn-server-web-proxy ………………….. SUCCESS [ 5.761 s][INFO] hadoop-yarn-server-applicationhistoryservice ……. SUCCESS [ 13.714 s][INFO] hadoop-yarn-server-resourcemanager …………….. SUCCESS [ 41.930 s][INFO] hadoop-yarn-server-tests ……………………… SUCCESS [ 13.364 s][INFO] hadoop-yarn-client …………………………… SUCCESS [ 17.408 s][INFO] hadoop-yarn-applications ……………………… SUCCESS [ 0.042 s][INFO] hadoop-yarn-applications-distributedshell ………. SUCCESS [ 5.131 s][INFO] hadoop-yarn-applications-unmanaged-am-launcher ….. SUCCESS [ 3.710 s][INFO] hadoop-yarn-site …………………………….. SUCCESS [ 0.107 s][INFO] hadoop-yarn-registry …………………………. SUCCESS [ 12.531 s][INFO] hadoop-yarn-project ………………………….. SUCCESS [ 7.781 s][INFO] hadoop-mapreduce-client ………………………. SUCCESS [ 0.116 s][INFO] hadoop-mapreduce-client-core ………………….. SUCCESS [ 47.915 s][INFO] hadoop-mapreduce-client-common ………………… SUCCESS [ 38.104 s][INFO] hadoop-mapreduce-client-shuffle ……………….. SUCCESS [ 9.073 s][INFO] hadoop-mapreduce-client-app …………………… SUCCESS [01:01 min][INFO] hadoop-mapreduce-client-hs ……………………. SUCCESS [ 18.149 s][INFO] hadoop-mapreduce-client-jobclient ……………… SUCCESS [ 9.002 s][INFO] hadoop-mapreduce-client-hs-plugins …………….. SUCCESS [ 3.222 s][INFO] Apache Hadoop MapReduce Examples ………………. SUCCESS [ 13.224 s][INFO] hadoop-mapreduce …………………………….. SUCCESS [ 6.571 s][INFO] Apache Hadoop MapReduce Streaming ……………… SUCCESS [ 9.781 s][INFO] Apache Hadoop Distributed Copy ………………… SUCCESS [ 16.254 s][INFO] Apache Hadoop Archives ……………………….. SUCCESS [ 5.302 s][INFO] Apache Hadoop Rumen ………………………….. SUCCESS [ 13.760 s][INFO] Apache Hadoop Gridmix ………………………… SUCCESS [ 8.858 s][INFO] Apache Hadoop Data Join ………………………. SUCCESS [ 6.252 s][INFO] Apache Hadoop Ant Tasks ………………………. SUCCESS [ 4.276 s][INFO] Apache Hadoop Extras …………………………. SUCCESS [ 6.206 s][INFO] Apache Hadoop Pipes ………………………….. SUCCESS [ 1.945 s][INFO] Apache Hadoop OpenStack support ……………….. SUCCESS [ 12.239 s][INFO] Apache Hadoop Amazon Web Services support ………. SUCCESS [ 38.137 s][INFO] Apache Hadoop Client …………………………. SUCCESS [ 13.213 s][INFO] Apache Hadoop Mini-Cluster ……………………. SUCCESS [ 0.169 s][INFO] Apache Hadoop Scheduler Load Simulator …………. SUCCESS [ 13.206 s][INFO] Apache Hadoop Tools Dist ……………………… SUCCESS [ 15.248 s][INFO] Apache Hadoop Tools ………………………….. SUCCESS [ 0.162 s][INFO] Apache Hadoop Distribution ……………………. SUCCESS [01:09 min][INFO] ————————————————————————[INFO] BUILD SUCCESS[INFO] ————————————————————————[INFO] Total time: 25:19 min[INFO] Finished at: 2015-03-26T17:54:10+08:00[INFO] Final Memory: 106M/402M[INFO] ————————————————————————

经过漫长的等待编译过程后,编译成功后会打包,放在hadoop-dist/target

#ll

回避现实的人,未来将更不理想。

CentOS 64位上编译 Hadoop2.6.0

相关文章:

你感兴趣的文章:

标签云: