export HADOOP_ROOT_LOGGER=DEBUG,console

安装snappy错误

flume配置成gzip,提示

Caused by: java.lang.IllegalArgumentException: SequenceFile doesn't work with GzipCodec without native-hadoop code!

配置成snappy都提示

Caused by: java.lang.RuntimeException: native snappy library not available: this version of libhadoop was built without snappy support.

意思就是说,GzipCodec找不到本地库,现在使用的libhadoop不支持snappy

既然提示本地库问题,那么就得重新编译hadoop源码,用源码生成的本地库代替现网使用的本地库。

网上乱七八糟的文章不注明版本,很多都无法使用:

编译源码一定要看BUILDING.txt 

里面会写清楚需要安装哪些软件

        

[email protected]:~/hadoop-2.7.1-src$ cat BUILDING.txt

Build instructions for Hadoop

———————————————————————————-

Requirements:

* Unix System

* JDK 1.7+

* Maven 3.0
or later

* Findbugs 1.3.9 (if running findbugs)

* ProtocolBuffer 2.5.0

* CMake 2.6
or newer (if compiling native code), must be 3.0
or newer on Mac

* Zlib devel (if compiling native code)

* openssl devel ( if compiling native hadoop-pipes and to get the best HDFS encryption performance )

* Jansson C XML parsing library ( if compiling libwebhdfs )

* Linux FUSE (Filesystem in Userspace) version 2.6
or above ( if compiling fuse_dfs )

* Internet connection for first build (to fetch all Maven and Hadoop dependencies)

        

除此之外还有很多其他的信息可以看,总之这个bulding.txt很重要。

大概的安装步骤如下,可以参考:
        

yum -y install gcc-c++

  • 安装 maven

tar -zxvf apache-maven-3.3.3-bin.tar.gz//解压

mv apache-maven-3.3.3 maven//修改名称

vi /etc/profile //添加环境变量

MAVEN_HOME=/usr/java/maven

PATH=$MAVEN_HOME/bin:$PATH

export MAVEN_HOME



保存退出

source /etc/profile//使文件生效

mvn –v//查看 maven 版本

 

  • 安装 ProtocolBuffer


tar -zxvf protobuf-2.5.0.tar.gz

mv protobuf-2.5.0 protobuf

cd protobuf

./configure –prefix=/usr/local/protobuf

make

make check

make install

vi /etc/profile//添加环境变量

PROTOBUF_HOME=/usr/local/protobuf

PATH=$PROTOBUF_HOME/bin:$PATH

export PROTOBUF_HOME


保存退出


source /etc/profile//使文件生效

protoc –version 查看版本号


  • 安装 cmake

yum -y install cmake

 

  • zlib 安装

yum -y install zlib


  • openssl devel

yum -y install openssl-devel


  • 安装 snappy

tar -zxvf snappy-1.1.3.tar.gz

mv snappy-1.1.3 snappy

cd snappy

./configure

make

make install

 


  • 编译 hadoop-2.2.0-src 源码

    tar –zxvf hadoop-2.2.0-src.tar.gz

    cd hadoop-2.2.0-src


mvn package -Pdist,native -DskipTests -Dtar -Dbundle.snappy -Dsnappy.lib=/usr/local/lib


或者先clean一下


mvn clean package -Pdist,native -DskipTests -Dtar  -Drequire.snappy  -Dbundle.snappy -Dsnappy.lib=/usr/local/lib

mvn clean package  -X -Pdist,native -DskipTests -Dtar  -Drequire.snappy  -Dbundle.snappy -Dsnappy.lib=/usr/local/lib

最后编译成功,会出现一堆success,大概用时15分钟。

[INFO] — maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist —

[INFO] Building jar: /home/hadoop/hadoop-2.7.1-src/hadoop-dist/target/hadoop-dist-2.7.1-javadoc.jar

[INFO] ————————————————————————

[INFO] Reactor Summary:

[INFO]

[INFO] Apache Hadoop Main …………………………… SUCCESS [  4.404 s]

[INFO] Apache Hadoop Project POM …………………….. SUCCESS [  3.943 s]

[INFO] Apache Hadoop Annotations …………………….. SUCCESS [  5.518 s]

[INFO] Apache Hadoop Assemblies ……………………… SUCCESS [  0.182 s]

[INFO] Apache Hadoop Project Dist POM ………………… SUCCESS [  6.141 s]

[INFO] Apache Hadoop Maven Plugins …………………… SUCCESS [  2.640 s]

[INFO] Apache Hadoop MiniKDC ………………………… SUCCESS [  2.311 s]

[INFO] Apache Hadoop Auth …………………………… SUCCESS [  2.277 s]

[INFO] Apache Hadoop Auth Examples …………………… SUCCESS [  2.704 s]

[INFO] Apache Hadoop Common …………………………. SUCCESS [02:11 min]

[INFO] Apache Hadoop NFS ……………………………. SUCCESS [  6.156 s]

[INFO] Apache Hadoop KMS ……………………………. SUCCESS [ 14.329 s]

[INFO] Apache Hadoop Common Project ………………….. SUCCESS [  0.452 s]

[INFO] Apache Hadoop HDFS …………………………… SUCCESS [04:37 min]

[INFO] Apache Hadoop HttpFS …………………………. SUCCESS [ 23.211 s]

[INFO] Apache Hadoop HDFS BookKeeper Journal ………….. SUCCESS [  5.884 s]

[INFO] Apache Hadoop HDFS-NFS ……………………….. SUCCESS [  8.351 s]

[INFO] Apache Hadoop HDFS Project ……………………. SUCCESS [  0.113 s]

[INFO] hadoop-yarn …………………………………. SUCCESS [  0.074 s]

[INFO] hadoop-yarn-api ……………………………… SUCCESS [02:31 min]

[INFO] hadoop-yarn-common …………………………… SUCCESS [ 29.533 s]

[INFO] hadoop-yarn-server …………………………… SUCCESS [  0.216 s]

[INFO] hadoop-yarn-server-common …………………….. SUCCESS [  9.086 s]

[INFO] hadoop-yarn-server-nodemanager ………………… SUCCESS [ 12.703 s]

[INFO] hadoop-yarn-server-web-proxy ………………….. SUCCESS [  1.688 s]

[INFO] hadoop-yarn-server-applicationhistoryservice ……. SUCCESS [  4.188 s]

[INFO] hadoop-yarn-server-resourcemanager …………….. SUCCESS [ 12.948 s]

[INFO] hadoop-yarn-server-tests ……………………… SUCCESS [  3.110 s]

[INFO] hadoop-yarn-client …………………………… SUCCESS [  3.472 s]

[INFO] hadoop-yarn-server-sharedcachemanager ………….. SUCCESS [  1.811 s]

[INFO] hadoop-yarn-applications ……………………… SUCCESS [  0.053 s]

[INFO] hadoop-yarn-applications-distributedshell ………. SUCCESS [  1.309 s]

[INFO] hadoop-yarn-applications-unmanaged-am-launcher ….. SUCCESS [  0.987 s]

[INFO] hadoop-yarn-site …………………………….. SUCCESS [  0.060 s]

[INFO] hadoop-yarn-registry …………………………. SUCCESS [  3.070 s]

[INFO] hadoop-yarn-project ………………………….. SUCCESS [  5.895 s]

[INFO] hadoop-mapreduce-client ………………………. SUCCESS [  0.062 s]

[INFO] hadoop-mapreduce-client-core ………………….. SUCCESS [ 16.904 s]

[INFO] hadoop-mapreduce-client-common ………………… SUCCESS [ 15.219 s]

[INFO] hadoop-mapreduce-client-shuffle ……………….. SUCCESS [  2.322 s]

[INFO] hadoop-mapreduce-client-app …………………… SUCCESS [  4.418 s]

[INFO] hadoop-mapreduce-client-hs ……………………. SUCCESS [  3.028 s]

[INFO] hadoop-mapreduce-client-jobclient ……………… SUCCESS [  2.791 s]

[INFO] hadoop-mapreduce-client-hs-plugins …………….. SUCCESS [  1.104 s]

[INFO] Apache Hadoop MapReduce Examples ………………. SUCCESS [  2.405 s]

[INFO] hadoop-mapreduce …………………………….. SUCCESS [  3.137 s]

[INFO] Apache Hadoop MapReduce Streaming ……………… SUCCESS [  2.338 s]

[INFO] Apache Hadoop Distributed Copy ………………… SUCCESS [  8.093 s]

[INFO] Apache Hadoop Archives ……………………….. SUCCESS [  1.290 s]

[INFO] Apache Hadoop Rumen ………………………….. SUCCESS [  3.027 s]

[INFO] Apache Hadoop Gridmix ………………………… SUCCESS [  2.096 s]

[INFO] Apache Hadoop Data Join ………………………. SUCCESS [  1.231 s]

[INFO] Apache Hadoop Ant Tasks ………………………. SUCCESS [  1.268 s]

[INFO] Apache Hadoop Extras …………………………. SUCCESS [  1.506 s]

[INFO] Apache Hadoop Pipes ………………………….. SUCCESS [  4.323 s]

[INFO] Apache Hadoop OpenStack support ……………….. SUCCESS [  2.154 s]

[INFO] Apache Hadoop Amazon Web Services support ………. SUCCESS [  2.260 s]

[INFO] Apache Hadoop Azure support …………………… SUCCESS [  2.000 s]

[INFO] Apache Hadoop Client …………………………. SUCCESS [  9.934 s]

[INFO] Apache Hadoop Mini-Cluster ……………………. SUCCESS [  0.250 s]

[INFO] Apache Hadoop Scheduler Load Simulator …………. SUCCESS [  2.677 s]

[INFO] Apache Hadoop Tools Dist ……………………… SUCCESS [  8.720 s]

[INFO] Apache Hadoop Tools ………………………….. SUCCESS [  0.084 s]

[INFO] Apache Hadoop Distribution ……………………. SUCCESS [01:22 min]

[INFO] ————————————————————————

[INFO] BUILD SUCCESS

[INFO] ————————————————————————

[INFO] Total time: 15:36 min

[INFO] Finished at: 2016-06-13T18:51:37+08:00

[INFO] Final Memory: 104M/352M

hadoop-2.7.1-src/hadoop-dist/ target 在该目录找到编译后的 native文件夹,将源码编译后的这个native文件夹,替换到现网的native文件



目录是hadoop/lib/native

 修改 hadoop 配置

修改 hadoop-env.sh

export HADOOP_HOME=/usr/java/hadoop 安装目录

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native

修改配置 core-site.xml


<property>

<name>io.compression.codecs</name>

<value>org.apache.hadoop.io.compress.SnappyCodec</value>

</property>

                

如果 MR 中间结果需要使用 snappy 压缩,修改 mapred-site.xml

<property>

<name>mapreduce.map.output.compress</name>

<value>true</value>

</property>

<property>

<name>mapreduce.map.output.compress.codec</name>

<value>org.apache.hadoop.io.compress.SnappyCodec</value>

</property>


hadoop checknative –a 检查本地库是否安装成功

查看本地库的命令:hadoop checknative -a 


[[email protected] ~]# hadoop checknative -a

Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /usr/local/hadoop/lib/native/libhadoop.so which might have disabled stack guard. The VM will try to fix the stack guard now.

It‘s highly recommended that you fix the library with ‘execstack -c <libfile>‘, or link it with ‘-z noexecstack‘.

16/06/13
19:27:42 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Native library checking:

hadoop:  false

zlib:    false

snappy:  false

lz4:    false

bzip2false

openssl: false 

解决问题的最关键点:

export HADOOP_ROOT_LOGGER=DEBUG,console

帮助调试,用这个才看得出是什么问题。

[email protected]:/usr/local/hadoop/lib/native$ hadoop checknative

16/06/13
19:14:52 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...

16/06/13
19:14:52 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path

16/06/13
19:14:52 DEBUG util.NativeCodeLoader:
java.library.path=/usr/local/hadoop/lib

16/06/13
19:14:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

16/06/13
19:14:52 DEBUG util.Shell: setsid exited with exit code 0

Native library checking:

hadoop:  false

zlib:    false

snappy:  false

lz4:    false

bzip2false

openssl: false

java.library.path路径中找不到需要的本地库,因为本地库是native,所以改环境变量。

vi ~/.bashrc

# ~/.bashrc: executed by bash(1) for non-login shells.

# see /usr/share/doc/bash/examples/startup-files (in the package bash-doc)

# for examples

export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-i386

export ZOOKEEPER_HOME=/usr/local/zookeeper/bin

# User specific environment and startup programs

PATH=$PATH:$HOME/bin

export HADOOP_HOME=/usr/local/hadoop

export HADOOP_COMMON_HOME=$HADOOP_HOME

export HADOOP_HDFS_HOME=$HADOOP_HOME

export HADOOP_MAPRED_HOME=$HADOOP_HOME

export HADOOP_YARN_HOME=$HADOOP_HOME

export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop

export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$HADOOP_HOME/lib:/usr/local/hadoop/sbin:/usr/local/hadoop/bin

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native

export HADOOP_OPTS=“-Djava.library.path=$HADOOP_HOME/lib/native”

export PATH 

source ~/.bashrc

再查看本地库,就没有问题了

[email protected]:/usr/local/hadoop/lib/native$ source ~/.bashrc

[email protected]:/usr/local/hadoop/lib/native$ hadoop checknative

16/06/13
19:23:03 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...

16/06/13
19:23:03 DEBUG util.NativeCodeLoader: Loaded the native-hadoop library

16/06/13
19:23:03 DEBUG util.Shell: setsid exited with exit code 0

16/06/13
19:23:03 WARN bzip2.Bzip2Factory: Failed to load/initialize native-bzip2 library system-native, will use pure-Java version

16/06/13
19:23:03 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library

Native library checking:

hadoop:  true /usr/local/hadoop/lib/native/libhadoop.so.1.0.0

zlib:    true /lib/i386-linux-gnu/libz.so.1

snappy:  true /usr/local/hadoop/lib/native/libsnappy.so.1

lz4:    true revision:99

bzip2:  false

openssl: true /usr/lib/i386-linux-gnu/libcrypto.so

[[email protected] ~]# export HADOOP_ROOT_LOGGER=DEBUG,console

[[email protected] ~]# hadoop checknative     

16/06/13 19:30:35 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library…

Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /usr/local/hadoop/lib/native/libhadoop.so which might have disabled stack guard. The VM will try to fix the stack guard now.

It&apos;s highly recommended that you fix the library with &apos;execstack -c <libfile>&apos;, or link it with &apos;-z noexecstack&apos;.

16/06/13 19:30:35 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /usr/local/hadoop/lib/native/libhadoop.so: /usr/local/hadoop/lib/native/libhadoop.so: wrong ELF class: ELFCLASS32 (Possible cause: architecture word width mismatch)

16/06/13 19:30:35 DEBUG util.NativeCodeLoader: java.library.path=/usr/local/hadoop/lib/native/

16/06/13 19:30:35 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable

16/06/13 19:30:36 DEBUG util.Shell: setsid exited with exit code 0

Native library checking:

hadoop:  false

zlib:    false

snappy:  false

lz4:     false

bzip2:   false

openssl: false

16/06/13 19:30:36 INFO util.ExitUtil: Exiting with status 1




0 条评论

发表评论

电子邮件地址不会被公开。 必填项已用*标注