准备所需组件
- Hadoop 2.7.3:
https://github.com/apache/had... - Snappy 1.1.3: https://github.com/google/sna...
- protobuf-2.5.0: https://github.com/google/pro...
安装依赖包
ant,gcc,gcc-c++,make,autoconf,automake,cmake,gzip,libssl-devel,openssl-devel,libtool,Java7+,maven 3+
大部分可直接 yum install 安装即可
安装 snappy
- 解压 unzip snappy-1.1.3.zip
- 进入目录 cd snappy-1.1.3
- 编译安装
./configure, make && make install
;编译成功后,在/usr/local/lib/
下会有如下
# ls /usr/local/lib/
libsnappy.a libsnappy.la libsnappy.so libsnappy.so.1 libsnappy.so.1.3.0
安装 protobuf
- 解压:
unzip protobuf-2.5.0.zip
- 进入目录:
cd protobuf-2.5.0/
- 编译安装:
./configure, make && make install
- 配置环境变量:
export PATH=/usr/local/protoc/bin:$PATH
- 立即生效:
source /etc/profile
- 查看版本:
protoc –version
[root@SparkWorker3 data]# protoc --version
libprotoc 2.5.0
编译 hadoop 源码
-
安装 jdk,maven,安装依赖库:
yum -y install svn ncurses-devel gcc* yum -y install lzo-devel zlib-devel autoconf automake libtool cmake openssl-devel
- 解压
unzip hadoop-release-2.7.3-RC2.zip
- 进入目录
cd hadoop-release-2.7.3-RC2/
- 设置 maven 内存,
export MAVEN_OPTS="-Xms256m -Xmx512m"
- 编译
mvn package -DskipTests -Pdist,native -Dtar -Dsnappy.lib=/usr/local/lib -Dbundle.snappy
- 时间会很长,一般报错都是缺少依赖
- 结果如下
# ls hadoop-release-2.7.3-RC2/hadoop-dist/target/hadoop-2.7.3/lib/native
libhadoop.a libhadooputils.a libsnappy.a libsnappy.so.1.3.0
libhadooppipes.a libhdfs.a libsnappy.la
libhadoop.so libhdfs.so libsnappy.so
libhadoop.so.1.0.0 libhdfs.so.0.0.0 libsnappy.so.1
hadoop 和hbase 添加 snappy
-
拷贝库文件到 hadoop
# cp -r /data/hadoop-compile/hadoop-release-2.7.3-RC2/hadoop-dist/target/hadoop-2.7.3/lib/native/* $HADOOP_HOME/lib/native/
-
在 hbase 中创建目录
# mkdir -p $HBASE_HOME/lib/native/Linux-amd64-64
-
拷贝库文件到 Hbase
# cp -r /data/hadoop-compile/hadoop-release-2.7.3-RC2/hadoop-dist/target/hadoop-2.7.3/lib/native/* $HBASE_HOME/lib/native/Linux-amd64-64/
-
$HADOOP_/HOME/etc/hadoop/hadoop-env.sh 添加如下环境变量:
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/hadoop/hadoop-2.7.3/lib/native/:/usr/local/lib/
-
$HBASE_HOME/conf/hbase-env.sh 添加如下环境变量:
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/hadoop/hadoop-2.7.3/lib/native/:/usr/local/lib/ export HBASE_LIBRARY_PATH=$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
修改配置文件
-
添加 $HADOOP_HOOME/etc/hadoop/core-site.xml
<property> <name>io.compression.codecs</name> <value>org.apache.hadoop.io.compress.GzipCodec, org.apache.hadoop.io.compress.DefaultCodec, org.apache.hadoop.io.compress.BZip2Codec, org.apache.hadoop.io.compress.SnappyCodec </value> </property>
-
添加 $HADOOP_HOOME/etc/hadoop/mapred-site.xml
<property> <name>mapred.output.compress</name> <value>true</value> </property> <property> <name>mapred.output.compression.codec</name> <value>org.apache.hadoop.io.compress.SnappyCodec</value> </property> <property> <name>mapred.compress.map.output</name> <value>true</value> </property> <property> <name>mapred.map.output.compression.codec</name> <value>org.apache.hadoop.io.compress.SnappyCodec</value> </property>
-
添加 $HBASE_HOME/conf/hbase-site.xml
<property> <name>hbase.regionserver.codecs</name> <value>snappy</value> </property>
重启 hadoop、hbase,并测试
hadoop测试: $HADOOP_HOME/bin/hadoop checknative -a
# $HADOOP_HOME/bin/hadoop checknative -a
17/09/21 10:31:50 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop: true /usr/local/hadoop/hadoop-2.7.3/lib/native/libhadoop.so
zlib: true /lib64/libz.so.1
snappy: true /usr/local/hadoop/hadoop-2.7.3/lib/native/libsnappy.so.1
lz4: true revision:99
bzip2: false
openssl: true /lib64/libcrypto.so
17/09/21 10:31:50 INFO util.ExitUtil: Exiting with status 1
hbase 创建表:
create 'snappytest', { NAME => 'info', COMPRESSION => 'snappy'}
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。