scribed + hdfs

scribe安装与配置
1. 依赖软件:

a. autoconf: 
 wget http://ftp.gnu.org/gnu/autoconf/autoconf-2.69.tar.gz tar xvf autoconf-2.69.tar.gz
cd autoconf-2.69
./configure --bindir=/usr/bin
make && make install
 
 
 
 b. 依赖库:
yum install libevent libevent-develpython-devel yum install gcc-c++
yum install libtool
yum install automake
yum install byacc flex

c. boost, 最好1.45
  wget http://jaist.dl.sourceforge.net/project/boost/boost/1.45.0/ boost_1_45_0.tar.gz
  
  tar -xf boost_1_45_0.tar.gz
 cd boost_1_45_0
 ./bootstrap.sh
 ./bjam install --prefix=/usr/local/bootstrap

d. thrift
wget --no-check-certificate https://dist.apache.org/repos/dist/release/ thrift/0.8.0/thrift-0.8.0.tar.gz

 tar xzf thrift-0.8.0.tar.gz
 cd thrift-0.8.0
 ./configure --with-boost=/usr/local/bootstrap/ --with-java --prefix=/usr/local/thrift --with-php-config=/usr/local/php/bin/php-config   make && make install

cd thrift-0.8.0/contrib/fb303
  ./bootstrap.sh --prefix=/usr/local/thrift/fb303 --with-boost=/usr/ local/bootstrap/ --with-thriftpath=/usr/local/thrift/
  ./configure --prefix=/usr/local/thrift/fb303 --with-boost=/usr/local/ bootstrap/ --with-thriftpath=/usr/local/thrift/ CPPFLAGS="- DHAVE_INTTYPES_H -DHAVE_NETINET_IN_H"

2. 安装scribe
export BOOST_ROOT=/usr/local/bootstrap/
export LD_LIBRARY_PATH=/usr/local/thrift/lib:/usr/lib:/usr/local/lib:/
usr/local/bootstrap/lib/
 ./bootstrap.sh --prefix=/usr/local/scribe --with-boost=/usr/local/bootstrap/ --with-thriftpath=/usr/local/thrift/ --with-fb303path=/usr/local/thrift/fb303 -- with-hadooppath=$HADOOP_HOME --enable-hdfs
 ./configure --prefix=/usr/local/scribe --with-boost=/usr/local/bootstrap/ -- with-thriftpath=/usr/local/thrift/ --with-fb303path=/usr/local/thrift/fb303 -- with-hadooppath=$HADOOP_HOME --enable-hdfs CPPFLAGS="-I $HADOOP_HOME/src/c++/libhdfs/ -I$JAVA_HOME/include/ -I $JAVA_HOME/include/linux/ -I$JAVA_HOME/jre/lib/amd64" LDFLAGS="- ljvm -lhdfs -L$JAVA_HOME/jre/lib/amd64-L$HADOOP_HOME/c++/Linux- amd64-64/lib "
 这里有可能会报错,C Compiler 之类的,去看config.log发现是gcc -ljvm 和 gcc -lhdfs找不到,
 -ljvam:
  ln -s $JAVA_HOME/jre/lib/amd64/server/libjvm.so /usr/lib64/ libjvm.so.2.2
  ln -s /usr/lib64/libjvm.so.2.2 /usr/lib64/libjvm.so

 -lhdfs:
  ln -s $HADOOP_HOME/c++/Linux-amd64-64/lib/libhdfs.so.0 /usr/ lib64/libhdfs.so.0
 这里一定要用链接,不要直接copy,会找不到这两个库依赖的库的..
 到这里编译应该可以成功了(应该...)
 然后需要 copy $HADOOP_HOME/src/c++/libhdfs/hdfs.h src/
然后修改src/HdfsFile.cpp 将hdfsConnectNewInstance() 改成 hdfsConnect(), hdfsConnectNewInstance是某个scribe时期的hadoop历史版本, 现在已经不支持
 make&&make install

3. 需要添加的环境变量
 export CLASSPATH = $CLASSPATH:$JAVA_HOME/lib:$JAVA_HOME/ jre/lib:$HADOOP_HOME/hadoop-core-1.0.4.jar:$HADOOP_HOME/lib/co mmons-logging-1.1.1.jar:$HADOOP_HOME/lib/commons-configuration-1.6.jar: $HADOOP_HOME/lib/commons-logging-api-1.1.1.ja r:$HADOOP_HOME/lib/commons-lang-2.4.jar
 export BOOST_ROOT=/usr/local/bootstrap/
 export LD_LIBRARY_PATH=/usr/local/thrift/lib:/usr/lib:/usr/local/lib:/ usr/local/bootstrap/lib/

4. 中间可能遇到的问题
 一. 编译scribe时 configure: error: Could not link against boost_filesystem !
  设置环境变量
  export BOOST_ROOT=/usr/local/bootstrap/
中交兴路-基础研发部-⾼高健峰 2013-05-28
  export LD_LIBRARY_PATH=/usr/local/thrift/lib:/usr/lib:/usr/ local/lib:/usr/local/bootstrap/lib/
 二. make时, hdfsConnectNewInstance not in this scope
  将HdfsFile.cpp 中的hdfsConnectNewInstance 改为 hdfsConnect
 三. 运行时, Configuration, LogFactory, StringUtil not found
  export CLASSPATH = $CLASSPATH:$JAVA_HOME/lib: $JAVA_HOME/jre/lib:$HADOOP_HOME/hadoop-core-1.0.4.jar: $HADOOP_HOME/lib/co mmons-logging-1.1.1.jar:$HADOOP_HOME/lib/commons-configuration-1.6.jar:
$HADOOP_HOME/lib/commons-logging-api-1.1.1.ja r:$HADOOP_HOME/lib/commons-lang-2.4.jar
 四. 运行时 hdfs append not supported!
  修改hadoop配置文件hdfs-site.xml
  加入 <property>
<name>dfs.support.append</name>
<value>true</value> </property>

5. 源代码的一些修改
   HdfsFile.cpp 关于hdfsExist部分的修改
   Store.cpp 关于baseFileName部分的修改 

猜你喜欢

转载自josephgao.iteye.com/blog/1909134