hadoop.2.6.0安装hive.1.2.2

版权声明:本文为博主九师兄(QQ群:spark源代码 198279782 欢迎来探讨技术)原创文章,未经博主允许不得转载。 https://blog.csdn.net/qq_21383435/article/details/81905805

1.hadoop.2.6.0的安装

参考:https://blog.csdn.net/qq_21383435/article/details/51691344
配置不变

2.hive.1.2.2的安装

2.1 前提

hive集群安装,连接mysql

2.2 解压

注意版本号

lcc@lcc apache-hive-1.2.2-bin$ pwd
/Users/lcc/soft/hive/apache-hive-1.2.2-bin
lcc@lcc apache-hive-1.2.2-bin$

配置

lcc@localhost conf$ vim hive-env.sh
HADOOP_HOME=/Users/lcc/soft/hadoop/hadoop-2.6.0
export HADOOP_CONF_DIR=/Users/lcc/soft/hadoop/hadoop-2.6.0/etc/hadoop/
export HIVE_CONF_DIR=/Users/lcc/soft/hive/apache-hive-1.2.2-bin/conf

lcc@lcc apache-hive-1.2.2-bin$ vim conf/hive-site.xml


<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>

  <property>
      <name>javax.jdo.option.ConnectionDriverName</name>
      <value>com.mysql.jdbc.Driver</value>
  </property>

  <!--mysql连接地址-->
  <property>
      <name>javax.jdo.option.ConnectionURL</name>
      <value>jdbc:mysql://localhost:3306/hive_meta1x?createDatabaseIfNotExist=true&amp;useUnicode=true&amp;characterEncoding=utf-8&amp;useSSL=false</value>
  </property>

  <property>
      <name>hive.hmshandler.retry.attempts</name>
      <value>3</value>
  </property>

  <property>
      <name>hive.metastore.warehouse.dir</name>
      <value>/Users/lcc/soft/hive/apache-hive-1.2.2-bin/hive_data</value>
  </property>
"conf/hive-site.xml" 69L, 1675C

把MySql驱动放到Hive的lib目录下

lcc@lcc apache-hive-1.2.2-bin$ pwd
/Users/lcc/soft/hive/apache-hive-1.2.2-bin
lcc@lcc apache-hive-1.2.2-bin$ ll
-rw-r--r--@   1 lcc  staff    848401  8  9 12:13 mysql-connector-java-5.1.25-bin.jar

然后初始化元数据

schematool -initSchema -dbType mysql --verbose

这一步会报错,如下错误1

初始化成功

这里写图片描述

然后启动服务

266  nohup bin/hive --service metastore &
267  nohup bin/hive --service hiveserver2 &

查看是否启动成功

lcc@localhost dubhe-node$ jps
8961 RunJar
14020 Jps
8116 SecondaryNameNode
8229 ResourceManager
8581 DataNode
13737 Launcher
13738 JUnitStarter
8906 RunJar
12334 
7935 NameNode
lcc@localhost dubhe-node$ jps -ml
8961 org.apache.hadoop.util.RunJar /Users/lcc/soft/hive/hive/lib/hive-metastore-2.2.0.jar org.apache.hadoop.hive.metastore.HiveMetaStore
8116 org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode
8229 org.apache.hadoop.yarn.server.resourcemanager.ResourceManager
8581 org.apache.hadoop.hdfs.server.datanode.DataNode
14022 sun.tools.jps.Jps -ml

13738 com.intellij.rt.execution.junit.JUnitStarter -ideVersion5 -junit4 com.dtwave.dipper.dubhe.node.util.JdbcUtilsTest,mysql
8906 org.apache.hadoop.util.RunJar /Users/lcc/soft/hive/hive/lib/hive-service-2.2.0.jar org.apache.hive.service.server.HiveServer2
12334 
7935 org.apache.hadoop.hdfs.server.namenode.NameNode
lcc@localhost dubhe-node$ 

然后测试能否远程连接

lcc@localhost dubhe-node$ beeline
Beeline version 1.2.2 by Apache Hive
beeline> !connect jdbc:hive2://localhost:10000/default
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/lcc/soft/hive/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/lcc/soft/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://localhost:10000/default
Enter username for jdbc:hive2://localhost:10000/default: 
Enter password for jdbc:hive2://localhost:10000/default: 
Connected to: Apache Hive (version 2.2.0)
Driver: Hive JDBC (version 2.2.0)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://localhost:10000/default> 

错误1.提示

hadoop.2.7.x的时候,$HADOOP_HOME/share/hadoop/yarn/lib/下已经没有jline-0.9.94.jar的jar包了,因为么有会造成错误,hive初始化元数据的时候

java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
        at jline.TerminalFactory.create(TerminalFactory.java:101)
        at jline.TerminalFactory.get(TerminalFactory.java:158)
        at jline.console.ConsoleReader.<init>(ConsoleReader.java:229)
        at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)
        at jline.console.ConsoleReader.<init>(ConsoleReader.java:209)
        at org.apache.hadoop.hive.cli.CliDriver.getConsoleReader(CliDriver.java:773)
        at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:715)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

原因:
hadoop目录下存在老版本jline:

解决方法:

lcc@lcc ~$ ll /Users/lcc/soft/hadoop/hadoop-2.6.0/share/hadoop/yarn/lib/ | grep jl
-rw-r--r--@  1 lcc  staff    87325 11 14  2014 jline-0.9.94.jar
lcc@lcc ~$

lcc@lcc ~$ cp /Users/lcc/soft/hive/apache-hive-1.2.2-bin/lib/jline-2.12.jar /Users/lcc/soft/hadoop/hadoop-2.6.0/share/hadoop/yarn/lib/

lcc@lcc ~$ mv /Users/lcc/soft/hadoop/hadoop-2.6.0/share/hadoop/yarn/lib/jline-0.9.94.jar /Users/lcc/soft/hadoop/hadoop-2.6.0/share/hadoop/yarn/lib/jline-0.9.94.jar.back

lcc@lcc ~$ ll /Users/lcc/soft/hadoop/hadoop-2.6.0/share/hadoop/yarn/lib/ | grep jl
-rw-r--r--@  1 lcc  staff    87325 11 14  2014 jline-0.9.94.jar.back
-rw-r--r--   1 lcc  staff   213854  8 21 12:57 jline-2.12.jar
lcc@lcc ~$

猜你喜欢

转载自blog.csdn.net/qq_21383435/article/details/81905805