Spark和Hive的结合(让hive基于spark计算)

spark和hive结合

1.安装mysql

2.在spark/conf中创建一个hive-site.xml

<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
   Licensed to the Apache Software Foundation (ASF) under one or more
   contributor license agreements.  See the NOTICE file distributed with
   this work for additional information regarding copyright ownership.
   The ASF licenses this file to You under the Apache License, Version 2.0
   (the "License"); you may not use this file except in compliance with
   the License.  You may obtain a copy of the License at
 
 
   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.
-->
<configuration>
  <property>
    <name>javax.jdo.option.ConnectionURL</name>
    <description>JDBC connect string for a JDBC metastore</description>
  </property>
 
   <property>
    <name>javax.jdo.option.ConnectionDriverName</name>
    <value>com.mysql.jdbc.Driver</value>
    <description>Driver class name for a JDBC metastore</description>
  </property>
 
  <property>
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>root</value>
    <description>username to use against metastore database</description>
  </property>
 
  <property>
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>root</value>
    <description>password to use against metastore database</description>
  </property>
</configuration>

3.把mysql的jar包放入spark的jars目录里

有两种方法让spark照hdfs中hive表的数据

3.修改etc/profile

export HADOOP_CONF_DIR=/home/xss/java/hadoop/etc/hadoop

3.在spark的conf目录的spark-env.sh 中加入  export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop/

以上三点效果都一样,所以执行哪个都可以,只需要执行一个就行

4.启动spark-sql(用命令行的方式)     ****运行指令之前需要先启动   start-spark.sh

bin/spark-sql --master spark://192.168.224.132:7077  --driver-class-path /usr/local/spark/mysql.jar( 在jars中放mysql包之后就不用之后的dirver)

5.修改mysql中hive数据库中的DBS     ***没有操作

把数据存放的路径改为HDFS目录
发布了64 篇原创文章 · 获赞 40 · 访问量 9519

猜你喜欢

转载自blog.csdn.net/qq_44472134/article/details/104347768
今日推荐