sparksql与hive整合

hive 环境启动:
./hive --service metastore & 
./hive --service hiveserver2 &


spark配置
将 $HIVE_HOME/conf/hive-site.xml copy或者软链 到 $SPARK_HOME/conf/
将 $HIVE_HOME/lib/mysql-connector-java-5.1.12.jar copy或者软链到$SPARK_HOME/lib/
copy或者软链$SPARK_HOME/lib/ 是方便spark standalone模式使用


1.standalone模式
/spark/bin/spark-sql --master spark://spark-master:7077 --jars /spark/examples/jars/mysql-connector-java-5.1.42-bin.jar

2.yarn-client模式
/bin/spark-sql --master yarn-client --jars /spark/examples/jars/mysql-connector-java-5.1.42-bin.jar




-----------------------hive-site.xml------------------------

<configuration>
	<property>
		<name>hive.metastore.warehouse.dir</name>
		<value>/usr/hive/warehouse</value>
	</property>
	<property>
	  <name>hive.metastore.uris</name>
	  <value>thrift://master:9083</value>
    </property>
	<property>
		<name>hive.metastore.local</name>
		<value>true</value>
	</property>
	<property>
		<name>hive.exec.scratchdir</name>
		<value>/tmp/hive</value>
	</property>	
	<property>
		<name>javax.jdo.option.ConnectionURL</name>
		<value>jdbc:mysql://172.18.0.21:3306/hive_db?createDatabaseIfNoExist=true</value>
	</property>
	<property>
		<name>javax.jdo.option.ConnectionDriverName</name>
		<value>com.mysql.jdbc.Driver</value>
	</property>
	<property>
		<name>javax.jdo.option.ConnectionUserName</name>
		<value>root</value>
	</property>
	<property>
		<name>javax.jdo.option.ConnectionPassword</name>
		<value>hadoop</value>
	</property>
</configuration>

  参考

http://blog.csdn.net/stark_summer/article/details/48443147

猜你喜欢

转载自m635674608.iteye.com/blog/2375479