sqoop导出到hive

sqqop 安装之后 ,有时候 需要把 hive 安装目录下的 hive-common-3.1.2.jar 拷贝到 sqoop目录下(也可能是sqoop没有正确解析到profile中 hive的安装目录变量)。
1、##查看 hdfs上的数据
hadoop fs -ls /sqphdfsdata/dh_call_info2/datajob1004

2、##sqoop导入mysql到 hdfs
sqoop list-tables --username root --password '2019_Mysql' --connect jdbc:mysql://localhost:3306/bgdmysqldb --target-dir /sqphdfsdata/dh_call_info2/datajob1004

3、##在hive端创建和mysql一致的表结构

sqoop create-hive-table --connect jdbc:mysql://192.168.91.112:3306/bgdmysqldb --username root --password '2019_Mysql' --table dh_call_info2

4、##在hive端建表
CREATE TABLE IF NOT EXISTS `rdw.dh_call_info2`
(id BIGINT, telephone string, name string, create_time int, update_time int)
ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' STORED AS TEXTFILE;

5、##sqoop导入hive
sqoop import \
--connect jdbc:mysql://192.168.91.112:3306/bgdmysqldb \
--username root \
--password '2019_Mysql' \
--table dh_call_info2 \
--fields-terminated-by '\t' \
--num-mappers 1 \
--hive-import \
--hive-database default \
--hive-table dh_call_info2 \
--delete-target-dir

猜你喜欢

转载自www.cnblogs.com/bjxdd/p/11979827.html