SpringBoot多数据源配置(hive/mysql)

简介

  • 在项目实践中,同一个项目可能需要整合多种数据源,把分散到各个数据库、数据表的数据都查询统计出来;
  • SpringBoot有默认的数据源配置,本篇博客将整合SpringBoot多数据源配置,包括mysql、hive等,通过jdbctemplate或mybatis进行使用;
  • 版本配置(版本可自行选择,这里是博主的测试版本):jdk1.8,springboot1.5.9,hive2.1.1,hadoop2.7.6,mysql5.7;

实践

  • 首先,pom文件引入依赖:
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
<spring.version>4.3.9.RELEASE</spring.version>
</properties>

<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.5.9.RELEASE</version>
</parent>

<dependencies>

<dependency>
	<groupId>org.springframework.boot</groupId>
	<artifactId>spring-boot-starter-web</artifactId>
</dependency>

<dependency>
	<groupId>org.springframework</groupId>
	<artifactId>spring-core</artifactId>
</dependency>
<dependency>
	<groupId>org.springframework</groupId>
	<artifactId>spring-context</artifactId>
</dependency>


<!-- 添加mybatis支持 -->
<dependency>
	<groupId>org.mybatis.spring.boot</groupId>
	<artifactId>mybatis-spring-boot-starter</artifactId>
	<version>1.3.2</version>
</dependency>

<!-- 添加mysql驱动 -->
<dependency>
	<groupId>mysql</groupId>
	<artifactId>mysql-connector-java</artifactId>
</dependency>

<!-- 添加数据库连接池 -->
<dependency>
	<groupId>com.alibaba</groupId>
	<artifactId>druid</artifactId>
	<version>1.0.29</version>
</dependency>

<!-- 添加spring管理bean对象 -->
<dependency>
	<groupId>org.springframework</groupId>
	<artifactId>spring-beans</artifactId>
</dependency>

<!-- 添加hadoop依赖 -->
<dependency>
	<groupId>org.apache.hadoop</groupId>
	<artifactId>hadoop-common</artifactId>
	<version>2.6.0</version>
</dependency>

<dependency>
	<groupId>org.apache.hadoop</groupId>
	<artifactId>hadoop-mapreduce-client-core</artifactId>
	<version>2.6.0</version>
</dependency>

<dependency>
	<groupId>org.apache.hadoop</groupId>
	<artifactId>hadoop-mapreduce-client-common</artifactId>
	<version>2.6.0</version>
</dependency>

<dependency>
	<groupId>org.apache.hadoop</groupId>
	<artifactId>hadoop-hdfs</artifactId>
	<version>2.6.0</version>
</dependency>

<dependency>
	<groupId>jdk.tools</groupId>
	<artifactId>jdk.tools</artifactId>
	<version>1.8</version>
	<scope>system</scope>
	<systemPath>${JAVA_HOME}/lib/tools.jar</systemPath>
</dependency>

<dependency>
	<groupId>org.springframework.boot</groupId>
	<artifactId>spring-boot-configuration-processor</artifactId>
	<optional>true</optional>
</dependency>

<!-- 添加hive依赖 -->
<dependency>
	<groupId>org.apache.hive</groupId>
	<artifactId>hive-jdbc</artifactId>
	<version>2.1.1</version>
	<exclusions>
		<exclusion>
			<groupId>org.eclipse.jetty.aggregate</groupId>
			<artifactId>*</artifactId>
		</exclusion>
	</exclusions>
</dependency>

</dependencies>	
  • yml配置多个数据源属性和数据库连接池通用属性:
#配置mybatis
mybatis:
     mapperLocations: classpath:/com/springboot/sixmonth/dao/mapper/**/*.xml
	
#配置多个数据源属性(这里只配置两个,有需要可自行新增)
spring:
    datasource:
      mysqlMain: #mysql主数据源,可关联mybatis
         type: 
           com.alibaba.druid.pool.DruidDataSource 
         url: jdbc:mysql://127.0.0.1:3306/master?useUnicode=true&characterEncoding=utf-8&zeroDateTimeBehavior=convertToNull&allowMultiQueries=true&useSSL=true&rewriteBatchedStatements=true
         username: root
         password: 1234
         driver-class-name: com.mysql.jdbc.Driver
      hive: #hive数据源
          url: jdbc:hive2://47.100.200.200:9019/default
          type: com.alibaba.druid.pool.DruidDataSource
          username: sixmonth
          password: sixmonth
          driver-class-name: org.apache.hive.jdbc.HiveDriver
      commonConfig: #连接池统一配置,应用到所有的数据源 
         initialSize: 1
         minIdle: 1
         maxIdle: 5
         maxActive: 50
         maxWait: 10000
         timeBetweenEvictionRunsMillis: 10000
         minEvictableIdleTimeMillis: 300000
         validationQuery: select 'x'
         testWhileIdle: true
         testOnBorrow: false
         testOnReturn: false
         poolPreparedStatements: true
         maxOpenPreparedStatements: 20
         filters: stat	
  • 新建以下相关类:
  1. HiveDruidConfig(hive数据库连接池配置);
  2. MysqlMainDruidConfig(mysql主数据源数据连接池配置);
  3. DataSourceCommonProperties(扩展连接池配置属性,应用到所有的数据源);
  4. DataSourceProperties(统一属性控制类,获取配置文件属性);
  5. HiveJdbcBaseDaoImpl(jdbctemplate注入hive数据源);
  6. MysqlMainJdbcBaseDaoImpl(jdbctemplate注入mysql数据源);

  •  新建HiveDruidConfig:
package com.springboot.sixmonth.common.config.druid.hive;

import java.sql.SQLException;

import javax.sql.DataSource;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

import com.alibaba.druid.pool.DruidDataSource;
import com.springboot.sixmonth.common.config.druid.DataSourceCommonProperties;
import com.springboot.sixmonth.common.config.druid.DataSourceProperties;

/**
 * -配置hive数据源
 * @author sixmonth
 * @Date 2019年5月18日
 *
 */
@Configuration
@EnableConfigurationProperties({DataSourceProperties.class,DataSourceCommonProperties.class})//将配置类注入到bean容器,使ConfigurationProperties注解类生效
public class HiveDruidConfig {

	    private static Logger logger = LoggerFactory.getLogger(HiveDruidConfig.class);

	    @Autowired
	    private DataSourceProperties dataSourceProperties;
	    
	    @Autowired
	    private DataSourceCommonProperties dataSourceCommonProperties;

	    @Bean("hiveDruidDataSource") //新建bean实例
	    @Qualifier("hiveDruidDataSource")//标识
	    public DataSource dataSource(){
	        DruidDataSource datasource = new DruidDataSource();

	        //配置数据源属性
	        datasource.setUrl(dataSourceProperties.getHive().get("url"));
	        datasource.setUsername(dataSourceProperties.getHive().get("username"));
	        datasource.setPassword(dataSourceProperties.getHive().get("password"));
	        datasource.setDriverClassName(dataSourceProperties.getHive().get("driver-class-name"));
	        
	        //配置统一属性
	        datasource.setInitialSize(dataSourceCommonProperties.getInitialSize());
	        datasource.setMinIdle(dataSourceCommonProperties.getMinIdle());
	        datasource.setMaxActive(dataSourceCommonProperties.getMaxActive());
	        datasource.setMaxWait(dataSourceCommonProperties.getMaxWait());
	        datasource.setTimeBetweenEvictionRunsMillis(dataSourceCommonProperties.getTimeBetweenEvictionRunsMillis());
	        datasource.setMinEvictableIdleTimeMillis(dataSourceCommonProperties.getMinEvictableIdleTimeMillis());
	        datasource.setValidationQuery(dataSourceCommonProperties.getValidationQuery());
	        datasource.setTestWhileIdle(dataSourceCommonProperties.isTestWhileIdle());
	        datasource.setTestOnBorrow(dataSourceCommonProperties.isTestOnBorrow());
	        datasource.setTestOnReturn(dataSourceCommonProperties.isTestOnReturn());
	        datasource.setPoolPreparedStatements(dataSourceCommonProperties.isPoolPreparedStatements());
	        try {
	            datasource.setFilters(dataSourceCommonProperties.getFilters());
	        } catch (SQLException e) {
	            logger.error("Druid configuration initialization filter error.", e);
	        }
	        return datasource;
	    }
	    
}
  • 新建MysqlMainDruidConfig类:
package com.springboot.sixmonth.common.config.druid.mysqlMain;

import java.sql.SQLException;

import javax.sql.DataSource;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;

import com.alibaba.druid.pool.DruidDataSource;
import com.springboot.sixmonth.common.config.druid.DataSourceCommonProperties;
import com.springboot.sixmonth.common.config.druid.DataSourceProperties;

/**
 * -配置mysql主数据源,mysql连接默认主数据源
 * @author sixmonth
 * @Date 2019年5月18日
 *
 */
@Configuration
@EnableConfigurationProperties({DataSourceProperties.class,DataSourceCommonProperties.class})//将配置类注入到bean容器,使ConfigurationProperties注解类生效
public class MysqlMainDruidConfig {

	    private static Logger logger = LoggerFactory.getLogger(MysqlMainDruidConfig.class);

	    @Autowired
	    private DataSourceProperties dataSourceProperties;
	    
	    @Autowired
	    private DataSourceCommonProperties dataSourceCommonProperties;

	    @Primary //标明为主数据源,只能标识一个主数据源,mybatis连接默认主数据源
	    @Bean("mysqlDruidDataSource") //新建bean实例
	    @Qualifier("mysqlDruidDataSource")//标识
	    public DataSource dataSource(){
	        DruidDataSource datasource = new DruidDataSource();

	        //配置数据源属性
	        datasource.setUrl(dataSourceProperties.getMysqlMain().get("url"));
	        datasource.setUsername(dataSourceProperties.getMysqlMain().get("username"));
	        datasource.setPassword(dataSourceProperties.getMysqlMain().get("password"));
	        datasource.setDriverClassName(dataSourceProperties.getMysqlMain().get("driver-class-name"));
	        
	        //配置统一属性
	        datasource.setInitialSize(dataSourceCommonProperties.getInitialSize());
	        datasource.setMinIdle(dataSourceCommonProperties.getMinIdle());
	        datasource.setMaxActive(dataSourceCommonProperties.getMaxActive());
	        datasource.setMaxWait(dataSourceCommonProperties.getMaxWait());
	        datasource.setTimeBetweenEvictionRunsMillis(dataSourceCommonProperties.getTimeBetweenEvictionRunsMillis());
	        datasource.setMinEvictableIdleTimeMillis(dataSourceCommonProperties.getMinEvictableIdleTimeMillis());
	        datasource.setValidationQuery(dataSourceCommonProperties.getValidationQuery());
	        datasource.setTestWhileIdle(dataSourceCommonProperties.isTestWhileIdle());
	        datasource.setTestOnBorrow(dataSourceCommonProperties.isTestOnBorrow());
	        datasource.setTestOnReturn(dataSourceCommonProperties.isTestOnReturn());
	        datasource.setPoolPreparedStatements(dataSourceCommonProperties.isPoolPreparedStatements());
	        try {
	            datasource.setFilters(dataSourceCommonProperties.getFilters());
	        } catch (SQLException e) {
	            logger.error("Druid configuration initialization filter error.", e);
	        }
	        return datasource;
	    }
	    
}
  • 新建DataSourceCommonProperties类:
package com.springboot.sixmonth.common.config.druid;

import org.springframework.boot.context.properties.ConfigurationProperties;

/**
 * -扩展连接池,通用配置属性,可应用到所有数据源
 * @author sixmonth
 * @Date 2019年5月18日
 *
 */
@ConfigurationProperties(prefix = DataSourceCommonProperties.DS, ignoreUnknownFields = false)
public class DataSourceCommonProperties {
	final static String DS = "spring.datasource.commonConfig";

	private int initialSize = 10;
	private int minIdle;
	private int maxIdle;
	private int maxActive;
	private int maxWait;
	private int timeBetweenEvictionRunsMillis;
	private int minEvictableIdleTimeMillis;
	private String validationQuery;
	private boolean testWhileIdle;
	private boolean testOnBorrow;
	private boolean testOnReturn;
	private boolean poolPreparedStatements;
	private int maxOpenPreparedStatements;
	private String filters;

	private String mapperLocations;
	private String typeAliasPackage;


	//为节省空间,这里省略set和get方法,可自行添加
	
}
  • 新建DataSourceProperties类:
package com.springboot.sixmonth.common.config.druid;

import java.util.Map;

import org.springframework.boot.context.properties.ConfigurationProperties;

/**
 * -统一属性控制类,获取配置文件属性
 * @author sixmonth
 * @Date 2019年5月18日
 *
 */
@ConfigurationProperties(prefix = DataSourceProperties.DS, ignoreUnknownFields = false)
public class DataSourceProperties {
	final static String DS = "spring.datasource";

	private Map<String,String> mysqlMain;
	
	private Map<String,String> hive;
	
	private Map<String,String> commonConfig;
	
    
	//为节省空间,这里省略set和get方法,可自行添加

}
  • 新建HiveJdbcBaseDaoImpl类:
package com.springboot.sixmonth.common.config.jdbcConfig;

import javax.sql.DataSource;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Repository;

/**
 * -注入hive数据源
 * @author sixmonth
 * @Date 2019年5月18日
 *
 */
@Repository
public class HiveJdbcBaseDaoImpl {
	
	private JdbcTemplate jdbcTemplate;

	public JdbcTemplate getJdbcTemplate() {
		return jdbcTemplate;
	}

	@Autowired
	public void setJdbcTemplate(@Qualifier("hiveDruidDataSource") DataSource dataSource) {
		this.jdbcTemplate = new JdbcTemplate(dataSource);
	}
	
}
  • 新建MysqlMainJdbcBaseDaoImpl类:
package com.springboot.sixmonth.common.config.jdbcConfig;

import javax.sql.DataSource;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Repository;

/**
 * -注入jdbctemplate
 * @author sixmonth
 * 2018年11月12日
 */
@Repository
public class MysqlMainJdbcBaseDaoImpl {
	
	private JdbcTemplate jdbcTemplate;

	public JdbcTemplate getJdbcTemplate() {
		return jdbcTemplate;
	}

	@Autowired
	public void setJdbcTemplate(@Qualifier("mysqlDruidDataSource") DataSource dataSource) {
		this.jdbcTemplate = new JdbcTemplate(dataSource);
	}
	
}
  • 新建测试类TestDao,测试jdbcTemplate连接使用mysql:
package com.springboot.sixmonth.dao.jdbcDao.test;

import org.springframework.stereotype.Repository;

import com.springboot.sixmonth.common.config.jdbcConfig.MysqlMainJdbcBaseDaoImpl;

/**
 * -测试jdbc连接
 * @author sixmonth
 * @Date 2019年5月20日
 *
 */
@Repository
public class TestDao extends MysqlMainJdbcBaseDaoImpl{

	/**
	 * -测试类
	 * @return
	 */
	public String test() {
		String sql = "SELECT username from sys_user where username='admin'";
		String param = this.getJdbcTemplate().queryForObject(sql,String.class);
		return param;
	}
	
}	
  • 新建测试类TestHiveDao,测试jdbcTmplate连接使用hive:
package com.springboot.sixmonth.dao.jdbcDao.test;

import org.springframework.stereotype.Repository;
import com.springboot.sixmonth.common.config.jdbcConfig.HiveJdbcBaseDaoImpl;

/**
 * -测试hive连接
 * @author sixmonth
 * @Date 2019年5月18日
 *
 */
@Repository
public class TestHiveDao extends HiveJdbcBaseDaoImpl{

	/**
	 * 测试获取hive数据库数据信息
	 * @return
	 */
	public String test() {
		String sql = "SELECT name from sixmonth limit 1";
		String param = this.getJdbcTemplate().queryForObject(sql,String.class);
		return param;
	}
	
}	
  • 使用方法:
	@Autowired
	private TestDao testDao;
	
	@Autowired
	private TestHiveDao testHiveDao;

总结

  1. SpringBoot可以配置多个mysql数据源,也可以配置多种数据源,比如hive、oracle等,需要可自行添加;
  2. 在整合mybatis的时候需要标明主数据源,默认使用数据源;
  3. 实践是检验认识真理性的唯一标准,自己动手,丰衣足食~~
发布了79 篇原创文章 · 获赞 276 · 访问量 57万+

猜你喜欢

转载自blog.csdn.net/alan_liuyue/article/details/90370844
今日推荐