Spring Boot JPA: How do I connect multiple databases? - java

I currently have one database connected and it is working. I would like to connect another (and eventually 2 more) databases. How do I do so? There should be a solution using only annotations and properties files.
I read this
Profile Specific Properties
and it sort of helps but I still don't know how switch from one profile to the other in the code during runtime. I'm assuming I need to be connected to one profile at a time before I try to retrieve/persist things from different databases.
I also read this question, How to use 2 or more databases with spring?, but I dont know how it works too well/ if it will apply. I'm not using a controller class and I dont know what that does. I'm also not sure how the config class they mention in the answer actually connects to the specific DO.
This is my application.properties file: (marked out username and password but its there in my file)
hibernate.dialect=org.hibernate.dialect.SQLServer2012Dialect
hibernate.show_sql=true
hibernate.format_sql=true
hibernate.default_schema=dbo
hibernate.packagesToScan=src.repositories.LMClientRepository.java
spring.jpa.generate-ddl=true
spring.jpa.hibernate.naming-strategy=org.hibernate.cfg.DefaultNamingStrategy
spring.datasource.username=***
spring.datasource.password=***
spring.datasource.url=jdbc:sqlserver://schqvsqlaod:1433;database=dbMOBClientTemp;integratedSecurity=false;
spring.datasource.testOnBorrow=true
spring.datasource.validationQuery=SELECT 1
spring.jpa.database=dbMOBClientTemp
spring.jpa.show-sql=true
spring.jpa.hibernate.dialect=org.hibernate.dialect.SQLServer2012Dialect
spring.datasource.driverClassName=com.microsoft.sqlserver.jdbc.SQLServerDriver
This is my application file:
package testApplication;
import java.util.ArrayList;
import java.util.List;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.orm.jpa.EntityScan;
import org.springframework.cache.annotation.EnableCaching;
import org.springframework.context.annotation.Bean;
import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
import fileRetrieval.InputFileParse;
import lmDataObjects.LMClientDO;
import lmDataObjects.LoadMethodDO;
import repositories.LMClientRepository;
import repositories.LoadMethodRepository;
#SpringBootApplication
#EnableJpaRepositories(basePackageClasses = LoadMethodRepository.class)
#EntityScan(basePackageClasses = LoadMethodDO.class)
#EnableCaching
public class Application {
private static final Logger log = LoggerFactory.getLogger(Application.class);
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
#Bean
public CommandLineRunner demo(LoadMethodRepository lm_repo, LMClientRepository lmc_repo) {
return (args) -> {
List<LMClientDO> lmlist = InputFileParse.getMultiGroupfile();
List<String> uniqueMediaIds = new ArrayList(InputFileParse.getUniqueMediaIds());
for (int i = 0; i < InputFileParse.getUniqueMediaIds().size(); i ++){
lm_repo.save(new LoadMethodDO(uniqueMediaIds.get(i)));
}
for (int i = 0; i < lmlist.size(); i++){
lmc_repo.save(new LMClientDO(lmlist.get(i).getClientId(), lmlist.get(i).getMediaId()));
}
//Here is where I would like to do stuff with data from the other database that I have not connected yet
};
}
}
I also made a new properties file called application-MTS.properties and I put data for the new database in there. Still unsure of what to do with it.
spring.datasource.username=***
spring.datasource.password=***
spring.datasource.url=jdbc:sqlserver://SCHQVSQLCON2\VSPD:1433;database=dbMTS;integratedSecurity=false;

You will need to define multiple DataSource beans that each represent the various database connection resources you plan to use.
You will then need to add a TransactionManager and EntityManagerFactory bean definition for each of those DataSource beans.
If you intend to have each DataSource participate in a JTA transaction, you'll need to also consider configuring a JTA transaction manager rather than individual resource local transaction managers.

Related

Spring boot JPA with different parent for entity and service

I have two different maven projects in first I am trying to keep two modules one for "repository and entities" and second for the services. The second project is containing only one module with the "controllers". Now I am having many problems first is "Not a managed type" for the entities. Another thing if I keep everything in one module or even in different modules with one parent project, it works flawlessly, however, I am just trying to put the different package in different project and module
The entityscan, enablejparepositries and all others are working, the debug states:
name: default
persistence provider classname: null
classloader: sun.misc.Launcher$AppClassLoader#42a57993
excludeUnlistedClasses: true
JTA datasource: null
Non JTA datasource: HikariDataSource (null)
Transaction type: RESOURCE_LOCAL
PU root URL: file:/F:/Software/MavenRepo/com/company/repo/1.0.0/repo-1.0.0.jar
Shared Cache Mode: UNSPECIFIED
Validation Mode: AUTO
Jar files URLs []
Managed classes names [
com.company.sitemap.repo.Page]
Mapping files names []
Properties []
However, At the last it states
Caused by: java.lang.IllegalArgumentException: Not a managed type: class com.company.sitemap.repo.Page
and shows error starting the application.
Can you please help me out with this?
Here is my Application class file
package com.company.sitemap;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.autoconfigure.domain.EntityScan;
import org.springframework.boot.web.servlet.support.SpringBootServletInitializer;
import org.springframework.context.ConfigurableApplicationContext;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
import com.company.sitemap.repo.SitemapRepoConfig;
import com.company.sitemap.service.SitemapConfig;
#SpringBootApplication
#EnableJpaRepositories(basePackageClasses = { SitemapRepoConfig.class })
#EntityScan(basePackages = {"com.nie.learn.sitemap.repo"})
#ComponentScan(basePackageClasses = { SitemapConfig.class, SitemapRepoConfig.class })
public class Sitemap extends SpringBootServletInitializer {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = SpringApplication.run(Sitemap.class, args);
for (String name : applicationContext.getBeanDefinitionNames()) {
System.out.println(name);
}
}
}
The entity class resides in different project and module. Let say Project-libs and module module-repo. I am trying to add this as a maven dependency.
The entity file is as follows:
package com.company.sitemap.repo;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
#Entity
public class Page {
Repo config to scan repo classes.
package com.company.sitemap.repo;
import org.springframework.context.annotation.Configuration;
#Configuration
public class SitemapRepoConfig {
}
Service config to scan service classes:
package com.company.sitemap.service;
import javax.validation.constraints.NotNull;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import com.company.sitemap.repo.PageRepository;
#Configuration
public class SitemapConfig {
#Bean
#NotNull
public SitemapService service(#NotNull PageRepository repo) {
return new SitemapService(repo);
}
}
I'll show you an example that helps us in a somewhat different case. This kind of config is used for Module1TestConf to allow separate modules testing using Spring boot slicing (#DataJpaTest, etc.) in our multi-module Gradle project to avoid loading all the context.
We just limit our module scanning to the current module + additional entities.
#SpringBootConfiguration
#EnableAutoConfiguration
#EntityScan(
basePackages = "your.other.module.package.entity",
basePackageClasses = {SomeEntity1.class, SomeEntity2.class}
)
#ComponentScan(
value = "your.current.module.package",
excludeFilters = {
#ComponentScan.Filter(type = FilterType.CUSTOM, classes = TypeExcludeFilter),
#ComponentScan.Filter(type = FilterType.CUSTOM, classes = AutoConfigurationExcludeFilter)})
That is basically what is #SpringBootApplication is doing under the hood.
So I'm not 100% sure that it would help you. Is there a valid reason to have the projects kept as separate maven projects (use Gradle! :) )?
Anyway, you just need to make those entities scanned by Spring and become manageable...
Also try specifying entity classes directly in basePackageClasses.
You need to scan the package so that spring can create beans automatically. You are defining Page as bean but not scanning it. For scan you need to add
#ComponentScan(basePackages = { "com.nie.learn.*" })
If you have already added it than please check if Page is annotated with #Entity like below
#javax.persistence.Entity
public class Page {}
#SpringBootApplication
#EnableJpaRepositories(basePackageClasses = { com.company.sitemap.repo})
#EntityScan(basePackages = {"com.company.sitemap.repo"})
public class Sitemap extends SpringBootServletInitializer {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = SpringApplication.run(Sitemap.class, args);
for (String name : applicationContext.getBeanDefinitionNames()) {
System.out.println(name);
}
}
}
Try the above configuration.

Spring boot connect with existing JDBC Connection

Spring boot provided it's own database connection according to configuration in application.properties. But here I have a service which provided me an object of javax.sql.Connection type.
src/main/resources/application.properties
server.port=9090
spring.jpa.database=POSTGRESQL
spring.datasource.platform=postgres
spring.datasource.url=jdbc:postgresql://localhost:5432/postgres
spring.datasource.username=postgres
spring.datasource.password=root
spring.jpa.show-sql=true
spring.jpa.generate-ddl=true
spring.jpa.hibernate.ddl-auto=update
spring.jpa.properties.hibernate.jdbc.lob.non_contextual_creation=true
Here is code for repository
package com.example.springbootdemo.repositories;
import org.springframework.data.repository.CrudRepository;
import com.example.springbootdemo.model.Box;
public interface BoxRepository extends CrudRepository<Box, Long> {
}
Code for controller
package com.example.springbootdemo.controllers;
import com.example.springbootdemo.model.Box;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RestController;
import com.example.springbootdemo.repositories.BoxRepository;
#RestController
public class BoxController {
#Autowired
BoxRepository boxrepository;
#PostMapping("/box")
public Box addBox(Box box){
return this.boxrepository.save(box);
}
}
Here when I am calling save function of JPA repository it saves the object using db object which it is calculating by using some of its own wrapper.
But I have to use a jar which gives me Database connection. Instead of configuration in src/main/resources/application.properties, I have to use connection object returned from this jar. Now I'll need to override the connection object that spring boot is using internally. I am not able to figure out how I can do this.
you have this path : src//main//resoruces//application.properties
and here you need to configure

How to query and retrieve results from ~100 customer databases using SpringBoot?

I have the scenario where I have to run 10 different queries on 100 customer databases with similar structure and then push the results to an ElasticSearch cluster for analysis. All the database connections are configured inside my applications.properties file. I decided to use Spring-Boot for the project and Java High Level Rest Client as the ElasticSearch API. However I found that in Spring-Boot I have to create an entity class for every entity and create a separate class and methods for each database connection. I am new to Spring Boot and am also not understanding the concepts of entitymanager or rowmapper. It is quite different from Java connection-statement-query-resultset format. Kindly help me
I have tried creating this Database configuration class where I tried to configure a single database reading from the properties file. I have created the basic datasource() and jdbctemplate() methods
package elasticsearch;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.boot.jdbc.DataSourceBuilder;
import org.springframework.boot.orm.jpa.EntityManagerFactoryBuilder;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.annotation.EnableTransactionManagement;
import javax.persistence.EntityManagerFactory;
import javax.sql.DataSource;
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = "entityManagerFactory",
basePackages = { "elasticsearch" }
)
public class DatabaseConfig {
#Bean(name = "dataSource")
#ConfigurationProperties(prefix = "primary.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "jdbcTemplate")
public JdbcTemplate jdbcTemplate(#Qualifier("dataSource") DataSource dataSource){
return new JdbcTemplate(dataSource);
}
#Bean(name = "entityManagerFactory")
public LocalContainerEntityManagerFactoryBean entityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("dataSource") DataSource dataSource) {
return builder
.dataSource(dataSource)
.packages("elasticsearch")
.persistenceUnit("elasticsearch")
.build();
}
#Bean(name = "transactionManager")
public PlatformTransactionManager transactionManager(
#Qualifier("entityManagerFactory") EntityManagerFactory
entityManagerFactory
) {
return new JpaTransactionManager(entityManagerFactory);
}
}
I do not want to create an entity class for every object because the queries and also the results can vary. Also I do not want to create rowmappers because I have already written Json mappers for the rows retrieved to push it to elasticsearch
I am new to Spring Boot and am also not understanding the concepts of
entitymanager or rowmapper.
You don't show any of the details of your schema, but I don't think you need both JPA and JdbcTemplate. I'd recommend one or the other.
My preference would be JdbcTemplate.
JPA/Hibernate is overkill and complexity that you don't need. JdbcTemplate will be fine if you are comfortable with writing SQL SELECTs.
It is quite different from Java connection-statement-query-resultset format.
Not really. JdbcTemplate helps you with the boilerplate, but it's still JDBC underneath.
100 client databases will require 100 sets of URL and credentials, one for each. That is a lot of configuration. You can't get around that.
The problem is intractable if the schemas are not identical for all customers.
I would separate the two problems: querying for customer data and pushing to Elastic Search.
You only need a single RowMapper per query if the schema and query are identical for all customers.
I think a single repository/data access object can be used. You only need write and test it once, but you need to instantiate a new instance at runtime for each database connection.

Spring Boot Test: #TestPropertySource not overriding #EnableAutoConfiguration

I am using Spring Data LDAP to get user data from an LDAP server.
My file structure looks like this:
main
java
com.test.ldap
Application.java
Person.java
PersonRepository.java
resources
application.yml
schema.ldif
test
java
Tests.java
resources
test.yml
test_schema.ldif
And here is my test class:
import com.test.ldap.Person;
import com.test.ldap.PersonRepository;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.TestPropertySource;
import org.springframework.test.context.junit4.SpringRunner;
import java.util.List;
#RunWith(SpringRunner.class)
#SpringBootTest(classes = {PersonRepository.class})
#TestPropertySource(locations = "classpath:test.yml")
#EnableAutoConfiguration
public class Tests {
#Autowired
private PersonRepository personRepository;
#Test
public void testGetPersonByLastName() {
List<Person> names = personRepository.getPersonNamesByLastName("Bachman");
assert(names.size() > 0);
}
}
The problem is, Spring Boot is loading the application.yml and schema.ldif files instead of my test YAML and LDIF files, despite the fact that my #TestPropertySource annotation is explicitly listing test.yml. This seems to be due to the auto configuration, which I would prefer to use for convenience.
I would expect #TestPropertySource to take higher precedence than the auto configuration, but that does not seem to be the case. Is this a bug in Spring, or am I misunderstanding something?
For the record, here is my test.yml file (it does specify test_schema.ldif):
spring:
ldap:
# Embedded Spring LDAP
embedded:
base-dn: dc=test,dc=com
credential:
username: uid=admin
password: secret
ldif: classpath:test_schema.ldif
port: 12345
validation:
enabled: false
So I was able to work around this by manually specifying the properties needed to make use of the LDIF file. This is because, according to the #TestPropertySource documentation, inlined properties have higher preferences than property files.
#RunWith(SpringRunner.class)
#SpringBootTest(classes = {PersonRepository.class})
#TestPropertySource(properties =
{"spring.ldap.embedded.ldif=test_schema.ldif", "spring.ldap.embedded.base-dn=dc=test,dc=com"})
#EnableAutoConfiguration
public class Tests {
//...
}
This is not the best workaround, however: what if I had more than just two properties I needed to define? It would be impractical to list them all there.
Edit:
Renaming my test.yml file to application.yml so it overrides the production file that way did the trick. As it turns out, the TestPropertySource annotation only works for .properties files.
I discovered that YML files DO NOT work with #TestPropertySource annotation.
A clean way around this is to use #ActiveProfile. Assuming that your YML file with test properties is called
application-integration-test.yml
then you should use the annotation like this
#ActiveProfile("integration-test")

How it works together these 2 Spring Java configuration classes?

I am studying for the Spring Core certification and I have the followind doubt with an exercice related to the beans configuration using the Java configuration way.
So I have the following RewardsConfig class that configure my beans (this class is into the application folder src/main/java):
package config;
import javax.sql.DataSource;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import rewards.RewardNetwork;
import rewards.internal.RewardNetworkImpl;
import rewards.internal.account.AccountRepository;
import rewards.internal.account.JdbcAccountRepository;
import rewards.internal.restaurant.JdbcRestaurantRepository;
import rewards.internal.restaurant.RestaurantRepository;
import rewards.internal.reward.JdbcRewardRepository;
import rewards.internal.reward.RewardRepository;
#Configuration
public class RewardsConfig {
#Autowired
DataSource dataSource;
#Bean
public RewardNetwork rewardNetwork(){
return new RewardNetworkImpl(accountRepository(), restaurantRepository(), rewardRepository());
}
#Bean
public AccountRepository accountRepository(){
JdbcAccountRepository repository = new JdbcAccountRepository();
repository.setDataSource(dataSource);
return repository;
}
#Bean
public RestaurantRepository restaurantRepository(){
JdbcRestaurantRepository repository = new JdbcRestaurantRepository();
repository.setDataSource(dataSource);
return repository;
}
#Bean
public RewardRepository rewardRepository(){
JdbcRewardRepository repository = new JdbcRewardRepository();
repository.setDataSource(dataSource);
return repository;
}
}
As you can see I declare 4 methods that are used to create 4 beans and that specify the dependency that occurs among these beans.
So I have a RewardNetwork bean that is implemented by RewardNetworkImpl class that depends from the following 3 beans: AccountRepository, RestaurantRepository and RewardRepository.
Is it the correct interpretation of the Java configuration is Spring?
Can I say for example that RewardNetwork is the declared bean and that RewardNetworkImpl its the current implementation of this bean?
All the 3beans (AccountRepository, RestaurantRepository and RewardRepository) depends by another bean dataSource that, as you can see in the previous code snippet, is declared as #Autowired:
#Autowired
DataSource dataSource;
This bean is not declared in this configuration class because it changes according to the environment (test, developt, production).
So, in my case it is declared into the unit test folder src/test/java:
package rewards;
import javax.sql.DataSource;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseBuilder;
#Configuration
public class TestInfrastructureConfig {
/**
* Creates an in-memory "rewards" database populated
* with test data for fast testing
*/
#Bean
public DataSource dataSource(){
return
(new EmbeddedDatabaseBuilder())
.addScript("classpath:rewards/testdb/schema.sql")
.addScript("classpath:rewards/testdb/test-data.sql")
.build();
}
}
So the dataSource bean define a datasource that is valid only for the test environment (used when I perform a unit test).
Now my doubt is: I have 2 different configuration classes and the dataSource bean is not definied into the RewardsConfig configuration class that contains the 3 beans that use it. Why I can't not use the #Import annotation to use it into RewardsConfig?
Something like it:
#Import(TestInfrastructureConfig.class)
How it work exactly?
Tnx
You don't have to import beans to make them available for autowiring. #Import is used to add extra configuration classes.
You really don't want to hard-import a test configuration class, because then your production code is referring to test-only code (and, in this case, always activating it). Instead, think of your configuration class more like an abstract class: declare autowired beans, but don't worry about how they get there. The downstream (runtime) configuration will supply them, and you don't need to know how. Maybe you're supplying an in-memory H2 for testing and using Spring Cloud Connectors for actual runs, doesn't matter.

Categories

Resources