Spring Multi datasource with similar schema - java

I have 4 databases with similar schema on PostgreSQL
My current code is like this
ressources
spring.datasource.url=jdbc:postgresql://localhost:5432/postgres
spring.datasource.username=postgres
spring.datasource.password=postgres
DAO
public interface AccountRepository extends JpaRepository<Account, Long>{}
Configuration
#Configuration
public class AccountServiceConfiguration {
#Autowired
private AccountRepository accountRepository;
#Bean
public AccountService accountService() {
return new AccountService(accountRepository);
}
}
Controller
#RestController
#RequestMapping("/accounts")
public class AccountController {
#Autowired
private AccountService accountService;
#RequestMapping(name = "/", method = RequestMethod.GET)
public Page<Account> getAccounts(Integer page, Integer size) {
return accountService.getAll(page, size);
}
}
Service
public class AccounttService {
public AccounttService(AccountRepository accountRepository) {
this.accountRepository = accountRepository;
}
public Page<Account> getAll(Integer page, Integer size) {
PageRequest pageRequest = new PageRequest(page, size);
return accountRepository.findAll(pageRequest);
}
}
I want to change like this
ressources
spring.db1.url=jdbc:postgresql://db1:5432/postgres
spring.db1.username=postgres1
spring.db1.password=postgres1
spring.db2.url=jdbc:postgresql://db2:5432/postgres
spring.db2.username=postgres2
spring.db2.password=postgres2
spring.db3.url=jdbc:postgresql://db3:5432/postgres
spring.db3.username=postgres3
spring.db3.password=postgres3
spring.db4.url=jdbc:postgresql://db4:5432/postgres
spring.db4.username=postgres4
spring.db4.password=postgres4
Controller
...
public Page<Account> getAccounts(Integer page, Integer size, string env) {
return accountService.getAll(page, size, env);
}
...
Service
public class AccounttService {
public AccounttService(Map<AccountRepository> mapAccountRepository) {
this.mapAccountRepository = mapAccountRepository;
}
public Page<Account> getAll(Integer page, Integer size, String env) {
PageRequest pageRequest = new PageRequest(page, size);
// search in specific env
}
}
How to load 4 data sources (may be on map) and search by environnement !
If i send env=db1 i want to run my request on db1
If you have other solution, i take it, but must use one repository and one entity to search in all databases.
Thank you :)

According to your comments you want a single Repository instance to switch between different schemata.
This won't work.
What you can do is provide a Facade for multiple Repository instance that delegates each call to on of the many instances according to some parameter/field/property.
But one way or the other you have to create a separate Repository instance with a different database connection for each.

What you are describing is called multi-tenancy using multiple databases.
To accomplish so you would need to manually configure the persistence layer and not to rely completely in Spring Boot auto-configuration capabilities.
The persistence layer configuration involves:
Hibernate, JPA and datasources properties
Datasources beans
Entity manager factory bean (in the case of Hibernate, with properties specifying this is a mult-tenant entity manager factory bean and tenant connection provider as well as tenant resolver)
Transaction manager bean
Spring Data JPA and transaction support configuration
In a blog post I recently published: Multi-tenant applications using Spring Boot, JPA, Hibernate and Postgres I cover in this exact problem with a detailed implementation.

Related

Spring mvc - which layer should convert entities to dtos (and vice versa)

In which layer should DTO/Entity conversion take place.
Having following structure in a Spring Mvc application:
Controller
Service
Repository
The approach I'm using now, where service layer is #Transactional.
#RestController
public class ExampleController {
#Autowired
private ExampleService exampleService;
#Autowired
private ExampleMapper exampleMapper;
#GetMapping("/examples")
public ResponseEntity<List<ExamleDto>> getAll() {
var examples = exampleService.getAll();
return ResponseEntity.ok(exampleMapper.examplesToExampleDtos(examples));
}
#PostMapping("/examples")
public ResponseEntity<Void> create(#RequestBody #Valid ExampleCreateDto createDto) {
var example = exampleService.create(createDto)
return ResponseEntity.created(URI.create("examples/" + example.getId()).build();
}
// PUT, DELETE, ...
}
#Service
#Transactional
public class ExampleService {
#Autowired
private ExampleRepository exampleRepository;
#Autowired
private ExampleMapper exampleMapper;
public List<Examle> getAll() {
var examples = exampleRepository.findAll();
return examples;
}
public void create(ExampleDto exampleDto) {
var example = exampleMapper.asExample(exampleDto);
return exampleRepository.save(example);
}
}
public interface ExampleRepository extends JpaRepository<Example, Long> {
Why I choose this aproach:
The service layer is transactional, so whenever we get back to the controller, all changes will be flushed (version field for example) will all be set.
It makes you think about your entitygraph, lets say you have a Person entity which has a list of Deparments. Lets say the PersonDto contains also the list of DeparmentDtos, it forces you to fetch all deparments before hand or you will run into a LazyInitializationException in the controller layer.
Which in my opinion is a good thing, because if you would perform the mapping in the service you would be doing N + 1 queries (N being the number of deparments) without realizing it.
Services who need each other to perform there business tasks, work on the entity model instead of the DTO model, which might have some validation (#NotNull, #Size, ...) which only supposed to be valided when it comes from the outside, but internally not all validations should be applied.
Business rules will still be checked in the service layer as part of the service method.
The only thing here is that for update/creates service still communicate by passing dtos iso of entities.
I googled this topic a lot, but couldn't find a definitive answer.

Use Spring #RefreshScope, #Conditional annotations to replace bean injection at runtime after a ConfigurationProperties has changed

I'm running a PoC around replacing bean injection at runtime after a ConfigurationProperties has changed. This is based on spring boot dynamic configuration properties support as well summarised here by Dave Syer from Pivotal.
In my application I have a simple interface implemented by two different concrete classes:
#Component
#RefreshScope
#ConditionalOnExpression(value = "'${config.dynamic.context.country}' == 'it'")
public class HelloIT implements HelloService {
#Override
public String sayHello() {
return "Ciao dall'italia";
}
}
and
#Component
#RefreshScope
#ConditionalOnExpression(value = "'${config.dynamic.context.country}' == 'us'")
public class HelloUS implements HelloService {
#Override
public String sayHello() {
return "Hi from US";
}
}
application.yaml served by spring cloud config server is:
config:
name: Default App
dynamic:
context:
country: us
and the related ConfigurationProperties class:
#Configuration
#ConfigurationProperties (prefix = "config.dynamic")
public class ContextHolder {
private Map<String, String> context;
Map<String, String> getContext() {
return context;
}
public void setContext(Map<String, String> context) {
this.context = context;
}
My client app entrypoint is:
#SpringBootApplication
#RestController
#RefreshScope
public class App1Application {
#Autowired
private HelloService helloService;
#RequestMapping("/hello")
public String hello() {
return helloService.sayHello();
}
First time I browse http://locahost:8080/hello endpoint it returns "Hi from US"
After that I change country: us in country: it in application.yaml in spring config server, and then hit the actuator/refresh endpoint ( on the client app).
Second time I browse http://locahost:8080/hello it stills returns "Hi from US" instead of "ciao dall'italia" as I would expect.
Is this use case supported in spring boot 2 when using #RefreshScope? In particular I'm referring to the fact of using it along with #Conditional annotations.
This implementation worked for me:
#Component
#RefreshScope
public class HelloDelegate implements HelloService {
#Delegate // lombok delegate (for the sake of brevity)
private final HelloService delegate;
public HelloDelegate(
// just inject value from Spring configuration
#Value("${country}") String country
) {
HelloService impl = null;
switch (country) {
case "it":
this.delegate = new HelloIT();
break;
default:
this.delegate = new HelloUS();
break;
}
}
}
It works the following way:
When first invocation of service method happens Spring creates bean HelloDelegate with configuration effective at that moment; bean is put into refresh scope cache
Because of #RefreshScope whenever configuration is changed (country property particularly in this case) HelloDelegate bean gets cleared from refresh scope cache
When next invocation happens, Spring has to create bean again because it does not exist in cache, so step 1 is repeated with new country property
As far as I watched the behavior of this implementation, Spring will try to avoid recreating RefreshScope bean if it's configuration was untouched.
I was looking for more generic solution of doing such "runtime" implementation replacement when found this question. This implementation has one significant disadvantage: if delegated beans have complex non-homogeneous configuration (e.g. each bean has it's own properties) code becomes lousy and therefore unsafe.
I use this approach to provide additional testability for artifacts. So that QA would be able to switch between stub and real integration without significant efforts. I would strongly recommend to avoid using such approach for business functionality.

Spring (Boot) validation annotations for different layers on the same class

Given a web application with Spring Boot, Spring MVC and Spring Data (with MongoDB as a database) and a one class used to represent request on multiple layers (REST, service, persistence).
Is it possible to declarative specify validation constraints on the fields of the class such that some of them would apply only for certain layers (or will be ignored by some) ?
Example:
Entity (getter and setter autogenerated)
public class User {
private String name;
#NotEmpty
private String role;
}
where #NotEmpty is JSR 303 anotation
REST API layer
role does not exist here
#RestController
public class RegisterController {
#Autowired
private UserService service;
#PostMapping
public User register(#Valid User u) {
return service.createAppUser(u);
}
}
Service layer
role is set by the implementation and is required by the persistence layer
#Service
public class UserService {
#Autowired
private UserRepo repo;
private User createAppUser(User u) {
u.setRole("APP_USER");
return repo.save(u);
}
}
where repo is Spring Data MongoRepository.
I can think of two approaches which solve this:
Introduce DTO object for REST API layer
Manual/ procedural validation; either using Spring Validator or something else, doesn't matter - simply nothing declarative
Both of which I don't like very much as they require lot of boilerplate and this is a trivial case.
you can use validation group and #Validated annotation.
like this:
Entity
#NotEmpty(groups = Create.class)
Method
public User register(#Validated(Create.class) User u) {
return service.createAppUser(u);
}

ACL security in Spring Boot

I am having issues setting up ACL through Java config in a Spring Boot application. I have created one small project to reproduce the issues.
I have tried a few different approaches. First issue I had was with EhCache, and after I fixed that (I assume I did) I couldn't login any more, and it looks like all the data is gone.
There are 4 classes with different configurations:
ACLConfig1.class
ACLConfig2.class
ACLConfig3.class
ACLConfig4.class
All #PreAuthorize and #PostAuthorize annotations are working as expected, except hasPermission.
Controller holds 4 endpoints: one for User, one for Admin, one Public and the last one which gives me headache #PostAuthorize("hasPermission(returnObject,'administration')")
I am pretty sure that inserts in DB are correct. This class is one of four, the last one that I have tried:
#Configuration
#EnableGlobalMethodSecurity(prePostEnabled = true, securedEnabled = true)
public class ACLConfig4 {
#Autowired
DataSource dataSource;
#Bean
public EhCacheBasedAclCache aclCache() {
return new EhCacheBasedAclCache(aclEhCacheFactoryBean().getObject(), permissionGrantingStrategy(), aclAuthorizationStrategy());
}
#Bean
public EhCacheFactoryBean aclEhCacheFactoryBean() {
EhCacheFactoryBean ehCacheFactoryBean = new EhCacheFactoryBean();
ehCacheFactoryBean.setCacheManager(aclCacheManager().getObject());
ehCacheFactoryBean.setCacheName("aclCache");
return ehCacheFactoryBean;
}
#Bean
public EhCacheManagerFactoryBean aclCacheManager() {
return new EhCacheManagerFactoryBean();
}
#Bean
public DefaultPermissionGrantingStrategy permissionGrantingStrategy() {
ConsoleAuditLogger consoleAuditLogger = new ConsoleAuditLogger();
return new DefaultPermissionGrantingStrategy(consoleAuditLogger);
}
#Bean
public AclAuthorizationStrategy aclAuthorizationStrategy() {
return new AclAuthorizationStrategyImpl(new SimpleGrantedAuthority("ROLE_ADMINISTRATOR"));
}
#Bean
public LookupStrategy lookupStrategy() {
return new BasicLookupStrategy(dataSource, aclCache(), aclAuthorizationStrategy(), new ConsoleAuditLogger());
}
#Bean
public JdbcMutableAclService aclService() {
JdbcMutableAclService service = new JdbcMutableAclService(dataSource, lookupStrategy(), aclCache());
return service;
}
#Bean
public DefaultMethodSecurityExpressionHandler defaultMethodSecurityExpressionHandler() {
return new DefaultMethodSecurityExpressionHandler();
}
#Bean
public MethodSecurityExpressionHandler createExpressionHandler() {
DefaultMethodSecurityExpressionHandler expressionHandler = defaultMethodSecurityExpressionHandler();
expressionHandler.setPermissionEvaluator(new AclPermissionEvaluator(aclService()));
expressionHandler.setPermissionCacheOptimizer(new AclPermissionCacheOptimizer(aclService()));
return expressionHandler;
}
}
What am I missing here? Why I have no data if I use ACLConfig3.class or
ACLConfig4.class. Is there any example on how this should be configured programmatically in Spring Boot?
The reason why you have no data was a bit tricky to find out. As soon as you define a MethodSecurityExpressionHandler bean in your config, there is no data in the database tables. This is because your data.sql file isn't executed.
Before explaining why data.sql isn't executed I'd first like to point out that you're not using the file as intended.
data.sql is executed by spring-boot after hibernate has been initialized and normally only contains DML statements. Your data.sql contains both DDL (schema) statements and DML (data) statements. This isn't ideal as some of your DDL statements clash with hibernate's hibernate.hbm2ddl.auto behaviour (note that spring-boot uses 'create-drop' when an embedded DataSource is being used). You should put your DDL statements in schema.sql and your DML statements in data.sql. As you're manually defining all tables you should disable hibernate.hbm2ddl.auto (by adding spring.jpa.hibernate.ddl-auto=none to applciation.properties).
That being said, let's take a look at why data.sql isn't executed.
The execution of data.sql is triggered via an ApplicationEvent that's fired via a BeanPostProcessor. This BeanPostProcessor (DataSourceInitializedPublisher) is created as a part of spring-boot's Hibernate/JPA auto configuration (see org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration, org.springframework.boot.autoconfigure.orm.jpa.DataSourceInitializedPublisher and org.springframework.boot.autoconfigure.jdbc.DataSourceInitializer).
Normally the DataSourceInitializedPublisher is created before the (embedded) DataSource is created and everything works as expected but by defining a custom MethodSecurityExpressionHandler the normal bean creation order alters.
As you've configured #EnableGlobalMethodSecurity, your're automatically importing GlobalMethodSecurityConfiguration.
spring-security related beans are created early on. As your MethodSecurityExpressionHandler requires a DataSource for the ACL stuff and the spring-security related beans require your custom MethodSecurityExpressionHandler, the DataSource is created earlier than usual; in fact it's created so early on that spring-boot's DataSourceInitializedPublisher isn't created yet.
The DataSourceInitializedPublisher is created later on but as it didn't notice the creation of a DataSource bean, it also doesn't trigger the execution of data.sql.
So long story short: the security configuration alters the normal bean creation order which results in data.sql not being loaded.
I guess that fixing the bean creation order would do the trick, but as I don't now how (without further experimentation) I propose the following solution: manually define your DataSource and take care of data initialization.
#Configuration
public class DataSourceConfig {
#Bean
public EmbeddedDatabase dataSource() {
return new EmbeddedDatabaseBuilder().setType(EmbeddedDatabaseType.H2)
//as your data.sql file contains both DDL & DML you might want to rename it (e.g. init.sql)
.addScript("classpath:/data.sql")
.build();
}
}
As your data.sql file contains all DDL required by your application you can disable hibernate.hbm2ddl.auto. Add spring.jpa.hibernate.ddl-auto=none to applciation.properties.
When defining your own DataSource spring-boot's DataSourceAutoConfiguration normally back's out but if you want to be sure you can also exclude it (optional).
#SpringBootConfiguration
#EnableAutoConfiguration(exclude = DataSourceAutoConfiguration.class)
#ComponentScan
#EnableCaching
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
This should fix your 'no data' problem. But in order to get everything working as expected you need to make 2 more modifications.
First of all, you should only define one MethodSecurityExpressionHandler bean. Currently you're defining 2 MethodSecurityExpressionHandler beans. Spring-security won't know which one to use and will (silently) use it's own internal MethodSecurityExpressionHandler instead. See org.springframework.security.config.annotation.method.configuration.GlobalMethodSecurityConfiguration#setMethodSecurityExpressionHandler.
#Configuration
#EnableGlobalMethodSecurity(prePostEnabled = true, securedEnabled = true)
public class MyACLConfig {
//...
#Bean
public MethodSecurityExpressionHandler createExpressionHandler() {
DefaultMethodSecurityExpressionHandler securityExpressionHandler = new DefaultMethodSecurityExpressionHandler();
securityExpressionHandler.setPermissionEvaluator(new AclPermissionEvaluator(aclService()));
securityExpressionHandler.setPermissionCacheOptimizer(new AclPermissionCacheOptimizer(aclService()));
return securityExpressionHandler;
}
}
The last thing you need to do is make the getId() method in Car public.
#Entity
public class Car {
//...
public long getId() {
return id;
}
//...
}
The standard ObjectIdentityRetrievalStrategy will look for a public method 'getId()' when trying to determine an object's identity during ACL permission evaluation.
(Note that I've based my answer upon ACLConfig4.)

#Transactional has no effect

I'm currently having the issue that the #Transactional annotation doesn't seem to start a transaction for Neo4j, yet (it doesn't work with any of my #Transactional annotated methods, not just with the following example).
Example:
I have this method (UserService.createUser), which creates a user node in the Neo4j graph first and then creates the user (with additional information) in a MongoDB. (MongoDB doesn't support transactions, thus create the user-node first, then insert the entity into MongoDB and commit the Neo4j-transaction afterwards).
The method is annotated with #Transactional yet a org.neo4j.graphdb.NotInTransactionException is thrown when it comes to creating the user in Neo4j.
Here is about my configuration and coding, respectively:
Code based SDN-Neo4j configuration:
#Configuration
#EnableTransactionManagement // mode = proxy
#EnableNeo4jRepositories(basePackages = "graph.repository")
public class Neo4jConfig extends Neo4jConfiguration {
private static final String DB_PATH = "path_to.db";
private static final String CONFIG_PATH = "path_to.properties";
#Bean(destroyMethod = "shutdown")
public GraphDatabaseService graphDatabaseService() {
return new GraphDatabaseFactory().newEmbeddedDatabaseBuilder(DB_PATH)
.loadPropertiesFromFile(CONFIG_PATH).newGraphDatabase();
}
}
Service for creating the user in Neo4j and the MongoDB:
#Service
public class UserService {
#Inject
private UserMdbRepository mdbUserRepository; // MongoRepository
#Inject
private Neo4jTemplate neo4jTemplate;
#Transactional
public User createUser(User user) {
// Create the graph-node first, because if this fails the user
// shall not be created in the MongoDB
this.neo4jTemplate.save(user); // NotInTransactionException is thrown here
// Then create the MongoDB-user. This can't be rolled back, but
// if this fails, the Neo4j-modification shall be rolled back too
return this.mdbUserRepository.save(user);
}
...
}
Side-notes:
I'm using spring version 3.2.3.RELEASE and spring-data-neo4j version 2.3.0.M1
UserService and Neo4jConfig are in separate Maven artifacts
Starting the server and SDN reading operations work so far, I'm just having troubles with writing operations
I'm currently migrating our project from the tinkerpop-framework to SDN-Neo4j. This user creation-process has worked before (with tinkerpop), I just have to make it work again with SDN-Neo4j.
I'm running the application in Jetty
Does anyone have any clue why this is not working (yet)?
I hope, this information is sufficient. If anything is missing, please let me know and I'll add it.
Edit:
I forgot to mention that manual transaction-handling works, but of course I'd like to implement it the way "as it's meant to be".
public User createUser(User user) throws ServiceException {
Transaction tx = this.graphDatabaseService.beginTx();
try {
this.neo4jTemplate.save(user);
User persistantUser = this.mdbUserRepository.save(user);
tx.success();
return persistantUser;
} catch (Exception e) {
tx.failure();
throw new ServiceException(e);
} finally {
tx.finish();
}
}
Thanks to m-deinum I finally found the issue. The problem was that I scanned for those components / services in a different spring-configuration-file, than where I configured SDN-Neo4j. I moved the component-scan for those packages which might require transactions to my Neo4jConfig and now it works
#Configuration
#EnableTransactionManagement // mode = proxy
#EnableNeo4jRepositories(basePackages = "graph.repository")
#ComponentScan({
"graph.component",
"graph.service",
"core.service"
})
public class Neo4jConfig extends Neo4jConfiguration {
private static final String DB_PATH = "path_to.db";
private static final String CONFIG_PATH = "path_to.properties";
#Bean(destroyMethod = "shutdown")
public GraphDatabaseService graphDatabaseService() {
return new GraphDatabaseFactory().newEmbeddedDatabaseBuilder(DB_PATH)
.loadPropertiesFromFile(CONFIG_PATH).newGraphDatabase();
}
}
I still will have to separate those components / services which require transactions from those which don't, though. However, this works for now.
I assume that the issue was that the other spring-configuration-file (which included the component-scan) was loaded before Neo4jConfig, since neo4j:repositories has to be put before context:component-scan. (See Note in Example 20.26. Composing repositories http://static.springsource.org/spring-data/data-neo4j/docs/current/reference/html/programming-model.html#d0e2948)

Categories

Resources