Small question on Spring Boot, and how to use a design pattern combined with Spring #Value configuration in order to select the appropriate #Repository please.
Setup: A springboot project which does nothing but save a pojo. The "difficulty" is the need to choose where to save the pojo, based on some info from inside the payload request.
I started with a first straightforward version, which looks like this:
#RestController
public class ControllerVersionOne {
#Autowired private ElasticRepository elasticRepository;
#Autowired private MongoDbRepository mongoRepository;
#Autowired private RedisRepository redisRepository;
//imagine many more other repositories
//imagine many more other repositories
//imagine many more other repositories
#PostMapping(path = "/save")
public String save(#RequestBody MyRequest myRequest) {
String whereToSave = myRequest.getWhereToSave();
MyPojo myPojo = new MyPojo(UUID.randomUUID().toString(), myRequest.getValue());
if (whereToSave.equals("elastic")) {
return elasticRepository.save(myPojo).toString();
} else if (whereToSave.equals("mongo")) {
return mongoRepository.save(myPojo).toString();
} else if (whereToSave.equals("redis")) {
return redisRepository.save(myPojo).toString();
// imagine many more if
// imagine many more if
// imagine many more if
} else {
return "unknown destination";
}
}
With the appropriate #Configuration and #Repository for each and every databases. I am showing 3 here, but imagine many. The project has a way to inject future #Configuration and #Repository as well (the question is not here actually)
#Configuration
public class ElasticConfiguration extends ElasticsearchConfiguration {
#Repository
public interface ElasticRepository extends CrudRepository<MyPojo, String> {
#Configuration
public class MongoConfiguration extends AbstractMongoClientConfiguration {
#Repository
public interface MongoDbRepository extends MongoRepository<MyPojo, String> {
#Configuration
public class RedisConfiguration {
#Repository
public interface RedisRepository {
Please note, some of the repositories are not children of CrudRepository. There is no direct ___Repository which can cover everything.
And this first version is working fine. Very happy, meaning I am able to save the pojo to where it should be saved, as I am getting the correct repository bean, using this if else structure.
In my opinion, this structure is not very elegant (if it ok if we have different opinion here), especially, not flexible at all (need to hardcode each and every possible repository, again imagine many).
This is why I went to refactor and change to this second version:
#RestController
public class ControllerVersionTwo {
private ElasticRepository elasticRepository;
private MongoDbRepository mongoRepository;
private RedisRepository redisRepository;
private Map<String, Function<MyPojo, MyPojo>> designPattern;
#Autowired
public ControllerVersionTwo(ElasticRepository elasticRepository, MongoDbRepository mongoRepository, RedisRepository redisRepository) {
this.elasticRepository = elasticRepository;
this.mongoRepository = mongoRepository;
this.redisRepository = redisRepository;
// many more repositories
designPattern = new HashMap<>();
designPattern.put("elastic", myPojo -> elasticRepository.save(myPojo));
designPattern.put("mongo", myPojo -> mongoRepository.save(myPojo));
designPattern.put("redis", myPojo -> redisRepository.save(myPojo));
//many more put
}
#PostMapping(path = "/save")
public String save(#RequestBody MyRequest myRequest) {
String whereToSave = myRequest.getWhereToSave();
MyPojo myPojo = new MyPojo(UUID.randomUUID().toString(), myRequest.getValue());
return designPattern.get(whereToSave).apply(myPojo).toString();
}
As you can see, I am leveraging a design pattern refactoring the if-else into a hashmap.
This post is not about if-else vs hashmap by the way.
Working fine, but please note, the map is a Map<String, Function<MyPojo, MyPojo>>, as I cannot construct a map of Map<String, #Repository>.
With this second version, the if-else is being refactored, but again, we need to hardcode the hashmap.
This is why I am having the idea to build a third version, where I can configure the map itself, via a spring boot property #Value for Map:
Here is what I tried:
#RestController
public class ControllerVersionThree {
#Value("#{${configuration.design.pattern.map}}")
Map<String, String> configurationDesignPatternMap;
private Map<String, Function<MyPojo, MyPojo>> designPatternStrategy;
public ControllerVersionThree() {
convertConfigurationDesignPatternMapToDesignPatternStrategy(configurationDesignPatternMap, designPatternStrategy);
}
private void convertConfigurationDesignPatternMapToDesignPatternStrategy(Map<String, String> configurationDesignPatternMap, Map<String, Function<MyPojo, MyPojo>> designPatternStrategy) {
// convert configurationDesignPatternMap
// {elastic:ElasticRepository, mongo:MongoDbRepository , redis:RedisRepository , ...}
// to a map where I can directly get the appropriate repository based on the key
}
#PostMapping(path = "/save")
public String save(#RequestBody MyRequest myRequest) {
String whereToSave = myRequest.getWhereToSave();
MyPojo myPojo = new MyPojo(UUID.randomUUID().toString(), myRequest.getValue());
return designPatternStrategy.get(whereToSave).apply(myPojo).toString();
}
And I would configure in the property file:
configuration.design.pattern.map={elastic:ElasticRepository, mongo:MongoDbRepository , saveToRedis:RedisRepositry, redis:RedisRepository , ...}
And tomorrow, I would be able to configure add or remove the future repository target.
configuration.design.pattern.map={elastic:ElasticRepository, anotherElasticKeyForSameElasticRepository, redis:RedisRepository , postgre:PostGreRepository}
Unfortunately, I am stuck.
What is the correct code in order to leverage a configurable property for mapping a key with it's "which #Repository to use" please?
Thank you for your help.
You can create a base repository to be extended by all your repositories:
public interface BaseRepository {
MyPojo save(MyPojo onboarding);
}
so you will have a bunch of repositories like:
#Repository("repoA")
public interface ARepository extends JpaRepository<MyPojo, String>, BaseRepository {
}
#Repository("repoB")
public interface BRepository extends JpaRepository<MyPojo, String>, BaseRepository {
}
...
Those repositories will be provided by a factory:
public interface BaseRepositoryFactory {
BaseRepository getBaseRepository(String whereToSave);
}
that you must configure in a ServiceLocatorFactoryBean:
#Bean
public ServiceLocatorFactoryBean baseRepositoryBean() {
ServiceLocatorFactoryBean serviceLocatorFactoryBean = new ServiceLocatorFactoryBean();
serviceLocatorFactoryBean.setServiceLocatorInterface(BaseRepositoryFactory.class);
return serviceLocatorFactoryBean;
}
Now you can inject the factory wherever you need and get the repo want:
#Autowired
private BaseRepositoryFactory baseRepositoryFactory;
...
baseRepositoryFactory.getBaseRepository("repoA").save(myPojo);
...
Hope it helps.
Short answer:
create a shared interface
create multiple sub-class of this interface (one per storage) using different spring component names
Use a map to deal with aliases
use Spring context to retrieve the right bean by alias (instead of creating a custom factory)
Now adding a new storage is only adding a new Repository classes with a name
Explanation:
As mentioned in the other answer you first need to define a common interface as you can't use the CrudRepository.save(...).
In my example I reuse the same signature as the save method to avoid re-implementing it in the sub-classes of CrudRepository.
public interface MyInterface<T> {
<S extends T> S save(S entity);
}
Redis Repository:
#Repository("redis") // Here is the name of the redis repo
public class RedisRepository implements MyInterface<MyPojo> {
#Override
public <S extends MyPojo> S save(S entity) {
entity.setValue(entity.getValue() + " saved by redis");
return entity;
}
}
For the other CrudRepository no need to provide an implementation:
#Repository("elastic") // Here is the name of the elastic repo
public interface ElasticRepository extends CrudRepository<MyPojo, String>, MyInterface<MyPojo> {
}
Create a configuration for your aliases in application.yml
configuration:
design:
pattern:
map:
redis: redis
saveToRedisPlease: redis
elastic: elastic
Create a custom properties to retrieve the map:
#Component
#ConfigurationProperties(prefix = "configuration.design.pattern")
public class PatternProperties {
private Map<String, String> map;
public String getRepoName(String alias) {
return map.get(alias);
}
public Map<String, String> getMap() {
return map;
}
public void setMap(Map<String, String> map) {
this.map = map;
}
}
Now create the version three of your repository with the injection of SpringContext:
#RestController
public class ControllerVersionThree {
private final ApplicationContext context;
private PatternProperties designPatternMap;
public ControllerVersionThree(ApplicationContext context,
PatternProperties designPatternMap) {
this.context = context;
this.designPatternMap = designPatternMap;
}
#PostMapping(path = "/save")
public String save(#RequestBody MyRequest myRequest) {
String whereToSave = myRequest.getWhereToSave();
MyPojo myPojo = new MyPojo(UUID.randomUUID().toString(), myRequest.getValue());
String repoName = designPatternMap.getRepoName(whereToSave);
MyInterface<MyPojo> repo = context.getBean(repoName, MyInterface.class);
return repo.save(myPojo).toString();
}
}
You can check that this is working with a test:
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.boot.test.web.client.TestRestTemplate;
import org.springframework.boot.test.web.server.LocalServerPort;
import org.springframework.http.HttpEntity;
import static org.junit.jupiter.api.Assertions.assertEquals;
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
class ControllerVersionThreeTest {
#LocalServerPort
private int port;
#Autowired
private TestRestTemplate restTemplate;
#Test
void testSaveByRedis() {
// Given: here 'redis' is the name of the spring beans
HttpEntity<MyRequest> request = new HttpEntity<>(new MyRequest("redis", "aValue"));
// When
String response = restTemplate.postForObject("http://localhost:" + port + "/save", request, String.class);
// Then
assertEquals("MyPojo{value='aValue saved by redis'}", response);
}
#Test
void testSaveByRedisAlias() {
// Given: here 'saveToRedisPlease' is an alias name of the spring beans
HttpEntity<MyRequest> request = new HttpEntity<>(new MyRequest("saveToRedisPlease", "aValue"));
// When
String response = restTemplate.postForObject("http://localhost:" + port + "/save", request, String.class);
// Then
assertEquals("MyPojo{value='aValue saved by redis'}", response);
}
}
Have you tried creating a configuration class to create your repository map
#Configuration
public class MyConfiguration {
#Bean
public Map repositoryMap() {
Map<String, ? extends Repository> repositoryMap = new HashMap<>();
repositoryMap.put('redis', new RedisRepository());
repositoryMap.put('mongo', new MongoRepository());
repositoryMap.put('elastic', new ElasticRepository());
return Collections.unmodifiableMap(repositoryMap);
}
}
Then you could have the following in your rest controller
#RestController
#Configuration
public class ControllerVersionFour {
#Autowired
private Map<String, ? extends Repository> repositoryMap;
#PostMapping(path = "/save/{dbname}")
public String save(#RequestBody MyRequest myRequest, #PathVariable("dbname") String dbname) {
MyPojo myPojo = new MyPojo(UUID.randomUUID().toString(), myRequest.getValue());
return repisitoryMap.get(dbname).save(myPojo);
}
It might be better to have the db as a path/query parameter instead of having it in the request body. That way you may or may not be able to just save the request body depending on your use case instead of creating another pojo.
This post may also be useful for autowiring a map
I have an #Autowired in one of my Serializers which extends StdSerializer
public class RefSerializer extends StdSerializer<LabeledElement> {
#Autowired
I18n i18n;
public RefSerializer() {
this(null);
}
public RefSerializer(Class<LabeledElement> t) {
super(t);
}
#Override
public void serialize(LabeledElement element, JsonGenerator generator, SerializerProvider provider) throws IOException {
String identifier = null;
String label = LabelUtils.labelPlanElement(this.i18n, planElement, "ref");
generator.writeObject(ReferenceElement.of(element.getId(), label, identifier));
}
}
and is used by #JsonSerialize inside the model class.
#JsonSerialize(contentUsing = RefSerializer.class)
#JsonDeserialize(contentUsing = PlanElementDeserializer.class)
#OneToMany(fetch = FetchType.EAGER)
private List<PlanElement> planElements;
If the Serializer is called inside my #RestComponent annotated endpoints the #Autowired is resolved and everything works fine for incoming and returned models.
Now I want to send the model actively via RestTemplate#exchange, but now the #Autowired inside the Serializer is null.
restTemplate.exchange(endpointUrl, httpMethod, new HttpEntity<>(planElement, authHeader), Map.class, authParameters);
Is there a way to get the autowiring to work for outgoing REST calls with RestTemplate?
Using Spring-boot 2.6.3, Java 17
If the RestTemplate is provided by bean inside a #Configuration file like so:
#Bean
public RestTemplate restTemplate(RestTemplateBuilder builder) {
return builder.build();
}
and then autowired via #Autowired inside the service which uses the RestTemplate and not instantiated by "new RestTemplate()" the autowired services inside the custom serializer are also available for the RestTemplates REST calls.
I'm trying to inject my config into my custom StdDeserializer. The problem however is, even though I marked the class as #Component, the value of the field is never injected. The injection works without any problems in the other places of the application.
Therefore, I've come to the conclusion that the problem is with the way the deserializer is working, since it doesn't get instantiated by us but rather like the example down here:
ClassToDeserialize myClass = new ObjectMapper().readValue(mockJson, ClassToDeserialize.class);
As you can see there is no explicit usage of my custom deserializer ClassToDeserializeDeserializer hence it detects the classes with the custom deserializer with the #JsonDeserialize(using = ClassToDeserialize.class) annotation.
Class that should be deserialized
#Data
#AllArgsConstructor
#SuperBuilder
#JsonDeserialize(using = MyClassDeserializer.class)
public class MyClass{
private final String field1;
private final String field2;
}
Config class that should be injected
#Configuration
#ConfigurationProperties(prefix = "myconfig")
#Data
#AllArgsConstructor
#NoArgsConstructor
public class MyConfig {
private final String confField1;
private final String confField2;
}
MyClass's custom deserializer:
#Component
public class MyClassDeserializer extends StdDeserializer<MyClass> {
#Autowired
MyConfig myConfig;
public MyClassDeserializer () {
this(null);
}
public MyClassDeserializer (Class<?> vc) {
super(vc);
}
#Override
public MyClassDeserializer deserialize(JsonParser parser, DeserializationContext context)
throws IOException, JsonProcessingException {
//Deserializing code that basically tries to read from myConfig
myConfig.getConfField1(); //Hello NullPointerException my old friend
}
}
Usage of the deserializer
MyClass myClass = new ObjectMapper().readValue(mockJson, MyClass.class);
Reason why it doesn't work:
Jackson doesn't know anything about Spring Boot stuff, so when you readValue(..) Jackson sees #JsonDeserialize annotation with deserializer class and creates new instance of the deserializer (it doesn't pick up the bean, but rather just new MyClassDeserializer(..)), that is why you never see MyConfig being injected.
If you want to make it work, you need to somehow register this deserializer through Spring Boot, for example, like this: How to provide a custom deserializer with Jackson and Spring Boot
as I started writing tests for my Spring Boot based REST API I noticed that an OffsetDateTime attribute in my DTO is getting serialized differently whether I use the Spring MockMVC to do a request or the Jackson ObjectMapper. When using Spring my #JSONFormat annotation is used correctly but when using the ObjectMapper it isn't.
#EqualsAndHashCode
#Builder
public class FooDTO{
public int id;
#JsonFormat(shape = JsonFormat.Shape.STRING, pattern = "yyyy-MM-dd HH:mm:ss")
public OffsetDateTime arrival;
public fooDTO(int id, OffsetDateTime arrival){
this.id = id;
this.arrival = arrival;
}
}
#RestController
public class FooController {
#Autowired
private FooRepository fooRepository;
#RequestMapping(value = "/foo/bar/{id}")
public fooDTO getFoo (#PathVariable int id) {
return fooRepository.loadDTO(id);
}
}
#RunWith(SpringRunner.class)
#ContextConfiguration(classes = Application.class)
#WebMvcTest(FooController.class)
public class FooControllerTest {
#Autowired
private MockMvc mockMvc;
#MockBean
private FooRepository fooRepository;
#Test
public void fooTest() {
FooDTO fooDTO = FooDTO.builder().id(1).arrival(OffsetDateTime.now()).build();
String fooDTOJSON = new ObjectMapper().writeValueAsString(fooDTO);
when(fooRepository.loadDTO(1).thenReturn(fooDTO);
String reponse = mockMvc.perform(request(HttpMethod.GET, "/foo/bar/1").accept(APPLICATION_JSON).andReturn().getResponse().getContentAsString();
assertEquals(fooDTOJSON, response);
}
The Spring MockMVC response looks like this:
{"id":1, "arrival": "2020-03-28 12:29:44"}
While the fooDTOJSON from the ObjectMapper looks like this:
{"id":1, "arrival":{"offset":{"totalSeconds":3600,"id":"+01:00","rules":{"fixedOffset":true,"transitions":[],"transitionRules":[]}},"nano":697162400,"year":2020,"monthValue":3,"dayOfMonth":28,"hour":12,"minute":29,"second":44,"dayOfWeek":"SATURDAY","dayOfYear":88,"month":"MARCH"}}
Ideally, I would expect the ObjectMapper to return the same result as MockMVC and use my annotation on the DTO. I'd really appreciate someone's help on this even if the solution might be quiet obvious. I'm not too used to working in the Java ecosystem especially Spring.
The Solution that led to the expected result was adding the jackson-modules-java8 library (in my case I think it was already present due to Spring(-Boot) or another dependency of my Application) and add the according module to the Jackson Mapper.
The working solution to receive the "correct" JSON String was:
String fooDTOJSON = new ObjectMapper().registerModule(new JavaTimeModule()).writeValueAsString(fooDTO);
I have a controller looks like this:
#RestController
#RequestMapping(value="/api/events")
public class EventController{
#Inject
private EventValidator eventValidator;
#InitBinder
#Qualifier("eventValidatior")
private void initBinder(WebDataBinder binder){
binder.setValidator(eventValidator);
}
#PostMapping()
public ResponseEntity<EventModel> save(#Valid #RequestBody EventRequest request, BindingResult result){
if(result.hasErrors()){
//some validation
}
//some other logic
}
}
Then i have a EventRequest pojo:
public class EventRequest{
private String eventName;
#Valid
#NotNull
private List<Event> events;
//setters and getters
}
In my controller, I have 2 types of validation, the InitBinder, and also java bean validation (JSR-303) that use #NotNull in EventRequest class.
The problem is, if I have BindingResult result in the controller, the #NotNull annotation won't work. And even the cascaded validation in Event class is not working too.
Why is that, how can I have both 2 types of validation?
Tried to add this but still not working
#Configuration
public class ValidatorConfig {
#Bean
public LocalValidatorFactoryBean defaultValidator() {
return new LocalValidatorFactoryBean();
}
#Bean
public MethodValidationPostProcessor methodValidationPostProcessor() {
return new MethodValidationPostProcessor();
}
}
binder.setValidator(eventValidator); will replace other registered validators.
Change to:
binder.addValidators(eventValidator);