Small question on Spring Boot, and how to use a design pattern combined with Spring #Value configuration in order to select the appropriate #Repository please.
Setup: A springboot project which does nothing but save a pojo. The "difficulty" is the need to choose where to save the pojo, based on some info from inside the payload request.
I started with a first straightforward version, which looks like this:
#RestController
public class ControllerVersionOne {
#Autowired private ElasticRepository elasticRepository;
#Autowired private MongoDbRepository mongoRepository;
#Autowired private RedisRepository redisRepository;
//imagine many more other repositories
//imagine many more other repositories
//imagine many more other repositories
#PostMapping(path = "/save")
public String save(#RequestBody MyRequest myRequest) {
String whereToSave = myRequest.getWhereToSave();
MyPojo myPojo = new MyPojo(UUID.randomUUID().toString(), myRequest.getValue());
if (whereToSave.equals("elastic")) {
return elasticRepository.save(myPojo).toString();
} else if (whereToSave.equals("mongo")) {
return mongoRepository.save(myPojo).toString();
} else if (whereToSave.equals("redis")) {
return redisRepository.save(myPojo).toString();
// imagine many more if
// imagine many more if
// imagine many more if
} else {
return "unknown destination";
}
}
With the appropriate #Configuration and #Repository for each and every databases. I am showing 3 here, but imagine many. The project has a way to inject future #Configuration and #Repository as well (the question is not here actually)
#Configuration
public class ElasticConfiguration extends ElasticsearchConfiguration {
#Repository
public interface ElasticRepository extends CrudRepository<MyPojo, String> {
#Configuration
public class MongoConfiguration extends AbstractMongoClientConfiguration {
#Repository
public interface MongoDbRepository extends MongoRepository<MyPojo, String> {
#Configuration
public class RedisConfiguration {
#Repository
public interface RedisRepository {
Please note, some of the repositories are not children of CrudRepository. There is no direct ___Repository which can cover everything.
And this first version is working fine. Very happy, meaning I am able to save the pojo to where it should be saved, as I am getting the correct repository bean, using this if else structure.
In my opinion, this structure is not very elegant (if it ok if we have different opinion here), especially, not flexible at all (need to hardcode each and every possible repository, again imagine many).
This is why I went to refactor and change to this second version:
#RestController
public class ControllerVersionTwo {
private ElasticRepository elasticRepository;
private MongoDbRepository mongoRepository;
private RedisRepository redisRepository;
private Map<String, Function<MyPojo, MyPojo>> designPattern;
#Autowired
public ControllerVersionTwo(ElasticRepository elasticRepository, MongoDbRepository mongoRepository, RedisRepository redisRepository) {
this.elasticRepository = elasticRepository;
this.mongoRepository = mongoRepository;
this.redisRepository = redisRepository;
// many more repositories
designPattern = new HashMap<>();
designPattern.put("elastic", myPojo -> elasticRepository.save(myPojo));
designPattern.put("mongo", myPojo -> mongoRepository.save(myPojo));
designPattern.put("redis", myPojo -> redisRepository.save(myPojo));
//many more put
}
#PostMapping(path = "/save")
public String save(#RequestBody MyRequest myRequest) {
String whereToSave = myRequest.getWhereToSave();
MyPojo myPojo = new MyPojo(UUID.randomUUID().toString(), myRequest.getValue());
return designPattern.get(whereToSave).apply(myPojo).toString();
}
As you can see, I am leveraging a design pattern refactoring the if-else into a hashmap.
This post is not about if-else vs hashmap by the way.
Working fine, but please note, the map is a Map<String, Function<MyPojo, MyPojo>>, as I cannot construct a map of Map<String, #Repository>.
With this second version, the if-else is being refactored, but again, we need to hardcode the hashmap.
This is why I am having the idea to build a third version, where I can configure the map itself, via a spring boot property #Value for Map:
Here is what I tried:
#RestController
public class ControllerVersionThree {
#Value("#{${configuration.design.pattern.map}}")
Map<String, String> configurationDesignPatternMap;
private Map<String, Function<MyPojo, MyPojo>> designPatternStrategy;
public ControllerVersionThree() {
convertConfigurationDesignPatternMapToDesignPatternStrategy(configurationDesignPatternMap, designPatternStrategy);
}
private void convertConfigurationDesignPatternMapToDesignPatternStrategy(Map<String, String> configurationDesignPatternMap, Map<String, Function<MyPojo, MyPojo>> designPatternStrategy) {
// convert configurationDesignPatternMap
// {elastic:ElasticRepository, mongo:MongoDbRepository , redis:RedisRepository , ...}
// to a map where I can directly get the appropriate repository based on the key
}
#PostMapping(path = "/save")
public String save(#RequestBody MyRequest myRequest) {
String whereToSave = myRequest.getWhereToSave();
MyPojo myPojo = new MyPojo(UUID.randomUUID().toString(), myRequest.getValue());
return designPatternStrategy.get(whereToSave).apply(myPojo).toString();
}
And I would configure in the property file:
configuration.design.pattern.map={elastic:ElasticRepository, mongo:MongoDbRepository , saveToRedis:RedisRepositry, redis:RedisRepository , ...}
And tomorrow, I would be able to configure add or remove the future repository target.
configuration.design.pattern.map={elastic:ElasticRepository, anotherElasticKeyForSameElasticRepository, redis:RedisRepository , postgre:PostGreRepository}
Unfortunately, I am stuck.
What is the correct code in order to leverage a configurable property for mapping a key with it's "which #Repository to use" please?
Thank you for your help.
You can create a base repository to be extended by all your repositories:
public interface BaseRepository {
MyPojo save(MyPojo onboarding);
}
so you will have a bunch of repositories like:
#Repository("repoA")
public interface ARepository extends JpaRepository<MyPojo, String>, BaseRepository {
}
#Repository("repoB")
public interface BRepository extends JpaRepository<MyPojo, String>, BaseRepository {
}
...
Those repositories will be provided by a factory:
public interface BaseRepositoryFactory {
BaseRepository getBaseRepository(String whereToSave);
}
that you must configure in a ServiceLocatorFactoryBean:
#Bean
public ServiceLocatorFactoryBean baseRepositoryBean() {
ServiceLocatorFactoryBean serviceLocatorFactoryBean = new ServiceLocatorFactoryBean();
serviceLocatorFactoryBean.setServiceLocatorInterface(BaseRepositoryFactory.class);
return serviceLocatorFactoryBean;
}
Now you can inject the factory wherever you need and get the repo want:
#Autowired
private BaseRepositoryFactory baseRepositoryFactory;
...
baseRepositoryFactory.getBaseRepository("repoA").save(myPojo);
...
Hope it helps.
Short answer:
create a shared interface
create multiple sub-class of this interface (one per storage) using different spring component names
Use a map to deal with aliases
use Spring context to retrieve the right bean by alias (instead of creating a custom factory)
Now adding a new storage is only adding a new Repository classes with a name
Explanation:
As mentioned in the other answer you first need to define a common interface as you can't use the CrudRepository.save(...).
In my example I reuse the same signature as the save method to avoid re-implementing it in the sub-classes of CrudRepository.
public interface MyInterface<T> {
<S extends T> S save(S entity);
}
Redis Repository:
#Repository("redis") // Here is the name of the redis repo
public class RedisRepository implements MyInterface<MyPojo> {
#Override
public <S extends MyPojo> S save(S entity) {
entity.setValue(entity.getValue() + " saved by redis");
return entity;
}
}
For the other CrudRepository no need to provide an implementation:
#Repository("elastic") // Here is the name of the elastic repo
public interface ElasticRepository extends CrudRepository<MyPojo, String>, MyInterface<MyPojo> {
}
Create a configuration for your aliases in application.yml
configuration:
design:
pattern:
map:
redis: redis
saveToRedisPlease: redis
elastic: elastic
Create a custom properties to retrieve the map:
#Component
#ConfigurationProperties(prefix = "configuration.design.pattern")
public class PatternProperties {
private Map<String, String> map;
public String getRepoName(String alias) {
return map.get(alias);
}
public Map<String, String> getMap() {
return map;
}
public void setMap(Map<String, String> map) {
this.map = map;
}
}
Now create the version three of your repository with the injection of SpringContext:
#RestController
public class ControllerVersionThree {
private final ApplicationContext context;
private PatternProperties designPatternMap;
public ControllerVersionThree(ApplicationContext context,
PatternProperties designPatternMap) {
this.context = context;
this.designPatternMap = designPatternMap;
}
#PostMapping(path = "/save")
public String save(#RequestBody MyRequest myRequest) {
String whereToSave = myRequest.getWhereToSave();
MyPojo myPojo = new MyPojo(UUID.randomUUID().toString(), myRequest.getValue());
String repoName = designPatternMap.getRepoName(whereToSave);
MyInterface<MyPojo> repo = context.getBean(repoName, MyInterface.class);
return repo.save(myPojo).toString();
}
}
You can check that this is working with a test:
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.boot.test.web.client.TestRestTemplate;
import org.springframework.boot.test.web.server.LocalServerPort;
import org.springframework.http.HttpEntity;
import static org.junit.jupiter.api.Assertions.assertEquals;
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
class ControllerVersionThreeTest {
#LocalServerPort
private int port;
#Autowired
private TestRestTemplate restTemplate;
#Test
void testSaveByRedis() {
// Given: here 'redis' is the name of the spring beans
HttpEntity<MyRequest> request = new HttpEntity<>(new MyRequest("redis", "aValue"));
// When
String response = restTemplate.postForObject("http://localhost:" + port + "/save", request, String.class);
// Then
assertEquals("MyPojo{value='aValue saved by redis'}", response);
}
#Test
void testSaveByRedisAlias() {
// Given: here 'saveToRedisPlease' is an alias name of the spring beans
HttpEntity<MyRequest> request = new HttpEntity<>(new MyRequest("saveToRedisPlease", "aValue"));
// When
String response = restTemplate.postForObject("http://localhost:" + port + "/save", request, String.class);
// Then
assertEquals("MyPojo{value='aValue saved by redis'}", response);
}
}
Have you tried creating a configuration class to create your repository map
#Configuration
public class MyConfiguration {
#Bean
public Map repositoryMap() {
Map<String, ? extends Repository> repositoryMap = new HashMap<>();
repositoryMap.put('redis', new RedisRepository());
repositoryMap.put('mongo', new MongoRepository());
repositoryMap.put('elastic', new ElasticRepository());
return Collections.unmodifiableMap(repositoryMap);
}
}
Then you could have the following in your rest controller
#RestController
#Configuration
public class ControllerVersionFour {
#Autowired
private Map<String, ? extends Repository> repositoryMap;
#PostMapping(path = "/save/{dbname}")
public String save(#RequestBody MyRequest myRequest, #PathVariable("dbname") String dbname) {
MyPojo myPojo = new MyPojo(UUID.randomUUID().toString(), myRequest.getValue());
return repisitoryMap.get(dbname).save(myPojo);
}
It might be better to have the db as a path/query parameter instead of having it in the request body. That way you may or may not be able to just save the request body depending on your use case instead of creating another pojo.
This post may also be useful for autowiring a map
Related
I have two controllers, namely ControllerA and ControllerB. In ControllerA I have an object (a HashMap) which maps principal names for connected users to their respective usernames:
#Controller
public class ControllerA {
HashMap<String, String> principalToUsername = new HashMap<>();
...
}
I would like to be able to access this object from ControllerB, which is another websocket controller:
#Controller
public class ControllerB {
private String getUsername(String principalName) {
// I want to access usernames here
}
}
How can I do this? All posts I've read speak about MVC controllers, in which I can use #SessionVariable or flash attributes. But how do I accomplish the same with WebSocket controllers?
You could create an additional #Component which keeps the HashMap<> and autowire it into both controllers. Be aware that by default, this will be shared by all controllers in the Spring application.
#Component
public class UserMap {
private final Map<String, String> map = new HashMap<>();
public String getUserName(String userName) {
return map.get(userName);
}
}
In the controller;
private final UserMap userMap;
// autowire in constructor
public ControllerA(UserMap userMap) {
this.userMap = userMap;
}
private String getUsername(String principalName) {
userMap.getUserName(principalname);
}
I'm trying to Autowire jdbc template inside mapStore.. but I'm getting null pointer exception.
I worked on so many examples but sill not able to resolve this issue..
Here is my main class
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
#SpringBootApplication
public class TestCacheApplication {
public static void main(String[] args) {
SpringApplication.run(TestCacheApplication.class, args);
System.err.println("......running successfully......");
}
}
Here is my cache configured code
#Component
public class CacheConfig {
#Bean
public static Config config() {
System.err.println("config class");
Config config = new Config();
config.setInstanceName("hazelcast");
MapConfig mapCfg = new MapConfig();
mapCfg.setName("first-map");
mapCfg.setBackupCount(2);
mapCfg.setTimeToLiveSeconds(300);
MapStoreConfig mapStoreCfg = new MapStoreConfig();
mapStoreCfg.setClassName(DataMapStore .class.getName()).setEnabled(true);
mapCfg.setMapStoreConfig(mapStoreCfg);
config.addMapConfig(mapCfg);
return config;
}
}
and TblRepo implementation
#Service
public class DataTblRepoImpl implements DataTblRepo {
#Autowired
private JdbcTemplate jdbcTemplate;
#Override
public void save(String id, String name) {
Object[] params = new Object[] { id, name };
int[] types = new int[] { Types.VARCHAR, Types.VARCHAR };
String insertSql = "INSERT INTO public.person(id, name) VALUES(?, ?)";
jdbcTemplate.update(insertSql, params, types);
}
and TblRepo interface I have annotated with #Repository annotation..
And My map store class
#SpringAware
public class DataMapStore implements MapStore<String, ModelClass>{
#Autowired
DataTblRepo dataTblRepo;
#Override
public void store(String key, ModelClass value) {
dataTblRepo.save(value.getId(), value.getName());
}
//remaining methods will come here
}
and Controller
#RestController
#CrossOrigin(origins = "*")
#RequestMapping("/api/v1")
public class DataController {
#Autowired
DataService dataService;
HazelcastInstance hazelCast = Hazelcast.getHazelcastInstanceByName("hazelcast");
#PostMapping("/{test}")
public String saveDatafrom(#RequestBody ModelClass model) {
hazelCast.getMap("first-map").put(model.getId(), model);
return "stored";
}
}
Here is the program flow.. When I start the application, first Cacheconfig class will run.
In the controller when I perform the map.put() operation, data will go to the DataMapStore class and call the store method to save the data in database..since DataTblRepo is null so operation is failing at the store method itself..*
I tried adding #component on the DataMapStore class also
but in my case I'm getting this error
"message": "Cannot invoke "com.example.demo.repo.DataTblRepository.save(String, String)" because "this.dataTableRepo" is null",
I saw this same issue in many platforms also but still not able to resolve this issue.
Any suggestions would be very helpful
SpringAware is for Hazelcast distributed objects (cf. documentation).
The MapStore in your example is not a distributed object but a simple plain object. It should be managed by Spring itself. You should replace the #SpringAware annotation by a Spring #Component annotation.
The next issue is that your map store configuration makes Hazelcast responsible to instantiate the MapStore. If this happens, you won't benefit from Spring's Dependency Injection mechanism. You should directly set the instance created by Spring.
Replace SpringAware by Component
#Component
public class DataMapStore implements MapStore<String, ModelClass> {
// ...
}
Use the Spring-configured MapStore instance
#Bean
public Config config(DataMapStore mapStore) { // Ask Spring to inject the instance
// ...
MapStoreConfig mapStoreCfg = new MapStoreConfig();
mapStoreCfg.setImplementation(mapStore); // Use it
mapCfg.setMapStoreConfig(mapStoreCfg);
config.addMapConfig(mapCfg);
return config;
}
I also removed the static keyword on the config() method.
Note that this way of using MapStore couples it with the "client" code. This means you need to use Hazelcast embedded. For more information about embedded mode vs. client/server, please check the documentation related to topology.
I am writing a service that gets an input based on which I need to call certain impl of one service. This input is a list of names of impls needs to called.
public interface Processor {
Map<String, String> execute();
}
#Service("BUCKET_PROCESSOR")
public class BucketProcessor implements Processor {
..... //first impl
}
#Service("QUERY_PROCESSOR")
public class QueryProcessor implements Processor {
..... //second impl
}
#Service("SQL_PROCESSOR")
public class SQLProcessor implements Processor {
..... //third impl
}
then I have a service where I want to inject a map of all these impls so that I can iterate over input and call respective impl.
#Service
public class MyAysncClient {
#Autowired
private Map<String, Processor> processorMap;
public void execute(List<String> processors) {
List<Future> tasks = new ArrayList<>();
for (String p : processors) {
final Processor processor = this.processorMap.get(p);
processor.execute()
....
}
}
}
you can just use getBeansOfType(Processor.class):
Returns a Map with the matching beans, containing the bean names as keys and the corresponding bean instances as values
#Bean
public Map<String, Processor> processorMap(ApplicationContext context) {
return context.getBeansOfType(Processor.class);
}
Yes, you can - spring has this feature enabled by default. Namely, you can define inject a Map<String, Processor> into the spring bean.
This will instruct spring to find all beans which are implementations of Processor interface and these will be values of the map, the corresponding keys will be bean names.
So the code presented in the question should work.
Check the documentation of well-known #Autowired annotation.
In the section "Autowiring Arrays, Collections, and Maps" it states the following:
In case of an array, Collection, or Map dependency type, the container autowires all beans matching the declared value type. For such purposes, the map keys must be declared as type String which will be resolved to the corresponding bean names. Such a container-provided collection will be ordered, taking into account Ordered and #Order values of the target components, otherwise following their registration order in the container. Alternatively, a single matching target bean may also be a generally typed Collection or Map itself, getting injected as such.
See This example - the relevant part of it is where the map is injected into the test.
A better and elegant way to do the same is
Define a Service locator pattern using below code
#Configuration
public class ProcessorConfig {
#Bean("processorFactory")
public FactoryBean<?> serviceLocatorFactoryBean() {
ServiceLocatorFactoryBean factoryBean = new ServiceLocatorFactoryBean();
factoryBean.setServiceLocatorInterface(ProcessorFactory.class);
return factoryBean;
}
}
public interface ProcessorFactory {
Processor getProcessor(ProcessorTypes processorTypes);
}
then
public interface Processor {
Map<String, String> execute();
}
#Component(ProcessorTypes.ProcessorConstants.BUCKET_PROCESSOR)
#Slf4j
public class BucketProcessor implements Processor {
#Override
public Map<String, String> execute() {
return Collections.singletonMap("processor","BUCKET_PROCESSOR");
}
}
#Component(ProcessorTypes.ProcessorConstants.QUERY_PROCESSOR)
#Slf4j
public class QueryProcessor implements Processor {
#Override
public Map<String, String> execute() {
return Collections.singletonMap("processor","QUERY_PROCESSOR");
}
}
#Component(ProcessorTypes.ProcessorConstants.SQL_PROCESSOR)
#Slf4j
public class SqlProcessor implements Processor {
#Override
public Map<String, String> execute() {
return Collections.singletonMap("processor","SQL_PROCESSOR");
}
}
Now define your service injecting the factory
#Service
#RequiredArgsConstructor
#Slf4j
public class ProcessorService {
private final ProcessorFactory processorFactory;
public void parseIndividual(ProcessorTypes processorTypes) {
processorFactory
.getProcessor(processorTypes)
.execute();
}
public void parseAll(List<ProcessorTypes> processorTypes) {
processorTypes.forEach(this::parseIndividual);
}
}
In client, you can execute in below way
processorService.parseAll(Arrays.asList(ProcessorTypes.SQL, ProcessorTypes.BUCKET, ProcessorTypes.QUERY));
processorService.parseIndividual(ProcessorTypes.BUCKET);
If you want to expose as REST API you can do it in below way
#RestController
#RequestMapping("/processors")
#RequiredArgsConstructor
#Validated
public class ProcessorController {
private final ProcessorService processorService;
#GetMapping("/process")
public ResponseEntity<?> parseContent(#RequestParam("processorType") #Valid ProcessorTypes processorTypes) {
processorService.parseIndividual(ProcessorTypes.BUCKET);
return ResponseEntity.status(HttpStatus.OK).body("ok");
}
#GetMapping("/process-all")
public ResponseEntity<?> parseContent() {
processorService.parseAll(Arrays.asList(ProcessorTypes.SQL, ProcessorTypes.BUCKET, ProcessorTypes.QUERY));
return ResponseEntity.status(HttpStatus.OK).body("ok");
}
}
Hope your problem gets resolved by the solution
I think this will help you , add bean configuration into configuration file
#Bean(name = "mapBean")
public Map<String, Processor > mapBean() {
Map<String, Processor > map = new HashMap<>();
//populate the map here
return map;
}
in your service
#Service
public class MyAysncClient {
#Autowired
#Qualifier("mapBean")
private Map<String, Processor> processorMap;
public void execute(List<String> processors) {
List<Future> tasks = new ArrayList<>();
for (String p : processors) {
final Processor processor = this.processorMap.get(p);
processor.execute()
....
}
}
}
by the way if you dont need name of the beans (according your example) so define a list , spring will inject all bean defined as service on the same interface
#Autowired
private List<Processor> processors; // include all defined beans
after that iterate each of them and call execute method.
Yes, you can, but it needs some improvements to your current code in order to make it work in this way.
First of all you have to add the getProcessorName method to the Processor interface:
public interface Processor {
Map<String, String> execute();
String getProcessorName();
}
When you implement it, you should set it's name in returning of getProcessorName method
#Service
public class QueryProcessor implements Processor {
//...
#Override
public String getProcessorName() {
return "QUERY_PROCESSOR";
}
}
Then you must create a spring configuration or add bean creation to the existing one
#Configuration
public class MyShinyProcessorsConfiguration {
#Bean
#Qualifier("processorsMap")
public Map<String, Processor> processorsMap(List<Processor> processors) {
Map<String, Processor > procMap = new HashMap<>();
processors.forEach(processor -> procMap.put(processor.getProcessorName(), processor);
return procMap;
}
}
...and then you can simply add your processors map to any component
#Service
public class MyAysncClient {
#Autowired
#Qualifier("processorsMap")
private Map<String, Processor> processorsMap;
}
We try to use the Spring Data CrudRepository in our project to provide persistency for our domain objects.
For a start I chose REDIS as backend since in a first experiment with a CrudRepository<ExperimentDomainObject, String> it seemd, getting it running is easy.
When trying to put it in our production code, things got more complicated, because here our domain objects were not necesseriliy using a simple type as key so the repository was CrudRepository<TestObject, ObjectId>.
Now I got the exception:
No converter found capable of converting from type [...ObjectId] to type [byte[]]
Searching for this exception, this answer which led my to uneducated experimenting with the RedisTemplate configuration. (For my experiment I am using emdedded-redis)
My idea was, to provide a RedisTemplate<Object, Object> instead of RedisTemplate<String, Object> to allow using the Jackson2JsonRedisSerializer to do the work as keySerializer also.
Still, calling testRepository.save(testObject) fails.
Please see my code:
I have public fields and left out the imports for the brevity of this example. If they are required (to make this a MVCE) I will happily provide them. Just leave me a comment.
dependencies:
dependencies {
implementation 'org.springframework.boot:spring-boot-starter-data-redis'
implementation 'org.springframework.boot:spring-boot-starter-web'
compileOnly 'org.projectlombok:lombok'
annotationProcessor 'org.projectlombok:lombok'
implementation group: 'redis.clients', name: "jedis", version: '2.9.0'
implementation group: 'it.ozimov', name: 'embedded-redis', version: '0.7.2'
}
RedisConfiguration:
#Configuration
#EnableRedisRepositories
public class RedisConfiguration {
#Bean
JedisConnectionFactory jedisConnectionFactory() {
return new JedisConnectionFactory();
}
#Bean
public RedisTemplate<Object, Object> redisTemplate() {
Jackson2JsonRedisSerializer jackson2JsonRedisSerializer = new Jackson2JsonRedisSerializer(Object.class);
final RedisTemplate<Object, Object> template = new RedisTemplate<>();
template.setConnectionFactory(jedisConnectionFactory());
template.setDefaultSerializer(jackson2JsonRedisSerializer);
template.setKeySerializer(jackson2JsonRedisSerializer);
template.setHashValueSerializer(jackson2JsonRedisSerializer);
template.setValueSerializer(jackson2JsonRedisSerializer);
template.setEnableDefaultSerializer(true);
return template;
}
}
TestObject
#RedisHash("test")
public class TestObject
{
#Id public ObjectId testId;
public String value;
public TestObject(ObjectId id, String value)
{
this.testId = id;
this.value = value; // In experiment this is: "magic"
}
}
ObjectId
#EqualsAndHashCode
public class ObjectId {
public String creator; // In experiment, this is "me"
public String name; // In experiment, this is "fool"
}
TestRepository
#Repository
public interface TestRepository extends CrudRepository<TestObject, ObjectId>
{
}
EmbeddedRedisConfiguration
#Configuration
public class EmbeddedRedisConfiguration
{
private final redis.embedded.RedisServer redisServer;
EmbeddedRedisConfiguration(RedisProperties redisProperties)
{
this.redisServer = new redis.embedded.RedisServer(redisProperties.getPort());
}
#PostConstruct
public void init()
{
redisServer.start();
}
#PreDestroy
public void shutdown()
{
redisServer.stop();
}
}
Application:
#SpringBootApplication
public class ExperimentApplication
{
public static void main(String[] args)
{
SpringApplication.run(ExperimentApplication.class, args);
}
}
Not the desired Answer:
Of course, I might introduce some special ID which is a simple datatype, e.g. a JSON-String which I build manually using jacksons ObjectMapper and then use a CrudRepository<TestObject, String>.
What I also tried in the meantime:
RedisTemplate<String, String>
RedisTemplate<String, Object>
Autowireing a RedisTemplate and setting its default serializer
Registering a Converter<ObjectId, byte[]> to
An autowired ConverterRegistry
An autowired GenericConversionService
but apparently they have been the wrong ones.
Basically, the Redis repositories use the RedisKeyValueTemplate under the hood to store data as Key (Id) and Value pair. So your configuration of RedisTemplate will not work unless you directly use it.
So one way for you will be to use the RedistTemplate directly, something like this will work for you.
#Service
public class TestService {
#Autowired
private RedisTemplate redisTemplate;
public void saveIt(TestObject testObject){
ValueOperations<ObjectId, TestObject> values = redisTemplate.opsForValue();
values.set(testObject.testId, testObject);
}
}
So the above code will use your configuration and generate the string pair in the Redis using the Jackson as the mapper for both the key and the value.
But if you want to use the Redis Repositories via CrudRepository you need to create reading and writing converters for ObjectId from and to String and byte[] and register them as custom Redis conversions.
So let's create reading and writing converters for ObjectId <-> String
Reader
#Component
#ReadingConverter
#Slf4j
public class RedisReadingStringConverter implements Converter<String, ObjectId> {
private ObjectMapper objectMapper = new ObjectMapper();
#Override
public ObjectId convert(String source) {
try {
return objectMapper.readValue(source, ObjectId.class);
} catch (IOException e) {
log.warn("Error while converting to ObjectId.", e);
throw new IllegalArgumentException("Can not convert to ObjectId");
}
}
}
Writer
#Component
#WritingConverter
#Slf4j
public class RedisWritingStringConverter implements Converter<ObjectId, String> {
private ObjectMapper objectMapper = new ObjectMapper();
#Override
public String convert(ObjectId source) {
try {
return objectMapper.writeValueAsString(source);
} catch (JsonProcessingException e) {
log.warn("Error while converting ObjectId to String.", e);
throw new IllegalArgumentException("Can not convert ObjectId to String");
}
}
}
And the reading and writing converters for ObjectId <-> byte[]
Writer
#Component
#WritingConverter
public class RedisWritingByteConverter implements Converter<ObjectId, byte[]> {
Jackson2JsonRedisSerializer<ObjectId> jackson2JsonRedisSerializer = new Jackson2JsonRedisSerializer(ObjectId.class);
#Override
public byte[] convert(ObjectId source) {
return jackson2JsonRedisSerializer.serialize(source);
}
}
Reader
#Component
#ReadingConverter
public class RedisReadingByteConverter implements Converter<byte[], ObjectId> {
Jackson2JsonRedisSerializer<ObjectId> jackson2JsonRedisSerializer = new Jackson2JsonRedisSerializer(ObjectId.class);
#Override
public ObjectId convert(byte[] source) {
return jackson2JsonRedisSerializer.deserialize(source);
}
}
And last add the Redis custom conversations. Just put the code into the RedisConfiguration
#Bean
public RedisCustomConversions redisCustomConversions(RedisReadingByteConverter readingConverter,
RedisWritingByteConverter redisWritingConverter,
RedisWritingStringConverter redisWritingByteConverter,
RedisReadingStringConverter redisReadingByteConverter) {
return new RedisCustomConversions(Arrays.asList(readingConverter, redisWritingConverter, redisWritingByteConverter, redisReadingByteConverter));
}
So now after the converters are created and registered as custom Redis Converters the RedisKeyValueTemplate can use them and your code should work as expected.
This is the what I have for the repo:
#RepositoryRestResource(collectionResourceRel = "people", path = "people")
public interface PersonRepo extends PagingAndSortingRepository<Person, Long> {
List<Person> findByName(#Param("name") String name);
}
http://localhost:8080/system/people is what I'm trying to access, where "http://localhost:8080/system" is the base url for the spring MVC project
I've found the solution for it, according to the http://docs.spring.io/spring-data/rest/docs/2.3.0.RELEASE/reference/html/#getting-started.introduction, I added a class for the data access configuration:
#Configuration
public class CustomizedRestMvcConfiguration extends RepositoryRestMvcConfiguration {
#Override
public RepositoryRestConfiguration config() {
RepositoryRestConfiguration config = super.config();
config.setBasePath("/api");
return config;
}
}
In addition, I have also added #Import(CustomizedRestMvcConfiguration.class) in my spring applications config class. And now whenever I do a http://localhost:8080/system/api/people call, the database entries can be retrieved.
try
#RequestMapping(value="/people/" , method=RequestMethod.GET)
public interface PersonRepo extends PagingAndSortingRepository<Person, Long> {
List<Person> findByName(#Param("name") String name);
}