How to use inheritance with ModelMapper's PropertyMap - java

I have been using the ModelMapper to convert Identity objects (entities) with DTO objects. I want to implement generic property conversions in a generic class for all entities (GenericEToDtoPropertyMap) and explicit property conversions in separate child-classes for each entity (PersonEToDTOPropertyMap for entity Person). To make it more clear, this is my code:
Generic property map:
public class GenericEToDtoPropertyMap<E extends Identity, DTO extends PersistentObjectDTO> extends PropertyMap<E, DTO> {
#Override
protected void configure() {
// E.oid to DTO.id conversion
map().setId(source.getOid());
}
}
Specific property map for entity Person:
public class PersonEToDTOPropertyMap extends GenericEToDtoPropertyMap<Person, PersonDTO> {
#Override
protected void configure() {
super.configure();
// implement explicit conversions here
}
}
Usage of property map:
modelMapper = new ModelMapper();
Configuration configuration = modelMapper.getConfiguration();
configuration.setMatchingStrategy(MatchingStrategies.STRICT);
modelMapper.addMappings(new PersonEToDTOPropertyMap());
// convert person object
PersonDTO personDto = modelMapper.map(person);
The problem is that the generic conversions do not apply. In my case person.oid does not get copied to personDto.id. It works correctly only if I remove the part:
map().setId(source.getOid());
from the GenericEToDtoPropertyMap.configure() method and put it in the PersonEToDTOPropertyMap.configure() method.
I guess, it has something to do with ModelMapper using Reflection to implement the mappings, but it would be nice if I could use inheritance in my property maps. Do you have any idea how to do this?

I just found an answer from the creator of ModelMapper, Jonathan Halterman:
https://groups.google.com/forum/#!topic/modelmapper/cvLTfqnHhqQ
This sort of mapping inheritance is not currently possible.
So I guess I have to implement all conversions in the child-class

Well after 5 years... I got a solution (I have same problem)
After view some post and search answers I found this post The problem happen because the method "configure()" restricts what you can do inside so I thinking in create an method outside for get the custom format, so this is my code:
public class UserMap extends PropertyMap<User,UserLoginDTO> {
#Override
protected void configure() {
using(generateFullname()).
map(source,destination.getFullName());
}
private Converter<User, String> generateFullname(){
return context -> {
User user = context.getSource();
return user.getName()+ " " + user.getFirstLastname() + " " + user.getSecondLastname();
};
}
}
so you could call it from your main method or wherever you need, like this:
ModelMapper modelMapper=new ModelMapper();
modelMapper.addMappings(new UserMap());
and the output is:
UserLoginDTO{id='001', fullName='Jhon Sunderland Rex'}
How you can see, the logic is out of configure method and it works, I hope this helps for others
pd: sorry for my english

Related

Java Spring Webflux: issue types MyInterface and org.springframework.data.repository.reactive.ReactiveCrudRepository<MyPojo,String> are incompatible

Small question regarding Java Spring Webflux and type incompatibility issues please.
I have a very simple Spring Webflux application, where I declare a common interface for all my repositories to save a pojo:
public interface MyInterface {
Mono<MyPojo> save(MyPojo myPojo);
}
Here are example of a concrete implementation, for instance, Redis:
#Repository("redis")
public class MyRedisRepository implements MyInterface {
private final ReactiveRedisOperations<String, String> reactiveRedisOperations;
public MyRedisRepository(ReactiveRedisOperations<String, String> reactiveRedisOperations) {
this.reactiveRedisOperations = reactiveRedisOperations;
}
#Override
public Mono<MyPojo> save(MyPojo myPojo) {
return reactiveRedisOperations.opsForValue().set("someKey", "someValue").map(__ -> myPojo);
}
}
And now an example with Elastic.
#Repository("elastic")
public interface MyElasticRepository extends ReactiveElasticsearchRepository<MyPojo, String>, MyInterface {
}
Please note, the important point is that some are regular classes (like Redis) which needs to implement the save method.
On the other hand, some are interface which implements Reactive___Repository<MyPojo, String> (which already have a save method)
When trying to compile, I am faced with this issue:
types question.MyInterface and org.springframework.data.repository.reactive.ReactiveCrudRepository<question.MyPojo,java.lang.String> are incompatible;
This is a bit strange to me, as my intention is just to have all the repositories under a common MyInterface , with a save method.
May I ask why I am facing this issue, and most of all, how to resolve this (keeping the MyInterface ) please?
Thank you
The return type and parameter type of the save method defined on the two interfaces are different, which makes them incompatible.
The ReactiveCrudRepository that ReactiveElasticsearchRepository extends, specifies that types derived from MyPojo can be passed to and will be returned from the save method.
In your custom interface you limit the passed argument and return type strictly to MyPojo. So the compiler recognizes there is no way to determine which save method is called at runtime and complains.
Try adjusting the return type of your interface to the following and adjusting your implementations:
public interface MyInterface<T extends MyPojo> {
Mono<T> save(T myPojo);
}

Cannot use generic with Util class in Java

I have a Util class and there is a static method called csvToEmployees. In order to use this method with different type of request classes, I am trying to convert the class as shown below that takes generic parameter:
public class CsvHelper<T> {
public List<T> csvToEmployees(InputStream is) {
//code omitted
for (CSVRecord rec : records) {
T employee = new T(
// ...
);
employees.add(employee);
}
return employees;
}
}
I call this method from my service by injecting this util class as shown below:
#Service
#RequiredArgsConstructor
public class EmployeeService {
private final EmployeeRepository employeeRepository;
private final CsvHelper<EmployeeRequest> helper;
public void create(MultipartFile file) {
List<Employee> employees = helper.csvToEmployees(file.getInputStream()).stream()
.map(EmployeeRequestMapper::mapToEntity)
.toList();
// ...
}
}
My problems are:
1. Is the implementation approach above is ok or not? I mean assuming that there are different kind of requests with the same fields, is using generic with that approach ok?
2. I get "Type parameter 'T' cannot be instantiated directly" error in the T employee = new T( line of util class. How can I fix it?
The best solution, in my opinion, is just creating multiple csvToObject methods inside the classes that you need to process.
I mean if you already know that you’re transforming a stream into a list of employees (it’s “hard coded” in the service method) why would you need to use generics? Just use the method for employees instead.

Dynamic dependency injection for multiple implementations of the same interface with Spring MVC

I am working on a REST API where I have an interface that defines a list of methods which are implemented by 4 different classes, with the possibility of adding many more in the future.
When I receive an HTTP request from the client there is some information included in the URL which will determine which implementation needs to be used.
Within my controller, I would like to have the end-point method contain a switch statement that checks the URL path variable and then uses the appropriate implementation.
I know that I can define and inject the concrete implementations into the controller and then insert which one I would like to use in each particular case in the switch statement, but this doesn't seem very elegant or scalable for 2 reasons:
I now have to instantiate all of the services, even though I only need to use one.
The code seems like it could be much leaner since I am literally calling the same method that is defined in the interface with the same parameters and while in the example it is not really an issue, but in the case that the list of implementations grows ... so does the number of cases and redundant code.
Is there a better solution to solve this type of situation? I am using SpringBoot 2 and JDK 10, ideally, I'd like to implement the most modern solution.
My Current Approach
#RequestMapping(Requests.MY_BASE_API_URL)
public class MyController {
//== FIELDS ==
private final ConcreteServiceImpl1 concreteService1;
private final ConcreteServiceImpl2 concreteService2;
private final ConcreteServiceImpl3 concreteService3;
//== CONSTRUCTORS ==
#Autowired
public MyController(ConcreteServiceImpl1 concreteService1, ConcreteServiceImpl2 concreteService2,
ConcreteServiceImpl3 concreteService3){
this.concreteService1 = concreteService1;
this.concreteService2 = concreteService2;
this.concreteService3 = concreteService3;
}
//== REQUEST MAPPINGS ==
#GetMapping(Requests.SPECIFIC_REQUEST)
public ResponseEntity<?> handleSpecificRequest(#PathVariable String source,
#RequestParam String start,
#RequestParam String end){
source = source.toLowerCase();
if(MyConstants.SOURCES.contains(source)){
switch(source){
case("value1"):
concreteService1.doSomething(start, end);
break;
case("value2"):
concreteService2.doSomething(start, end);
break;
case("value3"):
concreteService3.doSomething(start, end);
break;
}
}else{
//An invalid source path variable was recieved
}
//Return something after additional processing
return null;
}
}
In Spring you can get all implementations of an interface (say T) by injecting a List<T> or a Map<String, T> field. In the second case the names of the beans will become the keys of the map. You could consider this if there are a lot of possible implementations or if they change often. Thanks to it you could add or remove an implementation without changing the controller.
Both injecting a List or a Map have some benefits and drawbacks in this case. If you inject a List you would probably need to add some method to map the name and the implementation. Something like :
interface MyInterface() {
(...)
String name()
}
This way you could transform it to a Map<String, MyInterface>, for example using Streams API. While this would be more explicit, it would polute your interface a bit (why should it be aware that there are multiple implementations?).
When using the Map you should probably name the beans explicitly or even introduce an annotation to follow the principle of least astonishment. If you are naming the beans by using the class name or the method name of the configuration class you could break the app by renaming those (and in effect changing the url), which is usually a safe operation to do.
A simplistic implementation in Spring Boot could look like this:
#SpringBootApplication
public class DynamicDependencyInjectionForMultipleImplementationsApplication {
public static void main(String[] args) {
SpringApplication.run(DynamicDependencyInjectionForMultipleImplementationsApplication.class, args);
}
interface MyInterface {
Object getStuff();
}
class Implementation1 implements MyInterface {
#Override public Object getStuff() {
return "foo";
}
}
class Implementation2 implements MyInterface {
#Override public Object getStuff() {
return "bar";
}
}
#Configuration
class Config {
#Bean("getFoo")
Implementation1 implementation1() {
return new Implementation1();
}
#Bean("getBar")
Implementation2 implementation2() {
return new Implementation2();
}
}
#RestController
class Controller {
private final Map<String, MyInterface> implementations;
Controller(Map<String, MyInterface> implementations) {
this.implementations = implementations;
}
#GetMapping("/run/{beanName}")
Object runSelectedImplementation(#PathVariable String beanName) {
return Optional.ofNullable(implementations.get(beanName))
.orElseThrow(UnknownImplementation::new)
.getStuff();
}
#ResponseStatus(BAD_REQUEST)
class UnknownImplementation extends RuntimeException {
}
}
}
It passes the following tests:
#RunWith(SpringRunner.class)
#SpringBootTest
#AutoConfigureMockMvc
public class DynamicDependencyInjectionForMultipleImplementationsApplicationTests {
#Autowired
private MockMvc mockMvc;
#Test
public void shouldCallImplementation1() throws Exception {
mockMvc.perform(get("/run/getFoo"))
.andExpect(status().isOk())
.andExpect(content().string(containsString("foo")));
}
#Test
public void shouldCallImplementation2() throws Exception {
mockMvc.perform(get("/run/getBar"))
.andExpect(status().isOk())
.andExpect(content().string(containsString("bar")));
}
#Test
public void shouldRejectUnknownImplementations() throws Exception {
mockMvc.perform(get("/run/getSomethingElse"))
.andExpect(status().isBadRequest());
}
}
Regarding two of your doubts :
1. Instantiating the service object should not be an issue as this is one time job and controller gonna need them to serve all type of request.
2. You can use the exact Path mapping to get rid of switch case. For e.g. :
#GetMapping("/specificRequest/value1")
#GetMapping("/specificRequest/value2")
#GetMapping("/specificRequest/value3")
All of the above mapping will be on separate method which would deal with specific source value and invoke respective service method.
Hope this will help to make code more cleaner and elegant.
There is one more option of separating this on service layer and having only one endpoint to serve all types of source but as you said there is different implementation for each source value then it says that source is nothing but a resource for your application and having separate URI/separate method makes the perfect sense here. Few advantages that I see here with this are :
Makes it easy to write the test cases.
Scaling the same without impacting any other source/service.
Your code dealing the each source as separate entity from other sources.
The above approach should be fine when you have limited source values. If you have no control over source value then we need further redesign here by making source value differentiate by one more value like sourceType etc. and then having separate controller for each group type of source.

How to force Spring Data to create query methods with entity runtime type?

I've got around 5 objects that I want to do similar things with.
I figured out that not to polute the code I will put a logic for those objects in one place.
public class MetaObjectController<T extends MetaObject> {
#Autowired
private final MetaObjectRepository<T> repository;
// generic logic
Here's how repository looks:
public interface MetaObjectRepository<T extends MetaObject> extends GraphRepository<T> {
T findByName(String name);
}
Now, I create concrete class which uses delegation:
public class ExperimentalController {
#Autowired
private final MetaObjectController<MetaCategory> metaController;
#RequestMapping(method = RequestMethod.POST)
public void add(#RequestBody MetaCategory toAdd) {
metaController.add(toAdd);
}
Now, when I look at the generated queries I see, that although instantiated correctly, repository puts MetaObject as an entity name instead of runtime type.
Is there a way to force the repository to use runtime type?
Please don't advise to put a #Query annnotation. That's not what I am looking for.
This is most probably due to type erasure: at runtime there is only the type constraint available which is MetaObject. If you want to use (via spring-data) the actually relevant subclass you will have to create explicit interfaces of the MetaObjectRepository like this:
public class Transmogrifier extends MetaObject
public interface MetaTransmogrifierRepository
extends MetaObjectRepository<Transmogrifier> {}

How to use Spring MVC #JsonView when returning an object hierarchy from a Rest Controller

I'm building an application which uses Spring MVC 4.10 with jackson 2.3.2.
I have a Project class which has children Proposal objects and a Customer object. These Proposal objects are complex and I want to return a summarized JSON view of them. A similar situation happens with the Customer object. I'm trying to implement this with #JsonView annotations.
I wanted to ask if extending the views of the member object classes in the container object class view is the way to do this or, if not, if there is a cleaner way to implement this that I am unaware of.
Context
Before today, I was under the false impression that you could annotate your controller with multiple views and that the resulting JSON representation would be filtered accordingly.
#JsonView({Project.Extended.class, Proposal.Summary.class, Customer.Summary.class})
#Transactional
#RequestMapping(value="/project", method=RequestMethod.GET)
public #ResponseBody List<Project> findAll() {
return projectDAO.findAll();
}
Where each class had its own JsonView annotations and interfaces
e.g.:
public class Customer {
...
public interface Summary {}
public interface Normal extends Summary {}
public interface Extended extends Normal {}
}
Nevertheless, it is only the first view in the array that gets taken into account. According to https://spring.io/blog/2014/12/02/latest-jackson-integration-improvements-in-spring
Only one class or interface can be specified with the #JsonView
annotation, but you can use inheritance to represent JSON View
hierarchies (if a field is part of a JSON View, it will be also part
of parent view). For example, this handler method will serialize
fields annotated with #JsonView(View.Summary.class) and
#JsonView(View.SummaryWithRecipients.class):
and the official documentation in http://docs.spring.io/spring/docs/current/spring-framework-reference/html/mvc.html#mvc-ann-jsonview
To use it with an #ResponseBody controller method or controller
methods that return ResponseEntity, simply add the #JsonView
annotation with a class argument specifying the view class or
interface to be used:
So, I ended up extending the views of the members in the view of the container object, like this
#Entity
public class Project {
...
public static interface Extended extends Normal, Proposal.Extended {}
public static interface Normal extends Summary, Customer.Normal {}
public static interface Summary {}
}
and changed my controller to this
#JsonView(Project.Extended.class)
#Transactional
#RequestMapping(value="/project", method=RequestMethod.GET)
public #ResponseBody List<Project> findAll() {
return projectDAO.findAll();
}
This seems to do the trick, but I couldn't find documentation or discussion about this situation. Is this the intended use of JsonViews or is it kind of hackish?
Thank you in advance
-Patricio Marrone
I believe you have configured your views as necessary. The root of the issue is not Spring's #JsonView, but rather Jackson's implementation of views. As stated in Jackson's view documentation:
Only single active view per serialization; but due to inheritance of Views, can combine Views via aggregation.
So, it appears that Spring is simply passing on and adhering to the limitation set in place by Jackson 2.
I use Jersey+Jackson but issued just the same problem.
That's a trick that I'm doing for my application to let me require for several JSON Views during serialization. I bet it is also possible with Spring MVC instead of Jersey, but not 100% sure. It also does not seem to have performance issues. Maybe it is a bit complicated for your case, but if you have large object with big amount of possible views, maybe it's better than doing a lot of inheritance.
So I use the Jackson Filter approach to require several views in serialization. However, I haven't found the way to overcome the issue of putting #JsonFilter("name") above the classes to map, which does not make it so clean. But I mask it in custom annotation at least:
#Retention(RetentionPolicy.RUNTIME)
#JacksonAnnotationsInside
#JsonFilter(JSONUtils.JACKSON_MULTIPLE_VIEWS_FILTER_NAME)
public #interface JsonMultipleViews {}
The filter itself looks like this:
public class JsonMultipleViewsFilter extends SimpleBeanPropertyFilter {
private Collection<Class<?>> wantedViews;
public JsonMultipleViewsFilter(Collection<Class<?>> wantedViews) {
this.wantedViews = wantedViews;
}
#Override
public void serializeAsField( Object pojo, JsonGenerator jgen, SerializerProvider provider, PropertyWriter writer ) throws Exception {
if( include( writer ) ) {
JsonView jsonViewAnnotation = writer.getAnnotation(JsonView.class);
// serialize the field only if there is no #JsonView annotation or, if there is one, check that at least one
// of view classes above the field fits one of required classes. if yes, serialize the field, if no - skip the field
if( jsonViewAnnotation == null || containsJsonViews(jsonViewAnnotation.value()) ) {
writer.serializeAsField( pojo, jgen, provider );
}
}
else if( !jgen.canOmitFields() ) {
// since 2.3
writer.serializeAsOmittedField( pojo, jgen, provider );
}
}
private boolean containsJsonViews(Class<?>[] viewsOfProperty) {
for (Class<?> viewOfProperty : viewsOfProperty) {
for (Class<?> wantedView : wantedViews) {
// check also subclasses of required view class
if (viewOfProperty.isAssignableFrom(wantedView)) {
return true;
}
}
}
return false;
}
#Override
protected boolean include( BeanPropertyWriter writer ) {
return true;
}
#Override
protected boolean include( PropertyWriter writer ) {
return true;
}
}
I can use this filter like this:
public static String toJson( Object object, Collection<Class<?>> jsonViewClasses) throws JsonProcessingException {
// if no json view class is provided, just map without view approach
if (jsonViewClasses.isEmpty()) {
return mapper.writeValueAsString(object);
}
// if only one json view class is provided, use out of the box jackson mechanism for handling json views
if (jsonViewClasses.size() == 1) {
return mapper.writerWithView(jsonViewClasses.iterator().next()).writeValueAsString(object);
}
// if more than one json view class is provided, uses custom filter to serialize with multiple views
JsonMultipleViewsFilter jsonMultipleViewsFilter = new JsonMultipleViewsFilter(jsonViewClasses);
return mapper.writer(new SimpleFilterProvider() // use filter approach when serializing
.setDefaultFilter(jsonMultipleViewsFilter) // set it as default filter in case of error in writing filter name
.addFilter(JACKSON_MULTIPLE_VIEWS_FILTER_NAME, jsonMultipleViewsFilter) // set custom filter for multiple views with name
.setFailOnUnknownId(false)) // if filter is unknown, don't fail, use default one
.writeValueAsString(object);
}
After that, Jersey allows us to add Jersey Filters on the point of running the application (it goes through each endpoint in each Controller in start of application and we can easily bind the Jersey filters at this moment if there is is multiple value in #JsonView annotation above the endpoint).
In Jersey filter for #JsonView annotation with multiple value above endpoint, once it's bint on startup to correct endpoints depending on annotations, we can easily override the response entity with calling that utils method
toJson(previousResponeObjectReturned, viewClassesFromAnnoation);
No reason to provide the code of Jersey Filter here since you're using Spring MVC. I just hope that it's easy to do it the same way in Spring MVC.
The Domain Object would look like this:
#JsonMultipleViews
public class Example
{
private int id;
private String name;
#JsonView(JsonViews.Extended.class)
private String extendedInfo;
#JsonView(JsonViews.Meta.class)
private Date updateDate;
public static class JsonViews {
public interface Min {}
public interface Extended extends Min {}
public interface Meta extends Min {}
//...
public interface All extends Extended, Meta {} // interfaces are needed for multiple inheritence of views
}
}
We can ommit putting Min.class in my case on those fields that are always required not depending on view. We just put Min in required views and it will serialize all fields without #JsonView annotation.
View All.class is required for me since if we have, for example, a specific set of views for each domain class (like in my case) and then we need to map a complex model consisting of several domain objects that both use views approach - some view for object one, but all views for object two, it's easier to put it above endpoint like this:
#JsonView({ObjectOneViews.SomeView.class, ObjectTwoViews.All.class})
because if we ommit ObjectTwoViews.All.class here and require for only ObjectOneViews.SomeView.class, those fields that are marked with annotation in Object Two will not be serialized.

Categories

Resources