I have created the following Projection
#Projection(name = "select", types = {Organisation.class})
public interface OrganisationSelectProjection {
Long getId();
String getName();
}
Which I want to basically use in a "Select" component so I need the least data possible. So I also wanted to remove all the links with a ResourceProcessor, so I created this:
#Bean
public ResourceProcessor<Resource<OrganisationSelectProjection>> organisationProcessor() {
return resource -> {
resource.removeLinks();
return resource;
};
}
However, it looks like this breaks the entire API since whatever endpoint I hit I get the following exception message org.springframework.hateoas.PagedResources cannot be cast to org.springframework.hateoas.Resource
Any idea what I have doen wrong?
If you'd like to keep anonymous class in place, using ResourceSupport instead of Resource can resolve the issue.
#Bean
public ResourceProcessor<ResourceSupport> organisationProcessor() {
return resource -> {
resource.removeLinks();
return resource;
};
}
But in this case the processor will be used on each and every resource regardless of type of the content (you can check that inside the method body though).
Instead of anonymous descendant of ResourceProcessor you can define it as a stand-alone class:
public class OrganizationResourceProcessor implements ResourceProcessor<Resource<OrganisationSelectProjection>> {
#Override
public Resource<OrganisationSelectProjection> process(Resource<OrganisationSelectProjection> resource) {
resource.removeLinks();
return resource;
}
}
#Bean
public OrganizationResourceProcessor organisationProcessor() {
return new OrganizationResourceProcessor();
}
The issue you expereinced has something to do with type erasure since there is no any type parameters in the anonymous class implementation. Your definition is type-safe but it lacks that type-related data which is used at runtime when determining whether particular ResourceProcessor can process a resource. So spring-data-rest thinks that anonymous organizationProcessor can process PagedResources and feeds it to the processor where ClassCastException happens.
Not everything spring-data-rest puts through ResourceProcessors is a Resource. There can be different descendants of org.springframework.hateoas.ResourceSupport class (like PagedResources in your case) and majority of them are not inherited from Resource.
Related
i am kind of stuck on a problem with creating beans, or probably i got the wrong intention.. Maybe you can help me solve it:
I got a application which takes in requests for batch processing. For every batch i need to create an own context depending on the parameters issued by the request.
I will try to simplyfy it with the following example:
I receive a request to process in a batch FunctionA which is a implementation for my Function_I interface and has sub-implementation FunctionA_DE and FunctionA_AT
Something like this:
public interface Function_I {
String doFunctionStuff()
}
public abstract class FunctionA implements Function_I {
FunctionConfig funcConfig;
public FunctionA(FunctionConfig funcConfig) {
this.funcConfig = funcConfig;
}
public String doFunctionStuff() {
// some code
String result = callSpecificFunctionStuff();
// more code
return result;
}
protected abstract String callSpecificFunctionStuff();
}
public class FunctionA_DE extends FunctionA {
public FunctionA_DE(FunctionConfig funcConf) {
super(funcConf)
}
protected String callSpecifiFunctionStuff() {
//do some specificStuff
return result;
}
}
public class FunctionA_AT extends FunctionA {
public FunctionA_AT(FunctionConfig funcConf) {
super(funcConf)
}
protected String callSpecifiFunctionStuff() {
//do some specificStuff
return result;
}
}
what would be the Spring-Boot-Way of creating a instance for FunctionA_DE to get it as Function_I for the calling part of the application, and what should it look like when i add FunctionB with FunctionB_DE / FunctionB_AT to my classes..
I thought it could be something like:
PSEUDO CODE
#Configuration
public class FunctionFactory {
#Bean(SCOPE=SCOPE_PROTOTYPE) // i need a new instance everytime i call it
public Function_I createFunctionA(FunctionConfiguration funcConfig) {
// create Function depending on the funcConfig so either FunctionA_DE or FunctionA_AT
}
}
and i would call it by Autowiring the FunctionFactory into my calling class and use it with
someSpringFactory.createFunction(functionConfiguration);
but i cant figure it out to create a Prototype-Bean for the function with passing a parameter.. And i cant really find a solution to my question by browsing through SO, but maybe i just got the wrong search terms.. Or my approach to solve this issue i totally wrong (maybe stupid), nobody would solve it the spring-boot-way but stick to Factories.
Appreciate your help!
You could use Springs's application context. Create a bean for each of the interfaces but annotate it with a specific profile e.g. "Function-A-AT". Now when you have to invoke it, you can simply set the application context of spring accordingly and the right bean should be used by Spring.
Hello everyone and thanks for reading my question.
after a discussion with a friend who is well versed in the spring framework i came to the conclusion that my approach or my favoured solution was not what i was searching for and is not how spring should be used. Because the Function_I-Instance depends on the for the specific batch loaded configuration it is not recommended to manage all these instances as #Beans.
In the end i decided to not manage the instances for my Function_I with spring. but instead i build a Controller / Factory which is a #Controller-Class and let this class build the instance i need with the passed parameters for decision making on runtime.
This is how it looks (Pseudo-Code)
#Controller
public class FunctionController {
SomeSpringManagedClass ssmc;
public FunctionController(#Autowired SomeSpringManagedClass ssmc) {
this.ssmc = ssmc;
}
public Function_I createFunction(FunctionConfiguration funcConf) {
boolean funcA, cntryDE;
// code to decide the function
if(funcA && cntryDE) {
return new FunctionA_DE(funcConf);
} else if(funB && cntryDE) {
return new FunctionB_DE(funcConf);
} // maybe more else if...
}
}
I am working on a REST API where I have an interface that defines a list of methods which are implemented by 4 different classes, with the possibility of adding many more in the future.
When I receive an HTTP request from the client there is some information included in the URL which will determine which implementation needs to be used.
Within my controller, I would like to have the end-point method contain a switch statement that checks the URL path variable and then uses the appropriate implementation.
I know that I can define and inject the concrete implementations into the controller and then insert which one I would like to use in each particular case in the switch statement, but this doesn't seem very elegant or scalable for 2 reasons:
I now have to instantiate all of the services, even though I only need to use one.
The code seems like it could be much leaner since I am literally calling the same method that is defined in the interface with the same parameters and while in the example it is not really an issue, but in the case that the list of implementations grows ... so does the number of cases and redundant code.
Is there a better solution to solve this type of situation? I am using SpringBoot 2 and JDK 10, ideally, I'd like to implement the most modern solution.
My Current Approach
#RequestMapping(Requests.MY_BASE_API_URL)
public class MyController {
//== FIELDS ==
private final ConcreteServiceImpl1 concreteService1;
private final ConcreteServiceImpl2 concreteService2;
private final ConcreteServiceImpl3 concreteService3;
//== CONSTRUCTORS ==
#Autowired
public MyController(ConcreteServiceImpl1 concreteService1, ConcreteServiceImpl2 concreteService2,
ConcreteServiceImpl3 concreteService3){
this.concreteService1 = concreteService1;
this.concreteService2 = concreteService2;
this.concreteService3 = concreteService3;
}
//== REQUEST MAPPINGS ==
#GetMapping(Requests.SPECIFIC_REQUEST)
public ResponseEntity<?> handleSpecificRequest(#PathVariable String source,
#RequestParam String start,
#RequestParam String end){
source = source.toLowerCase();
if(MyConstants.SOURCES.contains(source)){
switch(source){
case("value1"):
concreteService1.doSomething(start, end);
break;
case("value2"):
concreteService2.doSomething(start, end);
break;
case("value3"):
concreteService3.doSomething(start, end);
break;
}
}else{
//An invalid source path variable was recieved
}
//Return something after additional processing
return null;
}
}
In Spring you can get all implementations of an interface (say T) by injecting a List<T> or a Map<String, T> field. In the second case the names of the beans will become the keys of the map. You could consider this if there are a lot of possible implementations or if they change often. Thanks to it you could add or remove an implementation without changing the controller.
Both injecting a List or a Map have some benefits and drawbacks in this case. If you inject a List you would probably need to add some method to map the name and the implementation. Something like :
interface MyInterface() {
(...)
String name()
}
This way you could transform it to a Map<String, MyInterface>, for example using Streams API. While this would be more explicit, it would polute your interface a bit (why should it be aware that there are multiple implementations?).
When using the Map you should probably name the beans explicitly or even introduce an annotation to follow the principle of least astonishment. If you are naming the beans by using the class name or the method name of the configuration class you could break the app by renaming those (and in effect changing the url), which is usually a safe operation to do.
A simplistic implementation in Spring Boot could look like this:
#SpringBootApplication
public class DynamicDependencyInjectionForMultipleImplementationsApplication {
public static void main(String[] args) {
SpringApplication.run(DynamicDependencyInjectionForMultipleImplementationsApplication.class, args);
}
interface MyInterface {
Object getStuff();
}
class Implementation1 implements MyInterface {
#Override public Object getStuff() {
return "foo";
}
}
class Implementation2 implements MyInterface {
#Override public Object getStuff() {
return "bar";
}
}
#Configuration
class Config {
#Bean("getFoo")
Implementation1 implementation1() {
return new Implementation1();
}
#Bean("getBar")
Implementation2 implementation2() {
return new Implementation2();
}
}
#RestController
class Controller {
private final Map<String, MyInterface> implementations;
Controller(Map<String, MyInterface> implementations) {
this.implementations = implementations;
}
#GetMapping("/run/{beanName}")
Object runSelectedImplementation(#PathVariable String beanName) {
return Optional.ofNullable(implementations.get(beanName))
.orElseThrow(UnknownImplementation::new)
.getStuff();
}
#ResponseStatus(BAD_REQUEST)
class UnknownImplementation extends RuntimeException {
}
}
}
It passes the following tests:
#RunWith(SpringRunner.class)
#SpringBootTest
#AutoConfigureMockMvc
public class DynamicDependencyInjectionForMultipleImplementationsApplicationTests {
#Autowired
private MockMvc mockMvc;
#Test
public void shouldCallImplementation1() throws Exception {
mockMvc.perform(get("/run/getFoo"))
.andExpect(status().isOk())
.andExpect(content().string(containsString("foo")));
}
#Test
public void shouldCallImplementation2() throws Exception {
mockMvc.perform(get("/run/getBar"))
.andExpect(status().isOk())
.andExpect(content().string(containsString("bar")));
}
#Test
public void shouldRejectUnknownImplementations() throws Exception {
mockMvc.perform(get("/run/getSomethingElse"))
.andExpect(status().isBadRequest());
}
}
Regarding two of your doubts :
1. Instantiating the service object should not be an issue as this is one time job and controller gonna need them to serve all type of request.
2. You can use the exact Path mapping to get rid of switch case. For e.g. :
#GetMapping("/specificRequest/value1")
#GetMapping("/specificRequest/value2")
#GetMapping("/specificRequest/value3")
All of the above mapping will be on separate method which would deal with specific source value and invoke respective service method.
Hope this will help to make code more cleaner and elegant.
There is one more option of separating this on service layer and having only one endpoint to serve all types of source but as you said there is different implementation for each source value then it says that source is nothing but a resource for your application and having separate URI/separate method makes the perfect sense here. Few advantages that I see here with this are :
Makes it easy to write the test cases.
Scaling the same without impacting any other source/service.
Your code dealing the each source as separate entity from other sources.
The above approach should be fine when you have limited source values. If you have no control over source value then we need further redesign here by making source value differentiate by one more value like sourceType etc. and then having separate controller for each group type of source.
I am working on an spring 2.0.1.RELEASE application.
Brief of Application:
1. I have separate Transformer beans that transforms my DTO to Domain
and vice versa.
2. I have separate Validator beans that validate my domain object being passed.
3. I have Service classes that takes care of the applying rules and calling persistence layer.
Now, i want to build a Workflow in my application:
where i will just call the start of the workflow and below mentioned steps will be executed in order and exception handling will be done as per the step:
1.First-Transformtion - transformToDomain() method will be called for that object type.
2.Second-Validator - class valid() method will be called for that object.
3.Third-Service - class save() method will be called for that object.
4.Fourth- Transformation - transformToDTO() method will be called for that object type.
after this my workflow ends and i will return the DTO object as response of my REST API.
Exception handling part is the one, i also want to take care of, like if particular exception handler exist for that step then call it, else call global exception handler.
I designed some prototype of same, but looking for some expert advice and how this can be achieved with a better design in java.
Explanation with example considering above use case is highly appreciable.
I'm not so sure if what you are describing is a workflow system in its true sense, perhaps a Chain of Responsibility is more of what you are talking about?
Following what you described as a sequence of execution, here is a simplified example of how I would implement the chain:
Transformer.java
public interface Transformer<IN, OUT> {
OUT transformToDomain(IN dto);
IN transformToDTO(OUT domainObject);
}
Validator.java
public interface Validator<T> {
boolean isValid(T object);
}
Service.java
public interface Service {
void save(Object object);
}
And the implementation that binds everything:
ProcessChain.java
public class ProcessChain {
private Transformer transformer;
private Service service;
private Validator validator;
Object process(Object dto) throws MyValidationException {
Object domainObject = transformer.transformToDomain(dto);
boolean isValid = validator.isValid(domainObject);
if(!isValid){
throw new MyValidationException("Validation message here");
}
service.save(domainObject);
return transformer.transformToDTO(domainObject);
}
}
I haven't specified any Spring related things here because your question seems to be a design question rather than a technology questions.
Hope this helps
Brief of what i implemented in a way with not much hustle:
This is how I created flow of handlers:
Stream.<Supplier<RequestHandler>>of(
TransformToDomainRequestHandler::new,
ValidateRequestHandler::new,
PersistenceHandler::new,
TransformToDTORequestHandler::new)
.sequential()
.map(c -> c.get()) /* Create the handler instance */
.reduce((processed, unProcessed) -> { /* chains all handlers together */
RequestHandler previous = processed;
RequestHandler target = previous.getNextRequestHandler();
while (target != null && previous != null) {
previous = target;
target = target.getNextRequestHandler();
}
previous.setNextRequestHandler(unProcessed);
return processed;
}).get();
This is my Request Handler which all other handler extends
I'm building an application which uses Spring MVC 4.10 with jackson 2.3.2.
I have a Project class which has children Proposal objects and a Customer object. These Proposal objects are complex and I want to return a summarized JSON view of them. A similar situation happens with the Customer object. I'm trying to implement this with #JsonView annotations.
I wanted to ask if extending the views of the member object classes in the container object class view is the way to do this or, if not, if there is a cleaner way to implement this that I am unaware of.
Context
Before today, I was under the false impression that you could annotate your controller with multiple views and that the resulting JSON representation would be filtered accordingly.
#JsonView({Project.Extended.class, Proposal.Summary.class, Customer.Summary.class})
#Transactional
#RequestMapping(value="/project", method=RequestMethod.GET)
public #ResponseBody List<Project> findAll() {
return projectDAO.findAll();
}
Where each class had its own JsonView annotations and interfaces
e.g.:
public class Customer {
...
public interface Summary {}
public interface Normal extends Summary {}
public interface Extended extends Normal {}
}
Nevertheless, it is only the first view in the array that gets taken into account. According to https://spring.io/blog/2014/12/02/latest-jackson-integration-improvements-in-spring
Only one class or interface can be specified with the #JsonView
annotation, but you can use inheritance to represent JSON View
hierarchies (if a field is part of a JSON View, it will be also part
of parent view). For example, this handler method will serialize
fields annotated with #JsonView(View.Summary.class) and
#JsonView(View.SummaryWithRecipients.class):
and the official documentation in http://docs.spring.io/spring/docs/current/spring-framework-reference/html/mvc.html#mvc-ann-jsonview
To use it with an #ResponseBody controller method or controller
methods that return ResponseEntity, simply add the #JsonView
annotation with a class argument specifying the view class or
interface to be used:
So, I ended up extending the views of the members in the view of the container object, like this
#Entity
public class Project {
...
public static interface Extended extends Normal, Proposal.Extended {}
public static interface Normal extends Summary, Customer.Normal {}
public static interface Summary {}
}
and changed my controller to this
#JsonView(Project.Extended.class)
#Transactional
#RequestMapping(value="/project", method=RequestMethod.GET)
public #ResponseBody List<Project> findAll() {
return projectDAO.findAll();
}
This seems to do the trick, but I couldn't find documentation or discussion about this situation. Is this the intended use of JsonViews or is it kind of hackish?
Thank you in advance
-Patricio Marrone
I believe you have configured your views as necessary. The root of the issue is not Spring's #JsonView, but rather Jackson's implementation of views. As stated in Jackson's view documentation:
Only single active view per serialization; but due to inheritance of Views, can combine Views via aggregation.
So, it appears that Spring is simply passing on and adhering to the limitation set in place by Jackson 2.
I use Jersey+Jackson but issued just the same problem.
That's a trick that I'm doing for my application to let me require for several JSON Views during serialization. I bet it is also possible with Spring MVC instead of Jersey, but not 100% sure. It also does not seem to have performance issues. Maybe it is a bit complicated for your case, but if you have large object with big amount of possible views, maybe it's better than doing a lot of inheritance.
So I use the Jackson Filter approach to require several views in serialization. However, I haven't found the way to overcome the issue of putting #JsonFilter("name") above the classes to map, which does not make it so clean. But I mask it in custom annotation at least:
#Retention(RetentionPolicy.RUNTIME)
#JacksonAnnotationsInside
#JsonFilter(JSONUtils.JACKSON_MULTIPLE_VIEWS_FILTER_NAME)
public #interface JsonMultipleViews {}
The filter itself looks like this:
public class JsonMultipleViewsFilter extends SimpleBeanPropertyFilter {
private Collection<Class<?>> wantedViews;
public JsonMultipleViewsFilter(Collection<Class<?>> wantedViews) {
this.wantedViews = wantedViews;
}
#Override
public void serializeAsField( Object pojo, JsonGenerator jgen, SerializerProvider provider, PropertyWriter writer ) throws Exception {
if( include( writer ) ) {
JsonView jsonViewAnnotation = writer.getAnnotation(JsonView.class);
// serialize the field only if there is no #JsonView annotation or, if there is one, check that at least one
// of view classes above the field fits one of required classes. if yes, serialize the field, if no - skip the field
if( jsonViewAnnotation == null || containsJsonViews(jsonViewAnnotation.value()) ) {
writer.serializeAsField( pojo, jgen, provider );
}
}
else if( !jgen.canOmitFields() ) {
// since 2.3
writer.serializeAsOmittedField( pojo, jgen, provider );
}
}
private boolean containsJsonViews(Class<?>[] viewsOfProperty) {
for (Class<?> viewOfProperty : viewsOfProperty) {
for (Class<?> wantedView : wantedViews) {
// check also subclasses of required view class
if (viewOfProperty.isAssignableFrom(wantedView)) {
return true;
}
}
}
return false;
}
#Override
protected boolean include( BeanPropertyWriter writer ) {
return true;
}
#Override
protected boolean include( PropertyWriter writer ) {
return true;
}
}
I can use this filter like this:
public static String toJson( Object object, Collection<Class<?>> jsonViewClasses) throws JsonProcessingException {
// if no json view class is provided, just map without view approach
if (jsonViewClasses.isEmpty()) {
return mapper.writeValueAsString(object);
}
// if only one json view class is provided, use out of the box jackson mechanism for handling json views
if (jsonViewClasses.size() == 1) {
return mapper.writerWithView(jsonViewClasses.iterator().next()).writeValueAsString(object);
}
// if more than one json view class is provided, uses custom filter to serialize with multiple views
JsonMultipleViewsFilter jsonMultipleViewsFilter = new JsonMultipleViewsFilter(jsonViewClasses);
return mapper.writer(new SimpleFilterProvider() // use filter approach when serializing
.setDefaultFilter(jsonMultipleViewsFilter) // set it as default filter in case of error in writing filter name
.addFilter(JACKSON_MULTIPLE_VIEWS_FILTER_NAME, jsonMultipleViewsFilter) // set custom filter for multiple views with name
.setFailOnUnknownId(false)) // if filter is unknown, don't fail, use default one
.writeValueAsString(object);
}
After that, Jersey allows us to add Jersey Filters on the point of running the application (it goes through each endpoint in each Controller in start of application and we can easily bind the Jersey filters at this moment if there is is multiple value in #JsonView annotation above the endpoint).
In Jersey filter for #JsonView annotation with multiple value above endpoint, once it's bint on startup to correct endpoints depending on annotations, we can easily override the response entity with calling that utils method
toJson(previousResponeObjectReturned, viewClassesFromAnnoation);
No reason to provide the code of Jersey Filter here since you're using Spring MVC. I just hope that it's easy to do it the same way in Spring MVC.
The Domain Object would look like this:
#JsonMultipleViews
public class Example
{
private int id;
private String name;
#JsonView(JsonViews.Extended.class)
private String extendedInfo;
#JsonView(JsonViews.Meta.class)
private Date updateDate;
public static class JsonViews {
public interface Min {}
public interface Extended extends Min {}
public interface Meta extends Min {}
//...
public interface All extends Extended, Meta {} // interfaces are needed for multiple inheritence of views
}
}
We can ommit putting Min.class in my case on those fields that are always required not depending on view. We just put Min in required views and it will serialize all fields without #JsonView annotation.
View All.class is required for me since if we have, for example, a specific set of views for each domain class (like in my case) and then we need to map a complex model consisting of several domain objects that both use views approach - some view for object one, but all views for object two, it's easier to put it above endpoint like this:
#JsonView({ObjectOneViews.SomeView.class, ObjectTwoViews.All.class})
because if we ommit ObjectTwoViews.All.class here and require for only ObjectOneViews.SomeView.class, those fields that are marked with annotation in Object Two will not be serialized.
Consider the following ServerResource derived type:
public class UserResource extends ServerResource {
#Get
public User getUser(int id) {
return new User(id, "Mark", "Kharitonov");
}
}
(Yes, it always returns the same user no matter the given id).
Is it possible to make it work in Restlet? Because, as far as I understand, the expected signature of the GET handler is:
Representation get();
OR
Representation get(Variant v); // (no idea what it means yet)
Now I understand, that I can implement the non type safe GET handler to extract the arguments from the request and then invoke getUser, after which to compose the respective Representation instance from the result and return. But this is a boilerplate code, it does not belong with the application code, its place is inside the framework. At least, this is how it is done by OpenRasta - the REST framework I have been using in .NET
Thanks.
You should remove the parameter from the signature
#Get
public User getUser() {
String id = getQuery().getFirstValue("id");
return new User(id, "Mark", "Kharitonov");
}
No need to override the get() methods in this case as the #Get annotation will be automatically detected.