A very specific usage of callbacks in Java - java

This question is about a specific usage of a callback pattern. By callback i mean an interface from which i can define method(s) that is (are) optionnaly (= with a default set to 'do nothing', thanks Java 8) called from a lower layer in my application. My "application" is in fact a product which may have a lot of changes between client projects, so i need to separates somethings in order to reuse what won't change (technical code, integration of technologies) from the rest (model, rules).
Let's take an example :
I developped a Search Service which is based upon Apache CXF JAX-RS Search.
This service parses a FIQL query which can only handle AND/OR condition with =/</&gt/LIKE/... condition to create a JPA criteria query. I can't use a a condition like 'isNull'.
Using a specific interface i can define a callback that will be called when i got the criteria query from apache CXF layer in my search service and add my condition to the existing ones before the query is executed. This condition are defined on the upper layer of my searchService (RestController). This is in order to reduce code duplicate, like retuning a criteria query and finalize it in every methods where i need it. And because using #Transactional in CXF JAX-RS controller does not work well Spring proxy and CXF work (some JAX-RS annotation are ignored);
First question : does this example seems to be a good idea in terms of design ?
Now another example : i have an object which have some basic fields created from a service layer. But i want to be able to set others non-nullable fields not related to the service's process before the entity is persisted. These fields may move from a projects to another so i'd like to not have to change the signature of my service's method every time we add / remove columns. So again i'm considering using a callback pattern to be able to set within the same transaction and before object is persisted by the Service layer.
Second question : What about this example ?
Global question : Except the classic usage of callback for events : is this a pratice to use this pattern for some specific usage or is there any better way to handle it ?
If you need some code sample ask me, i'll make some (can't post my current code).

I wouldn't say that what you've described is a very specific usage of "an interface from which i can define method(s) that is (are) optionally called from a lower layer". I think that it is reasonable and also quite common solution.
Your doubts may be due to the naming. I'd rather use the term command pattern here. It seems to me that it is less confusing. Your approach also resembles the strategy pattern i.e. you provide (inject) an object which performs some calculations. Depending, on the context you inject objects that behave in a different way (for example add different conditions to a query).
To sum up callbacks/commands are not only used for events. I'd even say that events are specific usage of them. Command/callback pattern is used whenever we need to encapsulate an operation within an object and transfer/pass it somehow (by the way, in Java there is no other way to do so but for example in C++ there are pointers to methods, in C# there are delegates...).
As to your second example. I'm not sure if I understand it correctly. Why can't you simply populate all required fields of an object before calling the service?

Related

Significance of Delegate Design Pattern in Swagger Generated Code?

When i generate code for Spring from my swagger yaml , usually controller layer is generated using delegate pattern , such that for a single model three files are generated . For example , if i defined a model named Person in my swagger/open API yaml file , three files get generated as :
PersonApi (interface that contains signatures of all person operations/methods)
PersonApiDelegate ( interface that provides default implementation of all PersonApi methods . Meant to be overriden )
PersonApiController (Which has a reference to PersonApiDelegate so that any implementation can override and provide custom implementation)
My question is for anyone who is familiar with building swagger/openapi generated code based apis that what is the significance of having such a pattern , instead of just exposing your service endpoints using a PersonController class , and not going through a PersonApi interface and then to a PersonApiDelegate and finally exposing the service through a PersonApiController ?
What is the valuable design extensibility we gain through this pattern ? I tried to find information from other resources on internet , but couldn't find a good answer in context of swagger first API development approach . Any insights on this will be really helpful .
First of all a clarification: as already mentioned in a comment, you are not forced to use the delegation. On the contrary, the default behavior of the Spring generator is to not use the delegation pattern, as you can easily check in the docs. In this case it will generate only the PersonApi interface and PersonApiController.
Coming to your question, why using delegation?
This allows you to write a class that implements PersonApiDelegate, that can be easily injected in the generated code, without any need to manually touch generated sources, and keeping the implementation safe from possible future changes in the code generation.
Let's think what could happen without delegation.
A naive approach would be to generate the sources and then write directly the implementation inside the generated PersonController. Of course the next time there is a need to run the generator, it would be a big mess. All the implementation would be lost...
A slightly better scenario, but not perfect, would be to write a class that extends PersonController. That would keep the implementation safe from being overwritten during generation, but would not protect it from future changes of the generation engine: as a bare minimum the implementation class would need to implement the PersonController constructor. Right now the constructor of a generated controller has the following signature PersonApiController(ObjectMapper objectMapper, HttpServletRequest request), but the developers of the generator may need to change it in the future. So the implementation would need to change too.
A third approach would be to forget completely about the generated PersonApiController, and just write a class that implements the PersonApi interface. That would be fine, but every time the code is generated you would need to delete the PersonApiController, otherwise Spring router will complain. Still manual work...
But with the delegation, the implementation code is completely safe. No need to manually delete stuff, no need to adapt in case of future changes. Also the class that implements PersonApiDelegate can be treated as an independent service, so you can inject / autowire into it whatever you need.

Rails application, but all data layer is using a json/xml based web service

I have a web service layer that is written in Java/Jersey, and it serves JSON.
For the front-end of the application, I want to use Rails.
How should I go about building my models?
Should I do something like this?
response = api_client.get_user(123)
User user = User.new(response)
What is the best approach to mapping the JSON to the Ruby object?
What options do I have? Since this is a critical part, I want to know my options, because performance is a factor. This, along with mapping JSON to a Ruby object and going from Ruby object => JSON, is a common occurance in the application.
Would I still be able to make use of validations? Or wouldn't it make sense since I would have validation duplicated on the front-end and the service layer?
Models in Rails do not have to do database operation, they are just normal classes. Normally they are imbued with ActiveRecord magic when you subclass them from ActiveRecord::Base.
You can use a gem such as Virtus that will give you models with attributes. And for validations you can go with Vanguard. If you want something close to ActiveRecord but without the database and are running Rails 3+ you can also include ActiveModel into your model to get attributes and validations as well as have them working in forms. See Yehuda Katz's post for details on that.
In your case it will depend on the data you will consume. If all the datasources have the same basic format for example you could create your own base class to keep all the logic that you want to share across the individual classes (inheritance).
If you have a few different types of data coming in you could create modules to encapsulate behavior for the different types and include the models you need in the appropriate classes (composition).
Generally though you probably want to end up with one class per resource in the remote API that maps 1-to-1 with whatever domain logic you have. You can do this in many different ways, but following the method naming used by ActiveRecord might be a good idea, both since you learn ActiveRecord while building your class structure and it will help other Rails developers later if your API looks and works like ActiveRecords.
Think about it in terms of what you want to be able to do to an object (this is where TDD comes in). You want to be able to fetch a collection Model.all, a specific element Model.find(identifier), push a changed element to the remote service updated_model.save and so on.
What the actual logic on the inside of these methods will have to be will depend on the remote service. But you will probably want each model class to hold a url to it's resource endpoint and you will defiantly want to keep the logic in your models. So instead of:
response = api_client.get_user(123)
User user = User.new(response)
you will do
class User
...
def find id
#api_client.get_user(id)
end
...
end
User.find(123)
or more probably
class ApiClient
...
protected
def self.uri resource_uri
#uri = resource_uri
end
def get id
# basically whatever code you envisioned for api_client.get_user
end
...
end
class User < ApiClient
uri 'http://path.to.remote/resource.json'
...
def find id
get(id)
end
...
end
User.find(123)
Basic principles: Collect all the shared logic in a class (ApiClient). Subclass that on a per resource basis (User). Keep all the logic in your models, no other part of your system should have to know if it's a DB backed app or if you are using an external REST API. Best of all is if you can keep the integration logic completely in the base class. That way you have only one place to update if the external datasource changes.
As for going the other way, Rails have several good methods to convert objects to JSON. From the to_json method to using a gem such as RABL to have actual views for your JSON objects.
You can get validations by using part of the ActiveRecord modules. As of Rails 4 this is a module called ActiveModel, but you can do it in Rails 3 and there are several tutorials for it online, not least of all a RailsCast.
Performance will not be a problem except what you can incur when calling a remote service, if the network is slow you will be to. Some of that could probably be helped with caching (see another answer by me for details) but that is also dependent on the data you are using.
Hope that put you on the right track. And if you want a more thorough grounding in how to design these kind of structures you should pick up a book on the subject, for example Practical Object-Oriented Design in Ruby: An Agile Primer by Sandi Metz.

On properly implementing complex service layers

I have the following situation:
Three concrete service classes implement a service interface: one is for persistence, the other deals with notifications, the third deals with adding points to specific actions (gamification). The interface has roughly the following structure:
public interface IPhotoService {
void upload();
Photo get(Long id);
void like(Long id);
//etc...
}
I did not want to mix the three types of logic into one service (or even worse, in the controller class) because I want to be able to change them (or shut them) without any problems. The problem comes when I have to inject a concrete service into the controller to use. Usually, I create a fourth class, named roughly ApplicationNamePhotoService, which implements the same interface, and works as a wrapper (mediator) between the other three services, which gets input from the controller, and calls each service correspondingly. It is a working approach, though one, which creates a lot of boilerplate code.
Is this the right approach? Currently, I am not aware of a better one, although I will highly appreciate to know if it is possible to declare the execution sequence declaratively (in the context) and to inject the controller with and on-the fly generated wrapper instance.
Also, it would be nice to cache some stuff between the three services. For example, all are using DAOs, i.e. making sometimes the same calls to the DB over and over again. If all the logic were into one place that could have been avoided, but now... I know that it is possible to enable some request or session based caching. Can you suggest me some example code? BTW, I am using Hibernate for the persistence part. Is there already some caching provided (probably, if they reside in the same transaction or something - with that one I am totally lost)
The service layer should consist of classes with methods that are units of work with actions that belong in the same transaction. It sounds like you are mixing service classes when they could be in the same class and method. You can inject service classes into one another when required too, rather than create another "mediator".
It is perfectly acceptable to "mix the three types of logic", in fact it is preferable if they form an expected use case/unit of work
Cache-ing I would look to use eh cache which is, I believe, well integrated with hibernate.

Creating user events on certain actions. What is recommended solution?

This question regards how one can effectively create and persist event domain objects on certain system or user triggered events which themselves may or may not persist changes to the database.
Im creating a system where a user can tag some object, and when tagging occurs i should create a UserTagEvent which holds the object that was tagged, the tag that was applied or removed, and the user that tagged the object. (EDIT: This is not the actual TAG object, just a log of a tagging event)
The relationship of such a taggable object is one-to-many (a taggable object has many tags)
As far as i can see i have three alternatives.
Inline code in the controller/service which does the tagging (don't wanna do this as it mixes two different business processes.)
Use hibernate listeners pre-collection-update and post-collection-update to fetch the necessary information and create and persist a new UserTagEvent
Use AOP.
Do i have any other alternatives? Has anyone done something similar to this? What do you guys think i should do? Any help is appreciated.
It is not 100% clear if the UserTagEvent represents the actual tag or if it just acts as a log for a tag event.
Use hibernate listeners pre-collection-update and post-collection-update to fetch the necessary information and create and persist a new UserTagEvent
If the UserTagEvent is your tag the hibernate listeners would not make much sense because they would only get fired when you create a UserTagEvent and add it to the object by yourself and then you won nothing.
Inline code in the controller/service which does the tagging (don't wanna do this as it mixes two different business processes.)
I would start by creating a TagService that is responsible for tagging/tag-logging. You could use it either from a controller or by using it from aop but you should encapsule the functionality like: tagService.createTag(tag, object, user)
This could be handy especially when you later want to use a different technology to store the events like some nosql solution.
The following is what i learned when exploring my options:
1) Inline code in the controller/service which does the
tagging (don't wanna do this as it
mixes two different business
processes.)
Didnt give this alternative a try
2) Use hibernate listeners pre-collection-update and
post-collection-update to fetch the
necessary information and create and
persist a new UserTagEvent
This turned out to be very difficult, inefficient, and problematic for several reasons.
For example, you are working with a collection of items which may or may not be lazy initialized. In order to detect changes in the collection i had to listen for collection initialization event, get a cloned collection, store it to a field variable, then listen for a update collection event, get a cloned collection and compare with the collection previously stored.
In addition these events got fired for ALL hibernate events, not just for the domain objects i was interested in. So this was a "no go"...
3) Use AOP.
I was originally very optimistic about this solution, and after a few tries i soon came to realize that this wasn't as simple as i first thought. There were very few guides on the web describing Grails AND AOP, and those existed were rather old.
There was a lot more work involved than i originally thought. My overall impression is that grails seems to have a lot of bugs assosciated with AOP integration, and i also didn't like the fact that i had to add bean definitions to resources.groovy for each aspect that i created. I tried to make aspects be autoloaded through annotations (auto-proxy), but with no luck.
In addition i never got the pointcut to work outside the main project. As my tagging solution is defined as a grails plugin it seems that AOP can't be applied on classes of the plugin (even if it is a inplace plugin).
So this turned out to be a "no go" aswell
So drum roll please.
What i ended up with was using the observer pattern to fire off an event whenever a new tag was added or removed. This involved making changes to my tagger plugin where i could specify listeners through spring beans (whicn implemented a TagEventListener interface) and have the tagger plugin fire off events on the spring beans upon the addTag and removeTag method calls.
Overall im pretty happy with this solution, it involves one or two more method calls then what would be necessary if i had just inlined as described in option 1. But this way I have cleaner code, and i don't mix business processes. So i think the extra 1ns overhead is worth it.

In Java, how can I construct a "proxy wrapper" around an object which invokes a method upon changing a property?

I'm looking for something similar to the Proxy pattern or the Dynamic Proxy Classes, only that I don't want to intercept method calls before they are invoked on the real object, but rather I'd like to intercept properties that are being changed. I'd like the proxy to be able to represent multiple objects with different sets of properties. Something like the Proxy class in Action Script 3 would be fine.
Here's what I want to achieve in general:
I have a thread running with an object that manages a list of values (numbers, strings, objects) which were handed over by other threads in the program, so the class can take care of creating regular persistent snapshots on disk for the purpose of checkpointing the application. This persistor object manages a "dirty" flag that signifies whether the list of values has changed since the last checkpoint and needs to lock the list while it's busy writing it to disk.
The persistor and the other components identify a particular item via a common name, so that when recovering from a crash, the other components can first check if the persistor has their latest copy saved and continue working where they left off.
During normal operation, in order to work with the objects they handed over to the persistor, I want them to receive a reference to a proxy object that looks as if it were the original one, but whenever they change some value on it, the persistor notices and acts accordingly, for example by marking the item or the list as dirty before actually setting the real value.
Edit: Alternatively, are there generic setters (like in PHP 5) in Java, that is, a method that gets called if a property doesn't exist? Or is there a type of object that I can add properties to at runtime?
If with "properties" you mean JavaBean properties, i.e. represented bay a getter and/or a setter method, then you can use a dynamic proxy to intercept the set method.
If you mean instance variables, then no can do - not on the Java level. Perhaps something could be done by manipulations on the byte code level though.
Actually, the easiest way to do it is probably by using AspectJ and defining a set() pointcut (which will intercept the field access on the byte code level).
The design pattern you are looking for is: Differential Execution. I do believe.
How does differential execution work?
Is a question I answered that deals with this.
However, may I suggest that you use a callback instead? You will have to read about this, but the general idea is that you can implement interfaces (often called listeners) that active upon "something interesting" happening. Such as having a data structure be changed.
Obligitory links:
Wiki Differential execution
Wiki Callback
Alright, here is the answer as I see it. Differential Execution is O(N) time. This is really reasonable, but if that doesn't work for ya Callbacks will. Callbacks basically work by passing a method by parameter to your class that is changing the array. This method will take the value changed and the location of the item, pass it back by parameter to the "storage class" and change the value approipriately. So, yes, you have to back each change with a method call.
I realize now this is not what you want. What it appears that you want is a way that you can supply some kind of listener on each variable in an array that would be called when that item is changed. The listener would then change the corresponding array in your "backup" to refect this change.
Natively I can't think of a way to do this. You can, of course, create your own listeners and events, using an interface. This is basically the same idea as the callbacks, though nicer to look at.
Then there is reflection... Java has reflection, and I am positive you can write something using it to do this. However, reflection is notoriously slow. Not to mention a pain to code (in my opinion).
Hope that helps...
I don't want to intercept method calls before they are invoked on the real object, but
rather I'd like to intercept properties that are being changed
So in fact, the objects you want to monitor are no convenient beans but a resurgence of C structs. The only way that comes to my mind to do that is with the Field Access call in JVMTI.
I wanted to do the same thing myself. My solution was to use dynamic proxy wrappers using Javassist. I would generate a class that implements the same interface as the class of my target object, wrap my proxy class around original class, and delegate all method calls on proxy to the original, except setters which would also fire the PropertyChangeEvent.
Anyway I posted the full explanation and the code on my blog here:
http://clockwork-fig.blogspot.com/2010/11/javabean-property-change-listener-with.html

Categories

Resources