Using Play Framework 1.2.7, I have a class that extends play.jobs.Job that performs database writes (MongoDB using Play Moprhia plugin)
Here's an abbreviated example:
/* controller */
public static void doThings(#Required String id) {
User me = User.findById(id);
notFoundIfNull(me);
new MyJob(me).now();
}
/* MyJob */
public class MyJob extends Job {
private final User me;
public MyJob(User me) {
this.me = me;
}
#Override
public void doJob() {
int newValue = me.someInt;
newValue++;
me.someInt = newValue;
me.save();
}
}
Here's the weird part (weird to me anyway):
The write in the doJob() method does happen the first time the job is executed, sometimes a second time, but any additional instantiations of this job the write never occurs. No exceptions are thrown.
If i just remove the extends Job from MyJob and then just call the MyJob class by instantiating it myself and calling doJob() it works every time:
/* controller */
public static void doThings(#Required String id) {
User me = User.findById(id);
notFoundIfNull(me);
new MyJob(me).doJob(); // assumes this class no longer Extends Job
}
I've been using Play now for 4+ years and have never seen this kind of behavior, and i'm at a loss as to what actually is happening.
I'm not sure, but I think could be a (not handled) conflict on Morphia plugin and special on Context.
I'm sure there is something very similar in JPA model for Play1, where there is two contexts.
From your code I notice the object is loaded by Controller Context, and saved in Job Context.
When you do without job, Morphia still use the Controller one.
Try to pass only id and reload inside Job, or try to use JPDA remote debug, trap every call (inside controller and job), go deep inside Play framework and compare context object.
Good luck
Related
I'm working on Java Spring
I wanna make simple Util methods like 'get current time as string, split string on special condition, etc..)
I think there is two ways to develop;
1. using static method
/* using static method */
public class TestUtil {
public static String printTest() {
System.out.println("test!");
}
}
// Call Method
public class Caller {
public void callTest() {
TestUtil.printTest();
}
}
2. set as component
/* using as #component */
#Component
public class TestUtil {
public String printTest() {
System.out.println("test!");
}
}
// Call Method
public class Caller {
#Autowired
TestUtil testutil;
public void callTest() {
testutil.printTest();
}
}
Which of the two looks better? And What is the main difference?
Can you answer, you have great mercy on this miserable junior developer.
Thank for your answer
First of all, you shouldn't use a static method to get the time. You can inject a Clock implementation into spring components so that tests can substitute their own predictable clock. That way code that gets the time can assert the expected value of the time.
Spring has control over how it creates component objects but it has no control over classloading, spring can't control which class loads first and make sure static fields have valid values at a given time. It's always better not to use static methods in spring for anything but code that has no dependencies or side effects whatsoever.
i am kind of stuck on a problem with creating beans, or probably i got the wrong intention.. Maybe you can help me solve it:
I got a application which takes in requests for batch processing. For every batch i need to create an own context depending on the parameters issued by the request.
I will try to simplyfy it with the following example:
I receive a request to process in a batch FunctionA which is a implementation for my Function_I interface and has sub-implementation FunctionA_DE and FunctionA_AT
Something like this:
public interface Function_I {
String doFunctionStuff()
}
public abstract class FunctionA implements Function_I {
FunctionConfig funcConfig;
public FunctionA(FunctionConfig funcConfig) {
this.funcConfig = funcConfig;
}
public String doFunctionStuff() {
// some code
String result = callSpecificFunctionStuff();
// more code
return result;
}
protected abstract String callSpecificFunctionStuff();
}
public class FunctionA_DE extends FunctionA {
public FunctionA_DE(FunctionConfig funcConf) {
super(funcConf)
}
protected String callSpecifiFunctionStuff() {
//do some specificStuff
return result;
}
}
public class FunctionA_AT extends FunctionA {
public FunctionA_AT(FunctionConfig funcConf) {
super(funcConf)
}
protected String callSpecifiFunctionStuff() {
//do some specificStuff
return result;
}
}
what would be the Spring-Boot-Way of creating a instance for FunctionA_DE to get it as Function_I for the calling part of the application, and what should it look like when i add FunctionB with FunctionB_DE / FunctionB_AT to my classes..
I thought it could be something like:
PSEUDO CODE
#Configuration
public class FunctionFactory {
#Bean(SCOPE=SCOPE_PROTOTYPE) // i need a new instance everytime i call it
public Function_I createFunctionA(FunctionConfiguration funcConfig) {
// create Function depending on the funcConfig so either FunctionA_DE or FunctionA_AT
}
}
and i would call it by Autowiring the FunctionFactory into my calling class and use it with
someSpringFactory.createFunction(functionConfiguration);
but i cant figure it out to create a Prototype-Bean for the function with passing a parameter.. And i cant really find a solution to my question by browsing through SO, but maybe i just got the wrong search terms.. Or my approach to solve this issue i totally wrong (maybe stupid), nobody would solve it the spring-boot-way but stick to Factories.
Appreciate your help!
You could use Springs's application context. Create a bean for each of the interfaces but annotate it with a specific profile e.g. "Function-A-AT". Now when you have to invoke it, you can simply set the application context of spring accordingly and the right bean should be used by Spring.
Hello everyone and thanks for reading my question.
after a discussion with a friend who is well versed in the spring framework i came to the conclusion that my approach or my favoured solution was not what i was searching for and is not how spring should be used. Because the Function_I-Instance depends on the for the specific batch loaded configuration it is not recommended to manage all these instances as #Beans.
In the end i decided to not manage the instances for my Function_I with spring. but instead i build a Controller / Factory which is a #Controller-Class and let this class build the instance i need with the passed parameters for decision making on runtime.
This is how it looks (Pseudo-Code)
#Controller
public class FunctionController {
SomeSpringManagedClass ssmc;
public FunctionController(#Autowired SomeSpringManagedClass ssmc) {
this.ssmc = ssmc;
}
public Function_I createFunction(FunctionConfiguration funcConf) {
boolean funcA, cntryDE;
// code to decide the function
if(funcA && cntryDE) {
return new FunctionA_DE(funcConf);
} else if(funB && cntryDE) {
return new FunctionB_DE(funcConf);
} // maybe more else if...
}
}
I am writing an application which uses the JiraRestClient by atlassian. I cannot create this client on every Jira interaction so I thought of caching this client.
This client performs login in every 35 minutes so I thought of caching it for 30 min and then perform login again.
For that purpose I need a single instance of this client so that all the threads for the Jira interactions can use it. I created a provider class which will keep the time track and will perform the login, if the creation time of the instance is over 30 min, insuring that the instance is always logged in.
public class JiraRestClientProvider {
private static JiraRestClient jiraRestClient;
private static final long EXPIRATION_TIME = 1800L;
private static long creationTime = Instant.now().getEpochSecond();
public static synchronized JiraRestClient getJiraRestClient() {
if (jiraRestClient == null || Instant.now().getEpochSecond() > creationTime + EXPIRATION_TIME) {
return createJiraRestClient();
}
return jiraRestClient;
}
where createJiraRestClient is a private method that reads the credentials, updates the creation time, and updates the private static variable jiraRestClient.
Another class uses this JiraRestClientProvider class to perform the Jira actions like creating issues or commenting on a issue, etc as follows:
JiraRestClientProvider.getJiraRestClient().
getIssueClient().createIssue(issueInput).claim().getKey();
or
JiraRestClientProvider.getJiraRestClient().getIssueClient()
.getIssue(issueKey).claim().getCommentsUri().toString();
Now, while writing unit test for the class that's using this I can not mock the static method getJiraRestClient and therefore unable to write unit tests.
(Also, I cannot use PowerMock).
My question is, is there a way that I can write my provider class such that I will only have single and fresh instance of JiraRestClient for all threads and I can also unit test it?
Yes you can. But not with your design.
Whenever you call:
JiraRestClientProvider.getJiraRestClient()
you automatically tightly-couple any class with the singleton, which means that every time you call your method under test in unit tests it will call the Singleton. There's no way around that unless of course you want to implement PowerMock :).
A quick win to get this is to wrap your singleton under an interface and use dependancy injection to inject it to your classes.
Writting complete plain code this would look like
interface IJiraClientProvider {
JiraRestClient getJiraRestClient();
}
class Wrapper implements IJiraClientProvider {
JiraRestClient getJiraRestClient() {
return JiraRestClientProvider.getJiraRestClient();
}
}
class YourClass {
private IJiraClientProvider jiraClientProvider
public YourClass(IJiraClientProvider jiraClientProvider) {
this.jiraClientProvider = jiraClientProvider;
}
// now you can unit test your code and mock the dependency
}
When you instantiate YourClass you'll have to pass the wrapper:
YourClass cls = new YourClass(new Wrapper());
The title might be incorrect, but I will try to explain my issue. My project is a Spring Boot project. I have services which do calls to external REST endpoints.
I have a service method which contains several method calls to other services I have. Every individual method call can be successful or not. Every method call is done to a REST endpoint and there can be issues that for example the webservice is not available or that it throws an unknown exception in rare cases. What ever happens, I need to be able to track which method calls were successful and if any one of them fails, I want to rollback to the original state as if nothing happened, see it a bit as #Transactional annotation. All REST calls are different endpoints and need to be called separately and are from an external party which I don't have influence on. Example:
public MyServiceImpl implements MyService {
#Autowired
private Process1Service;
#Autowired
private Process2Service;
#Autowired
private Process3Service;
#Autowired
private Process4Service;
public void bundledProcess() {
process1Service.createFileRESTcall();
process2Service.addFilePermissionsRESTcall();
process3Service.addFileMetadataRESTcall(); <-- might fail for example
process4Service.addFileTimestampRESTcall();
}
}
If for example process3Service.addFileMetadataRESTcall fails I want to do something like undo (in reverse order) for every step before process3:
process2Service.removeFilePermissionsRESTcall();
process1Service.deleteFileRESTcall();
I read about the Command pattern, but that seems to be used for Undo actions inside an application as a sort of history of actions performed, not inside a Spring web application. Is this correct for my use case too or should I track per method/webservice call if it was successful? Is there a best practice for doing this?
I guess however I track it, I need to know which method call failed and from there on perform my 'undo' method REST calls. Although in theory even these calls might also fail of course.
My main goal is to not have files being created (in my example) which any further processes have not been performed on. It should either be all successful or nothing. A sort of transactional.
Update1: improved pseudo implementation based on comments:
public Process1ServiceImpl implements Process1Service {
public void createFileRESTcall() throws MyException {
// Call an external REST api, pseudo code:
if (REST-call fails) {
throw new MyException("External REST api failed");
}
}
}
public class BundledProcessEvent {
private boolean createFileSuccess;
private boolean addFilePermissionsSuccess;
private boolean addFileMetadataSuccess;
private boolean addFileTimestampSuccess;
// Getters and setters
}
public MyServiceImpl implements MyService {
#Autowired
private Process1Service;
#Autowired
private Process2Service;
#Autowired
private Process3Service;
#Autowired
private Process4Service;
#Autowired
private ApplicationEventPublisher applicationEventPublisher;
#Transactional(rollbackOn = MyException.class)
public void bundledProcess() {
BundleProcessEvent bundleProcessEvent = new BundleProcessEvent();
this.applicationEventPublisher.publishEvent(bundleProcessEvent);
bundleProcessEvent.setCreateFileSuccess = bundprocess1Service.createFileRESTcall();
bundleProcessEvent.setAddFilePermissionsSuccess = process2Service.addFilePermissionsRESTcall();
bundleProcessEvent.setAddFileMetadataSuccess = process3Service.addFileMetadataRESTcall();
bundleProcessEvent.setAddFileTimestampSuccess = process4Service.addFileTimestampRESTcall();
}
#TransactionalEventListener(phase = TransactionPhase.AFTER_ROLLBACK)
public void rollback(BundleProcessEvent bundleProcessEvent) {
// If the last process event is successful, we should not
// be in this rollback method even
//if (bundleProcessEvent.isAddFileTimestampSuccess()) {
// remove timestamp
//}
if (bundleProcessEvent.isAddFileMetadataSuccess()) {
// remove metadata
}
if (bundleProcessEvent.isAddFilePermissionsSuccess()) {
// remove file permissions
}
if (bundleProcessEvent.isCreateFileSuccess()) {
// remove file
}
}
Your operation looks like a transaction, so you can use #Transactional annotation. From your code I can't really tell how you are managing HTTP response calls for each of those operations, but you should consider having your service methods to return them, and then do a rollback depending on response calls. You can create an array of methods like so, but how exactly you want your logic to be is up to you.
private Process[] restCalls = new Process[] {
new Process() { public void call() { process1Service.createFileRESTcall(); } },
new Process() { public void call() { process2Service.addFilePermissionsRESTcall(); } },
new Process() { public void call() { process3Service.addFileMetadataRESTcall(); } },
new Process() { public void call() { process4Service.addFileTimestampRESTcall(); } },
};
interface Process {
void call();
}
#Transactional(rollbackOn = Exception.class)
public void bundledProcess() {
restCalls[0].call();
... // say, see which process returned wrong response code
}
#TransactionalEventListener(phase = TransactionPhase.AFTER_ROLLBACK)
public void rollback() {
// handle rollback according to failed method index
}
Check this article. Might come in handy.
The answer to this question is quite broad. There are various ways to do distributed transactions to go through them all here. However, since you are using Java and Spring, your best bet is to use something like JTA (Java Transaction API), which enables a distributed transactions across multiple services/instances/etc.. Fortunately, Spring Boot supports JTA using either Atomikos or Bitronix. You can read the doc here.
One approach to enable distributed transactions is through a message broker such as JMS, RabbitMQ, Kafka, ActiveMQ, etc. and use a protocol like XA transactions (two-phase commit). In the case of external services that do not support distributed, one approach is to write a wrapper service that understands XA transactions to that external service.
I wanted to make a little "log" on what the user is doing. I have different panels and all of these have Ajax functions such as "onclick", "onevent" and "onchange". What I planned was to define an Application wide ArrayList of Strings to log all the things.
I wrote following into WicketApplication.java
public class WicketApplication extends WebApplication {
private List<String> log = new ArrayList<String>();
#Override
public Class<? extends WebPage> getHomePage() {
//code
}
#Override
public void init() {
//code
}
public List<String> getLog() {
return log;
}
public void setLog(List<String> log) {
this.log = log;
}}
Then in one of my panels:
public class Foo extends Panel{
private static final long serialVersionUID = 1L;
private WicketApplication wapp = (WicketApplication) Application.get();
public Foo(String id){
super(id);
}
public void bar(){
List<String> protocol = wapp.getLog();
protocol.add(foo.getBarName() + " has been added to " + selectedKontakt.getObject().getName());
wapp.setLog(protocol);
}
}
In a next panel I tried to create a new reference to WicketApplication. But it seems not to be the same.
Now I have these questions:
Isn't WicketApplication unique and therefore usable for this kind of manipulation?
Do I have to take a session for this?
Can I even parse Applcation to WebApplication? Because I have this error in the console
wapp <----- field that is causing the problem
Is there any other way to create an Application wide variable?
I think you are doing it wrong (on multiple levels).
Firstly: if you want to log, use a Logging framework. E.g. LogBack, preferably accessed through SLF4J
Secondly: if you don't want to use a log framework, create a log service (a dedicated object, not the Wicket Application), use Dependency Injection to inject the log service into all components where you need it. Wicket Supports both Spring and Guice
Third: Static access to the WebApplication as suggested by the accepted answer sounds like a very bad idea (but it is apparently suggested by Wicket, to be fair).
Normal way of use is (static) method. Its typical, don't be afraid.
MyApllication m = MyApllication.get();
So is genrally easy in every wicket object.
Usually "statically overrided" to return correct type, (and can give additional control).
public static MyApllication getMyApplication() {
return (MyApllication) get();
}
or
public static MyApllication get() {
return (MyApllication ) WebApplication.get();
}
When this static method returns correct type, Your problem is resolved.
Analyse how is build AuthenticatedWebApplication over WebApplication (or WebApplication over Application) , its from Wicket team and seems be canonical
BTW I You will have idea access / execute actions dependent on user / session similar idea exist : WebSession override in MySession