I'm developing an Android library,
When the user receives a push notification it may contain deep links, that I need to return to the app.
I did in kotlin with no problem.
This is the function that is called when it needs to send the deep links
fun getDeepLinkFlow(): Flow<HashMap<String, String>?> = flow {
emit(deepLinks)
}
And in my kotlin test app I managed to use with no problems as well, using like this.
GlobalScope.launch(coroutineContext) {
SDK.getDeepLinkFlow().collect { deepLinks ->
println(deepLinks)
}
}
But now, there's a RN project that whant to use the lib, in order to do that we are doing a RN module that joins the iOS code and the Android code. But it uses java.
So, how can I use the collect from Coroutines on a Java code? or what could I do differently?
Coroutines/Flow is inherently awkward to use from Java since it relies on transformed suspend code to work.
One possible solution is to expose an alternative way of consuming the Flow to your java code. Using the RxJava integration library you can expose a compatible Flowable that the java side can consume.
I've decided to change the way deep links were used and start using the ViewModel with a liveData instead.
Related
I built an android app and have built a corresponding node.js back-end to store data from it. I've got some API end-points created along with methods to save and retrieve data from a MongoDB.
I understand how to make https requests from java/android, but I'm not sure how things should ideally be set-up and structed. Is it best practice to create a new package for api related things? Like for example, a package like:
API Package
Class containing GET requests
Class containing POST requests
These classes might contain methods like:
public static String getItem(int item_id) {
//HTTPS boiler plate
//HTTPS GET request and process data
return result;
Or is it best to create a class to handle and make HTTPS requests and then implement all of the required API methods in a single, separate class?
There are several popular libraries out there that you can check it out like Retrofit or Android Volley. I recommend you not to write the code to make requests to the backend yourself because things like parsing JSON data shouldn't be done manually as everything will be hard to manage when your project grows.
To structure the code, it depends on personal preference. I personally use Retrofit so I keep all network requests in an interface. For example:
interface APIs {
#GET("api/user")
Call<Users> getUsers();
#GET("api/comment")
Call<Comments> getComments();
#GET("api/photos")
Call<Photos> getPhotos();
}
But if you want to learn how things should be set-up and structured nicely, I suggest you read about the Android architecture component and Clean Code architecture where code logics are kept separately and grouped in a manageable way.
I am using vertx-rx-java
I have two verticles which communicate which each other using EventBus, one of the verticles returns complex object as the result, to do so I created custom MessageCodec and wanted to register it in EventBus.
In standard io.vertx.core.eventbus.EventBus there is a method registerCodec which gives possibility to register custom codecs for this EventBus.
However since I am using io.vertx.rxjava.core.Vertx, by calling vertx.eventBus() I receive io.vertx.rxjava.core.eventbus.EventBus, which doesn't have such method. What is the purpose of removing this method from rxjava EventBus, is it considered bad practice to use custom codecs whens using rxjava? If so, what is the recommended approach?
The only way to add custom codec I found is to call eventBus.getDelegate():
EventBus eb = vertx.eventBus();
((io.vertx.core.eventbus.EventBus) eb.getDelegate()).registerCodec(new StringMessageCodec());
Short answer appears it has not been supported. Hoewever this has changed recently (August) on the unreleased master branch; see they removed #GenIgnore from EventBus here. However, that is not yet in the release 3.5.3. Presumably this change will go out in one of the next releases.
I am using Gatling for the first time. I have functional tests that are written in java/cucumber. I want to run these functional tests from a Gatling-scala script to do the performance testing of my application. Is there any way to do so?
The idea is to use the existing functional tests and wrap them around gatling scripts so that they could be executed concurrently for multiple users.
What you want to do is to call a Java method from Scala.
Make sure that the method you want to call is available on the class path Scala sees. Then refer to the method you want to call.
This blog post may help you.
If you are using the Gatling for the first time, have you been considering usage of some other performance tools which can provide you such options? As an analog to Gatling for your case (if you want to create functional tests on Java) and run them later using loading tools I would recommend you to check the Locust.
Using Locust you can write the tests using Java or even Kotlin. You can find the handy tutorial by this link:
https://www.blazemeter.com/blog/locust-performance-testing-using-java-and-kotlin
Another preferable option might be to use Taurus framework which allows you to run JUnit/TestNG tests right away:
https://gettaurus.org/docs/JUnit/
https://www.blazemeter.com/blog/navigating-your-first-steps-using-taurus
Gatling is primarily for http testing. what I would do is to call java code from within a gatling test that will return me a value that I check for ex: I return a boolean from a java code below for doing performance test(same also for functional test which needs extending GatlingHttpFunSpec instead of Simulation class). Also will need to use a dummy endpoint (like a health check url which will always return 200).
val myJavaTest: MyJavaTest = new MyJavaTest()
val baseURL="http://localhost:8080"
val endpoint_headers=Map("header1" -> "val1")
val endPoint="/myurl/healthcheck"
setUp(scenario("Scenario ")
.exec(
http("run first test")
.get(endpoint)
.headers(endpoint_headers)
.check(bodyString.transform(str => {
myJavaTest.runTest1()//should return boolean
}).is(true)))
.inject(atOnceUsers(1))).protocols(http
.baseURL(baseURL))
I have created several tasks, each takes an input, an execution function which keeps updating its status, and a function to get output of this task. They will execute in serial or parallel. Some outputs are List so there will be loops as well.
public class Task1 { //each task looks like this
void addInput(String key, String value){
....
}
void run(){
....
updateStatus();
....
}
HashMap getOutput(){
....
}
Status getStatus(){
....
}
}
I want to make a workflow from these tasks and then I will use the workflow structure information to build a dynamic GUI and monitor outputs of each task. Do I have to write a workflow execution system from scratch or is there any simple alternative available?
Is there any workflow engine to which I can give (in XML may be) my Java classes, input and output and execution functions and let it execute?
In Java world, your use case is called as BPM (Business process management).
In .Net world, this is called as Windows Workflow Foundation (WWF).
There are many java based open source BPM tools. The one i like is jBPM.
This is more powerful and can be integrated with rule engines like Drools.
Sample jBPM Screenshot:
Also Activiti is another good choice.
Sample Activiti Screenshot:
Check out Activity. This is not strictly designed to solve your use case, but you may use and adapt it's process engine to help you because it's written purely in java. Activity is rather process modeling engine so it's not designed to controll tasks running in parallel at runtime, howerver you will get many things which you can reuse "out of the box".
You will get tasks linking basing on xml file
You will get your gui for linking tasks for free (basing on eclipse)
You will get the GUI in web browser to browse running processes, start new, and see current status of the tasks: http://activiti.org/userguide/index.html#N12E60
You will get the reporting engine for free, where you can see reports and charts ("how long time did the tasks take", "how long was the process was running"
You will get the REST API for free. Other application will be able to get the current state of your application via simple REST calls
So going in this direction you will get many things for free. From programmer point of View you can for example inherit from Task class from Activity api. Late when the task is completed call
taskService.complete(task.getId(), taskVariables);
You can also go another way arround. So supossing that your class which calculates in background is called CalculationTask, you can connect CalculatinTasks with new instance of Activity Task. By this you will get a bridge to Activity process engine. So you can do something like
class CustomActivityTask extends Task { // inherit from Activity Task class to add your own fields
private int someStateOne;
private String someOtherState;
(...)
// getters and setters
}
class CalculationTask {
private CustomActivityTask avtivityTask; // by updating the state of this task you are updating the state of the task in Activity process engine
private RuntimeService activityRuntimeServiece;
public void run() { // this is your execution functin
while (true) {
// calulate
activityTask.setSomeStateOne(45)
activityTask.setSomeOtherState("Task is almost completing...");
(...)
if (allCompleted) {
activityRuntimeServiece.complete(avtivityTask.getId(), taskVariables);
break;
}
}
}
Apache Camel is an open source integration framework that can be used as a lightweight workflow system. Routes can be defined with a Java, XML, Groovy or Scala DSL. Apache Camel includes integrated monitoring capacities. Beside that you may use external monitoring tools such as Hawtio.
Also have a look at Work Flow in Camel vs BPM.
Take a look at Copper Engine http://copper-engine.org/
Unlike Activiti and such, it does not require to write a ton of XML just to get a simple workflow going
Perhaps State Chart XML (SCXML) can help you. Currently it's a Working Draft specification published by W3C
SCXML provides a generic state-machine based execution environment based on Harel State Tables.
The apache foundations provides a java implementation that we(my company) are currently using to perform state-transitions on "jobs". Here is the Apache Commons SCXML implementation
If your analysis is time-consuming and you don't need immediate feedback, then perhaps you don't need a workflow engine at all but a batch processor. Have a look at the Spring Batch project which can be used together with Camel Apache (see my other answer for more information about this option).
I would definitely consider the Apache Storm project. Storm is designed to be an easily-extensible, parallel computation engine. Among its many features, its ease of management, fault tolerance and general simplicity to set up (in comparison to other similar technologies like Hadoop, I believe) are probably going to be attractive to you for a prototype system.
Workflows would be analogous to Storm topologies; the different tasks would be streams; and the different methods in tasks would correspond to spouts and bolts. Also, Storm supports several programming languages in its API, like Hadoop.
Storm was initially designed by Twitter and then open-sourced, similarly to other projects like Cassandra (Facebook) and Hadoop itself (Yahoo!). What that means for you as a user is that it was built for actual use, instead of as a purely theoretical concept. It's also pretty battle tested.
I hope this was useful to you, and wish you the best of luck with your project!
I have a javascript function (very big one!) that I need its functionality in a Java (Groovy) class. It is a simple calendar converter. I can rewrite it in groovy but just want to know if it is possible to call javascript function from a java (groovy) method? I guess functional testing libraries like selenium and Canoo should have something like this, am I right?
PS: I don't want to wake up a real-world browser in order to use its JS runtime env.
Thanks,
As mentioned in the other answers, it is possible to use the Scripting API provided as part of the javax.script package, available from Java 6.
The following is a Groovy example which executes a little bit of Javascript:
import javax.script.*
manager = new ScriptEngineManager()
engine = manager.getEngineByName("JavaScript")
javascriptString = """
obj = {"value" : 42}
print(obj["value"])
"""
engine.eval(javascriptString) // prints 42
It is not necessary to call a browser to execute Javascript when using the Scripting API, but one should keep in mind that browser-specific features (probably the DOM-related functionalities) will not be available.
You can use Rhino, an implementation of JavaScript language in Java. Here is example of calling JavaScript function from java, but you can do it from groovy also.