Single-threaded Java Websocket for Testing - java

We are developing an application with Scala and Websockets. For the latter we use Java-Websocket. The application itself works great and we are in the middle of writing unit tests.
We use a WebSocket class as follows
class WebSocket(uri : URI) extends WebSocketClient(uri) {
connectBlocking()
var response = ""
def onOpen(handshakedata : ServerHandshake) {
println("onOpen")
}
def onMessage(message : String) {
println("Received: " + message)
response = message
}
def onClose(code : Int, reason : String, remote : Boolean) {
println("onClose")
}
def onError(ex : Exception) {
println("onError")
}
}
A test might look like this (pseudo code)
websocketTest {
ws = new WebSocket("ws://example.org")
ws.send("foo")
res = ws.getResponse()
....
}
Sending and receiving data works. However, the problem is that connecting to the websocket creates a new thread and only the new thread will have access to response using the onMessage handler. What is the best way to either make the websocket implementation single-threaded or connect the two threads so that we can access the response in the test case? Or is there another, even better way of doing it? In the end we should be able to somehow test the response of the websocket.

There are a number of ways you could try to do this. The issue will be that you might get an error or a successful response from the server. As a result, the best way is probably to use some sort of timeout. In the past I have used a pattern like (note, this is untested code):
...
use response in the onMessage like you did
...
long start = System.currentTimeMillis();
long timeout = 5000;//5 seconds
while((system.currentTimeMillis()-start)<timeout && response==null)
{
Thread.sleep(100);
}
if(response == null) .. timed out
else .. do something with the response
If you want to be especially safe you can use an AtomicReference for the response.
Of course the timeout and sleep can be minimized based on your test case.
Moreover, you can wrap this in a utility method.

Related

Vertx client is taking time to check for failure

I have a requirement where I am connecting one microservice to other microservice via Vertx client. In the code I am checking if another microservice is down then on failure it should create some JsonObject with solrError as key and failure message as value. If there is a solr error I mean if other microservice is down which is calling solr via load balancing then it should throw some error response. But Vertx client is taking some time to check on failure and when condition is checked that time there is no solrError in jsonobject as Vertx client is taking some time to check for failure so condition fails and resp is coming as null. In order to avoid this what can be done so that Vertx client fails before the condition to check for solrError and returns Internal server error response?
Below is the code :
solrQueryService.executeQuery(query).subscribe().with(jsonObject -> {
ObjectMapper objMapper = new ObjectMapper();
SolrOutput solrOutput = new SolrOutput();
List<Doc> docs = new ArrayList<>();
try {
if(null != jsonObject.getMap().get("solrError")){
resp = Response.status(Response.Status.INTERNAL_SERVER_ERROR)
.entity(new BaseException(
exceptionService.processSolrDownError(request.header.referenceId))
.getResponse()).build();
}
solrOutput = objMapper.readValue(jsonObject.toString(), SolrOutput.class);
if (null != solrOutput.getResponse()
&& CollectionUtils.isNotEmpty(solrOutput.getResponse().getDocs())) {
docs.addAll(solrOutput.getResponse().getDocs());
uniDocList = Uni.createFrom().item(docs);
}
} catch (JsonProcessingException e) {
e.printStackTrace();
}
});
if(null!=resp && resp.getStatus() !=200) {
return resp ;
}
SolrQueryService is preparing query and send out URL and query to Vertx web client as below :
public Uni<JsonObject> search(URL url, SolrQuery query,Integer timeout) {
int port = url.getPort();
if (port == -1 && "https".equals(url.getProtocol())) {
port = 443;
}
if (port == -1 && "http".equals(url.getProtocol())) {
port = 80;
}
HttpRequest<Buffer> request = client.post(port, url.getHost(), url.getPath()).timeout(timeout);
return request.sendJson(query).map(resp -> {
return resp.bodyAsJsonObject();
}).onFailure().recoverWithUni(f -> {
return Uni.createFrom().item(new JsonObject().put("solrError", f.getMessage()));
});
}
I have not used the Vertx client but assume its reactive and non-blocking. Assuming this is the case, your code seems to be mixing imperative and reactive constructs. The subscribe in the first line is reactive and the lambda you provide will be called when the server responds to the client request. However, after the subscribe, you have imperative code which runs before the lambda even has a chance to be called so your checks and access to the "resp" object will never be a result of what happened in the lambda itself.
You need to move all the code into the lambda or at least make subsequent code chain onto the result of the subscribe.

Can my selenium framework consume an incoming message

I would like to know how my Selenium framework can dequeue a message sitting in a message queue. I have built an application to send a JSON string containing k/v pairs to a message queue.
My architecture is as follows and separate apps:
A JSP Web Application exists accepting parameters resulting in a JSON string
A message sender exists and takes the JSON string and publishes it to a Queue
A message consumer exists and consumes the Messages. Its basically just sitting here
A Selenium Java Framework exists, but I would like to process the messages and for each message it will interpret the k/v pairs and kicks off the script.
I would like to use the messages already in the queue and process these messages within the selenium framework, how can I achieve this?
I will appreciate the help. I have edited the question with the code
This is the code snippet to send the JSON Message
public class MessageSender {
public static void main(String[] args) throws IOException {
SingleNumberLogin generateLogin = new SingleNumberLogin();
//function call to build the JSON object
String jsonQueue = generateLogin.buildJASONObject();
ConnectionFactory conFactory = new ConnectionFactory();
try {
Connection connInterface = conFactory.newConnection();
Channel mqChannel = connInterface.createChannel();
mqChannel.queueDeclare("MyQueue",false,false,false,null);
//Just assigning json to another string, then publish the message
String myMessage = jsonQueue;
mqChannel.basicPublish("","MyQueue",false ,false, null,myMessage.getBytes());
}catch (
IOException | TimeoutException e)
{
System.out.println(e.getStackTrace());
}
conFactory.setUsername("guest");
conFactory.setPassword("guest");
conFactory.setVirtualHost("/");
conFactory.setHost("localhost");
conFactory.setPort(5672);
}
}
code snippet for consumer code that I have inserted into the startup function of the automation script, so if a message arrives a single test case is executed
#BeforeTest
public static void initializeTestBaseSetup() throws Exception, IOException, TimeoutException {
ConnectionFactory conFactory = new ConnectionFactory();
Connection connInterface = conFactory.newConnection();
Channel mqChannel = connInterface.createChannel();
mqChannel.queueDeclare("MyQueue",false,false,false,null);
mqChannel.basicConsume("MyQueue", true, (consumerTag, message) -> {
//convert to byte array
String m = new String (message.getBody(), "UTF-8");
System.out.println("Message received" + m);
}, consumerTag -> {
});
}
Output JSON
JSON Message received 2020-08-28T20:39:30.845{
"NUMBER": "0000011111",
"Type": "BAU",
"User": "MyUser ",
"Email": "riidonesh#gmail.com",
}
When tested in isolation, it works perfectly fine, what I mean is that I send the message and check that the consumer receives it, adding the consumer code to my framework is where i am stuck.
I would suggest you don't think about what you have as a "selenium framework" - think of it as a "java framework".
Selenium is a set of libraries that allow you automate the web browser at a GUI level. The framework is the coded solution to facilitate creation and management of your test suite - it doesn't have to be limited to selenium and chances that's already just one of its components.
Trying to answer your question directly:
SELENIUM cannot read messages
JAVA can read messages
If your rabbitmq has a web front end then you may be able to use selenium for it, but this isn't a very efficient or a logical solution.
What you might want to consider, and what i would do, is extending your framework to use the rabbitmq libraries to process messages as you need. These libraries are designed for this task.
You say:
I would like to process the messages and for each message it will
interpret the k/v pairs and kicks off the script.
I understand this to mean that the messages are the pre-req data for the tests. If you want to read the values of a message before the test you can either:
Place the get/read in a generic #Before method
or if it's a specific message per test case, add it into the start of the test.
You're working in java so you can do whatever you want really.
To get you started, the rabbitmq tutorial starts here.
This is there hello world example for reading messages from the queue:
public class Recv {
private final static String QUEUE_NAME = "hello";
public static void main(String[] argv) throws Exception {
ConnectionFactory factory = new ConnectionFactory();
factory.setHost("localhost");
Connection connection = factory.newConnection();
Channel channel = connection.createChannel();
channel.queueDeclare(QUEUE_NAME, false, false, false, null);
System.out.println(" [*] Waiting for messages. To exit press CTRL+C");
}
}

Dynamic Rest Client in Java

I'm trying to create a dynamic Rest client, where I can set the HTTP Method(GET-POST-PUT-DELETE), Query Params and body(Json, plain, XML), this is basically what I need, for the request I think i know how I can do it, but my concern is for reading the answer, since I know what I should get ( format) but I dont know how to read it properly, so far I return an object, below the code (only for POST, but the idea is the same):
Response responseRest = null;
Client client = null;
try {
client = new ResteasyClientBuilder().establishConnectionTimeout(TIME_OUT, TimeUnit.MILLISECONDS).socketTimeout(TIME_OUT, TimeUnit.MILLISECONDS).build();
WebTarget target = client.target(request.getUrlTarget());
MediaType type = assignResponseType(request.getTypeResponse());
switch (request.getProtocol()) {
case POST: {
if (request.getParamQuery() != null) {
for (VarRequestDTO varRequest : request.getParamQuery()) {
target = target.queryParam(varRequest.getName(), varRequest.getValue());
}
}
responseRest = target.request().post(Entity.entity(new ResponseWrapper(), type));
break;
}
default:
//HTTP METHOD No supported
}
Object result = responseRest.readEntity(Object.class);
}
catch (Exception e) {
response.setError(Boolean.TRUE);
response.setMessage(e.getMessage());
e.printStackTrace();
} finally {
if (responseRest != null) {
responseRest.close();
}
if (client != null) {
client.close();
}
}
What I basically I need is to return the object in the format needed, and where is called it's supposed to do a cast to the correct format, I just need it to be dynamic and used for any service.
Thanks
Every request that a ReST client makes to a ReST service, it passes an "Accept" header.
This is to indicate to the service the MIME-type of the resource the client is willing to accept.
In the above case, what are the acceptable formats (json/ plain text/ etc.) for you?
Depending on the "accept" format you choose, and the "Content-type" header that you receive, you can write a deserializer to accept that data and process.
Also, instead of returning an Object which is too generic, consider returning a readable Stream to the caller.

Streaming in jersey 2?

I've been trying to get json streaming to work in jersey 2. For the life of me nothing streams until the stream is complete.
I've tried this example trying to simulate a slow producer of data.
#Path("/foo")
#GET
public void getAsyncStream(#Suspended AsyncResponse response) {
StreamingOutput streamingOutput = output -> {
JsonGenerator jg = new ObjectMapper().getFactory().createGenerator(output, JsonEncoding.UTF8);
jg.writeStartArray();
for (int i = 0; i < 100; i++) {
jg.writeObject(i);
try {
Thread.sleep(100);
}
catch (InterruptedException e) {
logger.error(e, "Error");
}
}
jg.writeEndArray();
jg.flush();
jg.close();
};
response.resume(Response.ok(streamingOutput).build());
}
And yet jersey just sits there until the json generator is done to return the results. I'm watching the results come through in charles proxy.
Do I need to enable something? Not sure why this won't stream out
Edit:
This may actually be working, just not how I expected it. I dont' think stream is writing things realtime which is what I wanted, its more for not having to buffer responses and immediately write them out to the client. If I run a loop of a million and no thread sleep then data does get written out in chunks without having to buffer it in memory.
Your edit it correct. It is working as expected. StreamingOutput is just a wrapper that let's us write directly to the response stream, but does not actually mean the response is streamed on each server side write to the stream. Also AsyncResponse does not provide any different response as far as the client is concerned. It is simply to help increase throughput with long running tasks. The long running task should actually be done in another thread, so the method can return.
See more at Asynchronous Server API
What you seem to be looking for instead is Chunked Output
Jersey offers a facility for sending response to the client in multiple more-or-less independent chunks using a chunked output. Each response chunk usually takes some (longer) time to prepare before sending it to the client. The most important fact about response chunks is that you want to send them to the client immediately as they become available without waiting for the remaining chunks to become available too.
Not sure how it will work for your particular use case, as the JsonGenerator expects an OutputStream (of which the ChuckedOutput we use is not), but here is a simpler example
#Path("async")
public class AsyncResource {
#GET
public ChunkedOutput<String> getChunkedStream() throws Exception {
final ChunkedOutput<String> output = new ChunkedOutput<>(String.class);
new Thread(() -> {
try {
String chunk = "Message";
for (int i = 0; i < 10; i++) {
output.write(chunk + "#" + i);
Thread.sleep(1000);
}
} catch (Exception e) {
} finally {
try {
output.close();
} catch (IOException ex) {
Logger.getLogger(AsyncResource.class.getName())
.log(Level.SEVERE, null, ex);
}
}
}).start();
return output;
}
}
Note: I had a problem getting this to work at first. I would only get the delayed complete result. The problem seemed to have been with something completely separate from the program. It was actually my AVG causing the problem. Some feature called "LinkScanner" was stopping this chunking process to occur. I disabled that feature and it started working.
I haven't explored chunking much, and am not sure the security implications, so I am not sure why the AVG application has a problem with it.
EDIT
Seems the real problem is due to Jersey buffering the response in order to calculate the Content-Length header. You can see this post for how you can change this behavior

What's a good, robust, ASYNC way to check if URLs exist from a Play controller?

I originally tried this:
private static boolean checkUrlsAreReachable(String... urls) {
checkArgument(urls.length > 0);
List<F.Promise<WS.HttpResponse>> promises = newArrayList();
for (String url : urls) {
promises.add(WS.url(url).followRedirects(true).timeout("30s").getAsync());
}
List<WS.HttpResponse> results = await(F.Promise.waitAll(promises));
for (WS.HttpResponse response : results) {
if (!response.success()) {
logger.debug("Failed accessing one of " + Joiner.on(", ").join(urls));
return false;
}
}
return true;
}
But I found several caveats:
I'm getting an exception on WS.url(url) if the URL in question does not resolve well (e.g. http://a.com/).
At least when debugging, it seems the call to getAsync() blocks ... is it really async in production? I know Play has fewer thread in Dev mode, but I thought the call wouldn't even start executing at this point.
If one of the URLs is not reachable, I'm not sure how to log which failed (how to access the URL from the WS.HttpResponse object)
So, I turned to use sync HTTP instead of async. The following implementation seems to work:
private static boolean checkUrlsAreReachable(String... urls) {
checkArgument(urls.length > 0);
List<F.Promise<Boolean>> promises = newArrayList();
for (final String url : urls) {
promises.add(new Job<Boolean>(){
#Override
public Boolean doJobWithResult() throws Exception {
try {
WS.HttpResponse result = WS.url(url).followRedirects(true)
.timeout("30s").get();
return result.success();
} catch (Exception e) {
return false;
}
}
}.now());
}
F.Promise<List<Boolean>> allResults = F.Promise.waitAll(promises);
List<Boolean> booleans = await(allResults);
return Booleans3.and(booleans);
}
Is there a way to make the async implementation work?
set job pool setting part in application.conf
# Jobs executor
# ~~~~~~
# Size of the Jobs pool
play.jobs.pool=20
# Execution pool
# ~~~~~
# Default to 1 thread in DEV mode or (nb processors + 1) threads in PROD mode.
# Try to keep a low as possible. 1 thread will serialize all requests (very useful for debugging purpose)
play.pool=5
And just put the checking part in a job, such as CheckingJob, and start it using
new CheckingJob().now()
it will be async.

Categories

Resources