Camel Restlet maxThreads not working as expected - java

I am working on an application where a lot of camel routes are exposed as restlet routes. Lets call them endpoints. These endpoints are consumed by an angular application. These endpoints calls to a 3rd party system to gather the data and then after processing them, it passes the response to the angular application.
There are times when the 3rd party system is very slow, and in such cases our server's (Websphere 8.5.5.9) thread pool reaches at its maximum size (because most of them are waiting to get a response from 3rd party). Due to this there are no threads available for other parts (which does not interact with server via these endpoints) of application and hence they also suffers due to this.
So basically we want to limit the number of requests to be served by these 'endpoints' if the server is considerably overloaded so that other parts of application won't get affected. So we wanted to play around the number of threads which can process the incoming request on any of the endpoint. To do that as a poc (proof of concept) I used this example https://github.com/apache/camel/tree/master/examples/camel-example-restlet-jdbc
In this example I changed the following configuration
<bean id="RestletComponentService" class="org.apache.camel.component.restlet.RestletComponent">
<constructor-arg ref="RestletComponent" />
<property name="maxQueued" value="0" />
<property name="maxThreads" value="1" />
</bean>
And in the
org.apache.camel.example.restlet.jdbc.MyRouteConfig
I added a sleep of 20 secs on one of the get direct route as following:
from("direct:getPersons")
.process(exchange -> { Thread.sleep(20000);})
.setBody(simple("select * from person"))
.to("jdbc:dataSource");
Now my assumption (which I understood from the camel documentation at http://camel.apache.org/restlet.html) is that only 1 request can be served at a given time and no other requests will be accepted (since maxQueued is set to 0) when the original request is still in process. But that is not happening in real. With this code I can call this endpoint many times concurrently and all of them give response after 20 secs and few millis.
I am searching for similar kind of setup from last few days and I haven't got anything yet. I wanted to understand if I am doing something wrong or if I have understood the documentation incorrectly.
Camel version used here is 2.23.0-SNAPSHOT

Instead of trying to configuring the thread pool of a Camel component, you could try to use Camel Hystrix to control the downstream calls of your application with the Circuit Breaker pattern.
As soon as the downstream service returns errors or responds too slow, you can return an alternative response to the caller.

Related

Spring boot - Threads / Feign-Client / Messaging / Streamlistener

We struggle to find a solution for the following scenario:
Situation
Receive a message via Spring Cloud Streamlistener
Invoke a REST-Service via Feign-Client
We have configured several Feign-RequestInterceptor to enrich
request header data.
We want to avoid passing every request header on the method call and like the central configuration approach of the request interceptors.
Problem:
How to access data from a specific message, which contains informations, that need to be added to every request call via the Feign-RequestInterceptor.
We don't have a Request-Context, as we come from a message.
Can we be sure , that the message consumption and the REST call is happening on the same thread? If yes, we could use the NamedThreadLocal to store the information.
Yes, unless you hand off to another thread in your StreamListener, the rest call will be made on the same thread (assuming you are using RestTemplate and not the reactive web client).

Java commons-httpclient: Testing timeout values

Abstract: how do devs Integration TEST timeouts for http requests?
Backstory: My team is having issues related to unusually long lasting HTTP web requests. We use the commons-httpclient version 3 by Apache. The code looks similar to this:
PostMethod post = new PostMethod(endpoint);
post.getParams().setSoTimeout(someInt);
httpClient.executeMethod(post);
The time to complete this request is usually acceptable (2 seconds or so), but occasionally, we will see 50-60 second requests despite having our SO timeout set to 4 seconds. This prompted me to do some research and found that most people are setting Connection Timeouts ANNNNND SO timeouts. It appears that SO timeouts should be set lower (as they simply time the distance between bytes in transit) and the the connection timeout is what we originally planned to use (i.e. initial delay between request and 1st byte returned).
Here is the code we scraped and plan on using:
httpClient.getHttpConnectionManager().getParams()
.setConnectionTimeout(someInt);
httpClient.getHttpConnectionManager().getParams()
.setSoTimeout(someInt);
The main pain here is that we are unable to integration test this change. More precisely, we are confused on how to integration test the delays coming from a socket connection to a foreign server. After digging through the commons-httpclient, I see protected and private classes that we will have to reproduce (because they are unextendable and unusable from outside the class), mock and string together the classes to ultimately get down to the socket class in java (which relies on a java native method -- which we would also need to reproduce and inject via mocks -- something I dont see frequently at that level).
The reason I am reaching out to Stack Overflow is to see how others are testing this/not testing this. I want to avoid testing this functionality in a performance environment at all costs.
Another thought of mine was to set up a mockserver to respond to the httpclient with a programmable delay time. I haven't seen an example of that yet.
First of all, there is no such thing as unit testing http requests - that would be integration testing.
Secondly, you can use a tool like JMeter to send http requests and test whether the response is being received in a certain amount of time as shown here in JMeter.
Taking the mock server route, I managed to set up a small web server with an API endpoint that I could test against. Here is the related code:
Lightweight server setup:
TJWSEmbeddedJaxrsServer server = new TJWSEmbeddedJaxrsServer();
server.setPort(SERVER_PORT);
server.getDeployment().getResources().add(new TestResource());
server.start();
API Endpoint:
/**
* In order to test the timeout, the resource will be injected into an embedded server.
* Each endpoint should have a unique use case.
*/
#Path("tests")
public class TestResource {
#POST
#Produces({MediaType.APPLICATION_XML})
#Path("socket-timeout")
public Response testSocketTimeout() throws InterruptedException {
Thread.sleep(SOCKET_TIMEOUT_SLEEP);
return Response.ok().build();
}
}
Within the api endpoint related class, I can control the sleep timeout which then triggers a socket timeout within the httpclient class. Its a bit hacky, but it works to test the functionality in the way I wanted to (simple, lightweight and effective).

Spring Integration HTTP Inbound Gateway Request Overlap

I have an HTTP Inbound Gateway in my Integration Application, which I will call during some save operation. It's like this. If I have one product, I will call the API once, and if I have more than once, then I will call multiple times. The problem is, for single invoke, SI works just fine. But for multiple calls, request and response get messed up. I thought Spring Integration Channels are just like MQ's, but it is not?
Let me explain this. Let's say I have 2 products. First, I invoke SI for Product A and then for B. Response of A got mapped to request B! It happens all the time. I don't want to use some dirty hacks like wait for the first response to come and invoke again. This means the system has to wait for a long time. I guess we can do it in Spring Integration using task executor, but with all the basic samples out there, I can't find the right one. So please help me find out how can I fix this issue!
My Configuration is :
<int:channel id="n2iMotorCNInvokeRequest" />
<int:channel id="n2iMotorCNInvokeResponse" />
<int:channel id="n2iInvoketransformerOut" />
<int:channel id="n2iInvokeobjTransformerOut" />
<int:channel id="n2iInvokegatewayOut" />
<int-http:inbound-gateway id="i2nInvokeFromPOS"
supported-methods="GET"
request-channel="i2nInvokeRequest"
reply-channel="i2nInvokeResponse"
path="/postProduct/{Id}"
mapped-response-headers="Return-Status, Return-Status-Msg, HTTP_RESPONSE_HEADERS"
reply-timeout="50000">
<int-http:header name="Id" expression="#pathVariables.Id"/>
</int-http:inbound-gateway>
<int:service-activator id="InvokeActivator"
input-channel="i2nInvokeRequest"
output-channel="i2nInvokeResponse"
ref="apiService"
method="getProductId"
requires-reply="true"
send-timeout="60000"/>
<int:transformer input-channel="i2nInvokeResponse"
ref="apiTransformer"
method="retrieveProductJson"
output-channel="n2iInvokeRequest"/>
<int-http:outbound-gateway request-channel="n2iInvokeRequest" reply-channel="n2iInvoketransformerOut"
url="http://10.xx.xx.xx/api/index.php" http-method="POST"
expected-response-type="java.lang.String">
</int-http:outbound-gateway>
<int:service-activator
input-channel="n2iInvoketransformerOut"
output-channel="n2iInvokeobjTransformerOut"
ref="apiService"
method="productResponse"
requires-reply="true"
send-timeout="60000"/>
The i2nInvokeFromPOS gateway is what we call from Web Application which is where all the products will be created. This Integration API will fetch that data, and post it to the backend system so that it will get updated to the other POS locations too!
Steps :
I will send the productId to i2nInvokeFromPOS.
apiTransformer -> retrieveProductJson() method will fetch the product details from DB based on the ID
Send the Request JSON to Backend system using http:outbound-gateway
Get the response from Backend and update the product status as uploaded in DB. Happens in apiService -> productResponse()
Once the response for A is received, all I'm getting is HTTP 500 Error for the Request B! But the Backend API is just fine.
The framework is completely thread-safe - if you are seeing cross-talk between different requests/responses then one (or more) of your components that the framework is invoking is not thread-safe.
You can't keep state in fields in, for example, code invoked from a service activator.

callback function do background jobs after the completing the action in java spring

I am new to Java Spring Framework, I am Rails developer I have requirement in java spring like I need to do background jobs but after the response send to the end User. It should not wait for the jobs to complete. But the jobs should run every time action completes.
Is a webservice app. We have Service, Bo and DAO layers and we are logging any exceptions occurred while processing the user data in database before response send to user, but now we want to move(Exception handling) after response send to user to increase the performance.
I remember in rails we have callbacks/filters after the action executed it calls the methods we want to executed. Same is available in java Spring?
Thanks,
Senthil
I assume the use case is something like a user requests a long-running task, and you want to return a response immediately and then launch the task in the background.
Spring can help with this. See
http://docs.spring.io/spring/docs/3.2.x/spring-framework-reference/html/scheduling.html
In particular see the #Async annotation.
With respect to the client getting a response back following the async processing (exception or otherwise), you can do it, but it's extra work.
Normally the immediate response would include some kind of ID that the client could come back with after some period of time. (For example, when you run a search against the Splunk API, it gives you a job ID, and you come back later with that job ID to check on the result). If this works, do that. The client has to poll but the implementation is the simplest.
If not, then you have to have some way for the client to listen for the response. This could be a "reply-to" web service endpoint on the client (perhaps passed in with the original request as a custom X-Reply-To HTTP header), or it could be a message queue, etc.

Can multiple camel routes cause a very large number of threads?

I will clarify my question.
I have a task to integrate two systems: a frontend serving html and backend which gives data to frontend.
Backend have a very large REST api so I have to use multiple routes.
I planned to use single camel context and wrap all routes into it.
<camelContext xmlns="http://activemq.apache.org/camel/schema/spring">
<from uri="direct:data"/>
<to uri="ahc:http://localhost/data"/>
<!--And so on. More than 70 routes-->
</camelContext>
Then, I planned to invoke the route using #Produce annotation on service method as adviced in Hiding middleware article
public interface Service {
String data();
}
public class MyBean {
#Produce(uri = "direct:data")
protected Service producer;
public void doSomething() {
// lets send a message
String response = producer.data();
}
}
As I understand information taken from here and here I'll end up with additional 70 thread in my app (one for each route). I fear that it can cause a serious performance hit and while the backend api will grow the thread number will grow with it. Is it correct? How can I avoid this if it's true? As I understand, I can't employ ExecutorService thread pool in this case.
Thanks in advance for any answer.
No you will not end up with a thread per route. The threading module is often tied to the threading model of the consumer (eg the route input).
For example a route that uses a timer component will use a scheduled thread pool (1 thread). And the JMS component will use 1 or more threads, depending on if you set concurrentConsumers=N, etc.
The direct component is like a direct method invocation and it uses the caller thread, so there is 0 new threads for that threading model.
If all your 70 routes uses the AHC in the < to > then you may want to re-use the same endpoint, so you reuse the thread pool of the AHC library. Or alternative to configure a shared pool to be used for all AHC endpoints.
And btw this question was also posted on the Camel user forum / mailinglist: http://camel.465427.n5.nabble.com/Can-multiple-camel-routes-cause-a-very-large-number-of-threads-tp5736620.html

Categories

Resources