We have a lot of FTL code that performs hash concatenation like this:
<#local event_data = event_data + {
'subaction_name': subactionName
} />
However, these constructs cause some overhead due to the following code (the freemarker.core.AddConcatExpression class’ _getAsTemplateModel(Environment env) method):
try {
String s1 = getStringValue(leftModel, left, env);
if(s1 == null) s1 = "null";
String s2 = getStringValue(rightModel, right, env);
if(s2 == null) s2 = "null";
return new SimpleScalar(s1.concat(s2));
} catch (NonStringException e) {
if (leftModel instanceof TemplateHashModel && rightModel instanceof TemplateHashModel) {
if (leftModel instanceof TemplateHashModelEx && rightModel instanceof TemplateHashModelEx) {
TemplateHashModelEx leftModelEx = (TemplateHashModelEx)leftModel;
TemplateHashModelEx rightModelEx = (TemplateHashModelEx)rightModel;
if (leftModelEx.size() == 0) {
return rightModelEx;
} else if (rightModelEx.size() == 0) {
return leftModelEx;
} else {
return new ConcatenatedHashEx(leftModelEx, rightModelEx);
}
} else {
return new ConcatenatedHash((TemplateHashModel)leftModel,
(TemplateHashModel)rightModel);
}
} else {
throw e;
}
}
because the getStringValue method throws NonStringExceptions in such cases. The NonStringException, in its turn, inherits the following constructor logic from the TemplateException:
super(getDescription(description, cause));
causeException = cause;
this.env = env;
if(env != null)
{
StringWriter sw = new StringWriter();
PrintWriter pw = new PrintWriter(sw);
env.outputInstructionStack(pw);
pw.flush();
ftlInstructionStack = sw.toString();
}
else
{
ftlInstructionStack = "";
}
which fetches a full instruction stack each time regardless further handling.
This leads to up to 50 ms of execution time spent on the exception constructors per page request in our case, which is quite critical for the overall throughput.
Could anyone give some advices on how to avoid these exceptions without touching the FTL code, please? Or, maybe, it’s possible to get some kind of a patch, which would move the TemplateModelHash instanceof check before the catch block in the AddConcatExpression._getAsTemplateModel method?
EDIT:
freemarker version 2.3.19
Try to use the latest stable FreeMarker release (2.3.25 at the moment), and see if the speed will be acceptable. A commit in 2013-06-13 says:
This reduces TemplateException creation resource usage further (as far
as it will be silently handled), as the final message from the blamed
expression, tips and so on is only assembled when the message is
indeed needed.
2.3.19 was released earlier than that, on 2012-02-29.
Update: I have committed into 2.3.26-nightly so that falling back to hash addition doesn't rely on exception throwing/catching at all. Not sure how dominant this was in your performance problem, but it was inefficient for sure.
Related
I have this method in Java, which is called within another method's try block. importFormat returns the class with assigned values (//do what is needed) in the try block. The method should read a file line by line. In case the method call to the method with try block is called more times than are the lines in a file, the importFormat() should return null.
I tried to check it with if block, although it doesn't do much and the ClassName is always returned. It seems the class stores always a first line from the file.
private ClassName importFormat(BufferedReader br) throws IOException, ParseException {
String out;
Track t = new ClassName();
if((out = br.readLine()) == null) return null;
//do what is needed
}else{
t = null; //here I unsuccessfully tried to force the method to again return null, no luck
System.err.print(out);
throw new ParseException("", 0);
}
return t;
}
I have tried also the br.ready() method, it didn't make any difference
EDIT: I noticed I have reproduced the code incorrectly, I'm sorry for that. Here it should be clearer
Minimal reproducible code:
private ClassName foo(BufferedReader br) throws IOException {
ClassName t = new ClassName();
String out = null;
out = br.readLine();
if(out.equals(null)) return null; //handle the case where there's no more line to read
if(/*!string red from BufferedReader.isEmpty()*/){
//do something
}else{
t = null; //ensure that null would be returned
//do something more unrelated to this question
}
return t;
}
I dont realy understand your question but I think you should not compare like this:
if((out = br.readLine()) == null) return null;
To compare string in java, let use str1.equals(str2) instead. So I think you should try:
if(out.equals(br.readLine())) {
//do sth here because "out" exists in BufferReader.
} else {
System.out.println("Continue searching...\n");
}
return t;
I have the following code running in my project:
HashMap<String, DeviceData> deviceMap = getAllDevices();
int status = 0;
DeviceHandle devHandle = null;
for (LicenseData licenseData:listLicenses) {
Map<String, String> licenseMap = licenseData.getLicenseKeyValues();
if ((licenseMap != null && !licenseMap.isEmpty())) {
String keyDecrypt = licenseMap.get("key");
Date expiryDate = new Date(Long.parseLong(licenseMap.get("expiryDate")));
boolean allowForeign = Boolean.parseBoolean(licenseMap.get("allowForeign"));
String ipDecrypt = licenseMap.get("ipAddress");
if (expiryDate.compareTo(new Date()) > 0 || keyDecrypt.equals(licenseData.getKey().getCurrentValueAsString()))
{
try {
DeviceData device = deviceMap.get(ipDecrypt);
devHandle = (DeviceHandle)device.getHandle();
if(device != null && devHandle != null) {
deviceMap.remove(ipDecrypt, device);
System.out.println("After deletion device map.");
System.out.println(deviceMap);
createUser(devHandle);
try {
if (allowForeign) {
Process pr = Runtime.getRuntime().exec(SomeOperation);
status = pr.waitFor();
if (status == 0)
//Debug Statement
else
//Error Debug Statemnt
deleteUser(devHandle);
}
else {
Process pr = Runtime.getRuntime().exec(SomeOperation);
status = pr.waitFor();
if (status == 0)
//Debug Statement
else
//Error Debug Statement
deleteUser(devHandle);
}
} catch(Exception e) {
//Exception statement
deleteUser(devHandle);
}
} catch(Exception e) {
e.printStackTrace();
}
}
}
}
Explanation: I have a list of licenses for my application in listLicenses. All the devices present in the server are in deviceMap. For each license, I am decrypting it and getting the values. If license for a device is present, I get a handle on that device and doing some operations.
The issue is:
If I am not able to get a handle on device(getHandle()), or if I am not able to create a user after getting the device handle(createUser()), an exception is thrown. These methods are very hierarchical, i.e I am calling them from here, they are in another class throwing own exceptions and for their operation, they call other methods.
If there are three devices in the map, and three licenses, and if for the first one I am not able to get a handle or create a user, device is removed from deviceMap but no further execution happens i.e. for the next two devices.
If exception occurs for on device, I want to continue the exception for other two devices. I tried using return but couldn't get it to work.
Please help.Also, please forgive for the syntax and if any mismatch is there in the code.
Make use of first try's catch block.
This is how I handled when I faced same kind of situation.
catch (Exception exp) {
if (exp instanceof NullPointerException) {
log.info"Invalid/ Inactive ");
} else if (exp instanceof NonUniqueResultException) {
log.info("Multiple records existed");
} else {
exp.printStackTrace();
errorMsgs.append("Unexpected Error Occured. Please contact Admin.");
}
}
I have written a piece of software in Java that checks if proxies are working by sending a HTTP request using the proxy.
It takes around 30,000 proxies from a database, then attempts to check if they are operational. The proxies received from the database used to be returned as an ArrayList<String>, but have been changed to Deque<String> for reasons stated below.
The way the program works is there is a ProxyRequest object that stores the IP & Port as a String and int respectively. The ProxyRequest object has a method isWorkingProxy() which attempts to send a request using a proxy and returns a boolean on whether it was successful.
This ProxyRequest object is wrapped around by a RunnableProxyRequest object that calls super.isWorkingProxy() in the overrided run() method. Based on the response from super.isWorkingProxy(), the RunnableProxyRequest object updates a MySQL database.
Do note that the updating of the MySQL database is synchronized().
It runs on 750 threads using a FixedThreadPool (on a VPS), but towards
the end, it becomes very slow (stuck on ~50 threads), which obviously
implies the garbage collector is working. This is the problem.
I have attempted the following to improve the lag, it does not seem to work:
1) Using a Deque<String> proxies and using Deque.pop() to obtain the String in which the proxy is. This (I believe), continuously makes the Deque<String> smaller, which should improve lag caused by the GC.
2) Set the con.setConnectTimeout(this.timeout);, where this.timeout = 5000; This way, the connection should return a result in 5 seconds. If not, the thread is completed and should no longer be active in the threadpool.
Besides this, I don't know any other way I can improve performance.
Can anyone recommend a way for me to improve performance to avoid / stop lagging towards the end of the threads by the GC? I know there is a Stackoverflow question about this (Java threads slow down towards the end of processing), but I have tried everything in the answer and it has not worked for me.
Thank you for your time.
Code snippets:
Loop adding threads to the FixedThreadPool:
//This code is executed recursively (at the end, main(args) is called again)
//Create the threadpool for requests
//Threads is an argument that is set to 750.
ThreadPoolExecutor executor = (ThreadPoolExecutor)Executors.newFixedThreadPool(threads);
Deque<String> proxies = DB.getProxiesToCheck();
while(proxies.isEmpty() == false) {
try {
String[] split = proxies.pop().split(":");
Runnable[] checks = new Runnable[] {
//HTTP check
new RunnableProxyRequest(split[0], split[1], Proxy.Type.HTTP, false),
//SSL check
new RunnableProxyRequest(split[0], split[1], Proxy.Type.HTTP, true),
//SOCKS check
new RunnableProxyRequest(split[0], split[1], Proxy.Type.SOCKS, false)
//Add more checks to this list as time goes...
};
for(Runnable check : checks) {
executor.submit(check);
}
} catch(IndexOutOfBoundsException e) {
continue;
}
}
ProxyRequest class:
//Proxy details
private String proxyIp;
private int proxyPort;
private Proxy.Type testingType;
//Request details
private boolean useSsl;
public ProxyRequest(String proxyIp, String proxyPort, Proxy.Type testingType, boolean useSsl) {
this.proxyIp = proxyIp;
try {
this.proxyPort = Integer.parseInt(proxyPort);
} catch(NumberFormatException e) {
this.proxyPort = -1;
}
this.testingType = testingType;
this.useSsl = useSsl;
}
public boolean isWorkingProxy() {
//Case of an invalid proxy
if(proxyPort == -1) {
return false;
}
HttpURLConnection con = null;
//Perform checks on URL
//IF any exception occurs here, the proxy is obviously bad.
try {
URL url = new URL(this.getTestingUrl());
//Create proxy
Proxy p = new Proxy(this.testingType, new InetSocketAddress(this.proxyIp, this.proxyPort));
//No redirect
HttpURLConnection.setFollowRedirects(false);
//Open connection with proxy
con = (HttpURLConnection)url.openConnection(p);
//Set the request method
con.setRequestMethod("GET");
//Set max timeout for a request.
con.setConnectTimeout(this.timeout);
} catch(MalformedURLException e) {
System.out.println("The testing URL is bad. Please fix this.");
return false;
} catch(Exception e) {
return false;
}
try(
BufferedReader in = new BufferedReader(new InputStreamReader(con.getInputStream()));
) {
String inputLine = null; StringBuilder response = new StringBuilder();
while((inputLine = in.readLine()) != null) {
response.append(inputLine);
}
//A valid proxy!
return con.getResponseCode() > 0;
} catch(Exception e) {
return false;
}
}
RunnableProxyRequest class:
public class RunnableProxyRequest extends ProxyRequest implements Runnable {
public RunnableProxyRequest(String proxyIp, String proxyPort, Proxy.Type testingType, boolean useSsl) {
super(proxyIp, proxyPort, testingType, useSsl);
}
#Override
public void run() {
String test = super.getTest();
if(super.isWorkingProxy()) {
System.out.println("-- Working proxy: " + super.getProxy() + " | Test: " + test);
this.updateDB(true, test);
} else {
System.out.println("-- Not working: " + super.getProxy() + " | Test: " + test);
this.updateDB(false, test);
}
}
private void updateDB(boolean success, String testingType) {
switch(testingType) {
case "SSL":
DB.updateSsl(super.getProxyIp(), super.getProxyPort(), success);
break;
case "HTTP":
DB.updateHttp(super.getProxyIp(), super.getProxyPort(), success);
break;
case "SOCKS":
DB.updateSocks(super.getProxyIp(), super.getProxyPort(), success);
break;
default:
break;
}
}
}
DB class:
//Locker for async
private static Object locker = new Object();
private static void executeUpdateQuery(String query, String proxy, int port, boolean toSet) {
synchronized(locker) {
//Some prepared statements here.
}
}
Thanks to Peter Lawrey for guiding me to the solution! :)
His comment:
#ILoveKali I have found network libraries are not aggressive enough in
shutting down a connection when things go really wrong. Timeouts tend
to work best when the connection is fine. YMMV
So I did some research, and found that I had to also use the method setReadTimeout(this.timeout);. Previously, I was only using setConnectTimeout(this.timeout);!
Thanks to this post (HttpURLConnection timeout defaults) that explained the following:
Unfortunately, in my experience, it appears using these defaults can
lead to an unstable state, depending on what happens with your
connection to the server. If you use an HttpURLConnection and don't
explicitly set (at least read) timeouts, your connection can get into
a permanent stale state. By default. So always set setReadTimeout to
"something" or you might orphan connections (and possibly threads
depending on how your app runs).
So the final answer is: The GC was doing just fine, it was not responsible for the lag. The threads were simply stuck FOREVER at a single number because I did not set the read timeout, and so the isWorkingProxy() method never got a result and kept reading.
I have a block of code, that deserializes multiple objects from file. How can i avoid using a while(true)?
ObjectInputStream in = new ObjectInputStream(new FileInputStream(
filename));
while (true) {
try {
MyObject o = (MyObject) in.readObject();
// Do something with the object
} catch (EOFException e) {
break;
}
}
in.close();
You should write either a collection (with a size), or a put a marker before each object:
try {
for (;in.readBoolean();) {
MyObject o = (MyObject) in.readObject();
}
} catch (EOFException e) {
// ...
}
When you write your object, write a boolean just before (it will however take 1 byte if I do remember well that part):
for (MyObject o : iterable) {
out.writeBoolean(true);
out.writeObject(o);
}
out.writeBoolean(false);
If iterable is a collection or map, you can use default serialization:
out.writeObject(iterable); // default collection serialization
Beside, don't catch an exception for each item, catch it globally (especially EOFException!): it is better for performance reasons.
I don't know if you work with Java 7, but your code + my for loop can be written like this:
try (ObjectInputStream in = new ObjectInputStream(new FileInputStream( filename))) {
for (;in.readBoolean();) {
MyObject o = (MyObject) in.readObject();
}
} catch (EOFException e) {
// ...
}
// no need to close, the try-with-resources do the job for you.
How can i avoid using a while(true)?
You can't.
More to the point, why do you think you want to?
This is a classic example of the tail wagging the dog. EOFException is thrown to indicate end of stream. Ergo you have to catch it, and ergo you have to loop until it is thrown, ergo you have to use while (true) or one of its cognates.
The exception thought police would have you prepend an object count, taking the curious position that external data structures should be designed to suit the coder's phobias, and overlooking that you may not know it in advance, or may need to change your mind, or may need to exit prematurely; or would have you write a null as an end-of-stream marker, overlooking that it prevents the use of null for any other purpose; and in both cases overlooking the fact that the API is already designed to throw EOFException, and already works the way it already works, so you already have to code accordingly.
The code that I'm proposing let you to serialize and deserialize multiple objects really easily without having any problems and avoiding the awful (in my opinion) while true:
public class EntityClass implements Serializable{
private int intVal;
private String stringVal;
public EntityClass(int intVal, String stringVal) {
this.intVal = intVal;
this.stringVal = stringVal;
}
#Override
public String toString() {
return "EntityClass{" +
"intVal=" + intVal +
", stringVal='" + stringVal + '\'' +
'}';
}
public static void main(String[] args) throws IOException, ClassNotFoundException {
EntityClass a = new EntityClass(1, "1");
EntityClass b = new EntityClass(2, "2");
EntityClass c = new EntityClass(3, "3");
ObjectOutputStream stream = new ObjectOutputStream(new FileOutputStream("out"));
stream.writeObject(a);
stream.writeObject(b);
stream.writeObject(c);
stream.close();
ObjectInputStream streamRead = new ObjectInputStream(new FileInputStream("out"));
EntityClass[] entities = new EntityClass[3];
int cont = 0;
try {
while (streamRead.available() >= 0) {
entities[cont] = (EntityClass) streamRead.readObject();
System.out.println(entities[cont]);
cont++;
}
} catch (EOFException exp) {
} finally {
streamRead.close();
}
}
}
I'm writing a file reader that returns an object and I'd like it to warn on parse errors and continue to the next record.
The code below is the obvious implementation of this, but involves recursing from inside the catch block. Is there any technical or stylistic reason not to do this?
public RecordType nextRecord() throws IOException{
if (reader == null){
throw new IllegalStateException("Reader closed.");
}
String line = reader.readLine();
if (line == null){
return null;
}else{
try {
return parseRecord(line);
}catch (ParseException pex){
logger.warn("Record ignored due to parse error: "
+ pex.getMessage());
//Note the recursion here
return nextRecord();
}
}
}
I would prefer to use a loop. With recursion, you never know how deep you can safely go.
String line;
while((line = reader.readLine()) != null) {
try {
return parseRecord(line);
}catch (ParseException pex){
logger.warn("Record ignored due to parse error: " + pex);
}
}
return null;
Why not replace the recursion with a loop:
public RecordType nextRecord() throws IOException {
if (reader == null) {
throw new IllegalStateException("Reader closed.");
}
for (;;) {
String line = reader.readLine();
if (line == null) {
return null;
} else {
try {
return parseRecord(line);
} catch (ParseException pex) {
logger.warn("Record ignored due to parse error: "
+ pex.getMessage());
// continue to the next record
}
}
}
}
Stylistically, I find this preferable.
Would it be cleaner to let the ParseException propagate back to the caller? The caller could then decide what to do about it.
What it seems like to me is that whatever is calling your method is going to keep calling it until the method returns null.
I would probably follow the advice of the previous posters and use a loop, however I would look at whatever is calling the method (as it is probably already using a loop), have it skip the line by looking for an exception to be thrown.