Java exception out of memory - java

This is extremely strange. I am getting an out of memory exception when trying to create a file. However, there is more than enough space on the disk for the file to be created.
The application is running on a Citrix (not sure if that is even relevant).
I have no idea where to even start to debug this since I can clearly see that there is space on the disk.
The file I'm trying to create is 4 KB and named history.db
Any ideas here?
This is the code I'm using to create the file:
try {
String databaseFileLocation = "";
String fileSeparator = System.getProperty("file.separator");
String homeDir = System.getProperty("user.home");
File myAppDir = new File(homeDir, ".imbox");
if (System.getProperty("os.name").contains("Windows")) {
databaseFileLocation = "jdbc:sqlite:" + myAppDir + fileSeparator + "history_" + agentID + ".db";
} else if (System.getProperty("os.name").contains("Mac")) {
databaseFileLocation = "jdbc:sqlite:history_" + agentID + ".db";
}
Class.forName("org.sqlite.JDBC");
Connection conn = DriverManager.getConnection(databaseFileLocation);
Statement stat = conn.createStatement();
stat.executeUpdate("CREATE TABLE IF NOT EXISTS visitorInfo (channelID text UNIQUE, currentPage, userCountry, userCity, org);");
stat.executeUpdate("CREATE TABLE IF NOT EXISTS chatHistory (channelID, sender, message, recipient, time);");
stat.executeUpdate("ALTER TABLE visitorInfo ADD COLUMN visitorTag;");
} catch (Exception eef) {
eef.printStackTrace();
final ImageIcon icon = new javax.swing.ImageIcon(getClass().getResource("/resources/warning_icon.gif"));
JOptionPane.showMessageDialog(null, "Failed to create database file.\n\nError description:\n" + eef.getMessage(), "Error when creating database file", JOptionPane.WARNING_MESSAGE, icon);
}

I can only say so much given such a small piece of code and no stack trace but judging by how this code is written I'll make a few assumptions.
Resources are being opened and never closed here, if that is happening here I can imagine that the same thing is happening elsewhere in the code. If that's the case then it's likely that there's some other unsafe code around.
This means with such little information the best I can say is that if the code has been running a while that you have a memory leak in there somewhere, probably using resources. If this has been started and restarted more than a few times then it's probably not being closed cleanly and old versions of this program are hanging around.
To properly figure this out, we would either need more code and/or a stack trace. Possibly running jps on the machine would help as well, otherwise we don't have enough information to say more.

Related

Java: Writing Variables to File, and Reading back

I am currently using Eclipse Java Neon for my builder, and I am trying to implement a save and load feature into a project i am currently working on. I know it requires me to use a Try/Catch block, but I have no idea how to really handle it. Not only that, but what I tried out is giving me a bit of an error:
try {
System.out.println("Writing to file...");
charWrite = new FileWriter("player.dat");
charWrite.write(player.getName()); //String
charWrite.write(player.getJob()); //String
charWrite.write(player.getLevel()); //Int
charWrite.write(player.getCurrency()); //Int
charWrite.write(player.getHealth()); //Int
charWrite.write(player.getExp()); //Int
}
catch (IOException excpt) {
System.out.println("Caught IOException: " + excpt.getMessage());
}
The system seems to recognize what is happening, but when I go to open it and see if it has written, the document is still blank.
And if I am this lost on writing, I am going to be so lost when reading to place it into the Class's parameters.
Thanks for the help.
You are trying to write an object of type java.lang.Class to a file. If you want the String representation of the class name use toString():
charWrite.write(player.getClass().toString());

OutOfMemoryExceptions on Productions Servers

Can the following code snippets leak memory
A
BufferedWriter logoutput;
FileWriter fstream = null;
try {
Calendar cal = Calendar.getInstance();
SimpleDateFormat sdf = new SimpleDateFormat(DATE_FILE_FORMAT_NOW);
fstream = new FileWriter("..\\GetInfoLogs\\" + sdf.format(cal.getTime()) + ".log", true);
logoutput = new BufferedWriter(fstream);
logoutput.write(statement);
// Missing fstream.close();
logoutput.close();
}
} catch (IOException e) {
System.err.println("Unable to write to file");
}
B
String info[] = {"", ""};
try {
conn.setAutoCommit(false);
Statement stmt = conn.createStatement();
ResultSet rset = stmt.executeQuery("select ....");
boolean hasRows = rset.next();
if (!hasRows) {
stmt.close();
return info;
} else {
info[0] = rset.getString(1);
info[1] = rset.getString(2);
}
// MISSING rset.close();
stmt.close();
} catch (SQLException e) {
logTransaction(service, "error at getPFpercentage: " + e.getMessage() + " ");
}
I would recommend use YourKit Java Profiler, as it very intuitive and easy to use tool.
Start your application locally, connect profiler to it and perform some of your application use cases.
No, they can't.
Objects are collected by the garbage collector when there are no longer any live references to them in a program. This is in general unrelated to whether they have been closed.
The only way closing an object (or calling any other method on it) could affect its eligibility for collection was if there was some global structure which held a reference to the object, and closing it had the side effect of removing it from this structure. I am not aware of any such structure in the JDK's IO libraries. Indeed, IO classes in the JDK are generally designed to close themselves when they get garbage collected, which would be pretty futile if their being open prevented them being collected.
Database classes like Connections are a bit trickier, because the have implementations provided by the JDBC driver. It is possible a poorly-written JDBC driver would prevent unclosed objects being collected. It seems unlikely, though, as that would be a huge screwup, frankly.
You can use the JDK's jmap tool to get a heap dump of a running application. You can then analyse this to try to work out why your application is using so much memory. Be warned that the dump files are huge (bigger than the dumped heap), and analysing them is a real pain. A colleague of mine has got good results using the Eclipse Memory Analyzer plugin.

Create a database / execute a bunch of mysql statements from Java

I have a library that needs to create a schema in MySQL from Java. Currently, I have a dump of the schema that I just pipe into the mysql command. This works okay, but it is not ideal because:
It's brittle: the mysql command needs to be on the path: usually doesn't work on OSX or Windows without additional configuration.
Also brittle because the schema is stored as statements, not descriptively
Java already can access the mysql database, so it seems silly to depend on an external program to do this.
Does anyone know of a better way to do this? Perhaps...
I can read the statements in from the file and execute them directly from Java? Is there a way to do this that doesn't involve parsing semicolons and dividing up the statements manually?
I can store the schema in some other way - either as a config file or directly in Java, not as statements (in the style of rails' db:schema or database.yml) and there is a library that will create the schema from this description?
Here is a snippet of the existing code, which works (when mysql is on the command line):
if( db == null ) throw new Exception ("Need database name!");
String userStr = user == null ? "" : String.format("-u %s ", user);
String hostStr = host == null ? "" : String.format("-h %s ", host);
String pwStr = pw == null ? "" : String.format("-p%s ", pw);
String cmd = String.format("mysql %s %s %s %s", hostStr, userStr, pwStr, db);
System.out.println(cmd + " < schema.sql");
final Process pr = Runtime.getRuntime().exec(cmd);
new Thread() {
public void run() {
try (OutputStream stdin = pr.getOutputStream()) {
Files.copy(f, stdin);
}
catch (IOException e) { e.printStackTrace(); }
}
}.start();
new Thread() {
public void run() {
try (InputStream stdout = pr.getInputStream() ) {
ByteStreams.copy(stdout, System.out);
}
catch (IOException e) { e.printStackTrace(); }
}
}.start();
int exitVal = pr.waitFor();
if( exitVal == 0 )
System.out.println("Create db succeeded!");
else
System.out.println("Exited with error code " + exitVal);
The short answer (as far as i know) is no.
You will have to do some parsing of the file into separate statements.
I have faced the same situation and you can find many questions on this topic here on SO.
some like here will show a parser. others can direct to tools Like this post from apache that can convert the schema to an xml format and then can read it back.
My main intention when writing this answer is to tell that I chose to use the command line in the end.
extra configuration: maybe it is an additional work but you can do it by config or at runtime based on the system you are running inside. you do the effort one time and you are done
depending on external tool: it is not as bad as it seems. you have some benefits too.
1- you don't need to write extra code or introduce additional libraries just for parsing the schema commands.
2- the tool is provided by the vendor. it is probably more debugged and tested than any other code that will do the parsing.
3- it is safer on the long run. any additions or changes in the format of dump that "might" break the parser will most probably be supported with the tool that comes with the database release. you won't need to do any change in your code.
4- the nature of the action where you are going to use the tool (creating schema) does not suggest frequent usage, minimizing the risk of it becoming a performance bottle neck.
I hope you can find the best solution for your needs.
Check out Yank, and more specifically the code examples linked to on that page. It's a light-weight persistence layer build on top of DBUtils, and hides all the nitty-gritty details of handling connections and result sets. You can also easily load a config file like you mentioned. You can also store and load SQL statements from a properties file and/or hard code the SQL statements in your code.

Java storedProcedure stops with OutOfMemoryError

I'm working on a Java project, running on Tomcat 6, which connects to a MySQL database. All procedures run as they should, both when testing local as testing on the server of our customer. There is one exception however, and that's for one procedure which retrieves a whole lot of data to generate a report. The stored procedure takes like 13 minutes or so when executing it from MySQL. When I run the application locally and connect to the online database, the procedure does work, the only time it doesn't work, is when it is run on the server of our client.
The client is pretty protective over his server, so we have limited control over it, but they do want us to solve the problem. When i check the log files, no errors are thrown from the function that executes the stored procedure. And putting some debug logs in the code, it shows that it does get to the execute call, but doesn't log the debug right after the call, neither logs the error in the catch, but does get into the finally section.
They claim there are no time-out errors in the MySQL logs.
If anyone has any idea on what might cause this problem, any help will be appreciated.
update:
after some nagging to the server administrator, I've finally got access to the catalina logs, and in those logs, i've finally found an error that has some meaning:
Exception in thread "Thread-16" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2894)
at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:117)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:407)
at java.lang.StringBuffer.append(StringBuffer.java:241)
at be.playlane.mink.database.SelectExportDataProcedure.bufferField(SelectExportDataProcedure.java:68)
at be.playlane.mink.database.SelectExportDataProcedure.extractData(SelectExportDataProcedure.java:54)
at org.springframework.jdbc.core.JdbcTemplate.processResultSet(JdbcTemplate.java:1033)
at org.springframework.jdbc.core.JdbcTemplate.extractReturnedResultSets(JdbcTemplate.java:947)
at org.springframework.jdbc.core.JdbcTemplate$5.doInCallableStatement(JdbcTemplate.java:918)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:876)
at org.springframework.jdbc.core.JdbcTemplate.call(JdbcTemplate.java:908)
at org.springframework.jdbc.object.StoredProcedure.execute(StoredProcedure.java:113)
at be.playlane.mink.database.SelectExportDataProcedure.execute(SelectExportDataProcedure.java:29)
at be.playlane.mink.service.impl.DefaultExportService$ExportDataRunnable.run(DefaultExportService.java:82)
at java.lang.Thread.run(Thread.java:636)
Weird tho that this doesn't log to the application logs, even tho it is wrapped within a try catch. Now based upon the error, the problem lies withing this methods:
public Object extractData(ResultSet rs) throws SQLException, DataAccessException
{
StringBuffer buffer = new StringBuffer();
try
{
// get result set meta data
ResultSetMetaData meta = rs.getMetaData();
int count = meta.getColumnCount();
// get the column names; column indices start from 1
for (int i = 1; i < count + 1; ++i)
{
String name = meta.getColumnName(i);
bufferField(name, i == count, buffer);
}
while (rs.next())
{
// get the column values; column indices start from 1
for (int i = 1; i < count + 1; ++i)
{
String value = rs.getString(i);
bufferField(value, i == count, buffer);
}
}
}
catch (Exception e)
{
logger.error("Failed to extractData SelectExportDataProcedue: ", e);
}
return buffer.toString();
}
private void bufferField(String field, boolean last, StringBuffer buffer)
{
try
{
if (field != null)
{
field = field.replace('\r', ' ');
field = field.replace('\n', ' ');
buffer.append(field);
}
if (last)
{
buffer.append('\n');
}
else
{
buffer.append('\t');
}
}
catch (Exception e)
{
logger.error("Failed to bufferField SelectExportDataProcedue: ", e);
}
}
The goal of these function is to export a certain resultset to an excel file (which happens on a higher level).
So if anyone has some tips on optimising this, they are very well welcome.
Ok, your stack trace gives you the answer:
Exception in thread "Thread-16" java.lang.OutOfMemoryError: Java heap space
That's why you're not logging, the application is crashing (Thread, to be specific). Judging from your description it sounds like you have a massive dataset that needs to be paged.
while (rs.next())
{
// get the column values; column indices start from 1
for (int i = 1; i < count + 1; ++i)
{
String value = rs.getString(i);
bufferField(value, i == count, buffer);
}
}
This is where you're thread dies (probably). Basically your StringBuffer runs out of memory. As for correcting it, there's a huge amount of options. Throw more memory at the problem on the client side (either by configuring the JVM (Here's a link):
How to set the maximum memory usage for JVM?
Or, if you're already doing that, throw more RAM into the device.
From a programming perspective it sounds like this is a hell of a report. You could offload some of the number crunching to MySQL rather than buffering on your end (if possible), or, if this is a giant report I would consider streaming it to a File and then reading via a buffered stream to fill the report.
It totally depends on what the report is. If it is tiny, I would aim at doing more work in SQL to minimize the result set. If it is a giant report then buffering is the other option.
Another possibility that you might be missing is that the ResultSet (depending on implementations) is probably buffered. That means instead of reading it all to strings maybe your report can take the ResultSet object directly and print from it. The downside to this, of course, is that a stray SQL exception will kill your report.
Best of luck, I'd try the memory options first. You might be running with something hilariously small like 128 and it will be simple (I've seen this happen a lot on remotely administered machines).

Java Heap Space Exception, with big ammount of data, any solution?

I've a litle big problem with java heap memory
I'm trying to migrate from oracle database 11g to access file 2007
This is not a problem below 65.000 records, now from there...
The aplication is throwing java heap exception, the memory consumption is raising over 600m and the CPU usage over 50% until the exeption.
As far as i know the rset.next() don't get all data (over 50 colums x +65000 row), but some records x time
i've try to set fetch size too, nothing happened
rset.setFetchSize(1000);
i've erase my code and show a output, same error
while (rset.next()) {
if (cont % 5000 == 0) {
System.out.println(cont + " proccesed and counting ...");
}
}
please don't give me the answer of using -xm(s, x)512, 1024, etc...
this could solved, not in my particulary case (i've tryied to set this even higher xD, nothing happend, i got the same exception at 65.000 records too)
Is there any other options i could try??,
meaby changing some driver configurations or string conections ??
please help
sorry aboubt my english
this is my connection:
Class.forName("oracle.jdbc.driver.OracleDriver");
this.conn = DriverManager.getConnection("jdbc:oracle:thin:#" + getServer() + ":1521:orcl", getUser(), getPassword());
this.stmt = this.conn.createStatement(java.sql.ResultSet.TYPE_SCROLL_INSENSITIVE, java.sql.ResultSet.CONCUR_UPDATABLE);
It looks like the problem is that you are using a Scrollable ResultSet and that is going to use more memory.

Categories

Resources