Java Application runs much slower in console than from Eclipse - java

I am having a very strange problem with Java that I have been unable to make any progress in fixing. I have designed a small application for generating and viewing simple fractals, it is part of a coursework. When I run the code from Eclipse it runs very quickly, generally using <5% of my processor and acting generally very responsive.
However when I run this same program from the windows command line, after compiling it of course, it runs much more slowly. It uses about 20% of the processor and if I was to put a figure on it I would say it runs about 10x slower, and is generally unusable. I have done a good bit of research on this problem and it hasn't been easy to come across relevant information, it certainly doesn't seem to be a common bug. The times it has been asked before differ from my situation in that people are writing an excessive amount to the console, which causes the slowdown. I am not printing anything to the console.
The process started by Eclipse when the code is run uses up to about 700mb of RAM. The code ran from the console uses up to about 70mb. I tried running it with a greater heap size, which did increase the RAM the process uses but did virtually nothing to increase the performance.
I would really appreciate it if anybody could help with this issue, I am tearing my hair out here.
Thanks very much!

Related

How do I properly limit JVM's memory usage?

I've been stuck on my problem for quite some time. To give you a litte context, I have written a bot in Java and was planning to run it on a Raspberry Pi 3 Model A+ 24/7. To my surprise, when I tested the almost finished program, its memory consumption kept on rising indefinitely.
Soon I realised, I had to limit the memory usage which I looked up on several sites over the past couple months. Unfortunately, most of them are outdated (2013 and older) and the very few newer ones didn't cover the important changes which must have taken place because I'm not able to figure out why my issue is still occurring.
I've tried so many things over such a long period of time that I'm not sure if I'll be able to sum up all the things I've tried so far but will update this post if I remember some important details.
Please see the pictures of my last test with the following settings:
java -Xmx4m -Xms4m -Xss64k -XX:MaxMetaspaceSize=8m -jar bot.jar
As you see the memory was not limited and rose to the point where the process was killed shortly after. In some of my previous tests I used an empty while(true) loop because I don't think I have a memory leak in my program. Weirdly enough, the empty loop also increased the memory size very slowly but did also not decrease over time.
At this point I'm not sure if the JVM is even capable of having a specified memory limit. My code uses the Robot class to make screen captures and fire certain buttons in nested while loops which also remind the garbage collector to cue a collection with System.gc(). Do I also have to use the following argument for the JVM?
-XX:MaxDirectMemorySize
I'm really confused with all the changes on Java as well. I've tried a few different JDKs because I thought that might solve the problem. The last test was compiled with the JDK 14 and runs on Java 11. Do I need to configure something on the OS in order to limit the memory consumption?
Maybe you guys could also recommend me a profiler with which I can check what is even allocating the memory in order to figure out what needs to be limited via the arguments but I would definitely need some help because I have never worked with one before.
I appreciate any help on this topic! Please let me know if you need any additional information and I will do my best to follow up during the week.
Maybe you can use the following args : -XX:+PrintGCDetails -Xloggc:/tmp/jvm-gc.log. It will log gc details in /tmp/jvm-gc.log .
Or you can check the size of the runtime heap with the following command:
# get the pid
ps -aux | bot.jar
# show the heap info
jmap -heap <pid>

Why does my Java app run faster with profiler attached?

I am developing a Java 8 SE application in Netbeans. A new feature I added recently to the app was running too slowly (about a minute, until the calculations stopped). So I fired up the profiler to see what is the major bottleneck. To my surprise, the calculations completed in about 7 seconds.
Couldn't believe it at first, but the results were correct.
Tried it a few times again, but the app always ran 10 times faster with the profiler attached to it. I also tried to run the compiled .jar file directly from the Windows command line, but the computations took about a minute again and again.
How is it possible, that the attached profiler provides such a massive boost to the performance? What changes does it do to the JVM or application?
BTW, I am using native OpenCV in these calculations with provided Java wrapper, if it makes any difference.
//Edit - Additional info: I am using the built-in Netbeans 8.1 profiler, which I believe is basically VisualVM. As for a profiling method I chose to monitor "Methods" and their execution times and invocation counts. The performance bump happens both with instrumented and sampled profiling.
Unfortunately there probably isn't one single answer that will explain why this is the case. Of course, it will depend on what the program is doing as well as how the program is being launched. For example, if you're using the profiler to launch the application (as opposed to connecting afterwards) then it may be that the profiler is launching with different configuration (heap size, garbage collector etc.) and that is the cause of the difference.
If you run jcmd you should see a list of processes. You can then run jcmd <id> VM.flags to see what the JVM has been configured with, and verify that the same are for the application when under a profiler and when it isn't.
Another possibility is that your program is excessively locking, and this excessive locking is causing thrashing in your application when the profiler isn't attached. With it attached the locking may be slower, resulting in the application threads co-operating and ultimately making faster progress.
However these are just suggestions of how you can investigate further; it's quite likely that there is another as yet undiscovered problem that you're seeing which is completely different (e.g. it's defaulting to a different level of logging ...)

JVM performance bottomed out overnight. What happened?

I'm not a Java developer but I find myself wearing that hat today, so bear with me.
When I stopped working on a file parser last night I was seeing benchmarks of 100k records per second-- which I was more than happy with. When I re-ran the same code against the same files this morning, I'm only seeing 10-12k records per second.
First thoughts might be that I changed something and introduced inefficient code but I've commented out larger swaths of the code than I had running last night and performance is still abysmal. At this point the program does virtually nothing except read files in the main loop and still reads them slowly.
I had a coworker run the fully-functional version of the jar on his own machine and he is seeing the 100k/s performance benchmarks I saw last night, so i can only assume something is wrong with my JVM/workstation.
Any ideas or thoughts on what I should be looking at? I hesitate to get into JVM performance tuning when I already know the stock JVM is more than capable of doing this task. I just don't understand what's changed since last night.
EDIT: I have rebooted the machine.
EDIT 2: It is now the next day. I booted up my laptop and ran the code, it's back to where it was in the first place-- 100k/ps. I checked out Windows' performance monitor yesterday and it didn't show an unusual amount of CPU, RAM or disk I/O so I'm really at a loss as to why this happened.
Perhaps I will look into JVM tuning after all if only to ensure I have a consistent experience.
Problem seems to be resolved.
Since posting, one thing I've done is eliminate the Windows pagefile in case this was a matter of paging. It helped but not as much as I needed.
What seems to have made the biggest impact is dedicating a sizable amount of memory to the JVM upon launch. I'm guessing the JVM might have been struggling to allocate enough memory dynamically. I'm now running it with the following parameters:
-server -Xmx4096m -Xms4096m
I added the server flag since this is intended to be a long-running process and I don't care about start-up time, and the memory arguments should give the JVM a solid, static 4GB of memory to work with.
I've not had any performance issues since.

Swing application becomes unresponsive after keeping operating for about 6 hours continuously

I have to say that I am not familiar with Swing technology. I am doing test automation on a Swing application. The problem I encountered was when I ran my tests automatically overnight, the application became unresponsive after a certain period of time and then tests failed. I recorded the time in test log, the period was mostly around 6 hours.
The application was still working but very slowly. There was no out of memory exception thrown. So I reckon it could be something wrong with the implementation of app using Swing. And maybe because the app keeps running so garbage collecting cannot catch up?
I couldn't find much information by Google. Could anyone who had similar experience shed some lights on the direction to solve this issue? Many thanks!
Try connecting to the application using JConsole and leave it running for the duration of the test. Check the memory heap usage chart. If you see the see saw line slowly rising, you have a memory leak somewhere.
There are tools to examine the heap, like IBM's Heap Analyzer.

Why is the Eclipse IDE getting slower?

I have downloaded the latest Eclipse IDE, Galileo, and tested it to see if it good for developing web applications in Java. I have also tried the Ganymede version of Eclipse and find that is it also good.
My Problem is that sometimes it hangs and stops responding while I am developing. Sometimes when I open a file, Eclipse hangs and does not respond for awhile. It seems that Eclipse is going slower and my job is getting slower because of the time that I am spending waiting for the response of Eclipse.
When I went to NetBeans 6.7, it was good and the performance was good. The loading is faster and the IDE responds well during my development testing.
My computer has 1 GB of RAM and a 1.6 GHz CPU.
What can you say about this?
I'm using Eclipse PDT 2.1 (also based on Galileo) for PHP development, and I've been using Eclipse-based IDE for 3 years now ; my observation is that 1 GB of RAM is generally not enough to run Eclipse + some kind of web server + DB server + browser + other stuff :-(
I'm currently working with a 1GB of RAM machine, and it's slow as hell... Few months ago, I had a 2GB of RAM machine, and things were going really fine -- and I'm having less software running on the "new machine" than I had on the other one !
Other things that seem to affect Eclipse's responsivness is :
opening a project that's on a network drive (accessing the sources that are on a development server via samba, for instance)
sometimes, using an SVN-plugin like SUbversive seems to freeze Eclipse for a couple of seconds/minutes
A nice to do with languages like PHP (might not be OK for JAVA projects, though) is to disable "automatically build" in "project"'s menu.
As a sidenote : I've already seen questions about eclipse's speed on SO ; you might want to try so searches, to get answers faster ;-)
This is a common concern and others have posted similar questions. There are optimizations that you can perform on your Eclipse environment. Take a look at the solutions posted here.
netbeans is really damn hot, i just didn get it to automatically release my android projects...
thinking of features.. i'd prefere eclipse...
to fasten it up a little more, just disable 'automatic build' doesnt really change anything (build just takes a little longer)
but it's really feelable faster...
but, after 1 or 2 hours, i also have to close, wait, and re-open it.
kind of sucks... (gotta macbook pro, 2.26 (i think) ghz, 3gb ram,
gave it a minimum of 768MB of ram, and keeps getting slower..
really sucks
::edit::
I also realized, that after opening a XML file, eclipse instantly gets a little bit more laggy (already disabled XML live compiling, or something similiar, makes no difference :( )
Our machines are bigger : 2GB ram, and faster CPU.
I'm sure that, as all software, Eclipse gets bigger and slower when upgrading version, due to all new functionnalities included. The good news is that from time to time, a release also brings some notable performance improvement. But in my experience, each time I tried using a ten-year old software on my current machine, it was lightning fast, so I'm sure the tendency is to get slower. I agree that this is a sad for us, when we don't get a better machine.
There might be some things you can do, to improve the responsiveness of your Eclipse.
I don't know if you already tried everything ... ?
My experience has been that NetBeans, Aptana, and Komodo are fast on computers where Eclipse is painfully slow. Maxing out RAM has seemed to help. Any chance you can bump up to 2 gig?
Netbeans sped up quite a bit in the last few years, perhaps your comparison is relative to the speed of netbeans?
Lately I had to up the size of my eclipse -Xmx from 64mb and decided I might as well go to 512, and it got a bit chunkier. at 64 I never saw the slightest pause, when it actually NEEDS a collection at 512 because of a long-running process that's not letting the background GC thread run, it can get a little pausey
I'm running on a pretty old version of eclipse (customized by the cable industry so it can run and display cable apps on a TV emulator) so your mileage may vary.
Check if you can disable unwanted plugins during start up.

Categories

Resources