How to profile spring-boot application memory consumption? - java

I have a spring-boot app that I suspect might have a memory leak. Over time the memory consumption seems to increase, taking like 500M of memory until I restart the application. After a fresh restart it takes something like 150M. The spring-boot app should be a pretty stateless rest app, and there shouldn't be any objects left around after request is completed. I would wish the garbage collector would take care of this.
Currently on production the spring-boot app seems to use 343M of memory (RSS). I got the heapdump of the application and analysed it. According to the analysis the heapdump is only 31M of size. So where does the missing 300M lie in? How is the heapdump correlated with the actual memory the application is using? And how could I profile the memory consumption past the heapdump? If the memory used is not in the heap, then where is it? How to discover what is consuming the memory of the spring-boot application?

So where does the missing 300M lie in?
A lot of research has gone into this, especially in trying to tune the parameters that control the non-heap. One result of this research is the memory calculator (binary).
You see, in Docker environments with a hard limit on the available amount of memory, the JVM will crash when it tries to allocate more memory than is available. Even with all the research, the memory calculator still has a slack-option called "head-room" - usually set to 5 to 10% of total available memory in case the JVM decides to grab some more memory anyway (e.g. during intensive garbage collection).
Apart from "head-room", the memory calculator needs 4 additional input-parameters to calculate the Java options that control memory usage.
total-memory - a minimum of 384 MB for Spring Boot application, start with 512 MB.
loaded-class-count - for latest Spring Boot application about 19 000. This seems to grow with each Spring version. Note that this is a maximum: setting a too low value will result in all kinds of weird behavior (sometimes an "OutOfMemory: non-heap" exception is thrown, but not always).
thread-count - 40 for a "normal usage" Spring Boot web-application.
jvm-options - see the two parameters below.
The "Algorithm" section mentions additional parameters that can be tuned, of which I found two worth the effort to investigate per application and specify:
-Xss set to 256kb. Unless your application has really deep stacks (recursion), going from 1 MB to 256kb per thread saves a lot of memory.
-XX:ReservedCodeCacheSize set to 64MB. Peak "CodeCache" usage is often during application startup, going from 192 MB to 64 MB saves a lot of memory which can be used as heap. Applications that have a lot of active code during runtime (e.g. a web-application with a lot of endpoints) may need more "CodeCache". If "CodeCache" is too low, your application will use a lot of CPU without doing much (this can also manifest during startup: if "CodeCache" is too low, your application can take a very long time to startup). "CodeCache" is reported by the JVM as a non-heap memory region, it should not be hard to measure.
The output of the memory calculator is a bunch of Java options that all have an effect on what memory the JVM uses. If you really want to know where "the missing 300M" is, study and research each of these options in addition to the "Java Buildpack Memory Calculator v3" rationale.
# Memory calculator 4.2.0
$ ./java-buildpack-memory-calculator --total-memory 512M --loaded-class-count 19000 --thread-count 40 --head-room 5 --jvm-options "-Xss256k -XX:ReservedCodeCacheSize=64M"
-XX:MaxDirectMemorySize=10M -XX:MaxMetaspaceSize=121289K -Xmx290768K
# Combined JVM options to keep your total application memory usage under 512 MB:
-Xss256k -XX:ReservedCodeCacheSize=64M -XX:MaxDirectMemorySize=10M -XX:MaxMetaspaceSize=121289K -Xmx290768K

Besides heap, you have thread stacks, meta space, JIT code cache, native shared libraries and the off-heap store (direct allocations).
I would start with thread stacks: how many threads does your application spawn at peak? Each thread is likely to allocate 1MB for its stack by default, depending on Java version, platform, etc. With (say) 300 active threads (idle or not), you'll allocate 300MB of stack memory.
Consider making all your thread pools fixed-size (or at least provide reasonable upper bounds). Even if this proves not to be root cause for what you observed, it makes the app behaviour more deterministic and will help you better isolate the problem.

We can view how much of memory consumption in spring boot app, in this way.
Create spring boot app as .jar file and execute it using java -jar springboot-example.jar
Now open the CMD and type jconsole and hit enter.
Note :- before opening the jconsole you need to run .jar file
Now you can see a window like below and it will appear application that previously ran in Local Process section.
Select springboot-example.jar and click below connect button.
After it will show the below prompt and give Insecure connection option.
Finally you can see Below OverView (Heap Memory, Threads...).

You can use "JProfiler" https://www.ej-technologies.com/products/jprofiler/overview.html
remotely or locally to monitor running java app memory usage.

You can using "yourkit" with IntelliJ if you are using that as your IDE to troubleshoot memory related issues for your spring boot app. I have used this before and it provides better insight to applications.
https://www.yourkit.com/docs/java/help/idea.jsp

Interesting article about memory profiling: https://www.baeldung.com/java-profilers

Related

Is setting JVM Xmx option in Google App Engine Standard useful?

I've a simple Spring Boot application running on Google App Engine Standard Java 11 environment F2 instances. However, I occasionally get errors such as:
Exceeded soft memory limit of 512 MB with 592 MB after servicing 18320
requests total. Consider setting a larger instance class in app.yaml.
Instead of upgrading to a larger instance type, I'd prefer to limit memory usage, even if it degrades performance a bit.
Since an F2 instance has 512MB of available memory, would it help to set JVM's -Xmx option to a value like say, 480MB? Or will it make things worse by converting Google's "Exceeded soft memory limit" warning to a full blown OutOfMemory error?
Thanks
A Java process can consume more memory than what is specified via -Xmx. That applies only to max heap size.
As such, if you want to limit it, you should specify significantly less than 512 MB.
There are other flags to limit some portions of non-heap memory like -XX:MaxDirectMemorySize.
By default, 1/4th of available RAM is assigned as Xmx.
You should check what are the actual settings your app is running with, perhaps via java -XX:+PrintFlagsFinal.
I suggest reading this excellent post by a JVM expert explaining various types of JVM memory: Java using much more memory than heap size (or size correctly Docker memory limit)
You can limit the amount of memory the jvm is using, but there are no universal "recommended" arguments / settings. Settings that may be good for one application or use-case can be terrible for another one.As a general rule, no JVM settings at all is a good starting point.
When you face the Exceeded soft private memory limit error you have two alternatives to follow:
You can upgrade your instance to an another with more memory
You can reduce the chunks of data you process in each request, process the XML file in smaller pieces and keep the smaller instance doing the work.
Here is a stackoverflow post which can help.

How to get JVM Heap Size of Running Spring Boot (Spring Web MVC) API (runtime)?

I have a Spring Boot API hosted on AWS Elastic Container Service (ECS) running inside a Docker container. I am using a m5.xlarge instance which has 4 vCPUs and 16GB on physical RAM on the cluster. I was currently fine-tuning CPU and Memory, but finding it very random and tedious, despite following this article:
https://medium.com/#vlad.fedosov/how-to-calculate-resources-reservation-for-ecs-task-3c68a1e12725
I'm still a little confused on what to set the JVM heap size too. From what I read, Java 10+ allows for automatic detection of container and will set the JVM to 25% of the Container RAM (Or is this the actually physical RAM of the Cluster??), however, I am using Java8
I am logging the garbage collections logs via VM arguments:
-verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+PrintGCDateStamps -Xloggc:${LOG_DIR}/gc.log -
My questions are -
What is the easiest way to get the JVM Heap size my app is using at runtime? Is there a command, a tool etc...?
I am running Java8 which I believe does NOT detect container and will set JVM based on PHYSICAL SERVER RAM - I am using a m5.xlarge instance on AWS with 4 vCPUs and 16GB RAM, so if I didn't specific -Xms or -Xmx JVM heap size would be 16GB * 0.25 = 4GB correct??
If my container memory/task on AWS ECS is 120% currently, how do I know if that is the JVM heap OR container memory that is the problem/too low OR maybe even my application code being inefficient? This is an API and it is querying the Database many thousand times per minute so many Objects are floating around residually in memory and not being garbage collected?? I'm not sure, but would appreciate any help
and 3. You'll have the data in log file you've specified in ${LOG_DIR}/gc.log
TIP: use gc_%t.log to grab all of gc logs (not only the last one) and write them to persistance storage.
You can visualise the data e.g. at gceasy.io - you'll see entire heap - collections, times, etc.
But remember your app is not only heap, it's also off heap - and it's not so easy to track off heap (eg native memory tracking is consuming additional resources, etc.) - the easiest way is to grab threaddump (the fastest way but not the recommended on PROD env is to build your app with JDK, run jstack command in app terminal and redirect it to file on persistent storage). Then you can also visualise it at https://fastthread.io/
it depends which update eg blog
especialy note vm flag XX:***RAMPercentage
Start investigating your app - gc.log if you'll see nothing suspicious then you'll need to check your container utilization.

How to know what is the proper heap size for Java app (Spring app)?

I have a Spring Boot backend application, so I wonder how much memory should I give it? Currently, I give about 1GB to it and it seems to work (not sure what should I check).
However, I have heard people saying that *Java is memory-intensive and one should give it about 1:4 CPU:Memory (i.e. if 4 core cpu, then 16GB memory). Currently my Spring uses 3~6cpu cores, but only 1~2 GB of memory. So I am quite confused. Am I wrong? What should I do?
Thank you!
More details: It connects to MySQL, Redis, etc, and does not have many memory-heavy things inside itself.
More details: GC and memory usage captured by Micrometer & Prometheus & Grafana:
You should consider doing a load test to see how the application is going to behave under pressure in a production environment, create a load test suite using Soap Ui or some similar tool and run it in ramp-up mode to see how the application is going to use CPU and memory, as I'm seeing above you don't have any major GC frequently which means that you don't have any memory consuming operations in the application, but to be sure any way you will need to do a load test, and don't forget to profile your application during the load test (YourKit-preferred, VisualVm). and from the results, you can decide how much memory you will need to assign for different segments of the GC (JVM Parameters), also you can decide on which garbage collector you will need to use(CMS, G1,... ) Garbage collectors.

Spring boot embedded tomcat server take over 800 MB RAM?

I'm working on a Springboot application that uses embedded tomcat server. Application take more than 800MB RAM. Is that common? Is there any way to bring memory uses down?
The amount of memory consumed by your tomcat totally depends upon your application requirement.
You need to do some sort of memory profiling of your application.
Is that common?
Yes, I could be. It all depends on your application, the way you create objects and the amount of memory being used by your objects.
You can start with putting your -Xms to 1GB and run your application and perform normal operations.
Use tools like JVisualVm or JConsole to observe the Heap Size and GC performance and even amount of memory consumed by different types of objects in the JVM.
This will give you an intial idea abount amount of Heap required by your application.
After this use tool like JMeter to load test your application check how the load is hampering your heap usage.
Suggested Reading:
http://blog.manupk.com/2012/09/java-memory-model-simplified.html
This is pretty common. Java VMs are pretty heavy. Look at the JVM start up flags, which will tell you what the heapsize can grow to (You may see something like -Xmx768m which allocates a maximum of 768M of heap). You might try setting the CATALINA_OPTS environment variable: CATALINA_OPTS=-Xmx512m, but if the springboot script that boots the VM overrides this you will have to track down the value being set in the script. However, the default value generally works well and will prevent the JVM from throwing out of memory errors if you start instantiating many or large (read: hibernate) objects that take a while to be garbage collected.
Is there any way to bring memory uses down?
There are two approaches:
You can attempt to "squeeze" the heap size. That is not recommended, as it leads to the JVM spending a larger percentage of the CPU in the GCs, more frequent GC pauses, and ultimately OOMEs.
This approach often doesn't work at all; i.e. it just causes the application to die quicker.
You can figure out why your application is using so much memory. It could be due to many things:
The problem may be too big.
Your application could be "bloated" with all sorts of unnecessary libraries, features, etc.
Your in-memory data structures could be poorly designed.
Your application could be caching too much in memory.
Your application could have memory leaks.
I agree with #cowbert's advice. Use performance monitoring tools to try to track down what is using most of the JVM's memory. If there is a memory leak, this often shows up as an unexpectedly large amount of memory used for certain kinds of objects.

Java/Tomcat heap size question

I am not a Java dev, but an app landed on my desk. It's a web-service server-side app that runs in a Tomcat container. The users hit it up from a client application.
The users constantly complain about how slow it is and the app has to be restarted about twice a week, cause things get really bad.
The previous developer told me that the app simply runs out of memory (as it loads more data over time) and eventually spends all its time doing garbage collection. Meanwhile, the Heap Size for Tomcat is set at 6GB. The box itself has 32GB of RAM.
Is there any harm in increasing the Heap Size to 16GB?
Seems like an easy way to fix the issue, but I am no Java expert.
You should identify the leak and fix it, not add more heap space. Thats just a stop gap.
You should configure tomcat to dump the heap on error, then analyze the heap in one of any number of tools after a crash. You can compute the retained sizes of all the clases, which should give you a very clear picture of what is wrong.
Im my profile I have a link to a blog post about this, since I had to do it recently.
No, there is no harm in increasing the Heap Size to 16GB.
The previous developer told me that the app simply runs out of memory (as it loads more data over time)
This looks like a memory leak, a serious bug in application. If you increase the amount of memory available from 6 to 16 GiB, you're still gonna have to restart the application, only less frequent. Some experienced developer should take a look at the application heap while running (look at hvgotcodes tips) and fix the application.
To resolve these issues you need to do performance testing. This includes both CPU and memory analysis. The JDK (6) bundles a tool called VisualVM, on my Mac OS X machine this is on the path by default as "jvisualvm". That's free and bundled, so it's a place to start.
Next up is the NetBeans Profiler (netbeans.org). That does more memory and CPU analysis. It's free as well, but a bit more complicated.
If you can spend the money, I highly recommend YourKit (http://www.yourkit.com/). It's not terribly expensive but it has a lot of built-in diagnostics that make it easier to figure out what's going on.
The one thing you can't do is assume that just adding more memory will fix the problem. If it's a leak, adding more memory may just make it run really badly a bit longer between restarts.
I suggest you use a profiling tool like JProfiler, VisualVM, jConsole, YourKit etc. You can take a heap dump of your application and analyze which objects are eating up memory.

Categories

Resources