Here is my problem:
First of all I'm working with hadoop and a single node configuration
I'm developing an application and I made just one map function, in this map function I call like 10 functions,
the application reads from a csv file and process a certain column, I already made the jar file and everything so when I run this app with a csv with 4000 rows on windows (windows 7) (using cygwin) on a 4 GB RAM machine, the application works fine, but when I run it on linux- ubuntu on a 2 GB RAM machine, it process some rows but then it throws a "Java heap space" error, or sometimes the thread is killed.
For the linux:
I already tried to change the hadoop export HEAP_SIZE and also the Xmx and Xms parameters on the app and it made some difference but not too much, the error stills happening...
Do you know why it s happening? its because the 4GB and 2GB of RAM difference between machines?
One thing I ran into with a mapper is if you call/use functions/objects that start their own threads from within the map function, this can easily create enough threads to use all the heap space for that JVM.
Each mapper will have the setup and cleanup function called once. In my situation I was able to process and put into an ArrayList all my data, then do the additional processing I needed in the cleanup function.
Related
I have scheduled(cron) a jar file on Linux box. The jar connects with Hive server over JDBC and runs select query, after that I write the selected data in csv file. The daily data volume is around 150 Million records and the csv file is approx. of size 30GB.
Now, this job does not completes every time it is invoked and results in writing part of data. I checked the PID for error with dmesg | grep -E 31866 and I can see:
[1208443.268977] Out of memory: Kill process 31866 (java) score 178 or sacrifice child
[1208443.270552] Killed process 31866 (java) total-vm:25522888kB, anon-rss:11498464kB, file-rss:104kB, shmem-rss:0kB
I am invoking my jar with memory options like :
java -Xms5g -Xmx20g -XX:+UseG1GC -cp jarFile
I want to know what exact the error text means and Is there any solution I can apply to ensure my job will not run OOM. The wired thing is the job does not fail every time its behaviour is inconsistence.
That message is actually from linux kernel, not your job. It means that your system ran out of memory and the kernel has killed your job to resolve the problem (otherwise you'd probably get a kernel panic).
You could try modifying your app to lower memory requirements (e.g. load your data incrementally or write a distributed job that would complete needed transformations on the cluster, not just one machine).
Need some help from the experts!
We have a project here (still on dev) that needs to run 50 java processes (for now and it will probably doubled or tripled in the future) at the same time every 5 minutes. I set Xmx50m for every process and our server has only 4gb of RAM, I know that would really slow our server. What I have in mind is to upgrade our RAM. My question is that do I have other options to prevent our server from being slow when running that amount of java processes?
Since you have 50 process and as per your assumption your processes need about 2.5 Gb to run .
To prevent your server from being slow you can follow some best practices to set java memory parameters e.g. set -Xmin and -Xmx the same values and determine a proper values based on your process usage, Also you can profile your process on runtime to ensure that everything is ok.
I'm developing a graphic user interface with NetBeans IDE 7.0.1. I need to operate with long Strings (about 1 GB) and I've changed the start up configuration parameter to:
-J-Xss512M
-J-Xms4G
-J-Xmx12G
-J-XX:PermSize=4G
-J-XX:MaxPermSize=8G
When I execute the applet I receive the "java.lang.OutOfMemoryError: Java heap space" exception. I've checked th point of the exception and at that moment, the applet was trying to concatenate two Strings, one of them of 550.000.000 length and the other 68.000.000
If I change the parameters above, it changes nothing on the applet exception.
My computer has 16 GB RAm so I think this is not the problem. Maybe some applet configuration?
Can you help me?
Thanks a lot
What operating system are you running on? If its a 32 bit one you wont be able to address more than around 4G of RAM (less in practice, especially on a Windows machine).
The second point here is the config you're changing is, I think, for NetBeans - not for the applet. If so (and I'm not sure how this works in NetBeans) then you need to alter the memory settings for your Applet itself, not for NetBeans.
This is typically done via run configuration settings for the app/applet etc you're trying to run.
Hope this helps!
When I run PowerShell in a remote session (etsn {servername}), I sometimes can't seem to run Java processes, even the most simple:
[chi-queuing]: PS C:\temp> java -cp .\hello.jar Hello
Error occurred during initialization of VM
Could not reserve enough space for object heap
Hello.jar is an "Hello, world!" application that should just print "Hello" to standard output.
So, the question is, is there something special about running processes on the other side of a PowerShell session? Is there something special about how the Java VM works that might not allow treatment like this? The memory is allocated on the remote computer, right? Here is a readout on the physical memory available:
[chi-queuing]: PS C:\temp> $mem = Get-wmiobject -class Win32_OperatingSystem
[chi-queuing]: PS C:\temp> $mem.FreePhysicalMemory
1013000
But, when I remote desktop to the server and ask the OS how much free memory there is, it says 270 MB physical memory free. Let me know what you think!
According to this:
http://msdn.microsoft.com/en-us/library/aa384372(VS.85).aspx
MaxMemoryPerShellMB
Specifies the maximum amount of memory allocated per shell, including the shell's child processes. The default is 150 MB.
Increase Max Memory Per Shell MB
winrm set winrm/config/winrs '#{MaxMemoryPerShellMB="1000"}'
I have a different answer to share with you guys. I found myself in the same situation and increasing memory min/max for Java.exe or using winrm did NOT solve my issue.
I compared two servers: one working and one not working.
I used this link https://technet.microsoft.com/en-us/library/ff520073%28v=ws.10%29.aspx to check my Windows Management Foundation wich is needed to run WINRS and also remote powershell.
the result: Both servers running Windows Server 2008 R2. One server running WMF 2.0, one running WMF 3.0.
To my surprise, the server running 2.0 was working and the one running 3.0 was NOT!
My solution: I upgraded the 3.0 WMF to 4.0!
Just a fyi: we suffered the same symptoms, and had an endless investigation based on the other two answers.
The actual solution for us was changing jdk1.8.0_31 to jdk1.8.0_51.
I have a multi-threaded application that launches an external app to do data conversion in preparation for later parts of the application.
I have an issue that when I set my thread count higher then 6 concurrent threads, the Runtime.getRuntime().exec() fails to launch the external application (I have also tried using ProcessBuilder with the same results). It does not throw any sort of Exception, and nothing is captured on either the standard output stream or standard error stream. What's even stranger is upon rebooting the server, I can run at least 8 concurrent threads for a few minutes without the issue, but then the issue will return.
I have read that Linux's implementation uses the fork() command which can cause an unable to allocate memory error, but I am not getting any such error.
I have written another test app to just launch X number of notespad.exe's without issue with X being as large as 100.
The application is running on a Window 2003 standard x64 server in a VMware environment.
JVM version is 1.6.0_11.
I can update the JVM in an attempt to resolve the issue, but would like to leave that as a last effort to prevent needing to test all application with the new JVM version.
Try to download the dll "framedyn.dll" from http://www.dlldump.com/download-dll-files_new.php/dllfiles/F/framedyn.dll/5.1.2600.2180/download.html and paste on C:\Windows\System32.