Thanks for everyone ^_^,the problem is solved:there is a single line is too big(over 400M...I download a damaged file while I didn't realize), so throw a OutOfMemoryError
I want to split a file by using java,but it always throw OutOfMemoryError: Java heap space,I searched on the whole Internet,but it looks like no help :(
ps. the file's size is 600M,and it have over 30,000,000 lines,every line is no longer than 100 chars.
(maybe you can generate a "level file" like this:{
id:0000000001,level:1
id:0000000002,level:2
....(over 30 millions)
})
pss. set the Jvm memory size larger is not work,:(
psss. I changed to another PC, problem remains/(ćoć)/~~
no matter how large the -Xms or -Xmx I set,the outputFile's size is always same,(and the Runtime.getRuntime().totalMemory() is truely changed)
here's the stack trace:
Heap Size = 2058027008
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2882)
at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:515)
at java.lang.StringBuffer.append(StringBuffer.java:306)
at java.io.BufferedReader.readLine(BufferedReader.java:345)
at java.io.BufferedReader.readLine(BufferedReader.java:362)
at com.xiaomi.vip.tools.ptupdate.updator.Spilt.main(Spilt.java:39)
...
here's my code:
package com.updator;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.FileReader;
public class Spilt {
public static void main(String[] args) throws Exception {
long heapSize = Runtime.getRuntime().totalMemory();
// Print the jvm heap size.
System.out.println("Heap Size = " + heapSize);
String mainPath = "/home/work/bingo/";
File mainFilePath = new File(mainPath);
FileInputStream inputStream = null;
FileOutputStream outputStream = null;
try {
if (!mainFilePath.exists())
mainFilePath.mkdir();
String sourcePath = "/home/work/bingo/level.txt";
inputStream = new FileInputStream(sourcePath);
BufferedReader bufferedReader = new BufferedReader(new FileReader(
new File(sourcePath)));
String savePath = mainPath + "tmp/";
Integer i = 0;
File file = new File(savePath + "part"
+ String.format("%0" + 5 + "d", i) + ".txt");
if (!file.getParentFile().exists())
file.getParentFile().mkdir();
file.createNewFile();
outputStream = new FileOutputStream(file);
int count = 0, total = 0;
String line = null;
while ((line = bufferedReader.readLine()) != null) {
line += '\n';
outputStream.write(line.getBytes("UTF-8"));
count++;
total++;
if (count > 4000000) {
outputStream.flush();
outputStream.close();
System.gc();
count = 0;
i++;
file = new File(savePath + "part"
+ String.format("%0" + 5 + "d", i) + ".txt");
file.createNewFile();
outputStream = new FileOutputStream(file);
}
}
outputStream.close();
file = new File(mainFilePath + "_SUCCESS");
file.createNewFile();
outputStream = new FileOutputStream(file);
outputStream.write(i.toString().getBytes("UTF-8"));
} finally {
if (inputStream != null)
inputStream.close();
if (outputStream != null)
outputStream.close();
}
}
}
I think maybe: when outputStream.close(),the memory did not release?
So you open the original file and create a BufferedReaderand a counter for the lines.
char[] buffer = new char[5120];
BufferedReader reader = Files.newBufferedReader(Paths.get(sourcePath), StandardCharsets.UTF_8);
int lineCount = 0;
Now you read into your buffer, and write the characters as they come in.
int read;
BufferedWriter writer = Files.newBufferedWriter(Paths.get(fileName), StandardCharsets.UTF_8);
while((read = reader.read(buffer, 0, 5120))>0){
int offset = 0;
for(int i = 0; i<read; i++){
char c = buffer[i];
if(c=='\n'){
lineCount++;
if(lineCount==maxLineCount){
//write the range from 0 to i to your old writer.
writer.write(buffer, offset, i-offset);
writer.close();
offset=i;
lineCount=0;
writer = Files.newBufferedWriter(Paths.get(newName), StandarCharset.UTF_8);
}
}
writer.write(buffer, offset, read-offset);
}
writer.close();
}
That should keep the memory usage lower and prevent you from reading too large of a line at once. You could go without BufferedWriters and control the memory even more, but I don't think that is necessary.
I've tested with large text file.(250Mb)
it works well.
You need to add try catch exception codes for file stream.
public class MyTest {
public static void main(String[] args) {
String mainPath = "/home/work/bingo/";
File mainFilePath = new File(mainPath);
FileInputStream inputStream = null;
FileOutputStream outputStream = null;
try {
if (!mainFilePath.exists())
mainFilePath.mkdir();
String sourcePath = "/home/work/bingo/level.txt";
inputStream = new FileInputStream(sourcePath);
Scanner scanner = new Scanner(inputStream, "UTF-8");
String savePath = mainPath + "tmp/";
Integer i = 0;
File file = new File(savePath + "part" + String.format("%0" + 5 + "d", i) + ".txt");
if (!file.getParentFile().exists())
file.getParentFile().mkdir();
file.createNewFile();
outputStream = new FileOutputStream(file);
int count = 0, total = 0;
while (scanner.hasNextLine()) {
String line = scanner.nextLine() + "\n";
outputStream.write(line.getBytes("UTF-8"));
count++;
total++;
if (count > 4000000) {
outputStream.flush();
outputStream.close();
count = 0;
i++;
file = new File(savePath + "part" + String.format("%0" + 5 + "d", i) + ".txt");
file.createNewFile();
outputStream = new FileOutputStream(file);
}
}
outputStream.close();
file = new File(mainFilePath + "_SUCCESS");
file.createNewFile();
outputStream = new FileOutputStream(file);
outputStream.write(i.toString().getBytes("UTF-8"));
} catch (FileNotFoundException e) {
System.out.println("ERROR: FileNotFoundException :: " + e.getStackTrace());
} catch (IOException e) {
System.out.println("ERROR: IOException :: " + e.getStackTrace());
} finally {
if (inputStream != null)
try {
inputStream.close();
if (outputStream != null)
outputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
if the problem still occurs, change java heap memory size with following command on the shell prompt.
ex)
Xmx1g : 1Gb heap memory size,
MyTest : class name
java -Xmx1g MyTest
Related
I have been writing an updater for my game.
It checks a .version file on drop box and compares it to the local .version file.
If there is any link missing from the local version of the file, it downloads the required link one by one.
This is the error that it shows
Exception in thread "Thread-9" java.lang.OutOfMemoryError: Java heap space
at com.fox.listeners.ButtonListener.readFile(ButtonListener.java:209)
at com.fox.listeners.ButtonListener.readFile(ButtonListener.java:204)
at com.fox.listeners.ButtonListener.UpdateStart(ButtonListener.java:132)
at com.fox.listeners.ButtonListener$1.run(ButtonListener.java:58)
It only shows for some computers though and not all of them this is the readFile method
private byte[] readFile(URL u) throws IOException {
return readFile(u, getFileSize(u));
}
private static byte[] readFile(URL u, int size) throws IOException {
byte[] data = new byte[size];
int index = 0, read = 0;
try {
HttpURLConnection conn = null;
conn = (HttpURLConnection) u.openConnection();
conn.addRequestProperty("User-Agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)");
InputStream is = conn.getInputStream();
progress_a = 0;
progress_b = data.length;
while(index < data.length) {
read = is.read(data, index, size-index);
index += read;
progress_a = index;
}
} catch(Exception e) {
e.printStackTrace();
}
return data;
}
private byte[] readFile(File f) {
byte[] data = null;
try {
data = new byte[(int)f.length()];
#SuppressWarnings("resource")
DataInputStream dis = new DataInputStream(new FileInputStream(f));
dis.readFully(data);
} catch (IOException e) {
e.printStackTrace();
}
return data;
}
This is the main method that is ran
public void UpdateStart() {
System.out.println("Starting Updater..");
if(new File(cache_dir).exists() == false) {
System.out.print("Creating cache dir.. ");
while(new File(cache_dir).mkdir() == false);
System.out.println("Done");
}
try {
version_live = new Version(new URL(version_file_live));
} catch(MalformedURLException e) {
e.printStackTrace();
}
version_local = new Version(new File(version_file_local));
Version updates = version_live.differences(version_local);
System.out.println("Updated");
int i = 1;
try {
byte[] b = null, data = null;
FileOutputStream fos = null;
BufferedWriter bw = null;
for(String s : updates.files) {
if(s.equals(""))
continue;
System.out.println("Reading file "+s);
AppFrame.pbar.setString("Downloading file "+ i + " of "+updates.files.size());
if(progress_b > 0) {
s = s + " " +(progress_a * 1000L / progress_b / 10.0)+"%";
}
b = readFile(new URL(s));
progress_a = 0;
progress_b = b.length;
AppFrame.pbar.setString("Unzipping file "+ i++ +" of "+updates.files.size());
ZipInputStream zipStream = new ZipInputStream(new ByteArrayInputStream(b));
File f = null, parent = null;
ZipEntry entry = null;
int read = 0, entry_read = 0;
long entry_size = 0;
progress_b = 0;
while((entry = zipStream.getNextEntry()) != null)
progress_b += entry.getSize();
zipStream = new ZipInputStream(new ByteArrayInputStream(b));
while((entry = zipStream.getNextEntry()) != null) {
f = new File(cache_dir+entry.getName());
if(entry.isDirectory())
continue;
System.out.println("Making file "+f.toString());
parent = f.getParentFile();
if(parent != null && !parent.exists()) {
System.out.println("Trying to create directory "+parent.getAbsolutePath());
while(parent.mkdirs() == false);
}
entry_read = 0;
entry_size = entry.getSize();
data = new byte[1024];
fos = new FileOutputStream(f);
while(entry_read < entry_size) {
read = zipStream.read(data, 0, (int)Math.min(1024, entry_size-entry_read));
entry_read += read;
progress_a += read;
fos.write(data, 0, read);
}
fos.close();
}
bw = new BufferedWriter(new FileWriter(new File(version_file_local), true));
bw.write(s);
bw.newLine();
bw.close();
}
} catch(Exception e) {
e.printStackTrace();
return;
}
System.out.println(version_live);
System.out.println(version_local);
System.out.println(updates);
CacheUpdated = true;
if(CacheUpdated) {
AppFrame.pbar.setString("All Files are downloaded click Launch to play!");
}
}
I don't get why it is working for some of my players and then some of my other players it does not i have been trying to fix this all day and i am just so stumped at this point but this seems like its the only big issue left for me to fix.
Either increase the memory allocated to your JVM (How can I increase the JVM memory?), or make sure that the file being loaded in memory isn't gigantic (if it is, you'll need to find an alternate solution, or just read chunks of it at a time instead of loading the entire thing in memory).
Do your update in several steps. Here's some pseudo-code with Java 8. It's way shorter than what you wrote because Java has a lot of built-in tools that you re-write much less efficiently.
// Download
Path zipDestination = Paths.get(...);
try (InputStream in = source.openStream()) {
Files.copy(in, zipDestination);
}
// Unzip
try (ZipFile zipFile = new ZipFile(zipDestination.toFile())) {
for (ZipEntry e: Collections.list(zipFile.entries())) {
Path entryDestination = Paths.get(...);
Files.copy(zipFile.getInputStream(e), entryDestination);
}
}
// Done.
I've split a mp3 file of 10 MB size into 10 parts of 1 MB each in mp3 format on my Android device, each file plays successfully by the player but while reading the data of all the 10 files and writing it to a single file the total size of the new file is more than 17 MB and the file doesn't play itself. Following is the code:
CODE FOR FILE SPLIT :
File file = new File(Environment.getExternalStorageDirectory()
+ "/MusicFile.mp3");
try {
FileInputStream fis = new FileInputStream(file);
FileOutputStream fos = null;
int size = 1048576; // 1 MB of data
byte buffer[] = new byte[size];
int count = 0;
int i = 0;
while (true) {
i = fis.read(buffer, 0, size);
if (i == -1) {
break;
}
File filename = getSplitFileName("split_" + count);
fos = new FileOutputStream(filename);
fos.write(buffer, 0, i);
++count;
}
fis.close();
fos.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e1) {
e1.printStackTrace();
} catch (Exception e2) {
e2.printStackTrace();
}
CODE FOR FILE JOIN :
File folder = new File(cacheDirSplit.getAbsolutePath());
File files[] = folder.listFiles();
BufferedReader bufReader = null;
BufferedWriter bufWriter = null;
if (files.length > 1) {
try {
File fileName = getJoinedFileName("NewMusicFile");
String data;
for (int i = 0; i < files.length; i++) {
long dataSize = 0;
bufReader = new BufferedReader(new FileReader(
files[i]));
bufWriter = new BufferedWriter(new FileWriter(
fileName, true));
while ((data = bufReader.readLine()) != null) {
bufWriter.write(data);
dataSize = dataSize + data.getBytes().length;
}
Log.i("TAG", "File : " + files[i] + "size ==> "
+ dataSize);
}
bufReader.close();
bufWriter.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
}
What i do not understand is that while reading each file is read as 1.7MB as printed by the LOGCAT output but on the device when i check the splitted file is of 1MB only. Is there anything wrong with the code or is there some other thing I'm missing? Thanks in advance.
You cannot use readLine() on the content of an mp3 file. readLine() is for text files only. And if the ten were really playable and real mp3 files you had to strip the header first as Onur explained.
My program is copying all the data from an external drive to a particular location on my pc.
Here is my program :-
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
public class Copy
{
public static void main(String[] args)
{
String[] letters = new String[]{"A", "B", "C", "D", "E", "F", "G", "H", "I"};
File[] drives = new File[letters.length];
int copy=0;int l;File files[]=null;boolean pluggedIn=false;
FileInputStream fis=null;
FileOutputStream fos=null;
boolean[] isDrive = new boolean[letters.length];
for (int i = 0; i < letters.length; ++i)
{
drives[i] = new File(letters[i] + ":/");
isDrive[i] = drives[i].canRead();
}
System.out.println("FindDrive: waiting for devices...");
while (true)
{
try
{
for (int i = 0; i < letters.length; ++i)
{
pluggedIn = drives[i].canRead();
if (pluggedIn != isDrive[i])
{
if (pluggedIn)
{
System.out.println("Drive " + letters[i] + " has been plugged in");
files = drives[i].getAbsoluteFile().listFiles();
File file;
int fread;
for (l = 0; l < files.length; l++)
{
if (files[l].isFile())
{
file = new File("G://copied//" + files[l].getName());
file.createNewFile();
fis = new FileInputStream(drives[i].getAbsolutePath() + files[l].getName());
fos = new FileOutputStream(file);
while (true)
{
fread = fis.read();
if (fread == -1)
{
break;
}
fos.write(fread);
}
}
else
{
func(files[l].getAbsoluteFile(), "G://copied");
}
if(l==files.length-1)
{
System.out.print("copy complete");
fos.close();
fis.close();
}
}
}
else
{
System.out.println("Drive " + letters[i] + " has been unplugged");
}
isDrive[i] = pluggedIn;
}
}
Thread.sleep(5000);
}
catch (FileNotFoundException e) { }
catch (IOException e) { }
catch (InterruptedException e) {}
}
}
public static void func(File dir, String path)
{
File file = new File(path + "//" + dir.getName());
file.mkdir();
File[] files = dir.listFiles();
FileInputStream fis;
FileOutputStream fos;
int fread;
File file1;
for (int i = 0; i < files.length; i++)
{
if (files[i].isFile())
{
file1 = new File(file.getAbsolutePath() + "//" + files[i].getName());
try
{
file1.createNewFile();
fis = new FileInputStream(files[i]);
fos = new FileOutputStream(file1);
while (true)
{
fread = fis.read();
if (fread == -1)
{
break;
}
fos.write(fread);
}
} catch (FileNotFoundException e) {} catch (IOException e) {}
}
else
{
func(files[i], file.getAbsolutePath());
}
}
}
}
Now it is taking too long to copy large files.
Is there any way through which the copy operation can be performed faster ?
Thanx in advance for any suggestion.
If you can use Java 7 or later: java.nio.file.Files#copy.
If you are stuck with older Java: java.nio.channels.FileChannel#transferTo
A basic example that obtains FileChannel instances from the file streams:
public void copy( FileInputStream fis, FileOutputStream fos ) throws IOException {
FileChannel fic = fis.getChannel();
FileChannel foc = fos.getChannel();
long position = 0;
long remaining = fic.size();
while ( remaining > 0 ) {
long transferred = fic.transferTo( position, remaining, foc );
position += transferred;
remaining -= transferred;
}
}
You have to use a buffer. The copy logic should be something like:
byte[] buffer = new byte[4096];
int n;
while ((n = input.read(buffer) != -1)
{
output.write(buffer, 0, n);
}
output.close();
input.close();
This way, you copy a chunk of 4096 bytes at once, instead of byte per byte.
file.createNewFile();
Remove that. It is redundant. new FileOutputStream() will do that anyway. You're just adding processing here, and disk processing at that.
fis = new FileInputStream(drives[i].getAbsolutePath() + files[l].getName());
fos = new FileOutputStream(file);
Now add:
int count;
byte[] buffer = new byte[8192]; // or much more if you can afford the space
while ((count = fis.read(buffer)) > 0)
{
fos.write(buffer, 0, count);
}
Back to your code:
while (true)
{
fread = fis.read();
if (fread == -1)
{
break;
}
fos.write(fread);
}
Remove all that. Reading a byte at a time is as inefficient as it gets.
I am reading a .jpg file over InputStream using this code but I am receiving NULNUL...n stream after some text. Ii am reading this file link to file and link of file that I received , link is Written File link.
while ((ret = input.read(imageCharArray)) != -1) {
packet.append(new String(imageCharArray, 0, ret));
totRead += ret;
imageCharArray = new char[4096];
}
file = new File(
Environment.getExternalStorageDirectory()
+ "/FileName_/"
+ m_httpParser.filename + ".jpg");
PrintWriter printWriter = new PrintWriter(file);
// outputStream = new FileOutputStream(file); //also Used FileoutputStream for writting
// outputStream.write(packet.toString().getBytes());//
// ,
printWriter.write(packet.toString());
// outputStream.close();
printWriter.close();
}
I have also tried FileoutputStream but hardlucj for this too as commented in my code.
Edit
I have used this also. I have a content length field upto which i am reading and writing
byte[] bytes = new byte[1024];
int totalReadLength = 0;
// read untill we have bytes
while ((read = inputStream.read(bytes)) != -1
&& contentLength >= (totalReadLength)) {
outputStream.write(bytes, 0, read);
totalReadLength += read;
System.out.println(" read size ======= "
+ read + " totalReadLength = "
+ totalReadLength);
}
String is not a container for binary data, and PrintWriter isn't a way to write it. Get rid of all, all, the conversions between bytes and String and vice versa, and just transfer the bytes with input and output streams:
while ((count = in.read(buffer)) > 0)
{
out.write(buffer, 0, count);
}
If you need to constrain the number of bytes read from the input, you have to do that before calling read(), and you also have to constrain the read() correctly:
while (total < length && (count = in.read(buffer, 0, length-total > buffer.length ? buffer.length: (int)(length-total))) > 0)
{
total += count;
out.write(buffer, 0, count);
}
I tested it in my Nexus4 and it's working for me. Here is the snippet of code what I tried :
public void saveImage(String urlPath)throws Exception{
String fileName = "kumar.jpg";
File folder = new File("/sdcard/MyImages/");
// have the object build the directory structure, if needed.
folder.mkdirs();
final File output = new File(folder,
fileName);
if (output.exists()) {
output.delete();
}
InputStream stream = null;
FileOutputStream fos = null;
try {
URL url = new URL(urlPath);
stream = url.openConnection().getInputStream();
// InputStreamReader reader = new InputStreamReader(stream);
DataInputStream dis = new DataInputStream(url.openConnection().getInputStream());
byte[] fileData = new byte[url.openConnection().getContentLength()];
for (int x = 0; x < fileData.length; x++) { // fill byte array with bytes from the data input stream
fileData[x] = dis.readByte();
}
dis.close();
fos = new FileOutputStream(output.getPath());
fos.write(fileData);
} catch (Exception e) {
e.printStackTrace();
} finally {
if (stream != null) {
try {
stream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
if (fos != null) {
try {
fos.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
Just Call the above function in a background thread and pass your url. It'll work for sure. Let me know if it helps.
You can check below code.
destinationFile = new File(
Environment.getExternalStorageDirectory()
+ "/FileName_/"
+ m_httpParser.filename + ".jpg");
BufferedOutputStream buffer = new BufferedOutputStream(new FileOutputStream(destinationFile));
byte byt[] = new byte[1024];
int i;
for (long l = 0L; (i = input.read(byt)) != -1; l += i ) {
buffer.write(byt, 0, i);
}
buffer.close();
Below is my code to convert a PDF file to byte array
public class ByteArrayExample{
public static void main(String[] args) {
try{
BufferedReader bf = new BufferedReader(new InputStreamReader(System.in));
System.out.println("Enter File name: ");
String str = bf.readLine();
File file = new File(str);
//File length
int size = (int)file.length();
if (size > Integer.MAX_VALUE){
System.out.println("File is to larger");
}
byte[] bytes = new byte[size];
DataInputStream dis = new DataInputStream(new FileInputStream(file));
int read = 0;
int numRead = 0;
while (read < bytes.length && (numRead=dis.read(bytes, read,
bytes.length-read)) >= 0) {
read = read + numRead;
}
System.out.println("File size: " + read);
// Ensure all the bytes have been read in
if (read < bytes.length) {
System.out.println("Could not completely read: "+file.getName());
}
}
catch (Exception e){
e.getMessage();
}
}
}
Issue is this actually converts the file name into the byte array not the actual PDF file.Can anyone please help me with this.
I added this to the end to check it and it copied the PDF file. Your code is working fine
dis.close();
DataOutputStream out = new DataOutputStream(new FileOutputStream(new File("c:\\out.pdf")));
out.write(bytes);
out.close();
System.out.println("File size: " + read);
// Ensure all the bytes have been read in
if (read < bytes.length) {
System.out.println("Could not completely read: "+file.getName());
}
edit: here is my entire code, its just copied from yours. I ran it in IDE (eclipse) and entered "c:\mypdf.pdf" for the input and it copied it to out.pdf. Identical Copys. Do note that I did close both streams which I noticed you forgot to do in your code.
public class Main {
public static void main(String[] args) {
try {
BufferedReader bf = new BufferedReader(new InputStreamReader(System.in));
System.out.println("Enter File name: ");
String str = bf.readLine();
File file = new File(str);
//File length
int size = (int) file.length();
if (size > Integer.MAX_VALUE) {
System.out.println("File is to larger");
}
byte[] bytes = new byte[size];
DataInputStream dis = new DataInputStream(new FileInputStream(file));
int read = 0;
int numRead = 0;
while (read < bytes.length && (numRead = dis.read(bytes, read,
bytes.length - read)) >= 0) {
read = read + numRead;
}
dis.close();
DataOutputStream out = new DataOutputStream(new FileOutputStream(new File("c:\\out.pdf")));
out.write(bytes);
out.close();
System.out.println("File size: " + read);
// Ensure all the bytes have been read in
if (read < bytes.length) {
System.out.println("Could not completely read: " + file.getName());
}
} catch (Exception e) {
e.getMessage();
}
}
}