i want to send data from an array to a txt file - java

String[][] EmployeeArray = new String [1000][25];
this is my array, it has all the info I need in it already but I want to send all the data from here into an text file called EmployeeFile. How should I do this?

You can Serialize it, or even better do some json decorations/formatting and then write that to a file...
jusing json could be as simple as:
String[][] x = { { "0-0", "0-1", "0-2" }, { "1-0", "1-1", "1-2" }, { "2-0", "2-1, "2-2" } };
try (Writer writer = new FileWriter("my2DArray.json")) {
Gson gson = new GsonBuilder().create();
gson.toJson(x, writer);
}

First of all you will need to loop over your array and create a single string containing your data to be written to the file. This is done so you can add the new line characters where you want them.
String fileContents = new String();
for(int i=0; i<1000; i++) {
for(int j=0; j<25; j++) {
fileContents += EmployeeArray[i][j]
}
fileContents += "\n";
}
The code above is very basic and it is ment to demonstrate the basic idea.
It would be more efficient to use a StringBuilder and I guess there are 100 ways to improve those lines further.
Then you can use the following method to write to the file:
public boolean writeFile(String data, String path) {
try {
byte[] encoded = data.getBytes(StandardCharsets.UTF_8);
java.nio.file.Files.write(Paths.get(path),
encoded, StandardOpenOption.CREATE, StandardOpenOption.TRUNCATE_EXISTING);
return true;
} catch(IOException e) {
e.printStackTrace();
}
return false;
}
You should be careful with the options StandardOpenOption.CREATE and StandardOpenOption.TRUNCATE_EXISTING. They will overwrite existing files.
Somtimes you need to append to the file, adjust accordingly.
The StandardOpenOption documentation can be found here. StandardOperation.APPEND comes in handy for logging purposes.
Also note that the character set used is UTF8. It is generally a good idea to use UTF8 if it covers your needs. If you get strange characters in your data you might need to also adjust accordingly.

Related

What's the most efficient way to read in a massive log file, and post to an API endpoint in Java?

Currently I have a massive log file in my application that I need to post to an endpoint. I periodically run a method that will read in the entire file into a list, perform some formatting so that the endpoint will accept it, and then convert the string using StringBuilder, return this string, and then post it to my endpoint. Oh, I forgot to mention, I batch the data out in chunks of X many characters. I am seeing some memory issues in my application and am trying to deal with this.
So this is how I am partitioning out the data to a temporary list
if (logFile.exists()) {
try (BufferedReader br = new BufferedReader(new FileReader(logFile.getPath()))) {
String line;
while ((line = br.readLine()) != null) {
if (isJSONValid(line)) {
temp.add(line);
tempCharCount += line.length();
}
if (tempCharCount >= LOG_PARTITION_CHAR_COUNT) {
// Formatting for the backend
String tempString = postFormat(temp);
// Send
sendLogs(tempString);
// Refresh
temp = new ArrayList<>();
tempCharCount = 0;
}
}
// Send "dangling" data
// Formatting for the backend
String tempString = postFormat(temp);
// Send
sendLogs(tempString);
} catch (FileNotFoundException e) {
Timber.e(new Exception(e));
} catch (IOException e) {
Timber.e(new Exception(e));
}
So when we reach our partition limit for character count, you can see that we are running
String tempString = postFormat(temp);
This is where we make sure our data is formatted into a string of json data that the endpoint will accept.
private String postFormat(ArrayList<String> list) {
list.add(0, LOG_ARRAY_START);
list.add(LOG_ARRAY_END);
StringBuilder sb = new StringBuilder();
for (int stringCount = 0; stringCount < list.size(); stringCount++) {
sb.append(list.get(stringCount));
// Only add comma separators after initial element, but never add to final element and
// its preceding element to match the expected backend input
if (stringCount > 0 && stringCount < list.size() - 2) {
sb.append(",");
}
}
return sb.toString();
}
As you might imagine, if you have a large log file, and these requests are going out async, then we will be using a lot of memory. Once our Stringbuilder is done, we return as a string that will eventually be gzip compressed and posted to an endpoint.
I am looking for ways to decrease the memory usage of this. I profiled it a bit on the side and could see how obviously inefficient it is, but am not sure of how I can do this better. Any ideas are appreciated.
I have one suggestion for you.
Formatted Output in Temp File - You can write formatted output in temp file. Once the transformation completed then you can read temp files and send to endpoint. If you don’t have sequence concern then you can use multi thread to append same file.
With this approach you are not storming any data in memory while transformation which will save lot of memory.

Get data from JSON API and put into CSV in loop

I am managed to get the data out of an API and put into a CSV, but I have problems to put the data into the CSV in a loop because right now it always overwrites it in the CSV. And the next problem is that the date does not get shown in the CSV in different fields. In the CSV it looks
like this:
and I want all the data like in my console:
my code right now:
JSONArray jsonarr_1 = (JSONArray) jobj.get("infectedByRegion");
//Get data for Results array
for(int i=0;i<jsonarr_1.size();i++)
{
//Store the JSON objects in an array
//Get the index of the JSON object and print the values as per the index
JSONObject jsonobj_1 = (JSONObject)jsonarr_1.get(i);
//Store the JSON object in JSON array as objects (For level 2 array element i.e Address Components)
String str_data1 = (String) jsonobj_1.get("region");
Long str_data2 = (Long) jsonobj_1.get("infectedCount");
Long str_data3 = (Long) jsonobj_1.get("deceasedCount");
System.out.println(str_data1);
System.out.println("Infizierte: "+str_data2);
System.out.println("Tote: "+str_data3);
System.out.println("\n");
PrintWriter pw = null;
try {
pw = new PrintWriter(new File("C:/Users/stelz/OneDrive/Desktop/Corona Daten/28.04.2020.csv"));
} catch (FileNotFoundException e) {
e.printStackTrace();
}
StringBuilder builder = new StringBuilder();
String columnNamesList = "Bundesland,Infizierte,Tote";
// No need give the headers Like: id, Name on builder.append
builder.append(columnNamesList +"\n");
builder.append(str_data1+",");
builder.append(str_data2+",");
builder.append(str_data3);
builder.append('\n');
pw.write(builder.toString());
pw.close();
System.out.println("done!");
}
//Disconnect the HttpURLConnection stream
conn.disconnect();
Your CSV is probably a nice comma delimited file with UTF-8 encoding. The first line in the image from Excel is an evidence that in the file and as text the first line is correctly:
Thüringen,2170,80
But your Excel has probably national defaults for Western Europe, meaning semicolon (';') as CSV separator, and CP1252 (slight variation of the Latin1 charset) for encoding.
How to fix?
Excel is known to have a very poor support for CSV file: read it as text (notepad or notepad++) or use LibreOffice calc which allows to declare the separator and the encoding when reading a CSV file.
If you have to stick to Excel, carefully build your file for your own Excel, use ISO-8859-1 charset and semicolons:
...
PrintWriter pw = null;
try {
pw = new PrintWriter(new File("C:/Users/stelz/OneDrive/Desktop/Corona Daten/28.04.2020.csv"),
"ISO-8859-1");
} catch (FileNotFoundException e) {
e.printStackTrace();
}
StringBuilder builder = new StringBuilder();
String columnNamesList = "Bundesland;Infizierte;Tote";
// No need give the headers Like: id, Name on builder.append
builder.append(columnNamesList +"\n");
builder.append(str_data1+";");
...
It looks like you're trying to create many different CSVs? If this is not what you want and you want all the records in one CSV, then I recommend looping through the array, building a CSV with StringBuilder or some other CSV tool of which there are many, and then creating the file outside of your for loop.

Saving Arraylist to Text document seperate lines

Trying to save arraylist items to a text file, I kind of have it working but it saves the whole arraylist on one line
I am hoping to save it per line and not have any duplicates or the empty brackets at the start, Any help would be much appreciated. Also if possible to remove the brackets around the text for easier reading into an arraylist
FileOutputStream is more fitting for when your data is already in a byte format. I suggest you use something like a PrintWriter.
PrintWriter pw = null;
try {
File file = new File("file.txt"); //edited
pw = new PrintWriter(file); //edited
for(String item: Subreddit_Array_List)
pw.println(item);
} catch (IOException e) {
e.printStackTrace();
}
finally{
pw.close();
}
Keep in mind this overwrites what was in the file before rather than appends to it. The output will be formatted like:
Cats
Dogs
Birds
You can iterate over the map and save all variables in separate lines.
Example:
private List<Object> objects;
private void example() {
//JDK >= 8
this.objects.forEach(this::writeInFile);
//JDK < 8
for (Object object : this.objects) {
this.writeInFile(object);
}
}
private void writeInFile(Object object) {
//your code here
}

How to read large Base64 file (150MB) on android app?

I'm trying to read large base64 Text File with size (~ 150MB) on android application.
The file contains JSON string that i need to decode and transform it to an JSON object, and use it along the app. The problem is that I'm getting an exception Out of Memory while trying to read this data.
The app needs to work offline so I need to download the full data.
Here's the code:
String localPath = getApplicationContext().getFilesDir().getPath().toString() ;
String key = "dataFile.txt" ;
StringBuilder text = new StringBuilder();
File file=new File(localPath+"/"+ key);
byte fileContent[] = new byte[3000];
try ( FileInputStream fin = new FileInputStream(file)) {
while(fin.read(fileContent) >= 0) {
byte[] data = Base64.decode(fileContent, Base64.DEFAULT);
try {
text.append(new String(data, "UTF-8"));
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
}
}
obj = new JSONObject(text.toString());
}catch (Exception e){
e.printStackTrace();
}
How can i read this kind of file?
You are trying to read the the whole file into the text object by reading the file, iterating it and appending each line to text. You create the JSONObject out of your text object which is actually useful for your application only in the last step.
Here, by the time your code reaches the line obj = new JSONObject(text.toString()); you have already filled up the heap with nearly the size of your Input File as this complete file is in the memory in the form of the test object. You then make JSONObject of of this text object.
What you can do to eliminate this problem is as follows:
Use BufferedReader to read the file in chunks (Optional). Using read() may be a bit slow and it is nice to have a buffer.
Iterate the file and put the entries into the text object in batches of 1000 or 10000.
Prepare JSONObject out of text and append it to obj.
Clear the text object before processing the next batch and then repeat the whole process.
By doing this you are reading only a small portion of file in the memory and also text object is acting as a buffer, consuming only a small amount of the memory.
Here is the sample code snippet:
int counter = 0;
String temp = null;
final int BATCH_SIZE = 1000;
try (BufferedReader br = new BufferedReader(new FileReader(path)) {
while ((temp = br.readLine()) != null) {
text.append(temp);
++counter;
/* Process In Batches */
if(counter % BATCH_SIZE == 0) {
/* Prepare & Append JSON Objects */
obj = prepareAppendJSON(text.toString(), obj);
/* Clear text */
text.setLength(0);
}
}
/* Last Iteration */
obj = prepareAppendJSON(text, obj);
text = new StringBuilder();
} catch (IOException ex) {
ex.printStackTrace();
}
The only option you have is to use JSON streaming and react on events you are interested in.
import org.codehaus.jackson.*;
.....
JsonParser parser = new JsonFactory().createJsonParser( yourFileInputStream );
parser.configure( Feature.ALLOW_BACKSLASH_ESCAPING_ANY_CHARACTER, true );
parser.configure( Feature.ALLOW_SINGLE_QUOTES, true );
// add more features
for( JsonToken token = parser.nextToken(); null != token; token = parser.nextToken() ){
switch( token ){
case FIELD_NAME:
doStuffWithName();
break;
case START_OBJECT:
doObjectStart();
break;
case END_OBJECT:
processObject();
break;
// other events
}
}
I used the above code on 4.0 device and with JSON file of 10 MB.
PS. You gonna need to decode your original Base64 file first. Not sure if you can do it within a java.io.Stream on the fly. In the worst case unpack the Base64 file into a plain-json and then use JSON streaming code from above

Modify a .txt file in Java

I have a text file that I want to edit using Java. It has many thousands of lines. I basically want to iterate through the lines and change/edit/delete some text. This will need to happen quite often.
From the solutions I saw on other sites, the general approach seems to be:
Open the existing file using a BufferedReader
Read each line, make modifications to each line, and add it to a StringBuilder
Once all the text has been read and modified, write the contents of the StringBuilder to a new file
Replace the old file with the new file
This solution seems slightly "hacky" to me, especially if I have thousands of lines in my text file.
Anybody know of a better solution?
I haven't done this in Java recently, but writing an entire file into memory seems like a bad idea.
The best idea that I can come up with is open a temporary file in writing mode at the same time, and for each line, read it, modify if necessary, then write into the temporary file. At the end, delete the original and rename the temporary file.
If you have modify permissions on the file system, you probably also have deleting and renaming permissions.
if the file is just a few thousand lines you should be able to read the entire file in one read and convert that to a String.
You can use apache IOUtils which has method like the following.
public static String readFile(String filename) throws IOException {
File file = new File(filename);
int len = (int) file.length();
byte[] bytes = new byte[len];
FileInputStream fis = null;
try {
fis = new FileInputStream(file);
assert len == fis.read(bytes);
} catch (IOException e) {
close(fis);
throw e;
}
return new String(bytes, "UTF-8");
}
public static void writeFile(String filename, String text) throws IOException {
FileOutputStream fos = null;
try {
fos = new FileOutputStream(filename);
fos.write(text.getBytes("UTF-8"));
} catch (IOException e) {
close(fos);
throw e;
}
}
public static void close(Closeable closeable) {
try {
closeable.close();
} catch(IOException ignored) {
}
}
You can use RandomAccessFile in Java to modify the file on one condition:
The size of each line has to be fixed otherwise, when new string is written back, it might override the string in the next line.
Therefore, in my example, I set the line length as 100 and padding with space string when creating the file and writing back to the file.
So in order to allow update, you need to set the length of line a little larger than the longest length of the line in this file.
public class RandomAccessFileUtil {
public static final long RECORD_LENGTH = 100;
public static final String EMPTY_STRING = " ";
public static final String CRLF = "\n";
public static final String PATHNAME = "/home/mjiang/JM/mahtew.txt";
/**
* one two three
Text to be appended with
five six seven
eight nine ten
*
*
* #param args
* #throws IOException
*/
public static void main(String[] args) throws IOException
{
String starPrefix = "Text to be appended with";
String replacedString = "new text has been appended";
RandomAccessFile file = new RandomAccessFile(new File(PATHNAME), "rw");
String line = "";
while((line = file.readLine()) != null)
{
if(line.startsWith(starPrefix))
{
file.seek(file.getFilePointer() - RECORD_LENGTH - 1);
file.writeBytes(replacedString);
}
}
}
public static void createFile() throws IOException
{
RandomAccessFile file = new RandomAccessFile(new File(PATHNAME), "rw");
String line1 = "one two three";
String line2 = "Text to be appended with";
String line3 = "five six seven";
String line4 = "eight nine ten";
file.writeBytes(paddingRight(line1));
file.writeBytes(CRLF);
file.writeBytes(paddingRight(line2));
file.writeBytes(CRLF);
file.writeBytes(paddingRight(line3));
file.writeBytes(CRLF);
file.writeBytes(paddingRight(line4));
file.writeBytes(CRLF);
file.close();
System.out.println(String.format("File is created in [%s]", PATHNAME));
}
public static String paddingRight(String source)
{
StringBuilder result = new StringBuilder(100);
if(source != null)
{
result.append(source);
for (int i = 0; i < RECORD_LENGTH - source.length(); i++)
{
result.append(EMPTY_STRING);
}
}
return result.toString();
}
}
If the file is large, you might want to use a FileStream for output, but that seems pretty much like it is the simplest process to do what you're asking (and without more specificity i.e. on what types of changes / edits / deletions you're trying to do, it's impossible to determine what more complicated way might work).
No reason to buffer the entire file.
Simply write each line as your read it, insert lines when necessary, delete lines when necessary, replace lines when necessary.
Fundamentally, you will not get around having to recreate the file wholesale, especially if it's just a text file.
What kind of data is it? Do you control the format of the file?
If the file contains name/value pairs (or similar), you could have some luck with Properties, or perhaps cobbling together something using a flat file JDBC driver.
Alternatively, have you considered not writing the data so often? Operating on an in-memory copy of your file should be relatively trivial. If there are no external resources which need real time updates of the file, then there is no need to go to disk every time you want to make a modification. You can run a scheduled task to write periodic updates to disk if you are worried about data backup.
In general you cannot edit the file in place; it's simply a very long sequence of characters, which happens to include newline characters. You could edit in place if your changes don't change the number of characters in each line.
Can't you use regular expressions, if you know what you want to change ? Jakarta Regexp should probably do the trick.
Although this question was a time ago posted, I think it is good to put my answer here.
I think that the best approach is to use FileChannel from java.nio.channels package in this scenario. But this, only if you need to have a good performance! You would need to get a FileChannel via a RandomAccessFile, like this:
java.nio.channels.FileChannel channel = new java.io.RandomAccessFile("/my/fyle/path", "rw").getChannel();
After this, you need a to create a ByteBuffer where you will read from the FileChannel.
this looks something like this:
java.nio.ByteBuffer inBuffer = java.nio.ByteBuffer.allocate(100);
int pos = 0;
int aux = 0;
StringBuilder sb = new StringBuilder();
while (pos != -1) {
aux = channel.read(inBuffer, pos);
pos = (aux != -1) ? pos + aux : -1;
b = inBuffer.array();
sb.delete(0, sb.length());
for (int i = 0; i < b.length; ++i) {
sb.append((char)b[i]);
}
//here you can do your stuff on sb
inBuffer = ByteBuffer.allocate(100);
}
Hope that my answer will help you!
I think, FileOutputStream.getFileChannel() will help a lot, see FileChannel api
http://java.sun.com/javase/6/docs/api/java/nio/channels/FileChannel.html
private static void modifyFile(String filePath, String oldString, String newString) {
File fileToBeModified = new File(filePath);
StringBuilder oldContent = new StringBuilder();
try (BufferedReader reader = new BufferedReader(new FileReader(fileToBeModified))) {
String line = reader.readLine();
while (line != null) {
oldContent.append(line).append(System.lineSeparator());
line = reader.readLine();
}
String content = oldContent.toString();
String newContent = content.replaceAll(oldString, newString);
try (FileWriter writer = new FileWriter(fileToBeModified)) {
writer.write(newContent);
}
} catch (IOException e) {
e.printStackTrace();
}
}
You can change the txt file to java by saving on clicking "Save As" and saving *.java extension.

Categories

Resources