Write CSV file column-by-column - java

I was searching for an answer for this but I didn't find it. Does anyone have a solution for this kind of problem. I have a set of text variables that I have to write into the .CSV file using Java. I am currently doing a project with JavaScript that calls for Java. This is a function that I have right now that does the job well and writes the text into .CSV line by line.
function writeFile(filename, data)
{
try
{
//write the data
out = new java.io.BufferedWriter(new java.io.FileWriter(filename, true));
out.newLine();
out.write(data);
out.close();
out=null;
}
catch(e) //catch and report any errors
{
alert(""+e);
}
}
But now I have to write parts of text one by one like the example bellow.
first0,second0,third0
first1,second1,third1
first2,second2,third2
.
.
.
first9,second9,third9
So the algorithm goes like this. The function writes first0 with comma then goes to the next line writes first1, goes to next line writes first2 and so one until first9. After that part is done the script goes to the beginning of the file and writes second0 behind the comma, goes to the next line and writes second1 behind the comma and so on. You get the idea.
So now I need java

You might want to consider using Super CSV to write the CSV file. As well as taking care of escaping embedded double-quotes and commas, it offers a range of writing implementations that write from arrays/Lists, Maps or even POJOs, which means you can easily try out your ideas.
If you wanted to keep it really simple, you can assemble your CSV file in a two-dimensional array. This allows to to assemble it column-first, and then write the whole thing to CSV when it's ready.
package example;
import java.io.FileWriter;
import java.io.IOException;
import org.supercsv.io.CsvListWriter;
import org.supercsv.io.ICsvListWriter;
import org.supercsv.prefs.CsvPreference;
public class ColumnFirst {
public static void main(String[] args) {
// you can assemble this 2D array however you want
final String[][] csvMatrix = new String[3][3];
csvMatrix[0][0] = "first0";
csvMatrix[0][1] = "second0";
csvMatrix[0][2] = "third0";
csvMatrix[1][0] = "first1";
csvMatrix[1][1] = "second1";
csvMatrix[1][2] = "third1";
csvMatrix[2][0] = "first2";
csvMatrix[2][1] = "second2";
csvMatrix[2][2] = "third2";
writeCsv(csvMatrix);
}
private static void writeCsv(String[][] csvMatrix) {
ICsvListWriter csvWriter = null;
try {
csvWriter = new CsvListWriter(new FileWriter("out.csv"),
CsvPreference.STANDARD_PREFERENCE);
for (int i = 0; i < csvMatrix.length; i++) {
csvWriter.write(csvMatrix[i]);
}
} catch (IOException e) {
e.printStackTrace(); // TODO handle exception properly
} finally {
try {
csvWriter.close();
} catch (IOException e) {
}
}
}
}
Output:
first0,second0,third0
first1,second1,third1
first2,second2,third2

Here is my solution to the problem. You don't need to keep the whole data in the buffer thanks to the low-level random access file mechanisms. You would still need to load your records one by one:
package file.csv;
import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.io.RandomAccessFile;
import java.nio.channels.FileChannel;
import java.util.Arrays;
import java.util.List;
public class CsvColumnWriter {
public static void main(String args[]) throws Exception{
CsvColumnWriter csvWriter = new CsvColumnWriter(new File("d:\\csv.txt"), new File("d:\\csv.work.txt"), 3);
csvWriter.writeNextCol(Arrays.asList(new String[]{"first0", "first1", "first2"}));
csvWriter.writeNextCol(Arrays.asList(new String[]{"second0", "second1", "second2"}));
csvWriter.writeNextCol(Arrays.asList(new String[]{"third0", "third1", "third2"}));
}
public void writeNextCol(List<String> colOfValues) throws IOException{
// we are going to create a new target file so we have to first
// create a duplicated version
copyFile(targetFile, workFile);
this.targetStream = new BufferedOutputStream(new FileOutputStream(targetFile));
int lineNo = 0;
for(String nextColValue: colOfValues){
String nextChunk = nextColValue + ",";
// before we add the next chunk to the current line,
// we must retrieve the line from the duplicated file based on its the ofset and length
int lineOfset = findLineOfset(lineNo);
workRndAccFile.seek(lineOfset);
int bytesToRead = lineInBytes[lineNo];
byte[] curLineBytes = new byte[bytesToRead];
workRndAccFile.read(curLineBytes);
// now, we write the previous version of the line fetched from the
// duplicated file plus the new chunk plus a 'new line' character
targetStream.write(curLineBytes);
targetStream.write(nextChunk.getBytes());
targetStream.write("\n".getBytes());
// update the length of the line
lineInBytes[lineNo] += nextChunk.getBytes().length;
lineNo++;
}
// Though I have not done it myself but obviously some code should be added here to care for the cases where
// less column values have been provided in this method than the total number of lines
targetStream.flush();
workFile.delete();
firstColWritten = true;
}
// finds the byte ofset of the given line in the duplicated file
private int findLineOfset(int lineNo) {
int ofset = 0;
for(int i = 0; i < lineNo; i++)
ofset += lineInBytes[lineNo] +
(firstColWritten? 1:0); // 1 byte is added for '\n' if at least one column has been written
return ofset;
}
// helper method for file copy operation
public static void copyFile( File from, File to ) throws IOException {
FileChannel in = new FileInputStream( from ).getChannel();
FileChannel out = new FileOutputStream( to ).getChannel();
out.transferFrom( in, 0, in.size() );
}
public CsvColumnWriter(File targetFile, File workFile, int lines) throws Exception{
this.targetFile = targetFile;
this.workFile = workFile;
workFile.createNewFile();
this.workRndAccFile = new RandomAccessFile(workFile, "rw");
lineInBytes = new int[lines];
for(int i = 0; i < lines; i++)
lineInBytes[i] = 0;
firstColWritten = false;
}
private File targetFile;
private File workFile;
private int[] lineInBytes;
private OutputStream targetStream;
private RandomAccessFile workRndAccFile;
private boolean firstColWritten;
}

I'm just going ahead and assume that you have some freedom how to fulfill this task. To my knowledge, you can't 'insert' text into a file. You can only do it by reading the file completely, change it in-memory, and then write back the result into the file.
So it would be better if you invert your data structure in-memory and then write it. If your data object is a matrix, just transpose it, so that it is in the format you want to write.

How about this
Scanner input = new Scanner(System.in);
String[] lines = new String[9];
for (int j = 0; j < 2; j++) {
for (int i = 0; i < 9; i++) {
lines[i] += (String) input.nextLine() + ",";
}
}
for (int i = 0; i < 9; i++) {
lines[i] += (String) input.nextLine();
}

Based on your requirements of not losing any data if an error occurs, perhaps you should rethink the design and use an embedded database (there is a discussion of the merits of various embedded databases at Embedded java databases). You would just need a single table in the database.
I suggest this because in your original question it sounds like you are trying to use a CSV file like a database where you can update the columns of any row in any order. In that case, why not just bite the bullet and use a real database.
Anyhow, once you have all the columns and rows of your table filled in, export the database to a CSV file in "text file order" row1-col1, row1-col2 ... row2-col1 etc.
If an error occurs during the building of the database, or the exporting of the CSV file at least you will still have all the data from the previous run and can try again.

Related

reading in CSV file, arrayindex out of bounds error

I'm attempting to read in a csv file, I've created a test file with 9 entries and their value, but my code won't read past the second line , it says the file isn't found past the second key, I've tried tweeking it as much as possible, can someone help me? sample input would include something like this but in a csv file (so each in a new line, I'm new here and still learning to edit text here):
Diego,2
Maria,2
Armando,5
Ken, 1
public static void main(String[] args) {
HashMap<String, Integer> h = new HashMap<String, Integer>(511);
try
{
Scanner readIn = new Scanner (new File ("test1.csv") );
System.out.println("I'm here 1");
while ( readIn.hasNext() )
{
System.out.print(readIn.next());// for testing purposes only
System.out.println("Check 2"); // for testing purposes only
String line = readIn.nextLine();
String str[] = line.split(",");
for (int i = 0; i < str.length ; i++)
{
String k = str[0];
int v = Integer.parseInt(str[1]);
h.insert(k , v);
}
System.out.println(h.toString());
}
readIn.close();
}
catch (ArrayIndexOutOfBoundsException ob)
{
System.out.println(" - The file wasn't found." );
}
catch (FileNotFoundException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
}
A call to next() or nextLine() should be preceeded with the call to hasNext().But in you code you have checked if hasNext() returns true in the while loop,and then invoked both next() and nextLine() inside the loop.
You can modify your code as below:
while ( readIn.hasNext() )
{
String line = readIn.nextLine();
String str[] = line.split(",");
for (int i = 0; i < str.length ; i++)
{
String k = str[0];
int v = Integer.parseInt(str[1]);
h.put(k , v);
}
System.out.println(h.toString());
}
Your for loop isn't actually serving a purpose. You will notice that you never actually reference i in the loop. Prior to the OP's answer I believe you were trying to split on a string that didn't have a comma, but your code assumes that one will be there and hence the out of bounds exception. This relates to why I was telling you that the println() was problematic.
As far as your question about hasNext(), this is the only way you will know that you can read another line from the file. If you try to read past the end you will run into problems.
Rather writing code to read CSV file on your own. I'd suggest you to use standard libraries like Apache Commons CSV. It provides more methods to deal with CSV, Tab separated file, etc...
import java.io.FileReader;
import java.util.List;
import org.apache.commons.csv.CSVFormat;
import org.apache.commons.csv.CSVRecord;
public class SO35859431 {
public static void main(String[] args) {
String filePath = "D:\\user.csv";
try {
List<CSVRecord> listRecords = CSVFormat.DEFAULT.parse(new FileReader(filePath)).getRecords();
for (CSVRecord csvRecord : listRecords) {
/* Get record using index/position */
System.out.println(csvRecord.get(0));
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
First there is no such insert() method in HashMap class.The correct one is put(k, v) & in while loop it should be hasNext(). Follow is my code alternate the BufferedReader.
/*
* To change this license header, choose License Headers in Project Properties.
* To change this template file, choose Tools | Templates
* and open the template in the editor.
*/
package read;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.HashMap;
import java.util.Scanner;
/**
*
* #author Isuru Rangana Jr
*/
public class Read {
/**
* #param args the command line arguments
*/
public static void main(String args[]) throws FileNotFoundException, IOException {
HashMap<String, Integer> h = new HashMap<String, Integer>();
try {
BufferedReader readIn = new BufferedReader(new FileReader(new File("test.csv")));
while (readIn.ready()) {
String line = readIn.readLine();
String str[] = line.split(",");
for (int i = 0; i < str.length; i++) {
String k = str[0];
int v = Integer.parseInt(str[1]);
h.put(k, v);
}
System.out.println(h.toString());
}
readIn.close();
} catch (ArrayIndexOutOfBoundsException ob) {
System.out.println(" - The file wasn't found.");
}
}
}

Formatting Output to CSV File Format to Account for Empty Values In Some Rows

tl;dr: I want to output my HashMap in the same CSV file format that I am reading it in.
I want to preface this with: I may be going about this the wrong way. Because I started this thinking one thing and didn't account for this problem I'm having. So I'm trying to make a small application that will give you a random movie to watch depending on what genre you are in the mood for (similarly to Netflix's Max application, but considerably dumbed down). I have a list of movies that I'm going to format myself in CSV format, because I recently wrote some code that reads in values from a CSV file and I didn't have much to alter.
Here is the dilemma: I have read in the CSV formatted file (only a two line sample file), since I know what columns contain what I use a BufferedReader to read in line by line storing each value of the delimited value in its own ArrayList (I know there is a better way but this is what I came up with for now) I then store each ArrayList according to genre into a HashMap. Now I want to be able to write back out to the same file at some point to edit it. So movies that have been watched will be removed from the HashMap and then overwrite the file, so that when it's read back in next time the movie that was already watched will not be in the file anymore. So the difficulty I am having at this point is formatting the output back out to account for empty spaces in the actual CSV.
So for example, the test file I have only contains two lines where each movie genre has two movies except for drama and comedy. So the file looks like this
Action,Drama,Sci-fi/Fantasy,Thriller/Suspense,Comedy
Action2,,Sci-fi/Fantasy2,Thriller/Suspense2,
Just to solidify what output I want/expected; Say I watch Sci-fi/Fantasy2, it was a good movie, but it's gotta go, the output I want once removed is this
Action,Drama,Sci-fi/Fantasy,Thriller/Suspense,Comedy
Action2,,,Thriller/Suspense2,
But I know I'm not going to get those results because when I simply read the file then output it back out I get:
Action,Action2,
Drama,,
Thriller/Suspense,Thriller/Suspense2,
Comedy,,
Sci-fi/Fantasy,Sci-fi/Fantasy2,
So after getting these results I now realize I didn't plan well enough, and wonder if I'm going about this the wrong way. Does anyone know how to format in the way I described? I tried to find a solution, but after coming up empty handed I deduced that maybe the format I want goes against how a CSV should look, considering some cells in the file would have to be blank. I tried keeping any blank spaces from the file so they would go into the hashmap, but results were the same. I thought about bypassing the output file altogether, but I'm not sure how to save the values I originally read in that go into my map. Any ideas or solutions would greatly appreciated. Here is the class that does all the work:
package rngesus;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.FileWriter;
import java.io.IOException;
import java.util.ArrayList;
import java.util.HashMap;
public class ReadWrite {
private static final String[] GENRES = { "Action", "Drama",
"Sci-Fi/Fantasy", "Thriller/Suspense", "Comedy" };
private static final String NEW_LINE = "\n";
private static final String DELIMITER = ",";
private static final int NUM_OF_COL = 5;
private static final int GENRE_1 = 0;
private static final int GENRE_2 = 1;
private static final int GENRE_3 = 2;
private static final int GENRE_4 = 3;
private static final int GENRE_5 = 4;
private static String moviesFile;
private HashMap<String, ArrayList<String>> moviesByGenre;
ArrayList<String> actionMovies;
ArrayList<String> dramaMovies;
ArrayList<String> sciFiFantasyMovies;
ArrayList<String> thrillerSuspenseMovies;
ArrayList<String> comedyMovies;
public ReadWrite() {
moviesFile = "";
moviesByGenre = new HashMap<String, ArrayList<String>>();
actionMovies = new ArrayList<String>();
dramaMovies = new ArrayList<String>();
sciFiFantasyMovies = new ArrayList<String>();
thrillerSuspenseMovies = new ArrayList<String>();
comedyMovies = new ArrayList<String>();
}
public void readAndSortInputFile(String fileOfMovies) throws IOException {
try {
BufferedReader buffRdr = new BufferedReader(new FileReader(
new File(fileOfMovies)));
String line = "";
while ((line = buffRdr.readLine()) != null) {
String[] lnPtr = line.split(",", NUM_OF_COL);
int diff = Math.min(lnPtr.length, NUM_OF_COL);
for (int i = 0; i < diff; i++) {
if ((i == GENRE_1) && !lnPtr[i].isEmpty()) {
actionMovies.add(lnPtr[i]);
} else if ((i == GENRE_2) && !lnPtr[i].isEmpty()) {
dramaMovies.add(lnPtr[i]);
} else if ((i == GENRE_3) && !lnPtr[i].isEmpty()) {
sciFiFantasyMovies.add(lnPtr[i]);
} else if ((i == GENRE_4) && !lnPtr[i].isEmpty()) {
thrillerSuspenseMovies.add(lnPtr[i]);
} else if ((i == GENRE_5) && !lnPtr[i].isEmpty()){
comedyMovies.add(lnPtr[i]);
}
}
}
buffRdr.close();
moviesFile = fileOfMovies;
} catch (FileNotFoundException err) {
err.printStackTrace();
System.out.println("Error: Unable to locate file specified");
}
}
public void mapMoviesToGenre(){
moviesByGenre.put(GENRES[GENRE_1], actionMovies);
moviesByGenre.put(GENRES[GENRE_2], dramaMovies);
moviesByGenre.put(GENRES[GENRE_3], sciFiFantasyMovies);
moviesByGenre.put(GENRES[GENRE_4], thrillerSuspenseMovies);
moviesByGenre.put(GENRES[GENRE_5], comedyMovies);
}
public void initMapToOutput() {
mapMoviesToGenre();
FileWriter fileWriter = null;
try {
fileWriter = new FileWriter(moviesFile);
fileWriter.append(NEW_LINE);
for(ArrayList<String> moviesInGenre : moviesByGenre.values()){
for(String movie : moviesInGenre){
fileWriter.append(movie);
fileWriter.append(DELIMITER);
}
fileWriter.append(NEW_LINE);
}
} catch (Exception err) {
err.printStackTrace();
} finally {
try {
fileWriter.flush();
fileWriter.close();
} catch (IOException err) {
err.printStackTrace();
}
}
}
}
I don't think that is formatted very well but I have some ignorance on that, to anyone, feel free to edit away. I'm already aware that there must be a better way to store the values read in from the file, but for now I'm focused on solving this other dilemma. But if anyone is willing to point anything else out feel free.
Okay, so I worked through it, to whom it may concern. I went ahead and grouped the movies line by line as opposed as to column by column. All I had to do was change the while part of my code to this
int index = 0;
while ((line = buffRdr.readLine()) != null) {
String[] lnPtr = line.split(",", NUM_OF_COL);
int diff = Math.min(lnPtr.length, NUM_OF_COL);
for (int i = 0; i < diff; i++) {
if ((index == GENRE_1) && !lnPtr[i].isEmpty()) {
actionMovies.add(lnPtr[i]);
} else if ((index == GENRE_2) && !lnPtr[i].isEmpty()) {
dramaMovies.add(lnPtr[i]);
} else if ((index == GENRE_3) && !lnPtr[i].isEmpty()) {
sciFiFantasyMovies.add(lnPtr[i]);
} else if ((index == GENRE_4) && !lnPtr[i].isEmpty()) {
thrillerSuspenseMovies.add(lnPtr[i]);
} else if ((index == GENRE_5) && !lnPtr[i].isEmpty()){
comedyMovies.add(lnPtr[i]);
}
}
index++;
}
And I now get movies of the same genre on the same line every time, although sometimes in different order, which I assume is due to going into and out of the HashMap. If anyone knows a cleaner more efficient way to do this, feel free to share, but I solved my problem for now.

Reading data from a file in Java

So I have a background in c++ and I am trying to learn java. Everything is pretty similar. I am having a problem thought with file i/o. So I am messing around and doing really simple programs to get the basic ideas. Here is my code to read data from a file. So I am reading Core Java Volume 1 by Cay Hortsman and it tells me to write this to read from a file,
Scanner in = new Scanner(Paths.get("myFile.txt");
But when I write it in my code, it gives me a red line under paths. So I am not sure how to read from a file. It does not go into much detail about the subject. So my program below I am trying to just read numbers in from a file and store them in an array.
package practice.with.arrays.and.io;
import java.io.IOException;
import java.nio.file.Path;
import java.util.*;
public class PracticeWithArraysAndIO
{
static final int TEN = 10;
public static void main(String[] args) throws IOException
{
//Declaring a scanner object to read in data
Scanner in = new Scanner(Paths.get("myFile.txt"));
//Declaring an array to store the data from the file
int[] arrayOfInts = new int[TEN];
//Local variable to store data in from the file
int data = 0;
try
{
for(int i = 0; i < TEN; i++)
{
data = in.nextInt();
arrayOfInts[i] = data;
}
}
finally
{
in.close();
}
}
It is not clear why you are doing Paths.get(filename)).
You can wrap a Scanner around a file like this. As the comments below mention, you should choose an appropriate charset for your file.
Scanner in = new Scanner(new File("myFile.txt"), StandardCharsets.UTF_8);
To use the constant above, you need the following import, and Java 7.
import java.nio.charset.StandardCharsets
With my experience in Java, I've used the BufferedReader class for reading a text file instead of the Scanner. I usually reserve the Scanner class for user input in a terminal. Perhaps you could try this method out.
Create a BufferedReader with FileReader like so:
BufferedReader buffReader = new BufferedReader(new FileReader("myFile.txt"));
After setting this up, you can read lines with:
stringName = buffReader.readLine();
This example will set the String, stringName, to the first line in your document. To continue reading more lines, you'll need to create a loop.
You need to import java.nio.file.Paths.
I've used the BufferedReader class.
I hope it is helpful for you
public class PracticeWithArraysAndIO {
static final int TEN = 10;
public static void main(String[] args) throws IOException
{
BufferedReader br = null;
try{
br = new BufferedReader(new FileReader("/home/myFile.txt"));//input your file path
int value=0;
int[] arrayOfInts = new int[TEN];
int i=0;
while((value = br.read()) != -1)
{
if(i == 10) //if out of index, break
break;
char c = (char)value; //convert value to char
int number = Character.getNumericValue(c); //convert char to int
arrayOfInts[i] = number; //insert number into array
i++;
}
}catch(IOException e){
e.printStackTrace();
}finally{
if(br != null)
br.close(); //buffer close
}
}
}

Converting an CSV file to a JSON object in Java

Is there an open source java library to convert a CSV (or XLS) file to a JSON object?
I tried using json.cdl, but somehow it does not seem to work for large CSV strings.
I'm trying to find something like http://www.cparker15.com/code/utilities/csv-to-json/, but written in Java.
You can use Open CSV to map CSV to a Java Bean, and then use JAXB to convert the Java Bean into a JSON object.
http://opencsv.sourceforge.net/#javabean-integration
http://jaxb.java.net/guide/Mapping_your_favorite_class.html
Here is my Java program and hope somebody finds it useful.
Format needs to be like this:
"SYMBOL,DATE,CLOSE_PRICE,OPEN_PRICE,HIGH_PRICE,LOW_PRICE,VOLUME,ADJ_CLOSE
AAIT,2015-02-26 00:00:00.000,-35.152,0,35.152,35.12,679,0
AAL,2015-02-26 00:00:00.000,49.35,50.38,50.38,49.02,7572135,0"
First line is the column headers. No quotation marks anywhere. Separate with commas and not semicolons. You get the deal.
/* Summary: Converts a CSV file to a JSON file.*/
//import java.util.*;
import java.io.*;
import javax.swing.*;
import javax.swing.filechooser.FileNameExtensionFilter;
public class CSVtoJSON extends JFrame{
private static final long serialVersionUID = 1L;
private static File CSVFile;
private static BufferedReader read;
private static BufferedWriter write;
public CSVtoJSON(){
FileNameExtensionFilter filter = new FileNameExtensionFilter("comma separated values", "csv");
JFileChooser choice = new JFileChooser();
choice.setFileFilter(filter); //limit the files displayed
int option = choice.showOpenDialog(this);
if (option == JFileChooser.APPROVE_OPTION) {
CSVFile = choice.getSelectedFile();
}
else{
JOptionPane.showMessageDialog(this, "Did not select file. Program will exit.", "System Dialog", JOptionPane.PLAIN_MESSAGE);
System.exit(1);
}
}
public static void main(String args[]){
CSVtoJSON parse = new CSVtoJSON();
parse.convert();
System.exit(0);
}
private void convert(){
/*Converts a .csv file to .json. Assumes first line is header with columns*/
try {
read = new BufferedReader(new FileReader(CSVFile));
String outputName = CSVFile.toString().substring(0,
CSVFile.toString().lastIndexOf(".")) + ".json";
write = new BufferedWriter(new FileWriter(new File(outputName)));
String line;
String columns[]; //contains column names
int num_cols;
String tokens[];
int progress = 0; //check progress
//initialize columns
line = read.readLine();
columns = line.split(",");
num_cols = columns.length;
write.write("["); //begin file as array
line = read.readLine();
while(true) {
tokens = line.split(",");
if (tokens.length == num_cols){ //if number columns equal to number entries
write.write("{");
for (int k = 0; k < num_cols; ++k){ //for each column
if (tokens[k].matches("^-?[0-9]*\\.?[0-9]*$")){ //if a number
write.write("\"" + columns[k] + "\": " + tokens[k]);
if (k < num_cols - 1) write.write(", "); }
else { //if a string
write.write("\"" + columns[k] + "\": \"" + tokens[k] + "\"");
if (k < num_cols - 1) write.write(", ");
}
}
++progress; //progress update
if (progress % 10000 == 0) System.out.println(progress); //print progress
if((line = read.readLine()) != null){//if not last line
write.write("},");
write.newLine();
}
else{
write.write("}]");//if last line
write.newLine();
break;
}
}
else{
//line = read.readLine(); //read next line if wish to continue parsing despite error
JOptionPane.showMessageDialog(this, "ERROR: Formatting error line " + (progress + 2)
+ ". Failed to parse.",
"System Dialog", JOptionPane.PLAIN_MESSAGE);
System.exit(-1); //error message
}
}
JOptionPane.showMessageDialog(this, "File converted successfully to " + outputName,
"System Dialog", JOptionPane.PLAIN_MESSAGE);
write.close();
read.close();
}
catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Requires Swing but comes with a nifty little GUI so those who know absolutely no Java can use it once packaged into an executable .jar. Feel free to improve upon it. Thank you StackOverflow for helping me out all these years.
#Mouscellaneous basically answered this for you so please give him the credit.
Here is what I came up with:
package edu.apollogrp.csvtojson;
import au.com.bytecode.opencsv.bean.CsvToBean;
import au.com.bytecode.opencsv.bean.HeaderColumnNameMappingStrategy;
import org.codehaus.jackson.map.ObjectMapper;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.List;
public class ConvertCsvToJson {
public static void main(String[] args) throws IOException, ClassNotFoundException {
if (args.length > 1) {
String pathToCsvFile = args[0];
String javaBeanClassName = "edu.apollogrp.csvtojson.bean." + args[1];
final File file = new File(pathToCsvFile);
if (!file.exists()) {
System.out.println("The file you specified does not exist. path=" + pathToCsvFile);
}
Class<?> type = null;
try {
type = Class.forName(javaBeanClassName);
} catch (ClassNotFoundException e) {
System.out.println("The java bean you specified does not exist. className=" + javaBeanClassName);
}
HeaderColumnNameMappingStrategy strat = new HeaderColumnNameMappingStrategy();
strat.setType(type);
CsvToBean csv = new CsvToBean();
List list = csv.parse(strat, new InputStreamReader(new FileInputStream(file)));
System.out.println(new ObjectMapper().writeValueAsString(list));
} else {
System.out.println("Please specify the path to the csv file.");
}
}
}
I used maven to include the dependencies, but you could also download them manually and include them in your classpath.
<dependency>
<groupId>net.sf.opencsv</groupId>
<artifactId>opencsv</artifactId>
<version>2.0</version>
</dependency>
<dependency>
<groupId>org.codehaus.jackson</groupId>
<artifactId>jackson-mapper-asl</artifactId>
<version>1.9.12</version>
</dependency>
<dependency>
<groupId>org.codehaus.jackson</groupId>
<artifactId>jackson-core-asl</artifactId>
<version>1.9.12</version>
</dependency>
I have used excel file in this code.you can use csv.
i have wrote this class for particular Excel/csv format which is known to me.
import java.io.File;
public class ReadExcel {
private String inputFile;
public void setInputFile(String inputFile) {
this.inputFile = inputFile;
}
public void read() throws IOException {
File inputWorkbook = new File(inputFile);
Workbook w;
try {
w = Workbook.getWorkbook(inputWorkbook);
// Get the first sheet
Sheet sheet = w.getSheet(0);
// Loop over first 10 column and lines
int columns = sheet.getColumns();
int rows = sheet.getRows();
ContactList clist = new ContactList();
ArrayList<Contact> contacts = new ArrayList<Contact>();
for (int j = 1; j < rows; j++) {
Contact contact = new Contact();
for (int i = 0; i < columns; i++) {
Cell cell = sheet.getCell(i, j);
switch (i) {
case 0:
if (!cell.getContents().equalsIgnoreCase("")) {
contact.setSrNo(Integer.parseInt(cell.getContents()));
} else {
contact.setSrNo(j);
}
break;
case 1:
contact.setName(cell.getContents());
break;
case 2:
contact.setAddress(cell.getContents());
break;
case 3:
contact.setCity(cell.getContents());
break;
case 4:
contact.setContactNo(cell.getContents());
break;
case 5:
contact.setCategory(cell.getContents());
break;
}
}
contacts.add(contact);
}
System.out.println("done");
clist.setContactList(contacts);
JSONObject jsonlist = new JSONObject(clist);
File f = new File("/home/vishal/Downloads/database.txt");
FileOutputStream fos = new FileOutputStream(f, true);
PrintStream ps = new PrintStream(fos);
ps.append(jsonlist.toString());
} catch (BiffException e) {
e.printStackTrace();
System.out.println("error");
}
}
public static void main(String[] args) throws IOException {
ReadExcel test = new ReadExcel();
test.setInputFile("/home/vishal/Downloads/database.xls");
test.read();
}
}
i have used jxl.jar for excel reading
If your CSV is simple, then this is easy to write by hand - but CSV can include nasty edge cases with quoting, missing values, etc.
load the file using BufferedReader.readLine()
use String.split(",") to get the value from each line - NB this approach will only work correctly if your values don't have commas in!
write each value to the output using BufferedWriter
with the necessary JSON braces and quoting
You might want to use a CSV library, then convert to JSON 'by hand'
Here is a class I generated to return JSONArray, not just to print to a file.
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.dataformat.csv.CsvMapper;
import com.fasterxml.jackson.dataformat.csv.CsvSchema;
import org.json.simple.JSONArray;
import org.json.simple.parser.JSONParser;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.File;
import java.util.List;
import java.util.Map;
public class CsvToJson {
private static final Logger log = LoggerFactory.getLogger(UtilsFormat.class);
private static CsvToJson instance;
public static JSONArray convert(File input) throws Exception {
JSONParser parser = new JSONParser();
CsvSchema csvSchema = CsvSchema.builder().setUseHeader(true).build();
CsvMapper csvMapper = new CsvMapper();
// Read data from CSV file
List<? extends Object> readAll = csvMapper.readerFor(Map.class).with(csvSchema).readValues(input).readAll();
ObjectMapper mapper = new ObjectMapper();
JSONArray jsonObject = (JSONArray) parser.parse(mapper.writerWithDefaultPrettyPrinter().writeValueAsString(readAll));
System.out.print(jsonObject.toString());
return new JSONArray();
}
}
With Java 8, writing JSON is at hand.
You didn't specify what JSON API you want, so I assume by "JSON object" you mean a string with a serialized JSON object.
What I did in the CSV Cruncher project:
Load the CSV using HSQLDB. That's a relatively small (~2 MB) library, which actually implements a SQL 2008 database.
Query that database using JDBC.
Build a JDK JSON object (javax.json.JsonObject) and serialize it.
Here's how to do it:
static void convertResultToJson(ResultSet resultSet, Path destFile, boolean printAsArray)
{
OutputStream outS = new BufferedOutputStream(new FileOutputStream(destFile.toFile()));
Writer outW = new OutputStreamWriter(outS, StandardCharsets.UTF_8);
// javax.json way
JsonObjectBuilder builder = Json.createObjectBuilder();
// Columns
for (int colIndex = 1; colIndex <= metaData.getColumnCount(); colIndex++) {
addTheRightTypeToJavaxJsonBuilder(resultSet, colIndex, builder);
}
JsonObject jsonObject = builder.build();
JsonWriter writer = Json.createWriter(outW);
writer.writeObject(jsonObject);
The whole implementation is here. (Originally I wrote my own CSV parsing and JSON writing, but figured out both are complicated enough to reach for a tested off-the-shelf library.)
If you're using Java 8, you can do something like this. No Libraries or complicated logic required.
Firstly, create a POJO representing your new JSON object. In my example it's called 'YourJSONObject' and has a constructor taking two strings.
What the code does is initially reads the file, then creates a stream of String based lines. ( a line is equivalent to a line in your CSV file).
We then pass the line in to the map function which splits it on a comma and then creates the YourJSONObject.
All of these objects are then collected to a list which we pass in to the JSONArray constructor.
You now have an Array of JSONObjects. You can then call toString() on this object if you want to see the text representation of this.
JSONArray objects = new JSONArray(Files.readAllLines(Paths.get("src/main/resources/your_csv_file.csv"))
.stream()
.map(s -> new YourJSONObject(s.split(",")[0], s.split(",")[1]))
.collect(toList()));
Old post but I thought I'd share my own solution. It assumes quotations are used around an in-value comma. It also removes all quotations afterwards.
This method accepts a String in CSV format. So it assumes you've already read the CSV file to a string. Make sure you didn't remove the NextLine characters ('\n') while reading.
This method in no way perfect, but it might be the quick one-method solution in pure java you are looking for.
public String CSVtoJSON(String output) {
String[] lines = output.split("\n");
StringBuilder builder = new StringBuilder();
builder.append('[');
String[] headers = new String[0];
//CSV TO JSON
for (int i = 0; i < lines.length; i++) {
String[] values = lines[i].replaceAll("\"", "").split("۞");
if (i == 0) //INDEX LIST
{
headers = values;
} else {
builder.append('{');
for (int j = 0; j < values.length && j < headers.length; j++) {
String jsonvalue = "\"" + headers[j] + "\":\"" + values[j] + "\"";
if (j != values.length - 1) { //if not last value of values...
jsonvalue += ',';
}
builder.append(jsonvalue);
}
builder.append('}');
if (i != lines.length - 1) {
builder.append(',');
}
}
}
builder.append(']');
output = builder.toString();
return output;
}

Modify a .txt file in Java

I have a text file that I want to edit using Java. It has many thousands of lines. I basically want to iterate through the lines and change/edit/delete some text. This will need to happen quite often.
From the solutions I saw on other sites, the general approach seems to be:
Open the existing file using a BufferedReader
Read each line, make modifications to each line, and add it to a StringBuilder
Once all the text has been read and modified, write the contents of the StringBuilder to a new file
Replace the old file with the new file
This solution seems slightly "hacky" to me, especially if I have thousands of lines in my text file.
Anybody know of a better solution?
I haven't done this in Java recently, but writing an entire file into memory seems like a bad idea.
The best idea that I can come up with is open a temporary file in writing mode at the same time, and for each line, read it, modify if necessary, then write into the temporary file. At the end, delete the original and rename the temporary file.
If you have modify permissions on the file system, you probably also have deleting and renaming permissions.
if the file is just a few thousand lines you should be able to read the entire file in one read and convert that to a String.
You can use apache IOUtils which has method like the following.
public static String readFile(String filename) throws IOException {
File file = new File(filename);
int len = (int) file.length();
byte[] bytes = new byte[len];
FileInputStream fis = null;
try {
fis = new FileInputStream(file);
assert len == fis.read(bytes);
} catch (IOException e) {
close(fis);
throw e;
}
return new String(bytes, "UTF-8");
}
public static void writeFile(String filename, String text) throws IOException {
FileOutputStream fos = null;
try {
fos = new FileOutputStream(filename);
fos.write(text.getBytes("UTF-8"));
} catch (IOException e) {
close(fos);
throw e;
}
}
public static void close(Closeable closeable) {
try {
closeable.close();
} catch(IOException ignored) {
}
}
You can use RandomAccessFile in Java to modify the file on one condition:
The size of each line has to be fixed otherwise, when new string is written back, it might override the string in the next line.
Therefore, in my example, I set the line length as 100 and padding with space string when creating the file and writing back to the file.
So in order to allow update, you need to set the length of line a little larger than the longest length of the line in this file.
public class RandomAccessFileUtil {
public static final long RECORD_LENGTH = 100;
public static final String EMPTY_STRING = " ";
public static final String CRLF = "\n";
public static final String PATHNAME = "/home/mjiang/JM/mahtew.txt";
/**
* one two three
Text to be appended with
five six seven
eight nine ten
*
*
* #param args
* #throws IOException
*/
public static void main(String[] args) throws IOException
{
String starPrefix = "Text to be appended with";
String replacedString = "new text has been appended";
RandomAccessFile file = new RandomAccessFile(new File(PATHNAME), "rw");
String line = "";
while((line = file.readLine()) != null)
{
if(line.startsWith(starPrefix))
{
file.seek(file.getFilePointer() - RECORD_LENGTH - 1);
file.writeBytes(replacedString);
}
}
}
public static void createFile() throws IOException
{
RandomAccessFile file = new RandomAccessFile(new File(PATHNAME), "rw");
String line1 = "one two three";
String line2 = "Text to be appended with";
String line3 = "five six seven";
String line4 = "eight nine ten";
file.writeBytes(paddingRight(line1));
file.writeBytes(CRLF);
file.writeBytes(paddingRight(line2));
file.writeBytes(CRLF);
file.writeBytes(paddingRight(line3));
file.writeBytes(CRLF);
file.writeBytes(paddingRight(line4));
file.writeBytes(CRLF);
file.close();
System.out.println(String.format("File is created in [%s]", PATHNAME));
}
public static String paddingRight(String source)
{
StringBuilder result = new StringBuilder(100);
if(source != null)
{
result.append(source);
for (int i = 0; i < RECORD_LENGTH - source.length(); i++)
{
result.append(EMPTY_STRING);
}
}
return result.toString();
}
}
If the file is large, you might want to use a FileStream for output, but that seems pretty much like it is the simplest process to do what you're asking (and without more specificity i.e. on what types of changes / edits / deletions you're trying to do, it's impossible to determine what more complicated way might work).
No reason to buffer the entire file.
Simply write each line as your read it, insert lines when necessary, delete lines when necessary, replace lines when necessary.
Fundamentally, you will not get around having to recreate the file wholesale, especially if it's just a text file.
What kind of data is it? Do you control the format of the file?
If the file contains name/value pairs (or similar), you could have some luck with Properties, or perhaps cobbling together something using a flat file JDBC driver.
Alternatively, have you considered not writing the data so often? Operating on an in-memory copy of your file should be relatively trivial. If there are no external resources which need real time updates of the file, then there is no need to go to disk every time you want to make a modification. You can run a scheduled task to write periodic updates to disk if you are worried about data backup.
In general you cannot edit the file in place; it's simply a very long sequence of characters, which happens to include newline characters. You could edit in place if your changes don't change the number of characters in each line.
Can't you use regular expressions, if you know what you want to change ? Jakarta Regexp should probably do the trick.
Although this question was a time ago posted, I think it is good to put my answer here.
I think that the best approach is to use FileChannel from java.nio.channels package in this scenario. But this, only if you need to have a good performance! You would need to get a FileChannel via a RandomAccessFile, like this:
java.nio.channels.FileChannel channel = new java.io.RandomAccessFile("/my/fyle/path", "rw").getChannel();
After this, you need a to create a ByteBuffer where you will read from the FileChannel.
this looks something like this:
java.nio.ByteBuffer inBuffer = java.nio.ByteBuffer.allocate(100);
int pos = 0;
int aux = 0;
StringBuilder sb = new StringBuilder();
while (pos != -1) {
aux = channel.read(inBuffer, pos);
pos = (aux != -1) ? pos + aux : -1;
b = inBuffer.array();
sb.delete(0, sb.length());
for (int i = 0; i < b.length; ++i) {
sb.append((char)b[i]);
}
//here you can do your stuff on sb
inBuffer = ByteBuffer.allocate(100);
}
Hope that my answer will help you!
I think, FileOutputStream.getFileChannel() will help a lot, see FileChannel api
http://java.sun.com/javase/6/docs/api/java/nio/channels/FileChannel.html
private static void modifyFile(String filePath, String oldString, String newString) {
File fileToBeModified = new File(filePath);
StringBuilder oldContent = new StringBuilder();
try (BufferedReader reader = new BufferedReader(new FileReader(fileToBeModified))) {
String line = reader.readLine();
while (line != null) {
oldContent.append(line).append(System.lineSeparator());
line = reader.readLine();
}
String content = oldContent.toString();
String newContent = content.replaceAll(oldString, newString);
try (FileWriter writer = new FileWriter(fileToBeModified)) {
writer.write(newContent);
}
} catch (IOException e) {
e.printStackTrace();
}
}
You can change the txt file to java by saving on clicking "Save As" and saving *.java extension.

Categories

Resources