Modify a Csv file with OpenCsv, SpringBoot, extra quote (") in the result - java

I want to modify a column of a csv file with OpenCsv.
In the result I have an extra quote ("). How can I get rid of it?
Ex.
Input file: "Order ID,Customer Name, 123, John"
Output file: ""Order ID,Customer Name, 123, Chris""
public static byte[] updateCSV(String fileToUpdate) throws IOException {
final byte[] fallback = {};
// Read existing file
CSVReader reader = new CSVReader(new StringReader(fileToUpdate));
List<String[]> csvBody = reader.readAll();
// get CSV row column and replace with by using row and column
for(int i = 1; i < csvBody.size(); i++){
csvBody.get(i)[1] = "-"+csvBody.get(i)[1];
}
reader.close();
//Write back in csv
try (StringWriter writer = new StringWriter();
CSVWriter csvWriter = new CSVWriter(writer)
) {
csvWriter.writeAll(csvBody);
csvWriter.flush();
return writer.toString().getBytes();
} catch (Exception e) {
LOGGER.error(LogUtil.systemLoggingContext(), "Cannot modify the current CSV file");
return fallback;
}
}

Related

How to write a column on the same CSV file with OpenCSV, even when the file is empty?

I have to write a code that has to be able to write a column (an array String[] of values) and the header of that column into a CSV file. It has to write the column into an output .csv file in both cases, when it doesn't exist and when there's already a file named as the input value file. Of course I wanna read and write on the same file.
Here's the code:
public void afegir_columna_csv(String file, String header, String[] contingut) {
try {
FileWriter filewriter = new FileWriter(file, true);
CSVWriter csvWriter = new CSVWriter(filewriter);
FileReader filereader = new FileReader(file);
CSVReader csvReader = new CSVReader(filereader);
String head = header;
String[] values = contingut;
String[] entries = null;
//Adding the header part:
String[] H = csvReader.readNext();
ArrayList listH = new ArrayList(Arrays.asList(H));
listH.add(head);
csvWriter.writeNext((String[]) listH.toArray());
AƱadimos los valores:
int i=0;
while((entries = csvReader.readNext()) != null) {
ArrayList list = new ArrayList(Arrays.asList(entries));
list.add(values[i]);
csvWriter.writeNext((String[]) list.toArray());
}
csvWriter.close();
}
catch(Exception e) {
e.printStackTrace();
}
}
I've been testing the code and that's what happens on both cases:
1.If the file exists but is empty: it doesn't write the first column.
2.If the file already exists and has columns on it: it throws me a cast Exception (where i cast (String[]) list.toArray()).
Any ideas of how it's properly done?
Thanks!
Here's the error i get on the testing number 2:
java.lang.ClassCastException: class [Ljava.lang.Object; cannot be cast to class [Ljava.lang.String; ([Ljava.lang.Object; and [Ljava.lang.String; are in module java.base of loader 'bootstrap')
After a while analysing the problem i've reached a solution that works. I'm not pretty sure if it's super efficient but it does it's job. Had to change the way i get the information from a file, header and content to only content.
Here's the code:
public void add_column_csv(String[] contingut) {
try {
String filename = "output_preproces.csv";
File _file = new File(filename);
if(_file.createNewFile()) { //If the file doesn't exists, create and add the first column
FileWriter filewriter = new FileWriter(filename, true);
CSVWriter csvWriter = new CSVWriter(filewriter);
List<String[]> data = new ArrayList<String[]>();
int i=0;
while(i < contingut.length) {
data.add(new String[] {contingut[i]});
i++;
}
csvWriter.writeAll(data);
csvWriter.close();
}
else { //If the file already exists, add a column to it:
FileReader filereader = new FileReader(filename);
CSVReader csvReader = new CSVReader(filereader);
List<String[]> data = new ArrayList<String[]>();
int i=0;
while(i < contingut.length) {
String[] nextLine;
nextLine = csvReader.readNext();
String[] aux = new String[nextLine.length + 1];
aux[nextLine.length] = contingut[i];
int j = 0;
while(j < nextLine.length) {aux[j] = nextLine[j]; j++;};
data.add(aux);
i++;
}
csvReader.close();
new FileWriter(filename, false).close(); //delete the old content of the file
FileWriter filewriter = new FileWriter(filename, true);
CSVWriter csvWriter = new CSVWriter(filewriter);
csvWriter.writeAll(data);
csvWriter.close();
}
} catch(Exception e) {
e.printStackTrace();
}
}
As you can see, i had to separate the CSVReader from the CSVWriter and save all the data on Lists.

OpenCSV does not write all records

I have written a method in Java for updating a text file delimited by tab. I'm using the opencsv library. First I'm reading in the existing file, then I change the values of some columns and then I overwrite the file. I'm running Windows 7. The problem is now that not everything gets written to the file. Even if I don't change any value and just overwrite the file, not all records are written.
My code is the following. What is wrong with it?
private void csvWrite(int[] boundary, String filename, int answerCount) {
CSVReader csvReader = null;
List<String[]> csvBody = null;
try {
csvReader = new CSVReader(new FileReader(PATH + FILE),'\t');
csvBody = csvReader.readAll();
} catch (IOException e) {
e.printStackTrace();
}
// Search for the file
int count = 0;
for (String[] str : csvBody) {
if (str[0].equals(filename)) {
// found
csvBody.get(count)[POS_START_INDEX+answerCount-2] = Arrays.toString(boundary);
break;
}
count++;
}
try {
csvReader.close();
CSVWriter writer = new CSVWriter(new FileWriter(PATH + FILE),'\t');
writer.writeAll(csvBody);
writer.flush();
writer.close();
} catch (IOException e) {
e.printStackTrace();
}
}
I came across the same issue. For me, it was because I was not properly closing all the readers and writers.
Got resolved after implementing autocloseable:
private File writeToTempFile(File file) throws IOException {
File tempFile = new File("/var/tmp/newcsv.csv");
tempFile.createNewFile();
try(FileReader fileReader = new FileReader(file);
CSVReader csvReader = new CSVReaderBuilder(fileReader).withSkipLines(4).build();
FileWriter outputFileWriter = new FileWriter(tempFile);
CSVWriter writer = new CSVWriter(outputFileWriter, CSVWriter.DEFAULT_SEPARATOR,
CSVWriter.NO_QUOTE_CHARACTER,
CSVWriter.NO_ESCAPE_CHARACTER,
CSVWriter.DEFAULT_LINE_END);){
writer.writeAll(csvReader.readAll());
}catch (Exception e){
}
return tempFile;
}

Export a string in Java to CSV

How to export a string in Java to a csv file having this format using only one column.
This is what i am expecting:
Column 1
Row 1: string1,string2,string3
Row 2: string4, string5, string6
Thanks in advance
In the code below you provide a List of elements. Each element contains the info for one line of the csv file.
The StringBuilder is used to create the String for one line, which then is output at once to the file.
public void writeCsvFile(List elements, String fileName) throws IOException {
BufferedWriter csvFile = null;
String delim = ",";
try {
csvFile = new BufferedWriter(new OutputStreamWriter(
new FileOutputStream(fileName), StandardCharsets.UTF_8));
for (int i = 0; i < objects.size(); i++) {
StringBuilder buf = new StringBuilder();
Elements elem = elements.get(i);
buf.append(elem.info1).append(delim);
buf.append(elem.info2).append(delim);
buf.append(elem.info3);
csvFile.write(buf.toString());
csvFile.newLine();
}
} finally {
try {
if (csvFile != null) {
csvFile.close();
}
} catch (IOException e) {
// empty
}
}
}
You essentially have to "escape" the commas so the CSV reader won't interrpret them as columns delimiters.
If you wrap your row values in quotes then the commas should be ignored as delimeters
This will give you 3 columns
Value1,Value2,Value3
This should give you 1 column with the entire string as a single value
"Value1,Value2,Value3"

OpenNLP - Tokenize an Array of Strings

I am trying to tokenize a text file using the OpenNLP tokenizer.
What I do, I read in a .txt file and store it in a list, want to iterate over every line, tokenize the line and write the tokenized line to a new file.
In the line:
tokens[i] = tokenizer.tokenize(output[i]);
I get:
Type mismatch: cannot convert from String[] to String
This is my code:
public class Tokenizer {
public static void main(String[] args) throws Exception {
InputStream modelIn = new FileInputStream("en-token-max.bin");
try {
TokenizerModel model = new TokenizerModel(modelIn);
Tokenizer tokenizer = new TokenizerME(model);
CSVReader reader = new CSVReader(new FileReader("ParsedRawText1.txt"),',', '"', 1);
String csv = "ParsedRawText2.txt";
CSVWriter writer = new CSVWriter(new FileWriter(csv),CSVWriter.NO_ESCAPE_CHARACTER,CSVWriter.NO_QUOTE_CHARACTER);
//Read all rows at once
List<String[]> allRows = reader.readAll();
for(String[] output : allRows) {
//get current row
String[] tokens=new String[output.length];
for(int i=0;i<output.length;i++){
tokens[i] = tokenizer.tokenize(output[i]);
System.out.println(tokens[i]);
}
//write line
writer.writeNext(tokens);
}
writer.close();
}
catch (IOException e) {
e.printStackTrace();
}
finally {
if (modelIn != null) {
try {
modelIn.close();
}
catch (IOException e) {
}
}
}
}
}
Does anyone has any idea how to complete this task?
As compiler says, you try to assign array of Strings (result of tokenize()) to String (tokens[i] is a String). So you should declare and use tokens inside the inner loop and write tokens[] there, too:
for (String[] output : allRows) {
// get current row
for (int i = 0; i < output.length; i++) {
String[] tokens = tokenizer.tokenize(output[i]);
System.out.println(tokens);
// write line
writer.writeNext(tokens);
}
}
writer.close();
Btw, are you sure that your source file is a csv? If it is actually a plain text file, then you split text by commas and gives such chunks to Opennlp, and it can perform worse, because its model was trained over normal sentences, not split like yours.

delete a row in csv file

I am appending the data to the last row of a csv. I wanted to delete the existing row and then rewrite it with the appended element. Is there any way of deleting the row in csv? I am using opencsv to read and the write the file. I tried using CSVIterator class. However, it seems the iterator does not support the remove() operation.
Here is the code that I tried:
static String[] readLastRecord(File outputCSVFile) throws WAGException {
checkArgument(outputCSVFile != null, "Output CSV file cannot be null");
FileReader fileReader = null;
CSVReader csvFileReader = null;
CSVIterator csvIterator = null;
String[] csvLastRecord = null;
try {
fileReader = new FileReader(outputCSVFile);
csvFileReader = new CSVReader(fileReader, ',', '\'',
csvRowCount - 1);
csvIterator = new CSVIterator(csvFileReader);
while (csvIterator.hasNext()) {
csvLastRecord = csvIterator.next();
csvIterator.remove();
}
} catch (IOException ioEx) {
throw new WAGException(
WAGInputExceptionMessage.FILE_READ_ERR.getMessage());
} finally {
try {
if (csvFileReader != null)
csvFileReader.close();
} catch (IOException ioEx) {
throw new WAGException(
WAGInputExceptionMessage.FILE_CLOSE_ERR.getMessage());
}
}
return csvLastRecord;
}
i just found an answer. Hope it helps.
You need to read the csv, add elements to the list string, remove specific row from it with allelements.remove(rowNumber) and then write the list string back to the csv file.
The rowNumber is an int with row number.
CSVReader reader2 = new CSVReader(new FileReader(filelocation));
List<String[]> allElements = reader2.readAll();
allElements.remove(rowNumber);
FileWriter sw = new FileWriter(filelocation);
CSVWriter writer = new CSVWriter(sw);
writer.writeAll(allElements);
writer.close();
Look at this example from opencsv opencsv example
use unset to remove the row in csv
function readCSV($csvFile){
$file_handle = fopen($csvFile, 'r');
while (!feof($file_handle) ) {
$line_of_text[] = fgetcsv($file_handle, 1024);
}
fclose($file_handle);
return $line_of_text;
}
$csvFile1 = '../build/js/snowcem.csv';
$csv1 = readCSV($csvFile1);
//specified row number want to delete on place of $id
unset($csv1[$id]);
$file = fopen("../build/js/snowcem.csv","w");
foreach ($csv1 as $file1) {
$result = [];
array_walk_recursive($file1, function($item) use (&$result) {
$item = '"'.$item.'"';
$result[] = $item;
});
fputcsv($file, $result);
}
fclose($file);

Categories

Resources