I am appending the data to the last row of a csv. I wanted to delete the existing row and then rewrite it with the appended element. Is there any way of deleting the row in csv? I am using opencsv to read and the write the file. I tried using CSVIterator class. However, it seems the iterator does not support the remove() operation.
Here is the code that I tried:
static String[] readLastRecord(File outputCSVFile) throws WAGException {
checkArgument(outputCSVFile != null, "Output CSV file cannot be null");
FileReader fileReader = null;
CSVReader csvFileReader = null;
CSVIterator csvIterator = null;
String[] csvLastRecord = null;
try {
fileReader = new FileReader(outputCSVFile);
csvFileReader = new CSVReader(fileReader, ',', '\'',
csvRowCount - 1);
csvIterator = new CSVIterator(csvFileReader);
while (csvIterator.hasNext()) {
csvLastRecord = csvIterator.next();
csvIterator.remove();
}
} catch (IOException ioEx) {
throw new WAGException(
WAGInputExceptionMessage.FILE_READ_ERR.getMessage());
} finally {
try {
if (csvFileReader != null)
csvFileReader.close();
} catch (IOException ioEx) {
throw new WAGException(
WAGInputExceptionMessage.FILE_CLOSE_ERR.getMessage());
}
}
return csvLastRecord;
}
i just found an answer. Hope it helps.
You need to read the csv, add elements to the list string, remove specific row from it with allelements.remove(rowNumber) and then write the list string back to the csv file.
The rowNumber is an int with row number.
CSVReader reader2 = new CSVReader(new FileReader(filelocation));
List<String[]> allElements = reader2.readAll();
allElements.remove(rowNumber);
FileWriter sw = new FileWriter(filelocation);
CSVWriter writer = new CSVWriter(sw);
writer.writeAll(allElements);
writer.close();
Look at this example from opencsv opencsv example
use unset to remove the row in csv
function readCSV($csvFile){
$file_handle = fopen($csvFile, 'r');
while (!feof($file_handle) ) {
$line_of_text[] = fgetcsv($file_handle, 1024);
}
fclose($file_handle);
return $line_of_text;
}
$csvFile1 = '../build/js/snowcem.csv';
$csv1 = readCSV($csvFile1);
//specified row number want to delete on place of $id
unset($csv1[$id]);
$file = fopen("../build/js/snowcem.csv","w");
foreach ($csv1 as $file1) {
$result = [];
array_walk_recursive($file1, function($item) use (&$result) {
$item = '"'.$item.'"';
$result[] = $item;
});
fputcsv($file, $result);
}
fclose($file);
Related
I want to modify a column of a csv file with OpenCsv.
In the result I have an extra quote ("). How can I get rid of it?
Ex.
Input file: "Order ID,Customer Name, 123, John"
Output file: ""Order ID,Customer Name, 123, Chris""
public static byte[] updateCSV(String fileToUpdate) throws IOException {
final byte[] fallback = {};
// Read existing file
CSVReader reader = new CSVReader(new StringReader(fileToUpdate));
List<String[]> csvBody = reader.readAll();
// get CSV row column and replace with by using row and column
for(int i = 1; i < csvBody.size(); i++){
csvBody.get(i)[1] = "-"+csvBody.get(i)[1];
}
reader.close();
//Write back in csv
try (StringWriter writer = new StringWriter();
CSVWriter csvWriter = new CSVWriter(writer)
) {
csvWriter.writeAll(csvBody);
csvWriter.flush();
return writer.toString().getBytes();
} catch (Exception e) {
LOGGER.error(LogUtil.systemLoggingContext(), "Cannot modify the current CSV file");
return fallback;
}
}
I have to write a code that has to be able to write a column (an array String[] of values) and the header of that column into a CSV file. It has to write the column into an output .csv file in both cases, when it doesn't exist and when there's already a file named as the input value file. Of course I wanna read and write on the same file.
Here's the code:
public void afegir_columna_csv(String file, String header, String[] contingut) {
try {
FileWriter filewriter = new FileWriter(file, true);
CSVWriter csvWriter = new CSVWriter(filewriter);
FileReader filereader = new FileReader(file);
CSVReader csvReader = new CSVReader(filereader);
String head = header;
String[] values = contingut;
String[] entries = null;
//Adding the header part:
String[] H = csvReader.readNext();
ArrayList listH = new ArrayList(Arrays.asList(H));
listH.add(head);
csvWriter.writeNext((String[]) listH.toArray());
AƱadimos los valores:
int i=0;
while((entries = csvReader.readNext()) != null) {
ArrayList list = new ArrayList(Arrays.asList(entries));
list.add(values[i]);
csvWriter.writeNext((String[]) list.toArray());
}
csvWriter.close();
}
catch(Exception e) {
e.printStackTrace();
}
}
I've been testing the code and that's what happens on both cases:
1.If the file exists but is empty: it doesn't write the first column.
2.If the file already exists and has columns on it: it throws me a cast Exception (where i cast (String[]) list.toArray()).
Any ideas of how it's properly done?
Thanks!
Here's the error i get on the testing number 2:
java.lang.ClassCastException: class [Ljava.lang.Object; cannot be cast to class [Ljava.lang.String; ([Ljava.lang.Object; and [Ljava.lang.String; are in module java.base of loader 'bootstrap')
After a while analysing the problem i've reached a solution that works. I'm not pretty sure if it's super efficient but it does it's job. Had to change the way i get the information from a file, header and content to only content.
Here's the code:
public void add_column_csv(String[] contingut) {
try {
String filename = "output_preproces.csv";
File _file = new File(filename);
if(_file.createNewFile()) { //If the file doesn't exists, create and add the first column
FileWriter filewriter = new FileWriter(filename, true);
CSVWriter csvWriter = new CSVWriter(filewriter);
List<String[]> data = new ArrayList<String[]>();
int i=0;
while(i < contingut.length) {
data.add(new String[] {contingut[i]});
i++;
}
csvWriter.writeAll(data);
csvWriter.close();
}
else { //If the file already exists, add a column to it:
FileReader filereader = new FileReader(filename);
CSVReader csvReader = new CSVReader(filereader);
List<String[]> data = new ArrayList<String[]>();
int i=0;
while(i < contingut.length) {
String[] nextLine;
nextLine = csvReader.readNext();
String[] aux = new String[nextLine.length + 1];
aux[nextLine.length] = contingut[i];
int j = 0;
while(j < nextLine.length) {aux[j] = nextLine[j]; j++;};
data.add(aux);
i++;
}
csvReader.close();
new FileWriter(filename, false).close(); //delete the old content of the file
FileWriter filewriter = new FileWriter(filename, true);
CSVWriter csvWriter = new CSVWriter(filewriter);
csvWriter.writeAll(data);
csvWriter.close();
}
} catch(Exception e) {
e.printStackTrace();
}
}
As you can see, i had to separate the CSVReader from the CSVWriter and save all the data on Lists.
I have written a method in Java for updating a text file delimited by tab. I'm using the opencsv library. First I'm reading in the existing file, then I change the values of some columns and then I overwrite the file. I'm running Windows 7. The problem is now that not everything gets written to the file. Even if I don't change any value and just overwrite the file, not all records are written.
My code is the following. What is wrong with it?
private void csvWrite(int[] boundary, String filename, int answerCount) {
CSVReader csvReader = null;
List<String[]> csvBody = null;
try {
csvReader = new CSVReader(new FileReader(PATH + FILE),'\t');
csvBody = csvReader.readAll();
} catch (IOException e) {
e.printStackTrace();
}
// Search for the file
int count = 0;
for (String[] str : csvBody) {
if (str[0].equals(filename)) {
// found
csvBody.get(count)[POS_START_INDEX+answerCount-2] = Arrays.toString(boundary);
break;
}
count++;
}
try {
csvReader.close();
CSVWriter writer = new CSVWriter(new FileWriter(PATH + FILE),'\t');
writer.writeAll(csvBody);
writer.flush();
writer.close();
} catch (IOException e) {
e.printStackTrace();
}
}
I came across the same issue. For me, it was because I was not properly closing all the readers and writers.
Got resolved after implementing autocloseable:
private File writeToTempFile(File file) throws IOException {
File tempFile = new File("/var/tmp/newcsv.csv");
tempFile.createNewFile();
try(FileReader fileReader = new FileReader(file);
CSVReader csvReader = new CSVReaderBuilder(fileReader).withSkipLines(4).build();
FileWriter outputFileWriter = new FileWriter(tempFile);
CSVWriter writer = new CSVWriter(outputFileWriter, CSVWriter.DEFAULT_SEPARATOR,
CSVWriter.NO_QUOTE_CHARACTER,
CSVWriter.NO_ESCAPE_CHARACTER,
CSVWriter.DEFAULT_LINE_END);){
writer.writeAll(csvReader.readAll());
}catch (Exception e){
}
return tempFile;
}
I am trying to read from one CSV file using OpenCSV. I then want to copy all the data from the input csv and output it to another csv file while adding a new column with information.
public void run_streets_tsv( String tsvIn, String tsvOut) throws Exception
{
CSVReader reader = null;
CSVWriter writer = null;
try
{
reader = new CSVReader((new FileReader(tsvIn)));
writer = new CSVWriter(new FileWriter(tsvOut), '\t');
String element [] = null;
List<String[]> a = new ArrayList<String[]>();
while((element = reader.readNext()) != null){
for(int i = 0; i<element.length; i++){
a.add(i, element);
//a.add("JSON"); need to add this json element at the end of each column
}
}
writer.writeAll(a);
}
catch(Exception e)
{
throw e;
}
finally
{
reader.close();
writer.close();
}
}
Another method I am trying is like this (changing the while loop, all other code remains the same):
String element [] = null;
while((element = reader.readNext()) != null){
ArrayList list = new ArrayList(Arrays.asList(reader));
list.add(element);
list.add("JSON");
writer.writeNext(element);
}
This does correctly print all the lines, but it just copies. I want to add that extra "JSON" column with its data.
The following "enlarges" the element-Array by one, enabling you to put something in the newly created last index. Then just save that array.
import java.util.Arrays;
String element[] = null;
while((element = reader.readNext()) != null){
element = Arrays.copyOf(element, element.length + 1);
element[element.length - 1] = "JSON";
writer.writeNext(element);
}
OK, you are close although I see a few errors.
'reader.readNext()' return a line from the input as a String array, we basically need to add an element to this for the output.
while((element = reader.readNext()) != null) {
String[] output = getExpandedArray(element);
a.add(output);
}
You will need to implement getExpandedArray, I will start it off.
private String[] getExpandedArray(String[] input) {
String[] output = null;
//Populate/create output from input, but with the array 1 bigger.
output[output.length -1] = "JSON";
return output;
}
I am trying to tokenize a text file using the OpenNLP tokenizer.
What I do, I read in a .txt file and store it in a list, want to iterate over every line, tokenize the line and write the tokenized line to a new file.
In the line:
tokens[i] = tokenizer.tokenize(output[i]);
I get:
Type mismatch: cannot convert from String[] to String
This is my code:
public class Tokenizer {
public static void main(String[] args) throws Exception {
InputStream modelIn = new FileInputStream("en-token-max.bin");
try {
TokenizerModel model = new TokenizerModel(modelIn);
Tokenizer tokenizer = new TokenizerME(model);
CSVReader reader = new CSVReader(new FileReader("ParsedRawText1.txt"),',', '"', 1);
String csv = "ParsedRawText2.txt";
CSVWriter writer = new CSVWriter(new FileWriter(csv),CSVWriter.NO_ESCAPE_CHARACTER,CSVWriter.NO_QUOTE_CHARACTER);
//Read all rows at once
List<String[]> allRows = reader.readAll();
for(String[] output : allRows) {
//get current row
String[] tokens=new String[output.length];
for(int i=0;i<output.length;i++){
tokens[i] = tokenizer.tokenize(output[i]);
System.out.println(tokens[i]);
}
//write line
writer.writeNext(tokens);
}
writer.close();
}
catch (IOException e) {
e.printStackTrace();
}
finally {
if (modelIn != null) {
try {
modelIn.close();
}
catch (IOException e) {
}
}
}
}
}
Does anyone has any idea how to complete this task?
As compiler says, you try to assign array of Strings (result of tokenize()) to String (tokens[i] is a String). So you should declare and use tokens inside the inner loop and write tokens[] there, too:
for (String[] output : allRows) {
// get current row
for (int i = 0; i < output.length; i++) {
String[] tokens = tokenizer.tokenize(output[i]);
System.out.println(tokens);
// write line
writer.writeNext(tokens);
}
}
writer.close();
Btw, are you sure that your source file is a csv? If it is actually a plain text file, then you split text by commas and gives such chunks to Opennlp, and it can perform worse, because its model was trained over normal sentences, not split like yours.