Converting Vector to Arraylist - java

Hello is there any way to convert vector into arraylist ? I want to make search options for my table but looks like it would be much easier with arraylist for me. I used search but surprising there was nothing about vectors.
public Vector<Vector<Object>> InfoForTheTable() {
Scanner s = null;
Vector<Vector<Object>> data = new Vector<Vector<Object>>();
try {
s = new Scanner(new File("info.txt"));
while (s.hasNextLine()) {
String line = s.nextLine();
if (line.startsWith("")) {
String[] atoms = line.split("[#]");
Vector<Object> row = new Vector<Object>();
row.add(atoms[0]);
row.add(atoms[1]);
row.add(atoms[2]);
row.add(atoms[3]);
row.add(atoms[4]);
row.add(atoms[5]);
data.add(row);
}
}
}
catch(IOException e) {
e.printStackTrace();
}
finally {
if (s != null) {
s.close();
}
}
return data;
}

Instead of
Vector<Vector<Object>> data = new Vector<Vector<Object>>();
I recommend to use:
List<List<Object>> data = new ArrayList<>();
Or even better: create a class MyInfo and assign the atoms values to its properties
Then use:
List<MyInfo> data = new ArrayList<>();
So here is a more modern version of your code:
public List<Info> readInfoFromFile() {
List<Info> infoList= new ArrayList<>();
try (Scanner s = new Scanner(Paths.get("info.txt")){
while (s.hasNextLine()) {
String line = s.nextLine();
if (line.startsWith("")) {
String[] atoms = line.split("[#]");
Info info = new Info();
info.setA(atoms[0]);
info.setB(atoms[1]);
info.setC(atoms[2]);
info.setD(atoms[3]);
info.setE(atoms[4]);
info.setG(atoms[5]);
infoList.add(rowinfo);
}
} catch(IOException e) {
e.printStackTrace();
}
return data;
}
Replace the properties and their types as needed.

I'd recommend you to don't use that type of collections, first of all the main difference between ArrayList and Vector is that all vector operations are synchhronized. Second, you have a Vector< Vector<Object> > and you want this to be ArrayList< ArrayList<Object> > So in my opinion you should create your own class (bean class).
Example with at least Java 1.7:
public class myTableModel{
private String somePropertyName1;
.
.
.
private String somePropertyNameN;
public MyTableModel(String ... array){
//assign values to instance attributes.
}
//getters and setters
}
//remember method names in java starts with lower-case
public List<MyTableModel> infoForTheTable() {
List<MyTableModel> data = new ArrayList<>(); //diamond inference
//use try-with-resources
try (Scanner s = new Scanner(Paths.getPath("info.txt"))){
while (s.hasNextLine()) {
String line = s.nextLine();
if (line.startsWith("")) {
String[] atoms = line.split("[#]");
data.add(new MyTableModel(atoms[0],atoms[1],atoms[2],atoms[3],atoms[4],atoms[5]));
}
}
}
catch(IOException e) {
//handle exception or throw it up!
}
return data;
}

Yes, ArrayList has a constructor that takes a Collection (which Vector obviously is) to do so:
ArrayList(Collection<? extends E> c)

just use the constructor which takes a collection as its parameter:
ArrayList<String> list = new ArrayList<String>(row);
Note that it only does a shallow copy.
for more details see this

Related

Why do I need to convert from Integer[] to int[]?

I have the following code
public static int[] readCSV() {
ArrayList<Integer> entries = new ArrayList<>();
try {
File file = new File("someDataFile.csv");
FileReader fr = new FileReader(file);
BufferedReader br = new BufferedReader(fr);
String line = "";
String[] row;
while((line = br.readLine()) != null) {
row = line.split(",");
for(String value : row) {
int entry = Integer.parseInt(value);
entries.add(entry);
}
}
br.close();
} catch(IOException ioe) {
ioe.printStackTrace();
}
int[] IDs = entries.toArray();
return IDs;
}
Every entry of the csv is an integer stored as a string. I get the following error: "Type mismatch: cannot convert from Object[] to int[]". As far as I understand, "entries" is not an Object[] here, it's an ArrayList<Integer>.
I was using an example given on geeksforgeeks. That didn't work and I'm not sure why.
I also checked the previous answers to the same question, and the top answer works for me. That said, I still don't have an int[], I only have Integer[]. Then I have to do this to convert from Integer[] to int[]. My question is why do I have to do all that instead of int[] IDs = entries.toArray();?
If I do
int[] IDs = new int[entries.size()];
for (int i=0; i<entries.size(); i++) {
IDs[i] = entries.get(i);
}
it works fine. Why is that different from int[] IDs = entries.toArray()?
Is there a better way to get the contents of the csv file in an int[]?
First, to answer your question, because a collection (like ArrayList) can only contain object instances. That means you must use the Integer wrapper type instead of the int primitive type. However, in Java 8+, there are simple ways to perform that conversion. I would also strongly recommend a try-with-Resources over manually closing the BufferedReader. I also simplified the code a little. Like,
public static int[] readCSV() {
List<Integer> entries = new ArrayList<>();
File file = new File("someDataFile.csv");
try (BufferedReader br = new BufferedReader(new FileReader(file))) {
String line;
while ((line = br.readLine()) != null) {
String[] row = line.split("\\s*,\\s*"); // Consume white space
for (String value : row) {
entries.add(Integer.parseInt(value));
}
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
return entries.stream().mapToInt(Integer::intValue).toArray();
}
List#toArray always returns an Object[]. The closest you can get is entries.toArray(new Integer[0]) to get an Integer[].
To get an int[] you can use the Streams API or loop over the List and copy it over to an array.
Integer[] arr = list.toArray(new Integer[0]);
int[] arr2 = list.stream().mapToInt(i -> i).toArray();

How to remove specific duplicate data in array after sorting?

This is my code:
FileWriter writers = null;
try {
BufferedReader reader = new BufferedReader(new FileReader("Database.txt"));
ArrayList<Data> dataList = new ArrayList<>();
String line = "";
while ((line = reader.readLine()) != null) {
//split string and construct Data object and add it to dataList
dataList.add(parse(line));
}
reader.close();
Collections.sort(dataList);
writers = new FileWriter("final.txt");
for (Data d : dataList) {
writers.write(d.toString());
writers.write("\r\n");
}
writers.close();
} catch (Exception ex) {
ex.printStackTrace();
} finally {
}
Input/Output in this code:
input: mamy, 30, new, old
daddy, 21, new, new
output: daddy, 21,new,new
mamy , 30, new, old
Expected output:
daddy,21,new
mamy,30,new,old
My Problem is how to remove duplicate in array before storing it to final.txt? any suggestion?
I think Set is perfect for you, it eliminates duplicates.
Set<Data> dataSet = new HashSet<>(dataList);
To remove duplicates use this code right before sorting.
ArrayList<Data> newDataList = new ArrayList<>();
for (Data element : dataList) {
if (!newDataList.contains(element)) {
newDataList.add(element);
}
}
dataList = newDataList;

Convert CSV to JSON array in Java Springboot

Hi I am trying to convert a CSV file into a JSON array using A dependency called csvReader, but when I run the code it prints out the JSON response incorrectly and I ament sure why would anyone be able to point me in the right direction.
#GetMapping("/convert")
public List<List<String>> convertCSV() throws FileNotFoundException {
List<List<String>> records = new ArrayList<List<String>>();
try (CSVReader csvReader = new CSVReader(new FileReader("C:/Download/cities.csv"));) {
String[] values = null;
while ((values = csvReader.readNext()) != null) {
records.add(Arrays.asList(values));
}
} catch (IOException e) {
e.printStackTrace();
}
return values;
}
Your case is not a big deal.
You can read that csv and build json.
Read first row and determine columns. The rest of rows are values.
public class Foo{
public static void main(String[] args) throws Exception{
List<String> csvRows = null;
try(var reader = Files.lines(Paths.get("dataFile.csv"))){
csvRows = reader.collect(Collectors.toList());
}catch(Exception e){
e.printStackTrace();
}
if(csvRows != null){
String json = csvToJson(csvRows);
System.out.println(json);
}
}
public static String csvToJson(List<String> csv){
//remove empty lines
//this will affect permanently the list.
//be careful if you want to use this list after executing this method
csv.removeIf(e -> e.trim().isEmpty());
//csv is empty or have declared only columns
if(csv.size() <= 1){
return "[]";
}
//get first line = columns names
String[] columns = csv.get(0).split(",");
//get all rows
StringBuilder json = new StringBuilder("[\n");
csv.subList(1, csv.size()) //substring without first row(columns)
.stream()
.map(e -> e.split(","))
.filter(e -> e.length == columns.length) //values size should match with columns size
.forEach(row -> {
json.append("\t{\n");
for(int i = 0; i < columns.length; i++){
json.append("\t\t\"")
.append(columns[i])
.append("\" : \"")
.append(row[i])
.append("\",\n"); //comma-1
}
//replace comma-1 with \n
json.replace(json.lastIndexOf(","), json.length(), "\n");
json.append("\t},"); //comma-2
});
//remove comma-2
json.replace(json.lastIndexOf(","), json.length(), "");
json.append("\n]");
return json.toString();
}
}
Tested on:
fname,lname,note
Shaun,Curtis,a
Kirby,Beil,b
-----------------------
[
{
"fname" : "Shaun",
"lname" : "Curtis",
"note" : "a"
}, {
"fname" : "Kirby",
"lname" : "Beil",
"note" : "b"
}
]
This method work on any structure of csv. Don't need to map columns.
That is because of your reading data in String and printing the List of String. If you want to map the CSV to Object ( JSON Object), You need to read the CSV as bean object please find below code snippet to print as JSON, override toString method as JSON format.
User.java
public class User {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
#NotNull
private String name;
#NotNull
private String surname;
//Getter and Setters
}
CsvReaderUtil.java
public static List<User> readCsvFile() throws IOException {
List<User> list = null;
CSVReader reader = null;
InputStream is = null;
try {
File initialFile = new File("C:\\Users\\ER\\Desktop\\test.csv");
is = new FileInputStream(initialFile);
reader = new CSVReader(new InputStreamReader(is), ',', '"', 1);
ColumnPositionMappingStrategy strat = new ColumnPositionMappingStrategy();
strat.setType(User.class);
String[] columns = new String[]{"id", "name", "surname"};
strat.setColumnMapping(columns);
CsvToBean csv = new CsvToBean();
list = csv.parse(strat, reader);
} catch (Exception e) {
e.printStackTrace();
} finally {
is.close();
reader.close();
}
return list;
}
Now print this List Of Users as a JSON object.
Here is a useful example of how to transform CSV to JSON using Java 11+:
private String fromCsvToJson(String csvFile) {
String[] lines = file.split("\n");
if (lines.length <= 1) {
return List.of();
}
var headers = lines[0].split(",");
var jsonFormat = Arrays.stream(lines)
.skip(1)
.map(line -> line.split(","))
.filter(line -> headers.length == line.length)
.map(line -> IntStream.range(0, headers.length).boxed().collect(toMap(i -> headers[i], i -> line[i], (a, b) -> b)))
.toList();
return new ObjectMapper().writeValueAsString(jsonFormat);
}

Reading a file into an array Java

So I have a project in which I have to read book reference numbers and book titles from a .txt file into an array, and then a user is to enter a reference number that will do a search for the book with that reference number, so here is what i have, Keep in mind I'm not very experienced with java
public class Book {
ArrayList<String> books = new ArrayList<String>();
BufferedReader br = null;
{
try {
br = new BufferedReader(new FileReader("BookList.txt"));
String book;
while ((book = br.readLine()) != null) {
books.add(book);
}
} catch (IOException e){
} finally {
try {
br.close();
} catch (IOException ex) {
}
}
String [] bookList = new String[books.size()];
books.toArray(bookList);
}
}
That is to read the file into an array list, and then convert the array list into an array
Im not 100% sure if that's right so if theres a problem, I would gladly take your solution.
The problem i'm having is when i try to set up a method that allows a user to search
private void FindItActionPerformed(java.awt.event.ActionEvent evt) {
String input;
input = Input.getText();
for(int i=0; i<bookList.length; i++){
}
}
I get an error that says cannot find symbol bookList, but im not sure what why
Thanks for any help or advice you may be able to offer
your init code is inside a scoped brackets, you missed a method declaration by the way.
you cant reach bookList as its not a class parameter but declared in the scope.
put a method declaration (above the try) and add bookList declaration under your BufferedReader variable instead of declaring it in the scope.
You need to have bookList available to all of your methods
public class Book {
ArrayList<String> books = new ArrayList<String>();
String[] bookList;
BufferedReader br = null;
// ...
Then you need to set it to something. Your current line books.toArray(bookList); uses bookList as the argument for toArray to know what kind of array it is producing, then it will return an array of that type. So you need to do
this.bookList = books.toArray(bookList);
Your code should look like this:
public class Book {
ArrayList<String> books = new ArrayList<String>();
BufferedReader br = null;
String[] bookList; //difference (bookList is now visible to all methods in class)
{
try {
br = new BufferedReader(new FileReader("BookList.txt"));
String book;
while ((book = br.readLine()) != null) {
books.add(book);
}
} catch (IOException e){
} finally {
try {
br.close();
} catch (IOException ex) {
}
}
bookList = new String[books.size()]; //difference
books.toArray(bookList);
}
}
private void FindItActionPerformed(java.awt.event.ActionEvent evt){
String input;
input = Input.getText();
for(int i=0; i<bookList.length; i++){
//do something...
}
}
Problem in your code is that you have tried to use variable bookList which was in different scope defined.

Writing back to a csv with Java using super csv

Ive been working on this code for quite sometime and just want to be given the simple heads up if im routing down a dead end. The point where im at now is to mathch identical cells from diffrent .csv files and copy one row into another csv file. The question really is would it be possible to write at specfic lines say for example if the the 2 cells match at row 50 i wish to write back on to row 50. Im assuming that i would maybe extract everything to a hashmap, write it in there then write back to the .csv file? is there a easier way?
for example i have one Csv that has person details, and the other has property details of where the actual person lives, i wish to copy the property details to the person csv, aswell as match them up with the correct person detail. hope this makes sense
public class Old {
public static void main(String [] args) throws IOException
{
List<String[]> cols;
List<String[]> cols1;
int row =0;
int count= 0;
boolean b;
CsvMapReader Reader = new CsvMapReader(new FileReader("file1.csv"), CsvPreference.EXCEL_PREFERENCE);
CsvMapReader Reader2 = new CsvMapReader(new FileReader("file2.csv"), CsvPreference.EXCEL_PREFERENCE);
try {
cols = readFile("file1.csv");
cols1 = readFile("fiel2.csv");
String [] headers = Reader.getCSVHeader(true);
headers = header(cols1,headers
} catch (IOException e) {
e.printStackTrace();
return;
}
for (int j =1; j<cols.size();j++) //1
{
for (int i=1;i<cols1.size();i++){
if (cols.get(j)[0].equals(cols1.get(i)[0]))
{
}
}
}
}
private static List<String[]> readFile(String fileName) throws IOException
{
List<String[]> values = new ArrayList<String[]>();
Scanner s = new Scanner(new File(fileName));
while (s.hasNextLine()) {
String line = s.nextLine();
values.add(line.split(","));
}
return values;
}
public static void csvWriter (String fileName, String [] nameMapping ) throws FileNotFoundException
{
ICsvListWriter writer = new CsvListWriter(new PrintWriter(fileName),CsvPreference.STANDARD_PREFERENCE);
try {
writer.writeHeader(nameMapping);
} catch (IOException e) {
e.printStackTrace();
}
}
public static String[] header(List<String[]> cols1, String[] headers){
List<String> list = new ArrayList<String>();
String [] add;
int count= 0;
for (int i=0;i<headers.length;i++){
list.add(headers[i]);
}
boolean c;
c= true;
while(c) {
add = cols1.get(0);
list.add(add[count]);
if (cols1.get(0)[count].equals(null))// this line is never read errpr
{
c=false;
break;
} else
count ++;
}
String[] array = new String[list.size()];
list.toArray(array);
return array;
}
Just be careful if you read all of the addresses and person details into memory first (as Thomas has suggested) - if you're only dealing with small CSV files then it's fine, but you may run out of memory if you're dealing with larger files.
As an alternative, I've put together an example that reads the addresses in first, then writes the combined person/address details while it reads in the person details.
Just a few things to note:
I've used CsvMapReader and CsvMapWriter because you were - this meant I've had to use a Map containing a Map for storing the addresses. Using CsvBeanReader/CsvBeanWriter would make this a bit more elegant.
The code from your question doesn't actually use Super CSV to read the CSV (you're using Scanner and String.split()). You'll run into issues if your CSV contains commas in the data (which is quite possible with addresses), so it's a lot safer to use Super CSV, which will handle escaped commas for you.
Example:
package example;
import java.io.StringReader;
import java.io.StringWriter;
import java.util.HashMap;
import java.util.Map;
import org.supercsv.io.CsvMapReader;
import org.supercsv.io.CsvMapWriter;
import org.supercsv.io.ICsvMapReader;
import org.supercsv.io.ICsvMapWriter;
import org.supercsv.prefs.CsvPreference;
public class CombiningPersonAndAddress {
private static final String PERSON_CSV = "id,firstName,lastName\n"
+ "1,philip,fry\n2,amy,wong\n3,hubert,farnsworth";
private static final String ADDRESS_CSV = "personId,address,country\n"
+ "1,address 1,USA\n2,address 2,UK\n3,address 3,AUS";
private static final String[] COMBINED_HEADER = new String[] { "id",
"firstName", "lastName", "address", "country" };
public static void main(String[] args) throws Exception {
ICsvMapReader personReader = null;
ICsvMapReader addressReader = null;
ICsvMapWriter combinedWriter = null;
final StringWriter output = new StringWriter();
try {
// set up the readers/writer
personReader = new CsvMapReader(new StringReader(PERSON_CSV),
CsvPreference.STANDARD_PREFERENCE);
addressReader = new CsvMapReader(new StringReader(ADDRESS_CSV),
CsvPreference.STANDARD_PREFERENCE);
combinedWriter = new CsvMapWriter(output,
CsvPreference.STANDARD_PREFERENCE);
// map of personId -> address (inner map is address details)
final Map<String, Map<String, String>> addresses =
new HashMap<String, Map<String, String>>();
// read in all of the addresses
Map<String, String> address;
final String[] addressHeader = addressReader.getCSVHeader(true);
while ((address = addressReader.read(addressHeader)) != null) {
final String personId = address.get("personId");
addresses.put(personId, address);
}
// write the header
combinedWriter.writeHeader(COMBINED_HEADER);
// read each person
Map<String, String> person;
final String[] personHeader = personReader.getCSVHeader(true);
while ((person = personReader.read(personHeader)) != null) {
// copy address details to person if they exist
final String personId = person.get("id");
final Map<String, String> personAddress = addresses.get(personId);
if (personAddress != null) {
person.putAll(personAddress);
}
// write the combined details
combinedWriter.write(person, COMBINED_HEADER);
}
} finally {
personReader.close();
addressReader.close();
combinedWriter.close();
}
// print the output
System.out.println(output);
}
}
Output:
id,firstName,lastName,address,country
1,philip,fry,address 1,USA
2,amy,wong,address 2,UK
3,hubert,farnsworth,address 3,AUS
From your comment, it seems like you have the following situation:
File 1 contains persons
File 2 contains addresses
You then want to match persons and addresses by some key ( one or more fields) and write the combination back to a CSV file.
Thus the simplest approach might be something like this:
//use a LinkedHashMap to preserve the order of the persons as found in file 1
Map<PersonKey, String[]> persons = new LinkedHashMap<>();
//fill in the persons from file 1 here
Map<PersonKey, String[]> addresses = new HashMap<>();
//fill in the addresses from file 2 here
List<String[]> outputLines = new ArrayList<>(persons.size());
for( Map.Entry<PersonKey, String[]> personEntry: persons.entrySet() ) {
String[] person = personEntry.getValue();
String[] address = addresses.get( personEntry.getKey() );
//merge the two arrays and put them into outputLines
}
//write outputLines to a file
Note that PersonKey might just be a String or a wrapper object ( Integer etc.) if you can match persons and addresses by one field. If you have more fields you might need a custom PersonKey object with equals() and hashCode() properly overridden.

Categories

Resources