How to map key and values from two files? - java

I have searched around but couldn't find any similar question so I am posting a new one.
Currently I have basically two files. One file is the definition of the key, another file is the value that maps to the key (CSV format, but I'd like to not restrict it CSV file).
File 1:
id:TEXT,name:TEXT,city:TEXT,state:TEXT,zip:Integer
What it means is that the file has 5 fields, it defines the id of type TEXT, name of type TEXT, zip of type Integer, etc.
File 2 (each record is separated by a new line, there will be thousands of lines record):
11212, karen, new york, NY, 10000
21312, jim, boston, MA, 10000
12312,,seattle,,10000 // name and state is not available in this record
So the file 2 will have the value that maps to the key in file 1, notice if the value is null or empty, it will just be ignored in the result.
What would be an element way to convert these files into a java object as below:
#Data
#AllArgsConstructor
public class RecordCollection {
// Key is to map to the `id`, whereas the rest of the values map to Record
Map<String, Record> records;
}
#Data
#AllArgsConstructor
public class Record {
String name;
String city;
String state;
Integer zip;
}
To start I have:
String keyValues = "id:TEXT,name:TEXT,city:TEXT,state:TEXT,zip:Integer";
Now I have the inputStream parsed for file 2 and here is where I am at:
BufferedReader file2InputStreamBuffered = new BufferedReader("file 2");
Now, how to map the value to my Java objects in an elegant way? (With 3rd party tools or any common libs)

You need to build CsvSchema for your format file and after that use it for reading CSV file. Code could look like below:
import com.fasterxml.jackson.annotation.JsonCreator;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.databind.MappingIterator;
import com.fasterxml.jackson.dataformat.csv.CsvMapper;
import com.fasterxml.jackson.dataformat.csv.CsvSchema;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.util.HashMap;
import java.util.Map;
public class CsvApp {
public static void main(String[] args) throws Exception {
File formatFile = new File("./resource/test.format").getAbsoluteFile();
File csvFile = new File("./resource/test.csv").getAbsoluteFile();
CsvSchema schema = createSchemaForFormat(formatFile);
CsvMapper csvMapper = new CsvMapper();
MappingIterator<Record> rows = csvMapper.readerFor(Record.class).with(schema).readValues(csvFile);
RecordCollection recordCollection = new RecordCollection(100);
while (rows.hasNext()) {
recordCollection.add(rows.next());
}
recordCollection.getRecords().forEach((k, v) -> {
System.out.println(k + " => " + v);
});
}
private static CsvSchema createSchemaForFormat(File formatFile) throws IOException {
String content = String.join("", Files.readAllLines(formatFile.toPath()));
String[] columns = content.split(",");
CsvSchema.Builder builder = CsvSchema.builder();
for (String column : columns) {
String[] columnData = column.split(":");
String name = columnData[0];
String type = columnData[1];
builder.addColumn(name, "Integer".equalsIgnoreCase(type) ? CsvSchema.ColumnType.NUMBER : CsvSchema.ColumnType.STRING);
}
return builder.build();
}
}
class RecordCollection {
private final Map<String, Record> records;
RecordCollection(int expectedSize) {
this.records = new HashMap<>(expectedSize);
}
public void add(Record record) {
this.records.put(record.getId(), record);
}
public Map<String, Record> getRecords() {
return records;
}
}
class Record {
private final String id;
private final String name;
private final String city;
private final String state;
private final Integer zip;
#JsonCreator
public Record(
#JsonProperty("id") String id,
#JsonProperty("name") String name,
#JsonProperty("city") String city,
#JsonProperty("state") String state,
#JsonProperty("zip") Integer zip) {
this.id = id;
this.name = name;
this.city = city;
this.state = state;
this.zip = zip;
}
public String getId() {
return id;
}
public String getName() {
return name;
}
public String getCity() {
return city;
}
public String getState() {
return state;
}
public Integer getZip() {
return zip;
}
#Override
public String toString() {
return "Record{" +
"name='" + name + '\'' +
", city='" + city + '\'' +
", state='" + state + '\'' +
", zip=" + zip +
'}';
}
}
Above code prints:
12312 => Record{name=' ', city='seattle ', state=' ', zip=10000}
21312 => Record{name='jim ', city='boston ', state='MA', zip=10000}
11212 => Record{name='karen', city='new york', state='NY', zip=10000}
In above case Record class is tied with configuration file which defines columns. If format of CSV is not dynamic you can build schema in Java class without reading it from file. If it is dynamic, instead of Record class you could store Map<String, Object> type which could handle dynamic columns.
For more information take a look od Jackson CSV documentation.

Related

How to get the name of an Attribute from an Entity

I have the following entity class:
public class Conversation {
private String id;
private String ownerId;
private Long creationDate;
public Conversation(String id, String ownerId, Long creationDate){
this.id = id;
this.ownerId = ownerId;
this.creationDate = creationDate;
}
}
On other submodule through an external service, on each insertion, I recive a map of the following entities:
public class AttributeValue {
private Sring s; //string attribute
private String n; //number attribute
public String getS() {
return this.s;
}
public String getN() {
return this.n;
}
public AttributeValue(String s, String n){
this.s = s;
this.n = n;
}
}
//Example if I insert this conversation: new Conversation("1", "2", 1623221757971)
// I recive this map:
Map<String, AttributeValue> insertStream = Map.ofEntries(
entry("id", new AttributeValue("1", null)),
entry("ownerId", new AttributeValue("2", null)),
entry("creationDate", new AttributeValue(null, "1623221757971"))
);
To read the ownerId field from the map, I have to do this:
String ownerId = insertStream.get("ownerId").getS();
My question is, instead of have to write: insertStream.get("ownerId"), exists any way through Reflection to read the name of the field from the entity (Conversation.ownerId)?
This is because we want to mantain the submodule and If we make a change on the entitity, for example change ownerId for ownerIdentifier, the submodule shows a compilation error or is changed automatically.
Is this what you want? Field#getName()
Example code:
Field[] conversationFields = Conversation.class.getDeclaredFields();
String field0Name = conversationFields[0].getName();
Depending on the JVM used, field0Name can be "id". You can also use Class#getFields(), this method includes all Fields that are accessible in this class (super class's fields).
Another option (not using reflection) would be to refactor your code.
import java.util.Map;
import java.util.HashMap;
public class Conversation {
public static String[] names = {
"id", "ownerId", "creationDate"
};
private Map<String, Object> data = new HashMap<String,Object>();
public Conversation(Object... data) {
if(data.length!=names.length)
throw new IllegalArgumentException("You need to pass "+names.length+" arguments!");
for(int i=0; i<names.length; i++)
data.put(names[i],data[i]);
}
public Map<String,Object> getData() { return data; }
// You can pass "id"/"ownerId" or names[0]/names[1]
public String getString(String key) {
return (String)data.get(key);
}
// You can pass "creationDate" or names[2]
public long getLong(String key) {
return (long)data.get(key);
}
}
You could then create Conversation Objects like before:
Conversation c = new Conversation("myId","myOwnerId",123456789L);
You could also add public static String fields like ID="id", but changing the value of a field will never change the field's name.

How to convert JSON field name to Java bean class property with Jackson

I have access to a RESTful API which returns JSON Strings, such as the following:
{
"Container1": {
"active": true
},
"Container2": {
"active": false
},
}
The problem is that the RESTful API is a bit maldesigned. The field name contains the data already. With the Jackson library it is not possible to deserialize the field name to a property name of the corresponding Java bean class. I assume, this isn't intended by the JSON specification neither. The above JSON string needs to be deserialized to an instance of the following class:
public class Container {
private Boolean active;
private String name;
}
I end up with UnrecognizedPropertyException for the field Container1.
I thought to configure to ignore unknown properties and to provide a JsonDeserializer for that property like this:
#JsonIgnoreProperties(ignoreUnknown = true)
public class Container {
private Boolean active;
private String name;
#JsonDeserialize(using = FieldNameToPropertyDeserializer.class)
public void setName(String name) {
this.name = name;
}
}
and the FieldNameToPropertyDeserializer:
public class FieldNameToPropertyDeserializer extends StdDeserializer<String> {
public FieldNameToPropertyDeserializer() {
super(String.class);
}
#Override
public String deserialize(JsonParser parser, DeserializationContext context) throws IOException, JsonProcessingException {
return parser.getCurrentName();
}
}
The invocation of the deserialization is achieved as follows:
String jsonString = response.readEntity(String.class);
ObjectMapper objectMapper = new ObjectMapper();
ObjectReader readerFor = objectMapper.readerFor(Container.class);
MappingIterator<Container> mappingIterator = readerFor.readValues(jsonString);
while (mappingIterator.hasNext()) {
Container container = (Container) mappingIterator.next();
containers.add(container);
}
But I only receive empty objects (properties set to null) because the parsing of the properties is skipped since I set #JsonIgnoreProperties(ignoreUnknown = true).
Is this possible at all? Or should I implement something like a post-processing afterwards?
How about this. Create a class ContainerActive like this
public class ContainerActive {
private boolean active;
// constructors, setters, getters
}
And you could just do
Map<String, ContainerActive> map = mapper.readValue(jsonString, new TypeReference<Map<String, ContainerActive>>() {});
With this you will have "Container1", "Container2" as the keys and ContainerActive Object as values which has active field.
Just a quick solution, if the object is such that, that all of it object is a container object you can receive the JSON inside and JSONObject you may use below code
import java.io.IOException;
import org.json.JSONException;
import org.json.JSONObject;
import com.fasterxml.jackson.core.JsonParseException;
import com.fasterxml.jackson.databind.JsonMappingException;
import com.fasterxml.jackson.databind.ObjectMapper;
public class TestSO {
public static void main(String[] args) throws JsonParseException, JsonMappingException, JSONException, IOException {
String jsonString = "{\r\n" +
" \"Container1\": {\r\n" +
" \"active\": true\r\n" +
" },\r\n" +
" \"Container2\": {\r\n" +
" \"active\": false\r\n" +
" },\r\n" +
"}";
JSONObject jsonObject = new JSONObject(jsonString);
ObjectMapper mapper = new ObjectMapper();
for (String key : jsonObject.keySet()) {
Container container = mapper.readValue(jsonObject.get(key).toString(), Container.class);
System.out.println(container);
}
}
static class Container{
private String name;
private Boolean active;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public Boolean getActive() {
return active;
}
public void setActive(Boolean active) {
this.active = active;
}
#Override
public String toString() {
return "Container [name=" + name + ", active=" + active + "]";
}
}
}

Jackson-CSV schema for array

I have CSV file which I need to parse. The schema for this file is following:
name, contacts - where contacts are many strings for each person where
the number of this contact columns is not regular.
For example:
john, john#wick.com, 123 123 123, fb/john.wick
mike, 123 0303 11
dave,
I'm trying to create a CsvSchema with Jacskon CSV for my bean:
public class Person {
private String name;
private String[] contacts;
}
By creating custom schema:
CsvSchema schema = CsvSchema.builder()
.addColumn("name")
.addArrayColumn("contacts", ",")
.build();
But I am getting this:
com.fasterxml.jackson.dataformat.csv.CsvMappingException: Too many entries: expected at most 2
How to with Jackson CSV solve problem like that?
Java code:
CsvMapper mapper = new CsvMapper();
CsvSchema schema = CsvSchema.builder()
.addColumn("name")
.addArrayColumn("contacts", ",")
.build();
MappingIterator<Person> it = mapper.readerFor(Person.class).with(schema)
.readValues(csvString);
List<Person> all = it.readAll();
You can use CsvParser.Feature.WRAP_AS_ARRAY feature and read whole row as List<String>. In constructor you can convert List to Person object. See below example:
import com.fasterxml.jackson.databind.MappingIterator;
import com.fasterxml.jackson.dataformat.csv.CsvMapper;
import com.fasterxml.jackson.dataformat.csv.CsvParser;
import java.io.File;
import java.util.List;
import java.util.stream.Collectors;
public class CsvApp {
public static void main(String[] args) throws Exception {
File csvFile = new File("./resource/test.csv").getAbsoluteFile();
CsvMapper csvMapper = new CsvMapper();
csvMapper.enable(CsvParser.Feature.WRAP_AS_ARRAY);
MappingIterator<List<String>> rows = csvMapper.readerFor(List.class).readValues(csvFile);
List<Person> persons = rows.readAll().stream()
.filter(row -> !row.isEmpty())
.map(Person::new)
.collect(Collectors.toList());
persons.forEach(System.out::println);
}
}
class Person {
private String name;
private List<String> contacts;
public Person(List<String> row) {
this.name = row.remove(0);
this.contacts = row;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public List<String> getContacts() {
return contacts;
}
public void setContacts(List<String> contacts) {
this.contacts = contacts;
}
#Override
public String toString() {
return "Person{" +
"name='" + name + '\'' +
", contacts=" + contacts +
'}';
}
}
For your input above code prints:
Person{name='john', contacts=[john#wick.com, 123 123 123, fb/john.wick]}
Person{name='mike', contacts=[123 0303 11]}
Person{name='dave', contacts=[]}
I think the problem is with the column separator being the same as the array column separator.
You can use ; instead of , in your CSV.
Should be: john,john#wick.com;123 123 123;fb/john.wick
This way you can continue to use the features of Jackson instead of having to manually instantiate from a List of Strings.

How to dynamically assign headers to a csv file using CsvMapper in Java

Can anyone help please?
I am stuck on reading a csv file and serializing it onto a POJO.
I am using CsvMapper from jackson library. The reading and serialization part are done and works fine-ish. The issue is when the user moves the headers/columns around causing the serialization to make some alphabetical assumption that the values on the CSV file are also alphabetically.
Eg (File below has headers on the first row and the second row has person details values)
personNameHeader,personAgeHeader
Wiliam,32
Now my POJO is as follow
#JsonIgnoreProperties(ignoreUnknown = true)
// #JsonPropertyOrder(value = {"personNameHeader", "personAgeHeader" })
public class PersonDetailsCSVTemplate {
#JsonProperty("personNameHeader")
private String name;
#JsonProperty("personAgeHeader")
private String age;
//Public constructor and getters and setters...
This is the code to read the values from the CSV and map onto the class
import com.fasterxml.jackson.databind.MappingIterator;
import com.fasterxml.jackson.dataformat.csv.CsvMapper;
import com.fasterxml.jackson.dataformat.csv.CsvSchema;
...
CsvMapper csvMapper = new CsvMapper();
CsvSchema schema = csvMapper.typedSchemaFor(PersonDetailsCSVTemplate.class).withHeader();
MappingIterator<PersonDetailsCSVTemplate > dataIterator = csvMapper.readerFor(PersonDetailsCSVTemplate.class).with(schema)
.readValues(data);
while (dataIterator.hasNextValue()) {
PersonDetailsCSVTemplate dataCSV = dataIterator.nextValue();
}
After the serialization it can be seen that CsvMapper mapped the following:
PersonDetailsCSVTemplate.name = "32" andPersonDetailsCSVTemplate.age = "Wiliam"
By annotating the class with #JsonPropertyOrder(value = {"personNameHeader", "personAgeHeader" }) forces the CSV to be always name column followed by age column which isnt ideal.
Can anyone suggest anything that they think will work?
Regards
Since Jackson 2.7, you can use withColumnReordering(true) instead of sortedBy()
CsvSchema schema = csvMapper
.typedSchemaFor(PersonDetailsCSVTemplate.class)
.withHeader()
.withColumnReordering(true);
You could use sortedBy and give it the order in which the properties apeare in your csv data:
#JsonIgnoreProperties(ignoreUnknown = true)
#JsonPropertyOrder(value = { "personNameHeader", "personAgeHeader" })
static class PersonDetailsCSVTemplate
{
#JsonProperty("personNameHeader")
private String name;
#JsonProperty("personAgeHeader")
private String age;
public String getName()
{
return name;
}
public void setName(String name)
{
this.name = name;
}
public String getAge()
{
return age;
}
public void setAge(String age)
{
this.age = age;
}
#Override
public String toString()
{
return "PersonDetailsCSVTemplate [name=" + name + ", age=" + age + "]";
}
}
You can keep or leave #JsonPropertyOrder it won't affect the output.
#Test
public void sort() throws IOException
{
CsvMapper csvMapper = new CsvMapper();
CsvSchema schema = csvMapper
.typedSchemaFor(PersonDetailsCSVTemplate.class)
.withHeader()
.sortedBy("personNameHeader", "personAgeHeader")
.withColumnSeparator(',')
.withComments();
MappingIterator<PersonDetailsCSVTemplate> dataIterator =
csvMapper
.readerFor(PersonDetailsCSVTemplate.class)
.with(schema)
.readValues("personNameHeader,personAgeHeader\r\n"
+
"Wiliam,32\r\n");
while (dataIterator.hasNextValue())
{
PersonDetailsCSVTemplate dataCSV = dataIterator.nextValue();
System.out.println(dataCSV);
}
}
Output:
PersonDetailsCSVTemplate [name=Wiliam, age=32]

Parsing JSON data into model objects in Java

I haven't worked with JSON data before, thus the question.
I've the following JSON object in a file.
{
"courses": [
{ "id":998", "name":"Java Data Structures", "teacherId":"375" },
{ "id":"999", "name":"Java Generics", "teacherId":"376" }
],
"teachers": [
{ "id":"375", "firstName":"Amiyo", "lastName":"Bagchi"},
{ "id":"376", "firstName":"Dennis", "lastName":"Ritchie"}
]
}
Here are my model Objects.
public class Course {
private int _id;
private String _name;
private Teacher _teacher;
}
public class Teacher {
private int _id;
private String _firstName;
private String _lastName;
}
My task is to read the JSON Objects and return a list of Model objects.
I've imported the simple.JSON family of jar and here's my code that reads the file.
FileReader reader = new FileReader(path);
JSONParser parser = new JSONParser();
Object obj = parser.parse(reader);
JSONObject jsonObject = (JSONObject) obj;
My question is,
How do I parse the JSON document into my Model objects?
If the input file is JSON but of a different format how do I throw exception/handle the anomaly?
Any help appreciated.
UPDATE I suggest you use JSON parser to parse the data:
import org.json.JSONArray;
import org.json.JSONException;
import org.json.JSONObject;
import java.util.ArrayList;
import java.util.List;
import java.util.HashMap;
import java.io.IOException;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Paths;
class Course {
public int _id;
public String _name;
public Teacher _teacher;
private Course(int id, String name, Teacher teacher){
this._id = id;
this._name = name;
this._teacher = teacher;
}
public Course() {
}
}
class Teacher {
public int _id;
public String _firstName;
public String _lastName;
private Teacher(int id, String fname, String lname){
this._id = id;
this._firstName = fname;
this._lastName = lname;
}
public Teacher(){
}
}
public class jsontest {
public static void main(String[] args) throws JSONException, IOException {
// String JSON_DATA = "{\n"+
// " \"courses\": [\n"+
// " { \"id\":\"998\", \"name\":\"Java Data Structures\", \"teacherId\":\"375\" },\n"+
// " { \"id\":\"999\", \"name\":\"Java Generics\", \"teacherId\":\"376\" }\n"+
// "\n"+
// " ],\n"+
// " \"teachers\": [\n"+
// " { \"id\":\"375\", \"firstName\":\"Amiyo\", \"lastName\":\"Bagchi\"},\n"+
// " { \"id\":\"376\", \"firstName\":\"Dennis\", \"lastName\":\"Ritchie\"} \n"+
// " ]\n"+
// "}\n"+
// "";
// read json file into string
String JSON_DATA = new String(Files.readAllBytes(Paths.get("path_to_json_file")), StandardCharsets.UTF_8);
// using a JSON parser
JSONObject obj = new JSONObject(JSON_DATA);
// parse "teachers" first
List<Teacher> listCourses = new ArrayList<Teacher>();
List<JSONObject> listObjs = parseJsonData(obj,"teachers");
for (JSONObject c: listObjs) {
Teacher teacher = new Teacher();
teacher._id = c.getInt("id");
teacher._firstName = c.getString("firstName");
teacher._lastName = c.getString("lastName");
listCourses.add(teacher);
}
// parse "courses" next
List<Course> resultCourses = new ArrayList<Course>();
List<JSONObject> listObjs2 = parseJsonData(obj, "courses");
for (JSONObject c: listObjs2) {
Course course = new Course();
course._id = c.getInt("id");
course._name = c.getString("name");
int teacherId = c.getInt("teacherId");
HashMap<String, Teacher> map = new HashMap<String, Teacher>();
for (Teacher t: listCourses){
map.put(Integer.toString(t._id), t);
}
course._teacher = map.get(Integer.toString(teacherId));
resultCourses.add(course);
}
}
public static List<JSONObject> parseJsonData(JSONObject obj, String pattern)throws JSONException {
List<JSONObject> listObjs = new ArrayList<JSONObject>();
JSONArray geodata = obj.getJSONArray (pattern);
for (int i = 0; i < geodata.length(); ++i) {
final JSONObject site = geodata.getJSONObject(i);
listObjs.add(site);
}
return listObjs;
}
}
Output:
BTW: The json data in the example has one value whose double quotes are not in pairs. To proceed, it must be fixed.
You should try using Jackson as the JSON parsing library instead. There is a lot more support and features that come with it.
In your case, a couple of annotations to map the JSON properties to the Java fields should be sufficient.
https://github.com/FasterXML/jackson-annotations
https://github.com/FasterXML/jackson-databind
UPDATE: Some code, to show just much better this can be done with Jackson.
public class Course {
#JsonProperty("id")
private int _id;
#JsonProperty("name")
private String _name;
#JsonProperty("teacher")
private Teacher _teacher;
// ...public getters and setters
}
public class Teacher {
#JsonProperty("id")
private int _id;
#JsonProperty("firstName")
private String _firstName;
#JsonProperty("lastName")
private String _lastName;
// ...public getters and setters
}
// Container class to conform to JSON structure
public class CoursesDto {
private List<Teacher> teachers;
private List<Course> courses;
}
// In your parser place
ObjectMapper mapper = new ObjectMapper();
FileReader reader = new FileReader(path);
CoursesDto dto = mapper.readValue(reader, CoursesDto.class);
The #JsonProperty annotations tell Jackson what JSON key should be used to deserialize. They are not necessary if the property names match the JSON keys. That means that if you remove the leading underscore from your property names, this would work without annotations. Also, Jackson will default to using public fields and getter/setter methods. This means that you can keep your fields prefixed by _ as long as the getter/setter don't have it (setFirstName(String firstName)).

Categories

Resources