json/java write at next free line - java

I try to write a json file and I want to get an output like this:
{"key1" : "value"..}
{"key4" : "value"..}
in the same file.
I've done it this way:
public class Writer {
public void writerMeth(String[] dataArray) throws IOException{
JsonObjectBuilder builder = Json.createObjectBuilder();
builder.add("key1",dataArray[0])
.add("key2",dataArray[1])
.add("key3",dataArray[2]) ;
JsonStructure output=builder.build();
HashMap<String, Object> config = new HashMap<String, Object>();
config.put(JsonGenerator.PRETTY_PRINTING, true);
JsonWriterFactory factory =Json.createWriterFactory(config);
JsonWriter writer = factory.createWriter(new FileOutputStream("file.json"));
writer.write(output);
writer.close();
My problem is that by calling the method the old data is deleted; I want to write the next entry at the next free line. It is possible?

Related

SnakeYaml dump function writes with single quotes

Consider the following code:
public void testDumpWriter() {
Map<String, Object> data = new HashMap<String, Object>();
data.put("NAME1", "Raj");
data.put("NAME2", "Kumar");
Yaml yaml = new Yaml();
FileWriter writer = new FileWriter("/path/to/file.yaml");
for (Map.Entry m : data.entrySet()) {
String temp = new StringBuilder().append(m.getKey()).append(": ").append(m.getValue()).toString();
yaml.dump(temp, file);
}
}
The output of the above code is
'NAME1: Raj'
'NAME2: Kumar'
But i want the output without the single quotes like
NAME1: Raj
NAME2: Kumar
This thing is very comfortable for parsing the file.
If anyone have solution, please help me to fix. Thanks in advance
Well SnakeYaml does exactly what you tell it to: For each entry in the Map, it dumps the concatenation of the key, the String ": ", and the value as YAML document. A String maps to a Scalar in YAML, and since the scalar contains a : followed by space, it must be quoted (else it would be a key-value pair).
What you actually want to do is to dump the Map as YAML mapping. You can do it like this:
public void testDumpWriter() {
Map<String, Object> data = new HashMap<String, Object>();
data.put("NAME1", "Raj");
data.put("NAME2", "Kumar");
DumperOptions options = new DumperOptions();
options.setDefaultFlowStyle(DumperOptions.FlowStyle.BLOCK);
Yaml yaml = new Yaml(options);
FileWriter writer = new FileWriter("/path/to/file.yaml");
yaml.dump(data, writer);
}

Maps and Jxls - process different excel sheets separately with XLSTransformer

I have an excel template with two sheets that I want to populate through XLSTransformer. The data are different for the two sheets (list of different lenghts, with one taking results of a table - see code below), meaning that I cannot pass them through one map.
I've tried with two maps :
//for the first excel sheet
Map<String, List<ListData>> beanParams = new HashMap<String, List<ListData>>();
beanParams.put("rs", rs);
//for the second one
Map<String, List<DetailResult>> beanParams2 = new HashMap<String, List<DetailResult>>();
beanParams2.put("detRes", detRes);
XLSTransformer former = new XLSTransformer();
former.transformXLS(srcFilePath, beanParams, destFilePath);
former.transformXLS(srcFilePath, beanParams2, destFilePath);
The lists look like this :
List<Results> rs = new ArrayList<Results>();
Results s1 = new Results(compteurFemme, compteurHomme, compteurTot, averageTempsFemme, averageTempsHomme, averageTempsTot);
rs.add(s1);
List<ResultsDetails> detRes = new ArrayList<ResultsDetails>();
for(int i=0; i<tableau.getRowCount(); i++){
ResultsDetails newRes = new ResultsDetails(item[i], rep[i], justefaux[i], tempsrep[i]);
item[i]=((DataIdGenre) tableau.getModel()).getValueAt(i, 2).toString();
rep[i]=((DataIdGenre) tableau.getModel()).getValueAt(i, 3).toString();
justefaux[i]=((DataIdGenre) tableau.getModel()).getValueAt(i, 4).toString();
tempsrep[i]=((DataIdGenre) tableau.getModel()).getValueAt(i, 5).toString();
detRes.add(newRes);
}
Individually, the two exports are working on the respective sheet, but together, the second erases the first one.
I then try to use some kind of multimap, with one key (the one I put in excel) for two values
Map<String, List<Object>> hm = new HashMap<String, List<Object>>();
List<Object> values = new ArrayList<Object>();
values.add(rs);
values.add(detRes);
hm.put("det", values);
XLSTransformer former = new XLSTransformer();
former.transformXLS(srcFilePath, hm, destFilePath);
But I got an error telling me that the datas were inaccessible.
So my question is, is there a way to deal directly with different sheets when using XLSTransformer ?
Ok, I've come up with something, through a temporary file :
private void exportDataDet(File file) throws ParseException, IOException, ParsePropertyException, InvalidFormatException {
List<ResultsDetails> detRes = generateResultsDetails();
try(InputStream is = IdGenre.class.getResourceAsStream("/xlsTemplates/IdGenre/IdGenre_20-29_+12.xlsx")) {
try (OutputStream os = new FileOutputStream("d:/IdGenreYOLO.xlsx")) {
Context context = new Context();
context.putVar("detRes", detRes);
JxlsHelper.getInstance().processTemplate(is, os, context);
}
}
}
private void exportData(File file) throws ParseException, IOException, ParsePropertyException, InvalidFormatException {
List<Results> rs = generateResults();
try{
String srcFilePath = "d:/IdGenreYOLO.xlsx";
String destFilePath = "d:/IdGenreRes.xlsx";
Map<String, List<Results>> beanParams = new HashMap<String, List<Results>>();
beanParams.put("rs", rs);
XLSTransformer former = new XLSTransformer();
former.transformXLS(srcFilePath, beanParams, destFilePath);
}
finally{
Path path = Paths.get("d:/IdGenreYOLO.xlsx");
try{
Files.delete(path);
}
finally{}
}
}
It's probably not the best solution, even more because I have to add other data that could not fit in the two existing lists, and - at least for now - it will be done through a second temp file.
I haven't use the same method twice, because XLSTransformer was the only one I was able to make operative within the excel template I was given (which I can't modify).
I'm open to any suggestion.

How to rename Columns via Lambda function - fasterXML

Im using the FasterXML library to parse my CSV file. The CSV file has the column names in its first line. Unfortunately I need the columns to be renamed. I have a lambda function for this, where I can pass the red value from the csv file in and get the new value.
my code looks like this, but does not work.
CsvSchema csvSchema =CsvSchema.emptySchema().withHeader();
ArrayList<HashMap<String, String>> result = new ArrayList<HashMap<String, String>>();
MappingIterator<HashMap<String,String>> it = new CsvMapper().reader(HashMap.class)
.with(csvSchema )
.readValues(new File(fileName));
while (it.hasNext())
result.add(it.next());
System.out.println("changing the schema columns.");
for (int i=0; i < csvSchema.size();i++) {
String name = csvSchema.column(i).getName();
String newName = getNewName(name);
csvSchema.builder().renameColumn(i, newName);
}
csvSchema.rebuild();
when i try to print out the columns later, they are still the same as in the top line of my CSV file.
Additionally I noticed, that csvSchema.size() equals 0 - why?
You could instead use uniVocity-parsers for that. The following solution streams the input rows to the output so you don't need to load everything in memory to then write your data back with new headers. It will be much faster:
public static void main(String ... args) throws Exception{
Writer output = new StringWriter(); // use a FileWriter for your case
CsvWriterSettings writerSettings = new CsvWriterSettings(); //many options here - check the documentation
final CsvWriter writer = new CsvWriter(output, writerSettings);
CsvParserSettings parserSettings = new CsvParserSettings(); //many options here as well
parserSettings.setHeaderExtractionEnabled(true); // indicates the first row of the input are headers
parserSettings.setRowProcessor(new AbstractRowProcessor(){
public void processStarted(ParsingContext context) {
writer.writeHeaders("Column A", "Column B", "... etc");
}
public void rowProcessed(String[] row, ParsingContext context) {
writer.writeRow(row);
}
public void processEnded(ParsingContext context) {
writer.close();
}
});
CsvParser parser = new CsvParser(parserSettings);
Reader reader = new StringReader("A,B,C\n1,2,3\n4,5,6"); // use a FileReader for your case
parser.parse(reader); // all rows are parsed and submitted to the RowProcessor implementation of the parserSettings.
System.out.println(output.toString());
//nothing else to do. All resources are closed automatically in case of errors.
}
You can easily select the columns by using parserSettings.selectFields("B", "A") in case you want to reorder/eliminate columns.
Disclosure: I am the author of this library. It's open-source and free (Apache V2.0 license).

YAML values not getting returned correctly

I've got a YAML file that looks like this:
---
name:
storage:
documentfiles:
username: rafa
password: hello
And I'm trying to get the last two username and password values. My current code is the one below. I'm using a Map to store the YAML values, but since there is more than one child when I map.get() anything past name it gives me a null value. if I do map.get(name) I get {storage={documentfiles={username=rafa, password=hello}}} Does anyone know how I can correctly get the username and password?
public Map grabYaml(){
Yaml reader = new Yaml();
InputStream inputStream = getClass().getClassLoader().getResourceAsStream(yamlFileName);
Map map = (Map) reader.load(inputStream);
return map;
}
Something like this
public class Test {
public Map grabYaml() throws IOException {
Yaml reader = new Yaml();
InputStream inputStream = new FileInputStream(new File(yamlFileName));
Map map = (Map) reader.load(inputStream);
return map;
}
public static void main(String[] args) throws IOException {
Map storage = (Map) new Test().grabYaml().get("name");
Map documentfiles = (Map)storage.get("storage");
Map userData = (Map) documentfiles.get("documentfiles");
System.out.println(userData.get("username"));
System.out.println(userData.get("password"));
}
}

Writing and reading ListMultimap<Object, Object> in file

I tried writing ListMultimap to file using Properties, but it seems impossible, refer to question Writing and reading ListMultimap to file using Properties.
Going ahead, if using Properties to store ListMultimap is not correct way, how can we store ListMultimap into a file? And how can we read back from file?
e.g. lets say I have:
ListMultimap<Object, Object> index = ArrayListMultimap.create();
How can I write methods to write this ListMultimap to file and read back from file:
writeToFile(ListMultimap multiMap, String filePath){
//??
}
ListMultimap readFromFile(String filePath){
ListMultimap multiMap;
//multiMap = read from file
return multiMap;
}
You need to decide how you will represent each object in the file. For example, if your ListMultimap contained Strings you could simply write the string value but if you're dealing with complex objects you need to produce a representation of those object as a byte[], which if you want to use Properties should then be Base64 encoded.
The basic read method should be something like:
public ListMultimap<Object, Object> read(InputStream in) throws IOException
{
ListMultimap<Object, Object> index = ArrayListMultimap.create();
Properties properties = new Properties();
properties.load(in);
for (Object serializedKey : properties.keySet())
{
String deserializedKey = deserialize(serializedKey);
String values = properties.get(serializedKey);
for (String value : values.split(","))
{
index.put(deserializedKey, deserialize(value));
}
}
return index;
}
And the write method this:
public void write(ListMultimap<Object, Object> index, OutputStream out) throws IOException
{
Properties properties = new Properties();
for (Object key : index.keySet())
{
StringBuilder values = new StringBuilder();
for (Object value = index.get(key))
{
values.append(serailize(value)).append(",");
}
properties.setProperty(serailize(key), values.subString(0, values.length - 1));
}
properties.store(out, "saving");
}
This example makes use of serialize and deserialize methods that you'll need to define according to your requirements but the signatures are:
public String serialize(Object object)
and
public Object deserialize(String s)

Categories

Resources