Ordering keys in .properties file alphabetically while ignoring case sensitivity - java

I have this function that stores values from a .properties file into a tree map (translatedMap), then retrieves new values from "keyMap" and stores them into "translatedMap" as well. The issue is no matter what I do it seems to always separate capitalized keys from non-capitalized keys. Here is my code:
Properties translation = new Properties(){
private static final long serialVersionUID = 1L;
#Override
public synchronized Enumeration<Object> keys() {
return Collections.enumeration(new TreeSet<Object>(super
.keySet()));
}
};
//creates file and stores values of keyMap into the file
try {
TreeMap<String, String> translatedMap = new TreeMap<String, String>(String.CASE_INSENSITIVE_ORDER);
InputStreamReader in = new InputStreamReader(new FileInputStream(filePath), "UTF-8");
translation.load(in);
// Store all values to TreeMap and sort
Enumeration<?> e = translation.propertyNames();
while (e.hasMoreElements()) {
String key = (String) e.nextElement();
if (key.matches(".#")) {
} else {
String value = translation.getProperty(key);
translatedMap.put(key, value);
}
}
// Add new values to translatedMap
for (String key : keyMap.keySet()) {
// Handle if some keys have already been added; delete so they can be re-added
if (translatedMap.containsKey(key)) {
translatedMap.remove(key);
}
translatedMap.put(key, keyMap.get(key));
}
in.close();
translation.putAll(translatedMap);
File translationFile = new File(filePath);
OutputStreamWriter out = new OutputStreamWriter(new FileOutputStream(translationFile, false), "UTF-8");
translation.store(out, null);
out.close();
} catch (IOException e) {
e.printStackTrace();
}
}
The output I'm getting is something like:
CAPITALIZED_KEY1=value1 CAPITALIZED_KEY2=value2
alowercase.key=value3 anotherlowercase.key=value4
morelowercase.keys=value5
When I would want it to come out like:
alowercase.key=value3 anotherlowercase.key=value4
CAPITALIZED_KEY1=value1 CAPITALIZED_KEY2=value2
morelowercase.keys=value5

Properties are not ordered. It doesn't matter what order you insert into them or if you call putAll() with something that is sorted, they extend Hashtable. See here.

The basic problem is that - though sorted case-insensitive -, an ordered map should still be case-sensitive as property names are case-sensitive.
Hence overide Properties, and on writing sort the names case-insensitive.
public class SortedProperties extends Properties {
#Override
public void store(Writer writer, String comments)
throws IOException {
List<String> names = new ArrayList<>();
for (Enumeration<?> en = propertyNames(); en.hasMoreElements(); ) {
String name = en.nextElement().toString();
names.add(name);
}
Collections.sort(names, new Comparator<String>() {
#Override
public int compareTo(String other) {
toLowerCase().compareTo(other.toLowerCase());
}
});
//... write all properties
}

To achieve this I ended up avoiding the store function all together. I did the sorting inside the treeMap. I used a buffered writter and wrote to the file. like this:
Properties translation = new Properties();
//creates file and stores values of keyMap into the file
try {
TreeMap<String, String> translatedMap = new TreeMap<String, String>(new Comparator<String>() {
public int compare(String o1, String o2) {
return o1.toLowerCase().compareTo(o2.toLowerCase());
}
});
InputStreamReader in = new InputStreamReader(new FileInputStream(filePath), "UTF-8");
translation.load(in);
// Store all values to TreeMap and sort
for (String key : translation.stringPropertyNames()) {
keyMap.put(key, translation.getProperty(key));
}
in.close();
Iterator<String> it = keyMap.keySet().iterator();
while (it.hasNext()) {
String key = it.next();
translatedMap.put(key, keyMap.get(key));
}
BufferedWriter bw = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(filePath, false), "UTF-8"));
bw.write("#" + new Date().toString());
bw.newLine();
Iterator<String> it2 = translatedMap.keySet().iterator();
while (it2.hasNext()) {
String key = it2.next();
bw.write(key + '=' + translatedMap.get(key));
bw.newLine();
}
bw.close();
} catch (IOException e) {
e.printStackTrace();
}
}

Related

Append to existing CSV with headers

I have a method that appends to a .csv file but the problem is that it adds a header row everytime as well. How can I append to the .csv correctly?
I am aware that adding to a List would do the job but this method is called in separate runs.
public static void writeToCSVFileAndSend(String facilityId, int candidateStockTakeContainersCount) throws IOException {
FileWriter report = new FileWriter("/tmp/MonthlyExpectedComplianceSuggestions.csv", true);
LocalDate today = java.time.LocalDate.now();
String[] headers = { "Warehouse", "Expected Count for "+ today.getMonth().getDisplayName(TextStyle.SHORT, Locale.ENGLISH)};
Map<String, Integer> facilityExpectedMonthlyCountMap= new HashMap<String, Integer>() {
{
put(facilityId, candidateStockTakeContainersCount);
}
};
try (CSVPrinter printer = new CSVPrinter(report, CSVFormat.DEFAULT
.withHeader(headers))) {
facilityExpectedMonthlyCountMap.forEach((a, b) -> {
try {
printer.printRecord(a, b);
} catch (IOException e) {
e.printStackTrace();
}
});
}
}
Current Output
Warehouse,Expected Count for Dec
A,2147
Warehouse,Expected Count for Dec
B,0
Expected Output
Warehouse,Expected Count for Dec
A,2147
B,0
To avoid multiple headers, you should create object of CSVPrinter once and reuse it
Depending on how you are getting the data, you may split the function in two and pass CSVPrinter object around.
public static void writeToCSVFileAndSend() throws IOException
{
File outputCSV = new File( "/tmp/MonthlyExpectedComplianceSuggestions.csv");
LocalDate today = java.time.LocalDate.now();
String[] headers = { "Warehouse", "Expected Count for "+ today.getMonth().getDisplayName(TextStyle.SHORT, Locale.ENGLISH)};
boolean headerRequired = true;
if( outputCSV.exists()){
headerRequired = false;
}
CSVPrinter printer = null;
if( headerRequired){
printer = new CSVPrinter(report, CSVFormat.DEFAULT.withHeader(headers));
}
else{
printer = new CSVPrinter(report);
}
// Iterate through combination of facilityId and candidateStockTakeContainersCount and
// call print record
Map<String, Integer> facilityExpectedMonthlyCountMap= new HashMap<String, Integer>();
// fill in your data in map here
facilityExpectedMonthlyCountMap.forEach((a, b) -> {
try {
printer.printRecord(a, b);
} catch (IOException e) {
e.printStackTrace();
}
});
}

how to fetch and validate csv header in open csv?

I want to fetch header from csv file . If I am not use this skipLines then I will get header at 0 index array . But I want to fetch header directly using HeaderColumnNameMappingStrategy but it will not work with my code.
I also want to validate header column list ( like csv had not allowed to contain extra column)
I had also check this How to validate the csv headers using opencsv but it was not helpful to me.
#SuppressWarnings({ "unchecked", "rawtypes" })
public Map<String, Object> handleStockFileUpload(MultipartFile file, Long customerId) {
Map<String, Object> responseMap = new HashMap<>();
responseMap.put("datamap", "");
responseMap.put("errormap", "");
responseMap.put("errorkeys", "");
List<Map<String, Integer>> list = new ArrayList<>();
List<StockCsvDTO> csvStockList = new ArrayList<>();
try {
String fileName = new SimpleDateFormat("yyyy_MM_dd_HHmmss").format(new Date()) + "_" + file.getOriginalFilename();
responseMap.put("filename", fileName);
File stockFile = new File(productsUploadFilePath + fileName);
stockFile.getParentFile().mkdirs();
FileOutputStream fos = new FileOutputStream(stockFile);
fos.write(file.getBytes());
fos.close();
CsvTransfer csvTransfer = new CsvTransfer();
ColumnPositionMappingStrategy ms = new ColumnPositionMappingStrategy();
ms.setType(StockCsv.class);
Reader reader = Files.newBufferedReader(Paths.get(productsUploadFilePath + fileName));
CSVReader csvReader = new CSVReader(reader);
CsvToBean cb = new CsvToBeanBuilder(reader)
.withType(StockCsv.class)
.withMappingStrategy(ms)
.withSkipLines(1)
.build();
csvTransfer.setCsvList(cb.parse());
reader.close();
csvStockList = csvTransfer.getCsvList();
} catch (Exception e) {
e.printStackTrace();
responseMap.put("status", "servererror");
}
responseMap.put("datamap", csvStockList);
return responseMap;
}
I found the following solution:
Use #CsvBindByName with HeaderColumnNameMappingStrategy,e.g. annotate your bean properties with #CsvBindByName:
public static class HollywoodActor {
private int id;
#CsvBindByName(column = "First Name")
private String firstName;
#CsvBindByName(column = "Last Name")
private String lastName;
// getter / setter
}
Add a method like this:
public class CsvParser {
public <T> ParseResult<T> parseByPropertyNames(Reader csvReader, Class<T> beanClass) throws IOException {
CSVReader reader = new CSVReaderBuilder(csvReader).withCSVParser(new
CSVParserBuilder().build()).build();
CsvToBean<T> bean = new CsvToBean();
HeaderColumnNameMappingStrategy<T> mappingStrategy = new HeaderColumnNameMappingStrategy();
mappingStrategy.setType(beanClass);
bean.setMappingStrategy(mappingStrategy);
bean.setCsvReader(reader);
List<T> beans = bean.parse();
return new CsvParseResult<>(mappingStrategy.generateHeader(), beans);
}
and also don't forget to add public class ParseResult
public class ParseResult <T> {
private final String[] headers;
private final List<T> lines;
// all-args constructor & getters
}
Use then use them in your code:
String csv = "Id,First Name,Last Name\n" + "1, \"Johnny\", \"Depp\"\n" + "2, \"Al\", \"Pacino\"";
CsvParseResult<HollywoodActor> parseResult = parser
.parseByPropertyNames(new InputStreamReader(new ByteArrayInputStream(csv.getBytes(StandardCharsets.UTF_8), HollywoodActor.class)));
From ParseResult.headers you can get actual headers from which were in your .csv file. Just compare them with what's expected.
Hope that helps!
Here I was comparing my csvHeader with originalHeader:
List<String> originalHeader = fileUploadUtility.getHeader(new StockCsv());
List<String> invalidHeader = csvHeader.stream().filter(o -> (originalHeader.stream().filter(f -> f.equalsIgnoreCase(o)).count()) < 1).collect(Collectors.toList());
if(null != invalidHeader && invalidHeader.size() > 0 && invalidHeader.toString().replaceAll("\\[\\]", "").length() > 0) {
msg = "Invalid column(s) : " + invalidHeader.toString().replace(", ]", "]") + ". Please remove invalid column(s) from file.";
resultMap.put(1, msg);
}
public List<String> getHeader(T pojo) {
// TODO Auto-generated method stub
final CustomMappingStrategy<T> mappingStrategy = new CustomMappingStrategy<>();
mappingStrategy.setType((Class<? extends T>) pojo.getClass());
String header[] = mappingStrategy.generateHeader();
List<String> strHeader = Arrays.asList(header);
return strHeader;
}
Here is an alternative to your present problem.First, define what you expect your headers to look like. For example:
public static final ArrayList<String> fileFormat = new ArrayList<> (Arrays.asList("Values1", "Values2", "Values3", "Values4"));
Now, write a method to return custom errors if any exist:
public String validateCsvFileDetails(MultipartFile file, Set<String> requiredHeadersArray) {
Set<String> errors = new HashSet<>();
try {
InputStream stream = file.getInputStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(stream));
String headerLine = reader.readLine();
if (Objects.isNull(headerLine))
return "The file has no headers, please ensure it has the correct upload format";
List<String> headersInFileList;
String[] headersInFileArray;
if (headerLine.contains(",")) {
headersInFileArray = StringUtils.split(headerLine, ",");
headersInFileList = Arrays.asList(headersInFileArray);
} else//the headerline has only one headerfield
{
headersInFileList = Collections.singletonList(headerLine);
}
for (String header : requiredHeadersArray) {
if (!headersInFileList.contains(header))
errors.add("The file has the wrong header format, please ensure " + header + " header is present");
}
//if there are errors, return it
if (!errors.isEmpty())
return sysUtils.getStringFromSet(errors);
//Ensure the csv file actually has values after the header, but don't read beyond the first line
String line;
int counter = 0;
while ((line = reader.readLine()) != null) {
counter++;
if (counter > 0)
break;
}
//if line is null return validation error
if (Objects.isNull(line))
return "Cannot upload empty file";
} catch (Exception e) {
logger.error(new Object() {
}.getClass().getEnclosingMethod().getName(), e);
return "System Error";
}
return null;
}
Now you can validate you file headers as follows:
String errors = validateCsvFileDetails(file, new HashSet<>(fileFormat));
if (errors != null)
return error
//proceed
Give this a try using captureHeader as a pre-filter:
...
private class CustomHeaderColumnNameMappingStrategy<T> extends HeaderColumnNameMappingStrategy {
private String[] expectedHeadersOrdered = {"Column1", "Column2", "Column3", "Column4", "Column5"};
#Override
public void captureHeader(CSVReader reader) throws IOException, CsvRequiredFieldEmptyException {
String[] actualCsvHeaders = reader.peek();
String actualHeader, expectedHeader;
if (expectedHeadersOrdered.length > actualCsvHeaders.length) {
throw new CsvRequiredFieldEmptyException("Missing header column.");
} else if (expectedHeadersOrdered.length < actualCsvHeaders.length) {
throw new IOException("Unexpected extra header column.");
}
// Enforce strict column ordering with index
// TODO: you might want to employ simple hashMap, List, set, etc. as needed
for (int i=0; i<actualCsvHeaders.length; i++) {
actualHeader = actualCsvHeaders[i];
expectedHeader = expectedHeadersOrdered[i];
if ( ! expectedHeader.equals(actualHeader) ) {
throw new IOException("Header columns mismatch in ordering.");
}
}
super.captureHeader(reader); // Back to default processing if the headers include ordering are as expected
}
}
CustomHeaderColumnNameMappingStrategy yourMappingStrategy = new CustomHeaderColumnNameMappingStrategy<YourPOJO>();
ourMappingStrategy.setType(YourPOJO.class);
try {
pojosFromCsv = new CsvToBeanBuilder<YourPOJO>(new FileReader(csvFile))
.withType(YourPOJO.class)
.withMappingStrategy(yourMappingStrategy)
.build();
pojosFromCsv.stream();
}
Inspired by Using captureHeader in OpenCSV

HASHMAP, key present still giving "false" result

Following is my code that returning false even if the key exists:
import java.util.HashMap;
import java.util.Map;
import java.io.BufferedReader;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;
public class SequenceNumber {
public static int getSequenceNumber (String TcOrderId){
// Create a hash map to set key values pair.
Map<String, Integer> map = new HashMap<String, Integer>();
int i= 1;
// check if hashmap contains the key.
System.out.println("key present " +map.containsKey(TcOrderId));
if (map.containsKey(TcOrderId))
{
//Key Present
System.out.println("Inside IF ");
int value = map.get(TcOrderId);
System.out.println("value from the key " + value);
map.remove(value);
map.put(TcOrderId, value+1);
return map.get(TcOrderId);
}
else
{
//Key Not present
System.out.println("INSIDE ELSE ");
map.put(TcOrderId, i);
System.out.println("map "+ map);
return map.get(TcOrderId);
}
}
public static void main(String[] args) throws IOException {
String sCurrentLine;
BufferedReader br = null;
try {
br = new BufferedReader(new FileReader("C:\\Users\\BongAn\\Desktop\\Package\\testing.txt"));
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
{
while ((sCurrentLine = br.readLine()) != null) {
//String orderid = sCurrentLine.substring(0, 6);
System.out.println("reading line " +sCurrentLine);
int seqvalue = getSequenceNumber(sCurrentLine);
System.out.println("seqvalue "+seqvalue);
}
}
}
}
Input data in the file:
1233
1233
1234
The result should be
1
2
1
But everytime its going in the else loop and the result is
1
1
1
I am trying to use HASHMAP as I am creating my own index.
In your CODE everytime you call getSequenceNumber function - you create new HashMap. I believe this is not something you want.
To avoid that - you can simply move Map<String, Integer> map = new HashMap<String, Integer>(); into the body of class. Since the function getSequenceNumber is a static function - you will need to make the variable static. Hope this helps.
Snippet:
public class SequenceNumber {
// PUT STATIC VARIABLE HERE:
static Map<String, Integer> map = new HashMap<String, Integer>();
public static int getSequenceNumber (String TcOrderId){
// Create a hash map to set key values pair.
// (REMOVE) Map<String, Integer> map = new HashMap<String, Integer>();**
int i= 1;
// check if hashmap contains the key.
...
}
...
}
Another alternative
(perhaps better) would be to avoid static functions and variables and create an instance of SequenceNumber object. That way you could keep a couple of different instance numbers separately.
Simple snippet:
public class SequenceNumber {
// Your hashmap here:
Map<String, Integer> map = new HashMap<String, Integer>();
public int getSequenceNumber (String TcOrderId) {
// ...
}
public static void main(String[] args) throws IOException {
// Instance of SequenceNumber object:
SequenceNumber sequenceNumber = new SequenceNumber();
String sCurrentLine;
BufferedReader br = null;
// ...
while ((sCurrentLine = br.readLine()) != null) {
//String orderid = sCurrentLine.substring(0, 6);
System.out.println("reading line " +sCurrentLine);
int seqvalue = sequenceNumber.getSequenceNumber(sCurrentLine);
System.out.println("seqvalue "+seqvalue);
}
// ...
}
}
Something like this should work. Haven't tried running it though.
public class SequenceNumber {
public static int getSequenceNumber (String TcOrderId, Map<String, Integer> map){
if(!map.contains(TcOrderId)){
map.put(TcOrderId, 0);
}
map.put(TcOrderId, map.get(TcOrderId)+1);
return map.get(TcOrderId);
}
public static void main(String[] args) throws IOException {
String sCurrentLine;
BufferedReader br = null;
Map<String, Integer> map = new HashMap<String, Integer>();
try {
br = new BufferedReader(new FileReader("C:\\Users\\BongAn\\Desktop\\Package\\testing.txt"));
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
{
while ((sCurrentLine = br.readLine()) != null) {
//String orderid = sCurrentLine.substring(0, 6);
System.out.println("reading line " +sCurrentLine);
int seqvalue = getSequenceNumber(sCurrentLine, map);
System.out.println("seqvalue "+seqvalue);
}
}

Java: Write HashMap into File

I'm working on a program that saves two words into a HashMap. I need to be able to take the HashMap key and value and write it into a file as "key:value" format. When my save() method is called, the HashMap contents are supposed to be written into the file whose name was given as parameter to the constructor. The method returns false if the file can't be saved; otherwise it returns true. However, its not working if the File does not exist. It's also not saving changes made to an existing file. I'm not understanding how to read/write files too well... Thank you.
package dictionary;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileWriter;
import java.util.HashMap;
import java.util.Scanner;
public class MindfulDictionary {
private HashMap<String, String> words;
private File file;
public MindfulDictionary() {
this.words = new HashMap<String, String>();
}
public MindfulDictionary(String file) {
this.file = new File(file);
this.words = new HashMap<String, String>();
}
public boolean load() {
try {
Scanner fileReader = new Scanner(this.file);
while (fileReader.hasNextLine()) {
String line = fileReader.nextLine();
String[] parts = line.split(":"); // the line is split at :
String word = parts[0];
String trans = parts[1];
this.add(word, trans);
}
} catch (Exception e) {
System.out.println("nope");
}
return true;
}
public boolean save() {
boolean saved = true;
BufferedWriter writer = null;
try {
writer = new BufferedWriter(new FileWriter(this.file.getName(), true));
for (String key : this.words.keySet()) {
writer.write(key + ":" + this.words.get(key) + "\n");
writer.newLine();
writer.flush();
writer.close();
}
} catch (Exception e) {
}
return saved;
}
public void add(String word, String translation) {
if ((!this.words.containsKey(word))) {
this.words.put(word, translation);
}
}
public String translate(String word) {
if (this.words.containsKey(word)) {
return this.words.get(word);
} else if (this.words.containsValue(word)) {
for (String key : this.words.keySet()) {
if (this.words.get(key).equals(word)) {
return key;
}
}
}
return null;
}
public void remove(String word) {
if (this.words.containsKey(word)) {
this.words.remove(word);
} else if (this.words.containsValue(word)) {
String remove = "";
for (String key : this.words.keySet()) {
if (this.words.get(key).equals(word)) {
remove += key;
}
}
this.words.remove(remove);
}
}
}
Notice this part of your code,
try {
writer = new BufferedWriter(new FileWriter(this.file.getName(), true));
for (String key : this.words.keySet()) {
writer.write(key + ":" + this.words.get(key) + "\n");
writer.newLine();
writer.flush();
writer.close(); // !!
}
} catch (Exception e) {
}
Here, you are calling close() on the BufferedWriter object. You can not use the object after you have called close() on it.
Once the stream has been closed, further write() or flush() invocations will cause an IOException to be thrown.
Read more about close() here.
Also, since you are catching all the exceptions and not doing anything with them, you did not notice the IOException . In future NEVER do this. At the least log any exception that occurs. This will help you with your debugging.

Matching Keys in a HashMap

I am attempting to do the following (in psuedocode):
Generate HashMapOne that will be populated by results
found in a DICOM file (the Key was manipulated for matching
purposes).
Generate a second HashMapTwo that will be read from a
text document.
Compare the Keys of both HashMaps, if a match add the results of
the value of HashMapOne in a new HashMapThree.
I am getting stuck with adding the matched key's value to the HashMapThree. It always populates a null value despite me declaring this a public static variable. Can anyone please tell me why this may be? Here is the code snippets below:
public class viewDICOMTags {
HashMap<String,String> dicomFile = new HashMap<String,String>();
HashMap<String,String> dicomTagList = new HashMap<String,String>();
HashMap<String,String> Result = new HashMap<String, String>();
Iterator<org.dcm4che2.data.DicomElement> iter = null;
DicomObject working;
public static DicomElement element;
DicomElement elementTwo;
public static String result;
File dicomList = new File("C:\\Users\\Ryan\\dicomTagList.txt");
public void readDICOMObject(String path) throws IOException
{
DicomInputStream din = null;
din = new DicomInputStream(new File(path));
try {
working = din.readDicomObject();
iter = working.iterator();
while (iter.hasNext())
{
element = iter.next();
result = element.toString();
String s = element.toString().substring(0, Math.min(element.toString().length(), 11));
dicomFile.put(String.valueOf(s.toString()), element.vr().toString());
}
System.out.println("Collected tags, VR Code, and Description from DICOM file....");
}
catch (IOException e)
{
e.printStackTrace();
return;
}
finally {
try {
din.close();
}
catch (IOException ignore){
}
}
readFromTextFile();
}
public void readFromTextFile() throws IOException
{
try
{
String dicomData = "DICOM";
String line = null;
BufferedReader bReader = new BufferedReader(new FileReader(dicomList));
while((line = bReader.readLine()) != null)
{
dicomTagList.put(line.toString(), dicomData);
}
System.out.println("Reading Tags from Text File....");
bReader.close();
}
catch(FileNotFoundException e)
{
System.err.print(e);
}
catch(IOException i)
{
System.err.print(i);
}
compareDICOMSets();
}
public void compareDICOMSets() throws IOException
{
for (Entry<String, String> entry : dicomFile.entrySet())
{
if(dicomTagList.containsKey(entry.getKey()))
Result.put(entry.getKey(), dicomFile.get(element.toString()));
System.out.println(dicomFile.get(element.toString()));
}
SortedSet<String> keys = new TreeSet<String>(Result.keySet());
for (String key : keys) {
String value = Result.get(key);
System.out.println(key);
}
}
}
This line of code looks very wrong
Result.put(entry.getKey(), dicomFile.get(element.toString()));
If you are trying to copy the key/value pair from HashMapOne, then this is not correct.
The value for each key added to Result will be null, because you are calling get method on Map interface on dicomFile. get requires a key as a lookup value, and you are passing in
element.toString()
where element will be the last element that was read from your file.
I think you should be using
Result.put(entry.getKey(), entry.getValue()));

Categories

Resources