The code given below takes two different inputs but I want to pass only single input that is the path of folder "test" and rest of the functioning as it is.
And also the final.tbl which is generation it should generate in the same input folder path:
public class Migrator {
private static final String KEY1 = "post_tran_id";
private static final String KEY2 = "post_tran_cust_id";
void migrate(String post_tran, String post_tran_cust) throws IOException {
Map<String, Map<String, String>> h1 = loadFile(post_tran, KEY1);
Map<String, Map<String, String>> h2 = loadFile(post_tran_cust, KEY2);
PrintStream out = new PrintStream("final.tbl");
for (Map.Entry<String, Map<String, String>> entry : h1.entrySet()) {
Map<String, String> data = entry.getValue();
String k = data.get(KEY2);
if (k != null && h2.containsKey(k)) {
print(out, KEY1, data.get(KEY1));
print(out, KEY2, data.get(KEY2));
// Print remaining rows in any order
for (String key : data.keySet()) {
if ( ! key.equals(KEY1) && ! key.equals(KEY2) ) {
print(out, key, data.get(key));
}
}
data = h2.get(k);
for (String key : data.keySet()) {
if ( ! key.equals(KEY2) ) {
print(out, key, data.get(key));
}
}
out.println(); // Record separator
}
}
}
private void print(PrintStream out, String key, String data) {
out.print("[name]");
out.print(key);
out.print("[/name]");
out.print("=");
out.print("[data]");
out.print(data);
out.print("[/data]");
out.println();
}
private Map<String, Map<String, String>> loadFile(String fileName, String key) throws
IOException {
Map<String, Map<String, String>> result = new HashMap<String, Map<String, String>>
();
BufferedReader br = new BufferedReader(new FileReader(fileName));
String line;
do {
Map<String, String> data = new HashMap<String, String>();
while ((line = br.readLine()) != null && !line.isEmpty()) {
data.put(getKey(line), getData(line));
}
result.put(data.get(key), data);
} while (line != null);
br.close();
return result;
}
private String getKey(String line) {
String[] tokens = line.split("=");
int length = tokens[0].length();
return tokens[0].substring(6, length - 7);
}
private String getData(String line) {
String[] tokens = line.split("=");
int length = tokens[1].length();
return tokens[1].substring(6, length - 7);
}
public static void main(String[] args) throws IOException { Migrator mg =
new Migrator();
mg.migrate("D:\\test\\post_tran.tbl",
"D:\\test\\post_tran_cust.tbl"); }
}
To make your migrate method take 1 argument but be able to work with many paths, you can always append all the paths into one string and parse them inside the migrate method.
Example:
String appendedArgument = "D:\\test\\post_tran.tbl;D:\\test\\post_tran_cust.tbl";
Notice the semi-colon separating both paths.
Then you can call you method:
mg.migrate(appendedArgument);
And parse it on the other side:
void migrate(String argument) throws IOException
{
String[] splitArgument.split(";");
String post_tran = splitArgument[0];
String post_tran_cust = splitArgument[1];
Map<String, Map<String, String>> h1 = loadFile(post_tran, KEY1);
Map<String, Map<String, String>> h2 = loadFile(post_tran_cust, KEY2);
}
Using this kind of method you can send as many paths into your migrate method as you want, this enables you (in this particular case) to also send the path where you want to store the final.tbl file.
That would make the appendedArgument string to look like:
String appendedArgument = "D:\\test\\;D:\\test\\post_tran.tbl;D:\\test\\post_tran_cust.tbl";
And then you would need to parse it accordingly inside the migrate method.
Related
Issue adding a list from one hashmap's value to another's
Basically, I have 2 hashmaps (map1 and map2), both have the same keys(Integers from 0-500), but different values. What I'm wanting to do is use the value of map1, which is a String, as the key and the value of map2, which is a List, as the value. Adding map1 as the key is working, no problem, but when I try to add map2's value as map's value, it just returns as null.
This is for a homework project, where we are given 2 .csv files, one with labels and another with fake image file names, and have to be able to search by either image label or image file name.
Map<String, List<String>> map = new HashMap<String, List<String>>();
#SuppressWarnings({ "resource", "null", "unlikely-arg-type" })
public ImageLabelReader(String labelMappingFile, String imageMappingFile) throws IOException {
Map<Integer, String> map1 = new HashMap<Integer, String>();
Map<Integer, List<String>> map2 = new HashMap<Integer, List<String>>();
BufferedReader labelIn = new BufferedReader(new FileReader(labelMappingFile));
BufferedReader imageIn = new BufferedReader(new FileReader(imageMappingFile));
String row;
String[] rowArray;
while ((row = labelIn.readLine()) != null) {
rowArray = row.split(" ", 2);
map1.put(Integer.parseInt(rowArray[0]), rowArray[1]);
}
labelIn.close();
while ((row = imageIn.readLine()) != null) {
rowArray = row.split(" ", 2);
if(map2.containsKey(Integer.parseInt(rowArray[1]))) {
List<String> tempList = map2.get(Integer.parseInt(rowArray[1]));
tempList.add(rowArray[0]);
} else {
List<String> l = new ArrayList<String>();
l.add(rowArray[0]);
map2.put(Integer.parseInt(rowArray[1]), l);
}
}
imageIn.close();
List<String> t = new ArrayList<String>();
for(int i = 0; i < map1.size(); i++) {
t.clear();
for(String s : map2.get(i)) {
t.add(s);
System.out.println(t);
}
map.put(map1.get(i), map2.get(i));
}
System.out.println(map.containsKey("burrito"));
System.out.print(map2.get("burrito"));
}
Output is "True null" when the output should be "True [list containing strings]"
Try replacing -
map.put(map1.get(i), map2.get(i));
with
map.put(map1.get(i), t);
And also -
System.out.print(map2.get("burrito"));
with
System.out.print(map.get("burrito"));
Also, you're trying to get map's value using a String while you said the key is of int type, please check that.
I use a hashMap to store data (certificate details) which is read from a file.
The key and value is stored in the hashMap but after calling the put method, ALL values have the value of the last added entry.
I guess it is also related to
hashmap.get() returning wrong values even though they are all correct in the map
but I don't see my error:
HashMap<String, String[]> certDataMap = new HashMap<String, String[]>();
String line="";
String bankName = "", validTill = "", fingerPrint = "";
File certDat = new File(certDataFile);
int cntEntries=0;
String[] data = {"dummy", "dummy"};
if (certDat.exists()) {
try {
Scanner scanner = new Scanner(certDat);
while (scanner.hasNextLine()) {
line=scanner.nextLine();
bankName=line.split("\\|")[0];
validTill=line.split("\\|")[1];
fingerPrint=line.split("\\|")[2];
logger.debug("line: {} bankName: {} validTill: {} fingerPrint: {}",line, bankName, validTill, fingerPrint);
data[0]=validTill;
data[1]=fingerPrint;
certDataMap.put(bankName, data);
debugCertMap();
cntEntries++;
}
scanner.close();
logger.debug("{} read from {}", cntEntries, certDataFile);
} catch (IOException e) {
logger.error(certDataFile,e);
}
} else
logger.error(certDataFile+" not found! New file will be created if certificates were downloaded");
The problem was the declaration of string array data outside the loop as mentioned by Jonathan:
while (scanner.hasNextLine()) {
line=scanner.nextLine();
bankName=line.split("\\|")[0];
validTill=line.split("\\|")[1];
fingerPrint=line.split("\\|")[2];
logger.debug("line: {} bankName: {} validTill: {} fingerPrint: {}",line, bankName, validTill, fingerPrint);
String[] data = {validTill, fingerPrint};
certDataMap.put(bankName, data);
debugCertMap();
cntEntries++;
An object is actually reference and you are using the same object data for each line. Use a new object.
Yes, you use same object String[] data = {"dummy", "dummy"};, where data is the reference to the array.
But look at your code. All these could be done very simply and avoid these problems.
Create data holder class, that represents single line from the file:
public static final class Data {
private final String bankName;
private final String validTill;
private final String fingerPrint;
public Data(String[] line) {
bankName = line[0];
validTill = line[1];
fingerPrint = line[2];
}
}
And provide a method that accept Path and retrieve file content with required format:
public static Map<String, Data> read(Path path) throws IOException {
return Files.lines(path)
.map(line -> new Data(line.split("\\|")))
.collect(Collectors.toMap(Data::getBankName, Function.identity()));
}
That's all!
I have to generate execution plans for each query extracted from a log file. I have to generate a Hashmap that contains the query as the key and value as the execution plan. This HashMap is to be displayed as a table in Angular.
So far I have tried the following. I am getting the table with query and execution plans as two columns. But the explain plain is displayed in a single line, instead of the table form I get when I print it out on the console (See the picture). Is there a way to get the same format while displaying it in the Angular table.
The Console Prints it in this form, which is what I want in the table also.
CODE:
ExecutionPlanService: Connects to the Oracle DB and generates Explain Plan.
#Service
public class ExecutionPlanService {
List<String> l = new ArrayList<>();
String execute;
public List<String> getExecPlan(String line) throws IOException, SQLException, ClassNotFoundException{
ResultSet rs=null;
Class.forName("oracle.jdbc.driver.OracleDriver");
Connection con=DriverManager.getConnection("jdbc:oracle:thin:#10.49.7.212:1521:AVMPRD","CtmDitRW","CtmDit1743");
Statement stmt = con.createStatement();
stmt.execute(line);
rs = stmt.executeQuery("select plan_table_output from table(dbms_xplan.display())");
while (rs.next())
{
l.add(rs.getString(1));
}
return l;
}
public String prepend(String line, String prepend){
String output = null;
for (int index = 0; index < line.length(); index++) {
output = prepend + line;
}
return output;
}
}
ExecPlanService: Checks if the given line is SQL statement, generates explain plan and puts these both as key and value in a HashMap.
private SQLCheckerService checker = new SQLCheckerService();
private ExecutionPlanService service = new ExecutionPlanService();
List<String>l = new ArrayList<String>();
Map<String, List<String>> map = new HashMap<>();
String trimmedLine;
public Map<String, List<String>> processLine(String line) throws FileNotFoundException, UnsupportedEncodingException, ClassNotFoundException, IOException, SQLException {
if(this.checker.isSelectStatement(line))
{
System.out.println("working!!");
//Trim the line to start with a SQL keyword
trimmedLine= this.checker.trimString(line);
String prepend = this.service.prepend(trimmedLine, "Explain plan for ");
l =this.service.getExecPlan(prepend);
l.forEach(System.out::println);
map.put(trimmedLine,l );
System.out.println(map);
}
return map;
}
ReadFileExecPlanService: Reads a log file line by line and processes it to generate explain plan.
public Map<String, List<String>> readFile(String filename) throws IOException, ClassNotFoundException, SQLException{
private ExecPlanService eservice= new ExecPlanService();
Map<String, List<String>> hashMap = new HashMap<>();
try (BufferedReader br = new BufferedReader(new FileReader(filename))){
for(String line; (line = br.readLine()) != null; ){
hashMap = this.eservice.processLine(line);
}
}
System.out.println(hashMap);
return hashMap;
}
ExecPlanController
#RestController
public class ExecPlanController {
#Autowired
ReadFileExecPlan exservice;
Map<String, List<String>> map = new HashMap<>();
#GetMapping("/getPlan")
#ResponseBody
public Map<String, List<String>> getPlan() throws IOException, ClassNotFoundException, SQLException{
String rootLocation = "C:\\Users\\Apoorva_Sharma\\Desktop\\upload-dir";
File directory = new File(rootLocation);
//get all the files from a directory
File[] fList = directory.listFiles();
for (File file : fList){
if (file.isFile()){
map = (exservice.readFile(file.getAbsolutePath()));
}
}
System.out.println("WORKING!!");
System.out.println(map);
return map;
}
}
I have a String array which contains some records ,now i have to put that records in a file and have to read those values and have to check the records with the String array values.Here is my String array..
public final static String fields[] = { "FileID", "FileName", "EventType",
"recordType", "accessPointNameNI", "apnSelectionMode",
"causeForRecClosing", "chChSelectionMode",
"chargingCharacteristics", "chargingID", "duration",
"dynamicAddressFlag", "iPBinV4AddressGgsn",
"datavolumeFBCDownlink", "datavolumeFBCUplink",
"qoSInformationNeg"};
I have to put these records in a map using these,,
static LinkedHashMap<String, String> getMetaData1() {
LinkedHashMap<String, String> md = new LinkedHashMap<>();
for (String fieldName : fields) md.put(fieldName, "");
return md;
}
now my file is
FileID
FileName
EventType
recordType
accessPointNameNI
apnSelectionMode
causeForRecClosing
chChSelectionMode
chargingCharacteristics
chargingID
duration
dynamicAddressFlag
iPBinV4AddressGgsn
datavolumeFBCDownlink
datavolumeFBCUplink
qoSInformationNeg
Now i am reading this file with this function
static LinkedHashMap<String, String> getMetaData() {
LinkedHashMap<String, String> md = new LinkedHashMap<>();
BufferedReader br = null;
try {
String sCurrentLine;
String file[];
br = new BufferedReader(new FileReader("./file/HuaGPRSConf"));
while ((sCurrentLine = br.readLine()) != null) {
md.put(sCurrentLine, "");
}
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
if (br != null)
br.close();
} catch (IOException ex) {
ex.printStackTrace();
}
}
return md;
}
Now those two functions are returing values in two different ways..The String is giving
{FileID=, FileName=, EventType=, recordType=, accessPointNameNI=, apnSelectionMode=, causeForRecClosing=, chChSelectionMode=, chargingCharacteristics=, chargingID=, duration=, dynamicAddressFlag=, iPBinV4AddressGgsn=, datavolumeFBCDownlink=, datavolumeFBCUplink=, qoSInformationNeg=}
But the file from which one i am getting the values is giving values with a big spaces..
{ FileID =, FileName=, EventType=, recordType=, accessPointNameNI=, apnSelectionMode=, causeForRecClosing=, chChSelectionMode=, chargingCharacteristics=, chargingID=, duration=, dynamicAddressFlag=, iPBinV4AddressGgsn=, datavolumeFBCDownlink=, datavolumeFBCUplink=, qoSInformationNeg=, rATType=, ratingGroup=, resultCode=, serviceConditionChange=, iPBinV4Address=, sgsnPLMNIdentifier=, timeOfFirstUsage=, timeOfLastUsage=, timeOfReport=, timeUsage=, changeCondition=, changeTime=,.... so on
now when i am trying to check two values using this function they are not equal..
LinkedHashMap<String, String> md1=getMetaData();
LinkedHashMap<String, String> md2=getMetaData1();
if(md1.equals(md2)){
System.out.println(md1);
}else{
System.out.println("Not");
}
i cannot understand the problem can anyone help...
You should use sCurrentLine.trim() to remove unnnecessary whitespace.
I will suggest to first check the data before comparing.If , you are finding extra space then first apply trim() to remove the space and then compare.
You are checking if 2 different instances of LinkedHashMap are equal and they are not.
You have to use get method of LinkedHashMap to compare values.
Also you should remove empty spaces by String trim method.
Ive been working on this code for quite sometime and just want to be given the simple heads up if im routing down a dead end. The point where im at now is to mathch identical cells from diffrent .csv files and copy one row into another csv file. The question really is would it be possible to write at specfic lines say for example if the the 2 cells match at row 50 i wish to write back on to row 50. Im assuming that i would maybe extract everything to a hashmap, write it in there then write back to the .csv file? is there a easier way?
for example i have one Csv that has person details, and the other has property details of where the actual person lives, i wish to copy the property details to the person csv, aswell as match them up with the correct person detail. hope this makes sense
public class Old {
public static void main(String [] args) throws IOException
{
List<String[]> cols;
List<String[]> cols1;
int row =0;
int count= 0;
boolean b;
CsvMapReader Reader = new CsvMapReader(new FileReader("file1.csv"), CsvPreference.EXCEL_PREFERENCE);
CsvMapReader Reader2 = new CsvMapReader(new FileReader("file2.csv"), CsvPreference.EXCEL_PREFERENCE);
try {
cols = readFile("file1.csv");
cols1 = readFile("fiel2.csv");
String [] headers = Reader.getCSVHeader(true);
headers = header(cols1,headers
} catch (IOException e) {
e.printStackTrace();
return;
}
for (int j =1; j<cols.size();j++) //1
{
for (int i=1;i<cols1.size();i++){
if (cols.get(j)[0].equals(cols1.get(i)[0]))
{
}
}
}
}
private static List<String[]> readFile(String fileName) throws IOException
{
List<String[]> values = new ArrayList<String[]>();
Scanner s = new Scanner(new File(fileName));
while (s.hasNextLine()) {
String line = s.nextLine();
values.add(line.split(","));
}
return values;
}
public static void csvWriter (String fileName, String [] nameMapping ) throws FileNotFoundException
{
ICsvListWriter writer = new CsvListWriter(new PrintWriter(fileName),CsvPreference.STANDARD_PREFERENCE);
try {
writer.writeHeader(nameMapping);
} catch (IOException e) {
e.printStackTrace();
}
}
public static String[] header(List<String[]> cols1, String[] headers){
List<String> list = new ArrayList<String>();
String [] add;
int count= 0;
for (int i=0;i<headers.length;i++){
list.add(headers[i]);
}
boolean c;
c= true;
while(c) {
add = cols1.get(0);
list.add(add[count]);
if (cols1.get(0)[count].equals(null))// this line is never read errpr
{
c=false;
break;
} else
count ++;
}
String[] array = new String[list.size()];
list.toArray(array);
return array;
}
Just be careful if you read all of the addresses and person details into memory first (as Thomas has suggested) - if you're only dealing with small CSV files then it's fine, but you may run out of memory if you're dealing with larger files.
As an alternative, I've put together an example that reads the addresses in first, then writes the combined person/address details while it reads in the person details.
Just a few things to note:
I've used CsvMapReader and CsvMapWriter because you were - this meant I've had to use a Map containing a Map for storing the addresses. Using CsvBeanReader/CsvBeanWriter would make this a bit more elegant.
The code from your question doesn't actually use Super CSV to read the CSV (you're using Scanner and String.split()). You'll run into issues if your CSV contains commas in the data (which is quite possible with addresses), so it's a lot safer to use Super CSV, which will handle escaped commas for you.
Example:
package example;
import java.io.StringReader;
import java.io.StringWriter;
import java.util.HashMap;
import java.util.Map;
import org.supercsv.io.CsvMapReader;
import org.supercsv.io.CsvMapWriter;
import org.supercsv.io.ICsvMapReader;
import org.supercsv.io.ICsvMapWriter;
import org.supercsv.prefs.CsvPreference;
public class CombiningPersonAndAddress {
private static final String PERSON_CSV = "id,firstName,lastName\n"
+ "1,philip,fry\n2,amy,wong\n3,hubert,farnsworth";
private static final String ADDRESS_CSV = "personId,address,country\n"
+ "1,address 1,USA\n2,address 2,UK\n3,address 3,AUS";
private static final String[] COMBINED_HEADER = new String[] { "id",
"firstName", "lastName", "address", "country" };
public static void main(String[] args) throws Exception {
ICsvMapReader personReader = null;
ICsvMapReader addressReader = null;
ICsvMapWriter combinedWriter = null;
final StringWriter output = new StringWriter();
try {
// set up the readers/writer
personReader = new CsvMapReader(new StringReader(PERSON_CSV),
CsvPreference.STANDARD_PREFERENCE);
addressReader = new CsvMapReader(new StringReader(ADDRESS_CSV),
CsvPreference.STANDARD_PREFERENCE);
combinedWriter = new CsvMapWriter(output,
CsvPreference.STANDARD_PREFERENCE);
// map of personId -> address (inner map is address details)
final Map<String, Map<String, String>> addresses =
new HashMap<String, Map<String, String>>();
// read in all of the addresses
Map<String, String> address;
final String[] addressHeader = addressReader.getCSVHeader(true);
while ((address = addressReader.read(addressHeader)) != null) {
final String personId = address.get("personId");
addresses.put(personId, address);
}
// write the header
combinedWriter.writeHeader(COMBINED_HEADER);
// read each person
Map<String, String> person;
final String[] personHeader = personReader.getCSVHeader(true);
while ((person = personReader.read(personHeader)) != null) {
// copy address details to person if they exist
final String personId = person.get("id");
final Map<String, String> personAddress = addresses.get(personId);
if (personAddress != null) {
person.putAll(personAddress);
}
// write the combined details
combinedWriter.write(person, COMBINED_HEADER);
}
} finally {
personReader.close();
addressReader.close();
combinedWriter.close();
}
// print the output
System.out.println(output);
}
}
Output:
id,firstName,lastName,address,country
1,philip,fry,address 1,USA
2,amy,wong,address 2,UK
3,hubert,farnsworth,address 3,AUS
From your comment, it seems like you have the following situation:
File 1 contains persons
File 2 contains addresses
You then want to match persons and addresses by some key ( one or more fields) and write the combination back to a CSV file.
Thus the simplest approach might be something like this:
//use a LinkedHashMap to preserve the order of the persons as found in file 1
Map<PersonKey, String[]> persons = new LinkedHashMap<>();
//fill in the persons from file 1 here
Map<PersonKey, String[]> addresses = new HashMap<>();
//fill in the addresses from file 2 here
List<String[]> outputLines = new ArrayList<>(persons.size());
for( Map.Entry<PersonKey, String[]> personEntry: persons.entrySet() ) {
String[] person = personEntry.getValue();
String[] address = addresses.get( personEntry.getKey() );
//merge the two arrays and put them into outputLines
}
//write outputLines to a file
Note that PersonKey might just be a String or a wrapper object ( Integer etc.) if you can match persons and addresses by one field. If you have more fields you might need a custom PersonKey object with equals() and hashCode() properly overridden.