I am trying to take a very long file of strings and convert it to an XML according to a schema I was given. I used jaxB to create classes from that schema. Since the file is very large I created a thread pool to improve the performance but since then it only processes one line of the file and marshalls it to the XML file, per thread.
Below is my home class where I read from the file. Each line is a record of a transaction, for every new user encountered a list is made to store all of that users transactions and each list is put into a HashMap. I made it a ConcurrentHashMap because multiple threads will work on the map simultaneously, is this the correct thing to do?
After the lists are created a thread is made for each user. Each thread runs the method ProcessCommands below and receives from home the list of transactions for its user.
public class home{
public static File XMLFile = new File("LogFile.xml");
Map<String,List<String>> UserMap= new ConcurrentHashMap<String,List<String>>();
String[] UserNames = new String[5000];
int numberOfUsers = 0;
try{
BufferedReader reader = new BufferedReader(new FileReader("test.txt"));
String line;
while ((line = reader.readLine()) != null)
{
parsed = line.split(",|\\s+");
if(!parsed[2].equals("./testLOG")){
if(Utilities.checkUserExists(parsed[2], UserNames) == false){ //User does not already exist
System.out.println("New User: " + parsed[2]);
UserMap.put(parsed[2],new ArrayList<String>()); //Create list of transactions for new user
UserMap.get(parsed[2]).add(line); //Add First Item to new list
UserNames[numberOfUsers] = parsed[2]; //Add new user
numberOfUsers++;
}
else{ //User Already Existed
UserMap.get(parsed[2]).add(line);
}
}
}
reader.close();
} catch (IOException x) {
System.err.println(x);
}
//get start time
long startTime = new Date().getTime();
tCount = numberOfUsers;
ExecutorService threadPool = Executors.newFixedThreadPool(tCount);
for(int i = 0; i < numberOfUsers; i++){
System.out.println("Starting Thread " + i + " for user " + UserNames[i]);
Runnable worker = new ProcessCommands(UserMap.get(UserNames[i]),UserNames[i], XMLfile);
threadPool.execute(worker);
}
threadPool.shutdown();
while(!threadPool.isTerminated()){
}
System.out.println("Finished all threads");
}
Here is the ProcessCommands class. The thread receives the list for its user and creates a marshaller. From what I unserstand marshalling is not thread safe so it is best to create one for each thread, is this the best way to do that?
When I create the marshallers I know that each from (from each thread) will want to access the created file causing conflicts, I used synchronized, is that correct?
As the thread iterates through it's list, each line calls for a certain case. There are a lot so I just made pseudo-cases for clarity. Each case calls the function below.
public class ProcessCommands implements Runnable{
private static final boolean DEBUG = false;
private List<String> list = null;
private String threadName;
private File XMLfile = null;
public Thread myThread;
public ProcessCommands(List<String> list, String threadName, File XMLfile){
this.list = list;
this.threadName = threadName;
this.XMLfile = XMLfile;
}
public void run(){
Date start = null;
int transactionNumber = 0;
String[] parsed = new String[8];
String[] quoteParsed = null;
String[] universalFormatCommand = new String[9];
String userCommand = null;
Connection connection = null;
Statement stmt = null;
Map<String, UserObject> usersMap = null;
Map<String, Stack<BLO>> buyMap = null;
Map<String, Stack<SLO>> sellMap = null;
Map<String, QLO> stockCodeMap = null;
Map<String, BTO> buyTriggerMap = null;
Map<String, STO> sellTriggerMap = null;
Map<String, USO> usersStocksMap = null;
String SQL = null;
int amountToAdd = 0;
int tempDollars = 0;
UserObject tempUO = null;
BLO tempBLO = null;
SLO tempSLO = null;
Stack<BLO> tempStBLO = null;
Stack<SLO> tempStSLO = null;
BTO tempBTO = null;
STO tempSTO = null;
USO tempUSO = null;
QLO tempQLO = null;
String stockCode = null;
String quoteResponse = null;
int usersDollars = 0;
int dollarAmountToBuy = 0;
int dollarAmountToSell = 0;
int numberOfSharesToBuy = 0;
int numberOfSharesToSell = 0;
int quoteStockInDollars = 0;
int shares = 0;
Iterator<String> itr = null;
int transactionCount = list.size();
System.out.println("Starting "+threadName+" - listSize = "+transactionCount);
//UO dollars, reserved
usersMap = new HashMap<String, UserObject>(3); //userName -> UO
//USO shares
usersStocksMap = new HashMap<String, USO>(); //userName+stockCode -> shares
//BLO code, timestamp, dollarAmountToBuy, stockPriceInDollars
buyMap = new HashMap<String, Stack<BLO>>(); //userName -> Stack<BLO>
//SLO code, timestamp, dollarAmountToSell, stockPriceInDollars
sellMap = new HashMap<String, Stack<SLO>>(); //userName -> Stack<SLO>
//BTO code, timestamp, dollarAmountToBuy, stockPriceInDollars
buyTriggerMap = new ConcurrentHashMap<String, BTO>(); //userName+stockCode -> BTO
//STO code, timestamp, dollarAmountToBuy, stockPriceInDollars
sellTriggerMap = new HashMap<String, STO>(); //userName+stockCode -> STO
//QLO timestamp, stockPriceInDollars
stockCodeMap = new HashMap<String, QLO>(); //stockCode -> QLO
//create user object and initialize stacks
usersMap.put(threadName, new UserObject(0, 0));
buyMap.put(threadName, new Stack<BLO>());
sellMap.put(threadName, new Stack<SLO>());
try {
//Marshaller marshaller = getMarshaller();
synchronized (this){
Marshaller marshaller = init.jc.createMarshaller();
marshaller.setProperty(Marshaller.JAXB_FORMATTED_OUTPUT, true);
marshaller.setProperty(Marshaller.JAXB_FRAGMENT, true);
marshaller.marshal(LogServer.Root,XMLfile);
marshaller.marshal(LogServer.Root,System.out);
}
} catch (JAXBException M) {
M.printStackTrace();
}
Date timing = new Date();
//universalFormatCommand = new String[8];
parsed = new String[8];
//iterate through workload file
itr = this.list.iterator();
while(itr.hasNext()){
userCommand = (String) itr.next();
itr.remove();
parsed = userCommand.split(",|\\s+");
transactionNumber = Integer.parseInt(parsed[0].replaceAll("\\[", "").replaceAll("\\]", ""));
universalFormatCommand = Utilities.FormatCommand(parsed, parsed[0]);
if(transactionNumber % 100 == 0){
System.out.println(this.threadName + " - " +transactionNumber+ " - "+(new Date().getTime() - timing.getTime())/1000);
}
/*System.out.print("UserCommand " +transactionNumber + ": ");
for(int i = 0;i<8;i++)System.out.print(universalFormatCommand[i]+ " ");
System.out.print("\n");*/
//switch for user command
switch (parsed[1].toLowerCase()) {
case "One"
*Do Stuff"
LogServer.create_Log(universalFormatCommand, transactionNumber, CommandType.ADD);
break;
case "Two"
*Do Stuff"
LogServer.create_Log(universalFormatCommand, transactionNumber, CommandType.ADD);
break;
}
}
}
The function create_Log has multiple cases so as before, for clarity I just left one. The case "QUOTE" only calls one object creation function but other other cases can create multiple objects. The type 'log' is a complex XML type that defines all the other object types so in each call to create_Log I create a log type called Root. The class 'log' generated by JaxB included a function to create a list of objects. The statement:
Root.getUserCommandOrQuoteServerOrAccountTransaction().add(quote_QuoteType);
takes the root element I created, creates a list and adds the newly created object 'quote_QuoteType' to that list. Before I added threading this method successfully created a list of as many objects as I wanted then marshalled them. So I'm pretty positive the bit in class 'LogServer' is not the issue. It is something to do with the marshalling and syncronization in the ProcessCommands class above.
public class LogServer{
public static log Root = new log();
public static QuoteServerType Log_Quote(String[] input, int TransactionNumber){
ObjectFactory factory = new ObjectFactory();
QuoteServerType quoteCall = factory.createQuoteServerType();
**Populate the QuoteServerType object called quoteCall**
return quoteCall;
}
public static void create_Log(String[] input, int TransactionNumber, CommandType Command){
System.out.print("TRANSACTION "+TransactionNumber + " is " + Command + ": ");
for(int i = 0; i<input.length;i++) System.out.print(input[i] + " ");
System.out.print("\n");
switch(input[1]){
case "QUOTE":
System.out.print("QUOTE CASE");
QuoteServerType quote_QuoteType = Log_Quote(input,TransactionNumber);
Root.getUserCommandOrQuoteServerOrAccountTransaction().add(quote_QuoteType);
break;
}
}
So you wrote a lot of code, but have you try if it is actually working? After quick look I doubt it. You should test your code logic part by part not going all the way till the end. It seems you are just staring with Java. I would recommend practice first on simple one threaded applications. Sorry if I sound harsh, but I will try to be constructive as well:
Per convention, the classes names are starts with capital letter, variables by small, you do it other way.
You should make a method in you home (Home) class not a put all your code in the static block.
You are reading the whole file to the memory, you do not process it line by line. After the Home is initialized literary whole content of file will be under UserMap variable. If the file is really large you will run out of the heap memory. If you assume large file than you cannot do it and you have to redisign your app to store somewhere partial results. If your file is smaller than memmory you could keep it like that (but you said it is large).
No need for UserNames, the UserMap.containsKey will do the job
Your thread pools size should be in the range of your cores not number of users as you will get thread trashing (if you have blocking operation in your code make tCount = 2*processors if not keep it as number of processors). Once one ProcessCommand finish, the executor will start another one till you finish all and you will be efficiently using all your processor cores.
DO NOT while(!threadPool.isTerminated()), this line will completely consume one processor as it will be constantly checking, call awaitTermination instead
Your ProcessCommand, has view map variables which will only had one entry cause as you said, each will process data from one user.
The synchronized(this) is Process will not work, as each thread will synchronized on different object (different isntance of process).
I believe creating marshaller is thread safe (check it) so no need to synchronization at all
You save your log (whatever it is) before you did actual processing in of the transactions lists
The marshalling will override content of the file with current state of LogServer.Root. If it is shared bettween your proccsCommand (seems so) what is the point in saving it in each thread. Do it once you are finished.
You dont need itr.remove();
The log class (for the ROOT variable !!!) needs to be thread-safe as all the threads will call the operations on it (so the list inside the log class must be concurrent list etc).
And so on.....
I would recommend, to
Start with simple one thread version that actually works.
Deal with processing line by line, (store reasults for each users in differnt file, you can have cache with transactions for recently used users so not to keep writing all the time to the disk (see guava cache)
Process multithreaded each user transaction to your user log objects (again if it is a lot you have to save them to the disk not keep all in memmory).
Write code that combines logs from diiffernt users to create one (again you may want to do it mutithreaded), though it will be mostly IO operations so not much gain and more tricky to do.
Good luck
override cont
Related
I have an application which reads and writes a number of files. The aim is to prevent a specific file from being read or written while it is being written by another thread. I do not want to lock the reading and writing of all files while a single file is being written as this causes unnecessary locking.
To try and achieve this I am using a concurrentHashMap in conjunction with a synchonized block, but if there is a better solution I am open to it.
Here is the rough code.
private static final ConcurrentMap<String, String> lockMap = new ConcurrentHashMap();
private void createCache(String templatePath, String cachePath){
//get template
String temp = getTemplate(templatePath);
String myRand = randomString();
lockMap.put(cachePath,myRand);
// save cache file
try {
// ** is lockMap.get(cachePath) still threadsafe if another thread has changed the row's value?
synchronized ( lockMap.get(cachePath) ){
Files.write(Paths.get(cachePath),temp.getBytes(StandardCharsets.UTF_8));
}
} finally {
// remove lock if not locked by another thread in the meantime
lockMap.remove(cachePath, myRand);
}
}
private String getCache(String cachePath){
String output = null;
//only lock if this specific file is being written at the moment
if ( lockMap.contains(cachePath) ){
synchronized ( lockMap.get(cachePath) ){
output = getFile(cachePath);
}
} else {
output = getFile(cachePath);
}
return output;
}
// main event
private String cacheToString (String templatePath, String cachePath){
File cache = new File(cachePath);
if ( !cache.exists() ){
createCache(templatePath, cachePath)
}
return getCache(cachePath);
}
The problem I have is that although the thread will only remove the lock for the requested file if it's unchanged by another thread, it's still possible for another thread to update the value in the lockMap for this entry - if this happens will the synchronisation fail?
I would write a new temporary file each time and rename it when finished. Renaming is atomic.
// a unique counter across restarts
final AtomicLong counter = new AtomicLong(System.currentTimeMillis()*1000);
private void createCache(String templatePath, String cachePath) {
//get template
String temp = getTemplate(templatePath);
Path path = Paths.get(cachePath);
Path tmpPath = Paths.get(path.getParent().toString(), counter.getAndIncrement() + ".tmp");
// save cache file
Files.write(tmpPath, temp.getBytes(StandardCharsets.UTF_8));
Files.move(tmpPath, path, ATOMIC_MOVE, REPLACE_EXISTING);
}
If multiple threads try to write to the same file, the last one to perform a move wins.
I'm reading file-Paths from a collection
Collection<String> FileList = new ArrayList<>();
this collection can contain more than 600.000 File-paths, but with my current method it takes up to a few hours, to create the text-file with all information.
Every XML contains a list of -items- which could have a tag -value- with the attribute -value is_special="true"-. In this case, the name of the -item- should be stored. The result looks like:
C:\bar\foo\archive\T16-0B07186E3B194D2341256D2F003FF1FE.xml
C:\bar\foo\archive\C1257FBF0040265C-1\T26-75A218AFA1FC460B41256D9C00406708.xml
C:\bar\foo\archive\C1257FBF0040265C-1\T26-75A218AFA1FC460B41256D9C99406708.xml
Itemname:CreationDate
Itemname:PublishingDate
Itemname:ValidThruDate
Itemname:ArchiveDate
Itemname:ReleaseDate
Itemname:EraseDate
Current function:
public void FullFilterAndExport() throws JAXBException, IOException {
totalFilesCount = 0;
totalFilesCountPositive = 0;
PrintWriter pWriter = new PrintWriter(new BufferedWriter(new FileWriter(DB_Path.toString() + "\\export_full.txt")));
for(String file: FileList) {
if (file.endsWith(".xml") && !file.contains("databaseinfo.xml")) {
totalFilesCount = totalFilesCount +1;
ItemList.clear();
JAXBContext context = JAXBContext.newInstance(NotesDocumentMetaFile.class);
Unmarshaller um = context.createUnmarshaller();
NotesDocumentMetaFile docMetaFile = (NotesDocumentMetaFile) um.unmarshal(new FileReader(file));
for(int i = 0; i < docMetaFile.getItems().size(); i++) {
if(docMetaFile.getItems().get(i).getValueIsSpecial() == true) {
ItemList.add("Itemname:" + docMetaFile.getItems().get(i).getName());
}
}
if(!ItemList.isEmpty()) {
totalFilesCountPositive = totalFilesCountPositive + 1;
pWriter.println(file);
pWriter.println();
for(String item : ItemList) {
pWriter.println(item);
}
pWriter.println();
}
}
}
pWriter.println();
pWriter.println("------------------");
pWriter.println("Anzahl der geprüften Dateien: " + totalFilesCount);
pWriter.println("Anzahl der geprüften positiven Dateien: " + totalFilesCountPositive);
if (pWriter != null){
pWriter.flush();
pWriter.close();
}
Is there any chance to improve the performance?
profile (using jvisualvm, included in oracle jdk), section cpu sampling snapshot.
a culprit might be jaxb. If it is the case try any streaming xml reader. The code will be uglier, but should be faster. Re-test / Re-profile to check what is taking cpu time
you might want to de-correlate reading from xml files and writing to the output text file, using for example a BlockingDeque that will contain the result of reading an xml file. This queue would be fed by severals threads reading xml in parallel, and consumed by a writing thread, in order to take advantage of all the cores of your cpu.
EDIT:
as a quick-win, i think this code :
JAXBContext context = JAXBContext.newInstance(NotesDocumentMetaFile.class);
Unmarshaller um = context.createUnmarshaller();
can be moved outside the for loop. It should give you a good boost. The context is thread safe, while the unmarshaller is not but can be reused for several files.
I have a csv file named abc.csv which contains data as follows :
All,Friday,0:00,315.06,327.92,347.24
All,Friday,1:00,316.03,347.73,370.55
and so on .....
I wish to import the data into Redis. How to do it through the Java API.
Please suggest the steps to do this.
I wish to run the jar and get the data imported into Redis db.
Any help on mass insert would also be helpful in case Java option is not possible.
You can do it by using Jedis(https://github.com/xetorthio/jedis), java client for redis. Just create a class with main method and create a connection, set keys.
void processData(){
List<String> lines = Files.readAllLines(
Paths.get("\Path\abc.csv"), Charset.forName("UTF-8"));
Jedis connection = new Jedis("host", port);
Pipeline p = connection.pipelined();
for(String line: lines){
String key = getKey(line);
String value = getValue(line);
p.set(key,value);
}
p.sync();
}
If the file is big you can create an inputstream out of it and read line by line instead of loading the whole file. You should also than call p.sync() in batches, just keep a counter and do a modulo with batch size.
Redisson allows to organize multiple commands to one. This called Redis pipelining. Here is example:
List<String> lines = ...
RBatch batch = redisson.createBatch();
RList<String> list = batch.getList("yourList");
for(String line: lines){
String key = getKey(line);
batch.getBucket(key).set(line);
}
// send to Redis as single command
batch.execute();
You can use Jedis Lib. Here I have extended my Java Object to get key and Hash to set in the object. In your case, you can have a list of parsed CSV values and keys directly sent as a list.
private static final int BATCH_SIZE = 1000;
public void setRedisHashInBatch(List<? extends RedisHashMaker> Obj) {
try (Jedis jedis = jedisPool.getResource()) {
log.info("Connection IP -{}", jedis.getClient().getSocket().getInetAddress());
Pipeline p = jedis.pipelined();
log.info("Setting Records in Cache. Size: " + Obj.size());
int j = 0;
for (RedisHashMaker obj : Obj) {
p.hmset(obj.getUniqueKey(), obj.getRedisHashStringFromObject());
j++;
if (j == BATCH_SIZE) {
p.sync();
j = 0;
}
}
p.sync();
}
}
I am trying to use JaxB to marshall objects I create to an XML. What I want is to create a list then print it to the file, then create a new list and print it to the same file but everytime I do it over writes the first. I want the final XML file to look like I only had 1 big list of objects. I would do this but there are so many that I quickly max my heap size.
So, my main creates a bunch of threads each of which iterate through a list of objects it receives and calls create_Log on each object. Once it is finished it calls printToFile which is where it marshalls the list to the file.
public class LogThread implements Runnable {
//private Thread myThread;
private Log_Message message = null;
private LinkedList<Log_Message> lmList = null;
LogServer Log = null;
private String Username = null;
public LogThread(LinkedList<Log_Message> lmList){
this.lmList = lmList;
}
public void run(){
//System.out.println("thread running");
LogServer Log = new LogServer();
//create iterator for list
final ListIterator<Log_Message> listIterator = lmList.listIterator();
while(listIterator.hasNext()){
message = listIterator.next();
CountTrans.addTransNumber(message.TransactionNumber);
Username = message.input[2];
Log.create_Log(message.input, message.TransactionNumber, message.Message, message.CMD);
}
Log.printToFile();
init_LogServer.threadCount--;
init_LogServer.doneList();
init_LogServer.doneUser();
System.out.println("Thread "+ Thread.currentThread().getId() +" Completed user: "+ Username+"... Number of Users Complete: " + init_LogServer.getUsersComplete());
//Thread.interrupt();
}
}
The above calls the below function create_Log to build a new object I generated from the XSD I was given (SystemEventType,QuoteServerType...etc). These objects are all added to an ArrayList using the function below and attached to the Root object. Once the LogThread loop is finished it calls the printToFile which takes the list from the Root object and marshalls it to the file... overwriting what was already there. How can I add it to the same file without over writing and without creating one master list in the heap?
public class LogServer {
public log Root = null;
public static String fileName = "LogFile.xml";
public static File XMLfile = new File(fileName);
public LogServer(){
this.Root = new log();
}
//output LogFile.xml
public synchronized void printToFile(){
System.out.println("Printing XML");
//write to xml file
try {
init_LogServer.marshaller.marshal(Root,XMLfile);
} catch (JAXBException e) {
e.printStackTrace();
}
System.out.println("Done Printing XML");
}
private BigDecimal ConvertStringtoBD(String input){
DecimalFormatSymbols symbols = new DecimalFormatSymbols();
symbols.setGroupingSeparator(',');
symbols.setDecimalSeparator('.');
String pattern = "#,##0.0#";
DecimalFormat decimalFormat = new DecimalFormat(pattern, symbols);
decimalFormat.setParseBigDecimal(true);
// parse the string
BigDecimal bigDecimal = new BigDecimal("0");
try {
bigDecimal = (BigDecimal) decimalFormat.parse(input);
} catch (ParseException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return bigDecimal;
}
public QuoteServerType Log_Quote(String[] input, int TransactionNumber){
BigDecimal quote = ConvertStringtoBD(input[4]);
BigInteger TransNumber = BigInteger.valueOf(TransactionNumber);
BigInteger ServerTimeStamp = new BigInteger(input[6]);
Date date = new Date();
long timestamp = date.getTime();
ObjectFactory factory = new ObjectFactory();
QuoteServerType quoteCall = factory.createQuoteServerType();
quoteCall.setTimestamp(timestamp);
quoteCall.setServer(input[8]);
quoteCall.setTransactionNum(TransNumber);
quoteCall.setPrice(quote);
quoteCall.setStockSymbol(input[3]);
quoteCall.setUsername(input[2]);
quoteCall.setQuoteServerTime(ServerTimeStamp);
quoteCall.setCryptokey(input[7]);
return quoteCall;
}
public SystemEventType Log_SystemEvent(String[] input, int TransactionNumber, CommandType CMD){
BigInteger TransNumber = BigInteger.valueOf(TransactionNumber);
Date date = new Date();
long timestamp = date.getTime();
ObjectFactory factory = new ObjectFactory();
SystemEventType SysEvent = factory.createSystemEventType();
SysEvent.setTimestamp(timestamp);
SysEvent.setServer(input[8]);
SysEvent.setTransactionNum(TransNumber);
SysEvent.setCommand(CMD);
SysEvent.setFilename(fileName);
return SysEvent;
}
public void create_Log(String[] input, int TransactionNumber, String Message, CommandType Command){
switch(Command.toString()){
case "QUOTE": //Quote_Log
QuoteServerType quote_QuoteType = Log_Quote(input,TransactionNumber);
Root.getUserCommandOrQuoteServerOrAccountTransaction().add(quote_QuoteType);
break;
case "QUOTE_CACHED":
SystemEventType Quote_Cached_SysType = Log_SystemEvent(input, TransactionNumber, CommandType.QUOTE);
Root.getUserCommandOrQuoteServerOrAccountTransaction().add(Quote_Cached_SysType);
break;
}
}
EDIT: The below is code how the objects are added to the ArrayList
public List<Object> getUserCommandOrQuoteServerOrAccountTransaction() {
if (userCommandOrQuoteServerOrAccountTransaction == null) {
userCommandOrQuoteServerOrAccountTransaction = new ArrayList<Object>();
}
return this.userCommandOrQuoteServerOrAccountTransaction;
}
Jaxb is about mapping java object tree to xml document or vice versa. So in principle, you need complete object model before you can save it to xml.
Of course it would not be possible, for very large data, for example DB dump, so jaxb allows marshalling object tree in fragments, letting the user control moment of the object creation and marshaling. Typical use case would be fetching records from DB one by one and marshaling them one by one to a file, so there would not be problem with the heap.
However, you are asking about appending one object tree to another (one fresh in memory, second one already represented in a xml file). Which is not normally possible as it is not really appending but crating new object tree that contains content of the both (there is only one document root element, not two).
So what you could do,
is to create new xml representation with manually initiated root
element,
copy the existing xml content to the new xml either using XMLStreamWriter/XMLStreamReader read/write operations or unmarshaling
the log objects and marshaling them one by one.
marshal your log objects into the same xml stram
complete the xml with the root closing element. -
Vaguely, something like that:
XMLStreamWriter writer = XMLOutputFactory.newInstance().createXMLStreamWriter(new FileOutputStream(...), StandardCharsets.UTF_8.name());
//"mannually" output the beginign of the xml document == its declaration and the root element
writer.writeStartDocument();
writer.writeStartElement("YOUR_ROOT_ELM");
Marshaller mar = ...
mar.setProperty(Marshaller.JAXB_FRAGMENT, true); //instructs jaxb to output only objects not the whole xml document
PartialUnmarshaler existing = ...; //allows reading one by one xml content from existin file,
while (existing.hasNext()) {
YourObject obj = existing.next();
mar.marshal(obj, writer);
writer.flush();
}
List<YourObject> toAppend = ...
for (YourObject toAppend) {
mar.marshal(obj,writer);
writer.flush();
}
//finishing the document, closing the root element
writer.writeEndElement();
writer.writeEndDocument();
Reading the objects one by one from large xml file, and complete implementation of PartialUnmarshaler is described in this answer:
https://stackoverflow.com/a/9260039/4483840
That is the 'elegant' solution.
Less elegant is to have your threads write their logs list to individual files and the append them yourself. You only need to read and copy the header of the first file, then copy all its content apart from the last closing tag, copy the content of the other files ignoring the document openkng and closing tag, output the closing tag.
If your marshaller is set to marshaller.setProperty(Marshaller.JAXB_FORMATTED_OUTPUT, true);
each opening/closing tag will be in different line, so the ugly hack is to
copy all the lines from 3rd to one before last, then output the closing tag.
It is ugly hack, cause it is sensitive to your output format (if you for examle change your container root element). But faster to implement than full Jaxb solution.
I am getting Java Heap Space Error while writing large data from database to an excel sheet.
I dont want to use JVM -XMX options to increase memory.
Following are the details:
1) I am using org.apache.poi.hssf api
for excel sheet writing.
2) JDK version 1.5
3) Tomcat 6.0
Code i have wriiten works well for around 23 thousand records, but it fails for more than 23K records.
Following is the code:
ArrayList l_objAllTBMList= new ArrayList();
l_objAllTBMList = (ArrayList) m_objFreqCvrgDAO.fetchAllTBMUsers(p_strUserTerritoryId);
ArrayList l_objDocList = new ArrayList();
m_objTotalDocDtlsInDVL= new HashMap();
Object l_objTBMRecord[] = null;
Object l_objVstdDocRecord[] = null;
int l_intDocLstSize=0;
VisitedDoctorsVO l_objVisitedDoctorsVO=null;
int l_tbmListSize=l_objAllTBMList.size();
System.out.println(" getMissedDocDtlsList_NSM ");
for(int i=0; i<l_tbmListSize;i++)
{
l_objTBMRecord = (Object[]) l_objAllTBMList.get(i);
l_objDocList = (ArrayList) m_objGenerateVisitdDocsReportDAO.fetchAllDocDtlsInDVL_NSM((String) l_objTBMRecord[1], p_divCode, (String) l_objTBMRecord[2], p_startDt, p_endDt, p_planType, p_LMSValue, p_CycleId, p_finYrId);
l_intDocLstSize=l_objDocList.size();
try {
l_objVOFactoryForDoctors = new VOFactory(l_intDocLstSize, VisitedDoctorsVO.class);
/* Factory class written to create and maintain limited no of Value Objects (VOs)*/
} catch (ClassNotFoundException ex) {
m_objLogger.debug("DEBUG:getMissedDocDtlsList_NSM :Exception:"+ex);
} catch (InstantiationException ex) {
m_objLogger.debug("DEBUG:getMissedDocDtlsList_NSM :Exception:"+ex);
} catch (IllegalAccessException ex) {
m_objLogger.debug("DEBUG:getMissedDocDtlsList_NSM :Exception:"+ex);
}
for(int j=0; j<l_intDocLstSize;j++)
{
l_objVstdDocRecord = (Object[]) l_objDocList.get(j);
l_objVisitedDoctorsVO = (VisitedDoctorsVO) l_objVOFactoryForDoctors.getVo();
if (((String) l_objVstdDocRecord[6]).equalsIgnoreCase("-"))
{
if (String.valueOf(l_objVstdDocRecord[2]) != "null")
{
l_objVisitedDoctorsVO.setPotential_score(String.valueOf(l_objVstdDocRecord[2]));
l_objVisitedDoctorsVO.setEmpcode((String) l_objTBMRecord[1]);
l_objVisitedDoctorsVO.setEmpname((String) l_objTBMRecord[0]);
l_objVisitedDoctorsVO.setDoctorid((String) l_objVstdDocRecord[1]);
l_objVisitedDoctorsVO.setDr_name((String) l_objVstdDocRecord[4] + " " + (String) l_objVstdDocRecord[5]);
l_objVisitedDoctorsVO.setDoctor_potential((String) l_objVstdDocRecord[3]);
l_objVisitedDoctorsVO.setSpeciality((String) l_objVstdDocRecord[7]);
l_objVisitedDoctorsVO.setActualpractice((String) l_objVstdDocRecord[8]);
l_objVisitedDoctorsVO.setLastmet("-");
l_objVisitedDoctorsVO.setPreviousmet("-");
m_objTotalDocDtlsInDVL.put((String) l_objVstdDocRecord[1], l_objVisitedDoctorsVO);
}
}
}// End of While
writeExcelSheet(); // Pasting this method at the end
// Clean up code
l_objVOFactoryForDoctors.resetFactory();
m_objTotalDocDtlsInDVL.clear();// Clear the used map
l_objDocList=null;
l_objTBMRecord=null;
l_objVstdDocRecord=null;
}// End of While
l_objAllTBMList=null;
m_objTotalDocDtlsInDVL=null;
-------------------------------------------------------------------
private void writeExcelSheet() throws IOException
{
HSSFRow l_objRow = null;
HSSFCell l_objCell = null;
VisitedDoctorsVO l_objVisitedDoctorsVO = null;
Iterator l_itrDocMap = m_objTotalDocDtlsInDVL.keySet().iterator();
while (l_itrDocMap.hasNext())
{
Object key = l_itrDocMap.next();
l_objVisitedDoctorsVO = (VisitedDoctorsVO) m_objTotalDocDtlsInDVL.get(key);
l_objRow = m_objSheet.createRow(m_iRowCount++);
l_objCell = l_objRow.createCell(0);
l_objCell.setCellStyle(m_objCellStyle4);
l_objCell.setCellValue(String.valueOf(l_intSrNo++));
l_objCell = l_objRow.createCell(1);
l_objCell.setCellStyle(m_objCellStyle4);
l_objCell.setCellValue(l_objVisitedDoctorsVO.getEmpname() + " (" + l_objVisitedDoctorsVO.getEmpcode() + ")"); // TBM Name
l_objCell = l_objRow.createCell(2);
l_objCell.setCellStyle(m_objCellStyle4);
l_objCell.setCellValue(l_objVisitedDoctorsVO.getDr_name());// Doc Name
l_objCell = l_objRow.createCell(3);
l_objCell.setCellStyle(m_objCellStyle4);
l_objCell.setCellValue(l_objVisitedDoctorsVO.getPotential_score());// Freq potential score
l_objCell = l_objRow.createCell(4);
l_objCell.setCellStyle(m_objCellStyle4);
l_objCell.setCellValue(l_objVisitedDoctorsVO.getDoctor_potential());// Freq potential score
l_objCell = l_objRow.createCell(5);
l_objCell.setCellStyle(m_objCellStyle4);
l_objCell.setCellValue(l_objVisitedDoctorsVO.getSpeciality());//CP_GP_SPL
l_objCell = l_objRow.createCell(6);
l_objCell.setCellStyle(m_objCellStyle4);
l_objCell.setCellValue(l_objVisitedDoctorsVO.getActualpractice());// Actual practise
l_objCell = l_objRow.createCell(7);
l_objCell.setCellStyle(m_objCellStyle4);
l_objCell.setCellValue(l_objVisitedDoctorsVO.getPreviousmet());// Lastmet
l_objCell = l_objRow.createCell(8);
l_objCell.setCellStyle(m_objCellStyle4);
l_objCell.setCellValue(l_objVisitedDoctorsVO.getLastmet());// Previousmet
}
// Write OutPut Stream
try {
out = new FileOutputStream(m_objFile);
outBf = new BufferedOutputStream(out);
m_objWorkBook.write(outBf);
} catch (Exception ioe) {
ioe.printStackTrace();
System.out.println(" Exception in chunk write");
} finally {
if (outBf != null) {
outBf.flush();
outBf.close();
out.close();
l_objRow=null;
l_objCell=null;
}
}
}
Instead of populating the complete list in memory before starting to write to excel you need to modify the code to work in such a way that each object is written to a file as it is read from the database. Take a look at this question to get some idea of the other approach.
Well, I'm not sure if POI can handle incremental updates but if so you might want to write chunks of say 10000 Rows to the file. If not, you might have to use CSV instead (so no formatting) or increase memory.
The problem is that you need to make objects written to the file elligible for garbage collection (no references from a live thread anymore) before writing the file is finished (before all rows have been generated and written to the file).
Edit:
If can you write smaller chunks of data to the file you'd also have to only load the necessary chunks from the db. So it doesn't make sense to load 50000 records at once and then try and write 5 chunks of 10000, since those 50000 records are likely to consume a lot of memory already.
As Thomas points out, you have too many objects taking up too much space, and need a way to reduce that. There is a couple of strategies for this I can think of:
Do you need to create a new factory each time in the loop, or can you reuse it?
Can you start with a loop getting the information you need into a new structure, and then discarding the old one?
Can you split the processing into a thread chain, sending information forwards to the next step, avoiding building a large memory consuming structure at all?