How to get and write data in text file Like Database - java

I'm trying to develop a small program. I want to write and get data from a text file like if it were a database. I have some data ex:
User = abc,
Age = 12,
No = 154
I want to write that data in the text file and after I want so search data using User. I don't know how to that. Can anyone tell how to do this.
BufferedWriter writer = null;
try {
writer = new BufferedWriter(new FileWriter("./output.txt"));
writer.write("your data here");
} catch (IOException e) {
System.err.println(e);
} finally {
if (writer != null) {
try {
writer.close();
} catch (IOException e) {
System.err.println(e);
}
}
}

May I know why do you want this..?? Because as the request for read and write will Increase your code will reach the bottle neck. You want to perform heavy I/O operations for getting the lite data. Disk I/O is heaving is its own concurrent read restrictions. So I will not suggest you to use such approach for getting the lite data. You may put some heavy data like Images/ Videos/ Songs etc in files using some unique ID that will be a good approach but like this I will nor prefer you.. But at all you want to do this than go for property files which works on key and value. Put values token separated and split at the time of consumption.

Related

How can I plus file's info?

I'm making a new game and I wanna make a coins collector to, later, buy things with those coins. I'm using eclipse.
void save() {
try {
PrintWriter out = new PrintWriter("coins.txt");
out.write(Integer.toString(nmonedas));
out.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
}
}
void load() {
StringBuffer texto=new StringBuffer();
try {
int c;
#SuppressWarnings("resource")
FileReader entrada=new FileReader("coins.txt");
while((c=entrada.read())!=-1){
texto.append((char)c);
}
}
catch (IOException ex) {}
labelshow.setText(texto.toString());
}
I have this code but i cant plus the info. NEED HELP PLS
Well, the thing is, I'm doing a game in eclipse and I want you to collect coins and keep them in a file.
They are collected perfectly and stored in the file, but when I start the game again I want them to be collected but they add up with the previous ones
I assume you are referring to appending text to a .TXT file. If so, you can use something like this:
Files.write(Paths.get("Path to text file here"), "Content".getBytes(), StandardOpenOption.APPEND);
I would put the above in a TRY CATCH block. Also look into PrintWriter as this may be more appopriate to what you need it for as it allows you to continuously write to the file.

How to insert data as fast as possible with Hibernate

I read file and create a Object from it and store to postgresql database. My file has 100,000 document that I read from one file and split it and finally store to database.
I can't create List<> and store all document in List<> because my RAM is little. My code to read and write to database are as below. But My JVM Heap fills and can not continue to store more document. How to read file and store to database efficiently.
public void readFile() {
StringBuilder wholeDocument = new StringBuilder();
try {
bufferedReader = new BufferedReader(new FileReader(files));
String line;
int count = 0;
while ((line = bufferedReader.readLine()) != null) {
if (line.contains("<page>")) {
wholeDocument.append(line);
while ((line = bufferedReader.readLine()) != null) {
wholeDocument = wholeDocument.append("\n" + line);
if (line.contains("</page>")) {
System.out.println(count++);
addBodyToDatabase(wholeDocument.toString());
wholeDocument.setLength(0);
break;
}
}
}
}
wikiParser.commit();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
bufferedReader.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
public void addBodyToDatabase(String wholeContent) {
Page page = new Page(new Timestamp(System.currentTimeMillis()),
wholeContent);
database.addPageToDatabase(page);
}
public static int counter = 1;
public void addPageToDatabase(Page page) {
session.save(page);
if (counter % 3000 == 0) {
commit();
}
counter++;
}
First of all you should apply a fork-join approach here.
The main task parses the file and sends batches of at most 100 items to an ExecutorService. The ExecutorService should have a number of worker threads that equals the number of available database connections. If you have 4 CPU cores, let's say that the database can take 8 concurrent connections without doing to much context switching.
You should then configure a connection pooling DataSource and have a minSize equal to maxSize and equal to 8. Try HikariCP or ViburDBCP for connection pooling.
Then you need to configure JDBC batching. If you're using MySQL, the IDENTITY generator will disable bathing. If you're using a database that supports sequences, make sure you also use the enhanced identifier generators (they are the default option in Hibernate 5.x).
This way the entity insert process is parallelized and decoupled from the main parsing thread. The main thread should wait for the ExecutorService to finish processing all tasks prior to shutting down.
Actually it is hard to suggest to you without doing real profiling and find out what's making your code slow or inefficient.
However there are several things we can see from your code
You are using StringBuilder inefficiently
wholeDocument.append("\n" + line); should be wrote as wholeDocument.append("\n").append(line); instead
Because what you original wrote will be translated by compiler to
whileDocument.append(new StringBuilder("\n").append(line).toString()). You can see how much unnecessary StringBuilders you have created :)
Consideration in using Hibernate
I am not sure how you manage your session or how you implemented your commit(), I assume you have done it right, there are still more thing to consider:
Have you properly set up batch size in Hibernate? (hibernate.jdbc.batch_size) By default, the JDBC batch size is something around 5. You may want to make sure you set it in bigger size (so that internally Hibernate will send inserts in a bigger batch).
Given that you do not need the entities in 1st level cache for later use, you may want to do intermittent session flush() + clear() to
Trigger batch inserts mentioned in previous point
clear out first level cache
Switch away from Hibernate for this feature.
Hibernate is cool but it is not panacea for everything. Given that in this feature you are just saving records into DB based on text file content. Neither you do need any entity behavior, nor you need to make use of first level cache for later processing, there is not much reason to make use of Hibernate here given the extra processing and space overhead. Simply doing JDBC with manual batch handling is going to save you a lot of trouble .
I use #RookieGuy answer.
stackoverflow.com/questions/14581865/hibernate-commit-and-flush
I use
session.flush();
session.clear();
and finally after read all documents and store them into database
tx.commit();
session.close();
and change
wholeDocument = wholeDocument.append("\n" + line);
to
wholeDocument.append("\n" + line);
I'm not very much sure about the structure of your data file.It will be easy to understand, if you could provide a sample of your file.
The root cause of the memory consumption is the way of reading/iterating the file. Once something get read, stays in memory. You should rather use either java.io.FileInputStream or org.apache.commons.io.FileUtils.
Here is a sample code to iterate with java.io.FileInputStream
try (
FileInputStream inputStream = new FileInputStream("/tmp/sample.txt");
Scanner sc = new Scanner(inputStream, "UTF-8")
) {
while (sc.hasNextLine()) {
String line = sc.nextLine();
addBodyToDatabase(line);
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
Here is a sample code to iterate with org.apache.commons.io.FileUtils
File file = new File("/tmp/sample.txt");
LineIterator it = FileUtils.lineIterator(file, "UTF-8");
try {
while (it.hasNext()) {
String line = it.nextLine();
addBodyToDatabase(line);
}
} finally {
LineIterator.closeQuietly(it);
}
You should begin a transaction, do the save operation and commit a transaction. (Don't begin a transaction after save!). You can try to use StatelessSession to exclude memory consumption by a cache.
And use more less value, for an example 20, in this code
if (counter % 20 == 0)
You can try to pass StringBuilder as a method's argument as far as possible.

Can't read previously written JSON data (Unterminated String error) in Java

I'm using the twitter4j package for an information retrieval class and have collected some tweets. However, for the next part of the assignment, I am to use Lucene to index on the tweets. In order to do this, my thought was to save the tweets as JSON Strings to a file and then reread them when needed. However, I'm running into an error.
When the file is written, I can see the entire JSON object just fine. The total object is quite large (2500 characters). However, when reading back from the file, I get a Unterminated string at xxxx error. I am using the TwitterObjectFactory methods to both write and read the string. Here is a sample code:
Writing:
public void onStatus(Status status) {
try{
String jsonString = TwitterObjectFactory.getRawJSON(status);
output.write(jsonString+"\n");
numTweets++;
if(numTweets > 10){
synchronized(lock){
lock.notify();
}
}
}
catch(IOException e){
e.printStackTrace();
}
}
Reading:
Scanner input = new Scanner(file);
while(input.hasNext()){
Status status = TwitterObjectFactory.createStatus(input.nextLine());
System.out.println(status.getUser().getScreenName());
}
This works only some of the time. If I run the program multiple times and get many tweets, the program almost always crashes after 2-3 tweets have been read from the file, always with the same error. If you'd like to replicate the code, you can follow this example. I've added a synchronized block in order to close the stream after 10 tweets, but it's not necessary to replicate the error.
Can someone explain what is happening? My guess is that there's something wrong with the way I'm encoding the JSON into the file. I'm using BufferedWriter wrapping an OutputStreamWriter in order to output in UTF-8 format.
Edit: I do close the stream. Here's the bottom snippet of the code:
twitterStream.addListener(listener);
twitterStream.sample("en");
try{
synchronized(lock){
lock.wait();
}
}
catch(InterruptedException e){
e.printStackTrace();
}
twitterStream.clearListeners();
twitterStream.cleanUp();
twitterStream.shutdown();
output.close();
You probably need to flush your output, before you notify the reader. Otherwise parts of your String will stay in the buffer.
public void onStatus(Status status) {
try{
String jsonString = TwitterObjectFactory.getRawJSON(status);
output.write(jsonString+"\n");
output.flush();
numTweets++;
if(numTweets > 10){
synchronized(lock){
lock.notify();
}
}
}
catch(IOException e){
e.printStackTrace();
}
}
I don't see the code where you properly close the BufferedWriter. If you don't close it manually before the first program ends, then data might remain in the internal buffer and never written to the file.
You can also try to open the file in a text editor and look at the contents. Tools like http://codebeautify.org/jsonviewer or http://jsonlint.com/ allow you to validate/beautify the contents to see errors.
Lastly, try BufferedReader( new InputStreamReader( new FileInputStream(file), "UTF-8" ) ). Maybe non-ASCII characters in the input are confusing Scanner.

Exporting to CSV/Excel in Java

I'm trying to export data into a CSV file through Java and I've got some code to do it but it doesn't seem to be outputting the CSV file. Could someone tell me what's wrong? What I would like to do is rather than saving the file somewhere, I would like it to be directly exported to the user.
EDIT: Just in case it's not clear, I don't want the file to be saved anywhere but would like it to be outputted automatically to the user i.e. they click export and get the "Run/Save results.csv" window and they open the file. Currently the file is getting saved so I know that the method seems to work, just in the opposite way that I want it to.
public static void writeToCSV(List<Map> objectList) {
String CSV_SEPARATOR = ",";
try {
BufferedWriter bw = new BufferedWriter(new OutputStreamWriter(
new FileOutputStream("results.csv"), "UTF-8"));
for (Map objectDetails : objectList) {
StringBuffer oneLine = new StringBuffer();
Iterator it = objectDetails.values().iterator();
while (it.hasNext()) {
Object value = it.next();
if(value !=null){
oneLine.append(value.toString());
}
if (it.hasNext()) {
oneLine.append(CSV_SEPARATOR);
}
}
bw.write(oneLine.toString());
bw.newLine();
}
bw.flush();
bw.close();
} catch (UnsupportedEncodingException e) {
} catch (FileNotFoundException e) {
} catch (IOException e) {
}
}
I would recommend using a framework like opencsv for that. It also does escaping and quoting for you.
If you're not getting errors, check the directory where your code is. Without a specific path, your file is being saved there.
EDIT: Since the file is being saved and you want it to open automatically, use
Runtime.getRuntime().exec("results.csv");
(For Windows - opens the csv file in the default application for csv files)
Runtime.getRuntime().exec("open results.csv");
(For Mac - opens the csv file in the default application for csv files)
recommend HSSFWorkbook to easily read and write excel files.
http://poi.apache.org/apidocs/org/apache/poi/hssf/usermodel/HSSFWorkbook.html
To do that, the CSV reader need to read the memory of your program. This is a little complex thing to do. So, save the file in a temp folder instead. There is no problem to do this sort of thing.

How / when to delete a file in java?

The problem is, user clicks a button in JSP, which will export the displayed data. So what i am doing is, creating a temp. file and writing the contents in it [ resultSet >> xml >> csv ], and then writing the contents to ServletResponse. After closing the respons output stream, i try to delete the file, but every time it returns false.
code;
public static void writeFileContentToResponse ( HttpServletResponse response , String fileName ) throws IOException{
ServletOutputStream responseoutputStream = response.getOutputStream();
File file = new File(fileName);
if (file.exists()) {
file.deleteOnExit();
DataInputStream dis = new DataInputStream(new FileInputStream(
file));
response.setContentType("text/csv");
int size = (int) file.length();
response.setContentLength(size);
response.setHeader("Content-Disposition",
"attachment; filename=\"" + file.getName() + "\"");
response.setHeader("Pragma", "public");
response.setHeader("Cache-control", "must-revalidate");
if (size > Integer.MAX_VALUE) {
}
byte[] bytes = new byte[size];
dis.read(bytes);
FileCopyUtils.copy(bytes, responseoutputStream );
}
responseoutputStream.flush();
responseoutputStream.close();
file.delete();
}
i have used 'file.deleteOnExit();' and file.delete(); but none of them is working.
file.deleteOnExit() isn't going to produce the result you want here - it's purpose is to delete the file when the JVM exits - if this is called from a servlet, that means to delete the file when the server shuts down.
As for why file.delete() isn't working - all I see in this code is reading from the file and writing to the servlet's output stream - is it possible when you wrote the data to the file that you left the file's input stream open? Files won't be deleted if they're currently in use.
Also, even though your method throws IOException you still need to clean up things if there's an exception while accessing the file - put the file operations in a try block, and put the stream.close() into a finally block.
Don't create that file.
Write your data directly from your resultset to your CSV responseoutputStream.
That saves time, memory, diskspace and headache.
If you realy need it, try using File.createTempFile() method.
These files will be deleted when your VM stops normaly if they haven't been deleted before.
I'm assuming you have some sort of concurrency issue going on here. Consider making this method non-static, and use a unique name for your temp file (like append the current time, or use a guid for a filename). Chances are that you're opening the file, then someone else opens it, so the first delete fails.
as I see it, you are not closing the DataInputStream dis - this results to the false status, when you do want to delete file. Also, you should handle the streams in try-catch-finally block and close them within finally. The code is a bit rough, but it is safe:
DataInputStream dis = null;
try
{
dis = new DataInputStream(new FileInputStream(
file));
... // your other code
}
catch(FileNotFoundException P_ex)
{
// catch only Exceptions you want, react to them
}
finally
{
if(dis != null)
{
try
{
dis.close();
}
catch (IOException P_ex)
{
// handle exception, again react only to exceptions that must be reacted on
}
}
}
How are you creating the file. You probably need to use createTempFile.
You should be able to delete a temporary file just fine (No need for deleteOnExit). Are you sure the file isn't in use, when you are trying to delete it? You should have one file per user request (That is another reason you should avoid temp files and store everything in memory).
you can try piped input and piped output stream. those buffers need two threads one to feed the pipe (exporter) and the other (servlet) to consume data from the pipe and write it to the response output stream
You really don't want to create a temporary file for a request. Keep the resulting CSV in memory if at all possible.
You may need to tie the writing of the file in directly with the output. So parse a row of the result set, write it out to response stream, parse the next row and so on. That way you only keep one row in memory at a time. Problem there is that the response could time out.
If you want a shortcut method, take a look at Display tag library. It makes it very easy to show a bunch of results in a table and then add pre-built export options to said table. CSV is one of those options.
You don't need a temporary file. The byte buffer which you're creating there based on the file size may also cause OutOfMemoryError. It's all plain inefficient.
Just write the data of the ResultSet immediately to the HTTP response while iterating over the rows. Basically: writer.write(resultSet.getString("columnname")). This way you don't need to write it to a temporary file or to gobble everything in Java's memory.
Further, most JDBC drivers will by default cache everything in Java's memory before giving anything to ResultSet#next(). This is also inefficient. You'd like to let it give the data immediately row-by-row by setting the Statement#setFetchSize(). How to do it properly depends on the JDBC driver used. In case of for example MySQL, you can read it up in its JDBC driver documentation.
Here's a kickoff example, assuming that you're using MySQL:
protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
response.setContentType("text/csv");
response.setCharacterEncoding("UTF-8");
Connection connection = null;
Statement statement = null;
ResultSet resultSet = null;
PrintWriter writer = response.getWriter();
try {
connection = database.getConnection();
statement = connection.createStatement(ResultSet.TYPE_FORWARD_ONLY, ResultSet.CONCUR_READ_ONLY);
statement.setFetchSize(Integer.MIN_VALUE);
resultSet = statement.executeQuery("SELECT col1, col2, col3 FROM tbl");
while (resultSet.next()) {
writer.append(resultSet.getString("col1")).append(',');
writer.append(resultSet.getString("col2")).append(',');
writer.append(resultSet.getString("col3")).println();
// Note: don't forget to escape quotes/commas as per RFC4130.
}
} catch (SQLException e) {
throw new ServletException("Retrieving CSV rows from DB failed", e);
} finally {
if (resultSet != null) try { resultSet.close; } catch (SQLException logOrIgnore) {}
if (statement != null) try { statement.close; } catch (SQLException logOrIgnore) {}
if (connection != null) try { connection.close; } catch (SQLException logOrIgnore) {}
}
}
That's it. This way effectlvely only one database row is been kept in the memory all the time.

Categories

Resources