Storing objects (documents files) into SQL using JDBC - java

I have a JSF form where the user uploads data. Now I need to store the file uploaded (could be pdf, doc, xls, etc) into the database. Now of course this file will be loaded later by reviewers. Any idea what is the best way to do this? Serialization? and if yes, any where specific I should look?
I've seen that most serialization deal with user created objects (ones I am familiar with) like employee objects, buildings, etc. But I don't know how to represent the file document as object. as In what to pass to inputReader to get the binary string...
Thanks,

There's no need for serialization at all here - a file is (simplistically) just a name and the data within it. So you just need a name field of type VARCHAR (or whatever) and a data fields of type IMAGE or whatever type your database uses for blobs.
You can use PreparedStatement.setBlob (or potentially PreparedStatement.setBinaryStream; I'll readily admit I'm not confident in the differences) to upload the data.

To insert a binary document into a database you need a table with a column of type BLOB (binary large object).
A skeleton to write:
File f = new File( "your.doc")
FileInputStream fis = new FileInputStream( );
PreparedStatement pstmt = dbCon.prepareStatement("insert into docs(doc) values(?)" );
pstmt.setBinaryStream( 1, fis, (int) l.length() );
pstmt.executeUpdate();
pstmt.close();
to read from the DB:
Statement st = dbCon.createStatement();
ResultSet rs = st.executeQuery("select doc from docs where ..." );
if( rs.next() )
{
InputStream is = rs.getBinaryStream( "doc" );
FileOutputStream fos = new FileOutputStream( "filename" );
// copy is to fos
}
rs.close();
st.close();

Related

Export a writable sqlite database

Is there a way to export a writable database? I need to use the database I have exported from the phone in a Visual Studio app as part of the system. Its bad, I know, but its the only I could do my project. This is the code i use to export the database from the SQLiteImporterExporter library
public void exportDataBase(String path) {
InputStream myInput = null;
OutputStream myOutput = null;
try {
String inFileName = DB_PATH + DB_NAME;
myInput = new FileInputStream(inFileName);
String outFileName = path + DB_NAME;
myOutput = new FileOutputStream(outFileName);
byte[] buffer = new byte[1024];
int length;
while ((length = myInput.read(buffer)) > 0) {
myOutput.write(buffer, 0, length);
}
if (exportListener != null)
exportListener.onSuccess("Successfully Exported");
} catch (Exception e) {
if (exportListener != null)
exportListener.onFailure(e);
} finally {
try {
// Close the streams
myOutput.flush();
myOutput.close();
myInput.close();
} catch (Exception ex) {
if (exportListener != null)
exportListener.onFailure(ex);
}
}
}
Is there a way to export a writable database?
From your code, assuming that it's working, then that's what you are doing. It is also assumed that you are ensuring that the database is not in-use elsewhere and that it has been closed.
That is, an SQLite database is a single file, copy that file and you can then use that database anywhere where you have SQLite. The writable/readable concept (i.e. getWritableDatabase v getReadableDatabase) is one of the more common mis-conceptions. The latter will in most cases get a writable database, it will only return a read only database if the database cannot be written too e.g. (full disk). As per :-
Create and/or open a database. This will be the same object returned
by getWritableDatabase() unless some problem, such as a full disk,
requires the database to be opened read-only. In that case, a
read-only database object will be returned. If the problem is fixed, a
future call to getWritableDatabase() may succeed, in which case the
read-only database object will be closed and the read/write object
will be returned in the future.
getReadableDatabase
Exceptions/considerations would be :-
If the database were encrypted, in which case you would use the encryption code/extension along with the key.
If you use in-memory/temporary tables, in this case AND if you needed these in-memory/temporary tables, you could either create a copy database but changing in-memory/temporary to be permanent.
SQLite does have a backup API, which does this. However, I don't believe that the SQLite API in Android implements the backup API though (so it could well be a bigger pain to try utilising it, than implementing the copy method).
being on Android, there will be an additional table in the database named android_metadata, this simply stores the locale and can be ignored elsewhere.

How to save an image to SQL server with java? when the image is in package of my project

I am creating an application where the user can add profiles of different person including his profile picture. However i want to enable the option when there is NO picture to upload. In that way i want to save a default image that it is in package of my project. image in a package of my project
Here is part of the code that i use to save a picture using FileChooser i might be almost the same i think.
String F = TxtRutaFoto.getText();
FileInputStream fis = null;
try{
File file = new File(F);
fis = new FileInputStream(file);
int k = JOptionPane.showConfirmDialog(null, "DESEA GUARDAR LOS DATOS DEL JUGADOR?","PREGUNTA", JOptionPane.YES_NO_OPTION);
if(k == JOptionPane.YES_OPTION){ try (
CallableStatement pstm = conexion.conectar.getConnection().prepareCall("{call INSERTARJUGADOR(?,?,?,?,?,?,?)}")
){
pstm.setString(1, txtID.getText());
pstm.setString(2, txt_ape_pat.getText());
pstm.setString(3, txt_ape_mat.getText());
pstm.setString(4, txt_nombre.getText());
pstm.setString(5, txt_correo.getText());
pstm.setString(6, txt_direc.getText());
pstm.setBinaryStream(7, fis, (int) file.length());
ResultSet r = pstm.executeQuery();
You need to get the InputStream of the picture in your jar. You can do that by using the method classLoader#getResource method:
String loc = getClass().getClassLoader().getResource("/path/to/resource.jpg");
InputStream is = new URL(loc).openStream();
In the default-case you can set that stream instead of the FileInputStream. You should check loc for being null in case the file can't be found (sometimes you have to leave the leading slash away.
Problem solved i used this and its works. First i create a 'resources' folder in where is my default picture. Then i just i used the next code.
File file = new File(getClass().getResource("/resources/user.jpg").getFile());
FileInputStream fis = new FileInputStream(file);
then i just call my procedure
pstm.setBinaryStream(7, fis);
ResultSet r = pstm.executeQuery();

java.sql.SQLException: ORA-00600: internal error code, arguments: [12811], [93233]

I got this while trying to execute
File f = new File("file location");
FileInputStream fis = new FileInputStream(f);
PreparedStatement ps = db.prepareStatement("INSERT INTO table (title,file,id) VALUES('title',?,7)");
ps.setBinaryStream(1, fis, (int)f.length());
ps.execute();
ps.close();
title - VARCHAR2
file - BLOB
id - FK to another table
Operations with another tables are working, everything is fine but this one. I tried to change statement, drop and recreate table. Without result

Store and retrieve word documents with MySQL

I need to store and retrieve MS Word documents into MySQL 5.1 with Servlets. I've the code to upload a file, but I don't know can I feed into the table. I've used BLOB for the field I've to insert .doc files.
Here's my code snippet to upload files:
protected void doPost(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
response.setContentType("text/html;charset=UTF-8");
PrintWriter out = response.getWriter();
try {
// get access to file that is uploaded from client
Part p1 = request.getPart("file");
String type=p1.getContentType();
String name=p1.getName();
long size = p1.getSize();
InputStream is = p1.getInputStream();
//FileInputStream fis = is.
// read filename which is sent as a part
Part p2 = request.getPart("name");
Scanner s = new Scanner(p2.getInputStream());
String filename = s.nextLine(); // read filename from stream
// get filename to use on the server
String outputfile = this.getServletContext().getRealPath(filename); // get path on the server
FileOutputStream os = new FileOutputStream (outputfile);
// write bytes taken from uploaded file to target file
int ch = is.read();
while (ch != -1) {
os.write(ch);
ch = is.read();
}
os.close();
out.println("<h3>File : '" + name + "' Type : '" + type + "' "
+ "of Size : " + ((double) size/1024) + "KB uploaded successfully!</h3>");
}
catch(Exception ex) {
out.println("Exception -->" + ex.getMessage());
}
finally {
out.close();
}
}
Here, I've used Servlets 3.0 feature for uploading a file...
My table schema :
resources
- UserID [varchar(15)]
- Document [mediumblob]
Could anyone help me how can I store the document into the table and though BLOB is a type representing binary data, how can I retrieve as a Word Document (*.doc)?
I agree with Archimedix... Instead of putting them into MySQL as BLOB, you can store the file on the disk and store its path in MYSQL as TEXT field. This way your retrieval time will be low. If you are space conscious then you can zip the doc and save it on the disk and on request uncompress and send it.
UPDATE
From your code it appears that you already have the handle of the file and you are able to save it on the server.
Now to save space you can zip it using default java zip utility.
You might face a problem when two people upload two different files with the same name. To avoid scenarios like this you can either rename your archived document with an uuid (use java 6 uuid class) or you can generate SHA1 for that file and use that for name.
Now you can use the absolute path of the archived (and renamed file) for storing in the MySQL.
Instead of table schema
resources
UserID [varchar(15)]
Document [mediumblob]
You can use this
resources
UserID [varchar(15)]
Document [varchar(512)]
So for a query like this:
Select Document from table Documents WHERE UserID = 'abcd';
you will now get an absolute path for the zipped file. Uncompress this file and send it.
A partial answer on storing the Word documents in files:
You don't need any additional column to save the file name as the document's record ID can serve as the file name.
When saving a new document, do in a database transaction so that you can undo the process when something goes wrong.
In pseudo code, this would look like this:
begin transaction;
try {
save new record for document;
save Word document in predefined directory, using record's ID as the filename;
} catch (Exception e) {
rollback transaction;
throw e; // Rethrow exception
}
commit transaction;
The code above assumes that an exception is thrown when an error occurs.

How can I store large amount of data from a database to XML (memory problem)?

First, I had a problem with getting the data from the Database, it took too much memory and failed. I've set -Xmx1500M and I'm using scrolling ResultSet so that was taken care of. Now I need to make an XML from the data, but I can't put it in one file. At the moment, I'm doing it like this:
while(rs.next()){
i++;
xmlStringBuilder.append("\n\t<row>");
xmlStringBuilder.append("\n\t\t<ID>" + Util.transformToHTML(rs.getInt("id")) + "</ID>");
xmlStringBuilder.append("\n\t\t<JED_ID>" + Util.transformToHTML(rs.getInt("jed_id")) + "</JED_ID>");
xmlStringBuilder.append("\n\t\t<IME_PJ>" + Util.transformToHTML(rs.getString("ime_pj")) + "</IME_PJ>");
//etc.
xmlStringBuilder.append("\n\t</row>");
if (i%100000 == 0){
//stores the data to a file with the name i.xml
storeKBR(xmlStringBuilder.toString(),i);
xmlStringBuilder= null;
xmlStringBuilder= new StringBuilder();
}
and it works; I get 12 100 MB files. Now, what I'd like to do is to do is have all that data in one file (which I then compress) but if just remove the if part, I go out of memory. I thought about trying to write to a file, closing it, then opening, but that wouldn't get me much since I'd have to load the file to memory when I open it.
Why not write all data to one file and open the file with the "append" option? There is no need to read in all the data in the file if you are just going to write to it.
However, this might be a better solution:
PrintWriter writer = new PrintWriter(new BufferedOutputStream(new FileOutputStream("data.xml")));
while(rs.next()){
i++;
writer.print("\n\t<row>");
writer.print("\n\t\t<ID>" + Util.transformToHTML(rs.getInt("id")) + "</ID>");
writer.print("\n\t\t<JED_ID>" + Util.transformToHTML(rs.getInt("jed_id")) + "</JED_ID>");
writer.print("\n\t\t<IME_PJ>" + Util.transformToHTML(rs.getString("ime_pj")) + "</IME_PJ>");
//...
writer.print("\n\t</row>");
}
writer.close();
The BufferedOutputStream will buffer the data before printing it, and you can specify the buffer size in the constructor if the default value does not suit your needs. See the java API for details: http://java.sun.com/javase/6/docs/api/.
You are assembling the complete file in memory: what you should be doing is writing the data directly to the file.
Additionally, you might consider using a proper XML API rather than assembling XML as a text file. A short tutorial is available here.
I have never encountered this usecase but I am pretty sure vtd-xml supports xml's of size more than 1 GB. It is worth checking out # http://vtd-xml.sourceforge.net
Or you can also follow all the below article series # http://www.ibm.com/developerworks/
"Output large XML documents"
Ok, so the code is rewritten and I'll include the whole operation:
//this is the calling/writing function; I have 8 types of "proizvod" which makes
//8 XML files. After an XML file is created, it needs to be zipped by a custom zip class
generateXML(tmpParam,queryRBR,proizvod.getOznaka());
writeToZip(proizvod.getOznaka());
//inside writeToZip
ZipEntry ze = new ZipEntry(oznaka + ".xml");
FileOutputStream fos = new FileOutputStream(new File(zipFolder + oznaka + ".zip"));
ZipOutputStream zos = new ZipOutputStream(fos);
zos.putNextEntry(ze);
FileInputStream fis = new FileInputStream(new File(zipFolder + oznaka + ".xml"));
final byte[] buffer = new byte[1024];
int n;
while ((n = fis.read(buffer)) != -1)
zos.write(buffer, 0, n);
zos.closeEntry();
zos.flush();
zos.close();
fis.close();
// inside generateXML
PrintWriter writer = new PrintWriter(new BufferedOutputStream(new FileOutputStream(zipFolder +oznaka + ".xml")));
writer.print("\n<?xml version=\"1.0\" encoding=\"UTF-8\" ?>");
writer.print("\n<PROSTORNE_JEDINICE>");
stmt = cm.getConnection().createStatement(ResultSet.TYPE_SCROLL_INSENSITIVE,
ResultSet.CONCUR_READ_ONLY);
String q = "";
rs = stmt.executeQuery(q);
if(rs != null){
System.out.println("Početak u : " +Util.nowTime());
while(rs.next()){
writer.print("\n\t<row>");
writer.print("\n\t\t<ID>" + Util.transformToHTML(rs.getInt("id")) + "</ID>");
writer.print("\n\t\t<JED_ID>" + Util.transformToHTML(rs.getInt("jed_id")) + "</JED_ID>");
//etc
writer.print("\n\t</row>");
}
System.out.println("Kraj u : " +Util.nowTime());
}
writer.print("\n</PROSTORNE_JEDINICE>");
But generateXML part still takes a lot of memory (if I'm guessing correctly, it takes bit by bit as much as it can) and I don't see how I could optimize it (use an alternative way to feed the writer.print function)?

Categories

Resources