Is there a way to read and write from a blob in chunks using Hibernate.
Right now I am getting OutOfmemoryException because the whole blob data is loaded in memory into a byte[].
To be more specific, let's say I want to save a large file into a database table called File.
public class File {
private byte[] data;
}
I open the file in a FileInputStream and then what?
How do I tell Hibernate that I need to stream the content and will not give the whole byte[] array at once?
Should I use Blob instead of byte[]? Anyway how can I stream the content?
Regarding reading, is there a way I can tell hibernate that (besides the lazy loading it does) I need the blob to be loaded in chunks, so when I retrieve my File it should not give me OutOfMemoryException.
I am using:
Oracle 11.2.0.3.0
Hibernate 4.2.3 Final
Oracle Driver 11.2
If going the Blob route, have you tried using Hibernate's LobHelper createBlob method, which takes an InputStream? To create a Blob and persist to the database, you would supply the FileInputStream object and the number of bytes.
Your File bean/entity class could map the Blob like this (using JPA annotations):
#Lob
#Column(name = "DATA")
private Blob data;
// Getter and setter
And the business logic/data access class could create the Blob for your bean/entity object like this, taking care not to close the input stream before persisting to the database:
FileInputStream fis = new FileInputStream(file);
Blob data = getSession().getLobHelper().createBlob(fis, file.length());
fileEntity.setData(data);
// Persist file entity object to database
To go the other way and read the Blob from the database as a stream in chunks, you could call the Blob's getBinaryStream method, giving you the InputStream and allowing you to set the buffer size later if needed:
InputStream is = fileEntity.getData().getBinaryStream();
Struts 2 has a convenient configuration available that can set the InputStream result's buffer size.
Related
I am creating Azure function using Java, My requirement I need to copy blob from one container to another container with encryption
so, for encrypting blob I am adding 4bites before and after the blob while uploading to sink container
now, I need to fetch blob content, for this I found one class in azure i.e,
#BlobInput(
name = "InputFileName",
dataType = "binary",
path = sourceContainerName+"/{InputFileName}")
byte[] content,
Here byte[] content, fetching content of blob
but I am facing some errors like, if I pass any file name as InputFileName parameter it is giving 200ok means returning successful. also it is difficult to mefor exception handling
so I am looking for other ways for fetching blob content.... please answer me if any methods or classes we have
If you are looking for more control, instead of using the bindings, you can use the Azure Storage SDK directly. Check out the quickstart doc for getting
setup.
This sample code has full end-to-end code that you could build upon. Here is the code that you are looking for in it for reference
String data = "Hello world!";
InputStream dataStream = new ByteArrayInputStream(data.getBytes(StandardCharsets.UTF_8));
/*
* Create the blob with string (plain text) content.
*/
blobClient.upload(dataStream, data.length());
dataStream.close();
/*
* Download the blob's content to output stream.
*/
int dataSize = (int) blobClient.getProperties().getBlobSize();
ByteArrayOutputStream outputStream = new ByteArrayOutputStream(dataSize);
blobClient.downloadStream(outputStream);
outputStream.close();
Uploading an image to hbase using Java program, after retrieving the image I found there is difference in file size eventually increased and most of Exif and Meta data loss
(GPS location data, camera details, etc..)
Code :
public ArrayList<Object> uploadImagesToHbase(MultipartFile uploadedFileRef){
byte[] bytes =uploadedFileRef.getBytes();
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
ImageIO.write(image, "jpg", outputStream);
HBaseAdmin admin = new HBaseAdmin(configuration);
HTable table = new HTable(configuration, "sample");
Put image = new Put(Bytes.toBytes("1"));
image.add(Bytes.toBytes("DataColumn"), Bytes.toBytes(DataQualifier), bytes);
table.put(image);
How to store and retrieve a Image with out any change / loss?
Please try using SerializationUtils from Apache Commons Lang.
Below are methods
static Object clone(Serializable object) //Deep clone an Object using serialization.
static Object deserialize(byte[] objectData) //Deserializes a single Object from an array of bytes.
static Object deserialize(InputStream inputStream) //Deserializes an Object from the specified stream.
static byte[] serialize(Serializable obj) //Serializes an Object to a byte array for storage/serialization.
static void serialize(Serializable obj, OutputStream outputStream) //Serializes an Object to the specified stream.
While storing in to hbase you can store byte[] which was returned from serialize.
While getting the Object you can type cast to corresponding object for ex: File object and can get it back.
Most likely you are just over-complicating things. :-)
The reason why you are losing the Exif and other metadata, is that the ImageIO convenience methods ImageIO.read(...) and ImageIO.write(...) does not preserve metadata. The good news is, they are not needed.
As you seem to already have the image data from the MultipartFile, you should simply store that data (the byte array) in the database, and you will store exactly what the user uploaded. No difference in file size, and metadata will be untouched.
Your code above doesn't compile for me, and I'm no HBase expert, so I just leave that out (as you have already been able to store an image, to see the size/quality difference and metadata loss, I assume you know how to do that :-) ). But here's the basics:
public ArrayList<Object> uploadImagesToHbase(MultipartFile uploadedFileRef) {
byte[] bytes = uploadedFileRef.getBytes();
// Store the above "bytes" byte array in HBase *as is* (no ImageIO)
}
I am working on a small POC (which will be integrated into a bigger application) which consists of
Problem Context
reading an image in a simple java application program,
Convert image into a byte array byte[] imageEncodedBytes = baos.toByteArray()
storing it into a remote DB2 database (technology used was not imp, so I am using plain jdbc for now)
Reading it back from the DB and
converting it back to ensure that the re-creation of the image works. ( I am able to open the new re-created image in any image viewer)
Issues
The issues occur at step 5.
I read the image using a select query into a Result Set.
use the rs.getBlob("ColumnName") to get the blob value.
Fetch the byte array from the blob value using byte[] decodedArray = myBlob.getBytes(1, (int)myBlob.length())
Create the image from the obtained byte array.
At Step 3 the byte array decodedArray obtained from the blob differs from the byte array 'imageEncodedBytes' that I get when I read the image.
As a consequence, the following code to create the image from the byte array decodedArray fails.
ByteArrayInputStream bais = new ByteArrayInputStream(decodedArray);
//Writing to image
BufferedImage imag=ImageIO.read(bais); // Line of failure. No registered provider able to read bais
ImageIO.write(imag, "jpg", new File(dirName,"snap.jpg"));
References and Other data for issue investigation
I have referred the following links for verification
1. Inserting image in DB2
2. This Link here offers insight, but yet I was not able to determine, how to register the ImageReader.
4. When inserting the image to DB2 I am using - the following query
Statement st = conn.createStatement();
st.executeUpdate("INSERT INTO PHOTO (ID,PHOTO_NM,PHOTO_IM, THMBNL_IM) " + "VALUES (1,'blob("+bl+")',blob('"+bl+"')")
As an alternative to fetching blob value from the result set I have also used binaryStream = rs.getBinaryStream("PHOTO_IM") to get the binary stream and then get byte array from the binary stream. even in this case, the decodedArray is different from imageEncodedBytes
Please assist, I may be missing something extremely trivial here, but I am not able to figure out what. Any help/pointers will be greatly helpful. Thanks in advance.
The resolution is sort of a workaround that I have worked on.
I have used the jdbcTemplates to resolve the issue. The lobHandler Object of the jdbc template provides an easy way to manage the blobs.
Steps to resolve using the Spring Lob Handler
) Created a Data Source
) configured the Jdbc template to use the data source
) use the lobHandler code to execute the insert query
Code below
jdbcTemplate = new JdbcTemplate(dataSource);
jdbcTemplate.execute("INSERT INTO PHOTO (PHOTO_IM) VALUES (?)",
new AbstractLobCreatingPreparedStatementCallback(lobHandler) {
protected void setValues(PreparedStatement ps, LobCreator lobCreator) {
try {
lobCreator.setBlobAsBinaryStream(ps, 1, bean.getImageOrig(), bean.getImageLength());
} catch (java.sql.SQLException e) {
e.printStackTrace();
}
}
}
);
references
1. ) Stack Overflow Link - Spring JDBC Template Insert Blob
2. ) Stack Overflow Link - Close InputStream
3. ) Spring Doc for LobHandler
I am trying to write simple webapp using JSP / Servlet / EJB (for JPA) for uploading files (up to 50 mb) to DB.
In my entity class (User) i have following code:
#Lob
private byte[] file;
Here is how I retrieve file in Servlet (actually it saves file on my computer and I want to change it):
for (Part part : request.getParts()) {
InputStream is = request.getPart(part.getName()).getInputStream();
int i = is.available();
byte[] b = new byte[i];
is.read(b);
String fileName = getFileName(part);
FileOutputStream os = new FileOutputStream("C:/files/" + fileName);
os.write(b);
is.close();
}
I don't know how to write byte arrays (using for loop) to my User entity. Any ideas ?
Your input stream processing is incorrect - the available method doesn't return you the length of the entire stream, it only gives you the (estimated) amount that is available to be read before the stream will be blocked. Emphasis on estimate. You need to loop through reading the entire stream until a read call results in -1, or use a utility like IOUtils from Apache Commons IO.
final byte[] data = IOUtils.toByteArray(inputStream);
Once you have the data, just set it to your entity:
entity.setFile(data);
And then save it.
A user uploads a large file to my website and I want to gzip the file and store it in a blob. So I have an uncompressed InputStream and the blob wants an InputStream. I know how to compress an InputStream to an Outputstream using GZIPOutputStream, but how do I go from the gzip'ed OutputStream back to the InputStream needed by the blob.
The only way I could find involves using ByteArrayOutputStream and then creating a new InputStream using toByteArray. But that will mean I have an entire copy of the file in memory. And it wouldn't surprise me if the JDBC driver implementation converted the stream to a byte[] also so I'd have two copies in memory.
If you are on java 1.6 you can use java.util.zip.DeflaterInputStream. As far as I can tell, this does exactly what you want. If you can't use 1.6 you should be able to reimplement DeflaterInputStream using java.util.zip.Deflater. When reading the data back from the BLOB use a InflaterInputStream as a filter to get the original data back.