Reading MEDIUMBLOB from MySQL in java - java

This is for the first time I am working with the Blob data types.
I am storing the image files in MEDIUMBLOB datatype in MySQL database. Now, I want to retrieve it using a JAVA application.
So far, I found tutorials making use of com.mysql.jdbc.Blob.
I tried,
Blob imgAsBlob = resultset.getBlob("image");
byte[] imgAsBytes = imgAsBlob.getBytes(1, (int) imgAsBlob.length());
but I get an error at imgAsBlob.getBytes(1, (int) imgAsBlob.length()) because the second argument is larger (46352) than the BLOB's length:
java.sql.SQLException: "pos" + "length" arguments cannot be larger than the BLOB's length.
Is there any other way than getBlob() method or any other class than com.mysql.jdbc.Blob which I can use?
Thanks.

Related

Extracting the hex value after after converting a given byte array into blob obj in Java

I have a requirement to insert blob value into one of the tables(DB type - Oracle). I am using Java to convert my image file into a byte array first and then to blob object by using javax.sql.rowset.serial.SerialBlob.
However, when I create a blob object using
Blob blob = new SerialBlob(byteArray);
The resulting blob object is of the form javax.sql.rowset.serial.SerialBlob#'some-hex-value'. When I use the blob object in my insert query, Oracle gives me: Invalid hex number error.
But when I try to insert using only the hex part and removing 'javax.sql.rowset.serial.SerialBlob' part from my blob object, the insert query works fine and the record gets inserted.
So is there any way to extract the hex value from the blob object that I am converting my image file into?
Thanks!

convert byte array to java.sql.Clob

Is there any way to convert a byte array into java.sql.Clob ?
I am having this type of issue...
getHibernateTemplate().save(object)
Where object is having a field private Clob docData; and the similar is mapped into oracle table as CLOB
This docData clob is getting formed from somewhere in my java code like Hibernate.createClob(someString)
I tried to save it with type="clob" but getting cann't cast com.sun.proxy$Proxy124 to oracle.sql.CLOB. I have tried many ways to remove this Proxy but finally failed.
So I have decided to go like byte[] data = IOUtils.toByteArray(docData.getCharacterStream()); / byte[] data = IOUtils.toByteArray(docData.getAsciiStream()) and saving it as type="binary" but I am getting Caused by: java.sql.BatchUpdateException: ORA-01461: can bind a LONG value only for insert into a LONG column.
So now I want to create as a Clob from byte[].
Any help welcome.
Note earlier I was using Hibernate 3.3 and it was working fine without any such byte array conversion and etc...now I have upgraded to Hibernate 3.6.10 and getting this issue.
I'm using this method to create Blobs:
org.hibernate.engine.jdbc.NonContextualLobCreator.NonContextualLobCreator.INSTANCE.createBlob( buffer )
where buffer is an array of bytes.
There are 2 similar methods for creating CLOBs:
NonContextualLobCreator.INSTANCE.createClob( reader, length )
NonContextualLobCreator.INSTANCE.createClob( string )
Pick the one that fits better with your data.
Your error message says
cann't cast com.sun.proxy$Proxy124 to oracle.sql.CLOB
In the rest of your text you are referring to java.sql.Clob Check your imports, you might be using the clob from the oracle.sql package instead of the java.sql package somewhere.
Well, issue is resolved. I kept the java data type as 'Clob' only and made the hibernate mapping like type="string". Issue got resolved since my digital sign data does not contain more than 2 MB (that java string max supports).

Spring JDBC `SimpleJdbcCall` with BLOB outputs, only returns 8000 bytes

I have attempted to use a stored procedure to extract a PDF stored as a BLOB in a SQL Server database. A bit similar to the following:
SimpleJdbcCall sp = new SimpleJdbcCall(getJdbcTemplate())
.withSchemaName("dbo")
.withProcedureName("sp_find_doc")
.declareParameters(
new SqlParameter("DocID", java.sql.Types.BIGINT),
new SqlOutParameter("DocBLOB", java.sql.Types.BLOB));
SqlParameterSource args = new MapSqlParameterSource().addValue("ID", 1L);
Map<String, Object> map = sp.execute(args);
The stored procedure defines the output as varbinary(max), so it should support pretty large outputs. However, the PDF is corrupted. Examining more closely, I can see that the object returned by the execute method to the map is truncated to 8000 bytes.
8000 bytes is a basic varbinary (without the max), so I'm guessing that some component is failing to specify the output type correctly. Or perhaps the execute method is only reading off the first 8000 bytes?
Is there something missing from my code above to ensure that Spring JDBC defines the correct types and maps all the bytes in the BLOB?
Note that I have also tried specifying useLOBs=false on the database URL and LONGVARBINARY for the outputs, but again, 8000 bytes was the limit to the arrays returned.
FYI - I'm using SQL Server 2014, jTDS 1.3.1 and Spring Boot 1.3.1 (Spring JDBC 4.2.3).
Workaround
Note that I have implemented a workaround of rewriting the stored procedure to return a result set rather than output parameters. This way, I can use rs.getBlob("DocBLOB"), which works nicely.
But of course I would be interested to find out whether there is a correct/better way to execute stored procedures which return BLOBs in output parameters.

Get File Size of stored files and new uploaded ones

I have a web app that is connected to a mySQL DB in BLOB Format, and there are some files stored in the DB, how can I fetch their sizes and add them to a new column in the table I'm using ? And what is the best approach to get file size of an uploaded file in JAVA ?!
For your new records that will be processed in Java you can use File.length() if you're working with a File object.
And for existing records, you can use OCTET_LENGTH(your_column_name) statement (source: Calculating total data size of BLOB column in a table) which will give you the size in bytes.
select file_name, Octet_length(blob_colume) as "Size in Bytes" from Table_name;

Converting cassandra blob type to string

I have an old column family which has a column named "value" which was defined as a blob data type. This column usually holds two numbers separated with an underscore, like "421_2".
When im using the python datastax driver and execute the query, the results return with that field parsed as a string:
In [21]: session.execute(q)
Out[21]:
[Row(column1=4776015, value='145_0'),
Row(column1=4891778, value='114_0'),
Row(column1=4891780, value='195_0'),
Row(column1=4893662, value='105_0'),
Row(column1=4893664, value='115_0'),
Row(column1=4898493, value='168_0'),
Row(column1=4945162, value='148_0'),
Row(column1=4945163, value='131_0'),
Row(column1=4945168, value='125_0'),
Row(column1=4945169, value='211_0'),
Row(column1=4998426, value='463_0')]
When I use the java driver I get a com.datastax.driver.core.Row object back. When I try to read the value field by, for example, row.getString("value") I get the expected InvalidTypeException: Column value is of type blob. Seems like the only way to read the field is via row.getBytes("value") and then I get back an java.nio.HeapByteBuffer object.
Problem is, I cant seem to convert this object to string in an easy fashion. Googling yielded two answers from 2012 that suggest the following:
String string_value = new String(result.getBytes("value"), "UTF-8");
But such a String constructor doesn't seems to exist anymore.
So, my questions are:
How do I convert HeapByteBuffer into string?
How come the python driver converted the blob easily and the java one did not?
Side Note:
I could debug the python driver, but currently that seems too much work for something that should be trivial. (and the fact that no one asked about it suggests Im missing something simple here..)
Another easier way is to change the CQL statement.
select column1, blobastext(value) from YourTable where key = xxx
The second column would be type of String.
You can also get direct access to the Java driver's serializers. This way you don't have to deal with low-level details, and it also works for other types.
Driver 2.0.x:
String s = (String)DataType.text().deserialize(byteBuffer);
Driver 2.1.x:
ProtocolVersion protocolVersion = cluster.getConfiguration().getProtocolOptions().getProtocolVersion();
String s = (String)DataType.text().deserialize(byteBuffer, protocolVersion);
Driver 2.2.x:
ProtocolVersion protocolVersion = cluster.getConfiguration().getProtocolOptions().getProtocolVersion();
String s = TypeCodec.VarcharCodec.instance.deserialize(byteBuffer, protocolVersion);
For version 3.1.4 of the datastax java driver the following will convert a blob to a string:
ProtocolVersion proto = cluster.getConfiguration().getProtocolOptions().getProtocolVersion();
String deserialize = TypeCodec.varchar().deserialize(row.getBytes(i), proto);
1.) Converting from byte buffer in Java is discussed in this answer.
2.) Assuming you're using Python 2, it's coming back as a string in Python because str is the binary type.

Categories

Resources