I am trying to upload a Flux object into azure blob storage, but I'm not sure how to send a Flux pojo using BlobAsyncClient. BlobAsyncClient has upload methods that take Flux or BinaryData but I have no luck trying to convert CombinedResponse to BYteBuffer or BinaryData. Does anyone have any suggestions or know how to upload a flux object to blob storage?
You will need an asynch blob container client:
#Bean("blobServiceClient")
BlobContainerAsyncClient blobServiceClient(ClientSecretCredential azureClientCredentials, String storageAccount, String containerName) {
BlobServiceClientBuilder blobServiceClientBuilder = new BlobServiceClientBuilder();
return blobServiceClientBuilder
.endpoint(format("https://%s.blob.core.windows.net/", storageAccount))
.credential(azureClientCredentials)
.buildAsyncClient()
.getBlobContainerAsyncClient(containerName);
}
And in your code you can use it to get a client, and save your Flux to it:
Flux<ByteBuffer> content = getContent();
blobServiceClient.getBlobAsyncClient(id)
.upload(content, new ParallelTransferOptions(), true);
I get that the getContent() step is the part you are struggling with. You can save either a BinaryData object or a Flux<ByteBuffer> stream.
To turn your object into a BinaryData object, use the static helper method:
BinaryData foo = BinaryData.fromObject(myObject);
BinaryData is meant for exactly what the name says: binary data. For example the content of an image file.
If you want to turn it into a ByteBuffer, keep in mind that you're trying to turn an object into a stream of data here. You will probably want to use some standardized way of doing that, so it can be reliably reversed, so rather than a stream of bytes that may break if you ever load the data in a different client, or even just a different version of the same, we usually save a json or xml representation of the object.
My go-to tool for this is Jackson:
byte[] myBytes = new ObjectMapper().writeValueAsBytes(myObject);
var myByteBuffer = ByteBuffer.wrap(myBytes);
And return it as a Flux:
Flux<ByteBuffer> myFlux = Flux.just(myByteBuffer);
By the way, Azure uses a JSON serializer under the hood in the BinaryData.fromObject() method. From the JavaDoc:
Creates an instance of BinaryData by serializing the Object using the default JsonSerializer.
Note: This method first looks for a JsonSerializerProvider
implementation on the classpath. If no implementation is found, a
default Jackson-based implementation will be used to serialize the object
Related
I am creating Azure function using Java, My requirement I need to copy blob from one container to another container with encryption
so, for encrypting blob I am adding 4bites before and after the blob while uploading to sink container
now, I need to fetch blob content, for this I found one class in azure i.e,
#BlobInput(
name = "InputFileName",
dataType = "binary",
path = sourceContainerName+"/{InputFileName}")
byte[] content,
Here byte[] content, fetching content of blob
but I am facing some errors like, if I pass any file name as InputFileName parameter it is giving 200ok means returning successful. also it is difficult to mefor exception handling
so I am looking for other ways for fetching blob content.... please answer me if any methods or classes we have
If you are looking for more control, instead of using the bindings, you can use the Azure Storage SDK directly. Check out the quickstart doc for getting
setup.
This sample code has full end-to-end code that you could build upon. Here is the code that you are looking for in it for reference
String data = "Hello world!";
InputStream dataStream = new ByteArrayInputStream(data.getBytes(StandardCharsets.UTF_8));
/*
* Create the blob with string (plain text) content.
*/
blobClient.upload(dataStream, data.length());
dataStream.close();
/*
* Download the blob's content to output stream.
*/
int dataSize = (int) blobClient.getProperties().getBlobSize();
ByteArrayOutputStream outputStream = new ByteArrayOutputStream(dataSize);
blobClient.downloadStream(outputStream);
outputStream.close();
I have a Java program (a war) that runs out of memory when manipulating a big XML file.
The program is a REST API that returns the manipulated XML via a REST Controller.
First, the program gets an XML file from a remote URL.
Then it replaces the values of id attributes.
Finally, it returns the new XML to the caller via the API controller.
What I get from the remote URL is a byte[] body with XML data.
Then, I convert it to a String.
Next, I do a regexp search-replace on the whole string.
Then I convert it back to a byte[].
I'm guessing that the XML now is in memory 3 times (the incoming bytes, the string and the outgoing bytes).
I'm looking for ways to improve this.
I have no local copies on the filesystem btw.
You can delete the incoming bytes from memory after converting the bytes to String:
byte[] bytes = bytesFromURL;
String xml = new String(bytes);
{...manipulate xml}
bytes = null;
System.gc();
bytes = xml.getBytes();
I'm planing to use SheetJS with rhino. And sheetjs takes a binary object(BLOB if i'm correct) as it's input. So i need to read a file from the system using stranded java I/O methods and store it into a blob before passing it to sheetjs. eg :-
var XLDataWorkBook = XLSX.read(blobInput, {type : "binary"});
So how can i create a BLOB(or appropriate type) from a binary file in java in order to pass it in.
i guess i cant pass streams because i guess XLSX needs a completely created object to process.
I found the answer to this by myself. i was able to get it done this way.
Read the file with InputStream and then write it to a ByteArrayOutputStream. like below.
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
...
buffer.write(bytes, 0, len);
Then create a byte array from it.
byte[] byteArray = buffer.toByteArray();
Finally i did convert it to a Base64 String (which is also applicable in my case) using the "Base64.encodeBase64String()" method in apache.commons.codec.binary package. So i can pass Base64 String as a method parameter.
If you further need there are lot of libraries(3rd-party and default) available for Base64 to Blob conversion as well.
the objective I'm trying to reach is to save a PDF into an Oracle database into a BLOB.
Currently, the servlet i'm using only sends back a pdf via HttpServletResponse.
Printers.getPDFPrinter(0).printToResponse(myTemplate, response, 0, TemplateA.PDF);
I don't have access to printToResponse script, so I don't know what it does.
All I know is that response has a HttpServletResponse type from which I can get the OutputStream, and myTemplate implements an IDocument Interface which I have not access either.
If I could get either myTemplate or response into a byte array (in order to save it as a blob), it would be OK.
However, in all my searches, I only found code to create a byte array from an input stream, and not an output stream.
Can anyone help please ?
That code probably requires the full HttpResponse because it also needs to set the content-type and some other bits. For as awful as it may sound, you can create a "mock" response object and override the relevant methods in order to intercept its writes to the output stream. You can provide the PDFWriter a ByteArrayOutputStream so that you can then get the byte[] and write it into your DB.
I am not sure which library the servlet uses accessing Printers.getPDFPrinter(0) but:
the library may offer other methods than printToResponse (printToStream, printToFile, ...?)
you may pass your own HttpServletResponse returning an dummy ServletOutputStream on getOutputStream(). This dummy subclass has to implement write(int b) by delegating to the result of Blob.setBinaryStream(1).
If you want to write your own HttpServletResponse, I would prefer inheriting from HttpServletResponseWrapper if the servlet should also return the PDF your ServletOutputStream needs to delegate the original and the Blob stream.
If you wand the servlet just to return an id to retrieve the PDF from the database later, you need to implement your own HttpServletResponse. In this case I would use a Proxy and the InvocationHandler handles getOutputStream().
I am trying to use Protocol buffers to store serialized data in Database for a web application built in java.
I have created .proto files and compiled them to get the generated classes. Also I can build the message objects using the setters & finally build() method. But to store it to database, I need serialized data as byte[] or byte buffers. How do I finally get that from the message instances ??
import com.paratha.serializers.protocolbuffers.CommentProto.Comment;
Comment.Builder comment=Comment.newBuilder();
comment.setCommentBody("This is the first comment!").setUserId(32433).build();
How do I get the serialized data from here to write to database ?
Google have made it very easy :) :
MyProtocolBufferObject myObject = MyProtocolBufferObject.newBuilder().setName("bob").build();
byte[] bytes = myObject.toByteArray();
Edit
With your example:
Comment.Builder commentBuilder=Comment.newBuilder();
Comment comment = commentBuilder.setCommentBody("This is the first comment!").setUserId(32433).build();
byte[] bytes = comment.toByteArray();
Note that when you call the newBuilder() method you are getting an instance of Comment.Builder, not an instance of Comment. It is only when you call the Comment.Builder's build() method that you get an instance of Comment.