I am creating Azure function using Java, My requirement I need to copy blob from one container to another container with encryption
so, for encrypting blob I am adding 4bites before and after the blob while uploading to sink container
now, I need to fetch blob content, for this I found one class in azure i.e,
#BlobInput(
name = "InputFileName",
dataType = "binary",
path = sourceContainerName+"/{InputFileName}")
byte[] content,
Here byte[] content, fetching content of blob
but I am facing some errors like, if I pass any file name as InputFileName parameter it is giving 200ok means returning successful. also it is difficult to mefor exception handling
so I am looking for other ways for fetching blob content.... please answer me if any methods or classes we have
If you are looking for more control, instead of using the bindings, you can use the Azure Storage SDK directly. Check out the quickstart doc for getting
setup.
This sample code has full end-to-end code that you could build upon. Here is the code that you are looking for in it for reference
String data = "Hello world!";
InputStream dataStream = new ByteArrayInputStream(data.getBytes(StandardCharsets.UTF_8));
/*
* Create the blob with string (plain text) content.
*/
blobClient.upload(dataStream, data.length());
dataStream.close();
/*
* Download the blob's content to output stream.
*/
int dataSize = (int) blobClient.getProperties().getBlobSize();
ByteArrayOutputStream outputStream = new ByteArrayOutputStream(dataSize);
blobClient.downloadStream(outputStream);
outputStream.close();
Related
I am trying to upload a Flux object into azure blob storage, but I'm not sure how to send a Flux pojo using BlobAsyncClient. BlobAsyncClient has upload methods that take Flux or BinaryData but I have no luck trying to convert CombinedResponse to BYteBuffer or BinaryData. Does anyone have any suggestions or know how to upload a flux object to blob storage?
You will need an asynch blob container client:
#Bean("blobServiceClient")
BlobContainerAsyncClient blobServiceClient(ClientSecretCredential azureClientCredentials, String storageAccount, String containerName) {
BlobServiceClientBuilder blobServiceClientBuilder = new BlobServiceClientBuilder();
return blobServiceClientBuilder
.endpoint(format("https://%s.blob.core.windows.net/", storageAccount))
.credential(azureClientCredentials)
.buildAsyncClient()
.getBlobContainerAsyncClient(containerName);
}
And in your code you can use it to get a client, and save your Flux to it:
Flux<ByteBuffer> content = getContent();
blobServiceClient.getBlobAsyncClient(id)
.upload(content, new ParallelTransferOptions(), true);
I get that the getContent() step is the part you are struggling with. You can save either a BinaryData object or a Flux<ByteBuffer> stream.
To turn your object into a BinaryData object, use the static helper method:
BinaryData foo = BinaryData.fromObject(myObject);
BinaryData is meant for exactly what the name says: binary data. For example the content of an image file.
If you want to turn it into a ByteBuffer, keep in mind that you're trying to turn an object into a stream of data here. You will probably want to use some standardized way of doing that, so it can be reliably reversed, so rather than a stream of bytes that may break if you ever load the data in a different client, or even just a different version of the same, we usually save a json or xml representation of the object.
My go-to tool for this is Jackson:
byte[] myBytes = new ObjectMapper().writeValueAsBytes(myObject);
var myByteBuffer = ByteBuffer.wrap(myBytes);
And return it as a Flux:
Flux<ByteBuffer> myFlux = Flux.just(myByteBuffer);
By the way, Azure uses a JSON serializer under the hood in the BinaryData.fromObject() method. From the JavaDoc:
Creates an instance of BinaryData by serializing the Object using the default JsonSerializer.
Note: This method first looks for a JsonSerializerProvider
implementation on the classpath. If no implementation is found, a
default Jackson-based implementation will be used to serialize the object
im using Java and the jclouds SDK to upload files to a Swift container as multipart, the uploading is going fine yet i need to add metadata to the file, i noticed that there is a function called getContentMetadata() which can get a metadata such as content length and type, yet i was unable to add custom metadata, tried to cast put options to metadata, didn't generate error but that generated an exception when i ran the code, the code is here
try {
PutOptions putOptions = PutOptions.Builder.metadata(ImmutableMap.of("test", String.valueOf(strValue)));
ByteSource fileBytes = Files.asByteSource(file);
Payload payload = Payloads.newByteSourcePayload(fileBytes);
///setting the header for the request
payload.getContentMetadata().setContentLength(file.length());
payload.getContentMetadata().setContentType(contentType);
payload.setContentMetadata((MutableContentMetadata) putOptions);
Blob blob = blobStore.blobBuilder(file.getName()).payload(payload).build();
///sednig the request
blobStore.putBlob("testContainer", blob, multipart());
return contentLength;
}
the line payload.setContentMetadata((MutableContentMetadata) putOptions); generated an exception
any idea how to solve this?
Thanks
You should set metadata via BlobBuilder, e.g.,
Blob blob = blobStore.blobBuilder(file.getName())
.payload(fileBytes)
.contentLength(file.length()
.contentType(contentType)
.userMetadata(ImmutableMap.of("test", String.valueOf(strValue)))
.build();
I am working on a small POC (which will be integrated into a bigger application) which consists of
Problem Context
reading an image in a simple java application program,
Convert image into a byte array byte[] imageEncodedBytes = baos.toByteArray()
storing it into a remote DB2 database (technology used was not imp, so I am using plain jdbc for now)
Reading it back from the DB and
converting it back to ensure that the re-creation of the image works. ( I am able to open the new re-created image in any image viewer)
Issues
The issues occur at step 5.
I read the image using a select query into a Result Set.
use the rs.getBlob("ColumnName") to get the blob value.
Fetch the byte array from the blob value using byte[] decodedArray = myBlob.getBytes(1, (int)myBlob.length())
Create the image from the obtained byte array.
At Step 3 the byte array decodedArray obtained from the blob differs from the byte array 'imageEncodedBytes' that I get when I read the image.
As a consequence, the following code to create the image from the byte array decodedArray fails.
ByteArrayInputStream bais = new ByteArrayInputStream(decodedArray);
//Writing to image
BufferedImage imag=ImageIO.read(bais); // Line of failure. No registered provider able to read bais
ImageIO.write(imag, "jpg", new File(dirName,"snap.jpg"));
References and Other data for issue investigation
I have referred the following links for verification
1. Inserting image in DB2
2. This Link here offers insight, but yet I was not able to determine, how to register the ImageReader.
4. When inserting the image to DB2 I am using - the following query
Statement st = conn.createStatement();
st.executeUpdate("INSERT INTO PHOTO (ID,PHOTO_NM,PHOTO_IM, THMBNL_IM) " + "VALUES (1,'blob("+bl+")',blob('"+bl+"')")
As an alternative to fetching blob value from the result set I have also used binaryStream = rs.getBinaryStream("PHOTO_IM") to get the binary stream and then get byte array from the binary stream. even in this case, the decodedArray is different from imageEncodedBytes
Please assist, I may be missing something extremely trivial here, but I am not able to figure out what. Any help/pointers will be greatly helpful. Thanks in advance.
The resolution is sort of a workaround that I have worked on.
I have used the jdbcTemplates to resolve the issue. The lobHandler Object of the jdbc template provides an easy way to manage the blobs.
Steps to resolve using the Spring Lob Handler
) Created a Data Source
) configured the Jdbc template to use the data source
) use the lobHandler code to execute the insert query
Code below
jdbcTemplate = new JdbcTemplate(dataSource);
jdbcTemplate.execute("INSERT INTO PHOTO (PHOTO_IM) VALUES (?)",
new AbstractLobCreatingPreparedStatementCallback(lobHandler) {
protected void setValues(PreparedStatement ps, LobCreator lobCreator) {
try {
lobCreator.setBlobAsBinaryStream(ps, 1, bean.getImageOrig(), bean.getImageLength());
} catch (java.sql.SQLException e) {
e.printStackTrace();
}
}
}
);
references
1. ) Stack Overflow Link - Spring JDBC Template Insert Blob
2. ) Stack Overflow Link - Close InputStream
3. ) Spring Doc for LobHandler
I wrote a Google App Engine application that makes use of Blobstore to save programmatically-generated data. To do so, I used the Files API, which unfortunately has been deprecated in favor to Google Cloud Storage. So I'm rewriting my helper class to work with GCS.
I'd like to keep the interface as similar as possible as it was before, also because I persist BlobKeys in the Datastore to keep references to the files (and changing the model of a production application is always painful). When i save something to GCS, i retrieve a BlobKey with
BlobKey blobKey = blobstoreService.createGsBlobKey("/gs/" + fileName.getBucketName() + "/" + fileName.getObjectName());
as prescribed here, and I persist it in the Datastore.
So here's the question: the documentation tells me how to serve a GCS file with blobstoreService.serve(blobKey, resp); in a servlet response, BUT how can I retrieve the file content (as InputStream, byte array or whatever) to use it in my code for further processing? In my current implementation I do that with a FileReadChannel reading from an AppEngineFile (both deprecated).
Here is the code to open a Google Storage Object as Input Stream. Unfortunately, you have to use bucket name and object name and not the blob key
GcsFilename gcs_filename = new GcsFilename(bucket_name, object_name);
GcsService service = GcsServiceFactory.createGcsService();
ReadableByteChannel rbc = service.openReadChannel(gcs_filename, 0);
InputStream stream = Channels.newInputStream(rbc);
Given a blobKey, use the BlobstoreInputStream class to read the value from Blobstore, as described in the documentation:
BlobstoreInputStream in = new BlobstoreInputStream(blobKey);
You can get the cloudstorage filename only in the upload handler (fileInfo.gs_object_name) and store it in your database. After that it is lost and it seems not to be preserved in BlobInfo or other metadata structures.
Google says:
Unlike BlobInfo metadata FileInfo metadata is not persisted to
datastore. (There is no blob key either, but you can create one later
if needed by calling create_gs_key.) You must save the gs_object_name
yourself in your upload handler or this data will be lost.
Sorry, this is a python link, but it should be easy to find something similar in java.
https://developers.google.com/appengine/docs/python/blobstore/fileinfoclass
Here is the Blobstore approach (sorry, this is for Python, but I am sure you find it quite similar for Java):
blob_reader = blobstore.BlobReader(blob_key)
if blob_reader:
file_content = blob_reader.read()
I'm hoping the answer to this question is quite simple, but I can't get it working after looking at the Azure Java API documentation.
I am trying to create an empty CloudBlockBlob, which will have blocks uploaded to it at a later point. I have successfully uploaded blocks before, when the blob is created upon the first block being uploaded, but I can't seem to get anything other than ("the specified blob does not exist") when I try to create a new blob without any data and then access it. I require this because in my service, a call is first made to create the new blob in Azure, and then later calls are used to upload blocks (at which point a check is made to see if the blob exists). Is it possible to create an empty blob in Azure, and upload data to it later? What have I missed?
I've not worked with Java SDK so I may be wrong but I tried creating an empty blob using C# code (storage client library 2.0) and if I upload an empty input stream an empty blob with zero byte size is created. I did something like the following:
CloudBlockBlob emptyBlob = blobContainer.GetBlockBlobReference("emptyblob.txt");
using (MemoryStream ms = new MemoryStream())
{
emptyBlob.UploadFromStream(ms);//Empty memory stream. Will create an empty blob.
}
I did look at Azure SDK for Java source code on Github here: https://github.com/WindowsAzure/azure-sdk-for-java/blob/master/microsoft-azure-api/src/main/java/com/microsoft/windowsazure/services/blob/client/CloudBlockBlob.java and found this "upload" function where you can specify an input stream. Try it out and see if it works for you.