Saving blobs with Google Endpoint - java

I have an app that allows users to save blobs in the blobstore. I have a schema that does so presently, but I am interested in something simpler and less twisted. For context, imagine my app allows users to upload the picture of an animal with a paragraph describing what the animal is doing.
Present schema
User calls my endpoint api to save the paragraph and name of the animal in entity Animal. Note: The Animal entity actually has 4 fields ( name, paragraph, BlobKey, and blobServingUrl as String). But the endpoint api only allows saving of the two mentioned.
Within the endpoint method, on app-engine side, after saving name and paragraph I make the following call to generate a blob serving url, which my endpoint method returns to the caller
#ApiMethod(name = "saveAnimalData", httpMethod = HttpMethod.POST)
public String saveAnimalData(AnimalData request) throws Exception {
...
BlobstoreService blobstoreService = BlobstoreServiceFactory.getBlobstoreService();
String url = blobstoreService.createUploadUrl("/upload");
return url;
}
On the android side, I use a normal http call to send the byte[] of the image to the blobstore. I use apache DefaultHttpClient(). Note: the blobstore, after saving the image, calls my app-engine server with the blob key and serving url
I read the response from the blobstore (blobstore called my callback url) using a normal java servlet, i.e. public void doPost(HttpServletRequest req, HttpServletResponse res) throws ServletException, IOException. From the servlet, I put the BlobKey and blobServingUrl into the Animal entity for the associated animal. (I had passed some meta data to the blobstore, which I use as markers to identify the associated animal entity).
Desired Schema
This is where your response comes in. Essential, I would like to eliminate the java servlet and have my entire api restricted to google cloud endpoint. So my question is: how would I use my endpoint to execute steps 3 and 4?
So the idea would be to send the image bytes to the endpoint method saveAnimalData at the same time that I am sending the paragraph and name data. And then within the endpoint method, send the image to the blobstore and then persist the BlobKey and blobServingUrl in my entity Animal.
Your response must be in java. Thanks.

I see two questions in one here :
Can Google Cloud Endpoints handle multipart files ? -> I don't know about this TBH
Is there a simpler process to store blobs than using the BlobStoreService?
It depends on the size of your image. If you limit your users to < 1MB files, you could just store your image as a Blob property of your Animal entity. It would allow you to bypass the BlobStoreService plumbering. See : https://developers.google.com/appengine/docs/java/datastore/entities?hl=FR
This solution still depends on how the Cloud Endpoint would handle the multipart file as a raw byte[]...
We encountered the same issue with GWT + Google App Engine in 2009, and it was before the BlobStoreService was made available.
GWT RPC and Cloud Endpoints interfaces share some similarities, and for us it was not possible. We had to create a plain HTTP Servlet, and use a Streaming Multipart file resolver beacause the one from Apache's HTTP Commons used the file system.

Related

how setup dfs insted blob endpoint for BlobContainerClient

In general, I need to create an app with java that will perform some operations on azure storage
like upload file, append to file, rename, check if exist and so on. And IMPORTANT It has to communicate with DFS endpoint https://xxxx.dfs.core.windows..
But I encounter some problems:
during using BlobContainerClient and uploading a file to azure storage an error appears:
com.azure.storage.blob.models.BlobStorageException: Status code 400,
"{"error":{"code":"MissingRequiredHeader","message":"An HTTP header
that's mandatory for this request is not
specified.\nRequestId:b225d695-201f-00ed-212e-c7c9e8000000\nTime:2021-10-22T10:23:12.4983407Z"}}"
How can I avoid this situation, what header is required and how to set up it?
Afterward I have implemented something similar but using DataLakeFileSystemClient and this time uploading of the file was totally fine. Unfortunately, not all operations can be performed. e.g. exists() method internally uses BlobContainerClient
and perform call via blob endpoint https://xxxx.blob.core.windows.. what if forbidden in my case.
IMO It is caused because BlobContainerClientBuilder.endpoint(String endpoint) set up endpoint blobContainerClient
endpoint to blob, and dfs endpoint for DataLakeFileSystemClient.
source code:
public DataLakeFileSystemClientBuilder endpoint(String endpoint) { // Ensure endpoint provided is dfs endpoint endpoint = DataLakeImplUtils.endpointToDesiredEndpoint(endpoint, "dfs", "blob"); blobContainerClientBuilder.endpoint(DataLakeImplUtils.endpointToDesiredEndpoint(endpoint, "blob", "dfs"));
So the question is: is it a bug in BlobContainerClientBuilder.endpoint(String endpoint) ?
or how to fix this problem to use the same endpoint for both clients.
Currently, I have implemented wcomunicatend I'm using both clients: DataLakeFileSystemClient to perform actions
like upload, append etc. and BlobContainerClient to check if file exist. I would like to use only one of the clients.
Could you help me somehow, please?
Azure Blob Storage is developed for storing large amount of unstructured data. And Unstructured data does not adhere to a particular data model or definition, such as text or binary data.
Blob Storage provides three resources, which are Storage Account (SA), Container inside SA and a Blob in the Container. And we use some Java classes to interact with these resources.
The BlobContainerClient class allows you to manipulate Azure Storage Containers and their Blobs. This class is mainly used to manipulate or work on the Containers (file system). So if you want to work on or manipulate Blobs (files) then it's recommended to use the BlobClient.
Check the following snippets to create a container and uploading a file.
Create a container using a BlobContainerClient.
blobContainerClient.create();
Upload BinaryData to a blob using a BlobClient generated from a BlobContainerClient.
BlobClient blobClient = blobContainerClient.getBlobClient("myblockblob");
String dataSample = "samples";
blobClient.upload(BinaryData.fromString(dataSample));
And to rename a blob (file), Copy and Delete is the only method of renaming a blob. If you want to do so for larger blobs, you need to use the asynchronous copy and check periodically for its completion.
Check this Manage blobs with Java v12 SDK and Azure Storage Blob client library for Java document for more information.

How to retrieve metadata associated with a video or image sent from a RESTful web service?

I have the controller shown below:
#RequestMapping(value = "/videos/{id}",
headers = "Accept=image/jpeg, image/jpg, image/png, video/mp4",
method = RequestMethod.GET)
public ResponseEntity<byte[]> loadVideo(#PathVariable("id") long campaignId,
Principal principal) throws IOException {
This controller returns a byte stream of the media associated with the given id. It works fine. The only issue I'm having is loading this videos associated metadata (title, description, view count, etc...) as I'm sending back an array of bytes, so I'm not too sure where to put the meta data.
Should I place the metadata in the response headers?
Should I have two separate calls, one for the video (byte steam) and
another call which returns an object containing the meta data?
Or is there a better way to go about this than either of the two
options above?
As my comment was already lenghty I decided to repost it here again:
If you deal with certain media-types like image/jpg or video/mp4 you should include the metadata as headers as the payload of the response should only include the bytes of the respective file. This also enables a lookup of the metadata without having to download the bytes of the actual file via a simple HEAD request.
Certain API provides, howerver, define their own media-type or send a JSON or XML based response to the client. In this cases, the payload contains often a predefined structure which includes the bytes of the file as a base64 encoded string as well as the metadata as plain text. These APIs argument that sending multiple files at once is easier this way then to handle multipart content.

Get BlobKey type from JSON Object

We're trying to build application which will store images to blobstore.
I used the code from here and added it to endpoint project.
Now the /uploaded servlet returns to client JSONObject like this:
json.put("servingUrl", servingUrl);
json.put("blobKey", blobKey);
There is a PictureEntity class. Where the field blobKey is of the type BlobKey and I can send another HttpPost request with all needed data to server, get the needed String from html entity and apply
picture.setBlobKey(blobKey); on server side.
But since I use EndPoints I don't want to transfer all the data again to another servlet on the server. I want to use endpoints. And it seems that they should somehow support BlobKey type.
At least GAE creates model for them but when I try to do the same on client side I always get an error:
The method setBlobKey(BlobKey) in the type PictureEntity is not
applicable for the arguments (String)
Any help would be appreciated. Thanks!

How to download file from web site having liferay portlet using java code

i'm trying to download a file from a site , this site has a life ray server
i have been reading to much about but all describe how to configure a server not how to read from , all examples i saw has HTTPServletRequest which needs a request input how can i transfer a URL to a request ,from where to start at least .
in other words :i have the URL , in the webpage i select a date and a download like is generated , how can i make it down in java ????
i tried this:
HttpServletRequest request = PortalUtil.getHttpServletRequest(PortletRequest);
so how to link my URL to PortletRequest
If you have the URL of the download the only thing you need is to perform a client request against that URL.
First thing you should try to be sure that the URL you have is the one that will give you the expected results is try to paste it in a new browser window and verify that the download starts.
Then, if you want to perform that download through Java you can do very easily using the URL and URLConnection (HttpURLConnection in this case) classes:
String urlString = "..."; // Your URL
URL url = new URL(urlString);
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
if (conn.getResponseCode() == 200) {
InputStream stream = conn.getInputStream();
// Read the data from the stream
}
You could also do the same using Apache HTTP Client.
Note: PortalUtil.getHttpServletRequest(...) is used internally by Liferay and you won't have any access to that API if you are doing a client request.
If you're writing a portlet, by design you don't get access to the HttpServletRequest.
What you can do is to utilize the "resource-serving" lifecycle phase of a portlet. There you get access to a ResourceRequest and ResourceResponse object. Those objects behave almost like a HttpServletRequest/-Response object
As you don't name the framework that you're using: javax.portlet.GenericPortlet.serveResource() is the method that you want to override in the pure JSR-286 API.
On the UI side, <portlet:resourceURL/> will provide the URL to your portlet's resource handling method.
This should provide you with enough google-food to find tutorials on how to implement different lifecycle phases - I can't judge the required level of detail you need. Note that Liferay has quite a few sample portlets that you can utilize as a source for sample code.
Edit: Following your comment below, let me give you some pseudo code (just typed here, never compiled/run):
on a jsp frontend, e.g. view.jsp:
Download File
Then, in your portlet, assuming you're implementing javax.portlet.GenericPortlet in one way or another (e.g. indirectly through Liferay's MVCPortlet or any other superclass):
public class MyPortlet extends GenericPortlet {
....
#Override
public void serveResource(ResourceRequest request, ResourceResponse response) {
// implement the file streaming here,
// use ResourceResponse the way you find illustrated
// in samples for HttpServletResponse
}

trying to retrieve file from blobstore and send it as mail attachment using google app engine

I am trying to design an application that would require me to retrieve data stored in blobstore and send it as attachment. Does google app engine allow this? From the documentation, i could not find a way to retrieve data from blobstore for processing within the app.. can someone please tell me how to accomplish this? Code examples and/or pointers to related online resources would be really helpful.
You can now read data from the blobstore, using BlobReader, which provides a simple, file-like interface.
As of now, it seems this isn't possible. You can only cause the the file to be sent to the client.
It's possible you could do what you need using a Datastore Blob?
It's worth also noting that the Blobstore is "experimental" and may change. It's possible additional functionality may be added that would allow what you'r trying to do.
This can be accomplished in two steps using the code from the Complete Sample App.
http://code.google.com/appengine/docs/java/blobstore/overview.html#Complete_Sample_App
1) Write a servlet that takes a blobkey and returns the contents of the blob.
public void doGet(HttpServletRequest req, HttpServletResponse res)
throws IOException {
BlobKey blobKey = new BlobKey(req.getParameter("blob-key"));
blobstoreService.serve(blobKey, res);
}
2) Within your application, use the URLFetchService.fetch(java.net.URL url) with the proper blobkey to retrieve the blob (as a stream) and attach it to the email.
You can use BlobstoreInputStream to do almost the same thing as BlobReader do.
https://developers.google.com/appengine/docs/java/javadoc/com/google/appengine/api/blobstore/BlobstoreInputStream
BlobstoreInputStream provides an InputStream view of a blob in Blobstore. It is thread compatible but not thread safe: there is no static state, but any multithreaded use must be externally synchronized.

Categories

Resources