I have a simple Spring Boot microservice that takes care of uploading, retrieving and deleting images to/from Google Cloud Storage. I have the following code for the get request in my service:
public StorageObject getImage(String fileName) throws IOException {
StorageObject object = storage.objects().get(bucketName, fileName).execute();
File file = new File("./" + fileName);
FileOutputStream os = new FileOutputStream(file);
storage.getRequestFactory()
.buildGetRequest(new GenericUrl(object.getMediaLink()))
.execute()
.download(os);
object.set("file", file);
return object;
}
And this is my controller part:
#GetMapping("/get/image/{id}")
public ResponseEntity<byte[]> getImage(#PathVariable("id") Long id) {
try {
String fileName = imageService.findImageById(id);
StorageObject object = gcsService.getImage(fileName);
byte[] res = Files.toByteArray((File) object.get("file"));
return ResponseEntity.ok()
.contentType(MediaType.IMAGE_JPEG)
.body(res);
} catch (IOException e) {
e.printStackTrace();
throw new RuntimeException("No such file or directory");
}
}
It all works fine in terms of getting the image in the response, but my problem is that the images get downloaded at the root directory of the project too. Many images are going to be uploaded through this service so this is an issue. I only want to display the images in the response (as a byteArray), without having them download. I tried playing with the code but couldn't manage to get it to work as I want.
I'd suggest to instead stream the download, while skipping the FileChannel operation:
public static void streamObjectDownload(
String projectId, String bucketName, String objectName, String targetFile
) {
Storage storage = StorageOptions.newBuilder().setProjectId(projectId).build().getService();
try (ReadChannel reader = storage.reader(BlobId.of(bucketName, objectName));
FileChannel targetFileChannel = FileChannel.open(Paths.get(targetFile), StandardOpenOption.WRITE)) {
ByteStreams.copy(reader, targetFileChannel);
System.out.println(
"Downloaded object " + objectName
+ " from bucket " + bucketName
+ " to " + targetFile
+ " using a ReadChannel.");
}
} catch (IOException e) {
e.printStacktrace()
}
}
One can eg. obtain a FileChannel from a RandomAccessFile:
RandomAccessFile file = new RandomAccessFile(Paths.get(targetFile), StandardOpenOption.WRITE);
FileChannel channel = file.getChannel();
While the Spring framework similarly has a GoogleStorageResource:
public OutputStream getOutputStream() throws IOExceptionReturns the output stream for a Google Cloud Storage file.
Then convert from OutputStream to byte[] (this may be binary or ASCII data):
byte[] bytes = os.toByteArray();
Would it work for you to create Signed URLs in Cloud Storage to display your images? These URLs give access to storage bucket files for a limited time, and then expire, so you would rather not store temporary copies of the image locally as is suggested in this post.
Related
I created Java program that can save files in AMAZON S3 storage - it works ok, but its saves files not only in S3 bucket, but also in my project directory.
Here is my code that saving files to S3. I suppose the reason why it saving in project directory also - is creation of file instance with specified path - File file = new File(timestamp + ".jpg"); But how can I avoid that and still set needed file name without saving it to the project directory?
public void saveFileToStorage(String url, Long timestamp, Integer deviceId) {
S3Repository repository = new S3Repository(bucketName);
File file = new File(timestamp + ".jpg");
try {
URL link = new URL(url);
Thread.sleep(1500);//wait until URL is ready for download
FileUtils.copyURLToFile(link, file);
repository.uploadFile(timestamp.toString(), file, deviceId.toString()+"/");
} catch (IOException | InterruptedException e) {
log.error(e.getMessage() + " - check thread sleep time!");
throw new RuntimeException(e);
}
}
Here is my upload method from repository:
public void uploadFile(String keyName, File file, String folder) {
ObjectMetadata metadata = new ObjectMetadata();
metadata.setContentLength(0);
s3client.putObject(bucketName, folder, new ByteArrayInputStream(new byte[0]), metadata);
s3client.putObject(new PutObjectRequest(bucketName, folder+keyName, file));
}
It's quite common to do something similar to what you've done.
I personally like the PutObjectRquest builder.
S3Client client = S3Client.builder().build();
PutObjectRequest request = PutObjectRequest.builder()
.bucket("bucketName").key("fileName").build();
client.putObject(request, RequestBody.fromFile(new File("filePath")));
To address your problem what about using RequestBody.fromByteBuffer() instead of RequestBody.fromFile()?
Here you can find an example:
https://stackabuse.com/aws-s3-with-java-uploading-files-creating-and-deleting-s3-buckets/
I am fetching S3 objects and then sending the object in email as an attachment. I am saving the contents in a temporary file. For images the code is working fine but in case of documents (pdf, docx, csv) files the attachments are sent without extension so they are not accessible.
try {
fullObject = s3Client.getObject(new GetObjectRequest(bucketName, key));
System.out.println("fullObject: " + fullObject);
ObjectMetadata metadata = fullObject.getObjectMetadata();
System.out.println(" meta data type: " + metadata.getContentType());
InputStream inputStream = fullObject.getObjectContent();
String extension = fullObject.getKey();
int index = extension.lastIndexOf('.');
if(index > 0) {
extension = extension.substring(index + 1);
System.out.println("File extension is " + extension);
}
File file = File.createTempFile(key, "."+ extension );
System.out.println("file: "+ file);
try (OutputStream outputStream = new FileOutputStream(file)) {
IOUtils.copy(inputStream, outputStream);
} catch (Exception e) {
System.out.println("error in copying data from one file to another");
}
dataSource = new FileDataSource(file);
System.out.println("added datasource in the list");
attachmentsList.add(dataSource);
}
Upon going through this code, I got to know that the issue was not in this code but when I was setting the name of the File. I was setting filename without any extension, for example I set Filename as "temporary" this caused the documents to be saved with tmp extension. All I had to do was add the extension of the object with its name ("temporary.docx"), this solved the issue and attachments were sent properly and were accessible.
I'm trying to make an Azure Function in Java. I need to make an excel file and upload it in BLOB container.
When I build the project and the tests start, then it works without problems and it uploads the file in the container, when instead I debug the project or I deploy it on Azure and I run it via internet (calling the service), it doesn't upload it. It blocks when it tries to upload the file.
Can you help me please? I'm on this problem since a few days.
Thank you.
I attach the method where it uploads the file:
#FunctionName("FunctionTest")
public HttpResponseMessage run(
#HttpTrigger(
name = "req",
methods = {HttpMethod.GET, HttpMethod.POST},
authLevel = AuthorizationLevel.ANONYMOUS)
HttpRequestMessage<Optional<String>> request,
final ExecutionContext context) {
context.getLogger().info("Java HTTP trigger processed a request.");
final String queryAccountName = request.getQueryParameters().get("AccountName");
String accountName = request.getBody().orElse(queryAccountName);
final String queryAccountKey = request.getQueryParameters().get("AccountKey");
String accountKey = request.getBody().orElse(queryAccountKey);
context.getLogger().info("Azure Blob storage v12 - Java quickstart sample\n");
// Retrieve the connection string for use with the application. The storage
// connection string is stored in an environment variable on the machine
// running the application called AZURE_STORAGE_CONNECTION_STRING. If the environment variable
// is created after the application is launched in a console or with
// Visual Studio, the shell or application needs to be closed and reloaded
// to take the environment variable into account.
// String connectStr = System.getenv("AZURE_STORAGE_CONNECTION_STRING");
//String connectStr = "DefaultEndpointsProtocol=https;AccountName="+accountName+";AccountKey="+accountKey+";EndpointSuffix=core.windows.net";
// Create a BlobServiceClient object which will be used to create a container client
//BlobServiceClient blobServiceClient = new BlobServiceClientBuilder().connectionString(connectStr).buildClient();
StorageSharedKeyCredential credential = new StorageSharedKeyCredential(accountName, accountKey);
String endpoint = String.format(Locale.ROOT, "https://%s.blob.core.windows.net", accountName);
BlobServiceClient blobServiceClient = new BlobServiceClientBuilder().endpoint(endpoint).credential(credential).buildClient();
//Create a unique name for the container
String containerName = "container-name";
// Create the container and return a container client object
//BlobContainerClient containerClient = blobServiceClient.createBlobContainer(containerName);
BlobContainerClient containerClient = blobServiceClient.getBlobContainerClient(containerName);
// Create a local file in the ./data/ directory for uploading and downloading
/*String pathFile = "./data/";
String fileName = "quickstart" + java.util.UUID.randomUUID() + ".txt";
File localFile = new File(pathFile + fileName);
// Write text to the file
FileWriter writer;
try {
writer = new FileWriter(pathFile + fileName, true);
writer.write("Hello, World!");
writer.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}*/
// Get a reference to a blob
// Upload the blob
String pathFile = System.getenv("TEMP") + "\\";
String fileName = creaReport(context)+".xlsx"; // creating file Excel - IT DOESN'T EVEN WORK WITH TXT FILE
BlobClient blobClient = containerClient.getBlobClient(fileName);
System.out.println("\nUploading to Blob storage as blob:\n\t" + blobClient.getBlobUrl());
blobClient.uploadFromFile(pathFile + fileName, true); // IT BLOCKS HERE
System.out.println("\nListing blobs...");
// List the blob(s) in the container.
for (BlobItem blobItem : containerClient.listBlobs()) {
System.out.println("\t" + blobItem.getName());
}
// Download the blob to a local file
// Append the string "DOWNLOAD" before the .txt extension so that you can see both files.
//String downloadFileName = fileName.replace(".txt", "DOWNLOAD.txt");
String downloadFileName = fileName.replace(".xlsx", "DOWNLOAD.xlsx");
File downloadedFile = new File(pathFile + downloadFileName);
System.out.println("\nDownloading blob to\n\t " + pathFile + downloadFileName);
blobClient.downloadToFile(pathFile + downloadFileName, true);
// Clean up
System.out.println("\nPress the Enter key to begin clean up");
System.console().readLine();
/*System.out.println("Deleting blob container...");
containerClient.delete();*/
System.out.println("Deleting the local source and downloaded files...");
localFile.delete();
downloadedFile.delete();
System.out.println("Done");
return request.createResponseBuilder(HttpStatus.OK).body("Blob uploaded").build();
}
For this problem, I test it in my side and summarize the point as below:
The reason for this problem is the files in local/temp are not shared among site instances. You can refer to this page.
So I met the same problem as you, I deploy my java function to azure and add a file under local/temp path manually. Then run the function, it can't access the data of the file.
After that, I edit my function code. Create a txt file by the code below in my function:
String filePath="d:\\local\\Temp\\test1.txt";
File file = new File(filePath);
try {
FileWriter writer = new FileWriter(file);
writer.write("Test data");
writer.close();
} catch (IOException e1) {
context.getLogger().info(e1.getMessage());
e1.printStackTrace();
}
And then read the file in the same function by the code below:
InputStream is = null;
int i;
char c;
try {
is = new FileInputStream(filePath);
while ((i = is.read()) != -1) {
c = (char) i;
context.getLogger().info("===inputstream==" + c);
}
} catch (Exception e) {
context.getLogger().info("===try catch error");
e.printStackTrace();
} finally {
if (is != null) {
try {
is.close();
} catch (IOException e) {
context.getLogger().info("===finally error");
e.printStackTrace();
}
}
}
Deploy the code to azure, the function app run in a consumption plan(so it will just use one instance if I just run it once). Running it, I read the data of the file success.
According to the test above, I suggest you do not create the file in local/temp. You'd better create the file in d:\home\site\wwwroot, you can create a folder under wwwroot and create the files in the folder. I test it works fine.
Hope it helps~
I am trying to detect the File Extension of a file passed as an InputStream, the extension is detected correctly but the file tends to become corrupted after that. Here is my method for detecting Extension -
public static Optional<String> detectFileExtension(InputStream inputStream) {
// To provide mark/reset functionality to the stream required by Tika.
InputStream bufferedInputStream = new BufferedInputStream(inputStream);
String extension = null;
try {
MimeTypes mimeRepository = getMimeRepository();
MediaType mediaType = mimeRepository.detect(bufferedInputStream, new Metadata());
MimeType mimeType = mimeRepository.forName(mediaType.toString());
extension = mimeType.getExtension();
log.info("File Extension detected: {}", extension);
// Need to reset input stream pos marker since it was updated while detecting the extension
inputStream.reset();
bufferedInputStream.close();
} catch (MimeTypeException | IOException ignored) {
log.error("Unable to detect extension of the file from the provided stream");
}
return Optional.ofNullable(extension);
}
private static MimeTypes getMimeRepository() {
TikaConfig config = TikaConfig.getDefaultConfig();
return config.getMimeRepository();
}
Now when I am trying to save this file after extension detection again using the same InputStream like -
byte[] documentContentByteArray = IOUtils.toByteArray(inputStream);
Optional<String> extension = FileTypeHelper.detectFileExtension(inputStream);
if (extension.isPresent()) {
fileName = fileName + extension.get();
} else {
log.warn("File: {} does not have a valid extension", fileName);
}
File file = new File("/tmp/" + fileName);
FileUtils.writeByteArrayToFile(file, documentContentByteArray);
It creates a file but a corrupted one. I guess after stream consumption in detectFileExtension the stream is not getting reset properly. If someone has done this before some guidance would be great, thanks in advance.
I fixed it by not using the same input stream again and again.
I created a new stream to pass for extension detection and the initial stream for creating the file.
byte[] documentContentByteArray = IOUtils.toByteArray(inputStream);
//extension detection
InputStream extensionDetectionInputStream = new ByteArrayInputStream(documentContentByteArray);
Optional<String> extension = FileTypeHelper.detectFileExtension(inputStream);
if (extension.isPresent()) {
fileName = fileName + extension.get();
} else {
log.warn("File: {} does not have a valid extension", fileName);
}
extensionDetectionInputStream.close();
//File creation
File file = new File("/tmp/" + fileName);
FileUtils.writeByteArrayToFile(file, documentContentByteArray);
If there is a better way to do that by reusing the same stream it would be great and I'll gladly accept that answer, for now, I am marking this as the accepted answer.
Hi to you all java experts.
I have this piece of code I could finally put together that works: (it's mostly java with a little ADF code)
public String upload(){
UploadedFile myfile = this.getFile();
FacesContext fctx = FacesContext.getCurrentInstance();
ServletContext servletCtx =
(ServletContext)fctx.getExternalContext().getContext();
String imageDirPath = servletCtx.getRealPath("/");
String nomdefichier = myfile.getFilename();
String mimetype = nomdefichier.substring(nomdefichier.length() - 3);
try {
InputStream inputStream = myfile.getInputStream();
BufferedImage input = ImageIO.read(inputStream);
File outputFile =
new File( System.getProperty("user.home") + File.separator + this.path + File.separator + nomdefichier);
ImageIO.write(input, mimetype, outputFile);
} catch (Exception ex) {
// handle exception
}
FacesMessage message =
new FacesMessage(mimetype + "Successfully uploaded file " + nomdefichier +
" (" + myfile.getLength() + " bytes)" + mimetype);
fctx.addMessage(null, message);
return null;
}
This codes uploads a picture just fine. I would really like to know if there is a file equivalent to ImageIO.write so that I could upload PDF, DOCX and such.
Thanks in advance for any response.
Best regards.
Marc Arbour
A simplified version of your code could be written as follows (omitting some of the JSF related stuff).
InputStream in = myFile.getInputStream();
Files.copy(in, Paths.get(yourPath));
You can write a byte array or an InputStream to a file with the java.nio.file.Files class (since Java 1.7).
// Saving data from an inputstream to a file:
Files.copy(inputStream, targetPath, copyOptions...);
// Saving a byte array to a file:
Files.write(path, byteArray, openOptions...);