Files remain open/locked in Azure Function - unable to delete - java

My program makes use of a library to upload a file located in an Azure File Share to Sharepoint, after which the file is deleted from Azure File Share. Below is a small (the relevant) part of my code; when I run it the file is uploaded correctly, but isn't removed afterwards isn't removed because it is still in use by an SMB client (it's "marked for deletion", but is only deleted once the Azure Function is disabled).
My guess was that since an InputStream is opened in the wrapper.uploadFile, but not closed that might be it, but resource.isOpen() always returns false
main.class
File file = new File (filepath);
Resource resource = new FileSystemResource(filepath);
PLGSharepointClient wrapper = new PLGSharepointClient(user, passwd, domain, spSiteUrl);
JSONObject jsonMetadata = new JSONObject();
wrapper.uploadFile(spFolder, resource, jsonMetadata);
resource.getInputStream().close();
System.out.println(resource.isOpen());
file.delete();
wrapper.uploadFile
public JSONObject uploadFile(String folder, Resource resource, JSONObject jsonMetadata) throws Exception {
LOG.debug("Uploading file {} to folder {}", resource.getFilename(), folder);
JSONObject submeta = new JSONObject();
submeta.put("type", "SP.ListItem");
jsonMetadata.put("__metadata", submeta);
headers = headerHelper.getPostHeaders("");
headers.remove("Content-Length");
byte[] resBytes = IOUtils.readFully(resource.getInputStream(), (int) resource.contentLength());
RequestEntity<byte[]> requestEntity = new RequestEntity<>(resBytes,
headers, HttpMethod.POST,
this.tokenHelper.getSharepointSiteUrl(
"/_api/web/GetFolderByServerRelativeUrl('" + UriUtils.encodeQuery(folder, StandardCharsets.UTF_8) +"')/Files/add(url='"
+ UriUtils.encodeQuery(resource.getFilename(), StandardCharsets.UTF_8) + "',overwrite=true)"
)
);
ResponseEntity<String> responseEntity =
restTemplate.exchange(requestEntity, String.class);
String fileInfoStr = responseEntity.getBody();
LOG.debug("Retrieved response from server with json");
JSONObject jsonFileInfo = new JSONObject(fileInfoStr);
String serverRelFileUrl = jsonFileInfo.getJSONObject("d").getString("ServerRelativeUrl");
LOG.debug("File uploaded to URI", serverRelFileUrl);
String metadata = jsonMetadata.toString();
headers = headerHelper.getUpdateHeaders(metadata);
LOG.debug("Updating file adding metadata {}", jsonMetadata);
RequestEntity<String> requestEntity1 = new RequestEntity<>(metadata,
headers, HttpMethod.POST,
this.tokenHelper.getSharepointSiteUrl("/_api/web/GetFileByServerRelativeUrl('" + UriUtils.encodeQuery(serverRelFileUrl, StandardCharsets.UTF_8) + "')/listitemallfields")
);
ResponseEntity<String> responseEntity1 =
restTemplate.exchange(requestEntity1, String.class);
LOG.debug("Updated file metadata Status {}", responseEntity1.getStatusCode());
return jsonFileInfo;
}

In your wrapper.upload file, add resource.getInputStream().close() and check if this works.

Related

Uploaded files duplicated in my project and AWS S3 bucket

I created Java program that can save files in AMAZON S3 storage - it works ok, but its saves files not only in S3 bucket, but also in my project directory.
Here is my code that saving files to S3. I suppose the reason why it saving in project directory also - is creation of file instance with specified path - File file = new File(timestamp + ".jpg"); But how can I avoid that and still set needed file name without saving it to the project directory?
public void saveFileToStorage(String url, Long timestamp, Integer deviceId) {
S3Repository repository = new S3Repository(bucketName);
File file = new File(timestamp + ".jpg");
try {
URL link = new URL(url);
Thread.sleep(1500);//wait until URL is ready for download
FileUtils.copyURLToFile(link, file);
repository.uploadFile(timestamp.toString(), file, deviceId.toString()+"/");
} catch (IOException | InterruptedException e) {
log.error(e.getMessage() + " - check thread sleep time!");
throw new RuntimeException(e);
}
}
Here is my upload method from repository:
public void uploadFile(String keyName, File file, String folder) {
ObjectMetadata metadata = new ObjectMetadata();
metadata.setContentLength(0);
s3client.putObject(bucketName, folder, new ByteArrayInputStream(new byte[0]), metadata);
s3client.putObject(new PutObjectRequest(bucketName, folder+keyName, file));
}
It's quite common to do something similar to what you've done.
I personally like the PutObjectRquest builder.
S3Client client = S3Client.builder().build();
PutObjectRequest request = PutObjectRequest.builder()
.bucket("bucketName").key("fileName").build();
client.putObject(request, RequestBody.fromFile(new File("filePath")));
To address your problem what about using RequestBody.fromByteBuffer() instead of RequestBody.fromFile()?
Here you can find an example:
https://stackabuse.com/aws-s3-with-java-uploading-files-creating-and-deleting-s3-buckets/

Get file from GCS without downloading it locally

I have a simple Spring Boot microservice that takes care of uploading, retrieving and deleting images to/from Google Cloud Storage. I have the following code for the get request in my service:
public StorageObject getImage(String fileName) throws IOException {
StorageObject object = storage.objects().get(bucketName, fileName).execute();
File file = new File("./" + fileName);
FileOutputStream os = new FileOutputStream(file);
storage.getRequestFactory()
.buildGetRequest(new GenericUrl(object.getMediaLink()))
.execute()
.download(os);
object.set("file", file);
return object;
}
And this is my controller part:
#GetMapping("/get/image/{id}")
public ResponseEntity<byte[]> getImage(#PathVariable("id") Long id) {
try {
String fileName = imageService.findImageById(id);
StorageObject object = gcsService.getImage(fileName);
byte[] res = Files.toByteArray((File) object.get("file"));
return ResponseEntity.ok()
.contentType(MediaType.IMAGE_JPEG)
.body(res);
} catch (IOException e) {
e.printStackTrace();
throw new RuntimeException("No such file or directory");
}
}
It all works fine in terms of getting the image in the response, but my problem is that the images get downloaded at the root directory of the project too. Many images are going to be uploaded through this service so this is an issue. I only want to display the images in the response (as a byteArray), without having them download. I tried playing with the code but couldn't manage to get it to work as I want.
I'd suggest to instead stream the download, while skipping the FileChannel operation:
public static void streamObjectDownload(
String projectId, String bucketName, String objectName, String targetFile
) {
Storage storage = StorageOptions.newBuilder().setProjectId(projectId).build().getService();
try (ReadChannel reader = storage.reader(BlobId.of(bucketName, objectName));
FileChannel targetFileChannel = FileChannel.open(Paths.get(targetFile), StandardOpenOption.WRITE)) {
ByteStreams.copy(reader, targetFileChannel);
System.out.println(
"Downloaded object " + objectName
+ " from bucket " + bucketName
+ " to " + targetFile
+ " using a ReadChannel.");
}
} catch (IOException e) {
e.printStacktrace()
}
}
One can eg. obtain a FileChannel from a RandomAccessFile:
RandomAccessFile file = new RandomAccessFile(Paths.get(targetFile), StandardOpenOption.WRITE);
FileChannel channel = file.getChannel();
While the Spring framework similarly has a GoogleStorageResource:
public OutputStream getOutputStream() throws IOExceptionReturns the output stream for a Google Cloud Storage file.
Then convert from OutputStream to byte[] (this may be binary or ASCII data):
byte[] bytes = os.toByteArray();
Would it work for you to create Signed URLs in Cloud Storage to display your images? These URLs give access to storage bucket files for a limited time, and then expire, so you would rather not store temporary copies of the image locally as is suggested in this post.

Contents and format of URI when uploading or downloading file

I have an application which is an API on a server (say 192.168.0.2), to which files (of any format) can be uploaded or from which they can be downloaded.
If an application on another machine on the network (say 192.196.0.3) wants to upload a file to the API, it passes the information in a JSON Object e.g. { "FILE_LOCATION":"file:/192.168.0.3/c:/testDocs/testFile.docx" }
The code in the api goes roughly:
private static void doPost (HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException
{
String errorMessage = "";
try
{
String src = request.getParameter ("src");
Object obj = jsonParser.parse (src);
JSONObject jsonObj = (JSONObject) obj;
String fileLocation = (String) jsonObj.get ("FILE_LOCATION");
URI uri = new URI (fileLocation);
URL url = uri.toURL (); // get URL from your uri object
URLConnection urlConnection = url.openConnection ();
InputStream is = urlConnection.getInputStream ();
System.out.println ("InputStream = " + is);
if (is != null)
{
// create output file, output stream etc
}
}
catch (Exception e)
{
errorMessage = e.getMessage ();
System.out.println (e.getClass().getName () + " : " + errorMessage);
}
PrintWriter pw = response.getWriter ();
pw.append (errorMessage);
}
The system log invariably shows something like:
"java.io.FileNotFoundException : 192.168.0.3/c:/testDocs/testFile.docx (No such file or directory)"
What am I doing wrong? I am convinced that the fault lies in the way I have constructed the String which will be used to create the URI.
That does not look like a valid file: URL. On Windows you could try to check if a file is accessible from a remote server if dir works in CMD.EXE. For example, try UNC pathname:
dir \\IP_OR_HOSTNAME\NAMEOFSHARE\path\etc\filename.xyz
If that works - and that rather depends on whether the remote server is serving that filesystem as NAMEOFSHARE - then the equivalent file: encoding of above would be:
String u = new File("\\\\IP_OR_HOSTNAME\\NAMEOFSHARE\\path\\etc\\filename.xyz").toURL().toString();
==> "file://IP_OR_HOSTNAME/NAMEOFSHARE/path/etc/filename.xyz"

Posting FileList in RestAssured

Currently I use the below code to post single file using RestAssured.
RestAssured.given().contentType(ContentType.MULTIPART_FORM_DATA.toString()).request().multiPart("files", ScreenshotFile).post().then().statusCode(200);
However I want to upload multiple files from the below mentioned FileList.
File ScreenShotFolder = new File("C:\\Users\\1451615\\Desktop\\SessionScreenshot\\");
File ScreenShotFiles[] = ScreenShotFolder.listFiles();
I have put a for loop to post multiple files in the same request. Please find below the code for same.
File ScreenShotFolder = new File("C:\\Users\\1451615\\Desktop\\SessionScreenshot\\");
File ScreenShotFiles[] = ScreenShotFolder.listFiles();
RestAssured.baseURI = "http://10.141.188.112:7080/PIMSelfService/testing/uploadResultImg";
RequestSpecification request = RestAssured.given().contentType(ContentType.MULTIPART_FORM_DATA.toString()).request();
for (File file: ScreenShotFiles) {
System.out.println("File name: " + file.getName());
String FilePath = file.getAbsolutePath();
File ScreenShotPath = new File(FilePath);
System.out.println(ScreenShotPath);
request.multiPart("files", ScreenShotPath);
}
request.post().then().statusCode(200);
ValidatableResponse createAttachemnetResponse = expect()
.given()
.spec(requestSpecification)
.header("content-type", "multipart/form-data")
.multiPart("files-0", new File("testImages/1.jpg"))
.multiPart("files-1", new File("testImages/2.png"))
.multiPart("files-2", new File("testImages/3.png"))
.multiPart("files-3", new File("testImages/4.png"))
.multiPart("files-4", new File("testImages/5.png"))
.formParams("txn_id", transactionId)
.when()
.post(TRANSACTION_BASEPATH + POST_ATTACHMENT)
.then()
.spec(responseSpecification);

Downloading a file with file name greater than 255 characters using MS edge

I am using the spring framework to serve a file for download. I have the following code.
public ResponseEntity<List<Integer>>
defExport()
throws IllegalStateException,
IOException {
Map<String, Object> resultMap = Maps.newHashMap();
int status = STATUS_SUCCESS;
HttpHeaders headers = new HttpHeaders();
List<Integer> byteList = Lists.newArrayList();
try {
File file = transformService.executeExport(something);
byte[] fileContent = null;
try (InputStream is = new FileInputStream(file)) {
fileContent = read(is);
}
String fileName = StringUtils.join(something, ".xlsx");
headers.add("fileName", fileName);
headers.add("Content-Disposition", StringUtils.join(
"attachment; filename=\"", URLEncoder.encode(fileName, "UTF8"), "\""));
for (byte b : fileContent) {
byteList.add(new Integer(b));
}
} catch (Exception e) {
status = STATUS_ERROR;
}
headers.add("ifxResultStatus", String.valueOf(status));
return new ResponseEntity<>(byteList, headers, HttpStatus.OK);
on JS side, I have the follwing:
_self.xhr.open('POST', targetUrl, true);
_self.xhr.onreadystatechange = goog.bind(this.blankDlResponseHandler, this);
_self.xhr.setRequestHeader('X-CSRF-TOKEN', project.core.app.View.getCsrfToken());
var form_data = new FormData();
_self.xhr.send(form_data);
When I try to download the file with name greater than 255 characters, I can do so successfully in Chrome and IE11 on Windows 10.
However, when I try to do so in MS Edge, the download is unsuccessful due to the long file name.
How can I make it work on MS Edge also.
Edit
I just realized that in Chrome the file name is always limited to 218 characters and the last characters are trimmed.
So, my new question is how can I limit the characters to 218, specially in the case where a file with the same name already exists (file names are then appended (1),(2) and so on)

Categories

Resources