Java REST - call method after POST to REST - java

I'm using a Java REST service for a file upload.
The file should land on my server, which it does, then move to Amazon S3 bucket.
The upload to the server is fine, but the 2nd call to another method does not work.
I assume because there is a timeout issue?
The code to move the file to amazon works in another app, but I am not able to get it working within my REST project.
Here is the method:
#POST
#Path("/upload")
#Consumes(MediaType.MULTIPART_FORM_DATA)
public Response uploadFile(#FormDataParam("file") InputStream inputStream,
#FormDataParam("file") FormDataContentDisposition file, #FormDataParam("filename") String filename){
Logger log = Logger.getLogger("Mike");
String response = "";
File f = null;
try {
final String FILE_DESTINATION = "C://uploads//" + file.getFileName();
f = new File(FILE_DESTINATION);
OutputStream outputStream = new FileOutputStream(f);
int size = 0;
byte[] bytes = new byte[1024];
while ((size = inputStream.read(bytes)) != -1) {
outputStream.write(bytes, 0, size);
}
outputStream.flush();
outputStream.close();
log.info("upload complete for initial file!");
//move file to Amazon S3 Bucket.
AmazonS3 s3 = new AmazonS3Client(
new ClasspathPropertiesFileCredentialsProvider());
log.info("trying put request");
PutObjectRequest request = new PutObjectRequest("site.address.org","/pdf/PDF_Web_Service/work/"+f.getName(),f);
log.info(f.getName());
log.info(f.getAbsolutePath());
s3.putObject(request);
log.info("put request complete");
response = "File uploaded " + FILE_DESTINATION;
} catch (Exception e) {
e.printStackTrace();
}
return Response.status(200).entity(response).build();
}
Specifically, here is the part not working. I am not getting any log info either:
//move file to Amazon S3 Bucket.
AmazonS3 s3 = new AmazonS3Client(
new ClasspathPropertiesFileCredentialsProvider());
log.info("trying put request");
PutObjectRequest request = new PutObjectRequest("site.address.org","/pdf/PDF_Web_Service/work/"+f.getName(),f);
log.info(f.getName()); log.info(f.getAbsolutePath());
s3.putObject(request); log.info("put request complete");

Michael,
If it's a time-out issue, it's common practice to use guava's Listenable Future to chain your tasks together. What your web sequence will look like then is:
a) Client sends file
b) Server responds with 200 once file completes uploading.
c) Once the server is done loading the file, chain the future to then upload to S3.
Chaining listenable futures is common practice to separate functionality and ensure a time out doesn't occur by breaking up your code and essentially pipe-lining it.
Please let me know if you have any questions!

I moved the Amazon code into the try block and now it works.

Related

How to download a file that gets processed over an HTTP request in Java?

I'm writing a program that builds stuff in a GUI (blah blah blah... irrelevant details), and the user is allowed to export that data as a .tex file which can be compiled to a PDF. Since I don't really want to assume they have a TeX environment installed, I'm using an API (latexonline.cc). That way, I can construct an HTTP GET request, send it to the API, then (hopefully!) return the PDF in a byte-stream. The issue, though, is that when I submit the request, I'm only getting the page data back from the request instead of the data from the PDF. I'm not sure if it's because of how I'm doing my request or not...
Here's the code:
... // preceding code
DataOutputStream dos = new DataOutputStream(new FileOutputStream("test.pdf"));
StringBuilder httpTex = new StringBuilder();
httpTex.append(this.getTexCode(...)); // This appends the TeX code (nothing wrong here)
// Build the URL and HTTP request.
String texURL = "https://latexonline.cc/compile?text=";
String paramURL = URLEncoder.encode(httpTex.toString(), "UTF-8");
URL url = new URL(texURL + paramURL);
byte[] buffer = new byte[1024];
try {
InputStream is = url.openStream();
int bufferLen = -1;
while ((bufferLen = is.read(buffer)) > -1) {
this.getOutputStream().write(buffer, 0, bufferLen);
}
dos.close();
is.close();
} catch (IOException ex) {
ex.printStackTrace();
}
Edit: Here's the data I'm getting from the GET request:
https://pastebin.com/qYtGXUsd
Solved! I used a different API and it works perfectly.
https://github.com/YtoTech/latex-on-http

How to send large file with Telegram Bot API?

Telegram bot has a file size limit for sending in 50MB.
I need to send large files. Is there any way around this?
I know about this project https://github.com/pwrtelegram/pwrtelegram but I couldn't make it work.
Maybe someone has already solved such a problem?
There is an option to implement the file upload via Telegram API and then send by file_id with bot.
I write a bot in Java using the library https://github.com/rubenlagus/TelegramBots
UPDATE
For solve this problem i use telegram api, that has limit on 1.5 GB for big files.
I prefer kotlogram - the perfect lib with good documentation https://github.com/badoualy/kotlogram
UPDATE 2
Example of something how i use this lib:
private void uploadToServer(TelegramClient telegramClient, TLInputPeerChannel tlInputPeerChannel, Path pathToFile, int partSize) {
File file = pathToFile.toFile();
long fileId = getRandomId();
int totalParts = Math.toIntExact(file.length() / partSize + 1);
int filePart = 0;
int offset = filePart * partSize;
try (InputStream is = new FileInputStream(file)) {
byte[] buffer = new byte[partSize];
int read;
while ((read = is.read(buffer, offset, partSize)) != -1) {
TLBytes bytes = new TLBytes(buffer, 0, read);
TLBool tlBool = telegramClient.uploadSaveBigFilePart(fileId, filePart, totalParts, bytes);
telegramClient.clearSentMessageList();
filePart++;
}
} catch (Exception e) {
log.error("Error uploading file to server", e);
} finally {
telegramClient.close();
}
sendToChannel(telegramClient, tlInputPeerChannel, "FILE_NAME.zip", fileId, totalParts)
}
private void sendToChannel(TelegramClient telegramClient, TLInputPeerChannel tlInputPeerChannel, String name, long fileId, int totalParts) {
try {
String mimeType = name.substring(name.indexOf(".") + 1);
TLVector<TLAbsDocumentAttribute> attributes = new TLVector<>();
attributes.add(new TLDocumentAttributeFilename(name));
TLInputFileBig inputFileBig = new TLInputFileBig(fileId, totalParts, name);
TLInputMediaUploadedDocument document = new TLInputMediaUploadedDocument(inputFileBig, mimeType, attributes, "", null);
TLAbsUpdates tlAbsUpdates = telegramClient.messagesSendMedia(false, false, false,
tlInputPeerChannel, null, document, getRandomId(), null);
} catch (Exception e) {
log.error("Error sending file by id into channel", e);
} finally {
telegramClient.close();
}
}
where TelegramClient telegramClient and TLInputPeerChannel tlInputPeerChannel you can create as write in documentation.
DON'T COPY-PASTE, rewrite on your needs.
With local Telegram Bot API server you are allowed to send InputStream with a 2000Mb file size limit, raised from 50Mb default.
IF you want to send file via telegram bot, you have three options:
InputStream (10 MB limit for photos, 50 MB for other files)
From http url (Telegram will download and send the file. 5 MB max size for photos and 20 MB max for other types of content.)
Send cached files by their file_ids.(There are no limits for files sent this way)
So, I recommend you to store file_ids beforehand and send files by these ids (this is recommended by api docs too).
Using a Local Bot API Server you can send a large file up to 2GB.
GitHub Source Code:
https://github.com/tdlib/telegram-bot-api
Official Documentation
https://core.telegram.org/bots/api#using-a-local-bot-api-server
You can build and install this to your server by following the instructions on this link https://tdlib.github.io/telegram-bot-api/build.html
basic setup :
Generate Telegram Applications id from https://my.telegram.org/apps
Start the server ./telegram-bot-api --api-id=<your-app-id> --api-hash=<your-app-hash> --verbosity=20
Default address is http://127.0.0.1:8081/ and the port is 8081.
All the official APIs will work with this setup. Just change the address to http://127.0.0.1:8081/bot/METHOD_NAME reference: https://core.telegram.org/bots/api
Example Code:
OkHttpClient client = new OkHttpClient().newBuilder()
.build();
MediaType mediaType = MediaType.parse("text/plain");
RequestBody body = new MultipartBody.Builder().setType(MultipartBody.FORM)
.addFormDataPart("chat_id","your_chat_id_here")
.addFormDataPart("video","file_location",
RequestBody.create(MediaType.parse("application/octet-stream"),
new File("file_location")))
.addFormDataPart("supports_streaming","true")
.build();
// https://127.0.0.1:8081/bot<token>/METHOD_NAME
Request request = new Request.Builder()
.url("http://127.0.0.1:8081/bot<token>/sendVideo")
.method("POST", body)
.build();
Response response = client.newCall(request).execute();

Jersey HTTP POST method corrupting non-text files

I have a HTTP POST method that works fine if I upload text files. But if I try to upload a word document, pdf, zip, gzip, etc... the files that are uploaded get corrupted in the process. I'm using Postman to send the request. I do a "POST" method, enter the url, add headers (tried all sorts of headers and it really does not change anything so now I don't have any entered), and then on the body I select "formdata" and select the file. I really just need to fix this to be able to support files that end in .csv.gz and .csv. Currently, csv is fine but the .csv.gz is the type that is corrupting. I tried other non-text files as well just to see what happens and they corrupt too. I cannot figure out if there is some encoding, filter, etc... that is causing this to happen that I can remove or some setting I need to apply. Or if there is some other way to handle this with jersey so the non-text files stay the same as the original file.
My application is running Spring v1.5.3 and Jersey 2.25.
#Override
public Response uploadTopicFile(String topic, FormDataMultiPart formDataMultipart) throws Exception {
List<BodyPart> bodyParts = formDataMultipart.getBodyParts();
// Getting the body of the request (should be a file)
for (BodyPart bodyPart : bodyParts) {
String fileName = bodyPart.getContentDisposition().getFileName();
InputStream fileInputStream = bodyPart.getEntityAs(InputStream.class);
String uploadedFileLocation = env.getProperty("temp.upload.path") + File.separator + fileName;
this.saveFile(fileInputStream, uploadedFileLocation);
String output = "File uploaded to : " + uploadedFileLocation;
log.debug(output);
}
return Response.status(201).build();
}
private void saveFile(InputStream uploadedInputStream, String serverLocation) {
try {
// Create the output directory
Files.createDirectories(Paths.get(serverLocation).getParent());
// Get the output stream
OutputStream outputStream = new FileOutputStream(new File(serverLocation));
int read = 0;
byte[] bytes = new byte[1024];
// Loop through the stream
while ((read = uploadedInputStream.read(bytes)) != -1) {
// Output to file
outputStream.write(bytes, 0, read);
}
// Flush and close
outputStream.flush();
outputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
return;
}
There was a filter causing the corruption. Filter was updated and issue resolved.

Azure storage block blob upload from android example

I am using the following code from an android application to upload a blob to Azure Blob Storage. Note: the sasUrl parameter below is a signed url acquired from my web service :
// upload file to azure blob storage
private static Boolean upload(String sasUrl, String filePath, String mimeType) {
try {
// Get the file data
File file = new File(filePath);
if (!file.exists()) {
return false;
}
String absoluteFilePath = file.getAbsolutePath();
FileInputStream fis = new FileInputStream(absoluteFilePath);
int bytesRead = 0;
ByteArrayOutputStream bos = new ByteArrayOutputStream();
byte[] b = new byte[1024];
while ((bytesRead = fis.read(b)) != -1) {
bos.write(b, 0, bytesRead);
}
fis.close();
byte[] bytes = bos.toByteArray();
// Post our image data (byte array) to the server
URL url = new URL(sasUrl.replace("\"", ""));
HttpURLConnection urlConnection = (HttpURLConnection) url.openConnection();
urlConnection.setDoOutput(true);
urlConnection.setConnectTimeout(15000);
urlConnection.setReadTimeout(15000);
urlConnection.setRequestMethod("PUT");
urlConnection.addRequestProperty("Content-Type", mimeType);
urlConnection.setRequestProperty("Content-Length", "" + bytes.length);
urlConnection.setRequestProperty("x-ms-blob-type", "BlockBlob");
// Write file data to server
DataOutputStream wr = new DataOutputStream(urlConnection.getOutputStream());
wr.write(bytes);
wr.flush();
wr.close();
int response = urlConnection.getResponseCode();
if (response == 201 && urlConnection.getResponseMessage().equals("Created")) {
return true;
}
} catch (Exception e) {
e.printStackTrace();
}
return false;
}
The code is working fine for small blobs but when a blob reaches a certain size depending on the phone I am testing with, I start to get out of memory exceptions. I would like to split the blobs and upload them in blocks. However, all the examples I find on the web are C# based and are using the Storage Client library. I am looking for a Java/Android example that uploads a blob in blocks using the Azure Storage Rest API.
There is an Azure Storage Android library published here. A basic blob storage example is in the samples folder. The method you’d probably like to use is uploadFromFile in the blob class. This will, by default attempt to put the blob in a single put if the size is less than 64MB and otherwise send the blob in 4MB blocks. If you’d like to reduce the 64MB limit, you can set the singleBlobPutThresholdInBytes property on the BlobRequestOptions object of either the CloudBlobClient (which will affect all requests) or passed to the uploadFromFile method (to affect only that request). The storage library includes many convenient features such as automated retries and maximum execution timeout across the block put requests which are all configurable.
If you’d still like to use a more manual approach, the PutBlock and Put Block List API references are here and provide generic, cross-language documentation. These have nice wrappers in the CloudBlockBlob class of the Azure Storage Android library called uploadBlock and commitBlockList which may save you a lot of time in manual request construction and can provide some of the aforementioned conveniences.

Sending Large Image in chunks

I am sending images from my android client to java jersey restful service and I succeded in doing that.But my issue is when I try to send large images say > 1MB its consumes more time so I like to send image in CHUNKS can anyone help me in doing this.How to send(POST) image stream in CHUNKS to server
references used :
server code & client call
server function name
/*** SERVER SIDE CODE****/
#POST
#Path("/upload/{attachmentName}")
#Consumes(MediaType.APPLICATION_OCTET_STREAM)
public void uploadAttachment(
#PathParam("attachmentName") String attachmentName,
#FormParam("input") InputStream attachmentInputStream) {
InputStream content = request.getInputStream();
// do something better than this
OutputStream out = new FileOutputStream("content.txt");
byte[] buffer = new byte[1024];
int len;
while ((len = in.read(buffer)) != -1) {
// whatever processing you want here
out.write(buffer, 0, len);
}
out.close();
return Response.status(201).build();
}
/**********************************************/
/**
CLIENT SIDE CODE
**/
// .....
client.setChunkedEncodingSize(1024);
WebResource rootResource = client.resource("your-server-base-url");
File file = new File("your-file-path");
InputStream fileInStream = new FileInputStream(file);
String contentDisposition = "attachment; filename=\"" + file.getName() + "\"";
ClientResponse response = rootResource.path("attachment").path("upload").path("your-file-name")
.type(MediaType.APPLICATION_OCTET_STREAM).header("Content-Disposition", contentDisposition)
.post(ClientResponse.class, fileInStream);
You should split the file in the client and restore part of the file in the server.
and after that you should merge the files together. Take a look at split /merge file on coderanch
Enjoy ! :)
Another path is available, if you don't want to code too much consider using :
file upload apache that is great ! :)

Categories

Resources