Heap exception when write file on my server - java

I have some problem with file write on server. I have two approach to file upload, one with Spring Multipart file, and it doesn't have problem, and one with byte array and when the file is big it gives java.lang.OutOfMemoryError: Java heap space(I can see that the occupied ram increase with this method). I use byte array because I pass this file between web service REST and I put the stream into an object with also the field name.
This is my code for file write:
#Override
public void storeAcquisition(Response response, String toStorePath) throws Exception{
#SuppressWarnings("unchecked")
LinkedHashMap<String,String> result= (LinkedHashMap<String,String>)response.getResult();
Files.write(Paths.get(toStorePath + "/" + result.get("name")), DatatypeConverter.parseBase64Binary(result.get("content")));
}
This is the sender method
byte[] file = getFile(filePath);
if (file!=null){
FileTransfer fileTransfer= new FileTransfer(file, Paths.get(filePath).getFileName().toString());
//TODO POST TO SERVER
Response responseSend = new Response(true, true, fileTransfer, null);
RestTemplate restTemplate = new RestTemplate();
Response responseStatus = restTemplate.postForObject(serverIp + "ATS/client/file/?toStorePath={toStorePath}", responseSend, Response.class, toStorePath);
return responseStatus;
}
and
public byte[] getFile(String path) throws Exception{
if (Files.exists(Paths.get(path)))
return Files.readAllBytes(Paths.get(path));
else
return null;
}
Is there a problem in my code or the only solution is to use a better server?
The problem seems to be on restTemplate.postForObject and if I put a breakpoint on LinkedHashMap<String,String> result it doesn't come called.

Related

How to create a POST request for a file upload

I am currently uploading some files via the API from Opentext Content Server 21.2. I have already implemented most of the method calls. I can upload files, create folders, delete files and so on. However, I am currently failing with the file upload. Mainly only PDFs or images (Jpeg, PNG etc.) should be uploaded.
The current API documentation can be found here:
https://developer.opentext.com/apis/14ba85a7-4693-48d3-8c93-9214c663edd2/d7540c64-7da2-4554-9966-069c56a9341d/a35ce271-5bb7-4bcf-b672-0c8bcf747091#operation/createNode2
My current code looks like this:
#Override
public ClientResponse saveFile(String sessionId, String folderId, File document, String filename) throws DmsException, IOException {
client = ClientHelper.createClient();
client.addFilter(new LoggingFilter());
Builder webResource = client.resource(getRestUri() + REST_CREATE).header("otcsticket", sessionId);
MultivaluedMap<String, String> postBody = new MultivaluedMapImpl();
postBody.add("name", filename);
postBody.add("type", TYPE_FILE);
postBody.add("parent_id", folderId);
postBody.add("file", FileUtils.readFileToString(document, StandardCharsets.UTF_8));
ClientResponse response = webResource.type(MediaType.APPLICATION_FORM_URLENCODED_TYPE).post(ClientResponse.class, postBody);
if (response.getStatus() == 200) {
return response;
} else {
System.out.println(response.toString());
System.out.println(response.getEntity(String.class));
throw new DmsException("XYZ-001", XYZ: HTTP-CODE "
+ response.getStatusInfo().getStatusCode() + " - " + response.getStatusInfo().getReasonPhrase());
}
}
The code returns a HTTP-STATUS 200 OK. However, no file is created, but a folder is created. The API description for this is identical only with the difference that no file is passed. Therefore I assume that the file parameter is skipped.
PS: I am using Jersey 1.19.1
I ask for help and am grateful for any answer

Custom sized Flux DataBuffer within WebSocket

I'm trying to make a live video stream where on the backend chunks are received and stored into a DataBuffer which then is continuously writen to a File. But instead of a bigger getting DataBuffer and file, I rather want a DataBuffer which is limited in its size and (hopefully) works like FIFO, so that if new chunks are pushed into the DataBuffer, older chunks are pushed out.
What I tried is to create a DataBuffer with the DataBufferFactory and allocateBuffer(), but this always gives me type missmatch errors because I have the Flux<DataBuffer> videoDataFlux which contains the chunk data on the one hand, and a <DataBufferFactory> on the other hand, which somehow can't be combined.
So, whats the way to go?
What is working so far:
#Override
public Mono<Void> handle(WebSocketSession webSocketSession) {
String filename = "strm";
Path path = FileSystems.getDefault().getPath("C:\\Program Files (x86)\\Apache Software Foundation\\
Tomcat 9.0\\webapps\\stream\\videos");
Flux<DataBuffer> videoDataFlux = webSocketSession.receive()
.map(WebSocketMessage::getPayload);
try{
Path file = Files.createTempFile(path, filename, ".webm");
AsynchronousFileChannel channel = AsynchronousFileChannel.open(file, StandardOpenOption.WRITE);
return (DataBufferUtils.write(videoDataFlux, channel, 0)
.then()
.doOnNext(s -> {
if((!videoDataFlux.equals(null)) && (!webSocketSession.equals(null))){
DataBufferUtils.write(videoDataFlux, channel, 0).subscribe();
}
}));
} catch(IOException e){
}
return null;
}

How to copy S3 object from one region to another when vpc endpoint is enabled

Recently I was unable to copy files using the s3.copyObject(sourceBucket, sourceKey, destBucket, destKey); because of 2 reasons.
1) The source and destination buckets are in 2 different regions (us-east-1 and us-east2 in my case).
2) The region where the server resides is in a VPC which has an S3 endpoint enabled. S3 endpoint is an internal connection to S3, but only in the same region
Given that we are moving large files, we could not download and then upload even temporarily. We also wanted to keep the S3 endpoint in place, because the application makes serious use of S3 assets once in region.
The solution is to stream copy the files from one stream to another. I wrote this simple function which will handle it.
ZipException is just a custom exception. Throw whatever you want.
Hopefully this helps somebody.
public static void copyObject(AmazonS3 sourceClient, AmazonS3 destClient, String sourceBucket, String sourceKey, String destBucket, String destKey) throws IOException {
S3ObjectInputStream inStream = null;
try {
GetObjectRequest request = new GetObjectRequest(sourceBucket, sourceKey);
S3Object object = sourceClient.getObject(request);
inStream = object.getObjectContent();
destClient.putObject(destBucket,
destKey, inStream, object.getObjectMetadata());
} catch (SdkClientException e) {
throw new ZipException("Unable to copy file.", e);
} finally {
if (inStream != null) {
inStream.close();
}
}
}

Java File URI error?

I need to get a file object online, and I know the file is located at : http://nmjava.com/Dir_App_IDs/Dir_GlassPaneDemo/GlassPaneDemo_2010_04_06_15_00_SNGRGLJAMX
If I paste it into my browser's url, I'll be able to download this file, now I'm trying to get it with Java, my code looks like this :
String File_Url="http://nmjava.com/Dir_App_IDs/Dir_GlassPaneDemo/GlassPaneDemo_2010_04_06_15_00_SNGRGLJAMX";
Object myObject=Get_Online_File(new URI(File_Url));
Object Get_Online_File(URI File_Uri) throws IOException
{
return readObject(new ObjectInputStream(new FileInputStream(new File(File_Uri))));
}
public static synchronized Object readObject(ObjectInput in) throws IOException
{
Object o;
......
return o;
}
But I got the following error message :
java.lang.IllegalArgumentException: URI scheme is not "file"
at java.io.File.<init>(File.java:366)
Why ? How to fix it ?
Frank
I'm not sure if FileInputStream is designed for reading over the internet .. try new URL(File_Uri).openConnection().getInputStream()
Don't use FileInputStream for this purpose. Create URL, then get input stream and read data from it.
URL url = new URL (fileUrl);
InputStream inputStream = url.openStream ();
readData (inputStream);
For reading data I recommend you to use Commons IO library (especially if there are 2 or more places where you work with streams, it'll save your time and make your code more expressive):
private byte[] readData (InputStream in) {
try {
return IOUtils.toByteArray (in);
} finally {
IOUtils.closeQuietly(in);
}
}
You also operate in your code with Object streams (like ObjectInputStream). But that stream should be used only to read serialized java object and it's not the case as I understand from the description (if it would be a serialized object then your browser hadn't opened that file).
I got inspired, the correct answer is :
Object myObject=Get_Online_File(new URL(File_Url));
Object Get_Online_File(URL File_Url) throws IOException
{
return readObject(new ObjectInputStream(File_Url.openConnection().getInputStream()));
// or readObject(new ObjectInputStream(File_Url.openStream()));
}
Try "file://nmjava.com/Dir_App_IDs/Dir_GlassPaneDemo/GlassPaneDemo_2010_04_06_15_00_SNGRGLJAMX"

How to clear the screen output of a Java HttpServletResponse

I'm writing to the browser window using servletResponse.getWriter().write(String).
But how do I clear the text which was written previously by some other similar write call?
The short answer is, you cannot -- once the browser receives the response, there is no way to take it back. (Unless there is some way to abnormally stop a HTTP response to cause the client to reload the page, or something to that extent.)
Probably the last place a response can be "cleared" in a sense, is using the ServletResponse.reset method, which according to the Servlet Specification, will reset the buffer of the servlet's response.
However, this method also seems to have a catch, as it will only work if the buffer has not been committed (i.e. sent to the client) by the ServletOutputStream's flush method.
You cannot. The best thing is to write to a buffer (StringWriter / StringBuilder) and then you can replace the written data any time. Only when you know for sure what is the response you can write the buffer's content to the response.
In the same matter, and reason to write the response this way and not to use some view technology for your output such as JSP, Velocity, FreeMarker, etc.?
If you have an immediate problem that you need to solve quickly, you could work around this design problem by increasing the size of the response buffer - you'll have to read your application server's docs to see if this is possible. However, this solution will not scale as you'll soon run into out-of-memory issues if you site traffic peaks.
No view technology will protect you from this issue. You should design your application to figure out what you're going to show the user before you start writing the response. That means doing all your DB access and business logic ahead of time. This is a common issue I've seen with convoluted system designs that use proxy objects that lazily access the database. E.g. ORM with Entity relationships are bad news if accessed from your view layer! There's not much you can do about an exception that happens 3/4 of the way into a rendered page.
Thinking about it, there might be some way to inject a page redirect via AJAX. Anyone ever heard of a solution like that?
Good luck with re-architecting your design!
I know the post is pretty old, but just thought of sharing my views on this.
I suppose you could actually use a Filter and a ServletResponseWrapper to wrap the response and pass it along the chain.
That is, You can have an output stream in the wrapper class and write to it instead of writing into the original response's output stream... you can clear the wrapper's output stream as and when you please and you can finally write to the original response's output stream when you are done with your processing.
For example,
public class MyResponseWrapper extends HttpServletResponseWrapper {
protected ByteArrayOutputStream baos = null;
protected ServletOutputStream stream = null;
protected PrintWriter writer = null;
protected HttpServletResponse origResponse = null;
public MyResponseWrapper( HttpServletResponse response ) {
super( response );
origResponse = response;
}
public ServletOutputStream getOutputStream()
throws IOException {
if( writer != null ) {
throw new IllegalStateException( "getWriter() has already been " +
"called for this response" );
}
if( stream == null ) {
baos = new ByteArrayOutputStream();
stream = new MyServletStream(baos);
}
return stream;
}
public PrintWriter getWriter()
throws IOException {
if( writer != null ) {
return writer;
}
if( stream != null ) {
throw new IllegalStateException( "getOutputStream() has already " +
"been called for this response" );
}
baos = new ByteArrayOutputStream();
stream = new MyServletStream(baos);
writer = new PrintWriter( stream );
return writer;
}
public void commitToResponse() {
origResponse.getOutputStream().write(baos.toByteArray());
origResponse.flush();
}
private static class MyServletStream extends ServletOutputStream {
ByteArrayOutputStream baos;
MyServletStream(ByteArrayOutputStream baos) {
this.baos = baos;
}
public void write(int param) throws IOException {
baos.write(param);
}
}
//other methods you want to implement
}

Categories

Resources