I am trying to render image thru BufferedArray using Play Framework. The images are not being rendered with any of the following approaches. Any pointers are really helpful
Returned byte[] from Bean and rendered in main.scala using #bean.property. I can see the data coming by viewing source.
Wrote the image to temporary location and using the URL returned in " > tag. No success.
Used inline image approach (http://en.wikipedia.org/wiki/Data_URI_scheme) (with and without Base64 encoding and image file size less than 32KB) but still no luck.
Any help/pointers are really appreciated.
I am using Play!Framework 2.1.0. Let, the image is located at D:\\Images\\juventus.jpg (I am Windows user). Below is the solution for your problem :
public static Result showImage() {
try {
byte[] array = Files.toByteArray(new File("D:\\Images\\juventus.jpg"));
return ok(array); // render image
} catch (IOException e) {
Logger.error("An IO Exception is occured while reading file!");
}
return internalServerError("An IO Exception is occured while reading file!");
}
That should render the image as a response. May this post is useful.. ;)
Related
I need to encode multiple images (Of which I have the complete path) into a video of a certain FPS on android.
Trials:
How to create a video from an array of images in Android?
Why I couldn't get it to work:
I added the Jcodec Dependecy to gradle file (
compile 'org.jcodec:jcodec:0.2.3'
compile 'org.jcodec:jcodec-android:0.2.2'
)
I then pasted the code into a function and this is what I get:
As You can see I managed to import SequenceEncoder (import org.jcodec.api.SequenceEncoder;)
But it doesn't recognize Buffered Image (I think it's because I have to use Bitmap)
And it gives me an error in the SequenceEncoder.
Also doesn't recognize the encodeImage Method.
Then I tried with the code I found on JCodec webSite:
SeekableByteChannel out = null;
try {
out = NIOUtils.writableFileChannel("/tmp/output.mp4");
// for Android use: AndroidSequenceEncoder
AWTSequenceEncoder encoder = new AWTSequenceEncoder(out, Rational.R(25, 1));
for (...) {
// Generate the image, for Android use Bitmap
BufferedImage image = ...;
// Encode the image
encoder.encodeImage(image);
}
// Finalize the encoding, i.e. clear the buffers, write the header, etc.
encoder.finish();
} finally {
NIOUtils.closeQuietly(out);
}
But it doesn't recognize AWTSequenceEncoder and thus the methods encodeImage and finish.
What am I doing wrong?
Ok, I found the Answer to the problem, technically it is in the answers of this question:
How to create a video from an array of images in Android?
But has only two votes, despite being the only one that worked for me and for what I found out the only one that should work. You cannot use BufferedImages in android, while the most voted quesion do and the SequenceEncoder that I didn't find is replaced with AndroidSequenceEncoder.
I am trying to save an image into an FTPServer using Apache Commons FTPClient.storeFile(...) method, which requires an InputStream. The image comes as a Byte[] at the beginning.
Here some code:
InputStream is = new ByteArrayInputStream(byteArray);
boolean done = ftpClient.storeFile(remotePath, is);
However, when I download the uploaded image, it looks very strange and even though the dimensions are respected, the image does not look like it should.
The Image after uploading Looks like this:
Image after uploading
In reality I do not have access to the original Image but I know it is an Image of the open sea with blue water on it.
Thank you!
So, as VGR! accurately pointed out, the Apache Commons ftpClient has two working modes, ASCII and binary mode. By default the ASCII mode is used and one must enforce the treatment of data as binary data explicitly via
try {
ftpClient.setFileType(FTP.BINARY_FILE_TYPE);
} catch (IOException ex) {
System.out.println("Error: " + ex.getMessage());
ex.printStackTrace();
}
I have these lines of codes that I have being trying to use to read pdf file with Apache pdfBox.
private void readPdf(){
try {
File PDF_Path = new File("/home/olyjosh/Downloads/my project.pdf");
PDDocument inputPDF = PDDocument.load(PDF_Path);
List<PDPage> allPages = inputPDF.getDocumentCatalog().getAllPages();
PDPage testPage = (PDPage) allPages.get(5);
System.out.println("Number of pages "+allPages.size());
PDFPagePanel pdfPanel = new PDFPagePanel();
jPanel1.add(pdfPanel);
pdfPanel.setPage(testPage);
// this.revalidate();
inputPDF.close();
} catch (IOException ex) {
Logger.getLogger(NewJFrame.class.getName()).log(Level.SEVERE, null, ex);
}
}
I want this pdf to be displayed on swing component like jPanel but this will only display the panel with the expected content of the pdf file. However, I was able to display the pdf as image using
convertToImage = testPage.convertToImage();
Please, how do I work around this or what am I doing wrong.
Apache PDF-Box has a mailing list where I was able to ask the same question and this was the response I got
This was removed in 2.0 because it made trouble. Obviously, it doesn't work for 1.8 either, at least for you, so why bother?
There are two ways to display, either get a BufferedImage (renderImage / renderImageWithDPI) and display that somehow (see in PDFDebugger how to do it), or renderPageToGraphics which renders to a graphics device object.
If you really want to get the source code of the deleted PDFReader application (which includes PDFPagePanel), use svn to get revision 1702125 or earlier, that should have it. But if it didn't work for you in 1.8, it won't work for you now.
The point is that swing display of PDF pages isn't part of the API, it's part of some tool (now: in PDFDebugger, previously: in PDFReader)
You need to have some understanding of awt / swing. If you don't, learn it, or hire somebody. (That's what we did, and the best is: google paid it, as part of the google summer of code)
Tilman
I store the images to SQLite by converting the bitmap to byte array.
Should I do the same thing? Getting the byte array from bitmap, then to JSON, then to PHP, and finally to MySQL.
If yes, how can I do that? I could store strings to MySQL from the app, but couldn't do it on byte arrays.
just convert the byte array into Base64 ... base64 technically is a string, so JSON it to your webservice, there convert it back from Base64 ... and the rest is history
here is a link for converting image to Base64 in android : How to convert a image into Base64 string?
btw, other guys are right it's not so efficient to store the image itself inthe db, storing a reference would be much better ... unless you do not want your images to be right on the sd card, which is something else, security wise !
You can refer to this SO discussion that talks about uploading files to a server using Android.
As a matter of fact it is not recommended to store binary data into relational databases. Refer this SO Discussion. Rather a recommended way would be store the binary data on a server disk location as a file and simply place the path of the file within the database. This would prevent any corruption of data due to discrepancies in the database character set and encodings.
As you are willing to use another solution that is more good, i am explaining you the below procedure.
Instead of storing the image in the database, create a directory specifically for the images, after you successfully upload the image store the path of that image in your database. After that use that path to refer that image.
You can upload image via POST method, and then store reference in the database (ex:- images/img1.bmp)
Whenever you needed you can get file reference using http request(you need to code php for that request handling)
You can access image by using your servers public ip or domain for example : mydomain.com/app1/images/img1.bmp
This is just a one way to do it, so if you think about implementing look for file upload examples via POST
Hope this helps
Use Multipart file upload to send file to your web service (Use libraries like Android Asynchronous Http Client to make the job easy).Then you can encode the file in to Base64 and store in the database as text ,but it is not a best practice to store image files in database, you should save the image as file in the server and keep the path in MySql.
public static String imgToBase64(Bitmap bitmap) {
ByteArrayOutputStream out = null;
try {
out = new ByteArrayOutputStream();
// compress image
bitmap.compress(Bitmap.CompressFormat.JPEG, 20, out);
out.flush();
out.close();
byte[] imgBytes = out.toByteArray();
return Base64.encodeToString(imgBytes, Base64.DEFAULT);
} catch (Exception e) {
return null;
} finally {
try {
out.flush();
out.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
Could anyone help me with java Framework - stripes?
I try to upload image with stripes:file, resize on the server and return with new StreamingResolution return ("image / jpeg"...
I dont now exactly how to send with StreamingResolution and how can I load image after stripes:file element on jsp?
Many Thanks
Well if you want to show the image on the page after uploading, there are a couple of things to deal with. First, the "POST" is going to have to have some sort of result, and that result probably won't be the image data. That is, having your action (your "event handler", as Stripes people call them) return a StreamingResolution doesn't really make sense unless you just want the user to be able to save the image like a downloaded file.
Thus, your image upload "POST" might involve just a plain resolution that forwards to a result page. Inside that page, you can put an HTML <img> tag that has a "src" set to another Stripes action. Now that action will return the StreamingResolution for your image data.
One problem to solve is where to keep the image data across the two HTML transactions. For that, I'd use a Stripes "flash scope" because it's pretty simple. If your server code is going to store the image in a database anyway, of course, your image action URL would simply include some sort of identifying information.
Assuming you can find the image data, all your server-side handler has to do is to create a StreamingResolution instance that has an implementation for the stream() method. That takes a single parameter (the HttpServletResponse). From that, you'd open up an output stream with response.getOutputStream(), copy the image data out to that, and then close the stream. There's really not much to it. Here's an example that sends out a simple file, but for you the main difference would be keeping track of the image data and of course a different MIME type:
public Resolution image() {
return new StreamingResolution("text/plain") {
public void stream(final HttpServletResponse response) {
InputStream sample = null;
try {
sample = getResourceAsStream(SAMPLE + getContext().getLocale().toString());
if (sample == null) sample = getResourceAsStream(SAMPLE + "en_US");
final OutputStream out = response.getOutputStream();
final byte buffer[] = new byte[8192];
out.write((HEADER + "\n").getBytes());
for (int rc = sample.read(buffer); rc > 0; rc = sample.read(buffer))
out.write(buffer, 0, rc);
}
finally {
if (sample != null) try { sample.close(); } catch (IOException ioe) { }
}
}
};
}
You'd want to also call setAttachment(false); before you start writing out the image data. This example is for a file download, so in my case I want it to be an attachment (and that's the default). If you're responding to a "GET" generated from an <img> tag, however, you don't want it to look like an attachment.