I am trying to upload files to Amazon S3 storage using Amazon’s Java API for it. The code is
Byte[] b = data.getBytes();
InputStream stream = new ByteArrayInputStream(b);
//InputStream stream = new FileInputStream(new File("D:/samples/test.txt"));
AWSCredentials credentials = new BasicAWSCredentials("<key>", "<key1>");
AmazonS3 s3client = new AmazonS3Client(credentials);
s3client.putObject(new PutObjectRequest("myBucket",name,stream, new ObjectMetadata()));
When I run the code after commenting the first two lines and uncommenting the third one, ie stream is a FileoutputStream, the file is uploaded correctly. But when data is a base64 encoded String, which is image data, the file is uploaded but image is corrupted.
Amazon documentation says I need to create and attach a POST policy and signature for this to work. How I can do that in java? I am not using an html form for uploading.
First you should remove data:image/png;base64, from beginning of the string:
Sample Code Block:
byte[] bI = org.apache.commons.codec.binary.Base64.decodeBase64((base64Data.substring(base64Data.indexOf(",")+1)).getBytes());
InputStream fis = new ByteArrayInputStream(bI);
AmazonS3 s3 = new AmazonS3Client();
Region usWest02 = Region.getRegion(Regions.US_WEST_2);
s3.setRegion(usWest02);
ObjectMetadata metadata = new ObjectMetadata();
metadata.setContentLength(bI.length);
metadata.setContentType("image/png");
metadata.setCacheControl("public, max-age=31536000");
s3.putObject(BUCKET_NAME, filename, fis, metadata);
s3.setObjectAcl(BUCKET_NAME, filename, CannedAccessControlList.PublicRead);
Here's a DTO class that takes in the base64Image data passed in directly from your client and parsed into its different components that can easily be passed in your uploadToAwsS3 method:
public class Base64ImageDto {
private byte[] imageBytes;
private String fileName;
private String fileType;
private boolean hasErrors;
private List<String> errorMessages;
private static final List<String> VALID_FILE_TYPES = new ArrayList<String>(3);
static {
VALID_FILE_TYPES.add("jpg");
VALID_FILE_TYPES.add("jpeg");
VALID_FILE_TYPES.add("png");
}
public Base64ImageDto(String b64ImageData, String fileName) {
this.fileName = fileName;
this.errorMessages = new ArrayList<String>(2);
String[] base64Components = b64ImageData.split(",");
if (base64Components.length != 2) {
this.hasErrors = true;
this.errorMessages.add("Invalid base64 data: " + b64ImageData);
}
if (!this.hasErrors) {
String base64Data = base64Components[0];
this.fileType = base64Data.substring(base64Data.indexOf('/') + 1, base64Data.indexOf(';'));
if (!VALID_FILE_TYPES.contains(fileType)) {
this.hasErrors = true;
this.errorMessages.add("Invalid file type: " + fileType);
}
if (!this.hasErrors) {
String base64Image = base64Components[1];
this.imageBytes = javax.xml.bind.DatatypeConverter.parseBase64Binary(base64Image);
}
}
}
public byte[] getImageBytes() {
return imageBytes;
}
public void setImageBytes(byte[] imageBytes) {
this.imageBytes = imageBytes;
}
public boolean isHasErrors() {
return hasErrors;
}
public void setHasErrors(boolean hasErrors) {
this.hasErrors = hasErrors;
}
public List<String> getErrorMessages() {
return errorMessages;
}
public void setErrorMessages(List<String> errorMessages) {
this.errorMessages = errorMessages;
}
public String getFileType() {
return fileType;
}
public void setFileType(String fileType) {
this.fileType = fileType;
}
public String getFileName() {
return fileName;
}
public void setFileName(String fileName) {
this.fileName = fileName;
}
}
And here's the method you can add to your AwsS3Service that will put the object up there (Note: You might not be using a transfer manager to manage your puts so you'll need to change that code accordingly):
public void uploadBase64Image(Base64ImageDto base64ImageDto, String pathToFile) {
InputStream stream = new ByteArrayInputStream(base64ImageDto.getImageBytes());
ObjectMetadata metadata = new ObjectMetadata();
metadata.setContentLength(base64ImageDto.getImageBytes().length);
metadata.setContentType("image/"+base64ImageDto.getFileType());
String bucketName = awsS3Configuration.getBucketName();
String key = pathToFile + base64ImageDto.getFileName();
try {
LOGGER.info("Uploading file " + base64ImageDto.getFileName() + " to AWS S3");
PutObjectRequest objectRequest = new PutObjectRequest(bucketName, key, stream, metadata);
objectRequest.setCannedAcl(CannedAccessControlList.PublicRead);
Upload s3FileUpload = s3TransferManager.upload(objectRequest);
s3FileUpload.waitForCompletion();
} catch (Exception e) {
e.printStackTrace();
LOGGER.info("Error uploading file " + base64ImageDto.getFileName() + " to AWS S3");
}
}
For those who use a later SDK:
implementation group: software.amazon.awssdk, name: s3, version: 2.10.3
byte[] bI = Base64.decodeBase64((base64Data.substring(base64Data.indexOf(",") + 1)).getBytes());
InputStream fis = new ByteArrayInputStream(bI);
amazonS3Client.putObject(PutObjectRequest.builder().bucket(bucketName).key(fileName)
.contentType(contentType)
.contentLength(Long.valueOf(bI.length))
.build(),
RequestBody.fromInputStream(fis, Long.valueOf(bI.length)));
The sample code of how to uploaded images(png / jpg) is as follows. --
try {
BasicAWSCredentials awsCreds = new BasicAWSCredentials(accessKey, secretKey);
AmazonS3 s3Client =
AmazonS3ClientBuilder.standard().withRegion(clientRegion)
.withCredentials(new
AWSStaticCredentialsProvider(awsCreds))
.build();
PutObjectRequest request = new PutObjectRequest(bucketName, fileName, new File(fileToUpload));
ObjectMetadata metadata = new ObjectMetadata();
metadata.setContentType("image/jpg");
request.setMetadata(metadata);
s3Client.putObject(request.withCannedAcl(CannedAccessControlList.PublicRead));
logger.info("File " + fileToUpload + " uploaded to AWS bucket " + bucketName);
} catch (AmazonServiceException e) {
logger.error(e);
fileName = Common.NO_VALUE.toString();
} catch (SdkClientException e) {
logger.error(e);
fileName = Common.NO_VALUE.toString();
}
However, I did not use any concept of encoding or decoding. This plane simple metadata content type of "image/jpg" worked.
Related
I have byte[] zipFileAsByteArray
This zip file has rootDir --|
| --- Folder1 - first.txt
| --- Folder2 - second.txt
| --- PictureFolder - image.png
What I need is to get two txt files and read them, without saving any files on disk. Just do it in memory.
I tried something like this:
ByteArrayInputStream bis = new ByteArrayInputStream(processZip);
ZipInputStream zis = new ZipInputStream(bis);
Also I will need to have separate method go get picture. Something like this:
public byte[]image getImage(byte[] zipContent);
Can someone help me with idea or good example how to do that ?
Here is an example:
public static void main(String[] args) throws IOException {
ZipFile zip = new ZipFile("C:\\Users\\mofh\\Desktop\\test.zip");
for (Enumeration e = zip.entries(); e.hasMoreElements(); ) {
ZipEntry entry = (ZipEntry) e.nextElement();
if (!entry.isDirectory()) {
if (FilenameUtils.getExtension(entry.getName()).equals("png")) {
byte[] image = getImage(zip.getInputStream(entry));
//do your thing
} else if (FilenameUtils.getExtension(entry.getName()).equals("txt")) {
StringBuilder out = getTxtFiles(zip.getInputStream(entry));
//do your thing
}
}
}
}
private static StringBuilder getTxtFiles(InputStream in) {
StringBuilder out = new StringBuilder();
BufferedReader reader = new BufferedReader(new InputStreamReader(in));
String line;
try {
while ((line = reader.readLine()) != null) {
out.append(line);
}
} catch (IOException e) {
// do something, probably not a text file
e.printStackTrace();
}
return out;
}
private static byte[] getImage(InputStream in) {
try {
BufferedImage image = ImageIO.read(in); //just checking if the InputStream belongs in fact to an image
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ImageIO.write(image, "png", baos);
return baos.toByteArray();
} catch (IOException e) {
// do something, it is not a image
e.printStackTrace();
}
return null;
}
Keep in mind though I am checking a string to diferentiate the possible types and this is error prone. Nothing stops me from sending another type of file with an expected extension.
You can do something like:
public static void main(String args[]) throws Exception
{
//bis, zis as you have
try{
ZipEntry file;
while((file = zis.getNextEntry())!=null) // get next file and continue only if file is not null
{
byte b[] = new byte[(int)file.getSize()]; // create array to read.
zis.read(b); // read bytes in b
if(file.getName().endsWith(".txt")){
// read files. You have data in `b`
}else if(file.getName().endsWith(".png")){
// process image
}
}
}
finally{
zis.close();
}
}
You can use below code.
But need to make sure that you S3 Bucket initial setup.
import com.amazonaws.AmazonServiceException;
import com.amazonaws.SdkClientException;
import com.amazonaws.auth.profile.ProfileCredentialsProvider;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.services.s3.model.GetObjectRequest;
import com.amazonaws.services.s3.model.ObjectMetadata;
import com.amazonaws.services.s3.model.ResponseHeaderOverrides;
import com.amazonaws.services.s3.model.S3Object;
import java.io.*;
import static com.amazonaws.regions.Regions.US_EAST_1;
public class GetObject2 {
public static void main(String[] args) throws IOException {
String bucketName = "Give Yout Bucket Name";
String key = "Give your String Key";
S3Object fullObject = null, objectPortion = null, headerOverrideObject = null;
try {
AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
.withRegion(US_EAST_1)
.withCredentials(new ProfileCredentialsProvider())
.build();
// Get an object and print its contents.
System.out.println("Downloading an object");
fullObject = s3Client.getObject(new GetObjectRequest(bucketName, key));
System.out.println("Content-Type: " + fullObject.getObjectMetadata().getContentType());
System.out.println("Content: ");
displayTextInputStream(fullObject.getObjectContent());
File localFile = new File("C:\\awstest.zip");
ObjectMetadata object = s3Client.getObject(new GetObjectRequest(bucketName, key), localFile);
// Get a range of bytes from an object and print the bytes.
GetObjectRequest rangeObjectRequest = new GetObjectRequest(bucketName, key)
.withRange(0, 9);
objectPortion = s3Client.getObject(rangeObjectRequest);
System.out.println("Printing bytes retrieved.");
displayTextInputStream(objectPortion.getObjectContent());
// Get an entire object, overriding the specified response headers, and print the object's content.
ResponseHeaderOverrides headerOverrides = new ResponseHeaderOverrides()
.withCacheControl("No-cache")
.withContentDisposition("attachment; filename=example.txt");
GetObjectRequest getObjectRequestHeaderOverride = new GetObjectRequest(bucketName, key)
.withResponseHeaders(headerOverrides);
headerOverrideObject = s3Client.getObject(getObjectRequestHeaderOverride);
displayTextInputStream(headerOverrideObject.getObjectContent());
} catch (AmazonServiceException e) {
// The call was transmitted successfully, but Amazon S3 couldn't process
// it, so it returned an error response.
e.printStackTrace();
} catch (SdkClientException e) {
// Amazon S3 couldn't be contacted for a response, or the client
// couldn't parse the response from Amazon S3.
e.printStackTrace();
} finally {
// To ensure that the network connection doesn't remain open, close any open input streams.
if (fullObject != null) {
fullObject.close();
}
if (objectPortion != null) {
objectPortion.close();
}
if (headerOverrideObject != null) {
headerOverrideObject.close();
}
}
}
static void displayTextInputStream(InputStream input) throws IOException {
// Read the text input stream one line at a time and display each line.
BufferedReader reader = new BufferedReader(new InputStreamReader(input));
String line = null;
while ((line = reader.readLine()) != null) {
System.out.println(line);
}
System.out.println();
}
}
I have two node on production environment. I have placed pdf files at one server and want to read it from both server. when am calling 'file' method directly pdf get displayed in browser but when i call 'pdfFiles' nothing is displayed in browser.
public Resolution file(){
try {
final HttpServletRequest request = getContext().getRequest();
String fileName = (String) request.getParameter("file");
File file = new File("pdf file directory ex /root/pdffiles/" + fileName);
getContext().getResponse().setContentType("application/pdf");
getContext().getResponse().addHeader("Content-Disposition",
"inline; filename=" + fileName);
FileInputStream streamIn = new FileInputStream(file);
BufferedInputStream buf = new BufferedInputStream(streamIn);
int readBytes = 0;
ServletOutputStream stream = getContext().getResponse().getOutputStream();
// read from the file; write to the ServletOutputStream
while ((readBytes = buf.read()) != -1)
stream.write(readBytes);
} catch (Exception exc) {
LOGGER.logError("reports", exc);
}
return null;
}
public Resolution pdfFile() {
final HttpServletRequest request = getContext().getRequest();
final HttpClient client = new HttpClient();
try {
String fileName = (String) request.getParameter("file");
final String url = "http://" + serverNameNode1 //having pdf files
+ "/test/sm.action?reports&file=" + fileName;
final PostMethod method = new PostMethod(url);
try {
client.executeMethod(method);
} finally {
method.releaseConnection();
}
} catch (final Exception e) {
LOGGER.logError("pdfReports", "error occured2 " + e.getMessage());
}
return null;
}
Included below part of code after 'client.executeMethod(method);' in 'pdfFile()' method and it works for me.
buf = new BufferedInputStream(method.getResponseBodyAsStream());
int readBytes = 0;
stream = getContext().getResponse().getOutputStream();
// write to the ServletOutputStream
while ((readBytes = buf.read()) != -1)
stream.write(readBytes);
I am trying to create a file upload API using Jersey. I would like to obtain details about the upload progress in the server side (is it possible?). Searching the web, the suggestion was to use stream to transfer the file. But... even was described below, the server just to execute the "putFile" method after the file arrives completely. Another problem is that these code only works to small files, when I try a file greater than 40mb
#Path("/file")
public class LargeUpload {
private static final String SERVER_UPLOAD_LOCATION_FOLDER = "/Users/diego/Documents/uploads/";
#PUT
#Path("/upload/{attachmentName}")
#Consumes(MediaType.APPLICATION_OCTET_STREAM)
public Response putFile(#PathParam("attachmentName") String attachmentName,
InputStream fileInputStream) throws Throwable {
String filePath = SERVER_UPLOAD_LOCATION_FOLDER + attachmentName;
saveFile(fileInputStream, filePath);
String output = "File saved to server location : ";
return Response.status(200).entity(output).build();
}
// save uploaded file to a defined location on the server
private void saveFile(InputStream uploadedInputStream, String serverLocation) {
try {
OutputStream outpuStream = new FileOutputStream(new File(
serverLocation));
int read = 0;
byte[] bytes = new byte[1024];
outpuStream = new FileOutputStream(new File(serverLocation));
while ((read = uploadedInputStream.read(bytes)) != -1) {
outpuStream.write(bytes, 0, read);
}
outpuStream.flush();
outpuStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
public static void main(String[] args) throws FileNotFoundException {
ClientConfig config = new ClientConfig();
config.property(ClientProperties.CHUNKED_ENCODING_SIZE, 1024);
Client client = ClientBuilder.newClient(config);
File fileName = new File("/Users/diego/Movies/ff.mp4");
InputStream fileInStream = new FileInputStream(fileName);
String sContentDisposition = "attachment; filename=\"" + fileName.getName()+"\"";
Response response = client.target("http://localhost:8080").path("upload-controller/webapi/file/upload/"+fileName.getName()).
request(MediaType.APPLICATION_OCTET_STREAM).header("Content-Disposition", sContentDisposition).
put(Entity.entity(fileInStream, MediaType.APPLICATION_OCTET_STREAM));
System.out.println(response);
}
I already know where the image is, but for simplicity's sake I wanted to download the image using JSoup itself. (This is to simplify getting cookies, referrer, etc.)
This is what I have so far:
//Open a URL Stream
Response resultImageResponse = Jsoup.connect(imageLocation).cookies(cookies).ignoreContentType(true).execute();
// output here
OutputStreamWriter out = new OutputStreamWriter(new FileOutputStream(new java.io.File(outputFolder + name));
//BufferedWriter out = new BufferedWriter(new FileWriter(outputFolder + name));
out.write(resultImageResponse.body()); // resultImageResponse.body() is where the image's contents are.
out.close();
I didn't even finish writing the question before I found the answer via JSoup and a little experimentation.
//Open a URL Stream
Response resultImageResponse = Jsoup.connect(imageLocation).cookies(cookies)
.ignoreContentType(true).execute();
// output here
FileOutputStream out = (new FileOutputStream(new java.io.File(outputFolder + name)));
out.write(resultImageResponse.bodyAsBytes()); // resultImageResponse.body() is where the image's contents are.
out.close();
Simply you can use these methods-
public static String storeImageIntoFS(String imageUrl, String fileName, String relativePath) {
String imagePath = null;
try {
byte[] bytes = Jsoup.connect(imageUrl).ignoreContentType(true).execute().bodyAsBytes();
ByteBuffer buffer = ByteBuffer.wrap(bytes);
String rootTargetDirectory = IMAGE_HOME + "/"+relativePath;
imagePath = rootTargetDirectory + "/"+fileName;
saveByteBufferImage(buffer, rootTargetDirectory, fileName);
} catch (IOException e) {
e.printStackTrace();
}
return imagePath;
}
public static void saveByteBufferImage(ByteBuffer imageDataBytes, String rootTargetDirectory, String savedFileName) {
String uploadInputFile = rootTargetDirectory + "/"+savedFileName;
File rootTargetDir = new File(rootTargetDirectory);
if (!rootTargetDir.exists()) {
boolean created = rootTargetDir.mkdirs();
if (!created) {
System.out.println("Error while creating directory for location- "+rootTargetDirectory);
}
}
String[] fileNameParts = savedFileName.split("\\.");
String format = fileNameParts[fileNameParts.length-1];
File file = new File(uploadInputFile);
BufferedImage bufferedImage;
InputStream in = new ByteArrayInputStream(imageDataBytes.array());
try {
bufferedImage = ImageIO.read(in);
ImageIO.write(bufferedImage, format, file);
} catch (IOException e) {
e.printStackTrace();
}
}
I am writing a small file upload utility thing as part of a larger project. Originally I was handling this from a servlet using the Apache commons File utility classes. Here is a snippet from a quick test client I wrote for the service:
public static void main(String[] args) {
JaxWsProxyFactoryBean factory = new JaxWsProxyFactoryBean();
factory.getInInterceptors().add(new LoggingInInterceptor());
factory.getOutInterceptors().add(new LoggingOutInterceptor());
factory.setServiceClass(FileUploadService.class);
factory.setAddress("http://localhost:8080/FileUploadService/FileUploadService");
FileUploadService client = (FileUploadService) factory.create();
FileType file = new FileType();
file.setName("statemo_1256144312279");
file.setType("xls");
DataSource source = new FileDataSource(new File("c:/development/statemo_1256144312279.xls"));
file.setHandler(new DataHandler(source));
Boolean ret = client.uploadFile(file);
System.out.println (ret);
System.exit(0);
}
This works absolutely fine. Now the problem comes when I am trying to replace the Apache commons utilities. In the above code I am creating a DataSource from a File with an absolute path name. In my servlet, I can't get an absolute path name however and the file I am sending over the wire is empty.
Here is the servlet code:
#SuppressWarnings("unchecked")
protected void doPost (final HttpServletRequest request, final HttpServletResponse response)
throws ServletException, IOException {
// form should have enctype="multipart/form-data" as an attribute
if (!ServletFileUpload.isMultipartContent (request)) {
LOG.info("Invalid form attribute");
return;
}
//DataInputStream in = new DataInputStream(request.getInputStream());
final DiskFileItemFactory factory = new DiskFileItemFactory ();
factory.setSizeThreshold(FILE_THRESHOLD_SIZE);
final ServletFileUpload sfu = new ServletFileUpload (factory);
sfu.setSizeMax(MAX_FILE_SIZE);
final HttpSession session = request.getSession();
final List<FileItem> files = new ArrayList<FileItem>();
final List<String> filesToProcess = new ArrayList<String>();
try {
final List<FileItem> items = sfu.parseRequest(request);
for (final FileItem f : items) {
if (!f.isFormField())
files.add(f);
}
/*for (final FileItem f : files) {
final String absoluteFileName = UPLOAD_DESTINATION + FilenameUtils.getName(f.getName());
//f.write(new File (absoluteFileName));
filesToProcess.add(absoluteFileName);
}*/
FileItem f = files.get(0);
LOG.info("File: " + FilenameUtils.getName(f.getName()));
LOG.info("FileBaseName: " + FilenameUtils.getBaseName(f.getName()));
LOG.info("FileExtension: " + FilenameUtils.getExtension(f.getName()));
FileUploadServiceClient client = new FileUploadServiceClient();
DataSource source = new FileDataSource(new File(f.getName()));
FileType file = new FileType();
file.setHandler(new DataHandler(source));
file.setName(FilenameUtils.getBaseName(f.getName()));
file.setType(FilenameUtils.getExtension(f.getName()));
Boolean ret = client.uploadFile(file);
LOG.info("File uploaded - " + ret);
filesToProcess.add(UPLOAD_DESTINATION + FilenameUtils.getName(f.getName()));
session.setAttribute("filesToProcess", filesToProcess);
final RequestDispatcher dispatcher = request.getRequestDispatcher("Validate");
if (null != dispatcher) {
dispatcher.forward(request, response);
}
} catch (FileUploadException e) {
LOG.info("Exception " + e.getMessage());
e.printStackTrace();
} catch (Exception e) {
LOG.info("Exception " + e.getMessage());
e.printStackTrace();
}
}
I've been working on this for the better part of this morning and am not getting anywhere. Even if I get rid of the Apache commons file stuff completely and handle the parsing of the request myself, I still can't construct the DataSource appropriately.
Thanks!
This was rather simple actually, I just copied over the bytes from the InputStream to the DataSource:
FileItem f = files.get(0);
// there is a problem here where the file being created is empty, since we only have a
// partial path:
DataSource source = new FileDataSource(new File(f.getName()));
// because of the above problem, we are going to copy over the data ourselves:
byte[] sourceBytes = f.get();
OutputStream sourceOS = source.getOutputStream();
sourceOS.write(sourceBytes);
This is the code of commons-email ByteArrayDataSource
it sounds odd to try to replace apache commons - don't, unless you have a really good reason
you can get absolute paths in a servlet. You can call getServletContext().getRealPath("/") which will return the absolute path of your application, and then you can get files relative to it.
In our application there are objects that have properties InputStream and Name. We are using next class to construct DataSource with those properties.
public class InputStreamDataSource implements DataSource {
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
private final String name;
public InputStreamDataSource(InputStream inputStream, String name) {
this.name = name;
try {
int nRead;
byte[] data = new byte[16384];
while ((nRead = inputStream.read(data, 0, data.length)) != -1) {
buffer.write(data, 0, nRead);
}
inputStream.close();
buffer.flush();
} catch (IOException e) {
e.printStackTrace();
}
}
#Override
public String getContentType() {
return new MimetypesFileTypeMap().getContentType(name);
}
#Override
public InputStream getInputStream() throws IOException {
return new ByteArrayInputStream(buffer.toByteArray());
}
#Override
public String getName() {
return name;
}
#Override
public OutputStream getOutputStream() throws IOException {
throw new IOException("Read-only data");
}
}
Most of the solutions shown here require that the InpustStream be closed (read into memory). It is possible to wrap the InputStream in a DataSource object without closing the InputStream though:
private record PipedDataSource(InputStream in, String contentType, String encoding)
implements DataSource, EncodingAware {
public String getContentType() {
return contentType;
}
public InputStream getInputStream() {
return in;
}
public String getName() {
return "PipedDataSource";
}
public OutputStream getOutputStream() throws IOException {
throw new IOException("No OutputStream");
}
#Override
public String getEncoding() {
return encoding;
}
}
The example above also implements EncodingAware. This can prevent the InputStream from being closed by third part libraries (for example java.mail.internet.MimeUtility) when they get the data source encoding.