I am trying to create a lambda function which accepts an image as multipart object, then does some processing and uploads it to s3 bucket and provides some response back to the user.
I have found some examples how to proceed however, i do not understand do i have to create two lambda functions and upload separate jars or it can be done differently.
So far i have service that parses multipart and uploads to s3 , my question is how to approach using aws lambdas, thank you guys
public String uploadFile(MultipartFile multipartFile) {
String fileUrl = "";
try {
File file = convertMultiPartToFile(multipartFile);
String fileName = generateFileName(multipartFile);
fileUrl = endpointUrl + "/" + bucketName + "/" + fileName;
uploadFileTos3bucket(fileName, file);
} catch (Exception e) {
throw new RuntimeException(e);
}
return fileUrl;
}
private File convertMultiPartToFile(MultipartFile file){
File convFile = new File(file.getOriginalFilename());
try {
FileOutputStream fos = new FileOutputStream(convFile);
fos.write(file.getBytes());
fos.close();
} catch (IOException e) {
throw new RuntimeException(e);
}
return convFile;
}
private String generateFileName(MultipartFile multiPart) {
return new Date().getTime() + "-" + multiPart.getOriginalFilename().replace(" ", "_");
}
private void uploadFileTos3bucket(String fileName, File file) {
s3client.putObject(new PutObjectRequest(bucketName, fileName, file)
.withCannedAcl(CannedAccessControlList.PublicRead));
}
I'd suggest you can simply use one lambda function, which should be integrated with an api gateway endpoint. So the user can invoke the endpoint (post) with the file that needs to be uploaded into S3 and then from the lambda function you can do the rest ( processing + uploading to s3) and then return some response back to the user.
This could be a starting point.
to get the s3 URL of your uploaded file:
s3Client.getUrl("your_bucket_name", "your_file_key").toExternalForm();
Here is another example to resize the images in S3 using Lambda. It's JS code though and uses only one Lambda function.
Related
I am attempting to use the authentication from google, however I have doubts on how I can use in the method:
GoogleCredential credential = GoogleCredential.fromStream(new FileInputStream("MyProject-1234.json"))
.createScoped(Collections.singleton(SQLAdminScopes.SQLSERVICE_ADMIN));
The file MyProject-1234.json is stored on a S3 bucket and this is currently running inside a lambda, how can I use this file on the authentication? I am not sure if I should be sending the path and how, or if I should be doing something else.
You need to use the AWS SDK for Java to download the file from S3 to the Lambda function's local file system first. You can't use FileInputStream to open an S3 object, you can only use that to open a local file system object.
Here's how you can pull the file from S3 and use it.
In short, the getFileFromS3(...) method will return a File object that you can use to create FileInputStream.
public class S3FileTest implements RequestStreamHandler {
private LambdaLogger logger;
#Override
public void handleRequest(InputStream input, OutputStream output, Context context) throws IOException {
logger = context.getLogger();
String bucketName = "==== S3 BUCKET NAME ====";
String fileName = "==== S3 FILE NAME ====";
File localFile = getFileFromS3(context, bucketName, fileName);
if(localFile == null) {
// handle error
// return ....
}
// use the file
GoogleCredential credential = GoogleCredential.fromStream(new FileInputStream(localFile))
.createScoped(Collections.singleton(SQLAdminScopes.SQLSERVICE_ADMIN));
// do more
// ...
}
private File getFileFromS3(Context context, String bucketName, String fileName) {
AmazonS3 s3Client = AmazonS3ClientBuilder.standard().withRegion(Regions.US_EAST_1).build();
// s3 client
if (s3Client == null) {
logger.log("S3 Client is null - can't continue!");
return null;
}
// s3 bucket - make sure it exist
if (!s3Client.doesBucketExistV2(bucketName)) {
logger.log("S3 Bucket does not exists - can't continue!");
return null;
}
File localFile = null;
try {
localFile = File.createTempFile(fileName, "");
// get S3Object
S3Object s3Object = s3Client.getObject(bucketName, fileName);
// get stream from S3Object
InputStream inputStream = s3Object.getObjectContent();
// write S3Object stream into a temp file
Files.copy(inputStream, localFile.toPath(), StandardCopyOption.REPLACE_EXISTING);
return localFile;
} catch (Exception e) {
logger.log("Failed to get file from S3: " + e.toString());
return null;
}
}
}
This spring app performs simple file upload,
here's the controller class
#Override
public String fileUpload(MultipartFile file) {
try{
// save uploaded image to images folder in root dir
Files.write(Paths.get("images/"+ file.getOriginalFilename()), file.getBytes());
// perform some tasks on image
return "";
} catch (IOException ioException) {
return "File upload has failed.";
} finally {
Files.delete(Paths.get("images/" + file.getOriginalFilename()));
}
}
but when i build jar and runs, it throws IOException saying,
java.nio.file.NoSuchFileException: images\8c9.jpeg.
So my question is how can i add the images folder inside the jar executable itself.
Thanks.
You should provide a full path for the images folder, or save in java.io.tmpdir creating the image folder first.
But, in my opinion you should configure your upload folder from a config file for flexibility. Take a look at this.
app:
profile-image:
upload-dir: C:\\projs\\web\\profile_image
file-types: jpg, JPG, png, PNG
width-height: 360, 360
max-size: 5242880
In your service or controller, do whatever you like, may be validate image type, size etc and process it as you like. For instance, if you want thumbnails(or avatar..).
In your controller or service class, get the directory:
#Value("${app.image-upload-dir:../images}")
private String imageUploadDir;
Finally,
public static Path uploadFileToPath(String fullFileName, String uploadDir, byte[] filecontent) throws IOException {
Path fileOut = null;
try{
Path fileAbsolutePath = Paths.get(StringUtils.join(uploadDir, File.separatorChar, fullFileName));
fileOut = Files.write(fileAbsolutePath, filecontent);
}catch (Exception e) {
throw e;
}
return fileOut; //full path of the file
}
For your question in the comment: You can use java.io.File.deleteOnExit() method, which deletes the file or directory defined by the abstract path name when the virtual machine terminates. TAKE A GOOD CARE THOUGH, it might leave some files if not handled properly.
try (ByteArrayOutputStream output = new ByteArrayOutputStream();){
URL fileUrl = new URL(url);
String tempDir = System.getProperty("java.io.tmpdir");
String path = tempDir + new Date().getTime() + ".jpg"; // note file extension
java.io.File file = new java.io.File(path);
file.deleteOnExit();
inputStream = fileUrl.openStream();
ByteStreams.copy(inputStream, output); // ByteStreams - Guava
outputStream = new FileOutputStream(file);
output.writeTo(outputStream);
outputStream.flush();
return file;
} catch (Exception e) {
throw e;
} finally {
try {
if(inputStream != null) {
inputStream.close();
}
if(outputStream != null) {
outputStream.close();
}
} catch(Exception e){
//skip
}
}
I have developed a rest client using spring mvc which upload files and form data to a rest service using Jersey.
I can able to see the files that i uploaded, in rest client's Tomcat home directory.
How can i automatically delete the file that is stored in my tomcat, after i get a success response.
Some of my configuration for your reference,
Multipart config in "web.xml"
<multipart-config>
<location>/tmp</location>
<max-file-size>26214400</max-file-size>
<max-request-size>31457280</max-request-size>
<file-size-threshold>0</file-size-threshold>
</multipart-config>
multipart config in "dispatcher-servlet.xml"
<bean id="multipartResolver" class="org.springframework.web.multipart.support.StandardServletMultipartResolver"/>
My business logic,
public Map<Object, Object> upload(ModelMap model) {
Map<Object, Object> responseMap = new HashMap<>();
sendMailBean = (SendMailBean) model.get("sendMailBean");
FormDataMultiPart formDataMultiPart = new FormDataMultiPart();
formDataMultiPart.field("firstName", sendMailBean.getFirstname());
formDataMultiPart.field("lastName", sendMailBean.getLastname());
formDataMultiPart.field("fromAddress", sendMailBean.getEmail());
formDataMultiPart.field("subject", sendMailBean.getSubject());
formDataMultiPart.field("text", sendMailBean.getMessage());
List<MultipartFile> files = sendMailBean.getAttachments();
try {
for(MultipartFile file : files) {
File convFile = convert(file);
FileDataBodyPart filePart = new FileDataBodyPart("files", convFile);
filePart.setContentDisposition(FormDataContentDisposition.name("files").fileName(file.getOriginalFilename()).build());
formDataMultiPart.bodyPart(filePart);
}
Client client = new Client();
WebResource webResource = client.resource(----rest url-----);
ClientResponse response = webResource.type(MediaType.MULTIPART_FORM_DATA).post(ClientResponse.class, formDataMultiPart);
if (response.getStatus() != 200) {
model.addAttribute("errormsg", "Failed : HTTP error code : " + response.getStatus());
responseMap.put("model", model);
responseMap.put("redirectToPage", "redirect:/views/error");
} else {
// responseMap.put("redirectToPage", "/views/email");
responseMap.put("model", model);
responseMap.put("redirectToPage", "");
}
}catch (Exception e) {
e.printStackTrace();
}
return responseMap;
}
public File convert(MultipartFile file)
{
File convFile = new File(file.getOriginalFilename());
try {
convFile.createNewFile();
FileOutputStream fos = new FileOutputStream(convFile);
fos.write(file.getBytes());
fos.close();
} catch (Exception e) {
e.printStackTrace();
}
return convFile;
}
I've had the same problem.
The best way for GC to delete the tmp file is to ensure your inputstream is closed after using it.
} finally {
if (inputStream != null) {
try {
inputStream.close();
} catch (IOException io) {
}
}
}
For some odd reason the inputstream stays open even after your rest service executes; this prevents the GC collect and delete it.
Looks like the problem lies with the fileoutstream logic to convert multipart file to file. Below is the workaround that i used to resolve this,
I replaced the conversion logic to,
public File multipartToFile(MultipartFile file) throws IllegalStateException, IOException
{
File tmpFile = new File(System.getProperty("user.dir") + File.separator + file.getOriginalFilename());
file.transferTo(tmpFile);
return tmpFile;
}
and for each multipart file iteration, i have put the converted file to a list and after I finished uploading the file, I iterated my list and deleted the file.
Cheers.
I am using grizzly for java rest service and consuming these web services in an android app.
Its working fine as far as "text" data is concerned.
Now I want to load the images(from server) in my android application, using this rest service and also allow the users to update image from the device.
I have tried this code
#GET
#Path("/img3")
#Produces(MediaType.APPLICATION_OCTET_STREAM)
public Response getFile()
{
File file = new File("img/3.jpg");
return Response.ok(file, MediaType.APPLICATION_OCTET_STREAM).header("Content-Disposition", "attachment; filename=\"" + file.getName() + "\"") // optional
.build();
}
The code above allow me to download the file, but is it possible to display result in broswer? like this
http://docs.oracle.com/javase/tutorial/images/oracle-java-logo.png
Solution of Part 1:
I have made the changes in my code as suggested by Shadow
#GET
#Path("/img3")
#Produces("image/jpg")
public Response getFile(#PathParam("id") String id) throws SQLException
{
File file = new File("img/3.jpg");
return Response.ok(file, "image/jpg").header("Inline", "filename=\"" + file.getName() + "\"")
.build();
}
Requested image will be displayed in browser
Part 2:
The code used to convert back Base64 encoded image
#POST
#Path("/upload/{primaryKey}")
#Consumes(MediaType.APPLICATION_FORM_URLENCODED)
#Produces("image/jpg")
public String uploadImage(#FormParam("image") String image, #PathParam("primaryKey") String primaryKey) throws SQLException, FileNotFoundException
{
String result = "false";
FileOutputStream fos;
fos = new FileOutputStream("img/" + primaryKey + ".jpg");
// decode Base64 String to image
try
{
byte byteArray[] = Base64.getMimeDecoder().decode(image);
fos.write(byteArray);
result = "true";
fos.close();
}
catch (Exception e)
{
e.printStackTrace();
}
return result;
}
I save user uploaded images in FTP.
FTP service is running on server Server-A. The actual problem is when I want to see the uploaded image from the web application running in my local host everything works, but when I deploy the local application to Tomcat running on the same server Server-A, images are not displayed correctly.
The picture when I run the web application in local Tomcat:
The same picture when I run the web application in the remote Tomcat:
You can see that the second image is not displayed correctly. Also want to mention that the FTP is the same one.
I am using Spring with Apache FtpClient library for image upload/download functionality.
Controller source code:
#RequestMapping(value = "/{id:\\d+}/image", method = RequestMethod.GET, produces = MediaType.IMAGE_JPEG_VALUE)
protected byte[] getUserImage(BaseForm form,
#PathVariable("id") int userId) {
try {
User user = checkToken(form.getToken());
log.info("/users/{id}/image [GET]. User: " + user + ", form: " + form + ", User id: " + userId);
FileWrapper image = service.getUserImage(userId);
if(image != null) {
return ftpService.downloadFtpFile(image.getName());
}
}
catch(Exception e) {
log.error(e.getMessage(), e);
}
return null;
}
FtpService source code:
public byte[] downloadFtpFile(String filePath) throws IOException {
FTPClient client = new FTPClient();
try {
client.connect(host, port);
if(!client.login(username, password)) {
throw new AdminException("Invalid ftp username/password");
}
client.enterLocalPassiveMode();
try(ByteArrayOutputStream outputStream = new ByteArrayOutputStream()) {
client.retrieveFile(filePath, outputStream);
return outputStream.toByteArray();
}
}
catch(Exception e) {
log.error(e.getMessage(), e);
}
finally {
if(client.isConnected()) {
client.logout();
client.disconnect();
}
}
return null;
}
Thanks in advance!
If you've not set the FTP transfer to be binary (as opposed to ASCII) it will "convert the line endings" (or what it thinks are line endings) which will corrupt the picture.