How to set md5 to a file while uploading to amazon s3 - java

I am using amazon s3 android low level sdk to upload a file and want to set md5 checksum to upload data.
1)Following is the code to create credentials:
BasicAWSCredentials lAwsCredentials = new BasicAWSCredentials(
Constants.ACCESS_KEY_ID, Constants.SECRET_KEY);
AmazonS3Client lS3Client = new AmazonS3Client(lAwsCredentials);
2)Following is the code to calculate md5
public class MD5CheckSum {
public static byte[] createChecksum(String pFilepath) throws Exception {
InputStream lFis = new FileInputStream(pFilepath);
byte[] lBuffer = new byte[1024];
MessageDigest lMessageDigest = MessageDigest.getInstance("MD5");
int lNumRead;
do {
lNumRead = lFis.read(lBuffer);
if (lNumRead > 0) {
lMessageDigest.update(lBuffer, 0, lNumRead);
}
} while (lNumRead != -1);
lFis.close();
return lMessageDigest.digest();
}
public static String getMD5Checksum(String pFilepath) throws Exception {
byte[] lBytes = createChecksum(pFilepath);
return Base64.encodeToString(lBytes, Base64.DEFAULT);
}
}
3)Following is the code to set md5 using metadata:
try {
lMd5 = MD5CheckSum.getMD5Checksum(pFile.getAbsolutePath());
Log.v(TAG, "CheckSum:====" + lMd5);
} catch (Exception lException) {
lException.printStackTrace();
}
ObjectMetadata lObjectMetadata = new ObjectMetadata();
if (lMd5 != null) {
lObjectMetadata.setContentMD5(lMd5);
}`
InitiateMultipartUploadResult mInitResponse = mS3Client.initiateMultipartUpload(new InitiateMultipartUploadRequest(mBucketName, mKeyName,
lObjectMetadata);
But a exception is thrown by amazon when i set md5:
Caused by: com.amazonaws.services.s3.model.AmazonS3Exception: Anonymous users cannot initiate multipart uploads. Please authenticate. (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: BA0C68FC884703FD), S3 Extended Request ID: re2sdbzf8MMqGAyrNQOoqYJ8EdXERoWE7cjG+UpfAtFuP5IeAbXmk6Riw+PX8Uw3Jcspn1rSQvI=
Is this the correct way to set md5?
Note: When md5 is not set(ie objectmetadata is not set) then upload works without any exception

I have also faced this type of issue..
I fixed by adding Base64.DEFAULT instead of others such as WRAP and NO_WRAP.

Related

No primary or single public constructor found for class java.io.File

I'm trying to pass a file to my springboot backend. (which will then be uploaded to an s3 bucket), but I'm receiving this error that I can't figure out.
The file itself will contain an array, of an array of strings
Error -
java.lang.IllegalStateException: No primary or single public constructor found for class java.io.File - and no default constructor found either
Data Source -
// Data saved to S3 bucket / downloadableData function
if (this.lng !== "0.000000" && this.lng !== "") {
this.locationData.push([`Longitude: ${this.lng}, Latitude: ${this.lat}, Uncertainty Radius: ${this.uncertainty_radius} meters, Address: ${this.place_name}, Source: TEXT`])
this.locationData = JSON.parse(JSON.stringify(this.locationData))
}
Axios Post -
downloadableData() {
const blob = new Blob([this.locationData], {type: 'application/json'});
const data = new FormData();
data.append("document", blob);
axios.post("http://localhost:8080/api/v1/targetLocation/uploadStreamToS3Bucket", blob)
},
Springboot method -
public void uploadStreamToS3Bucket(File locations) {
try {
AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
.withRegion(String.valueOf(awsRegion))
.build();
String bucketName = "downloadable-cases";
String fileName = connectionRequestRepository.findStream() +".json";
s3Client.putObject(new PutObjectRequest(bucketName, fileName, locations));
} catch (AmazonServiceException ex) {
System.out.println("Error: " + ex.getMessage());
}
}
Data example
I see you want to upload a file that contain JSON data in it. This can be done in a Spring BOOT app via logic like this.
<p>Upload images to an S3 Bucket. Each image will be analyzed!</p>
<form method="POST" onsubmit="myFunction()" action="/upload" enctype="multipart/form-data">
<input type="file" name="file" /><br/><br/>
<input type="submit" value="Submit" />
</form>
To handle this upload in a Spring Controller, you can use this logic:
// Upload a file to place into an Amazon S3 bucket.
#RequestMapping(value = "/upload", method = RequestMethod.POST)
#ResponseBody
public ModelAndView singleFileUpload(#RequestParam("file") MultipartFile file) {
try {
byte[] bytes = file.getBytes();
String name = file.getOriginalFilename() ;
// Put the file into the bucket.
s3Client.putObject(bytes, bucketName, name);
} catch (IOException e) {
e.printStackTrace();
}
return new ModelAndView(new RedirectView("photo"));
}
Now you have the byte array and file name. You can place this into an Amazon S3 bucket by using the AWS SDK for Java V2.
private S3Client getClient() {
Region region = Region.US_WEST_2;
S3Client s3 = S3Client.builder()
.credentialsProvider(EnvironmentVariableCredentialsProvider.create())
.region(region)
.build();
return s3;
}
// Places an image into a S3 bucket.
public String putObject(byte[] data, String bucketName, String objectKey) {
s3 = getClient();
try {
PutObjectResponse response = s3.putObject(PutObjectRequest.builder()
.bucket(bucketName)
.key(objectKey)
.build(),
RequestBody.fromBytes(data));
return response.eTag();
} catch (S3Exception e) {
System.err.println(e.getMessage());
System.exit(1);
}
return "";
}
Here is a complete document that shows this use case. This use case actually uses the Amazon Rekognition service to analyze photos in an Amazon S3 bucket; however, it still demonstrates how to successfully upload a file from your desktop to an Amazon S3 bucket Also - it's implemented using the AWS SDK For Java V2 - which is the version that Amazon Recommends.
Creating a dynamic web application that analyzes photos using the AWS SDK for Java

ConnectionPoolTimeoutException: Timeout waiting for connection from pool putObject () s3Client Java

I'm uploading image files to s3 using the s3 aws client in my java application, but sometimes I've been getting the error
ERROR 9 --- [io-8080-exec-27] b.c.i.h.e.handler.HttpExceptionHandler : com.amazonaws.SdkClientException: Unable to execute HTTP request: Timeout waiting for connection from pool
com.amazonaws.SdkClientException: Unable to execute HTTP request: Timeout waiting for connection from pool
Caused by: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
but I haven't identified the reason for this error to occur and what solution I need to implement. I observed in the documentation the implementation of a ClientConfiguration setMaxConnections and passing it to the AmazonS3ClientBuilder object, but I believe that this would be increasing the problem and not actually correcting it, would I be correct?
I did not find detail why this problem with connection pooling occurs when using putObject (), if someone knows the reason or can explain through my implementation why this problem occurs. In our application there is also a configuration for SQS Config for queues
S3Config
public class S3Config {
#Bean
public AmazonS3 s3client() {
return AmazonS3ClientBuilder.standard()
.build();
}
}
Service Upload
public List<String> uploadImage(Long id, List<MultipartFile> files) throws Exception {
Random rand = new Random();
Product product = this.productService.findById(id);
List<String> imgPath = new ArrayList<>();
for (MultipartFile file : files) {
String name = (product.getName() + this.brandService.findBrandById(product.getBrand()).getName() + rand.nextInt(999999)).replaceAll(" ", "-");
String fullPath = this.s3Service.uploadImageFile(
file,'.' + Objects.requireNonNull(file.getOriginalFilename()).split("\\.")[1],
name,
awsBucketProperties.getName(),
awsBucketProperties.getEndpoint());
imgPath.add(this.utils.removeImageDomain(fullPath));
}
return imgPath;
}
Service S3
public String uploadImageFile(final MultipartFile file, final String ext, final String filename, final String bucketName, final String bucketEndpoint) throws IOException {
byte[] imageData = file.getBytes();
InputStream stream = new ByteArrayInputStream(imageData);
String s3FileName = filename + ext;
ObjectMetadata metadata = new ObjectMetadata();
metadata.setContentLength(stream.available());
try {
s3client.putObject(new PutObjectRequest(bucketName, s3FileName, stream, metadata)
.withCannedAcl(CannedAccessControlList.PublicRead));
} catch (AmazonClientException ex) {
ex.printStackTrace();
}
return String.format("%s/%s", bucketEndpoint, s3FileName);
}

Not working mock for the local variable - Getting Specified Key does not exist (Service Amazon S3)

This is my actual method which I want to test using mock
#Override
public void processTransaction(Exchange exchange) throws Exception {
AmazonS3 s3Client = null;
s3Client = new AmazonS3Client(new DefaultAWSCredentialsProviderChain());
url = String.format("%s%s%s", tenantConfig.getSchemaName(), PATH_LOC_NEW, fileName);
region = System.getProperty("aws.region.prefix") != null ? System.getProperty("aws.region.prefix") : "";
bucket = String.format("%s%s-%s", System.getProperty("aws.Environment"), region, "gfcp-tenant");
S3Object object = s3Client.getObject(new GetObjectRequest(bucket, url));
InputStream inputStream = new BufferedInputStream(object.getObjectContent());
ObjectMetadata meta = new ObjectMetadata();
}
Here is my test logic but it seems it is not mocking AmazonS3Client AmazonS3 that is the reason I am getting exception
#RunWith(PowerMockRunner.class)
#PrepareForTest({TransactionProcessingService.class})
#PowerMockIgnore({"javax.net.ssl.*","javax.security.*","javax.management.*","javax.crypto.*"})
public class TransactionProcessingServiceTest {
#Test
public void testProcessTransaction() throws Exception {
DefaultAWSCredentialsProviderChain credentialMock = PowerMockito.mock(DefaultAWSCredentialsProviderChain.class);
PowerMockito.whenNew(DefaultAWSCredentialsProviderChain.class).withNoArguments().thenReturn(credentialMock);
AmazonS3Client s3Client = PowerMockito.mock(AmazonS3Client.class);
HttpRequestBase httprequest = PowerMockito.mock(HttpRequestBase.class);
S3Object object= PowerMockito.mock(S3Object.class);
InputStream in = PowerMockito.mock(InputStream.class);
object.setObjectContent(new S3ObjectInputStream(in, httprequest));
when(s3Client.getObject(new GetObjectRequest("local_vittal-gfcp-tenant", "cdta/TransactionProcessing/New/junit_test_file.acf")))
.thenReturn(object);
PowerMockito.whenNew(AmazonS3Client.class).withArguments(credentialMock).thenReturn(s3Client);
transactionProcessingService.processTransaction(exchange);
}
}
Can anyone please help me out.
Thanks in advance
It is working now.
I need to use implementation class also in #PrepareForTest like -
#PrepareForTest(TransactionProcessingServiceImpl.class)

Send .docx file from server to client in jhipster application

I am using Jhipster.
I am using docx4j to create a .docx file.
I want to download this .docx file from server to client.
But The file I download is corrupted.
On server side:
I generate my file and put it in a byte[]
WordprocessingMLPackage p = null;
...
File f = new File(filePath);
p.save(f);
byte[] stream = Files.readAllBytes(f.toPath());
I have tried to send it to the client in different format:
byte[]
byte[] encoded Base64
String
String encoded Base64
An example of what's look like my method:
// send back as String encoded in Base64
public ResponseEntity<FileDTO> getFile(#PathVariable Long id) throws URISyntaxException, IOException {
FileDTO result = fillRepository.findOne(id);
byte[] stream = FileUtil.getFile(id) // retrieve file as byte[]
byte[] encoded = Base64.encodeBase64(stream);
String encodedString = new String(encoded, "UTF-8");
result.setFile(encodedString);
return ResponseUtil.wrapOrNotFound(Optional.ofNullable(result));
}
On client side:
I retrieve my file as byte[] or String and I put it in a blob to be downloaded.
FileService.get({id: id}, function(result) {
var res = result.file;
// var res = Base64.decode(result.file);
vm.blob = new Blob([res], {type: 'data:attachment;charset=utf-8;application/vnd.openxmlformats-officedocument.wordprocessingml.document'});
vm.url = (window.URL || window.webkitURL).createObjectURL(vm.blob);
});
My service is declared like this:
(function() {
'use strict';
angular
.module('myApp')
.factory('FileService', FileService);
FileService.$inject = ['$resource', 'DateUtils'];
function FileService($resource, DateUtils) {
var resourceUrl = 'api/file/:id/generate';
return $resource(resourceUrl, {}, {
'get': {
method: 'GET',
responseType:'arraybuffer'
}});}})();
When I download the file word say:
"We're sorry. We can't open file.docx because we found a problem with its content."
And when I compare my original file and the one downloaded in notepad++ for example I see that binary content is not exactly the same like there was encode/decode issues...
Also the size are not the same:
Original file 13Ko
Downloaded file 18Ko
Could you help me on knowing how and why the file downloaded is corrupted.
I finally found a solution:
I directly send back the binary without convertion in the response.
And access it with a window.location
I a new Rest Controller without annotation:#RequestMapping("/api")
#RestController
public class FileGenerationResource {
...
#GetMapping("/file/{id}")
#Timed
public void getFile(#PathVariable Long id, HttpServletResponse response) throws URISyntaxException, IOException {
FileInputStream stream = fileService.getFile(id);
response.setContentType("application/vnd.openxmlformats-officedocument.wordprocessingml.document");
response.setHeader("Content-disposition", "attachment; filename=test.docx");
IOUtils.copy(stream,response.getOutputStream());
stream.close();
}
}
The controller content:
(function() {
'use strict';
angular
.module('myApp')
.controller('MyController', MyController);
MyController.$inject = ['$timeout', '$scope', '$stateParams', '$uibModalInstance'];
function MyController ($timeout, $scope, $stateParams, $uibModalInstance) {
var vm = this;
vm.clear = clear;
vm.dwl = dwl;
function dwl (id) {
window.location = "http://localhost:8080/file/"+id;
vm.clear();
}
function clear () {
$uibModalInstance.dismiss('cancel');
}
}
})();

AWS S3 Java SDK times out when attempting a PutObject request

I have some code that attempts to upload images to an s3 bucket. All of them are around 100-200kb.
However, after a few attempts uploads I always get the following stacktrace:
com.amazonaws.AmazonClientException: Unable to execute HTTP request:
Timeout waiting for connection from pool at
com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:713)
at
com.amazonaws.http.AmazonHttpClient.doExecute(AmazonHttpClient.java:453)
at
com.amazonaws.http.AmazonHttpClient.executeWithTimer(AmazonHttpClient.java:415)
at
com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:364)
at
com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3964)
at
com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1538)
If I do GetObject on my code, I do not get that problem, only on uploads.
Code is the following:
public PutObjectResult uploadImage(String key, InputStream inputStream, ObjectMetadata metadata) {
Optional<String> bucketName = propertyResolver.instance().value("s3.bucket.url");
String resourcePath = BASE_PATH + key;
PutObjectRequest request = new PutObjectRequest(bucketName.get(), resourcePath, inputStream, metadata);
PutObjectResult result;
try {
result = s3Client.putObject(request);
} catch (AmazonClientException amazonClientException) {
amazonClientException.printStackTrace();
}
return result;
}
I've attempted to find a solution online but all I could find were issues regarding the GetObject and not consuming the response properly.

Categories

Resources