I am using MvcUriComponentsBuilder.fromMethodName() to get a List of URLs and return them to the front end. Following is the example output where in i am getting domain in the form of localhost:
[
http://localhost:8081/files/1800_tiger.jpg/slideshow, http://localhost:8081/files/1800_trees.jpg/slideshow
]
Instead of localhost I want MvcUriComponentsBuilder to return me IP address of my system. Following is my code implementation:
#CrossOrigin
#RestController
public class ContentResource {
#RequestMapping("/getAllFiles")
public ResponseEntity<List<String>> getAllFiles(#RequestParam String panelName) {
List<String> fileNamesList = panelFileListMap.get(panelName);
if (fileNamesList != null) {
List<String> allFiles = fileNamesList.stream()
.map(fileName -> MvcUriComponentsBuilder
.fromMethodName(ContentResource.class, "getFile", fileName, panelName).build().toString())
.collect(Collectors.toList());
return ResponseEntity.ok().body(allFiles);
} else {
throw new RuntimeException("No images are uploaded in category = " + panelName);
}
}
#GetMapping("/files/{filename:.+}/{panelName}")
#ResponseBody
public ResponseEntity<Resource> getFile(#PathVariable("filename") String filename,
#PathVariable("panelName") String panelname) {
Resource file = storageService.loadFile(filename, panelname);
return ResponseEntity.ok()
.header(HttpHeaders.CONTENT_DISPOSITION, "attachment; filename=\"" + file.getFilename() + "\"")
.body(file);
}
}
Found out the solution myself: changed the getAllFiles method like following:
InetAddress ip = null;
#RequestMapping("/getAllFiles")
public ResponseEntity<List<String>> getAllFiles(#RequestParam String panelName) {
try {
ip = InetAddress.getLocalHost();
} catch (UnknownHostException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
List<String> fileNamesList = panelFileListMap.get(panelName);
if (fileNamesList != null) {
List<String> allFiles = fileNamesList.stream()
.map(fileName -> MvcUriComponentsBuilder
.fromMethodName(ContentResource.class, "getFile", fileName, panelName)
.host(ip.getHostAddress()).build().toString())
.collect(Collectors.toList());
return ResponseEntity.ok().body(allFiles);
} else {
throw new RuntimeException("No images are uploaded in category = " + panelName);
}
}
Related
public UaaGroup createGroup() {
String requestUrl = appConfig.getUaa().getBase_url() + "/Groups";
LOGGER.info("requestUrl : {}", requestUrl);
UaaGroup uaaGroup = new UaaGroup();
uaaGroup.setDescription("description");
uaaGroup.setDisplayName(UUID.randomUUID().toString());
LOGGER.info("DisplayName before rest call : {}", uaaGroup.getDisplayName());
try {
ResponseEntity<UaaGroup> responseEntity = restTemplate.postForEntity(requestUrl, uaaGroup, UaaGroup.class,
"");
uaaGroup = responseEntity.getBody();
LOGGER.info("UaaGroupServiceImpl.createGroup: uaaGroup={}", responseEntity.getBody().toString());
return uaaGroup;
} catch (Exception e) {
LOGGER.error("Create UAA Group failed: {}", e);
throw e;
}
}
public UaaGroup updateGroup(String groupId, GroupRequest groupRequest) {
String requestUrl = appConfig.getUaa().getBase_url() + "/Groups/{groupId}";
UaaGroup uaaGroup = new UaaGroup();
if (!Strings.isNullOrEmpty(groupId)) {
String displayName = "eid-" + groupRequest.getEnterpriseId() + '-' + "gid-" + groupId + '-'
+ groupRequest.getRole();
String description = groupRequest.getEnterpriseName() + ":" + groupRequest.getName();
uaaGroup.setDisplayName(displayName);
uaaGroup.setDescription(description);
try {
HttpEntity<UaaGroup> entity = new HttpEntity<UaaGroup>(uaaGroup);
ResponseEntity<UaaGroup> responseEntity = restTemplate.exchange(requestUrl, HttpMethod.PUT, entity,
UaaGroup.class, groupId);
uaaGroup = responseEntity.getBody();
LOGGER.info("Updated Group", responseEntity.getBody().toString());
return uaaGroup;
} catch (Exception e) {
LOGGER.info("Failed to update the Group: {}", e.getMessage());
}
}
return uaaGroup;
}
#Override
public UaaGroup handleGroup(GroupRequest request) {
UaaGroup uaaGroup = this.createGroup();
LOGGER.info("handleGroup() createdGroup: {}", uaaGroup);
UaaGroupList uaaGroupList = uaaService.listUaaGroups(); /** newly created group is not displaying here
String groupId = "";
if (uaaGroup != null) {
for (UaaGroupList.Resources resources : uaaGroupList.getResources()) {
if (uaaGroup.getDisplayName().equals(resources.getDisplayName())) {
groupId = resources.getId();
LOGGER.info("groupId: {}", groupId);
}
}
}
// if (Strings.isNullOrEmpty(groupId)) {
// UaaGroup uaagroup = createGroup();
// uaaGroupList = uaaService.listUaaGroups();
// if (uaaGroupList != null) {
// for (UaaGroupList.Resources resources : uaaGroupList.getResources()) {
// if (uaagroup.getDisplayName().equals(resources.getDisplayName())) {
// groupId = resources.getId();
// LOGGER.info("Uaa User Group Id found: {}", groupId);
// }
// }
// if (Strings.isNullOrEmpty(groupId)) {
// // this should never happen...
// LOGGER.error("Failed to create UAA Group : {}");
// }
// }
// }
// }
if (!Strings.isNullOrEmpty(groupId)) {
LOGGER.info("grupId:{}", groupId);
uaaGroup = updateGroup(groupId, request);
LOGGER.info("upfatedGroup : {}", uaaGroup);
return uaaGroup;
}
return uaaGroup; // every time I am getting only createGroup object
}
while creating group first I want to create a group with randomUUID and by using that random UUID i will try to get the groupId. In my case after creating the group .I am not able to see the newly created group in listof groups.
In handleGroup() method every time iam getting created group Object but that created group i snout displaying in list of groups
I am creating AWS bucket of 'Singapore' region. After creating I am getting acl of BUCKET, that time it throw exception - "(S3Exception) software.amazon.awssdk.services.s3.model.S3Exception: The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint. (Service: S3, Status Code: 301)"
My code is -
Main Method
{
s3Client = awsJdkClient.createCredentialsWithAWSJdk(sLocationConstaints);
sBucket = awsJdkClient.createAWSBucket(s3Client, sTempBucketName);
AmazonS3Requests.increase_PUT_BUCKET_Requeset();
if (getDetails(sBucket) == true) {
objBucketOperationEvent.addMessageToSatusbar("Bucket " + sBucket + " has been created successfully.", true);
} else {
objBucketOperationEvent.addMessageToSatusbar("Bucket " + sBucket + " has not been created.", true);
return false;
}
}
public GetBucketAclResponse getBucketACL( String bucketName) {
try {
GetBucketAclRequest getBucketAclRequest = GetBucketAclRequest.builder().bucket(bucketName).build();
GetBucketAclResponse bucketAclS3Client = this.s3Client.getBucketAcl(getBucketAclRequest); //Here it throw exception
return bucketAclS3Client;
} catch (S3Exception e) {
throw e;
}
}
private boolean getDetails(String bucketname) {
try {
objBucketExplorer.writeDebug("CreateBucket: getDetails() is invoked");
if (awsJdkClient == null) {
objBucketExplorer.writeDebug("CreateBucket-while getting detail: return unsuccessfully from getDetails() due to objService found null");
return false;
}
if (awsJdkClient.getBucketACL(bucketname) == null) {
AmazonS3Requests.increase_GET_BUCKET_Requeset();
return false;
} else {
AmazonS3Requests.increase_GET_BUCKET_Requeset();
return true;
}
} catch (S3Exception ex) {
String stackNum = Utility.exceptionHandler(ex);
objBucketExplorer.writeDebug("CreateBucket: return unsuccessfully from getDetails() :Error " + ex.getMessage() + " Stacktrace " + stackNum);
return false;
}
}
public S3Client createCredentialsWithAWSJdk(String regionString) {
try {
AwsBasicCredentials awsCreds = AwsBasicCredentials.create(
ACCESS_KEY,
SECRET_KEY);
S3ClientBuilder s3ClientBuilder = S3Client.builder().credentialsProvider(StaticCredentialsProvider.create(awsCreds));
if (regionString != null && !regionString.isEmpty()) {
Region region = Region.of(regionString);
s3ClientBuilder.region(region);
}
S3Client s3Client = s3ClientBuilder.build();
return s3Client;
} catch (S3Exception ex) {
throw ex;
}
}
public String createAWSBucket(S3Client s3Client, String bucketName) {
try {
S3Waiter s3Waiter = s3Client.waiter();
CreateBucketRequest bucketRequest = CreateBucketRequest.builder()
.bucket(bucketName)
.build();
CreateBucketResponse few = s3Client.createBucket(bucketRequest);
HeadBucketRequest bucketRequestWait = HeadBucketRequest.builder()
.bucket(bucketName)
.build();
return bucketRequestWait.bucket();
} catch (S3Exception e) {
if(e.awsErrorDetails().errorCode().equalsIgnoreCase("BucketAlreadyOwnedByYou")){
return bucketName;
}
throw e;
}
}
I'm trying to find a way to download file from api without window.open().
I'd like to get instant download when calling the api.
Currently downloading .xls file generated by a rest api using window.open()
API Endpoint
#GetMapping("/applications/export")
#Timed
public ResponseEntity<byte[]> exportApplicationsList() {
log.debug("REST request to export applications list");
byte[] result = applicationService.generateApplicationsListAsExcel();
if (result == null) {
return ResponseEntity.status(500).build();
}
String date = LocalDateTime.now().format(DateTimeFormatter.ofPattern("dd_MM_yyyy_HH_mm"));
return ResponseEntity.ok()
.header("Content-Disposition", "attachment; filename=liste_applications_" + date + ".xls")
.contentLength(result.length)
.contentType(MediaType.APPLICATION_OCTET_STREAM)
.body(result);
}
Service
/**
* Generate xls file from applications list.
*
* #param applications list of applications
*/
public byte[] generateApplicationsListAsExcel() {
log.info("Génération fichier xls de la liste des applications");
List<Application> applications = applicationRepository.findAll();
Collections.sort(applications);
try (InputStream is = new FileInputStream(ResourceUtils.getFile("classpath:jxls-templates/liste_applications_template.xls"))) {
try (ByteArrayOutputStream os = new ByteArrayOutputStream()) {
Context context = new Context();
context.putVar("applications", applications);
JxlsHelper.getInstance().processTemplate(is, os, context);
return os.toByteArray();
} catch (IOException e) {
log.error(e.toString());
}
} catch (IOException e) {
log.error(e.toString());
}
return null;
}
Invocation
exportApplicationsList(): void {
window.open('/api/applications/export');
}
You can return file as blob as response from backend and then use file-saver to download the file
this.http.get(`/api/applications/export`, params, { responseType: 'blob' })
.subscribe((resp: any) => {
saveAs(resp, `filename}.xlsx`)
});
Quick solution : window.location.href = url;
I used this file-saver, I think it will fulfill your needs.
this.filesService.getDownloadFile(id).subscribe(
data => {
importedSaveAs(data, name);
},
err => {
console.error(err);
});
For the backend:
#GetMapping("download-file/{id}")
public ResponseEntity<?> downloadFile(#PathVariable(value = "id") Long id) {
final Optional<FileEntity> file = fileRepository.findById(id);
if (!file.isPresent()) {
return ResponseEntity.badRequest().body(getErrorResponse("File not found"));
}
ByteArrayOutputStream downloadInputStream = amazonClient.downloadFile(file.get().getLink());
return ResponseEntity.ok()
.contentType(contentType(file.get().getName()))
.header(HttpHeaders.CONTENT_DISPOSITION, "attachment; filename=\"" + file.get().getName() + "\"")
.body(downloadInputStream.toByteArray());
}
I have Java application connecting to linux over SFTP. I'm using jsch as a framework. Now, let's say I need to rename a file.
public boolean rename(String name) {
boolean result = false;
channelSftp = (ChannelSftp)channel;
LsEntry currentFile = //here I have LsEntry object, pointing to specific record;
logger.info("Renaming CRC file " + currentFile.getFilename() + " to " + name);
try {
//For the first parameter I get current position in the directory, then I append the filename of the currently processed file.
//For the first parameter I get current position in the directory, then I append the new name.
channel.rename(channel.pwd() + currentFile .getFilename(), channel.pwd() + name);
result = true;
} catch (Exception e) {
logger.error("Error renaming crc file to " + name, e);
result = false;
}
return result;
}
Now after renaming the file on the filesystem, I also need to rename the file in the current LsEntry object I'm working with. The problem is that LsEntry doesn't provide such method, so I have to load it again. Now how do I look for it? I need to find specific file so I can use it as updated LsEntry object for later use. Is that possible?
EDIT1:
The LsEntry object, which represents entry on the filesystem has to be created somehow, I do that by casting vector object into it. Like so:
System.out.println("searching for files in following directory" + directory);
channelSftp.cd(directory);
Vector foundFiles = channelSftp.ls(directory);
for(int i=2; i<foundFiles.size();i++){
LsEntry files = (LsEntry) foundFiles.get(i);
System.out.println("found file: " + files.getFilename());
System.out.println("Found file with details : " + files.getLongname());
System.out.println("Found file on path: " + channelSftp.pwd());
channelSftp.rename(channelSftp.pwd() + files.getFilename(), channelSftp.pwd() + "picovina");
//LsEntry has now old name.
public class SftpClient {
private final JSch jSch;
private Session session;
private ChannelSftp channelSftp;
private boolean connected;
public SftpClient() { this.jSch = new JSch(); }
public void connect(final ConnectionDetails details) throws ConnectionException {
try {
if (details.usesDefaultPort()) {
session = jSch.getSession(details.getUserName(), details.getHost());
} else {
session = jSch.getSession(details.getUserName(), details.getHost(), details.getPort());
}
channelSftp = createSftp(session, details.getPassword());
channelSftp.connect();
connected = session.isConnected();
} catch (JSchException e) {
throw new ConnectionException(e.getMessage());
}
}
public void disconnect() {
if (connected) {
channelSftp.disconnect();
session.disconnect();
}
}
public void cd(final String path) throws FileActionException {
try {
channelSftp.cd(path);
} catch (SftpException e) {
throw new FileActionException(e.getMessage());
}
}
public List<FileWrapper> list() throws FileActionException {
try {
return collectToWrapperList(channelSftp.ls("*"));
} catch (SftpException e) {
throw new FileActionException(e.getMessage());
}
}
public String pwd() throws FileActionException {
try {
return channelSftp.pwd();
} catch (SftpException e) {
throw new FileActionException(e.getMessage());
}
}
public boolean rename(final FileWrapper wrapper, final String newFileName) throws FileActionException {
try {
String currentPath = channelSftp.pwd();
channelSftp.rename(currentPath + wrapper.getFileName(), currentPath + newFileName);
} catch (SftpException e) {
throw new FileActionException(e.getMessage());
}
return true;
}
private List<FileWrapper> collectToWrapperList(Vector<ChannelSftp.LsEntry> entries) {
return entries.stream()
.filter(entry -> !entry.getAttrs().isDir())
.map(entry -> FileWrapper.from(entry.getAttrs().getMtimeString(), entry.getFilename(), entry.getAttrs().getSize()))
.collect(Collectors.toList());
}
private ChannelSftp createSftp(final Session session, final String password) throws JSchException {
session.setPassword(password);
Properties properties = new Properties();
properties.setProperty("StrictHostKeyChecking", "no");
session.setConfig(properties);
session.connect();
return (ChannelSftp) session.openChannel("sftp");
}
}
Note here that the list method effectively returns a list of FileWrapper objects instead of LsEntry objects.
public class FileWrapper {
private static final String TIME_FORMAT = "EEE MMM dd HH:mm:ss zzz yyyy";
private Date timeStamp;
public Date getTimeStamp() { return timeStamp; }
private String fileName;
public String getFileName() { return fileName; }
private Long fileSize;
public Long getFileSize() { return fileSize; }
private FileWrapper(String timeStamp, String fileName, Long fileSize) throws ParseException {
this.timeStamp = new SimpleDateFormat(TIME_FORMAT).parse(timeStamp);
this.fileName = fileName;
this.fileSize = fileSize;
}
public static FileWrapper from(final String timeStamp, final String fileName, final Long fileSize) {
try {
return new FileWrapper(timeStamp, fileName, fileSize);
} catch (ParseException e) {
e.printStackTrace();
}
return null;
}
}
With this, you can easily list the remote directory and get all the files' attributes.
With that on hand you can simply invoke SftpClient#rename and rename the file you want.
I know that you want to avoid refactoring, but given the very tight nature or LsEntry as well as the fact that the library still uses Vector and such, I suppose that this is the best way to go (you'll avoid headaches in the future).
I know that this may not be 100% the answer you expect, but I think it's going to be helpful for you.
I am trying to access s3 bucket. I am able to do so using my local machine(i.e. from my local machine to S3 bucket), but getting access denied issue while trying to access it from EC2 instance running tomcat 8 and java 8.
Also when i upload the file the permissions are set for root user and if I keep my bucket as public and upload the file from EC2 the permissions are not set for the root user.
public class AmazonS3UtilService {
public static final String NAME = "amazonS3Util";
private static String S3_SECRET = "S3_SECRET";
private static String S3_ID = "S3_ID";
private static String BUCKET_NAME = "S3_BUCKET";
private static final String SUFFIX = "/";
private static final String DEFAULT_FOLDER_PATH = "PHR/Reports/";
#Autowired
protected Environment props;
private AWSCredentials awsCredentials = null;
private AmazonS3 s3Client = null;
private String bucketName = null;
private static final Logger log = Logger.getLogger(AmazonS3UtilService.class);
private void prepareAWSCredentials() {
if (awsCredentials == null) {
log.info("Preparing AWS Credentials");
awsCredentials = new AWSCredentials() {
#SuppressWarnings("unused")
Map<String, String> env = System.getenv();
public String getAWSSecretKey() {
String S3_SECRET = System.getProperty(AmazonS3UtilService.S3_SECRET);
if (S3_SECRET == null) {
S3_SECRET = System.getenv(AmazonS3UtilService.S3_SECRET);
if (S3_SECRET == null) {
S3_SECRET = props.getProperty(AmazonS3UtilService.S3_SECRET);
}
}
log.info("S3_SECRET ---->" + S3_SECRET);
return S3_SECRET;
}
public String getAWSAccessKeyId() {
String S3_ID = System.getProperty(AmazonS3UtilService.S3_ID);
if (S3_ID == null) {
S3_ID = System.getenv(AmazonS3UtilService.S3_ID);
if (S3_ID == null) {
S3_ID = props.getProperty(AmazonS3UtilService.S3_ID);
}
}
log.info("S3_ID ---->" + S3_ID);
return S3_ID;
}
};
}
}
private void prepareAmazonS3Client() {
if (s3Client == null) {
log.info("Preparing S3 Client");
ClientConfiguration clientCfg = new ClientConfiguration();
clientCfg.setProtocol(Protocol.HTTP);
s3Client = new AmazonS3Client(awsCredentials, clientCfg);
Region region = Region.getRegion(Regions.fromName(props.getProperty("s3client.region")));
log.info("Region ----->" + props.getProperty("s3client.region"));
s3Client.setRegion(region);
}
}
private void prepareBucketName() {
bucketName = System.getenv(AmazonS3UtilService.BUCKET_NAME);
log.info("bucketName ------>" + bucketName);
}
}
private void prepare() {
try {
awsCredentials = null;
prepareAWSCredentials();
prepareAmazonS3Client();
prepareBucketName();
} catch (AmazonServiceException ase) {
log.error("Caught an AmazonServiceException, which means your request made it "
+ "to Amazon S3, but was rejected with an error response for some reason.");
log.error("Error Message: " + ase.getMessage() + " HTTP Status Code: " + ase.getStatusCode()
+ " AWS Error Code: " + ase.getErrorCode() + " Error Type: " + ase.getErrorType()
+ " Request ID: " + ase.getRequestId());
new AmazonS3ClientException(ase, ase.getMessage());
} catch (AmazonClientException ace) {
log.error(ace);
new AmazonS3ClientException(ace, ace.getMessage());
}
}
#SuppressWarnings("unused")
public String uploadDocument(UploadDocumentDetailDTO uploadDocumentDetail) {
prepare();
String tempFileName = new SimpleDateFormat("yyyy-MM-dd hh-mm-ss").format(new Date());
String fileURL = null;
try {
File uploadFileContent = readBase64File(uploadDocumentDetail.getFileContent(), tempFileName);
uploadDocumentDetail.setContentType(FileContentTypeEnum.PDF);
String uploadFileName = getUploadFileName(uploadDocumentDetail);
PutObjectRequest request = new PutObjectRequest(bucketName, uploadFileName, uploadFileContent);
request.putCustomRequestHeader("Content-Type", "application/pdf");
request.putCustomRequestHeader("Content-Disposition", "inline");
PutObjectResult putObjectResult = s3Client.putObject(request);
URL url = generatePresignedUrlRequest(uploadFileName);
fileURL = url.toString();
} catch (Exception e) {
log.info(LoggerException.printException(e));
fileURL = "";
}
return fileURL;
}
public URL generatePresignedUrlRequest(String fileURL) {
log.info("Inside generatePresignedUrlRequest");
java.util.Date expiration = new java.util.Date();
long msec = expiration.getTime();
msec += 1000 * 60 * 60; // 1 hour.
expiration.setTime(msec);
GeneratePresignedUrlRequest generatePresignedUrlRequest = new GeneratePresignedUrlRequest(bucketName, fileURL);
generatePresignedUrlRequest.setMethod(HttpMethod.GET); // Default.
generatePresignedUrlRequest.setExpiration(expiration);
URL s = s3Client.generatePresignedUrl(generatePresignedUrlRequest);
log.info("Url --->" + s);
return s;
}
private String getUploadFileName(UploadDocumentDetailDTO uploadDocumentDetail) {
StringBuffer uploadFileName = new StringBuffer();
uploadFileName.append(DEFAULT_FOLDER_PATH);
if (uploadDocumentDetail.getBeneficiaryId() != null)
uploadFileName.append(uploadDocumentDetail.getBeneficiaryId() + SUFFIX);
if (uploadDocumentDetail.getDocumentType() != null)
uploadFileName.append(uploadDocumentDetail.getDocumentType().getName() + SUFFIX);
// Check and create Folder
validateAndCreateFolder(uploadFileName.toString());
if (uploadDocumentDetail.getAssesmentId() != null)
uploadFileName.append(
uploadDocumentDetail.getAssesmentId() + "." + uploadDocumentDetail.getContentType().getName());
else
uploadFileName.append(
uploadDocumentDetail.getDefaultFileName() + "." + uploadDocumentDetail.getContentType().getName());
return uploadFileName.toString();
}
private static File readBase64File(String content, String fileName) throws Exception {
File file = File.createTempFile(fileName, ".tmp");
file.deleteOnExit();
FileOutputStream fileOuputStream = new FileOutputStream(file);
fileOuputStream.write(Base64.decodeBase64(content));
fileOuputStream.close();
return file;
}
public void validateAndCreateFolder(String folderName) {
List<S3ObjectSummary> fileList = null;
try {
fileList = s3Client.listObjects(bucketName, folderName).getObjectSummaries();
} catch (AmazonServiceException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (AmazonClientException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
if (fileList == null || fileList.isEmpty()) {
// create meta-data for your folder and set content-length to 0
ObjectMetadata metadata = new ObjectMetadata();
metadata.setContentLength(0);
// create empty content
InputStream emptyContent = new ByteArrayInputStream(new byte[0]);
// create a PutObjectRequest passing the folder name suffixed by /
PutObjectRequest putObjectRequest = new PutObjectRequest(bucketName, folderName, emptyContent, metadata);
// send request to S3 to create folder
s3Client.putObject(putObjectRequest);
}
}
/**
* This method first deletes all the files in given folder and than the
* folder itself
*/
}
Following is the exception while access S3 from EC2 instance.
INFO com.medscheme.common.util.AmazonS3UtilService - com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: 926E1213366626B9), S3 Extended Request ID: zQbb4JCalYExHZtDSv0GmWxoHrQZJUV3M+jlUiaVJY/sDxW/qoNFC8hizfangVCjweWZtOqC7/A=
at com.amazonaws.http.AmazonHttpClient.handleErrorResponse(AmazonHttpClient.java:1275)
at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:873)
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:576)
at com.amazonaws.http.AmazonHttpClient.doExecute(AmazonHttpClient.java:362)
at com.amazonaws.http.AmazonHttpClient.executeWithTimer(AmazonHttpClient.java:328)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:307)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3649)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3602)
at com.amazonaws.services.s3.AmazonS3Client.listObjects(AmazonS3Client.java:679)
at com.amazonaws.services.s3.AmazonS3Client.listObjects(AmazonS3Client.java:664)
at com.medscheme.common.util.AmazonS3UtilService.validateAndCreateFolder(AmazonS3UtilService.java:222)
at com.medscheme.common.util.AmazonS3UtilService.getUploadFileName(AmazonS3UtilService.java:200)
at com.medscheme.common.util.AmazonS3UtilService.uploadDocument(AmazonS3UtilService.java:166)
at com.medscheme.service.impl.ReportsServiceImpl.getReport(ReportsServiceImpl.java:133)
at com.medscheme.service.impl.ReportsServiceImpl.getReport(ReportsServiceImpl.java:1)
at com.medscheme.controller.ReportsController.getWellnessReportDetails(ReportsController.java:69)
I was able to resolve the issue by using BasicAWSCredentials class instead of AWSCredentials while creating the amazon client.
The problem was only with EC2 instance.
Does anybody know what was going wrong on EC2.