AmazonS3Exception: request signature error only in linux remote server? - java

I wrote a Java program to list all the buckets and to upload a file in S3 compatible Object storage service.
The program is working fine in Windows my local machine but when I (after changing the path of the file to be uploaded of course ) transfer the runnable jar in the remote linux server and execute it I'm getting the following error-
> Exception in thread "main"
> com.amazonaws.services.s3.model.AmazonS3Exception: The request
> signature we calculated does not match the signature you provided.
> Check your AWS Secret Access Key and signing method. For more
> information, see REST Authentication and SOAP Authentication for
> details. (Service: Amazon S3; Status Code: 403; Error Code:
> SignatureDoesNotMatch; Request ID:
> 4e271b5b-d7f5-42b3-a4ad-886988bcb785; S3 Extended Request ID: null),
> S3 Extended Request ID: null
The issue seems to be in the 2nd half of the program as the list of buckets are returning in linux env. as well but during the file upload it is throwing error.
import java.io.File;
import java.text.DateFormat;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3Client;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.services.s3.S3ClientOptions;
import com.amazonaws.auth.AWSCredentialsProvider;
import com.amazonaws.services.s3.model.Bucket;
/**
* List your Amazon S3 buckets.
*/
public class ListBuckets
{
private static void listObjects(AmazonS3 s3) {
List<Bucket> buckets = s3.listBuckets();
System.out.println("Your Amazon S3 buckets are:");
for (Bucket b : buckets) {
System.out.println("* " + b.getName());
}
}
private static void putObject(AmazonS3 s3, String bucketName, String objectName, String pathName) throws Exception
{
s3.putObject(bucketName, objectName, new File(pathName));
}
private static void time(String t) {
DateFormat dateFormat = new SimpleDateFormat("yyyy/MM/dd HH:mm:ss");
Date date = new Date();
System.out.println(t+"-->"+dateFormat.format(date));
}
public static void main(String[] args) throws Exception
{
final String accessKey = "XXXXXXXXXXXXXX";
final String secretKey = "XXXXXXXXXXXXXXXXXXXXXXXX";
BasicAWSCredentials credentials = new BasicAWSCredentials(accessKey, secretKey);
#SuppressWarnings("deprecation")
final AmazonS3 s3 = new AmazonS3Client(credentials);
S3ClientOptions opts = new S3ClientOptions().withPathStyleAccess(true);
s3.setS3ClientOptions(opts);
s3.setEndpoint("https://XXXXXX.com");
ListBuckets.time("startTime");
ListBuckets.listObjects(s3);
//String pathName = "C:\\Users\\XXXXXX\\Documents\\New folder\\New Text Document - Copy.txt";
String pathName = "/home/abcd/XXXXX/objectStorage/CHANGELOG.mdown";
ListBuckets.putObject(s3, "snap-shot/sample-aws-ex", pathName, pathName);
ListBuckets.time("end time");
}
}`

Unbelievable!
You know what the issue was in linux? The object name and path name are two different things.
putObject(AmazonS3 s3, String bucketName, String objectName, String pathName)
where the pathName is the path of your file i.e.
String pathName = "/home/abc/xxxxx/objectStorage/errorlog.txt";
Notice it starts with forward slash, whereas object name should not start with / i.e.
String objectName = "home/abc/xxxxxx/objectStorage/errorlog.txt";
I wish the exception thrown would have given better clarity on what was wrong. The exception thrown only made me deviate from the root cause.

Related

When using service account impersonation, when calling export on Google Docs using v3 API, viewedByMeTime timestamp is updated

I am using a service account to access google doc files of users in my enterprise google account using impersonation.
See:
https://developers.google.com/drive/api/v3/about-auth#OAuth2Authorizing
So far so good.
Then, I need to download contents of Google Docs.
When calling Google Drive API to download the contents of a Google Doc, the documentation says to run the following:
https://developers.google.com/drive/api/v3/manage-downloads
Here is a java program that should reproduce the problem:
import com.google.api.client.googleapis.auth.oauth2.GoogleCredential;
import com.google.api.client.http.HttpRequestInitializer;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.jackson2.JacksonFactory;
import com.google.api.client.util.SecurityUtils;
import com.google.api.services.drive.Drive;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.security.GeneralSecurityException;
import java.util.Arrays;
import java.util.List;
public class FetchGoogleDocContentsWithServiceAccount {
static int readTimeout = 60000;
static int connectTimeout = 60000;
static String serviceAccountId = "";
static String serviceAccountEmail = "";
static String serviceAccountPrivateKeyFile = "";
static String serviceAccountPrivateKeyFilePassword = "";
static String fileId = "";
static JacksonFactory jacksonFactory = new JacksonFactory();
static NetHttpTransport httpTransport = new NetHttpTransport();
static List<String> googleScopeList = Arrays.asList("https://www.googleapis.com/auth/drive.readonly",
"https://www.googleapis.com/auth/admin.directory.group.readonly",
"https://www.googleapis.com/auth/admin.directory.user.alias.readonly",
"https://www.googleapis.com/auth/admin.directory.group", "https://www.googleapis.com/auth/admin.directory.user",
"https://www.googleapis.com/auth/drive");
public static void main(String[] args) throws Exception {
Drive drive = (new Drive.Builder(httpTransport,
jacksonFactory,
getRequestInitializer(getGoogleCredentials())))
.setApplicationName("Sample app").build();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
drive.files().export(fileId, "application/vnd.google-apps.document")
.executeMediaAndDownloadTo(baos);
System.out.println(baos.toString("UTF-8"));
}
public static HttpRequestInitializer getRequestInitializer(final GoogleCredential requestInitializer) {
return httpRequest -> {
requestInitializer.initialize(httpRequest);
httpRequest.setConnectTimeout(readTimeout);
httpRequest.setReadTimeout(connectTimeout);
};
}
public static GoogleCredential getGoogleCredentials() {
GoogleCredential credential;
try {
GoogleCredential.Builder b = new GoogleCredential.Builder().setTransport(httpTransport)
.setJsonFactory(jacksonFactory).setServiceAccountId(serviceAccountId)
.setServiceAccountPrivateKey(SecurityUtils.loadPrivateKeyFromKeyStore(SecurityUtils.getPkcs12KeyStore(),
new FileInputStream(new File(serviceAccountPrivateKeyFile)), serviceAccountPrivateKeyFilePassword,
"privatekey", serviceAccountPrivateKeyFilePassword))
.setServiceAccountScopes(googleScopeList);
if (serviceAccountEmail != null) {
b = b.setServiceAccountUser(serviceAccountEmail);
}
credential = b.build();
} catch (IOException | GeneralSecurityException e1) {
throw new RuntimeException("Could not build client secrets", e1);
}
return credential;
}
}
When I have performed this operation, we are seeing that the viewedByMeTime field is actually being updated as the impersonated user.
This is not good, because now people think someone might have stolen access to their account. They are going to open tickets with the security team.
Is this expected? How can I make this stop? Is there another method in the API I can call to download the google docs without updating this timestamp?
Also opened a ticket on the github for the google drive java sdk: https://github.com/googleapis/google-api-java-client-services/issues/3160
Updating the viewedByMeTime field upon calling the endpoint is indeed intended behaviour. Any action performed through the API is considered the same way as if the user did that action manually (i.e. that field would be updated too when the user visits the document through the UI).
By using domain-wise delegation (or "user impersonation"), you have no way to avoid this issue.
The only workaround would be to give the service account access to this file, and let it export the file without domain-wide delegation. The viewedByMeTime field will be updated only for the service account itself, but not for the original owner of that file (or any other user having access to it).

Why is withRegion() of AmazonSNSClientBuilder not visible?

I am writing code to create an Amazon Web Services SNS client in Eclipse, when I get an error saying
The method withRegion(Region) from the type
AwsClientBuilder is not visible
Here is my code
package com.amazonaws.samples;
import java.util.Date;
import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.AnonymousAWSCredentials;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.client.builder.AwsClientBuilder;
import com.amazonaws.regions.Region;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.sns.AmazonSNS;
import com.amazonaws.services.sns.AmazonSNSClient;
import com.amazonaws.services.sns.AmazonSNSClientBuilder;
import com.amazonaws.services.sns.model.CreateTopicRequest;
import com.amazonaws.services.sns.model.CreateTopicResult;
import com.amazonaws.services.sns.model.PublishRequest;
// Example SNS Sender
public class Main {
// AWS credentials -- replace with your credentials
static String ACCESS_KEY = "<Your AWS Access Key>";
static String SECRET_KEY = "<Your AWS Secret Key>";
// Sender loop
public static void main(String... args) throws Exception {
// Create a client
AWSCredentials awsCred = new AnonymousAWSCredentials();
AWSStaticCredentialsProvider cred = new AWSStaticCredentialsProvider(awsCred);
Region region = Region.getRegion(Regions.US_EAST_1);
AmazonSNS service = AmazonSNSClientBuilder.standard().withRegion(region).withCredentials(cred).build(); // Error message: The method withRegion(Region) from the type AwsClientBuilder<AmazonSNSClientBuilder,AmazonSNS> is not visible
// Create a topic
CreateTopicRequest createReq = new CreateTopicRequest()
.withName("MyTopic3");
CreateTopicResult createRes = service.createTopic(createReq);
for (;;) {
// Publish to a topic
PublishRequest publishReq = new PublishRequest()
.withTopicArn(createRes.getTopicArn())
.withMessage("Example notification sent at " + new Date());
service.publish(publishReq);
Thread.sleep(1000);
}
}
}
In the screenshot it shows where the error occurs with the red underline in dotted line:
What should I check to correct this?
You are passing the wrong parameter, withRegion takes either a String or a Regions (note, not Region, singular).
Try passing Regions.EU_WEST_1.
Both AmazonSNSClientBuilder.standard().withRegion(Regions.EU_WEST_1).build();
and AmazonSNSClientBuilder.standard().withRegion("eu-west-1").build();
are working fine for me.

Unable to move file from one folder to another in S3

I want to move the file inside the S3 folder to another folder which is in the same s3 Bucket. I tried the below code
CopyObjectRequest copyObjRequest = new CopyObjectRequest(bucketName,
srcFolder+"/"+Filename, bucketName,
targetFolder+"/"+Filename);
s3Client.copyObject(copyObjRequest);
DeleteObjectRequest deleteObjRequest = new DeleteObjectRequest(bucketName,
srcFolder+"/"+Filename);
s3Client.deleteObject(deleteObjRequest);
The folder may contain multiple file, i want to move only the selected file. Above code is not showing any error, but nothing happens. Can anyone please suggest me the right solution for it.
It would be a good initial stab just to run the following code and check what the output is, without and deletions.
Also worth checking the ACL and bucket policy on the object.
This is the format expected
CopyObjectRequest(java.lang.String sourceBucketName, java.lang.String sourceKey, java.lang.String destinationBucketName, java.lang.String destinationKey)
If you want a copy of the object in the same bucket
CopyObjectRequest copyObjRequest = new CopyObjectRequest("myBucket", "myObject.txt", "myBucket", "myNewObject.txt");
s3Client.copyObject(copyObjRequest);
If you want a copy of the object in a different bucket
CopyObjectRequest copyObjRequest = new CopyObjectRequest("myBucket", "myObject.txt", "myOtherBucket", "myNewObject.txt");
s3Client.copyObject(copyObjRequest);
Sample code for testing
import java.io.IOException;
import com.amazonaws.AmazonServiceException;
import com.amazonaws.SdkClientException;
import com.amazonaws.auth.profile.ProfileCredentialsProvider;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.services.s3.model.CopyObjectRequest;
public class CopyObjectSingleOperation {
public static void main(String[] args) throws IOException {
String clientRegion = "*** Client region ***";
String bucketName = "*** Bucket name ***";
String sourceKey = "*** Source object key *** ";
String destinationKey = "*** Destination object key ***";
try {
AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
.withCredentials(new ProfileCredentialsProvider())
.withRegion(clientRegion)
.build();
// Copy the object into a new object in the same bucket.
CopyObjectRequest copyObjRequest = new CopyObjectRequest(bucketName, sourceKey, bucketName, destinationKey);
s3Client.copyObject(copyObjRequest);
}
catch(AmazonServiceException e) {
// The call was transmitted successfully, but Amazon S3 couldn't process
// it, so it returned an error response.
e.printStackTrace();
}
catch(SdkClientException e) {
// Amazon S3 couldn't be contacted for a response, or the client
// couldn't parse the response from Amazon S3.
e.printStackTrace();
}
}
}

How to fetch the file list from gcs?

Following google's Getting Started I use following code to get the list of all files in a remote directory
class GCSFileStorage {
String bucket = "bucket_name";
String remoteDirectoryPath = "remote/path";
int fetchBlockSize = 1024 * 1024;
GcsService gcsService =
GcsServiceFactory.createGcsService(RetryParams.getDefaultInstance());
List<String> list() {
List<String> filenames = new List();
ListResult listResult = gcsService.list(bucket, ListOptions.DEFAULT);
while (listResult.hasNext()) {
ListItem listItem = listResult.next();
filenames += listItem.getName();
}
return filenames;
}
}
GCSFileStorage gcs = new GCSFileStorage();
gcs.list();
But this code fails with an exception:
java.io.IOException: com.google.appengine.tools.cloudstorage.RetriesExhaustedException:
...
Caused by: java.io.IOException: java.lang.NullPointerException
...
Caused by: java.lang.NullPointerException
at com.google.appengine.tools.cloudstorage.dev.LocalRawGcsService$BlobStorageAdapter.<init>(LocalRawGcsService.java:123)
at com.google.appengine.tools.cloudstorage.dev.LocalRawGcsService$BlobStorageAdapter.getInstance(LocalRawGcsService.java:184)
I suspect that I somehow should authorize in gcs and this may be the reason of failure. However I haven't found proper way to init everything that gcs needs for work.
As #ozarov mentioned the client I was using is specific for App Engine. It was added through dependency
com.google.appengine.tools:appengine-gcs-client:0.5
Instead REST API client should be used. Its dependency is
com.google.apis:google-api-services-storage:v1-rev44-1.20.0
Then the code to fetch files list may look as follows
import com.google.api.client.googleapis.auth.oauth2.GoogleCredential;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.jackson2.JacksonFactory;
import com.google.api.services.storage.Storage;
import com.google.api.services.storage.StorageScopes;
import com.google.api.services.storage.model.Objects;
import com.google.api.services.storage.model.StorageObject;
import com.google.common.collect.Lists;
import java.io.File;
import java.io.IOException;
import java.security.GeneralSecurityException;
import java.util.LinkedList;
import java.util.List;
class GCSFileStorage {
String bucket = "bucket_name";
String remoteDirectoryPath = "remote/path";
Storage storage
public GCSFileStorage() throws GeneralSecurityException, IOException {
storage = setupStorage();
}
List<String> list() throws IOException {
List<String> allItems = new LinkedList<String>();
Objects response = storage.objects().list(bucket).
setPrefix(remoteDirectoryPath).execute();
for (StorageObject obj: response.getItems()) {
allItems.add(obj.getName());
}
while (response.getNextPageToken() != null) {
String pageToken = response.getNextPageToken();
response = storage.objects().list(bucket).
setPrefix(remoteDirectoryPath).setPageToken(pageToken).execute();
for (StorageObject obj: response.getItems()) {
allItems.add(obj.getName());
}
}
return allItems;
}
Storage setupStorage() throws GeneralSecurityException, IOException {
GoogleCredential credential = new GoogleCredential.Builder().
setTransport(new NetHttpTransport()).
setJsonFactory(new JacksonFactory()).
setServiceAccountId("your_account_id").
setServiceAccountScopes(
Lists.newArrayList(StorageScopes.DEVSTORAGE_FULL_CONTROL)).
setServiceAccountPrivateKeyFromP12File(
new File("/local/path/to/private/key.p12")).
build();
return new Storage.
Builder(new NetHttpTransport(),
new JacksonFactory(), credential).
setApplicationName("foo").build();
}
}
How do you run this code? This GCS client is specific for App Engine and should run by either a deployed app or locally using the AE dev appserver or unit-tests (which should configure the AE runtime environment using LocalServiceTestHelper).

Jackrabbit WebDAV Synchronization Examples?

I'm using the Jackrabbit library for communicating with a cloud storage using the webdav protocol. I need a way to list all files from a specific directory and get the last modified property but I can't seem to find any working examples on this.
I basically need code to synchronize files from the local directory with the webdav url.
import java.io.File;
import java.io.FileInputStream;
import org.apache.commons.httpclient.Credentials;
import org.apache.commons.httpclient.HttpClient;
import org.apache.commons.httpclient.UsernamePasswordCredentials;
import org.apache.commons.httpclient.auth.AuthScope;
import org.apache.commons.httpclient.methods.InputStreamRequestEntity;
import org.apache.commons.httpclient.methods.RequestEntity;
import org.apache.jackrabbit.webdav.client.methods.DavMethod;
import org.apache.jackrabbit.webdav.client.methods.MkColMethod;
import org.apache.jackrabbit.webdav.client.methods.PutMethod;
public class WebDavClient
{
private String resourceUrl;
private HttpClient client;
private Credentials credentials;
private DavMethod method;
public WebDavClient(String resourceUrl, String username, String password)
throws Exception
{
this.resourceUrl = resourceUrl;
client = new HttpClient();
credentials = new UsernamePasswordCredentials(username, password);
client.getState().setCredentials(AuthScope.ANY, credentials);
}
public int upload(String fileToUpload) throws Exception
{
method = new PutMethod(getUpdatedWebDavPath(fileToUpload));
RequestEntity requestEntity = new InputStreamRequestEntity(
new FileInputStream(fileToUpload));
((PutMethod) method).setRequestEntity(requestEntity);
client.executeMethod(method);
return method.getStatusCode();
}
public int createFolder(String folder) throws Exception
{
method = new MkColMethod(getUpdatedWebDavPath(folder));
client.executeMethod(method);
return method.getStatusCode();
}
private String getUpdatedWebDavPath(String file)
{
// Make sure file names do not contain spaces
return resourceUrl + "/" + new File(file).getName().replace(" ", "");
}
}
Usage example for uploading the file Test.txt to the Backup folder:
String myAccountName = "...";
String myPassword = "...";
WebDavClient webdavUploader = new WebDavClient("https:\\\\webdav.hidrive.strato.com\\users\\" + myAccountName + "\\Backup", myAccountName, myPassword);
webdavUploader.upload("C:\\Users\\Username\\Desktop\\Test.txt");
Here's a list of different DavMethods that could be helpful:
http://jackrabbit.apache.org/api/1.6/org/apache/jackrabbit/webdav/client/methods/package-summary.html
Please help, I'm been struggling on this for so long!
Take a look at the AMES WebDAV Client code from Krusche and Partner on the EU portal. It is licensed under GPL, so should it may fit your purpose.
https://joinup.ec.europa.eu/svn/ames-web-service/trunk/AMES-WebDAV/ames-webdav/src/de/kp/ames/webdav/WebDAVClient.java
It works for me, though to access e.g. Win32LastModifiedTime I need to get the custom namespace, e.g.
private static final Namespace WIN32_NAMESPACE = Namespace.getNamespace("Z2", "urn:schemas-microsoft-com:");
and retrieve the custom Property Win32LastModifiedTime from the properties.
/*
* Win32LastModifiedTime
*/
String win32lastmodifiedtime = null;
DavProperty<?> Win32LastModifiedTime = properties.get("Win32LastModifiedTime", WIN32_NAMESPACE);
if ((Win32LastModifiedTime != null) && (Win32LastModifiedTime.getValue() != null)) win32lastmodifiedtime = Win32LastModifiedTime.getValue().toString();

Categories

Resources