AWS ProfileCredentialsProvider not able to get credentials - java

I am trying to upload a file to S3. The code to do so is below:
AmazonS3 s3Client = new AmazonS3Client(new ProfileCredentialsProvider());
String key = String.format(Constants.KEY_NAME + "/%s/%s", activity_id, aFile.getName());
s3Client.putObject(Constants.BUCKET_NAME, key, aFile.getInputStream(), new ObjectMetadata());
The problem I am having is that my ProfileCredentialsProvider cannot access my AWS keys. I have set my environment variables:
AWS_ACCESS_KEY=keys go here
AWS_SECRET_KEY=keys go here
AWS_ACCESS_KEY_ID=keys go here
AWS_DEFAULT_REGION=us-east-1
AWS_SECRET_ACCESS_KEY=keys go here
And as per Amazon's Documentation the set environment variables have precedence over any configuration files. This leads me to ask, why are my keys not being grabbed from my environment variables?

Figured it out.
If you specify a ProfileCredentialsProvider() the AWS SDK will look for a configuration file, regardless of precedence. Simply creating a S3 Client like this:
AmazonS3 s3Client = new AmazonS3Client();
Will check the various locations specified for credentials.

Related

Issuing uploading file to Amazon S3 Bucket. using the AWS SDK for Java v2

I am trying to upload a file to an AWS S3 Bucket using the AWS SDK 2.0 for Java, but I am getting an error when trying to do so.
software.amazon.awssdk.services.s3.model.S3Exception: The request signature we calculated does not match the signature you provided. Check your key and signing method.
I am not sure what I am missing. I have tried adding in a key, but I am not even sure what I need to put in there, think it is just a name to refer to what has been uploaded though.
private S3Client s3Client;
private void upload() {
setUpS3Client();
PutObjectRequest putObjectRequest = PutObjectRequest.builder()
.bucket(bucketName) //name of the bucket i am trying to upload to
.key("testing") //No idea what goes in here.
.build();
byte[] objectByteArray = getObjectByteArray(bucketRequest.getPathToFile()); //bucketRequest just holds the data that will be sent
PutObjectResponse putObjectResponse = s3Client.putObject(putObjectRequest, RequestBody.fromBytes(objectByteArray));
}
private void setUpS3Client() {
Region region = Region.AF_SOUTH_1;
s3Client = S3Client.builder()
.region(region)
.credentialsProvider(createStaticCredentialsProvider())
.build();
this.s3Client = s3Client;
}
Does anyone know what this error is referring to and what I need to change to get the file to upload? Any help will be appreciated.
This Java example works fine. Here is a screenshot of this Java code working:
You state:
key("testing") //No idea what goes in here.
.build();
The key is the name of the object to update. For example, book.pdf to upload a PDF file. All input to Java AWS V2 examples is documented at start in the main method.
Now for your problem - make sure you have the required dependencies in the POM file. Use the POM file located here:
https://github.com/awsdocs/aws-doc-sdk-examples/tree/main/javav2/example_code/s3
Also at start of every program, there is a link to Java V2 Dev guide that talks about setting up your DEV Environment, including your creds.
The problem was my credentials. The secret access key I provided was incorrect. :(

Setting GOOGLE_APPLICATION_CREDENTIALS in Spring

I'm attempting to use Spring to access files from Google Storage Buckets with the end goal of using MultiResourceItemReader to read in multiple XML files from the bucket. I currently have Spring working with this process when the XML files are locally on my machine (not GCP)
Now, I want to do the same thing, but instead of XML files on my machine, the files are in a GCP Storage bucket. I can access the bucket contents outside of Spring, one file at at time. For example this little bit of test code allows me to get access to the bucket and then see the files in the bucket. In this snippet, I setup the credentials via the JSON key file. (not an environment variable)
public static void storageDriver() throws IOException {
// Load credentials from JSON key file. If you can't set the GOOGLE_APPLICATION_CREDENTIALS
// environment variable, you can explicitly load the credentials file to construct the
// credentials.
String name = "";
String bucketName = "";
String bucketFileName = "";
String bucketFullPath = "";
Resource myBucker;
GoogleCredentials credentials;
File credentialsPath = new File("mycreds.json");
try (FileInputStream serviceAccountStream = new FileInputStream(credentialsPath)) {
credentials = ServiceAccountCredentials.fromStream(serviceAccountStream);
}
Storage storage = StorageOptions.newBuilder()
.setCredentials(credentials)
.setProjectId("myProject")
.build()
.getService();
for (Bucket bucket:storage.list().iterateAll()){
if(bucket.getName().equalsIgnoreCase("myGoogleBucket")){
bucketName = bucket.getName();
System.out.println(bucket);
for (Blob blob : bucket.list().iterateAll()){
bucketFileName = blob.getName();
bucketFullPath = "gs://"+bucketName+"/"+bucketFileName;
System.out.println(bucketFullPath);
}
}
};
However, when I try to do the following with Spring, Spring complains that I don't have the GOOGLE_APPLICATION_CREDENTIALS defined. (which of course I don't since I'm doing it programmatically.
For example, I'll add
#Value("gs://myGoogleBucket")
private Resource[] resources;
The Application Default Credentials are not available. They are available if running in Google Compute Engine. Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing to a file defining the credentials.
Spring Cloud GCP simplifies your GCS configuration.
You can add Storage support to your app. Then, either specify the location of your service account credentials through the spring.cloud.gcp.storage.credentials.location property, or by logging in with application default credentials using the Google Cloud SDK.
This will automatically provide you with a fully configured Storage object and things like #Value(gs://YOUR-BUCKET/YOUR-FILE) should just work.
I tried many ways, but at last this excerpt from spring docs is the one which worked for me:
Due to the way logging is set up, the GCP project ID and credentials defined in application.properties are ignored. Instead, you should set the GOOGLE_CLOUD_PROJECT and GOOGLE_APPLICATION_CREDENTIALS environment variables to the project ID and credentials private key location, respectively. You can do this easily if you’re using the Google Cloud SDK, using the gcloud config set project [YOUR_PROJECT_ID] and gcloud auth application-default login commands, respectively.

Java aws sdk - The specified location-constraint is not valid (non-amazon)

I would like to create a bucket in the ceph object storage via the S3 API. Which works fine, if I use Pythons boto3:
s3 = boto3.resource(
's3',
endpoint_url='https://my.non-amazon-endpoint.com',
aws_access_key_id=access_key,
aws_secret_access_key=secret_key
)
bucket = s3.create_bucket(Bucket="my-bucket") # successfully creates bucket
Trying the same with java leads to an exception:
BasicAWSCredentials awsCreds = new BasicAWSCredentials(access_key, secret_key);
AwsClientBuilder.EndpointConfiguration config =
new AwsClientBuilder.EndpointConfiguration(
"https://my.non-amazon-endpoint.com",
"MyRegion");
AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
.withCredentials(new AWSStaticCredentialsProvider(awsCreds))
.withEndpointConfiguration(config)
.build();
List<Bucket> buckets = s3Client.listBuckets();
// this works and lists all containers, hence the connection should be fine
for (Bucket bucket : buckets) {
System.out.println(bucket.getName() + "\t" +
StringUtils.fromDate(bucket.getCreationDate()));
}
Bucket bucket = s3Client.createBucket("my-bucket");
// AmazonS3Exception: The specified location-constraint is not valid (Service: Amazon S3; Status Code: 400; Error Code: InvalidLocationConstraint...
I am aware of several related issues, for instance this issue, but I was not able to adjust the suggested solutions to my non-amazon storage.
Digging deeper into the boto3 code, it turns out, that the LocationConstraint is set to None, if no region has been specified. But omitting the region in java leads to the InvalidLocationConstrain, too.
How do I have to configure the endpoint with the java s3 aws sdk to successfully create buckets?
Kind regards
UPDATE
Setting the signingRegion to "us-east-1" enables bucket creation functionality:
AwsClientBuilder.EndpointConfiguration config =
new AwsClientBuilder.EndpointConfiguration(
"https://my.non-amazon.endpoint.com",
"us-east-1");
If one assigns another region, the sdk will parse the region from endpoint url, as specified here.
In my case, this leads to an invalid region, for instance non-amazon.
Setting the signingRegion to "us-east-1" enables bucket creation functionality:
AwsClientBuilder.EndpointConfiguration config =
new AwsClientBuilder.EndpointConfiguration(
"https://my.non-amazon.endpoint.com",
"us-east-1");
If one assigns another region, the sdk will parse the region from endpoint url, as specified here.
In my case, this leads to an invalid region, for instance non-amazon.

AmazonS3Client is deprecated how to get s3client object with using credential

For getting s3 client object i am using below code.
BasicAWSCredentials creds = new BasicAWSCredentials(key, S3secretKey);
AmazonS3 s3Client =AmazonS3ClientBuilder.standard().withCredentials(new AWSStaticCredentialsProvider(creds)).build();
Getting below errors
Unable to find a region via the region provider chain. Must provide an explicit region in the builder or setup environment to supply a region.
I had to change to:
AmazonS3 client = AmazonS3ClientBuilder.standard()
.withRegion(Regions.US_EAST_1)
.withForceGlobalBucketAccess(true)
.build();
to emulate the "old" way (i.e. new AmazonS3Client() )
With a builder you need to provide your S3 bucket region using builder method, like .withRegion(Regions.US_EAST_1)
One way to do it with the 1.11.98 version of the sdk, in your code, you would do:
AmazonS3 s3 = AmazonS3ClientBuilder.defaultClient();
And you need to have ~/.aws/credentials and ~/.aws/config files:
~/.aws/credentials contents:
[pca]
aws_access_key_id = KDDDJGIzzz3VVBXYA6Z
aws_secret_access_key = afafaoRDrJhzzzzg/Hhcccppeeddaf
[deault]
aws_access_key_id = AMVKNEIzzzNEBXYJ4m
aws_secret_access_key = bU4rUwwwhhzzzzcppeeddoRDrJhogA
~/.aws/config contents:
[default]
region = us-west-1
[pca]
region = us-west-1
Make sure they're readable, and that you export a profile if you have multiple as above before starting your service:
alper$ export AWS_PROFILE="pca"
There's a good description here

Java AWS SDK - How to upload file using IAM profile from Java API

I have just started using AWS for my project. I want to write a project that uploads critical files to s3 bucket. I donot want to expose any secret keys so that all other developers / users can access the uploaded documents. Please provide some pointer how to begin with.
My Current Implementation:
return new AmazonS3Client(new AWSCredentials() {
#Override
public String getAWSAccessKeyId() {
return accessKey;
}
#Override
public String getAWSSecretKey() {
return accessKeySecret;
}, clientConfiguration )
Then I use amazonS3Client.putObject(putReq); to upload file.
So, here I am exposing my keys that enables other colleague to download/view the files. Anyone can use it to download/upload file from s3cmd, browser plugins etc.
On reading AWS docs, I got to know I can use EC2 instance and setup IAM profile. BUt I am not sure how can I do with java code. Please provide some link and example
Look at the InstanceProfileCredentialsProvider class. It gets IAM credentials (access/secret key) from the instance's metadata. Launch your instance under an IAM role that has a policy that permits access to S3.
AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
.withCredentials(new InstanceProfileCredentialsProvider())
.build();
Source reference
If your users need access to upload to S3, then they will need access to the keys, there's nothing you can do about that.
What you can do though, is to give them keys which have permissions to upload files to S3, but no permission to read/download. So, you'd have an upload policy with the PutObject permission, and a read policy with the List/Get permissions.

Categories

Resources