For getting s3 client object i am using below code.
BasicAWSCredentials creds = new BasicAWSCredentials(key, S3secretKey);
AmazonS3 s3Client =AmazonS3ClientBuilder.standard().withCredentials(new AWSStaticCredentialsProvider(creds)).build();
Getting below errors
Unable to find a region via the region provider chain. Must provide an explicit region in the builder or setup environment to supply a region.
I had to change to:
AmazonS3 client = AmazonS3ClientBuilder.standard()
.withRegion(Regions.US_EAST_1)
.withForceGlobalBucketAccess(true)
.build();
to emulate the "old" way (i.e. new AmazonS3Client() )
With a builder you need to provide your S3 bucket region using builder method, like .withRegion(Regions.US_EAST_1)
One way to do it with the 1.11.98 version of the sdk, in your code, you would do:
AmazonS3 s3 = AmazonS3ClientBuilder.defaultClient();
And you need to have ~/.aws/credentials and ~/.aws/config files:
~/.aws/credentials contents:
[pca]
aws_access_key_id = KDDDJGIzzz3VVBXYA6Z
aws_secret_access_key = afafaoRDrJhzzzzg/Hhcccppeeddaf
[deault]
aws_access_key_id = AMVKNEIzzzNEBXYJ4m
aws_secret_access_key = bU4rUwwwhhzzzzcppeeddoRDrJhogA
~/.aws/config contents:
[default]
region = us-west-1
[pca]
region = us-west-1
Make sure they're readable, and that you export a profile if you have multiple as above before starting your service:
alper$ export AWS_PROFILE="pca"
There's a good description here
Related
I'm trying to put an item into a DynamoDB table using the AWS SDK for Java.
I am using the EnhancedPutItem.java example from the docs:
public static void main(String[] args) {
ProfileCredentialsProvider credentialsProvider = ProfileCredentialsProvider.create();
Region region = Region.US_EAST_1;
DynamoDbClient ddb = DynamoDbClient.builder()
.credentialsProvider(credentialsProvider)
.region(region)
.build();
DynamoDbEnhancedClient enhancedClient = DynamoDbEnhancedClient.builder()
.dynamoDbClient(ddb)
.build();
putRecord(enhancedClient) ;
ddb.close();
}
...
When running locally I can put the item successfully, but when I run my application as a task on Fargate, it throws this error:
software.amazon.awssdk.core.exception.SdkClientException: Profile file contained no credentials for profile 'default': ProfileFile(profilesAndSectionsMap=[]).
The error states that the SDK is looking for credentials within the default credential profiles file & cannot find any.
This is because the sample code explicitly specifies a ProfileCredentialsProvider as the credentials provider for the DynamoDbClient.
This overrides the default credential provider chain, which by default would be looking for credentials in a variety of locations - including the ECS container credentials.
You need to remove the use of the ProfileCredentialsProvider.
You have 2 options:
specify no credential provider when creating the client - the SDK will fallback to the default credential provider and then find the ECS container credentials; this is the most common option and will most likely work across various environments (as the chain would look sequentially in multiple places)
replace the use of ProfileCredentialsProvider with ContainerCredentialsProvider instead, which would only specifically look for ECS container credentials
Option 1 is recommended as it is the most common configuration & your code has the highest chance of working across various environments (as the chain would look sequentially for multiple credential providers).
To implement option 1, change:
ProfileCredentialsProvider credentialsProvider = ProfileCredentialsProvider.create();
Region region = Region.US_EAST_1;
DynamoDbClient ddb = DynamoDbClient.builder()
.credentialsProvider(credentialsProvider)
.region(region)
.build();
to:
Region region = Region.US_EAST_1;
DynamoDbClient ddb = DynamoDbClient.builder()
.region(region)
.build();
Option 2 looks like this:
private AwsCredentialsProvider awsCredentialsProvider = ContainerCredentialsProvider.builder().build();
private final DynamoDbClient ddb = DynamoDbClient.builder()
.credentialsProvider(awsCredentialsProvider)
.region(Region.of(REGION))
.build();
I am working on java project and using aws sdk version 2 library to use s3 services. I am using S3Client for using services like getObject(),getBucekt,listObject etc.
And I want to use S3TransactionManager for get logging process of upload and download file.
Here are my code sample:-
static String sAccessKey = "XXXXXXXXXX";
static String sSecretKey = "XXXXXXXXXXXXX";
static AwsBasicCredentials awsCreds = AwsBasicCredentials.create(
sAccessKey,
sSecretKey);
AwsCredentialsProvider awsCredentialsProvider = StaticCredentialsProvider.create(awsCreds);
static S3Client s3Client = S3Client.builder().region(Region.US_WEST_2)
.credentialsProvider(StaticCredentialsProvider.create(awsCreds))
.build();
S3TransferManager s3TransferManager = S3TransferManager
.builder()
.s3ClientConfiguration(S3ClientConfiguration.builder()
.region(Region.US_WEST_2)
.credentialsProvider(awsCredentialsProvider).build()).build();
S3TransferManager s3TransferManager1 = S3TransferManager.create();
Region region = Region.US_WEST_2;
S3ClientConfiguration s3ClientConfiguration =
S3ClientConfiguration.builder()
.region(region)
.credentialsProvider(awsCredentialsProvider)
.targetThroughputInGbps(20.0)
.build();
Upload upload =
s3TransferManager1.upload(b -> b.putObjectRequest(r -> r.bucket("SSSS").key("test.ppt"))
.source(Paths.get("fileToUpload.txt")));
My issue is that when I used both s3 and s3_transaction_manager jar then I am getting error while creating object of S3TransactionManger.
Error are:-
Exception in thread "main" java.lang.NoSuchMethodError: software.amazon.awssdk.utils.Validate.isPositiveOrNull(Ljava/lang/Double;Ljava/lang/String;)Ljava/lang/Double;
at software.amazon.awssdk.transfer.s3.S3ClientConfiguration.<init>(S3ClientConfiguration.java:49)
at software.amazon.awssdk.transfer.s3.S3ClientConfiguration.<init>(S3ClientConfiguration.java:37)
at software.amazon.awssdk.transfer.s3.S3ClientConfiguration$DefaultBuilder.build(S3ClientConfiguration.java:301)
at software.amazon.awssdk.transfer.s3.S3ClientConfiguration$DefaultBuilder.build(S3ClientConfiguration.java:243)
at software.amazon.awssdk.transfer.s3.internal.DefaultS3TransferManager$DefaultBuilder.<init>(DefaultS3TransferManager.java:405)
at software.amazon.awssdk.transfer.s3.internal.DefaultS3TransferManager$DefaultBuilder.<init>(DefaultS3TransferManager.java:404)
at software.amazon.awssdk.transfer.s3.internal.DefaultS3TransferManager.builder(DefaultS3TransferManager.java:364)
at software.amazon.awssdk.transfer.s3.S3TransferManager.builder(S3TransferManager.java:497)
at com.bucketexplorer.main.SdkTest.main(SdkTest.java:88)
I think util class from both jar have conflict.
Jar files which I am using are :-
s3-2.16.46.jar
s3-transfer-manager-2.17.257-PREVIEW.jar
Please suggest how can I used both jar together?
It is a conflict between the two libraries.
Upgrading s3-2.16.46.jar to the same version of s3-transfer-manager (without PREVIEW) will solve the problem.
I would like to create a bucket in the ceph object storage via the S3 API. Which works fine, if I use Pythons boto3:
s3 = boto3.resource(
's3',
endpoint_url='https://my.non-amazon-endpoint.com',
aws_access_key_id=access_key,
aws_secret_access_key=secret_key
)
bucket = s3.create_bucket(Bucket="my-bucket") # successfully creates bucket
Trying the same with java leads to an exception:
BasicAWSCredentials awsCreds = new BasicAWSCredentials(access_key, secret_key);
AwsClientBuilder.EndpointConfiguration config =
new AwsClientBuilder.EndpointConfiguration(
"https://my.non-amazon-endpoint.com",
"MyRegion");
AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
.withCredentials(new AWSStaticCredentialsProvider(awsCreds))
.withEndpointConfiguration(config)
.build();
List<Bucket> buckets = s3Client.listBuckets();
// this works and lists all containers, hence the connection should be fine
for (Bucket bucket : buckets) {
System.out.println(bucket.getName() + "\t" +
StringUtils.fromDate(bucket.getCreationDate()));
}
Bucket bucket = s3Client.createBucket("my-bucket");
// AmazonS3Exception: The specified location-constraint is not valid (Service: Amazon S3; Status Code: 400; Error Code: InvalidLocationConstraint...
I am aware of several related issues, for instance this issue, but I was not able to adjust the suggested solutions to my non-amazon storage.
Digging deeper into the boto3 code, it turns out, that the LocationConstraint is set to None, if no region has been specified. But omitting the region in java leads to the InvalidLocationConstrain, too.
How do I have to configure the endpoint with the java s3 aws sdk to successfully create buckets?
Kind regards
UPDATE
Setting the signingRegion to "us-east-1" enables bucket creation functionality:
AwsClientBuilder.EndpointConfiguration config =
new AwsClientBuilder.EndpointConfiguration(
"https://my.non-amazon.endpoint.com",
"us-east-1");
If one assigns another region, the sdk will parse the region from endpoint url, as specified here.
In my case, this leads to an invalid region, for instance non-amazon.
Setting the signingRegion to "us-east-1" enables bucket creation functionality:
AwsClientBuilder.EndpointConfiguration config =
new AwsClientBuilder.EndpointConfiguration(
"https://my.non-amazon.endpoint.com",
"us-east-1");
If one assigns another region, the sdk will parse the region from endpoint url, as specified here.
In my case, this leads to an invalid region, for instance non-amazon.
I am trying to create an aws s3 bucket using the following java code.
AmazonS3 s3client = AmazonS3ClientBuilder.defaultClient();
s3client.setRegion(Region.getRegion(Regions.AP_SOUTH_1));
But I am getting the following error:
"exception": "com.amazonaws.SdkClientException",
"message": "Unable to find a region via the region provider chain. Must provide an explicit region in the builder or setup environment to supply a region."
Am I trying to set region in an incorrect way? Please advice.
If you are not using any proxies and you already setup your credentials, you can use below code:
AmazonS3 s3client = AmazonS3ClientBuilder.standard()
.withRegion(Region.getRegion(Regions.AP_SOUTH_1));
But if you need to setup a proxy and manually set the credentials, you can use below code:
AWSCredentials cred = new BasicAWSCredentials(<accessKey>,<secretKey>);
AmazonS3 s3client = AmazonS3ClientBuilder.standard()
.withCredentials(new AWSStaticCredentialsProvider(cred))
.withClientConfiguration(<your configuration>)
.withRegion(Region.getRegion(Regions.AP_SOUTH_1));
The reason you are getting the error is you have not setup AWS with Eclipse.
If you are using Eclipse as your IDE then read:
http://docs.aws.amazon.com/toolkit-for-eclipse/v1/user-guide/welcome.html
Once the profile is setup then
AmazonS3 s3 = new AmazonS3Client(new ProfileCredentialsProvider());
Region apSouth1 = Region.getRegion(Regions.AP_SOUTH_1);
s3.setRegion(apSouth1);
Also make sure to import:
import com.amazonaws.regions.Region;
import com.amazonaws.regions.Regions;
I am trying to upload a file to S3. The code to do so is below:
AmazonS3 s3Client = new AmazonS3Client(new ProfileCredentialsProvider());
String key = String.format(Constants.KEY_NAME + "/%s/%s", activity_id, aFile.getName());
s3Client.putObject(Constants.BUCKET_NAME, key, aFile.getInputStream(), new ObjectMetadata());
The problem I am having is that my ProfileCredentialsProvider cannot access my AWS keys. I have set my environment variables:
AWS_ACCESS_KEY=keys go here
AWS_SECRET_KEY=keys go here
AWS_ACCESS_KEY_ID=keys go here
AWS_DEFAULT_REGION=us-east-1
AWS_SECRET_ACCESS_KEY=keys go here
And as per Amazon's Documentation the set environment variables have precedence over any configuration files. This leads me to ask, why are my keys not being grabbed from my environment variables?
Figured it out.
If you specify a ProfileCredentialsProvider() the AWS SDK will look for a configuration file, regardless of precedence. Simply creating a S3 Client like this:
AmazonS3 s3Client = new AmazonS3Client();
Will check the various locations specified for credentials.