I‘m trying to use Amazon AWS SDK for developing Eucalyptus,but II am always getting access denied for that:public class App {
public static AmazonEC2 ec2;
public static AmazonS3 s3;
public static AmazonIdentityManagement identityManagement;
private static String ACCESS_KEY;
private static String SECRET_KEY;
private static String IDENTITY_END_POINT;
public static void init(){
ACCESS_KEY="myaccesskey";
SECRET_KEY="mysecretkey";
IDENTITY_END_POINT="http://192.168.1.101:8773/services/Euare";
AWSCredentials myCredential = new BasicAWSCredentials(ACCESS_KEY,SECRET_KEY);
//ec2 = new AmazonEC2Client(myCredential);
s3 = new AmazonS3Client(myCredential);
//ec2.setEndpoint(EC2_END_POINT);
s3.setEndpoint(Walrus_END_POINT);
identityManagement=new AmazonIdentityManagementClient(myCredential);
identityManagement.setEndpoint(IDENTITY_END_POINT);
}
public static void main(String[] args) {
// TODO Auto-generated method stub
init();
List<Bucket> buckets = s3.listBuckets();
System.out.println("List of all buckets in your cloud:\n");
for (Bucket bucket : buckets) {
System.out.println(bucket.getName()+"\n");
}
CreateUserRequest createUserRequest=new CreateUserRequest().withPath("/").withUserName("summer");
CreateUserResult createUserResult=identityManagement.createUser(createUserRequest);
}
}
That is the error I get:
Exception in thread "main" Status Code: 404, AWS Service: AmazonIdentityManagement, AWS Request ID: null, AWS Error Code: null, AWS Error Message: null
at com.amazonaws.http.AmazonHttpClient.handleErrorResponse(AmazonHttpClient.java:614)
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:312)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:165)
at com.amazonaws.services.identitymanagement.AmazonIdentityManagementClient.invoke(AmazonIdentityManagementClient.java:3130)
at com.amazonaws.services.identitymanagement.AmazonIdentityManagementClient.getUser(AmazonIdentityManagementClient.java:1201)
at com.test2.Group.main(Group.java:107)
I'm not sure what would cause a 404 when using the IAM API, the code you posted does not seem to match the posted exception.
You don't say which versions of Eucalyptus and the AWS Java SDK you are using, but with some versions it is necessary to add a / to the endpoint URL so that the AWS Java SDK calculates the signature correctly, e.g. for IAM:
"http://$EXAMLE:8773/services/Euare/"
I do not think this would cause the 404 issue however.
You can find details on using the AWS Java SDK with Eucalyptus here:
https://github.com/eucalyptus/eucalyptus/wiki/HOWTO-Use-AWS-Java-SDK-with-Eucalyptus
Unit tests for the AWS Java SDK with Eucalyptus are here:
https://github.com/eucalyptus/eutester/tree/testing/eutester4j/com/eucalyptus/tests/awssdk
This includes some coverage for IAM/STS services.
Related
I tried to run the DescribeConfigurationSettings API method for the ElasticBeanstalk as follow:
AWSElasticBeanstalk ebs = AWSElasticBeanstalkClientBuilder.standard().withRegion(Regions.EU_CENTRAL_1).withCredentials(new AWSStaticCredentialsProvider(credentials)).build();
for(ApplicationDescription ad : ebs.describeApplications().getApplications()){
System.out.println(ad);
for(EnvironmentDescription ed : ebs.describeEnvironments(new DescribeEnvironmentsRequest().withApplicationName(ad.getApplicationName())).getEnvironments()) {
System.out.println(ebs.describeConfigurationSettings(new DescribeConfigurationSettingsRequest().withApplicationName(ad.getApplicationName()).withEnvironmentName(ed.getEnvironmentName())).getConfigurationSettings());
}
}
However, I got the exception of Access Denied with the following message:
Exception in thread "main"
com.amazonaws.services.elasticbeanstalk.model.AWSElasticBeanstalkException:
Access Denied: S3Bucket=elasticbeanstalk-env-resources-eu-central-1,
S3Key=eb_patching_resources/instance_patch_extension.linux (Service:
Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID:
NB44V0RXQG2WHH4T; Proxy: null) (Service: AWSElasticBeanstalk; Status
Code: 400; Error Code: InvalidParameterValue; Request ID:
b058aa54-fc9c-4879-9502-5cb5818bc64a; Proxy: null)
How can I resolve this issue?
Based on the error you get, it seems that you are missing some IAM permissions. I would start by adding AWSElasticBeanstalkManagedUpdatesCustomerRolePolicy Managed policy to your user.
This policy is probably more permissive than what you actually need, but it would be difficult to pinpoint exactly, which permissions are necessary.
Amazon recommends using AWS SDK for Java V2.
Updated Code
Here is the Java V2 code for this use case.
package com.aws.example;
import software.amazon.awssdk.regions.Region;
import software.amazon.awssdk.services.elasticbeanstalk.ElasticBeanstalkClient;
import software.amazon.awssdk.services.elasticbeanstalk.model.*;
import java.util.List;
public class DescribeApplications {
public static void main(String[] args) {
Region region = Region.US_EAST_1;
ElasticBeanstalkClient beanstalkClient = ElasticBeanstalkClient.builder()
.region(region)
.build();
DescribeApplicationsResponse applicationsResponse = beanstalkClient.describeApplications();
List<ApplicationDescription> apps = applicationsResponse.applications();
for (ApplicationDescription app: apps) {
System.out.println("The application name is "+app.applicationName());
DescribeEnvironmentsRequest desRequest = DescribeEnvironmentsRequest.builder()
.applicationName(app.applicationName())
.build();
DescribeEnvironmentsResponse res = beanstalkClient.describeEnvironments(desRequest) ;
List<EnvironmentDescription> envDesc = res.environments();
for (EnvironmentDescription desc: envDesc) {
System.out.println("The Environment ARN is "+desc.environmentArn());
}
}
}
}
Output here:
I'm trying to create a file in a Google Drive folder.
Reading from the folder works. I'm already reading files in different applications with different Service Accounts.
I shared my G Suite Drive folder with the generated service account email and gave it Editor access (read, write, edit).
I tried to copy an existing file (but also create an empty one from scratch) from my Java Springboot application.
public class TemplateDocumentManager {
#Value("${printing.template.id}")
private String baseTemplateFileId;
#Autowired
private Drive driveService;
public void createNewContractFromEmptyTemplate() throws IOException {
File templateFile = getDriveFiles().get(baseTemplateFileId).execute();
File newFileInstance = getDriveFiles()
.copy(baseTemplateFileId, templateFile)
.setSupportsAllDrives(true)
.execute();
log.error("Id of newly created file is: {}", newFileInstance.getId());
}
protected Drive.Files getDriveFiles() {
return driveService.files();
}
}
The google drive service injected with the #Autowired annotation is working properly. It is created as follows:
#Configuration
public class DriveService {
public static final String APPLICATION_NAME = "appname";
#Autowired
GoogleCredential googleCredential;
#Bean("driveService")
public Drive createDriveService() throws GeneralSecurityException, IOException {
return new Drive.Builder(GoogleNetHttpTransport.newTrustedTransport(), JacksonFactory.getDefaultInstance(), googleCredential)
.setApplicationName(APPLICATION_NAME)
.build();
}
...
}
Any ideas on what the service account needs to be able to write on the G Suite Drive folder?
You have to give the required permissions to the GoogleCredentials Object, i.e.
GoogleCredential.fromStream(source).createScoped(scopes)
Scopes could be found in e.g. SheetsScopes or DriveScopes ( depending on your Google dependencies )
I am trying to connect to my s3 bucket ( already created and I am able to test by listing using s3cmd ) and list the contest of it.
Not seeing any error message when trying to create the connection. However, when trying to list the the content of my bucket, I am getting the exception :
java.lang.BootstrapMethodError: bootstrap method initialization exception
I am using the dependency ( in gradle) :
implementation "software.amazon.awssdk:s3:2.7.10"
Here the code snippet (I am getting the above error from : adapterSmsS3Client.listBuckets())
import software.amazon.awssdk.services.s3.S3Client;
....
private static final String BASE_URL = "https://s3.us-west-2.amazonaws.com";
private static final String BUCKET_NAME = "adapter-stream/dev";
....
S3Client adapterSmsS3Client = S3Client.builder()
.region(Region.US_WEST_2)
.credentialsProvider(StaticCredentialsProvider.create(AwsBasicCredentials.create("access-key","secret-key")))
.endpointOverride(URI.create(BASE_URL))
.build();
System.out.println("====Checking connection: " +adapterSmsS3Client.listBuckets());
Any insight on what I am missing ?
I have a Google App Engine Java app that is returning null from SystemProperty.environment.value(), and all other static members of SystemProperty. I see this when running my JUnit tests via Maven.
import com.google.appengine.api.utils.SystemProperty;
...
void printProps() {
log.info("props:" + System.getProperties());
log.info("env=" + SystemProperty.environment.value());
log.info("log=" + System.getProperty("java.util.logging.config.file"));
log.info("id=" + SystemProperty.applicationId.get());
log.info("ver=" + SystemProperty.applicationVersion.get());
}
The only item above that returns non-null is System.getProperties().
Here are some of the details of my setup:
IntelliJ IDEA EAP 13
Maven
App Engine SDK 1.8.5
Java 7 (1.7.0_40)
JUnit 4
I was having the same problem. I attempted to call these methods on LocalServiceTestHelper, but these did NOT populate SystemProperty.applicationId or SystemProperty.version
setEnvAppId(java.lang.String envAppId)
setEnvVersionId(java.lang.String envVersionId)
e.g.
public final LocalServiceTestHelper helper = new LocalServiceTestHelper(new LocalDatastoreServiceTestConfig(), new LocalTaskQueueTestConfig() ).setEnvAppId("JUnitApplicationId").setEnvVersionId("JUnitVersion");
My solution was simply to populate those properties myself in my JUnit setUp() method:
#Before
public void setUp() throws Exception {
SystemProperty.version.set("JUnitVersion");
SystemProperty.applicationId.set("JUnitApplicationId");
SystemProperty.applicationVersion.set("JUnitApplicationVersion");
SystemProperty.environment.set( SystemProperty.Environment.Value.Development );
helper.setUp();
datastore = DatastoreServiceFactory.getDatastoreService();
queue = QueueFactory.getDefaultQueue();
}
Note that the only valid values for SystemProperty.Environment are the static final values Production and Development.
I am using Quercus in Apache JMeter for simple scripting of tests. I have a requirement to log from PHP using log4j, and on the whole this works well. So I wrote a Quercus module like this:
public class LogFunction extends AbstractQuercusModule {
private static Logger log = Logger.getLogger(LogFunction.class);
public void log_str(Env env, String str) {
log.info(str);
}
}
Now, I am testing this with the following code:
public class QuercusTest {
private static ScriptEngine engine;
static{
//set up Quercus
ScriptEngineManager manager = new ScriptEngineManager();
engine = manager.getEngineByName("php");
}
public static void main(String[] args) throws ScriptException{
engine.eval("<?php log_str('Hello');");
}
}
This throws an exception (as I would expect) because this custom function isn't registered.
Exception in thread "main" com.caucho.quercus.QuercusErrorException: eval::1: Fatal Error: 'log_str' is an unknown function.
at com.caucho.quercus.env.Env.error(Env.java:6420)
at com.caucho.quercus.env.Env.error(Env.java:6306)
at com.caucho.quercus.env.Env.error(Env.java:5990)
at com.caucho.quercus.expr.CallExpr.evalImpl(CallExpr.java:198)
at com.caucho.quercus.expr.CallExpr.eval(CallExpr.java:151)
at com.caucho.quercus.expr.Expr.evalTop(Expr.java:523)
at com.caucho.quercus.statement.ExprStatement.execute(ExprStatement.java:67)
at com.caucho.quercus.program.QuercusProgram.execute(QuercusProgram.java:413)
at com.caucho.quercus.script.QuercusScriptEngine.eval(QuercusScriptEngine.java:134)
at com.caucho.quercus.script.QuercusScriptEngine.eval(QuercusScriptEngine.java:179)
at javax.script.AbstractScriptEngine.eval(AbstractScriptEngine.java:247)
at com.succeed.QuercusTest.main(QuercusTest.java:18)
However, I can't see how to register this Quercus module with the Java scripting engine. Docs are a bit sparse... Any help would be appreciated.
1.
ScriptEngineManager manager = new ScriptEngineManager();
engine = manager.getEngineByName("php");
2.
if( engine instanceof QuercusScriptEngine )
{
((QuercusScriptEngine)engine).getQuercus().addModule(new LogFunction());
}
This works.
(quercus-4.0.18-src + resin 4.0)
I ended up ditching the scripting engine code and going native-Quercus:
QuercusEngine engine = new QuercusEngine();
engine.getQuercus().getModuleContext().addModule("LogFunction", new LogFunction());
engine.setOutputStream(os);
engine.getQuercus().init();
engine.execute(phpCode);
This works OK. It at least has fairly predictable behaviour.