I created a JAR executable of a program which uses AWS to communicate with a DynamoDB database. I remember setting up my credentials through Eclipse so that I would have access to the database, but this doesn't seem to have carried over to the JAR file since these credentials are saved on my computer and do not get packaged with the JAR. I've noticed that the JAR actually has access to the database on the computer I used to create it, but on any other computer, any tasks that require the program to access the database give the error message "Error scanning table: Unable to load AWS credentials from any provider in the chain."
How should I include the credentials with the app in order to ensure that anyone running the JAR will have access to the database?
EDIT: In case it makes a difference, I think the way I originally set the credentials was in Eclipse via AWS > Preferences > Profile Configuration.
Based on what you've posted I see a few options:
Create an AWS user that is of the "Programatic Access" type for each user who is going to run your Jar. The user will need to then create ~/.aws/credentials with the credentials you give them. This is a semi manual step but it allows you to turn off a particular user.
Assuming you have a shell file to start up your Jar file, create the environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY and export them to be run by your users.
If this is truly a raw Jar, then in your main() (or some other method that you know gets called before anything in AWS is called) do a System.setProperty("aws.accessKeyId", "youraccesskey") and System.setProperty("aws.secretKey", "yoursecretkey") to set these as system properties.
The last two suggestions are pretty dangerous. You may trust you users today but what happens when one of them is no longer trusted? The first option allows you to disable a single user.
I don't recommend hardcoding credentials in the JAR, but if you really want to it, there a few ways.
When you create the client you can use .withCredentials() to pass in your key and secret. For example:
BasicAWSCredentials awsCreds = new BasicAWSCredentials("access_key_id", "secret_key");
AmazonDynamoDBClient dbClient = AmazonDynamoDBClientBuilder.standard()
.withCredentials(new AWSStaticCredentialsProvider(awsCreds))
.build();
If you want to make it global and have every client use it automatically, you can set these Java properties before you use any AWS code:
System.setProperty("aws.accessKeyId", "access_key_id");
System.setProperty("aws.secretKey", "secret_key");
For more information, see:
http://docs.aws.amazon.com/sdk-for-java/v1/developer-guide/credentials.html
You can create an 'AWSCredentials' object programmatically like this:
AWSCredentials credentials = new AWSCredentials() {
public String getAWSSecretKey() { return "your secret key here"; }
public String getAWSAccessKeyId() { return "your access id here"; }
};
AmazonSimpleDB simpleDb = new AmazonSimpleDBClient(credentials);
AmazonS3Client = new AmazonS3Client(credentials);
//and so on...
Related
I'll premise that I've already googled and read the documentation before writing, I've noticed that it's a popular discussion here on StackOverflow as well, but none of the answers already given have helped me.
I created a Google Cloud account to use the API: Google Vision.
To do this I followed the steps of creating the project, adding the above API and finally creating a service account with a key.
I downloaded the key and put it in a folder in the java project on the PC.
Then, since it is a maven project I added the dependencies to the pom as described in the tutorials.
At this point I inserted the suggested piece of code to start using the API.
Everything seemed to be OK, everything was read, the various libraries/interfaces were imported.
But an error came up as soon as I tried to run the program:
The Application Default Credentials are not available. They are available if running in Google Compute Engine. Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing to a file defining the credentials.
I must admit I didn't know what 'Google Compute Engine' was, but since there was an alternative and I had some credentials, I wanted to try and follow that.
So I follow the instructions:
After creating your service account, you need to download the service account key to your machine(s) where your application runs. You can either use the GOOGLE_APPLICATION_CREDENTIALS environment variable or write code to pass the service account key to the client library.
OK, I tried the first way, to pass the credentials via environment variable:
With powershell -> no response
$env:GOOGLE_APPLICATION_CREDENTIALS="my key path"
With cmd -> no response either
set GOOGLE_APPLICATION_CREDENTIALS=my key path
So I tried the second way, passing credentials using code and if I'm here something else must have gone wrong, in fact with what I have it was only possible to import io.grpc.Context, while everything else gives the error "Cannot resolve symbol ..".
import com.google.common.collect.Lists;
import io.grpc.Context;
import java.io.FileInputStream;
import java.io.IOException;
public class Main
{
static void authExplicit(String jsonPath)
{
GoogleCredentials credentials = GoogleCredentials.fromStream(new FileInputStream(jsonPath))
.createScoped( Lists.newArrayList("https://www.googleapis.com/auth/cloud-platform"));
Context.Storage storage = StorageOptions.newBuilder().setCredentials(credentials).build().getService();
System.out.println("Buckets:");
Page<Bucket> buckets = storage.list();
for (Bucket bucket : buckets.iterateAll()) {
System.out.println(bucket.toString());
}
}
public static void main(String[] args) throws IOException
{
OCR.detectText();
}
}
(I don't think I can upload a screen of the code to show where it gives me errors, however it is this)
PS: I also already installed the Cloud SDK and restarted the PC because some comments said to do that to fix things.
I also tried setting the environment variable manually, but nothing changes:
Solution
I don't know if it all worked out because of a series of operations I performed I for this one command line, anyway, for posterity's sake:
Fill in these fields manually using the email field in the json file and the path to the json file, both without inverted commas, then start cmd, paste the string and run!
gcloud auth activate-service-account your_account_email_id --key-file=json_file_path
It also helped me to reboot the PC afterwards.
Please, consider read the ImageAnnotationClient class level javadocs, it gives you the right guidance about how to accomplish the authentication process. I modified the provided code to give you the full example:
// Please, set the appropriate path to your JSON credentials file here
String credentialsPath = "...";
// The credentials could be loaded as well as this.getClass().getResourceAsStream(), for example
GoogleCredentials credentials = GoogleCredentials.fromStream(new FileInputStream(credentialsPath))
.createScoped(Lists.newArrayList("https://www.googleapis.com/auth/cloud-platform"));
// Use that credentials to build your image annotation client
ImageAnnotatorSettings imageAnnotatorSettings =
ImageAnnotatorSettings.newBuilder()
.setCredentialsProvider(FixedCredentialsProvider.create(credentials))
.build()
;
ImageAnnotatorClient imageAnnotatorClient = ImageAnnotatorClient.create(imageAnnotatorSettings);
// Perform the stuff you need to...
The only dependency you need to provide in your pom.xml is this:
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-vision</artifactId>
<version>2.0.18</version>
</dependency>
In any case, please, consider for reference the pom.xml provided in the Cloud Vision examples.
The Cloud Vision examples repository gives some useful code snippets as well in the Detect class for the detection process itself.
Be sure that the service account you are using to connect to Google Cloud has the necessary permissions.
Although this solution could work, it has the drawback that you probably need to store the credentials file somewhere in your machine, maybe in your source code repository, etcetera, and it can suppose a security risk. If your are running your program from Google Cloud is always advisable to grant to the VM or service you are using the necessary permissions and use the default application credentials.
The use of the GOOGLE_APPLICATION_CREDENTIALS could be preferable as well. If you are trying using this variable, initially, try configuring it using your IDE provided mechanisms instead of cmd, PS, or bash, and see if it works.
For using any of the two last mentioned solutions you don't need to provide any additional information when constructing your ImageAnnotatorClient:
ImageAnnotatorClient imageAnnotatorClient = ImageAnnotatorClient.create();
Your approach is correct.
To authenticate code, you should use a Service Account.
Google provides a useful mechanism called Application Default Credentials (ADCs). See finding credentials automatically. When you use ADCs, Google's SDKs use a predefined mechanism to try to authenticate as the Service Account:
Checking GOOGLE_APPLICATION_CREDENTIALS in your environment. As you've tried;
When running on a GCP service (e.g. Compute Engine) by looking for the service's (Service Account) credentials. With Compute Engine, this is done by checking the so-called Metadata service.
For #1, you can either use GOOGLE_APPLICATION_CREDENTIALS in the process' environment or you can manually load the file as you appear to be trying in your code.
That all said:
I don't see where GoogleCredentials is being imported by your code?
Did you grant the Service Account a suitable role (permissions) so that it can access any other GCP services that it needs?
You should be able to use this List objects example.
The link above, finding credentials automatically, show show to create a Service Account, assign it a role and export it.
You will want to perhaps start (for development!) with roles/storage.objectAdmin (see IAM roles for Cloud Storage) and refine before deployment.
I am working on creating some scheduled jobs using the Java SDK for google cloud scheduler. Here is the link for the application code which is already posted as a part of another question. The application basically creates a Cloud Scheduler job, which every time it runs, triggers a custom training job on VertexAI. Now the call from the scheduler to VertexAI to create the custom job is authenticated using the service account. My question is about the authentication of the application code that creates the Cloud Scheduler job itself. I have set this application as a maven project and I create a single executable jar. The application itself runs on my local workstation. The following are my points/questions:
When I create a docker image and copy this jar, and the service account key into the image, and then set the GOOGLE_APPLICATION_CREDENTIALS environment variable to point to the key within the container, then the application runs fine and the Cloud Scheduler job gets created.
When I do the same as above, except I simply execute the jar in powershell (with GOOGLE_APPLICATION_CREDENTIALS environment variable pointing to the service account key), the permission is denied.
Same as 2, except I simply run the application using the eclipse "Run App" button.
How can I authenticate to run the application without having to run in a docker container. And is there a way to authenticate without using the GOOGLE_APPLICATION_CREDENTIALS environment variable, i.e., directly in the application code itself. Links to sample code/examples will be helpful.
EDIT:
For point 2, the problem was a typo in the name of the environment variable. For point 3, you can set environment variables directly in eclipse as mentioned in the answer by #RJC.
I don't have Eclipse on my machine, but I've found a related answer where you can add a specific environment variable within the IDE itself. I suggest that you try to do the following and see if it fixes the problem.
There is another way to authenticate without using GOOGLE_APPLICATION_CREDENTIALS, and that's through explicitly pointing to your service account file in your code. I've created a sample code that retrieves a Job Name without using the GOOGLE_APPLICATION_CREDENTIALS. Authentication is done by specifying a credential setting when initializing the CloudSchedulerClient.
Here's what I've done on my end:
Use the gcloud iam service-accounts keys create serviceaccount.json --iam-account=NAME#PROJECT_ID.iam.gserviceaccount.com that will generate a JSON file for the service account that will be used in CredentialsProvider.
Create a CredentialsProvider object that will call the created JSON file of the service account.
try {
JobName name = JobName.of("[PROJECT]", "[LOCATION]", "[JOB]");
CredentialsProvider credentialsProvider =
FixedCredentialsProvider.create(
ServiceAccountCredentials.fromStream(new FileInputStream("/path/to/serviceaccount.json")));
CloudSchedulerSettings cloudSchedulerSettings = CloudSchedulerSettings.newBuilder().setCredentialsProvider(credentialsProvider).build();
CloudSchedulerClient cloudSchedulerClient = CloudSchedulerClient.create(cloudSchedulerSettings);
System.out.println(cloudSchedulerClient.getJob(name).toString()); // To display the output
cloudSchedulerClient.close();
} catch (IOException e) {
e.printStackTrace();
}
For the additional guidance, here's an API reference to customize credentials.
Note that you are using a service account which can be read by unauthorized person if improperly mishandled. My suggestion is to only set your service accounts with permissions required to perform your task/s. You can also follow this best practices for managing your credentials moving forward.
I am following: AWS Docs to setup the credentials. The problem is that it requires me to create a .aws folder in the machine. I want to get the keys and other secrets from a custom location. How can it be achieved?
P.S. If I follow the tutorial recommendation then all machines running the project would have to setup that .aws folder which would be a big hassle for everyone.
Where exactly would you suggest getting the credentials from? You could store them somewhere else, like a HashiCorp Vault server, and write a script or something to pull the values and set them as environment variables, but then you'll need to figure out how to give each computer secure credentials to access the Vault server.
If by "custom location" you simply mean a different local file system location, like a mapped drive or something, then you can specify that using the AWS_CREDENTIAL_PROFILES_FILE environment variable. Although it sounds like you want to do this on multiple people's workstations, and I would caution against sharing credentials files in that scenario. You really want to assign each person different AWS access keys so that you can track each person's AWS API actions, and revoke one person's access if they leave the company or something.
I recommend reading this page for understanding all the options to configure credentials for the AWS SDK.
Assuming you are using Amazon EC2 to host your application, then you can use IAM role to grant permissions, by attaching IAM role to your EC2 instances.
Furthermore, using IAM role avoid storing sensitive credential file in your instances.
Read this document, or watch this video to implement it.
I want to automate the creation of a new email from my java Application. For this I am generating a VBS script which will collect all the information (Email Content, Subject, Attachments, etc ) and it will generate the VBS script that will open a new email view from outlook with all the fields correctly field.(I don't want to automatically send the email, Just want to create a new mail in the outlook client.) Below you can see how I run the script from the java APP:
Runtime.getRuntime().exec("wscript " + nameOfScript);
All the time my Java App is ran with Admin Privileges.
When the User has the Outlook App application open with Admin Privileges, everything works fine.
When the User has the Outlook App application open without Admin Privileges, the script will not work any more. It will fail on the following line:
Set Outlook = GetObject(, "Outlook.Application")
Error code is 429 -> https://support.microsoft.com/en-ca/help/828550/you-receive-run-time-error-429-when-you-automate-office-applications
It seems that because I am running the script with Admin Privileges the GetObject function will always fail if outlook is running without admin Privileges.
Is there a workaround for this issue.
Thanks in advance for your help!
You can use runas service from windows, and your command will become like this
String command = "runas /user:"+domain+"/"+user+" \"wscript C:\\Path\\to\\your\\script.vbs\"";
The bad part is that runas command requires user's password, and the worst part is that you cannot provide it via process from Java. It must be input from keyboard.
This will NOT work:
while ((line = is.readLine()) != null)
{
System.out.println(line);
if(line.toLowerCase().matches(".*enter.*password.*"))
{
System.out.println("Writing password for the user");
os.write(password);
os.write(System.getProperty("line.separator"));
os.close();
}
}
What you could do is:
write a .bat file with runas command above execute it and let the
user insert his password into cmd window
Based on your comment and the error message, it is difficult to point out the reason that cause the problem. I would like to provide some suggestions as the as follows:
The object is available on the machine, but it is a licensed Automation object, and can't verify the availability of the license necessary to instantiate it.
Some objects can be instantiated only after the component finds a license key, which verifies that the object is registered for instantiation on the current machine. When a reference is made to an object through a properly installed type library or object library, the correct key is supplied automatically.
If the attempt to instantiate is the result of a CreateObject or GetObject call, the object must find the key. In this case, it may search the system registry or look for a special file that it creates when it is installed, for example, one with the extension .lic. If the key can't be found, the object can't be instantiated. If an end user has improperly set up the object's application, inadvertently deleted a necessary file, or changed the system registry, the object may not be able to find its key. If the key can't be found, the object can't be instantiated. In this case, the instantiation may work on the developer's system, but not on the user's system. It may be necessary for the user to reinstall the licensed object.
You are trying to use the GetObject function to retrieve a reference to class created with Visual Basic.
GetObject can't be used to obtain a reference to a class created with Visual Basic.
Access to the object has explicitly been denied. For example, you may be trying to access a data object that's currently being used and is locked to prevent deadlock situations. If that's the case, you may be able to access the object at another time.
For more information, please refer to these links:
Run-time error '429': ActiveX component can't create object
Run-Time error 429 when using GetObject (,"Outlook.Application")
We want to download files from Google Storage in our application server. It is important to have read-only restricted access to a single bucket and nothing else.
At first I used a regular user account (not a service account) which have permissions to access all buckets in our Google Cloud project, and everything worked fine - my Java code opened buckets and downloaded files without problems.
Storage storage = StorageOptions.getDefaultInstance().getService();
Bucket b = storage.get( "mybucketname" );
Then I wanted to switch to use a specially created service account which has access to a single bucket only. So I created a service account, gave permissions to read a single bucket, and downloaded its key file. The permissions in Google Cloud Console are named as:
Storage Object Viewer (3 members) Read access to GCS objects.
gsutil command line utility works fine with this account - from the command line it allows accessing this bucket but not the others.
The initialization from the command line is done using the following command:
gcloud --project myprojectname auth activate-service-account files-viewer2#myprojectname.iam.gserviceaccount.com --key-file=/.../keyfilename.json
I even tried two different service accounts which have access to different buckets, and from the command line I can switch between them and gsutil gives access to a relevant bucket only, and for any other it returns the error:
"AccessDeniedException: 403 Caller does not have storage.objects.list access to bucket xxxxxxxxxx."
So, from the command line everything worked fine.
But in Java there is some problem with the authentication.
The default authentication I previously used with a regular user account stopped working - it reports the error:
com.google.cloud.storage.StorageException: Anonymous users does not have storage.buckets.get access to bucket xxxxxxxxxx.
Then I've tried the following code (this is the simplest variant because it relies on the key json file, but I've already tried a number of other variants found in various forums, with no success):
FileInputStream fis = new FileInputStream( "/path/to/the/key-file.json" );
ServiceAccountCredentials credentials = ServiceAccountCredentials.fromStream( fis );
Storage storage = StorageOptions.newBuilder().setCredentials( credentials )
.setProjectId( "myprojectid" ).build().getService();
Bucket b = storage.get( "mybucketname" );
And all I receive is this error:
com.google.cloud.storage.StorageException: Caller does not have storage.buckets.get access to bucket mybucketname.
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
The same error is returned no matter to what buckets I'm trying to access (even non-existing).
What confuses me is that the same service account, initialized with the same JSON key file, works fine from the command line.
So I think something is missing in Java code that ensures correct authentication.
TL;DR - If you're using Application Default Credentials (which BTW you are when you do StorageOptions.getDefaultInstance().getService();), and if you need to use the credentials from a service account, you can do so without changing your code. All you need to do is set the GOOGLE_APPLICATION_CREDENTIALS environment variable to the full path of your service account json file and you are all set.
Longer version of the solution using Application Default Credentials
Use your original code as-is
Storage storage = StorageOptions.getDefaultInstance().getService();
Bucket b = storage.get( "mybucketname" );
Set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the full path of your json file containing the service account credentials.
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/service_account_credentials.json
Run your java application once again to verify that it is working as expected.
Alternate solution using hard-coded Service Account Credentials
The code example you posted for initializing ServiceAccountCredentials looks valid to me on a quick glance. I tried the following code snippet and it is working for me as expected.
String SERVICE_ACCOUNT_JSON_PATH = "/path/to/service_account_credentials.json";
Storage storage =
StorageOptions.newBuilder()
.setCredentials(
ServiceAccountCredentials.fromStream(
new FileInputStream(SERVICE_ACCOUNT_JSON_PATH)))
.build()
.getService();
Bucket b = storage.get("mybucketname");
When specifying a service account credential, the project ID is automatically picked up from the information present in the json file. So you do not have to specify it once again. I'm not entirely sure though if this is related to the issue you're observing.
Application Default Credentials
Here is the full documentation regarding Application Default Credentials explaining which credentials are picked up based on your environment.
How the Application Default Credentials work
You can get Application Default Credentials by making a single client
library call. The credentials returned are determined by the
environment the code is running in. Conditions are checked in the
following order:
The environment variable GOOGLE_APPLICATION_CREDENTIALS is checked. If this variable is specified it should point to a file that
defines the credentials. The simplest way to get a credential for this
purpose is to create a Service account key in the Google API Console:
a. Go to the API Console Credentials page.
b. From the project drop-down, select your project.
c. On the Credentials page, select the Create credentials drop-down,
then select Service account key.
d.From the Service account drop-down, select an existing service
account or create a new one.
e. For Key type, select the JSON key option, then select Create. The
file automatically downloads to your computer.
f. Put the *.json file you just downloaded in a directory of your
choosing. This directory must be private (you can't let anyone get
access to this), but accessible to your web server code.
g. Set the environment variable GOOGLE_APPLICATION_CREDENTIALS to
the path of the JSON file downloaded.
If you have installed the Google Cloud SDK on your machine and have run the command gcloud auth application-default login, your
identity can be used as a proxy to test code calling APIs from that
machine.
If you are running in Google App Engine production, the built-in service account associated with the application will be used.
If you are running in Google Compute Engine production, the built-in service account associated with the virtual machine instance
will be used.
If none of these conditions is true, an error will occur.
IAM roles
I would recommend going over the IAM permissions and the IAM roles available for Cloud Storage. These provide control at project and bucket level. In addition, you can use ACLs to control permissions at the object level within the bucket.
If your use case involves just invoking storage.get(bucketName). This operation will require just storage.buckets.get permission and the best IAM role for just this permission is roles/storage.legacyObjectReader.
If you also want to grant the service account permissions to get (storage.objects.get) and list (storage.objects.list) individual objects, then also add the role roles/storage.objectViewer to the service account.
Thanks to #Taxdude's long explanation, I understood that my Java code should be all right, and started looking at other possible reasons for the problem.
One of additional things I've tried were the permissions set to the service account, and there I've found the solution – it was unexpected, actually.
When a service account is created, it must not be given permissions to read from Google Storage, because then it will have read permissions to ALL buckets, and it is impossible to change that (not sure why), because the system marks these permissions as "inherited".
Therefore, you have to:
Create a "blank" service account with no permissions, and
Configure permissions from the bucket configuration
To do so:
Open Google Cloud Web console
Open Storage Browser
Select your bucket
Open the INFO PANEL with Permissions
Add the service account with the Storage Object Viewer permission, but there are also permissions named Storage Legacy Object Reader and Storage Legacy Bucket Reader
Because of the word "Legacy" I thought those should not be used – they look like something kept for backward compatibility. And after experimenting and adding these "legacy" permissions, all of a sudden the same code I was trying all the time started working properly.
I'm still not entirely sure what is the minimal set of permissions I should assign to a service account, but at least now it works with all three "read" permissions on the bucket – two "legacy" and one "normal".