connection Google cloud datastore emulator - java

I have installed google datastore emulator in my local machine along with it written a sample spring boot application .
I can't connection datastore emulator
This is my application.properties config
spring.cloud.gcp.datastore.project-id=project-id
spring.cloud.gcp.datastore.emulator.enabled=true
spring.cloud.gcp.datastore.emulator-host=http://localhost:8081
by this config , I will throw Exception
The Application Default Credentials are not available. They are available if running in Google Compute Engine. Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing to a file defining the credentials. See https://developers.google.com/accounts/docs/application-default-credentials for more information.

When using the Datastore emulator, you don't need credentials for running the application, so it might be that the library doesn't know that.
However, if you want to try it providing credentials, once you have a service account created, then run in the shell the following:
export GOOGLE_APPLICATION_CREDENTIALS="KEY_PATH"
KEY_PATH you have to replace it with the path of the JSON file that contains your service account key. You can find more information here.

Related

Can not authenticate to Google Cloud

I am trying to use Google Cloud Translation API, but I can not authenticate to Google Cloud.
TranslationServiceClient client = TranslationServiceClient.create()
This line fires the error below:
W/System.err: java.io.IOException: The Application Default Credentials are not available. They are available if running in Google Compute Engine. Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined pointing to a file defining the credentials. See https://developers.google.com/accounts/docs/application-default-credentials for more information.
I have created the project on Google Cloud, downloaded the json file when I created the Service Account as required by the API and I exported the parameters using command line but still not working:
export GOOGLE_APPLICATION_CREDENTIALS="/Users/admin/Desktop/franzosischlernen-d8db6-9457de40bc93.json"

Unable to load AWS credentials on EC2 Instance

The application
Simple REST API registration service in Spring, after sending proper POST request new user is created in database and Amazon SES sends an email with registration link to verify.
The problem
Locally after setting local variables (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY and AWS_DEFAULT_REGION) in my OS (Windows) app works just fine, but the problem starts after deploying it. I have an EC2 Instance with Amazon Linux AMI on AWS:
created user at AWS Identity and Access Management (IAM), granted AmazonSESFullAccess role
logging by CMD using shh with the private key file *.pem
Tomcat8 service started
MySQL service started
application *.war file deployed
created environment variables using 'export' command and I checked 'printenv' just to be sure everything is fine
after sending POST request I got exception (below) which means that user has been created but Amazon SES didn't send an email confirmation because couldn't authenticate
{
"timestamp": "2020-04-26T15:44:44.010+0000",
"message": "Unable to load AWS credentials from any provider in the chain: [EnvironmentVariableCredentialsProvider: Unable to load AWS credentials from environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY (or AWS_SECRET_ACCESS_KEY)), SystemPropertiesCredentialsProvider: Unable to load AWS credentials from Java system properties (aws.accessKeyId and aws.secretKey), WebIdentityTokenCredentialsProvider: To use assume role profiles the aws-java-sdk-sts module must be on the class path., com.amazonaws.auth.profile.ProfileCredentialsProvider#23fac1a3: profile file cannot be null, com.amazonaws.auth.EC2ContainerCredentialsProviderWrapper#68aa5a98: The requested metadata is not found at http://169.254.169.254/latest/meta-data/iam/security-credentials/]"
}
I checked again local environment variables on my EC2 instance and it was looking fine but to be sure I re-configured it using 'aws configure' command
exception keeps showing, somehow application cannot get environment variables, I'm fighting with that for over 5 hours now so hopefully someone will come here to rescue me...
Piece of code (works fine locally):
AmazonSimpleEmailService client =
AmazonSimpleEmailServiceClientBuilder
.standard()
.withCredentials(DefaultAWSCredentialsProviderChain.getInstance())
.withRegion(Regions.EU_CENTRAL_1)
.build();
I am total Linux noob, having problems with simple commands so please be gentle with solutions requiring some console commands.
If you're running app on EC2, don't use IAM user.
Instead create IAM role with same permissions and assign that role to the instance. If app uses AWS SDK it will be able to pick up credentials without any problems.
In your case problem is probably app's environment being different from yours, if you export credentials in your bash session it will not pass to app if it's loaded under different user or bash session.
The DefaultAWSCredentialsProvider has multiple places it will look for credentials. Instead of setting up your credentials as an environment variable, you can set up a credentials profile. See this documentation: Working with AWS Credentials.
Make sure you have the AWS CLI installed, then you can run the following command to configure your profile: aws configure
Click here for the documentation on the aws configure command.
If you have already configured your aws profile and it still does not work, you have most likely configured the profile for the wrong linux user. For example, if a linux user named tomcat8 is the user who is running your tomcat instance, then you need to set up a credentials profile at /home/tomcat8/.aws/credentials/

How to connect to Azure keyvault from SpringBoot application for local development using MSI

I am trying to connect to Azure KeyVault from my locally running Spring Boot Application. I can't keep those secrets to be saved in keyvault in different properties or yaml during dev, because my application will generate and delete so many secrets and tokens to be saved in keyvault in the run time.
I am aware of the process in which we can create an Azure service principal from your application registration. And use
azure.keyvault.client-id
azure.keyvault.client-key
in application.properties to connect.
But it may not be allowed to be created Azure service principal in our case. So is there any way to connect to key vault using MSI from locally running SpringBoot application.
using MSI_ENDPOINT
and MSI_SECRET
So is there any way to connect to key vault using MSI from locally running SpringBoot application.
using MSI_ENDPOINT and MSI_SECRET
I don't think you can use MSI_ENDPOINT and MSI_SECRET get the token in local, it just works when the web app published in the cloud.
But it may not be allowed to be created Azure service principal in our case.
As you know, you can use the service principal client id and secret(key) to access the keyvault. Actually, when enabling the MSI of the web app, it will create a service principal in your Azure AD tenant automatically. So you can just use the client id and secret of it.
Navigate to the portal -> Azure Active Directory -> Enterprise applications -> search for your web app name(select the Application Type with All Applications), then you get the client id(application id).
Note: Remember to check the object id of the service principal with that in your web app -> Identity, make sure you use the correct one.
For the service principal secret, you could create it via powershell like below(your account need the admin role Application administrator or Global administrator in your AAD tenant).
New-AzureADServicePrincipalPasswordCredential -ObjectId <service principal object id>
Then you will be able to access the keyvault with the client id and secret. For details in java, you can refer to this link.
You can't get it using those variables because locally there is no Azure AD Identity Registered on your local machine and as such Microsoft didn't build any MSI emulator so no variables will be set.
I can recommend what Microsoft did in their .NET library
Run Azure CLI and log in
In code check for variables and if they don't exist then run CLI command
az account get-access-token --resource 'https://vault.azure.net'
In CLI simply log into either principal or your account. Make sure to add this account/your account to KeyVault policy.
I know it's weird but I you can even check it HERE on their GitHub.
I might have an article that will help you in case you want more details
https://marczak.io/posts/2019/07/securing-websites-with-msi/

Google Cloud Storage with a service account in Java - 403 Caller does not have storage.objects.list access to bucket

We want to download files from Google Storage in our application server. It is important to have read-only restricted access to a single bucket and nothing else.
At first I used a regular user account (not a service account) which have permissions to access all buckets in our Google Cloud project, and everything worked fine - my Java code opened buckets and downloaded files without problems.
Storage storage = StorageOptions.getDefaultInstance().getService();
Bucket b = storage.get( "mybucketname" );
Then I wanted to switch to use a specially created service account which has access to a single bucket only. So I created a service account, gave permissions to read a single bucket, and downloaded its key file. The permissions in Google Cloud Console are named as:
Storage Object Viewer (3 members) Read access to GCS objects.
gsutil command line utility works fine with this account - from the command line it allows accessing this bucket but not the others.
The initialization from the command line is done using the following command:
gcloud --project myprojectname auth activate-service-account files-viewer2#myprojectname.iam.gserviceaccount.com --key-file=/.../keyfilename.json
I even tried two different service accounts which have access to different buckets, and from the command line I can switch between them and gsutil gives access to a relevant bucket only, and for any other it returns the error:
"AccessDeniedException: 403 Caller does not have storage.objects.list access to bucket xxxxxxxxxx."
So, from the command line everything worked fine.
But in Java there is some problem with the authentication.
The default authentication I previously used with a regular user account stopped working - it reports the error:
com.google.cloud.storage.StorageException: Anonymous users does not have storage.buckets.get access to bucket xxxxxxxxxx.
Then I've tried the following code (this is the simplest variant because it relies on the key json file, but I've already tried a number of other variants found in various forums, with no success):
FileInputStream fis = new FileInputStream( "/path/to/the/key-file.json" );
ServiceAccountCredentials credentials = ServiceAccountCredentials.fromStream( fis );
Storage storage = StorageOptions.newBuilder().setCredentials( credentials )
.setProjectId( "myprojectid" ).build().getService();
Bucket b = storage.get( "mybucketname" );
And all I receive is this error:
com.google.cloud.storage.StorageException: Caller does not have storage.buckets.get access to bucket mybucketname.
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
The same error is returned no matter to what buckets I'm trying to access (even non-existing).
What confuses me is that the same service account, initialized with the same JSON key file, works fine from the command line.
So I think something is missing in Java code that ensures correct authentication.
TL;DR - If you're using Application Default Credentials (which BTW you are when you do StorageOptions.getDefaultInstance().getService();), and if you need to use the credentials from a service account, you can do so without changing your code. All you need to do is set the GOOGLE_APPLICATION_CREDENTIALS environment variable to the full path of your service account json file and you are all set.
Longer version of the solution using Application Default Credentials
Use your original code as-is
Storage storage = StorageOptions.getDefaultInstance().getService();
Bucket b = storage.get( "mybucketname" );
Set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the full path of your json file containing the service account credentials.
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/service_account_credentials.json
Run your java application once again to verify that it is working as expected.
Alternate solution using hard-coded Service Account Credentials
The code example you posted for initializing ServiceAccountCredentials looks valid to me on a quick glance. I tried the following code snippet and it is working for me as expected.
String SERVICE_ACCOUNT_JSON_PATH = "/path/to/service_account_credentials.json";
Storage storage =
StorageOptions.newBuilder()
.setCredentials(
ServiceAccountCredentials.fromStream(
new FileInputStream(SERVICE_ACCOUNT_JSON_PATH)))
.build()
.getService();
Bucket b = storage.get("mybucketname");
When specifying a service account credential, the project ID is automatically picked up from the information present in the json file. So you do not have to specify it once again. I'm not entirely sure though if this is related to the issue you're observing.
Application Default Credentials
Here is the full documentation regarding Application Default Credentials explaining which credentials are picked up based on your environment.
How the Application Default Credentials work
You can get Application Default Credentials by making a single client
library call. The credentials returned are determined by the
environment the code is running in. Conditions are checked in the
following order:
The environment variable GOOGLE_APPLICATION_CREDENTIALS is checked. If this variable is specified it should point to a file that
defines the credentials. The simplest way to get a credential for this
purpose is to create a Service account key in the Google API Console:
a. Go to the API Console Credentials page.
b. From the project drop-down, select your project.
c. On the Credentials page, select the Create credentials drop-down,
then select Service account key.
d.From the Service account drop-down, select an existing service
account or create a new one.
e. For Key type, select the JSON key option, then select Create. The
file automatically downloads to your computer.
f. Put the *.json file you just downloaded in a directory of your
choosing. This directory must be private (you can't let anyone get
access to this), but accessible to your web server code.
g. Set the environment variable GOOGLE_APPLICATION_CREDENTIALS to
the path of the JSON file downloaded.
If you have installed the Google Cloud SDK on your machine and have run the command gcloud auth application-default login, your
identity can be used as a proxy to test code calling APIs from that
machine.
If you are running in Google App Engine production, the built-in service account associated with the application will be used.
If you are running in Google Compute Engine production, the built-in service account associated with the virtual machine instance
will be used.
If none of these conditions is true, an error will occur.
IAM roles
I would recommend going over the IAM permissions and the IAM roles available for Cloud Storage. These provide control at project and bucket level. In addition, you can use ACLs to control permissions at the object level within the bucket.
If your use case involves just invoking storage.get(bucketName). This operation will require just storage.buckets.get permission and the best IAM role for just this permission is roles/storage.legacyObjectReader.
If you also want to grant the service account permissions to get (storage.objects.get) and list (storage.objects.list) individual objects, then also add the role roles/storage.objectViewer to the service account.
Thanks to #Taxdude's long explanation, I understood that my Java code should be all right, and started looking at other possible reasons for the problem.
One of additional things I've tried were the permissions set to the service account, and there I've found the solution – it was unexpected, actually.
When a service account is created, it must not be given permissions to read from Google Storage, because then it will have read permissions to ALL buckets, and it is impossible to change that (not sure why), because the system marks these permissions as "inherited".
Therefore, you have to:
Create a "blank" service account with no permissions, and
Configure permissions from the bucket configuration
To do so:
Open Google Cloud Web console
Open Storage Browser
Select your bucket
Open the INFO PANEL with Permissions
Add the service account with the Storage Object Viewer permission, but there are also permissions named Storage Legacy Object Reader and Storage Legacy Bucket Reader
Because of the word "Legacy" I thought those should not be used – they look like something kept for backward compatibility. And after experimenting and adding these "legacy" permissions, all of a sudden the same code I was trying all the time started working properly.
I'm still not entirely sure what is the minimal set of permissions I should assign to a service account, but at least now it works with all three "read" permissions on the bucket – two "legacy" and one "normal".

AWS Java SDK credentials linux ec2

I've created my java web application on a tomcat server which will start another instance using the AWS Java SDK, on windows i just place the credentials in my user. Im now trying to host my application on an AWS EC2 Instance and hence i am trying to place my credentials on the Linux EC2 i've follow some steps on the AWS SDK - http://docs.aws.amazon.com/AWSSdkDocsJava/latest/DeveloperGuide/java-dg-setup.html as per the link but im still thrown the same error upon calling the method -
Cannot load the credentials from the credential profiles file. Please
make sure that your credentials file is at the correct location
(~/.aws/credentials), and is in valid format.
I've created a .aws folder in my home directory an placed the credential file within it, i've also added the export codes within the .bashrc file but it doesnt seem to work.
At Wits end here :(
Check what user tomcat runs as on the other machine.
When you store the credential for your user, they are stored at ~/.aws/credentials.
That's ok for you, but Tomcat may not be running as you.
So ensure a copy is present also at /home/{whateveryourtomcatuseris}/.aws/credentials.

Categories

Resources