I am both locally, as well as within our Kubernetes pods, authenticated into Google cloud. On both I can get correct response with gcloud info.
However, when I want to access GoogleDrive, I need to use GoogleCredential as follows:
GoogleCredential.Builder()
.setTransport(transport)
.setJsonFactory(jsonFactory)
.setServiceAccountPrivateKey(privateKey)
.setServiceAccountId(serviceAccount)
.setServiceAccountScopes(scopes.toList()).build()
Meaning - I need to specifically set privateKey and serviceAccount. Is there a way to force it to use the locally authenticated account?
When using Google buckets this can be done quite easily:
StorageOptions.getDefaultInstance().service.options.credentials
I cannot find the same way for Google Drive.
As #DazWilkin indicated, many GCP client libraries (such as GCS in your case) know how to automatically detect "Application Default Credentials" available.
These ADC credentials currently work only on Google Cloud Platform APIs (Google Drive predates that). You can read Google Drive Java quickstart to learn how to retrieve credentials: https://developers.google.com/drive/api/v3/quickstart/java
For refernce, GCP client libraries will look for ADCs by:
GOOGLE_APPLICATION_CREDENTIALS environment variable, if set, pointing to JSON key file of the service account.
Find %APPDATA%/gcloud/application_default_credentials.json (Windows) or $HOME/.config/gcloud/application_default_credentials.json (other) if the user has executed gcloud auth application-default login command.
On Google App Engine 1st gen (not GAE Flex) use appengine.AccessToken API.
On GCE, GKE or GAE 2nd gen environments, it calls the GCE Metadata API (an url like http://metadata.google.internal or http://169.254.169.254) to retrieve a short-lived access_token.
In your case, your GKE pods are using method #4 to retrieve a token for GCS bucket operations; but not for Drive API.
Related
I using GCP Secret manager to store passwords using the google-cloud-secretmanager client library (Java). The client library is expecting the service account key (json) file path for a gcp project in environment variable. I am able to do it for single project, but when I try to access Secret manager of multiple GCP projects, I don't know how to set the keys for different projects in environment variable. Need help in setting the keys in environment or is there a way to set it using java code.
I am using this maven dependency
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-secretmanager</artifactId>
</dependency>
https://cloud.google.com/secret-manager/docs/reference/libraries
Thanks in advance.
Like most Google Cloud services, Google Secret Manager supports cross-project permissions. You can grant your service account access to secrets in other projects by applying the IAM permission to the service account. Even though the service account resides in project-a, it can still be given permission to access Secret Manager secrets in project-b:
gcloud secrets add-iam-policy-binding "my-secret" \
--project "project-b"
--member "serviceAccount:my-service-account#project-a.iam.gserviceaccount.com"
As an aside, the client library does not require the path to a JSON service account key. It accepts one, but you can provide authentication via multiple paths, including Application Default Credentials (preferred).
My java service will run on my computers (let's say I'll have more than 1000 computers) and will send some data to S3. I use AWS Java SDK for it.
If I'm right, for doing it I need to use access key & secret key on my computers. (let's say it will be in .aws/credential file)
I read a lot of AWS documentation about the best practices for resources programmatic access, but still can't understand it.
Rotating access keys. After an access key is rotated, how can I change it in all applications that run my computers? Should my application be self-updated?
Temporary credentials. In this approach I still need to have access key & secret key on my computers. If yes, I have the same problem as in Q1.
Can somebody advise me what the best way and secure to programmatically access AWS resources in my situation? What do I need to do with access key & secret key?
Thank you.
UPDATES:
Computers are in different networks
Java app sends to S3 and also reads from S3
New computers can be added every time
The computers will need AWS credentials to talk with S3.
The simplest way is to store the credentials on each computer. However, as you say, it makes it hard to rotate the keys.
Another option is to store the credentials in a database that they can access, so they always get the latest credentials. However, they will need some sort of login to access the database.
Alternatively, you could setup identity federation, so that that the computers can authenticate against something like Active Directory, and then you can write a central service that will provide temporary credentials to each computer.
The process is basically:
The computers authenticate to AD
They call your service and prove that they are authenticated to AD
Your service then calls STS and generates temporary credentials valid for up to 36 hours
It provides those credentials to the computers
See: GetFederationToken - AWS Security Token Service
AFAIK you need to ensure that your application on computer has up-to-date access key. My recommendation is to store the access key on centralized place from which application will retrieve it. Thus, once you rotate the key and update the centralized storage, it will be reflected in all your application instances.
The AWS Java SDKs use a credential chain. The credential chain just means the SDK will look for credentials in 6 different places in this order:
Java system properties–aws.accessKeyId and aws.secretAccessKey. The AWS SDK for Java uses the SystemPropertyCredentialsProvider to load these credentials.
Environment variables–AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY. The AWS SDK for Java uses the EnvironmentVariableCredentialsProvider class to load these credentials.
The default credential profiles file– The specific location of this file can vary per platform, but is typically located at ~/.aws/credentials. This file is shared by many of the AWS SDKs and by the AWS CLI. The AWS SDK for Java uses the ProfileCredentialsProvider to load these credentials.
You can create a credentials file by using the aws configure command provided by the AWS CLI. You can also create it by editing the file with a text editor. For information about the credentials file format, see AWS Credentials File Format.
Amazon ECS container credentials– This is loaded from Amazon ECS if the environment variable AWS_CONTAINER_CREDENTIALS_RELATIVE_URI is set. The AWS SDK for Java uses the ContainerCredentialsProvider to load these credentials.
Instance profile credentials– This is used on Amazon EC2 instances, and delivered through the Amazon EC2 metadata service. The AWS SDK for Java uses the InstanceProfileCredentialsProvider
to load these credentials.
https://docs.aws.amazon.com/sdk-for-java/v2/developer-guide/credentials.html
There are lots of questions and articles on how to do this with .NET, but how/is it possible to easily authenticate for local development through Azure AD shared secret credential using Java (Spring Boot specifically)?
For .NET, it is as easy as specifying the RunAs=CurrentUser property in the connection string to connect to the Azure Key Vault (per this article: https://learn.microsoft.com/en-us/azure/key-vault/service-to-service-authentication), connecting automatically (assuming my account is listed in the access policy for the key vault I want to access). Ideally, I would not want to use a thousand Java dependencies to do this, I could manually obtain a token to authenticate, but it would be nice to save developers the hassle of having to manually obtain a token from Azure every time we wanted to test things for local development.
Thanks!
Here is an example os using MSICredentials Read Azure key vault secret through MSI in Java
Just try using the AzureCliCredentials instead https://azure.github.io/azure-sdk-for-java/com/microsoft/azure/credentials/AzureCliCredentials.html
I am using google app engine and google datastore.
I am using the google library
com.google.cloud.datastore.Datastore
My sample code is:
Datastore datastore = DatastoreOptions.getDefaultInstance().getService();
Key taskKey = datastore.newKeyFactory().setKind(entityName).newKey(id);
//populate some fields....
datastore.put(task);
I am using spring-boot and jetty as a container.
On local, it is working properly and the data updated in the google datastore.
The issue is when im deploying the app to the google-app-engine, im getting the Below exception when i get to datastore.put method.
com.google.cloud.datastore.DatastoreException: Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.
com.google.cloud.datastore.spi.v1.HttpDatastoreRpc.translate(HttpDatastoreRpc.java:129)
com.google.cloud.datastore.spi.v1.HttpDatastoreRpc.commit(HttpDatastoreRpc.java:155)
com.google.cloud.datastore.DatastoreImpl$4.call(DatastoreImpl.java:485)
Note: in both, on local environment and google-app-engine, i defined the environment variable GOOGLE_APPLICATION_CREDENTIALS that point to json file with all required credentials generated by google API.
According to the documentation for connecting to Datastore from App Engine in Java, there are several options available, so you can either go with Objectify (third party library), Datastore API or Datastore Client Library.
With the usage of Client Libraries, you must know that they make use of the Application Default Credentials, in such a way that, as documented, if the environment variable GOOGLE_APPLICATION_CREDENTIALS, ADC uses the default service account that App Engine provides for applications running over that service. So in your case, I think you should not define the environment variable, so that App Engine uses its default Service Account.
if you still struggling with com.google.cloud.datastore.DatastoreException: Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. you can set credentials explicitly, like it is shown in the example below:
Resource credentialsCyberpower = resourceLoader.getResource("classpath:yourservice-datastore-access.json");
GoogleCredentials credentials = GoogleCredentials.fromStream(credentialsCyberpower.getInputStream())
.createScoped(Lists.newArrayList("https://www.googleapis.com/auth/cloud-platform"));
DatastoreOptions options =
DatastoreOptions.newBuilder().setProjectId("XXXXXX").setCredentials(credentials).build();
Datastore datastore = options.getService();
ObjectifyService.init(new ObjectifyFactory(datastore));
generate yourservice-datastore-access.json in IAM service accounts. working with Objectify 6.0.5
If you are using <url-stream-handler>urlfetch</url-stream-handler> with Java8 and Objectify 6 you will have to switch into native and enable billing.
I was hit by this issue recently and spend a lot of time on fixing the problem, more info can be found here
The question is how can I set application secrets to make them available in application.yml?
On heroku I was doing it simply, by setting environment variable for dyno, and acces it as:
server:
port: ${PORT}
security:
user:
password: ${USERPASSWORD}
eureka:
client:
register-with-eureka: false
fetch-registry: false
instance:
hostname: localhost
securePortEnabled: true
password: ${EUREKAPASSWORD}
How to achieve that in Google App Engine? I was trying with datastore:
Unfornately I don't know how to inject those values into my *.yml file.
EDIT:
One more important thing to add. I am using maven appengine plugin to deploy my app via CI pipeline, so there is no possibility for me to push app.yaml file to App Engine
If you want to store secrets that are available to the app at runtime, keeping them in the datastore isn't a bad idea. I know of many apps that do that.
Here's an app used by the Khan Academy that's a good example of storing secret credentials in the datastore. It's in Python, but you can get the general idea. Note that on first admin login, it prompts for secrets to store.
Google has also a tutorial on how to store encrypted secrets.
https://cloud.google.com/kms/docs/store-secrets
TLDR: a separate bucket to store the encrypted secrets, instances download it when needed, decrypt using Google KMS (https://cloud.google.com/kms/) and remove afterwards.
The best and secure way is to use GCP KMS or some third party secrets manager product like vault.
GCP KMS
We need to use a service account with encrypt and decrypt permission(role) to encrypt the credentials(secrets) file.
Upload the encrypted credential file to GCS
Fetch the encrypted credential from GCS and decrypt and parse it(E.g. parse to plain java object) at runtime in your application code.
Datastore
Yes. We can store credentials/secrets environment variables into datastore and fetch them at runtime in application code.
Pros:
Simple
It can be used almost everywhere, GAE standard environment, GAE flexible environment, GCE, GCF, GKE, Cloud Run.
Cons:
Security is not as good as KMS.
GCE metadata
I used to use GCE metadata server to store my secret environment variables.
Pros:
It supports GAE, GCE, GKE.
Very simple. We just need to send HTTP requests to http://metadata.google.internal/computeMetadata/v1/ endpoint to fetch our custom metadatas(the secrets environment variables).
Cons:
Last year, GCE metadata doesn't support Cloud Function. (runtime: nodejs10).I can't fetch my custom secrets environment variables from GCE metadata within cloud function. But built-in metadatas can be fetched, like projectId.
security is not as good as KMS.
configmap and secrets(Only for GKE)
Simple base64 encryption is possible. Medium difficulty to use. Security is not as good as KMS.
Another hack way
I also create a post for this question here: How to pass system environment variables to app.yaml?
Yes, the Linux script way can do everything. But I don't like these hack way.