I'm working on some improvement on our google drive integration.
Current state:
There is already implementation of saving files into Google Drive folders using hard-coded folderId. This works with no problems.
Now: I want to extend this logic and I need for this list of all folders.
So I followed this guide:
https://developers.google.com/drive/api/guides/search-files
And the problem is that I receive only **one** folder but in google drive there is 10.
Can anyone has any idea what I missed or overlooked? Why result doesn't contain nextPageToken? Spent whole day on it and this drives me crazy.
This is my method (i'm using service account for connection):
#Override
public List<File> getAllFolders() throws IOException {
Drive service = googleDriveProvider.getService();
List<File> files = new ArrayList<>();
String pageToken = null;
do {
FileList result = service.files().list()
.setQ("mimeType='application/vnd.google-apps.folder'")
.setSpaces("drive")
.setSupportsAllDrives(true)
.setIncludeItemsFromAllDrives(true)
.setFields("nextPageToken, files(id, name, parents)")
.setPageToken(pageToken)
.execute();
for (File file : result.getFiles()) {
System.out.printf("Found file: %s (%s)\n",
file.getName(), file.getId());
}
files.addAll(result.getFiles());
pageToken = result.getNextPageToken();
} while (pageToken != null);
return files;
}
And this is GoogleDriveProvider:
#Service
public class GoogleDriveProvider {
private static final JsonFactory JSON_FACTORY = GsonFactory.getDefaultInstance();
private static final Set<String> SCOPES = DriveScopes.all();
private static final String GET_DRIVE_SERVICE_ERROR_MESSAGE = "Getting instance of Google drive has failed, error: [%s]";
#Value("${google.drive.service.account.auth.json}")
private String authJson;
#Value("${info.app.name}")
private String appName;
public Drive getService() throws GoogleDriveException {
try {
final NetHttpTransport httpTransport = GoogleNetHttpTransport.newTrustedTransport();
GoogleCredentials credentials = GoogleCredentials.fromStream(
new ByteArrayInputStream(authJson.getBytes())).createScoped(SCOPES);
credentials.refreshIfExpired();
HttpRequestInitializer requestInitializer = new HttpCredentialsAdapter(credentials);
return new Drive.Builder(httpTransport, JSON_FACTORY, requestInitializer)
.setApplicationName(appName)
.build();
} catch (IOException | GeneralSecurityException e) {
throw new GoogleDriveException(
format(GET_DRIVE_SERVICE_ERROR_MESSAGE, e.getMessage()), e);
}
}
}
Problem solved. Folders must be created by the service account or must be shared with service account first.
I have below method to return AmazonS3 for upload documents. In local env, I have to connect to a s3 bucket in a different region but in other environments the s3 bucket and the application code is same aws region.
public AmazonS3 getAmazonS3Client() {
if ("local".equals(hostEnvironment)) {
final AssumeRoleRequest roleRequest = new AssumeRoleRequest()
.withRoleArn("arnrole").withRoleSessionName("s3Session");
final AssumeRoleResult assumeRoleResult = AWSSecurityTokenServiceAsyncClientBuilder.defaultClient()
.assumeRole(roleRequest);
final Credentials sessionCredentials = assumeRoleResult.getCredentials();
final BasicSessionCredentials basicSessionCredentials = new BasicSessionCredentials(
sessionCredentials.getAccessKeyId(), sessionCredentials.getSecretAccessKey(),
sessionCredentials.getSessionToken());
return AmazonS3Client.builder().withRegion("us-east-2").withCredentials
(new AWSStaticCredentialsProvider(basicSessionCredentials)).build();
} else {
return AmazonS3Client.builder().withRegion("us-east-2").withCredentials
(new InstanceProfileCredentialsProvider(true)).build();
}
}
I am getting below exception when running from local, what am I missing here?
Caused by: com.amazonaws.SdkClientException: Unable to find a region
via the region provider chain. Must provide an explicit region in the
builder or setup environment to supply a region. at
com.amazonaws.client.builder.AwsClientBuilder.setRegion(AwsClientBuilder.java:462)
at
com.amazonaws.client.builder.AwsClientBuilder.configureMutableProperties(AwsClientBuilder.java:424)
at
com.amazonaws.client.builder.AwsAsyncClientBuilder.build(AwsAsyncClientBuilder.java:80)
at
com.amazonaws.services.securitytoken.AWSSecurityTokenServiceAsyncClientBuilder.defaultClient(AWSSecurityTokenServiceAsyncClientBuilder.java:45)
After I set the region to AmazonS3Client,this works
AmazonS3Client amazonS3 = new AmazonS3Client(basicSessionCredentials);
amazonS3.setRegion(RegionUtils.getRegion("us-east-2"));
In order to remove identities from a google cloud bucket, I use the example provided at the GCP examples repo: here. I am wondering if there is something I am missing, I have the correct root credentials to the cloud account, as well as the project ownership credentials. Basically, the removal operations do not owrk both from Java code and using the gsutil function from gcp web console.
Here is the original policy:
Policy{
bindings= {
roles/storage.legacyBucketOwner= [
projectOwner:csbauditor
],
roles/storage.objectAdmin= [
serviceAccount:company-kiehn-log#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-kiehn-file#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-howe-file#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-satterfield-log#csbauditor.iam.gserviceaccount.com,
serviceAccount:customer-0c1e8536-8bf5-46f4-8e#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-fahey-log#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-hammes-file#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-howe-log#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-sipes-file#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-doyle-log#csbauditor.iam.gserviceaccount.com,
serviceAccount:customer-6a53ee71-95eb-49b2-8a#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-bergnaum-file#csbauditor.iam.gserviceaccount.com
],
roles/storage.legacyBucketReader= [
projectViewer:csbauditor
],
roles/storage.objectViewer= [
serviceAccount:company-block-log#csbauditor.iam.gserviceaccount.com
]
},
etag=CLgE,
version=0
}
Here is the second policy version, before writing to IAM:
Policy{
bindings= {
roles/storage.legacyBucketOwner= [
projectOwner:csbauditor
],
roles/storage.objectAdmin= [
serviceAccount:company-kiehn-log#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-kiehn-file#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-howe-file#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-satterfield-log#csbauditor.iam.gserviceaccount.com,
serviceAccount:customer-0c1e8536-8bf5-46f4-8e#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-fahey-log#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-hammes-file#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-howe-log#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-sipes-file#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-doyle-log#csbauditor.iam.gserviceaccount.com,
serviceAccount:customer-6a53ee71-95eb-49b2-8a#csbauditor.iam.gserviceaccount.com,
serviceAccount:company-bergnaum-file#csbauditor.iam.gserviceaccount.com
],
roles/storage.legacyBucketReader= [
projectViewer:csbauditor
],
roles/storage.objectViewer= [
serviceAccount:company-block-log#csbauditor.iam.gserviceaccount.com
]
},
etag=CLgE,
version=0
}
Here is my code snippet:
Read bucket policy and extract unwanted identities
Set<Identity> wrongIdentities = new HashSet<Identity>();
Role roler = null;
Policy p = Cache.GCSStorage.getIamPolicy("bucketxyz");
Map<Role, Set<Identity>> policyBindings = p.getBindings();
for (Map.Entry<Role, Set<Identity>> entry : policyBindings.entrySet()) {
Set<Identity> setidentities = entry.getValue();
roler = entry.getKey();
if (roler.getValue().equals("roles/storage.objectAdmin")) {
setidentities = entry.getValue();
if ((set.equals("serviceAccount:attacker#csbauditor.iam.gserviceaccount.com"))) {
continue;
} else {
wrongIdentities.add(set);
}
}
}
}
removeBucketIamMember("bucektxyz", roler, identity));
}
}
Remove Unwanted Identities from policy
public static Policy removeBucketIamMember(String bucketName, Role role,
Identity identity) {
Storage storage = GoogleStorage.initStorage();
Policy policy = storage.getIamPolicy(bucketName);
System.out.println("policyt "+ policy);
Policy updatedPolicy = policy.toBuilder().removeIdentity(role,
Identity.serviceAccount(identity.getValue())).build();
System.out.println("updatedPolicy "+ policy);
storage.setIamPolicy(bucketName,updatedPolicy);
if (updatedPolicy.getBindings().get(role) == null||
!updatedPolicy.getBindings().get(role).contains(identity)) {
System.out.printf("Removed %s with role %s from %s\n", identity, role,
bucketName);
}
return updatedPolicy;
}
Update 01
I tried also using gsutil from within the web console, still does not work.
myaccount#cloudshell:~ (csbauditor)$ gsutil iam ch -d user:company-sipes-
file#csbauditor.iam.gserviceaccount.com gs://company-block-log-fce65e82-a0cd-
4f71-8693-381100d93c18
No changes made to gs://company-block-log-fce65e82-a0cd-4f71-8693-381100d93c18/
Update 02 As advised by #JohnHanley, gsutil worked after I replaced user with serviceAccount. However, the java code is not yet working.
I have found the issue in your code. Although I cannot be completely sure that this was the only issue since I wasn't able to compile your code, I had to change several classes too.
After I was able to compile and run the code I noticed that even if the "remove" function was executed nothing really happened, after making a few prints I noticed that it was trying to remove the services accounts using the wrong "role", since you were changing the "role" value on the "for" loop, and if the "set" wasn't equal to "attacker-service-account" then the loop made another iteration and changed the "role" value.
Here's the code of my class (a modification of the example snippet):
package com.google.cloud.examples.storage.snippets;
import com.google.cloud.Identity;
import com.google.cloud.Policy;
import com.google.cloud.Role;
import com.google.cloud.storage.Storage;
import com.google.cloud.storage.StorageOptions;
import com.google.cloud.storage.StorageRoles;
import java.util.Map;
import java.util.Set;
import java.util.Arrays;
import java.util.HashMap;
import java.util.HashSet;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
/** This class contains Bucket-level IAM snippets for the {#link Storage} interface. */
public class BucketIamSnippets {
/** Example of listing the Bucket-Level IAM Roles and Members */
public Policy listBucketIamMembers(String bucketName) {
// [START view_bucket_iam_members]
// Initialize a Cloud Storage client
Storage storage = StorageOptions.getDefaultInstance().getService();
// Get IAM Policy for a bucket
Policy policy = storage.getIamPolicy(bucketName);
// Print Roles and its identities
Map<Role, Set<Identity>> policyBindings = policy.getBindings();
for (Map.Entry<Role, Set<Identity>> entry : policyBindings.entrySet()) {
System.out.printf("Role: %s Identities: %s\n", entry.getKey(), entry.getValue());
}
// [END view_bucket_iam_members]
return policy;
}
/** Example of adding a member to the Bucket-level IAM */
public Policy addBucketIamMember(String bucketName, Role role, Identity identity) {
// [START add_bucket_iam_member]
// Initialize a Cloud Storage client
Storage storage = StorageOptions.getDefaultInstance().getService();
// Get IAM Policy for a bucket
Policy policy = storage.getIamPolicy(bucketName);
// Add identity to Bucket-level IAM role
Policy updatedPolicy =
storage.setIamPolicy(bucketName, policy.toBuilder().addIdentity(role, identity).build());
if (updatedPolicy.getBindings().get(role).contains(identity)) {
System.out.printf("Added %s with role %s to %s\n", identity, role, bucketName);
}
// [END add_bucket_iam_member]
return updatedPolicy;
}
public static void removeUserFromBucketUsingEmail(String bucketName, Role role, String email) {
Storage storage = StorageOptions.getDefaultInstance().getService();
Policy policy = storage.getIamPolicy(bucketName);
Identity identity = Identity.serviceAccount(email);
String eTag = policy.getEtag();
System.out.println("etag: " + eTag);
Policy updatedPolicy = storage.setIamPolicy(bucketName, policy.toBuilder().removeIdentity(role, identity).build());
if (updatedPolicy.getBindings().get(role) == null
|| !updatedPolicy.getBindings().get(role).contains(identity)) {
System.out.printf("Removed %s with role %s from %s\n", identity, role, bucketName);
}
}
public static void main(String... args) throws Exception {
try
{
String bucketName = "my-bucket-name";
BucketIamSnippets obj = new BucketIamSnippets ();
Role role_admin = StorageRoles.objectAdmin();
String acc_1 = "test1#my.iam.gserviceaccount.com";
String acc_2 = "test2#my.iam.gserviceaccount.com";
Identity identity_1 = Identity.serviceAccount(acc_1);
Identity identity_2 = Identity.serviceAccount(acc_2);
System.out.println(obj.addBucketIamMember (bucketName, role_admin, identity_1 ));
System.out.println(obj.addBucketIamMember (bucketName, role_admin, identity_2 ));
Storage storage = StorageOptions.getDefaultInstance().getService();
Policy policy = storage.getIamPolicy(bucketName);
System.out.println(policy);
//List<Role> roleList = new ArrayList<>();
List<Set<Identity>> identities = new ArrayList<>();
// Print Roles and its identities
Set<Identity> wrongIdentities = new HashSet<Identity>();
Role aux = null;
Map<Role, Set<Identity>> policyBindings = policy.getBindings();
Set<Identity> setidentities = new HashSet<>();
for (Map.Entry<Role, Set<Identity>> entry : policyBindings.entrySet()) {
aux = entry.getKey();
System.out.println("role plain " + aux);
System.out.println("role other " + aux.getValue());
if (aux.getValue().equals("roles/storage.objectAdmin")) {
System.out.println("role :" + aux.getValue());
System.out.println("Identities getV :" + entry.getValue());
System.out.println("Identities getK :" + entry.getKey());
setidentities = entry.getValue();
System.out.println("setidentities :" + setidentities);
System.out.println("setidentities size :" + setidentities.size());
for (Identity set : setidentities) {
if ((set.equals("serviceAccount: test2#my.iam.gserviceaccount.com"))) {
System.out.println("strong one : " + set);
continue;
} else {
wrongIdentities.add(set);
System.out.println("strong one : " + set);
}
System.out.println("wrongIdentities.size() : " + wrongIdentities.size());
}
}
}
System.out.println("ww " + wrongIdentities);
System.out.println("policyEtag " + policy.getEtag());
//GCSFunctions function = new GCSFunctions();
for (Identity identity : wrongIdentities) {
BucketIamSnippets.removeUserFromBucketUsingEmail(bucketName, role_admin, identity.getValue());
}
}
catch (Exception e)
{
e.printStackTrace ();
}
}
}
Notes:
I add two test services accounts and then I run your code (with a little modifications).
I have initialized the "role" as objectAdmin directly, and that's what i pass to the removing function.
Modify the code to comply with your actual use case.
I have compiled this with the same dependencies used on the example
I'm trying to create a new AWS EC2 instance using the AWS Java SDK but getting "Value () for parameter groupId is invalid. The value cannot be empty". Here is my code:
AWSCredentials credentials = null;
try {
credentials = new ProfileCredentialsProvider().getCredentials();
} catch (Exception e) {
throw new AmazonClientException(
"Cannot load the credentials from the credential profiles file. " +
"Please make sure that your credentials file is at the correct " +
"location (~/.aws/credentials), and is in valid format.",
e);
}
ec2 = AmazonEC2ClientBuilder.standard()
.withCredentials(new AWSStaticCredentialsProvider(credentials))
.withRegion(Regions.US_WEST_2)
.build();
}
RunInstancesRequest runInstancesRequest = new RunInstancesRequest();
String ami_id = "ami-efd0428f"; //ubuntu/images/hvm-ssd/ubuntu-xenial-16.04-amd64-server-20170414
Collection<String> securityGroups = new ArrayList<>();
securityGroups.add("launch-wizard-1");
securityGroups.add("sg-9405c2f3");
runInstancesRequest.withImageId(ami_id)
.withInstanceType("t2.medium")
.withMinCount(1)
.withMaxCount(1)
.withKeyName("MyKeyName")
.withSecurityGroups(securityGroups);
RunInstancesResult run_response = ec2.runInstances(runInstancesRequest); // fails here!
String instance_id = run_response.getReservation().getReservationId();
Tag tag = new Tag()
.withKey("Name")
.withValue(tfCompanyName.getText());
Collection<Tag> tags = new ArrayList<>();
tags.add(tag);
CreateTagsRequest tag_request = new CreateTagsRequest();
tag_request.setTags(tags);
CreateTagsResult tag_response = ec2.createTags(tag_request);
String s = String.format("Successfully started EC2 instance %s based on AMI %s",instance_id, ami_id);
System.out.println(s);
Any suggestions?
You might need to add a VPC details also .
PrivateIpAddresses ,Monitoring are among other required fields.
I would recommend you to try creating EC2 Instance manually using AWS Console and see what are the required parameters it is asking?
Hello I've downloaded and installed the latest version of Drive REST API for java and want to get the metadata of a public file from Google Drive by the fileID - I have the following code:
private static final String APPLICATION_NAME = "test";
private static final String FILE_ID = "theFileId";
public static void main(String[] args) {
HttpTransport httpTransport = new NetHttpTransport();
JacksonFactory jsonFactory = new JacksonFactory();
Drive service = new Drive.Builder(httpTransport, jsonFactory, null).setApplicationName(APPLICATION_NAME).build();
printFile(service, FILE_ID);
}
private static void printFile(Drive service, String fileId) {
try {
File file = service.files().get(fileId).execute();
System.out.println("Title: " + file.getTitle());
} catch (IOException e) {
System.out.println("An error occured: " + e);
}
}
But I get the error message: "Daily Limit for Unauthenticated Use Exceeded. Continued use requires signup."
I've tried it on https://developers.google.com/drive/v2/reference/files/get which worked out just fine.
Do I have to authenticate with an API key when the file is public and how would I do that if so.
Thanks for your time.
Yes, you need to authenticate with an API key. See the documentation here: https://developers.google.com/drive/web/about-auth