I want to automate the creation of a new email from my java Application. For this I am generating a VBS script which will collect all the information (Email Content, Subject, Attachments, etc ) and it will generate the VBS script that will open a new email view from outlook with all the fields correctly field.(I don't want to automatically send the email, Just want to create a new mail in the outlook client.) Below you can see how I run the script from the java APP:
Runtime.getRuntime().exec("wscript " + nameOfScript);
All the time my Java App is ran with Admin Privileges.
When the User has the Outlook App application open with Admin Privileges, everything works fine.
When the User has the Outlook App application open without Admin Privileges, the script will not work any more. It will fail on the following line:
Set Outlook = GetObject(, "Outlook.Application")
Error code is 429 -> https://support.microsoft.com/en-ca/help/828550/you-receive-run-time-error-429-when-you-automate-office-applications
It seems that because I am running the script with Admin Privileges the GetObject function will always fail if outlook is running without admin Privileges.
Is there a workaround for this issue.
Thanks in advance for your help!
You can use runas service from windows, and your command will become like this
String command = "runas /user:"+domain+"/"+user+" \"wscript C:\\Path\\to\\your\\script.vbs\"";
The bad part is that runas command requires user's password, and the worst part is that you cannot provide it via process from Java. It must be input from keyboard.
This will NOT work:
while ((line = is.readLine()) != null)
{
System.out.println(line);
if(line.toLowerCase().matches(".*enter.*password.*"))
{
System.out.println("Writing password for the user");
os.write(password);
os.write(System.getProperty("line.separator"));
os.close();
}
}
What you could do is:
write a .bat file with runas command above execute it and let the
user insert his password into cmd window
Based on your comment and the error message, it is difficult to point out the reason that cause the problem. I would like to provide some suggestions as the as follows:
The object is available on the machine, but it is a licensed Automation object, and can't verify the availability of the license necessary to instantiate it.
Some objects can be instantiated only after the component finds a license key, which verifies that the object is registered for instantiation on the current machine. When a reference is made to an object through a properly installed type library or object library, the correct key is supplied automatically.
If the attempt to instantiate is the result of a CreateObject or GetObject call, the object must find the key. In this case, it may search the system registry or look for a special file that it creates when it is installed, for example, one with the extension .lic. If the key can't be found, the object can't be instantiated. If an end user has improperly set up the object's application, inadvertently deleted a necessary file, or changed the system registry, the object may not be able to find its key. If the key can't be found, the object can't be instantiated. In this case, the instantiation may work on the developer's system, but not on the user's system. It may be necessary for the user to reinstall the licensed object.
You are trying to use the GetObject function to retrieve a reference to class created with Visual Basic.
GetObject can't be used to obtain a reference to a class created with Visual Basic.
Access to the object has explicitly been denied. For example, you may be trying to access a data object that's currently being used and is locked to prevent deadlock situations. If that's the case, you may be able to access the object at another time.
For more information, please refer to these links:
Run-time error '429': ActiveX component can't create object
Run-Time error 429 when using GetObject (,"Outlook.Application")
Related
I am working on creating some scheduled jobs using the Java SDK for google cloud scheduler. Here is the link for the application code which is already posted as a part of another question. The application basically creates a Cloud Scheduler job, which every time it runs, triggers a custom training job on VertexAI. Now the call from the scheduler to VertexAI to create the custom job is authenticated using the service account. My question is about the authentication of the application code that creates the Cloud Scheduler job itself. I have set this application as a maven project and I create a single executable jar. The application itself runs on my local workstation. The following are my points/questions:
When I create a docker image and copy this jar, and the service account key into the image, and then set the GOOGLE_APPLICATION_CREDENTIALS environment variable to point to the key within the container, then the application runs fine and the Cloud Scheduler job gets created.
When I do the same as above, except I simply execute the jar in powershell (with GOOGLE_APPLICATION_CREDENTIALS environment variable pointing to the service account key), the permission is denied.
Same as 2, except I simply run the application using the eclipse "Run App" button.
How can I authenticate to run the application without having to run in a docker container. And is there a way to authenticate without using the GOOGLE_APPLICATION_CREDENTIALS environment variable, i.e., directly in the application code itself. Links to sample code/examples will be helpful.
EDIT:
For point 2, the problem was a typo in the name of the environment variable. For point 3, you can set environment variables directly in eclipse as mentioned in the answer by #RJC.
I don't have Eclipse on my machine, but I've found a related answer where you can add a specific environment variable within the IDE itself. I suggest that you try to do the following and see if it fixes the problem.
There is another way to authenticate without using GOOGLE_APPLICATION_CREDENTIALS, and that's through explicitly pointing to your service account file in your code. I've created a sample code that retrieves a Job Name without using the GOOGLE_APPLICATION_CREDENTIALS. Authentication is done by specifying a credential setting when initializing the CloudSchedulerClient.
Here's what I've done on my end:
Use the gcloud iam service-accounts keys create serviceaccount.json --iam-account=NAME#PROJECT_ID.iam.gserviceaccount.com that will generate a JSON file for the service account that will be used in CredentialsProvider.
Create a CredentialsProvider object that will call the created JSON file of the service account.
try {
JobName name = JobName.of("[PROJECT]", "[LOCATION]", "[JOB]");
CredentialsProvider credentialsProvider =
FixedCredentialsProvider.create(
ServiceAccountCredentials.fromStream(new FileInputStream("/path/to/serviceaccount.json")));
CloudSchedulerSettings cloudSchedulerSettings = CloudSchedulerSettings.newBuilder().setCredentialsProvider(credentialsProvider).build();
CloudSchedulerClient cloudSchedulerClient = CloudSchedulerClient.create(cloudSchedulerSettings);
System.out.println(cloudSchedulerClient.getJob(name).toString()); // To display the output
cloudSchedulerClient.close();
} catch (IOException e) {
e.printStackTrace();
}
For the additional guidance, here's an API reference to customize credentials.
Note that you are using a service account which can be read by unauthorized person if improperly mishandled. My suggestion is to only set your service accounts with permissions required to perform your task/s. You can also follow this best practices for managing your credentials moving forward.
We have a requirement in which we have a page to show to the end user. On click of submit button on the page,an OSGI service is getting called which further calls an API.
The issue we are facing is that we are not allowed to store the API password anywhere and can only be entered by the permitted person. The challenge is:
We do not have any user interface to enable such thing.
What we could think of is getting the password entered in console during AEM startup but not aware how this is possible and everytime
any of the multiple instances is restarted, we will need to call the
permitted person to enter the password.
Can anybody provide his/her inputs on achieving this.
I would strongly suggest to question the requirement:
AEM works best with staticky rendered pages delivered from the dispatcher
personalised content requires therefore extra effort
But it can be achieved nonetheless:
you could deliver a static page and render the personalised information about the user via JavaScript and retrieve it via servlet call.
In case you need to store the password, you could create an OSGi bundle which is deployed and started when an instance is started. Usually you can achieve this by packaging it via content package and put it into an install folder: https://helpx.adobe.com/experience-manager/kb/HowToInstallPackagesUsingRepositoryInstall.html
The dispatcher must be configured to not cache this page though.
If I understand clearly then you are not permitted to store the api password in the AEM console. Can you store it in the server away from the AEM instance? If yes then you can create a file like a properties file and store it in the server and read the same file from you service before making the api call.
I have created a dummy script to avoid scope issues when trying to run it from Java code. Then, I check advanced google services under resources and go to the API console to try to deal with the credentials.
I can see 'Apps Script' credentials have been already created for me. When I try to set the redirectURI to point an endpoint I am exposing and then save, I get an error message saying I have no rights to perform the action.
Is there any code example to run google scripts from java? I have tried the tutorial but it is not focusing in the authorization. I have been so far able to create the script from java code with the token I get from credentials I have created, but not to run the scripts.
We want to download files from Google Storage in our application server. It is important to have read-only restricted access to a single bucket and nothing else.
At first I used a regular user account (not a service account) which have permissions to access all buckets in our Google Cloud project, and everything worked fine - my Java code opened buckets and downloaded files without problems.
Storage storage = StorageOptions.getDefaultInstance().getService();
Bucket b = storage.get( "mybucketname" );
Then I wanted to switch to use a specially created service account which has access to a single bucket only. So I created a service account, gave permissions to read a single bucket, and downloaded its key file. The permissions in Google Cloud Console are named as:
Storage Object Viewer (3 members) Read access to GCS objects.
gsutil command line utility works fine with this account - from the command line it allows accessing this bucket but not the others.
The initialization from the command line is done using the following command:
gcloud --project myprojectname auth activate-service-account files-viewer2#myprojectname.iam.gserviceaccount.com --key-file=/.../keyfilename.json
I even tried two different service accounts which have access to different buckets, and from the command line I can switch between them and gsutil gives access to a relevant bucket only, and for any other it returns the error:
"AccessDeniedException: 403 Caller does not have storage.objects.list access to bucket xxxxxxxxxx."
So, from the command line everything worked fine.
But in Java there is some problem with the authentication.
The default authentication I previously used with a regular user account stopped working - it reports the error:
com.google.cloud.storage.StorageException: Anonymous users does not have storage.buckets.get access to bucket xxxxxxxxxx.
Then I've tried the following code (this is the simplest variant because it relies on the key json file, but I've already tried a number of other variants found in various forums, with no success):
FileInputStream fis = new FileInputStream( "/path/to/the/key-file.json" );
ServiceAccountCredentials credentials = ServiceAccountCredentials.fromStream( fis );
Storage storage = StorageOptions.newBuilder().setCredentials( credentials )
.setProjectId( "myprojectid" ).build().getService();
Bucket b = storage.get( "mybucketname" );
And all I receive is this error:
com.google.cloud.storage.StorageException: Caller does not have storage.buckets.get access to bucket mybucketname.
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
The same error is returned no matter to what buckets I'm trying to access (even non-existing).
What confuses me is that the same service account, initialized with the same JSON key file, works fine from the command line.
So I think something is missing in Java code that ensures correct authentication.
TL;DR - If you're using Application Default Credentials (which BTW you are when you do StorageOptions.getDefaultInstance().getService();), and if you need to use the credentials from a service account, you can do so without changing your code. All you need to do is set the GOOGLE_APPLICATION_CREDENTIALS environment variable to the full path of your service account json file and you are all set.
Longer version of the solution using Application Default Credentials
Use your original code as-is
Storage storage = StorageOptions.getDefaultInstance().getService();
Bucket b = storage.get( "mybucketname" );
Set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the full path of your json file containing the service account credentials.
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/service_account_credentials.json
Run your java application once again to verify that it is working as expected.
Alternate solution using hard-coded Service Account Credentials
The code example you posted for initializing ServiceAccountCredentials looks valid to me on a quick glance. I tried the following code snippet and it is working for me as expected.
String SERVICE_ACCOUNT_JSON_PATH = "/path/to/service_account_credentials.json";
Storage storage =
StorageOptions.newBuilder()
.setCredentials(
ServiceAccountCredentials.fromStream(
new FileInputStream(SERVICE_ACCOUNT_JSON_PATH)))
.build()
.getService();
Bucket b = storage.get("mybucketname");
When specifying a service account credential, the project ID is automatically picked up from the information present in the json file. So you do not have to specify it once again. I'm not entirely sure though if this is related to the issue you're observing.
Application Default Credentials
Here is the full documentation regarding Application Default Credentials explaining which credentials are picked up based on your environment.
How the Application Default Credentials work
You can get Application Default Credentials by making a single client
library call. The credentials returned are determined by the
environment the code is running in. Conditions are checked in the
following order:
The environment variable GOOGLE_APPLICATION_CREDENTIALS is checked. If this variable is specified it should point to a file that
defines the credentials. The simplest way to get a credential for this
purpose is to create a Service account key in the Google API Console:
a. Go to the API Console Credentials page.
b. From the project drop-down, select your project.
c. On the Credentials page, select the Create credentials drop-down,
then select Service account key.
d.From the Service account drop-down, select an existing service
account or create a new one.
e. For Key type, select the JSON key option, then select Create. The
file automatically downloads to your computer.
f. Put the *.json file you just downloaded in a directory of your
choosing. This directory must be private (you can't let anyone get
access to this), but accessible to your web server code.
g. Set the environment variable GOOGLE_APPLICATION_CREDENTIALS to
the path of the JSON file downloaded.
If you have installed the Google Cloud SDK on your machine and have run the command gcloud auth application-default login, your
identity can be used as a proxy to test code calling APIs from that
machine.
If you are running in Google App Engine production, the built-in service account associated with the application will be used.
If you are running in Google Compute Engine production, the built-in service account associated with the virtual machine instance
will be used.
If none of these conditions is true, an error will occur.
IAM roles
I would recommend going over the IAM permissions and the IAM roles available for Cloud Storage. These provide control at project and bucket level. In addition, you can use ACLs to control permissions at the object level within the bucket.
If your use case involves just invoking storage.get(bucketName). This operation will require just storage.buckets.get permission and the best IAM role for just this permission is roles/storage.legacyObjectReader.
If you also want to grant the service account permissions to get (storage.objects.get) and list (storage.objects.list) individual objects, then also add the role roles/storage.objectViewer to the service account.
Thanks to #Taxdude's long explanation, I understood that my Java code should be all right, and started looking at other possible reasons for the problem.
One of additional things I've tried were the permissions set to the service account, and there I've found the solution – it was unexpected, actually.
When a service account is created, it must not be given permissions to read from Google Storage, because then it will have read permissions to ALL buckets, and it is impossible to change that (not sure why), because the system marks these permissions as "inherited".
Therefore, you have to:
Create a "blank" service account with no permissions, and
Configure permissions from the bucket configuration
To do so:
Open Google Cloud Web console
Open Storage Browser
Select your bucket
Open the INFO PANEL with Permissions
Add the service account with the Storage Object Viewer permission, but there are also permissions named Storage Legacy Object Reader and Storage Legacy Bucket Reader
Because of the word "Legacy" I thought those should not be used – they look like something kept for backward compatibility. And after experimenting and adding these "legacy" permissions, all of a sudden the same code I was trying all the time started working properly.
I'm still not entirely sure what is the minimal set of permissions I should assign to a service account, but at least now it works with all three "read" permissions on the bucket – two "legacy" and one "normal".
On an ID Vault configured Domino setup, when we register a user using Admin console, his/her ID file gets uploaded to ID Vault. In addition to this Admin can also chose to create ID file on some other specified location .
That is, Admin will be able to perform registration in two ways:
Option A: Admin will only chose to create file in ID Vault.
Option B: Admin will specify path where ID file will be created in addition to ID Vault.
I am using Lotus Notes Java client API to perform registration against the same setup. The issue is that I am not able to perform registration using Option A see above).
The method we call for registration of user has a mandatory parameter for ID file path. When executed, this method will create an ID file on the specified path in addition to one uploaded in ID Vault.
I have a requirement to perform registration with Option A.
Things i have tried:
Sending null/blank to value to this parameter causes run-time Exception.
Giving only the file name creates ID file in Lotus installation directory.
Setup details:
Lotus Domino 9
ID Vault configured
Client API details:
Lotus notes Java client API (NCSO.jar)
I think it's not possible to perform registration without creating file through this API.
Need some expert opinion here?
Can someone please point me in the right direction?
Unfortunately the NotesRegistration- class did not follow the development in admin- client. As for today there is no method to register a user without having an ID saved locally, as "filepath" is a mandatory parameter for registerNewUser method.
The only property that comes close is the IsNoIDFile- property, but it does not help here, as with that property set to False in my test no ID was generated at all (ID Vault was empty)...
Knowing that you can only "workaround" the bahaviour by deleting the created ID instantly after the method has finished.
I tried to find a PMR / SPR for that issue, as this might have probably been reported to IBM before, but I could not find one. If you need that feature, then you need to open a PMR with IBM and ask them for a "fix" for the class.