Authenticate subversion remotely - java

I'm developing an application based on java which allows one to manage jenkins jobs remotely.
How can I insert svn user and password when creating a new job?

I am not aware of any method of doing so, however you can have the master cache credentials after authenticating successfully once.
This can be found under: Manage Jenkins>Configure System>Subversion. Just check the box which specifies "Update default Subversion credentials cache after successful authentication".
If you have multiple repositories this won't help, but if your jobs use the same repo then you should be able to create a new job via the api without explicitly specifying the SVN credentials.

Related

GCP Dataflow There was a problem refreshing your credentials

I'm trying to create a dataflow in Java with Gradle. I've created a gradle task to upload the job to GCP via the DataflowRunner and set my credentials via environment "GOOGLE_APPLICATION_CREDENTIALS", "....json", but when I view the job, all I see is:
Workflow failed. Causes: There was a problem refreshing your credentials. Please check:
1. Dataflow API is enabled for your project.
2. There is a robot service account for your project:
service-[project number]#dataflow-service-producer-prod.iam.gserviceaccount.com
should have access to your project.
If this account does not appear in the permissions tab for your project, contact Dataflow support.
I've already made sure the dataflow API is enabled for my project and that the service account exists and has editor and Cloud Dataflow Service Agent roles. I've tested with my code and the sample code on the getting started page, same issue.
This issue can be triggerd if the Compute Engine default service account in the Google Cloud project is disabled:
The status should show a green tick.
When you first use Dataflow in a new GCP project, you need to enable the API (Step 3 in the Quickstart), it could take few minutes. During this step a couple of service accounts are created:
Cloud Dataflow service account: service-<project-number>#dataflow-service-producer-prod.iam.gserviceaccount.com.
Controller service account: <project-number>-compute#developer.gserviceaccount.com.
The first account is the one in the error. I would think that you have not enabled the Dataflow API or you executed the job while the API was being enabled. As you mentioned you verified this, the nex step is checking if the default service accounts were modified, let's say, they were recreated or the default permissions changed. These revisions also apply for the Controller service account.
In addition, please don't forget to assign the proper permissions to the account that execute the Dataflow job.
UPDATE:
If the issue persists, it is probable that the service accounts were corrupted somehow, in which case, it is recommended to create a new GCP project with fresh service accounts or Contact Cloud Support.

How to connect to Azure keyvault from SpringBoot application for local development using MSI

I am trying to connect to Azure KeyVault from my locally running Spring Boot Application. I can't keep those secrets to be saved in keyvault in different properties or yaml during dev, because my application will generate and delete so many secrets and tokens to be saved in keyvault in the run time.
I am aware of the process in which we can create an Azure service principal from your application registration. And use
azure.keyvault.client-id
azure.keyvault.client-key
in application.properties to connect.
But it may not be allowed to be created Azure service principal in our case. So is there any way to connect to key vault using MSI from locally running SpringBoot application.
using MSI_ENDPOINT
and MSI_SECRET
So is there any way to connect to key vault using MSI from locally running SpringBoot application.
using MSI_ENDPOINT and MSI_SECRET
I don't think you can use MSI_ENDPOINT and MSI_SECRET get the token in local, it just works when the web app published in the cloud.
But it may not be allowed to be created Azure service principal in our case.
As you know, you can use the service principal client id and secret(key) to access the keyvault. Actually, when enabling the MSI of the web app, it will create a service principal in your Azure AD tenant automatically. So you can just use the client id and secret of it.
Navigate to the portal -> Azure Active Directory -> Enterprise applications -> search for your web app name(select the Application Type with All Applications), then you get the client id(application id).
Note: Remember to check the object id of the service principal with that in your web app -> Identity, make sure you use the correct one.
For the service principal secret, you could create it via powershell like below(your account need the admin role Application administrator or Global administrator in your AAD tenant).
New-AzureADServicePrincipalPasswordCredential -ObjectId <service principal object id>
Then you will be able to access the keyvault with the client id and secret. For details in java, you can refer to this link.
You can't get it using those variables because locally there is no Azure AD Identity Registered on your local machine and as such Microsoft didn't build any MSI emulator so no variables will be set.
I can recommend what Microsoft did in their .NET library
Run Azure CLI and log in
In code check for variables and if they don't exist then run CLI command
az account get-access-token --resource 'https://vault.azure.net'
In CLI simply log into either principal or your account. Make sure to add this account/your account to KeyVault policy.
I know it's weird but I you can even check it HERE on their GitHub.
I might have an article that will help you in case you want more details
https://marczak.io/posts/2019/07/securing-websites-with-msi/

google app engine deploy to two difference accounts from same machine

Using the google app engine maven plugin when a deploy is performed, the browser opens and an oauth key is copied.
However, I want to use two different google accounts from the same laptop and have laready previously registered one oauth key, but now want to use another. I never get prompted for a key and the deploy fails as the application id is not correct, which is expected.
Is there any way to use two different google accounts with app engine sdk ?
I can deploy the app OK using a different machine as it is setup with the correct account.
(this would not be a problem if push to deploy/pipeline worked, but it doesn't)
To have more than one account register with the SDK, you should use gcloud Command Line Tool.
Once you have login multiple accounts, $ gcloud auth list should yield a list of accounts. Switching is as simple as running $ gcloud config set account ``ACCOUNT'' which should make relevant tokens active and will allow you to use appcfg.[py|sh] with the selected account.
If you look in your home directory, you should find a couple of files used by appcfg: .appcfg_oauth2_tokens_java and .appcfg_cookies. Deleting the former (I think) will prompt appcfg to retrigger the oauth process.
So... I guess if you have multiple oauth token files, you can create a short shell script that takes the username as an argument then copies the oauth token file you need to .appcfg_oauth2_tokens_java just before the appcfg update.

How to handle credentials in an open source project

I have developed an application that lives in a public Github repository.
The app interacts with systems that require credentials that are currently stored in a properties file.
A jenkins box runs the app periodically.
The problem of saving the project in github without exposing my credentials is succinctly addressed here.
How do I pass my credentials to the jenkins job without exposing the credentials (needed by the app) to my workmates?
In my case I went for a jenkins parametrized build that allowed me to provide string and password type of params.
The params are read in the program via system.getProperty

Hudson project without user interface

Can I configure, create/update the existing project in Hudson without using its user interface?
Is it possible by changing configuration file or other mean?
The Remote Access API page mentions that you an create/copy job with it.
Remote access API is offered in a REST-like style.
That is, there is no single entry point for all features, and instead they are available under the ".../api/" URL where "..." portion is the data that it acts on.
For example, if your Hudson installation sits at http://deadlock.netbeans.org/hudson/, http://deadlock.netbeans.org/hudson/api/ will give you HTML lists of all available functionality that act on the Hudson root.
On my Hudson, the /api address gives:
Create Job
To create a new job, post config.xml to this URL with query parameter name=JOBNAME.
You'll get 200 status code if the creation is successful, or 4xx/5xx code if it fails.
config.xml is the format Hudson uses to store the project in the file system, so you can see examples of them in /server/path/to/your/hudson/home.

Categories

Resources