I would like to ask how can I connect the Spark service of Bluemix from an application loaded on Java Liberty (still on Bluemix)?
Thank you
Open your bluemix dashboard and then open your Java Liberty CF App by clicking on it from CF Apps.
Click On Overview and then you can add a new spark service or bind your existing spark service from bluemix by clicking on the tiles for ADD A SERVICE OR API or BIND A SERVICE OR API.
Once your service is added then you can check the credentials by clicking on Show Credentials.
Now Whatever you are trying to do from your liberty APP, you can use those credentials to do your stuff.
But Practically, Apache-spark service is used for analytics using notebooks to do interactive data analysis or you can run jobs using spark-submit which is command line utility.
So if you java-liberty app going to consume some analytical output result,
you can run spark-submit jobs from your java-liberty app programmatically and then read the output from console(But i am not sure if this approach would be good).
But Recommended approach would be to let your spark-submit job store results in some Object store and then read that from your java-liberty app.
https://console.ng.bluemix.net/docs/services/AnalyticsforApacheSpark/index-gentopic3.html#genTopProcId4
Thanks,
Charles.
Related
I have a springboot webapp that i want to deploy to Azure. The app is springboot jar. I have been able to use azurewebapp plug in to achieve this but it is using OAUTH2 which limits integrating this deployment method into our CD (bitbucket).
So the more generic question would be how can i deploy my springboot app via bitbucket deployment pipeline?
I looked at sample yaml file from bitbucket and it looks like that it needs these variables
AZURE_APP_ID: $AZURE_APP_ID
AZURE_PASSWORD: $AZURE_PASSWORD
AZURE_TENANT_ID: $AZURE_TENANT_ID
AZURE_RESOURCE_GROUP: $AZURE_RESOURCE_GROUP
AZURE_APP_NAME: $AZURE_APP_NAME
ZIP_FILE: app-$BITBUCKET_BUILD_NUMBER.zip
So where would i get these values from
Azure App ID: I am assuming this is from Azure app service? But i don't see any app-id on my currently deployed app.
Azure Password: is this password for my (admin) account?
Azure Tenant ID: Whats this? where to get it from?
Also, is this correct approach or should i be using some other method? Azure pipeline?
Azure_App_Id, Azure_Password, Azure_Tenant_Id are the key-value pairs you will get after creating the service principal for your application in the Azure.
Brief Explanation:
AZURE_APP_ID is the associated service principal's application Id/Name/URL for login.
AZURE_PASSWORD is the service principal creds
As you mentioned the bitbucket pipeline, this bitbucket official site will helps you how to create service principal for your web app and get the details required for deploying YAML script.
This site extends to Azure CLI site of creating service principal that is focused on Function app deploy script to bitbucket, where you can follow the same steps and replace the function app with web app.
1) I have registered a sink app on PCF using cf push -p abcdef.jar sinkapp. it went good
2) Now I have my SCDF server also on PCF
How can I register sinkapp on the SCDF server using dataflow which is on the same PCF , same org, same space. coz I have no clue what do I reference this to for registering it ? I am looking for the command that I can give to the SCDF from dataflow shell.
Thank you.
I'd highly recommend going through the getting-started experience for Cloud Foundry.
You should not be pushing the apps standalone and manually; instead, you'd "register" the app(s) in SCDF, and you'd then use the registered app(s) in the stream definition.
When you deploy the stream, SCDF will interpret the definition and in turn, it will push the apps to the desired org/space on your behalf. Here's a sample manifest for SCDF, where you'd define the org/space and other overrides.
I need to deploy my Java application on Azure Cloud Service. I don't want the extra overhead that comes with managing my own machines using Azure VM, which is an IaaS, nor do I want to use App Service, since the max cores per machine is 4- My application is very compute intensive and I would like to use at least 16 cores per instance, which Azure Cloud Services provides(D5v2 instances).
My build system is Maven and I would like to use something like Codeship to build my .war and deploy it to Azure Cloud services(rather than using the Azure Eclipse SDK to manually Publish to Azure Cloud Services). I've spent hours on the Azure documentation, but haven't found any way of doing this.(Azure App Service has a simple 'upload a war to deploy' model. I dont know why the same isn't there for cloud services: https://azure.microsoft.com/en-in/documentation/articles/web-sites-java-get-started/).
Remember that Cloud Services are the original deployment mechanism for Azure, dating back to 2010 (ok, 2009 if you want to count pre-production days). The .cspkg format is pretty much the same as it ever has been. The Web Apps deployment mechanism is completely different.
Eclipse (on Windows) has a specific plugin available for constructing .cspkg which you can then automate deploying, via PowerShell or CLI.
Alternatively, you can bundle your .war files within a .cspkg generated by Visual Studio, and then get things started within `OnStart(). Again, you can automate deployment from scripts - no need to ever publish directly from within an IDE.
Also: There's nothing stopping you from your automation process pushing .war files to blob storage (or somewhere else) and then sending your app some type of message letting it know to update itself. At that point, there's no redeployment of a .cspkg - rather, it's just downloading a new .war to running web/worker instances and restarting the java process.
#DylanColaco, As #DavidMakogon said, you can install the plugins named azure-tools-for-java for Eclipse or IntelliJ IDEA to deploy your war file as web/worker role instance into cloud service.
And there is an offical tutorial which shows how to getstarted.
As references, you can refer to the article lists and a very helpful vedio at Channel 9 below.
For Eclipse, https://azure.microsoft.com/en-us/documentation/articles/azure-toolkit-for-eclipse/
For IntelliJ, https://azure.microsoft.com/en-us/documentation/articles/azure-toolkit-for-intellij/
Java Applications in Windows Azure Cloud Services using Eclipse
I would like to let the user use the transformation file created from the Pentaho data integration without the need of using the PDI application (spoon) and I want it to run on web application. For example when the user clicks or triggers the PDI File in the web application the .ktr or the created transformation file will run automatically.
Since I am new to PDI, can someone give me a link or step by step guide on how to do this? All the links that I have found are either incomplete or they are hard to follow.
you can expose your transformation as a web service with carte server
http://wiki.pentaho.com/display/EAI/PDI+data+over+web+services
I am pretty new to Amazon AWS technologies, and I have been going through all their documentation. My goal is to create a new web service for a use case (preferably REST) using Tomcat.
I want to use this service in multiple clients like Android, Iphone, Tablet, Web etc.
Some of the examples I want to support are like -
GET http://myservice.com/user/{userid}
PUT http://myservice.com/user/{user-data}
Does AWS or any other cloud service providers provide anything out of the box for deploying such services with minimal code changes?
With AWS, you create a virtual server, customize it, and then use it. When you create a server, you pick your operating system and the size of the server you need. Once it is running, you can login and customize it.
For example, you might start a linux server using the Amazon Linux AMI (amazon machine image). You can use yum to install tomcat. You can drop your war file into the tomcat webapps directory.
Set up access in the security group (firewall) to allow your clients to access the relevant port(s).
Bottom line is that the process is basically the same as if you are doing this on a new server of your own.