embedding/integrating Pentaho Data Integration into web applications? - java

I would like to let the user use the transformation file created from the Pentaho data integration without the need of using the PDI application (spoon) and I want it to run on web application. For example when the user clicks or triggers the PDI File in the web application the .ktr or the created transformation file will run automatically.
Since I am new to PDI, can someone give me a link or step by step guide on how to do this? All the links that I have found are either incomplete or they are hard to follow.

you can expose your transformation as a web service with carte server
http://wiki.pentaho.com/display/EAI/PDI+data+over+web+services

Related

How to deploy a Vuejs application

My application has two separate parts, back-end in Java (Spring boot) and front-end in Vuejs.
I can simply deploy the JAR for my back-end code where I need to, and that's it for deployment.
But when it comes to deploying the Vuejs app, I can do something similar and just put the compiled Vuejs application in the proper path in a Java Spring boot application and that will be all for the front-end too.
It just doesn't seem right to me to put that application in Spring boot when it doesn't really have to do anything with it really other than for deployment (Maybe I know nothing like Jon Snow).
Also, when its put under a Spring boot application, manual URL editing doesn't work.
This app doesn't do anything on its own, it fetches all its data from the back-end app.
So what are my options here, can someone please guide me in the right direction?
Do I just setup a nodejs server and deploy the Vuejs app in that? I am not sure how that works, or whether should I even be doing that for a production application. And if so, where do I start with setting up nodejs?
It makes sense to deploy it together with spring, and it's very common practice, at least from my experience with Angular (which I suppose would be very similar to VueJS).
You don't need to have 2 servers running. You just let Spring server your HTML/js/CSS files, which helps you avoid any problems with CORS.
I am not really sure what 'URL manual editing', do you mean by navigating the web page by editing the URL? I don't see much use cases there tbh and I would guess that is only a matter of few settings.
In gradle - I would set up a build task (not sure if task is correct word, 2 build.gradle files, each for FE/BE, the BE would depend on FE), the FE would be run when BE is run, it creates static HTML/js (in my case from angular, but it should be similar for Vue) and BE task adds the output to the classpath of the java application so that Spring can register the HTML and serve it to you.
You could use Docker to create a Dockerized version of your Vue.js app and then you can deploy this onto a cloud service provider such as AWS (e.g. EC2).
Check out this link for a basic guide https://v2.vuejs.org/v2/cookbook/dockerize-vuejs-app.html
My approach is to deploy front-end and back-end separately.
You can use web-server to proxy requests to Vuejs or Spring boot.
For example, if you use Nginx, you can use this configuration to pass requests
# pass root request to index file
location / {
root /front_files/;
index /index.html;
}
# pass requests to static files
location ~ ^/(js|styles) {
root /front_files/;
}
# pass requests to back-end
location /api/ {
proxy_pass http://127.0.0.1:8080/;
}

How to ship java based web application in AWS cloud using docker

We have build java web application which provides various REST API.. I would like to have painless deployment progress.. Here is desired scene..
Users -> Load Balancer -> AS1, AS2, AS3 ...
Here AS = Application Server (Tomcat on EC2) OR Docker instances (I will prefer docker instances)
First time Desired Flow:
Developer fires maven and builds .war file
We may develop script which will generate docker image using this .war file..
Executes steps which will float this dockers behind ELB
Redeployment:
Developer fires maven and builds .war file
We may develop script which will generate docker image using this .war file..
Executes steps which will float this dockers behind ELB and destroy previous one..
I am kind of new to DevOps and may be doing some mistake in above steps. So please feel free to correct me and provide guidance to achieve this goal.
(If this is duplicate please provide link to related question)
Thanks in advance.

Spark service on Bluemix call from a Java Liberty app

I would like to ask how can I connect the Spark service of Bluemix from an application loaded on Java Liberty (still on Bluemix)?
Thank you
Open your bluemix dashboard and then open your Java Liberty CF App by clicking on it from CF Apps.
Click On Overview and then you can add a new spark service or bind your existing spark service from bluemix by clicking on the tiles for ADD A SERVICE OR API or BIND A SERVICE OR API.
Once your service is added then you can check the credentials by clicking on Show Credentials.
Now Whatever you are trying to do from your liberty APP, you can use those credentials to do your stuff.
But Practically, Apache-spark service is used for analytics using notebooks to do interactive data analysis or you can run jobs using spark-submit which is command line utility.
So if you java-liberty app going to consume some analytical output result,
you can run spark-submit jobs from your java-liberty app programmatically and then read the output from console(But i am not sure if this approach would be good).
But Recommended approach would be to let your spark-submit job store results in some Object store and then read that from your java-liberty app.
https://console.ng.bluemix.net/docs/services/AnalyticsforApacheSpark/index-gentopic3.html#genTopProcId4
Thanks,
Charles.

Generate a report using pentaho java api for hdfs file passing hdfs file path programatically

I am new to Pentaho.I have installed Pentaho Enterprise Edition and have set the required configurations for BigData.I was able to run PDI transformation and generate reports using that tool.In my java web application I have downloaded pentaho-kettle jars using maven that are required for generating reports.Is it possible to generate a report for hdfs file by passing hdfs file path dynamically or programatically using report api?If yes what steps are required to accomplish this?
Thanks in advance.
You can have a pentaho report on a server side. To be able to query hdfs - this report should query PDI transformation/job. When it make attempt to query it - it starts the transformation. So you have to have a full PDI infrastructure to be able to execute this job/transformation.
Separate jars will not help since PDI (also known as pentaho-kettle of spoon) is 'an installation' rather than a library. It is interacts with hdfs using 'shims' as a plugins. This shims has to have a correct structure of folders and config files etc. In Enterprise Edition typically all this is hidden under the hood of enterprise server.
In case you want to use 'my own java web application' - the simplest way to have all working (from my point of view) is to create a kettle transformation, install Carte server (in a neighborhood with your web server or another machine), configure hdfs steps to run on this Carte server.
Technically when launch such report in context of your web application this will trigger kettle transformation execution. This also will call to carte server for hdfs steps. Since Carte server has a correct PDI installation to be able to interact with HDFS - it will fetch HDFS data and send it back to your application. This data will travel through a network since your web application run report: report run transformation, transformation fetch data from carte server, carte server fetch data from hdfs.
You may consider to have a carte server as a localhost only accessible, while expose you web app to external requests. Hope it will help.
Hope it will help.

Java standalone app with dynamic configuration via server over HTTP

i am writing a standalone java app. the app's properties should be configurable from a webpage deployed with the app. how do i achieve this ?
Thanks in advance
note: the app has an embedded HTTP client/server module. it should only run from command prompt
I don't think that's a good idea. Webpage forms are designed to work with a server, not with a standalone client app. You could have the app run its own web server, but that would mean the app has to be running for the configuration page to work, and it's also a rather contrived setup just to do some configuration.
It might be possible for the webpage to contain JavaScript that writes to a local file - I don't know enough about the JavaScript security model to say.
But why not have the configuration dialog as part of the app's GUI? That's the normal and expected behaviour - you'd need a pretty compelling reason to deviate from it.
JMX might be the answer that you're looking for. If you expose all of your configurable properties through MBeans, then adding a web page on top of that exposing these properties is just configuration.
You can launch a standalone Java app using JNLP files (Java WebStart). If you want the user to be able to configure the application before its launched, you can have the JNLP file dynamically generated, then pass properties as environment variables through the JNLP file.
You can configure your standalone Java app to read configurable properties from a properties file (say conf.properties) on the server.
You may have a UI webpage (html/jsp) with all the field to be configured. When the page is submitted a JSP/Servlet may write/update the contents of conf.properties on the server.
UPDATE: The above solution will work assuming only an admin user wants to update the properties file.
In case anybody should be able to update it, then concurrency issue has to be taken into account.
In that scenario, you have to implement a mechanism similar to how weblogic10 updates config.xml using Admin Console.
i.e. You will have 2 conf.properties files confA & confB (initially in sync). The standalone app will always read from confB. The UI will have 2 buttons say Lock & Release configurations. When an edit is made (locked & released), it will be written to confA and at the same time changes of confA has to be replicated to confB.

Categories

Resources