We have scripted all of the processes (creating models, table mapping, screnario and loadplan in java using groovy libraries).
Now we want to execute the create loadplan in java.
Somebody please provide the guidelines to script the execution code in java.
We got the method to execute from command prompt.
But looking for option to script in java.
A load plan or scenario execution can be triggered through the class RemoteRuntimeAgentInvoker from oracle.odi.runtime.agent.invocation. Check the invokeStartLoadPlan methods.
Related
In Postman, using Rest, I created a collection of requests that need to be executed. I was hoping to link Gherkin acceptance Criteria with the cucumber shell to execute the Postman collection.
Is there a way for me, in the cucumber shell, to execute the collection using Java?
Gherkin, Cucumber and Java are all new to me so I apologize if I'm not exactly clear.
thanks,
Scott
Can you execute the collection of requests using Java? That is, leaving the Cucumber and Gherkin out of the equation? If you can, then you can do it using Cucumber as well.
The execution of steps defined in Gherkin using Cucumber is just a way of executing selected Java methods matching the steps you define in Gherkin.
This is obviously under the condition that you choose to run Cucumber for Java.
To get started and have something to build from, clone and run the getting started project offered by the Cucumber team, https://github.com/cucumber/cucumber-java-skeleton
I developed a transformation that generates a Kermeta XMI file.
My problem is that I would run this transformation (.kmt file) in the background, that is to say from a Java program.
I tried a lot of code but always without result.
Someone can help me?
Thank you in advance for your help.
Running a kermeta program depends on the version of Kermeta and about its execution context.
With Kermeta 2, the kmt code is compiled into scala. This scala code can be directly called from java thanks to scala compatibility with java.
You first need to declare at least one main operation in order to generate the appropriate initialization operation.
Let's say you have a method "transformFooToBar" in a class "org::example::FooToBar" in your kermeta code.
Declare it as a main entry point using the tags
#mainClass "org::example::FooToBar"
#mainOperation "transformFooToBar"
This main operation must have only string parameters.
This will generate an utility scala class org.example.MainRunner which contains useful initialization methods and a main operation that you can call from the command line.
Case 1/ With Kermeta 2 code called from an eclipse plugin:
This is the simpliest case, call the method org.example.MainRunner.init4eclipse()
before any other call. Then call your main method org.example.KerRichFactory.createFooToBar.transformFooToBar()
You can call any other classes or methods (ie. even with parameters that aren't String)
This is quite convinient for building tool for eclipse based on kermeta transformation.
Case 2/ Kermeta 2 code called from a standard java application (ie. not running in an eclipse plugin)
The initialiation method is then org.example.MainRunner.init()
Common trap for case 2: many transformations that run as standalone still need to manipulate models using eclipse uri scheme in their internal refencing system. Ie. platform:/plugin/... , platform:/resource/..., pathmap:/... , or even more complex uri mapping (typically using custom protocols ) (you can check that easily in by looking in the xmi files as text)
In that case, since Eclipse platform isn't running to provide the search mechanism, you need to provide manually the equivalent URI mapping to map these URI to your local system URI (Ie. to file:/... or jar:file:/ URIs)
One possibility is to use an urimap.properties file that provide such mapping. By default when running a kermeta program in eclipse, an urimap.properties is generated for the current eclipse configuration.
When deploying a kermeta program to another computer, or using a custom deployment packaging, you will have to provide/compute an equivalent file for the target computer.
For convinience, you can set the location of this urimap.properties files thanks to the system propery "urimap.file.location"
Does anyone know if it is possible to call a shell script from within Java program and also pass an argument to that shell script from the for loop of that java class? In my shell script I am setting MySQL system variables to different values to see if those values affect the performance of the database application. I could have set those values through JDBC, but as I am working with MySQL, it is not possible to restart the database from JDBC, after each query execution.
Yes it is possible. For something like this you would probably be better off just using a batch file or something to do it though.
If you really do need to use java try:
How to run Unix shell script from Java code?
Runtime.exec() is what you are looking for.
Runtime.getRuntime().exec("theScript.sh param1 param2);
..modulo exception handling.
You may want to look into Apache Commons
I'm thinking about using pentahose to help me transform different xml files from several sources to integrate the data in my system. Those xml are downloaded from internet every 10 minutes by a java program. If I want to use Kettle to transform the data, do I need a Pentahose server in order to run the transformations? Or is there a way to export the transformations to java classes so I can use them? I'd appreciate any kind of orientation :)
if you wan to run Transformations you only need the command console pan.sh, if you are going to run jobs you will need to execute the kitchen.sh command. Take a peek to carte.sh server, its a self contained webserver that allows you to send transformations and jobs remotely to another machine. (intranet, internet).
Answering your other question about java code: No, kettle does not generate java code based on your transformations and jobs, only xml.
You don't need the server. You can run your Kettle jobs using the Kitchen command line. You could easily setup a cron job to execute your job at a set interval.
http://wiki.pentaho.com/display/EAI/Kitchen+User+Documentation
To run transformations and jobs you need to use pan and kitchen (sh version for linux and bat for windows) read here http://infocenter.pentaho.com/help/index.jsp?topic=%2Fpdi_user_guide%2Fconcept_cli_scripting.html. If you want to run transformation/jobs directly form java, you need to import "lib" and "libswt" (and libext for some versions of kettle) to your java project and use kettle's java API:
KettleEnvironment.init();
JobMeta jobMeta = new JobMeta("job path",null);
jobMeta.setParameterValue("param name",value);
Job job = new Job(null,jobMeta);
job.setLogLevel(LogLevel.BASIC);
job.start();
job.waitUntilFinished();
Result result = job.getResult();
if (!result.getResult()){
//manage the error case
}
this is an example of java API using, kettle is made in java, so it is full integrable
In one java application that uses JPL to interact with Prolog, I want to be able to restart the prolog engine with different settings. As an example, I would like to change from SWI to YAP (I configure which engine to use this with the method JPL.setNativeLibraryDir with the path of the right native library I need to use).
So after changing the JPL configuration, I was trying to halt the already running prolog engine in order to restart it again afterwards with JPL.init().
First I took a look to JPL.halt(), but the documentation says it is deprecated and the comments in the source code of the method said that it is no-op.
Afterwards, I tried to just launch a query with 'halt', but although I see in the console "YAP execution halted" as expected, my java application is also halted (!).
Is there a way to restart the logic engine using JPL without killing my java application ?
why not using a batch file to run scripts. first you can have /Applications/swi -l your/swi/code and then use halt inside your prolog code so that it halts after execution. You can run your batch file inside java using runtime method.