My process simply add some content to the system variable PATH. Actually I'm doing this with a Process that use the setx.exe:
public void changePath(String newPath ) {
String path = System.getenv("PATH") + ";";
String[] cmd = new String[]{"C:\\Windows\\System32\\setx.exe", "PATH",
path+newPath, "-m"};
ProcessBuilder builder = new ProcessBuilder(cmd);
...
}
So I tried to write a test case to it.
Class UpdatePathTest {
#Test
public void testUpdatePath() {
//call the method that update the path
changePath("C:\\somebin");
assertTrue(System.getenv("PATH").contains("C:\\somebin")); //fails
// ProcessBuilder with command String[]{"cmd", "/C", "echo", "%PATH%"}; will fail too.
//and the above in a new Thread will fail too.
}
}
So, is there any way to get the new PATH value? Writing the new path is the only option, because I'm developing a jar that will install a desktop application.
I'm not sure changing the path is a good idea in a unit test. What if the test fails? You will have to make sure you do all the relevant tidy up.
Consider inverting your dependencies and use dependency injection.
This article explains it quite well I think.
So instead of having a method that does:
public void method() {
String path = System.getenv("PATH") + ";";
//do stuff on path
}
consider doing:
public void method(String path) {
//do stuff on path
}
which allows you to stub the path. If you cannot change the signature of the method then consider using the factory pattern and using a test factory to get the path.
EDIT: after update to question
What you have to think about here is what you are testing. When you call C:\Windows\System32\setx.exe you have read the API docs and are calling it with the correct parameters. This is much like calling another method on a java API. For example if you are manipulating a list you "know" it is zero based. You do not need to test this you just read the API and the community backs you up on this. For testing changePath I think you probably what to test what is going into ProcessBuilder. Again you have read the API docs and you have to assume that you are passing in the correct variables. (See //1 at bottom) And again you have to assume that ProcessBuilder works properly and that the Oracle (or most likely Sun) guys have implemented it to the API documents.
So what you want to do is check that you are passing variables to ProcessBuilder that match the specification as you understand it. For this you can mock ProcessBuilder and then verify that you are passing the correct parameters and calling the correct method on this class.
In general it is a hard one to test because you don't want to test the windows functions but want to test java's interaction with them.
//1 The main problem I have had with calling this external commands is understanding the API documents correctly or setting up the command. Usually you have to get the command line out and check that you are using methods correctly (esp cmd functions). This can mean that you work out how to use the cmd function, code it into ProcessBuilder and then write a test (or vice versa on the ProcessBuilder/test) Not the ideal way but sometimes documents are hard to understand.
Related
picoCLI's #-file mechanism is almost what I need, but not exactly. The reason is that I want to control the exact location of additional files parsed -- depending on previous option values.
Example: When called with the options
srcfolder=/a/b optionfile=of.txt, my program should see the additional options read from /a/b/of.txt, but when called with srcfolder=../c optionfile=of.txt, it should see those from ../c/of.txt.
The #-file mechanism can't do that, because it expands ALL the option files (always relative to the current folder, if they're relative) prior to processing ANY option values.
So I'd like to have picoCLI...
process options "from left to right",
recursively parse an option file when it's mentioned in an optionfile option,
and after that continue with the following options.
I might be able to solve this by recursively starting to parse from within the annotated setter method:
...
Config cfg = new Config();
CommandLine cmd = new CommandLine(cfg);
cmd.parseArgs(a);
...
public class Config {
#Option(names="srcfolder")
public void setSrcfolder(String path) {
this.srcfolder=path;
}
#Option(names="optionfile")
public void parseOptionFile(String pathAndName) {
// validate path, do some other housekeeping...
CommandLine cmd = new CommandLine(this /* same Config instance! */ );
cmd.parseArgs(new String[] { "#"+this.srcfolder + pathAndName });
}
...
This way several CommandLine instances would call setter methods on the same Config instance, recursively "interrupting" each other. Now comes the actual question: Is that a problem?
Of course my Config class has state. But do CommandLine instances also have state that might get messed up if other CommandLine instances also modify cfg "in between options"?
Thanks for any insights!
Edited to add: I tried, and I'm getting an UnmatchedArgumentException on the #-file option:
Exception in thread "main" picocli.CommandLine$UnmatchedArgumentException: Unmatched argument at index 0: '#/path/to/configfile'
at picocli.CommandLine$Interpreter.validateConstraints(CommandLine.java:13490)
...
So first I have to get around this: Obviously picoCLI doesn't expand the #-file option unless it's coming directly from the command line.
I did get it to work: several CommandLine instance can indeed work on the same instance of an annotated class, without interfering with each other.
There are some catches and I had to work around a strange picoCLI quirk, but that's not exactly part of an answer to this question, so I explain them in this other question.
First, some background why I want this crazy thing. I'm building a Plugin in Jenkins that provides an API for scripts that are started from a pipeline-script to independently communicate with jenkins.
For example a shell-script can then tell jenkins to start a new stage from the running script.
I've got the communication between the script and Jenkins working, but the problem is that I now want to try and start a stage from a callback in my code but I can't seem to figure out how to do it.
Stuff I've tried and failed at:
Start a new StageStep.java
I can't seem to find a way to correctly instantiate and inject the step into the lifecycle. I've looked into DSL.java, but cant seem to get to an instance to call invokeStep(), nor was I able to find out how to instantiate DSL.java with the right environment.
Look at StageStepExecution.java and do what it does.
It seems to either invoke the body with an Environment Variable and nothing else, or set some actions and save the state in a config file when it has no body. I could not find out how the Pipeline: Stage View Plugin hooks into this, but it doesn't seem to read the config file. I've tried setting the Actions (even the inner class through reflection) but that did not seem to do anything.
Inject a custom string as Groovy body and call it with csc.newBodyInvoker()
A hacky solution I came up with was just generating the groovy script and running it like the ParallelStep does. But the sandbox does not allow me to call new GroovyShell().evaluate(""), and If I approve that call, the 'stage' step throws a MissingMethodException. So I also do not instatiate the script with the right environment. Providing the EnvironmentExpander does not make any difference.
Referencing and modifying workflow/{n}.xml
Changing the name of a stage in the relevant workflow/{n}.xml and rebooting the server updates the name of the stage, but modifying my custom stage to look like a regular one does not seem to add the step as a stage.
Stuff I've researched:
If some other plugin does something like this, but I couldn't find any example of plugins starting other steps.
How Jenkins handles the scripts and starts the steps, but It seems as though every step is directly called through the method name after the script is parsed, and I found no way to hook into this.
Other plugins using the StageView through other methods, but I could not find any.
add an AtomNode as a head onto the running thread, but I couldn't find how to replace/add the head and am hesitant to mess with jenkins' threading.
I've spent multiple days on this seemingly trivial call, but I can't seem to figure it out.
So the latest thing I tried actually worked, and is displayed correctly, but it ain't pretty.
I basically reimplemented the implementation of DSL.invokeStep(), which required me to use reflection A LOT. This is not safe, and will break with any changes of course so I'll open an issue in the Jenkins' ticket system in the hopes they will add a public interface for doing this. I'm just hoping this won't give me any weird side-effects.
// First, get some environment stuff
CpsThread cpsThread = CpsThread.current();
CpsFlowExecution currentFlowExecution = (CpsFlowExecution) getContext().get(FlowExecution.class);
// instantiate the stage's descriptor
StageStep.DescriptorImpl stageStepDescriptor = new StageStep.DescriptorImpl();
// now we need to put a new FlowNode as the head of the step-stack. This is of course not possible directly,
// but everything is also outside of the sandbox, so putting the class in the same package doesn't work
// get the 'head' field
Field cpsHeadField = CpsThread.class.getDeclaredField("head");
cpsHeadField.setAccessible(true);
Object headValue = cpsHeadField.get(cpsThread);
// get it's value
Method head_get = headValue.getClass().getDeclaredMethod("get");
head_get.setAccessible(true);
FlowNode currentHead = (FlowNode) head_get.invoke(headValue);
// crate a new StepAtomNode starting at the current value of 'head'.
FlowNode an = new StepAtomNode(currentFlowExecution, stageStepDescriptor, currentHead);
// now set this as the new head.
Method head_setNewHead = headValue.getClass().getDeclaredMethod("setNewHead", FlowNode.class);
head_setNewHead.setAccessible(true);
head_setNewHead.invoke(headValue, an);
// Create a new CpsStepContext, and as the constructor is protected, use reflection again
Constructor<?> declaredConstructor = CpsStepContext.class.getDeclaredConstructors()[0];
declaredConstructor.setAccessible(true);
CpsStepContext context = (CpsStepContext) declaredConstructor.newInstance(stageStepDescriptor,cpsThread,currentFlowExecution.getOwner(),an,null);
stageStepDescriptor.checkContextAvailability(context); // Good to check stuff I guess
// Create a new instance of the step, passing in arguments as a Map
Map<String, Object> stageArguments = new HashMap<>();
stageArguments.put("name", "mynutest");
Step stageStep = stageStepDescriptor.newInstance(stageArguments);
// so start the damd thing
StepExecution execution = stageStep.start(context);
// now that we have a callable instance, we set the step on the Cps Thread. Reflection to the rescue
Method mSetStep = cpsThread.getClass().getDeclaredMethod("setStep", StepExecution.class);
mSetStep.setAccessible(true);
mSetStep.invoke(cpsThread, execution);
// Finally. Start running the step
execution.start();
I have 2 .m files. One is the function and the other one (read.m) reads then function and exports the results into an excel file. I have a java program that makes some changes to the .m files. After the changes I want to automate the execution/running of the .m files. I have downloaded the matlabcontrol.jar and I am looking for a way to use it to invoke and run the read.m file that then reads the function.
Can anyone help me with the code? Thanks
I have tried this code but it does not work.
public static void tomatlab() throws MatlabConnectionException, MatlabInvocationException {
MatlabProxyFactoryOptions options =
new MatlabProxyFactoryOptions.Builder()
.setUsePreviouslyControlledSession(true)
.build();
MatlabProxyFactory factory = new MatlabProxyFactory(options);
MatlabProxy proxy = factory.getProxy();
proxy.eval("addpath('C:\\path_to_read.m')");
proxy.feval("read");
proxy.eval("rmpath('C:\\path_to_read.m')");
// close connection
proxy.disconnect();
}
Based on the official tutorial in the Wiki of the project, it seems quite straightforward to start with this API.
The path-manipulation might be a bit tricky, but I would give a try to loading the whole script into a string and passing it to eval (please note I have no prior experience with this specific Matlab library). That could be done quite easily (with joining Files.readAllLines() for example).
Hope that helps something.
From java code i am able to run the vbscript by using this code
Runtime.getRuntime().exec("wscript C:\\ppt\\test1.vbs ");
But want to know how to call the method of vbscript from java..for example in test1.vbs
Set objPPT = CreateObject("PowerPoint.Application")
objPPT.Visible = True
Set objPresentation = objPPT.Presentations.Open("C:\ppt\Labo.ppt")
Set objSlideShow = objPresentation.SlideShowSettings.Run.View
sub ssn1()
objPPT.Run "C:\ppt\Labo.ppt!.SSN"
End sub
how to call only ssn1() method from java.Otherwise can we run the macro of a power point from java code..kindly help!!
This should make you happy :) Go to the WScript section : http://technet.microsoft.com/library/ee156618.aspx
Here's my idea... in your vbscript file, make your script listen to a command line parameter that would specify which method to call. Then, in Java, you could only have to use this parameter whenever you want to call a specific method in the file.
Otherwise, if you want to access powerpoint in java, you will need to access its API like you did in vbscript, which is possible if vbscript can do it but the approach / syntax may change.
I'm not so much into the visual basic script side, but if you can expose your visual basic script as a COM object, the you can access the methods of it from java by usage of frameworks such as for example com4j:
http://com4j.java.net/
The PowerPoint application object's .Run method lets you call any public subroutine or function in any open presentation or loaded add-in
This post answers the OP's question:
Otherwise can we run the macro of a power point from java code..kindly help!!
(but does not address the original vbscript question)
There's the JACOB library, which stands for Java COM Bridge, you can find here: http://sourceforge.net/projects/jacob-project/?source=directory
With it you can invoke Excel, Word, Outlook, PowerPoint application object model methods.
I've tried this with Excel but not PowerPoint. (This is just some sample code, one might want to make it more object oriented.)
public class Excel {
private static ActiveXComponent xl = null;
public static Init() {
try {
ComThread.InitSTA();
xl = ActiveXComponent.connectToActiveInstance("Excel.Application.14");
// 14 is Office 2010, if you don't know what version you can do "Excel.Application"
if (xl==null) {
// code to launch Excel if not running:
xl = new ActiveXComponent("Excel.Application");
Dispatch.put(xl, "Visible", Constants.kTrue);
}
}
catch (Exception e) {
ComThread.Release();
}
}
public static String Run(String vbName) {
// Variant v = Dispatch.call(xl, "Run", vbName); // using string name lookup
Variant v = Dispatch.call(xl, 0x103, vbName); // using COM offset
// return Dispatch.get(this, "Name").getString();
return v.getString();
}
public static Variant Run1p(String vbName, Object param) {
// Variant v = Dispatch.call(xl, "Run", vbName, param);
return Dispatch.call(xl, 0x103, vbName, param);
// return Dispatch.get(this, "Name").getString();
}
public static Worksheet GetActiveWorksheet () {
// Dispatch d = xl.getProperty("ActiveSheet").toDispatch();
Dispatch d = Dispatch.get(xl, 0x133).toDispatch ();
return d; // you may want to put a wrapper around this...
}
}
Notes:
For Excel, at least, to get Run to invoke a VBA macro/subroutine several things have to be true:
The Excel workbook containing the macro must be "Active" (i.e. must
be the ActiveWorkbook) otherwise Run will not find the VBA subroutine. (However the workbook does not have to be
screen visible!! This means you can call a VBA Macro that is in an add-in!).
You can then pass the name of the macro using the following syntax as a string literal:
VBAProjectName.VBAModuleName.SubroutineName
For COM object invocations, you can use the name lookup version or the id number version. The id numbers come from the published COM interfaces (which you can find in C++ header files, or possibly have JACOB look them up for you).
If you successfully did the connection to Excel, be sure to call ComThread.Release() when you're done. Put it in some appropriately surrounding finally. If the process of your Java code terminates without calling it, the COM reference count on Excel will be wrong, and the Excel process will never terminate, even after you exit the Excel application. Once that happens, needless to say, Excel starts to behave screwy then (when you try to use it next, it runs but will fail to load any plug-ins/add-ons). If that happens (as it can during debugging esp. if you are bypassing finally's for better debugging) you have to use the task manager to kill the Excel process.
I'm writing a debugger for a Z80-emulator we are writing in a school project, using Java. The debugger reads a command from the user, executes it, reads another command, etc.
Commands can either be argument less, have optional arguments, or take an unlimited amount of arguments. Arguments are mostly integers, but occasionally they're strings.
Currently, we're using the Scanner-class for reading and parsing input. The read-method looks kinda like like this (I'm writing this off the top of my head, not paying attention to syntax nor correctness).
This was a kludge written in the beginning of the project, which quickly got out of hand as we added more and more commands to the debugger.
The major issues I have with this code is the large amount of repetition, the highlevel of if/else-nestedness, and the all around uglyness.
I would like suggestions on how to make this code more beautiful and modular, and what kind of patterns that are suitable for this kind of program.
I would also like more general suggestions on code style.
yup, there is a simpler/better way, especially in Java or other OO languages.
The basic insight, first, is that your command parser is a finite state machine: the START state is an empty line (or index at the start of a line).
Let's think about echo:
$ echo foo bat "bletch quux"
tokenize the line into pieces:
"echo" "foo" "bar" "bletch quux"
in a shell, the grammar is usually verb noun noun noun... so interpret it that way. You CAN do it with a sequence of if-else if things, but a hash is better. You load the hash with strings as indices, and index something else. It could be just a number, which would go into a switch:
(this is pseudocode):
Hashtable cmds = new Hashtable();
enum cmdindx { ECHO=1, LS=2, ...}
cmds.add("echo", ECHO); // ...
// get the token into tok
switch (cmds.get(tok)) {
case ECHO: // process it
// get each successor token and copy it to stdout
break;
...
default:
// didn't recognize the token
// process errors
}
EVEN better, you can apply the Command and Object Factory patterns. Now you have a class Command
public interface Command {
public void doThis(String[] nouns) ;
public Command factory();
}
public class Echo implements Command {
public void doThis(String[] nouns){
// the code is foreach noun in nouns, echo it
}
public Command factory(){
// this clones the object and returns it
}
}
Now, your code becomes
// Load the hash
Hashtable cmds = new Hashtable();
cmds.add("echo", new Echo()); // one for each command
// token is in tok
// the "nouns" or "arguments are in a String[] nouns
((cmds.get(tok)).factory()).doThis(nouns);
See how this works? You look up the object in the hash. You call the factory method to get a new copy. You then invoke the processing for that command using the doThis method.
Update
This may be a bit too general, in that it uses the Factory pattern. Why have a factory method? Mainly, you'd use that so that each time you execute a command, the "verb" object (like the instance of Echo) can have its own internal state. If you don't need state to persist for a long time, you can simplify this to
(cmds.get(tok)).doThis(nouns);
Now it simply gets and uses the Echo object you created when you instanciated it with cmds.add("echo", new Echo());.
Have you looked at doing the dispatching with a Map? A hashmap would be pretty easy to put in there. Just make the key the command and make an interface or abstract class that is a command like this:
interface Commmand {
void execute(String args);
}
Or even better you could chop up the arguments in advance:
interface Commmand {
void execute(String[] args);
}
Then your you would use HashMap<String,Command>.