I'm creating a test jar with a bunch of JUnit tests on it. Since I'm testing a 3rd part code, I don't run the tests with "mvn test", but I generate an executable jar and run it with "java -jar"
On my tests, I have many different categories, and I'd like to choose which one is executed, which isn't on the command line.
I tried the option -Dgroups="categories", but it does not work with "java -jar"
The thing I want to do is something like that:
java -jar -Dcategories="cat1,cat2,cat5" executable.jar
The only way I see about reading this on my code is with
System.getProperty("categories");
I tried using something like that:
#Before
public void setup() {
Assume.assumeTrue(System.getProperty("categories") != null && System.getProperty("categories").contains("cat1"));
}
It skips the test but still gives me a stack trace, which looks pretty bad.
Any other option to skip the tests?
The JAVA Documentation says:
java [ options ] -jar file.jar [ argument ... ]
so in your case
java -jar executable.jar cat1 cat2 cat5
public static void main(String[] args) {
for (String input : args) {
if (input.equalsIgnoreCase("cat1")) {
runTests(TestOne.class);
} else {
runTests(TestTwo.class);
}
}
}
private static void runTests(Class test) {
Result result = JUnitCore.runClasses(test);
for (Failure failure : result.getFailures()) {
System.out.println(failure.toString());
}
}
Related
I am very new to Sonar.
I am trying to make my own plugin for sonar. After downloading plugin example, I make it eclipse ready using mvn eclipse:eclipse and import to workspace. It is compiling fine.
But I need to add my own Rule files to it.
For that purpose, I have created 2 files.
MyCustomNLSRuleTest .java
package org.sonar.samples.java.checks;
import org.junit.Test;
import org.sonar.java.checks.verifier.JavaCheckVerifier;
public class MyCustomNLSRuleTest {
#Test
public void check() {
// Verifies that the check will raise the adequate issues with the expected message.
// In the test file, lines which should raise an issue have been commented out
// by using the following syntax: "// Noncompliant {{EXPECTED_MESSAGE}}"
JavaCheckVerifier.verify("src/test/files/MissingCheck.java", new MyCustomSubscriptionRule());
}
}
The actual rule is provided to me in the following java file which looks like below -
MissingCheck.java
public class MissingCheck extends Check
{
private HashMap<Integer, Integer> lineStringMap;
#Override
public void beginTree(DetailAST aRootAST) {
super.beginTree(aRootAST);
lineStringMap = new HashMap<>();
}
#Override
public int[] getDefaultTokens() {
return new int[] { TokenTypes.STRING_LITERAL};
}
#Override
public void visitToken(DetailAST ast) {
DetailAST parent = ast.getParent();
if (parent != null) {
DetailAST grandpa = parent.getParent();
if (isAnnotation(grandpa.getType())) {
return;
}
}
Integer count = lineStringMap.get(ast.getLineNo());
if (count == null) {
count = new Integer(1);
} else {
count++;
}
FileContents contents = getFileContents();
String[] line = contents.getLines();
if (line.length >= ast.getLineNo()) {
String l = line[ast.getLineNo() - 1];
if (!l.contains("$NON-NLS-" + count + "$")) {
log(ast.getLineNo(), "String_Not_Externalized", new Object[] { ast.getText() });
}
}
lineStringMap.put(ast.getLineNo(), count);
}
/**
* Checks if type is an annotation.
* #param type to check
* #return <code>true</code> if type is an annotation.
*/
private boolean isAnnotation(int type) {
return (type >= TokenTypes.ANNOTATION_DEF && type <= TokenTypes.ANNOTATION_ARRAY_INIT);
}
}
But, I am trying to do mvn clean package this project, it gives me error:
Results :
Tests in error:
MyCustomNLSRuleTest.check:13 » IllegalState At least one issue expected
Tests run: 8, Failures: 0, Errors: 1, Skipped: 0
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ---------------------------------------------------------------------
Any idea, how I can add a new rule in the plugin?
Thanks!
Okay, it seems that you are very far from what you are supposed to do when writing custom rules for the java plugin... A few questions first:
Did you actually tried to have a look at the dedicated page from the SonarQube confluence? http://docs.sonarqube.org/display/DEV/Custom+Rules+for+Java
Did you actually look at the following links before trying to write a rule?
The rule already implemented in the example project,
How they are tested,
With which test files,
How they are registered in the custom plugin,
The comments from the unit test you are actually writing.
Now... Let's start by explaining to you what you are currently doing, as apparently it's not clear at all.
You created a test file called MyCustomNLSRuleTest.java, which should theoretically correspond to a rule called MyCustomNLSRule. Note that it's probably not the case, as you are saying that the rule is provided to you in the MissingCheck.java file.
Your unit test uses JavaCheckVerifier to verify that the file provided as argument, "src/test/files/MissingCheck.java", will raise all the expected issues when playing the rule MyCustomSubscriptionRule against it.
At this point, you are not testing at all your MissingCheck, but using it as data for the MyCustomSubscriptionRule rule... And it's probably your main issue.
However, if this is actually really what you are trying to achieve, it means that:
You modified the rule MyCustomSubscriptionRule to have a custom behavior, different from the one from the original example project.
When executing it on the file MissingCheck.java, the check is supposed to raise issue (with the line having issue being commented out with // Noncompliant {{expected message}})
Your custom rule does not work, as it apparently raised no issue at all when playing the test.
Please look at all the links provided above to see how custom rules works, what is available in the java plugin API, and what you can achieve with it.
I have pw_check.java and I need to run it with an argument first and then run it without argument in terminal.
java pw_check -g
java pw_check
But in second command, without argument, the system is throwing exception. How could I handle it to feed my requirement.
Check the code pw_check.java.
Probably there is something like
public static void main(String[] args) {
// Code accessing args[0]
}
This will cause an error if you don't have a parameter.
Modify with a code similar to the following:
public static void main(String[] args) {
String arg = DEFAULT_ARG;
if (args.length == 1) {
arg = args[0];
}
... // Code using arg DEFAULT or passed value
}
My Java application consists of two parts:
core libraries (classes, interfaces, etc)
command line interface (CLI), which uses the core libraries
For 1. I use JUnit for unit testing, but what would you do for 2.?
How can I create automated tests for a command line interface?
I had the exact same problem, landed here and didn't find a good answer, so I thought I would post the solution I eventually came to as a starting point for anyone who lands here in the future.
I wrote my tests after the CLI (shame on me, I know), so first I made sure the CLI was written in a testable way. It looks something like this (I've omitted the exception handling and simplified a lot to make it more readable):
public class CLI {
public static void main(String... args) {
new CLI(args).startInterface();
}
CLI(String... args) {
System.out.println("Welcome to the CLI!");
// parse args, load resources, etc
}
void startInterface() {
BufferedReader consoleReader = new BufferedReader(new InputStreamReader(System.in));
while (true) {
String[] input = sanitiseInput(consoleReader.readLine());
if (input[0].equalsIgnoreCase("help") {
help();
} else if (input[0].equalsIgnoreCase("exit") {
break;
} else if (input[0].equalsIgnoreCase("save") {
save(input);
} else {
System.out.println("Unkown command.");
}
}
}
String[] sanitiseInput(String rawInput) {
// process the input and return each part of it in order in an array, something like:
return rawInput.trim().split("[ \t]+");
}
void help() {
// print help information
System.out.println("Helpful help.");
}
void save(String[] args) {
// save something based on the argument(s)
}
}
On to testing. CLI is not a part of the public libraries, so it should be protected from library users. As is mentioned here, you can use the default access modifier to make it package private. This gives your tests full access to the class (as long as they are in the same package) while still protecting it, so that's that taken care of.
Writing a method for each command accepted by the CLI allows JUnit tests to almost perfectly simulate user input. Since the object won't read from stdin until you call startInterface(), you can simply instantiate it and test the individual methods.
First, it's good to test that the raw input is being correctly sanitised, which you can do trivially by writing JUnit tests for sanitiseInput(). I wrote tests like this:
#Test
public void commandAndArgumentsSeparatedBySpaces() throws Exception {
String[] processedInput = uut.sanitiseInput("command argument1 argument2");
assertEquals("Wrong array length.", 3, processedInput.length);
assertEquals("command", processedInput[0]);
assertEquals("argument1", processedInput[1]);
assertEquals("argument2", processedInput[2]);
}
It's easy to cover some edge cases too:
#Test
public void leadingTrailingAndIntermediaryWhiteSpace() throws Exception {
String[] processedInput = uut.sanitiseInput(" \t this \twas \t \t a triumph \t\t ");
assertEquals("Wrong array length.", 4, processedInput.length);
assertEquals("this", processedInput[0]);
assertEquals("was", processedInput[1]);
assertEquals("a", processedInput[2]);
assertEquals("triumph", processedInput[3]);
}
Next we can test the invididual command methods by monitoring stdout. I did this (which I found here):
private CLI uut;
private ByteArrayOutputStream testOutput;
private PrintStream console = System.out;
private static final String EOL = System.getProperty("line.separator");
#Before
public void setUp() throws Exception {
uut = new CLI();
testOutput = new ByteArrayOutputStream();
}
#Test
public void helpIsPrintedToStdout() throws Exception {
try {
System.setOut(new PrintStream(testOutput));
uut.help();
} finally {
System.setOut(console);
}
assertEquals("Helpful help." + EOL, testOutput.toString());
}
In other words, substitute the JVM's out with something you can query just before the exercise, and then set the old console back in the test's teardown.
Of course, CLI applications often do more than just print to the console. Supposing your program saves information to a file, you could test it as such (as of JUnit 4.7):
#Rule
public TemporaryFolder tempFolder = new TemporaryFolder();
#Test
public void informationIsSavedToFile() throws Exception {
File testFile = tempFolder.newFile();
String expectedContent = "This should be written to the file.";
uut.save(testFile.getAbsolutePath(), expectedContent);
try (Scanner scanner = new Scanner(testFile)) {
String actualContent = scanner.useDelimiter("\\Z").next();
assertEquals(actualContent, expectedContent);
}
}
JUnit will take care of creating a valid file and removing it at the end of the test run, leaving you free to test that it is properly treated by the CLI methods.
For any CLI you can use BATS (Bash Automated Testing System):
The test-specification from the docs is a script-file like example.bats:
#!/usr/bin/env bats
#test "addition using bc" {
result="$(echo 2+2 | bc)"
[ "$result" -eq 4 ]
}
#test "addition using dc" {
result="$(echo 2 2+p | dc)"
[ "$result" -eq 4 ]
}
When using the bats command to execute and the output look like this:
$ bats example.bats
✓ addition using bc
✓ addition using dc
2 tests, 0 failures
See related tag for more questions: bats-core
I am currently developing a program that can execute JUnit test cases on external classes. These external classes are sent in by students and we would like to evaluate them.
I have the following test case
import static org.junit.Assert.*;
import org.junit.Test;
public class Task1Test {
#Test
public void testAdd() {
Task1 t = new Task1();
int a = 5;
int b = 11;
assertEquals("Wrong add result", a+b, t.add(a,b));
}
}
and I compiled it with:
$ javac -cp .:../lib/junit/junit-4.11.jar Task1Test.java
The Task1 will be a student's class, but for now it is just a sample class with an add method that will return a wrong result. The file Task1.java is located in the same folder as Task1Test.java.
In my program I load the test case class and try to run it with JUnitCore:
String testsPath = "/path/to/classes";
String junitJar = "/path/to/junit-4.11.jar"
URL taskUrl = new File(testsPath).toURI().toURL();
URL[] urls = new URL[] {taskUrl,junitJar};
#SuppressWarnings("resource")
ClassLoader loader = new URLClassLoader(urls);
Class<?> clazz = loader.loadClass(task);
Result res = JUnitCore.runClasses(clazz);
if(!res.wasSuccessful()) {
for(Failure f : res.getFailures()) {
System.out.println(f.toString());
}
}
However, it does not work as expected. When I run this code, I get this message:
initializationError(Task1Test): No runnable methods
When I look into the loaded class using reflections, I can see that the method testAdd has no annotation (i.e. method.getAnnotation(org.junit.Test.class) returns null).
Does anyone have an idea? Did I forget a compiler switch or anything?
I am using Java 1.7.0_11 and the code is run in an web application on Glassfish 3.1.2.2
EDIT:
I can run the test case from command line with:
$ java -cp .:../../code/lib/junit/junit-4.11.jar:../../code/lib/junit/hamcrest-core-1.3.jar org.junit.runner.JUnitCore Task1Test
I found a solution from this answer
I did not set a parent class loader, which seems to have caused the trouble. Setting it as it was said in the other answer it now executes the test.
I want to run google closure compiler on many javascript files from python.
I use this:
subprocess.Popen('/usr/bin/java -jar /var/www/compiler.jar --js "%s" --js_output_file "%s"' % (fullname, dest_filename),shell=True);
But as I understand it creates java VM for each process.
So this eats all the RAM. I only can hear my hdd cracking as swap is being used.
And system is almost hanged. Can I tell java somehow to use only 1 vm for all started processes?
May be I wrote something incorrectly. That is because I totally don't know java. Sorry for that
Possible answer 1:
The Google closure compiler does accept multiple input files with a syntax like this:
java -jar compiler.jar --js=in1.js --js=in2.js ... --js_output_file=out.js
This will produce only one output file that is the combination of all of the inputs. But this might not be what you want if you're trying to compile each file separately.
Possible answer 2:
It would not be hard to write a small wrapper script (using bash, python, or your favorite scripting language) that accepts pairs of parameters, e.g.
wrapper.sh in1.js out1.js in2.js out2.js ...
The code in wrapper.sh could loop over the (pairs of) parameters and call java -jar --js=xxx --js_output_file=yyy repeatedly, waiting for each to complete before beginning the next. This would have the benefit of not starting each process in parallel, so at least you wouldn't have (potentially) many JVMs running at the same time. Though you do have some inefficiency in having to restart the JVM for each run.
Possible answer 3:
If you really want just one JVM, then there is no way to do what you ask without writing a little bit of Java code (as far as I know). If you are familiar with Java, you could copy the source code of CommandLineRunner.java and modify it to suit your needs.
Or perhaps even easier, just write a small Java class whose main function simply invokes the CommandLineRunner main any number of times, passing in appropriate parameters to simulate a normal command line invocation. Here's something quick and dirty that would do the trick (hat tip to VonC)
import com.google.javascript.jscomp.CommandLineRunner;
import java.security.Permission;
public class MyRunner {
public static void main(String [] args) {
// Necessary since the closure compiler calls System.exit(...).
System.setSecurityManager(new NoExitSecurityManager());
for (int i=0; i<args.length; i+=2) {
System.out.println("Compiling " + args[i] + " into " + args[i+1] + "...");
try {
CommandLineRunner.main(new String[] {
"--js=" + args[i],
"--js_output_file=" + args[i+1]
});
}
catch (ExitException ee) {
System.out.println("Finished with status: " + ee.getStatus());
}
}
}
private static class ExitException extends SecurityException {
private int status;
public ExitException(int status) { this.status = status; }
public int getStatus() { return status; }
}
private static class NoExitSecurityManager extends SecurityManager {
public void checkPermission(Permission p) { }
public void checkPermission(Permission p, Object context) { }
public void checkExit(int status) { throw new ExitException(status); }
}
}
Compile it with with something like this:
javac -classpath compiler.jar MyRunner.java
Run it with something like this:
java -classpath .:compiler.jar MyRunner in1.js out1.js in2.js out2.js ...
And see output like this:
Compiling in1.js into out1.js...
Finished with status: 0
Compiling in2.js into out2.js...
Finished with status: 0