I'm Running a method annotated with #Test and I want get a reference to the object JunitCore, this object invokes the method by reflection.
How can I get a reference to the that object, If It's possible?(maybe a security issue)
I tried reflection and classLoader but I couldn't make it work.
Thanks
The JUnitCore is a basic entry point for Junit tests. The way it works is it finds a List of classes provided as java command arguments and uses them to create a Runner with which it runs the test cases.
At no point during processing does the main method in JUnitCore ever pass a reference of the JUnitCore instance it creates to any other object. As such, it is not retrievable either directly or with reflection.
JUnitCore is as follows
public static void main(String... args) {
runMainAndExit(new RealSystem(), args);
}
public static void runMainAndExit(JUnitSystem system, String... args) {
Result result= new JUnitCore().runMain(system, args);
system.exit(result.wasSuccessful() ? 0 : 1);
}
public Result runMain(JUnitSystem system, String... args) {
system.out().println("JUnit version " + Version.id());
List<Class<?>> classes= new ArrayList<Class<?>>();
List<Failure> missingClasses= new ArrayList<Failure>();
for (String each : args)
try {
classes.add(Class.forName(each));
} catch (ClassNotFoundException e) {
system.out().println("Could not find class: " + each);
Description description= Description.createSuiteDescription(each);
Failure failure= new Failure(description, e);
missingClasses.add(failure);
}
RunListener listener= new TextListener(system);
addListener(listener);
Result result= run(classes.toArray(new Class[0]));
for (Failure each : missingClasses)
result.getFailures().add(each);
return result;
}
... // and more
No where in this implementation is a reference to this passed as an argument. As such, you cannot get a reference to it.
The only way is to create a JunitCore instance and run the tests yourself:
JUnitCore junit = new JUnitCore();
//we can add a listener to listen for events as we run the tests
junit.addListener(new RunListener(){
#Override
public void testFailure(Failure failure) throws Exception {
System.out.println("failed " + failure);
}
});
Result result = junit.run(Class.forName(nameOfTestSuite));
Related
I have to unit test the below method, whereas all the lines of this code related to third party aws library. The method also returns nothing. So only test I can do is verifying the exception. Any other test can I do to improve the code coverage?
public void multipartUpload() throws InterruptedException {
TransferManager tm = TransferManagerBuilder.standard()
.withS3Client(s3Client)
.withMultipartUploadThreshold(1024l)
.build();
PutObjectRequest request = new PutObjectRequest(bucketName, keyName, filePath);
Upload upload = tm.upload(request);
upload.waitForCompletion();
}
Let see the code that needs to be tested:
public class DemoCodeCoverage {
public void showDemo(LibraryCode library) {
System.out.println("Hello World!");
library.runDemoApplication();
// Extract the below code to a method since LibraryCode is not passed
// Then ignore running that method
// LibraryCode library = new LibraryCode()
// library.runDemoApplication_1();
// library.runDemoApplication_2();
// library.runDemoApplication_3();
System.out.println("World ends here!");
}
public boolean showBranchingDemo(boolean signal) {
if (signal) {
signalShown();
} else {
noSignal();
}
return signal;
}
public void signalShown() {
System.out.println("signalShown!");
}
public void noSignal() {
System.out.println("NoSignal!");
}
}
public class LibraryCode {
// Library can be AWS/Database code which needs authentication
// And this authentication is not a concern for our UT
// Still will end up execption when we do our UT
public void runDemoApplication() {
throw new RuntimeException();
}
}
Below can give good code coverage:
public class DemoCodeCoverageTest {
#Test
public void testShowDemo() {
DemoCodeCoverage t = Mockito.spy(new DemoCodeCoverage());
LibraryCode lib = Mockito.mock(LibraryCode.class);
Mockito.doNothing().when(lib).runDemoApplication();
t.showDemo(lib);
// when(bloMock.doSomeStuff()).thenReturn(1);
// doReturn(1).when(bloMock).doSomeStuff();
}
#Test
public void testShowBranchingDemo() {
DemoCodeCoverage t = Mockito.spy(new DemoCodeCoverage());
assertEquals(true, t.showBranchingDemo(true));
assertEquals(false, t.showBranchingDemo(false));
}
#Test
public void testSignalShown() {
DemoCodeCoverage t = Mockito.spy(new DemoCodeCoverage());
t.showBranchingDemo(true);
Mockito.verify(t, times(1)).signalShown();
}
#Test
public void testNoSignal() {
DemoCodeCoverage t = Mockito.spy(new DemoCodeCoverage());
t.showBranchingDemo(false);
Mockito.verify(t, times(1)).noSignal();
}
}
Below are the steps to increase the test code coverage:
Case_1: Testing void method
Assume you have method the does not take any params and return nothing.
public void printHelloWorld() {
System.out.println("Hello World")
}
Still you can write test that calls this method and returns successfully without any runtimeException.
Actually we haven't tested anything here other than giving a option to run the code by our tests. Thus increase the code coverage.
Additionally you can verify the invocation:
Mockito.verify(instance, times(1)).printHelloWorld();
There are circumstances you cannot test those, example say it is third party library call, then the library might have tested already, we just need to run through it.
#Test
public void testPrintHelloWorld() {
// may be hibernate call/other 3rd party method call
instance.printHelloWorld();
}
If your tool is not strict for 100% code coverage, you can even ignore it and justify it.
Case_2: Testing a method with object created and called another method inside the testing method
Assume you have method the does call DB to add entry in Hello_World table also prints it in console like below.
public void printHelloWorld() throws DBException {
DBConnection db = new DBConnection();
db.createEntry(TABLE_NAME, "Hello World");
System.out.println("Hello World")
}
You can extract those db code into new method, then test it separately.
public void printHelloWorld() throws DBException {
makeHelloWorldEntryInTable();
System.out.println("Hello World")
}
public void makeHelloWorldEntryInTable() throws DBException {
DBConnection db = new DBConnection();
db.createEntry(TABLE_NAME, "Hello World");
}
While testing with DB you would expect the DBConnectionException as it is just unit test. So one test with #Test(expected=DBException) for makeHelloWorldEntryInTable, and another test on printHelloWorld() with skipping the method makeHelloWorldEntryInTable call like below. Thus increases the code coverage.
#Test(expected=DBException)
public void testMakeHelloWorldEntryInTable() {
//This can any third party library which cannot be configured for ut.
//One example is testing the AWS bucket exist or not.
instance.makeHelloWorldEntryInTable();
}
#Test
public void testPrintHelloWorld() {
Mockito.doNothing()
.when(localInstance)
.makeHelloWorldEntryInTable();
localInstance.printHelloWorld();
}
Case_3: if you have private method, then make it default package level and test it. Thus improves the code coverage.
I want to run integration tests only if it has given annotation. The thing is that test cases need some variables which are needed to be initialized in #Before and destroyed in #After.
I wrote the code which executes the tests which have given annotation, but they all fail because of the variables which need to be initialized in #Before phase.
I first invoke #Before phase(I suppose the variables are initialized), then run the test method, then invoke #After phase. But I get NullPointerException in test method.
How to initialize variable for the test methods?? Isn't enough to invoke #Before phase??
The code I have:
public static void main(String[] args) throws Exception {
Class<TodoMapperTest> obj = TodoMapperTest.class;
int passed = 0;
int failed = 0;
int count = 0;
for (Method method : obj.getDeclaredMethods()) {
if (method.isAnnotationPresent(Before.class))
method.invoke(obj.newInstance());
if (method.isAnnotationPresent(DEV.class)) {
try {
method.invoke(obj.newInstance());
System.out.printf("%s - Test '%s' - passed %n", ++count, method.getName());
passed++;
} catch (Throwable ex) {
System.out.printf("%s - Test '%s' - failed: %s %n", ++count, method.getName(), ex);
failed++;
}
}
if (method.isAnnotationPresent(After.class))
method.invoke(obj.newInstance());
}
System.out.printf("%nResult : Total : %d, Passed: %d, Failed %d%n", count, passed, failed);
}
#Before phase:
TodoQueryMapper mapper;
SqlSession session;
#Before
public void setUp() throws Exception {
InputStream inputStream = Resources.getResourceAsStream("todo-mybatis/mybatis-test.xml");
SqlSessionFactory sqlSessionFactory = new SqlSessionFactoryBuilder().build(inputStream);
inputStream.close();
session = sqlSessionFactory.openSession();
mapper = session.getMapper(TodoQueryMapper.class);
}
Edit:
Test case:
#Test
#DEV
public void test_case() throws Exception {
SqlParams params = new SqlParams();
params.idList = Collections.singletonList(1234567);
// In here, 'mapper' variable is null, even #Before invoked
List<TodoDto> data = mapper.findByIdList(params);
assertEquals(1, data.size());
}
DEV Annotation:
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.METHOD)
public #interface DEV {
}
Each time you invoke a method via reflection you create a new instance of your test class: method.invoke(obj.newInstance());
You should seperate your test execution in three phases: Before, Test and After. Iterate over the Test-methods, find the Before- and After-method and execute them in the desired order.
Pseudo code:
Class<AccountDaoMapperTest> objClass = AccountDaoMapperTest.class;
for (Method testMethod : findTestMethods(objClass)) {
AccountDaoMapperTest objUnderTest = objClass.newInstance();
findBeforeMethod(objClass).invoke(objUnderTest);
testMethod.invoke(objUnderTest);
findAfterMethod(objClass).invoke(objUnderTest);
}
As said in the java documentation i.e. https://docs.oracle.com/javase/1.5.0/docs/guide/language/varargs.html,
There is a strong synergy between autoboxing and varargs, which is illustrated in the following program using reflection:
// Simple test framework
public class Test {
public static void main(String[] args) {
int passed = 0;
int failed = 0;
for (String className : args) {
try {
Class c = Class.forName(className);
c.getMethod("test").invoke(c.newInstance());
passed++;
} catch (Exception ex) {
System.out.printf("%s failed: %s%n", className, ex);
failed++;
}
}
System.out.printf("passed=%d; failed=%d%n", passed, failed);
}
}
But I didn't understood how/where those getMethod & invoke methods are using the autoboxing concept here?
NOTE: I know these 2 methods are vararg based methods but where is autobaoxing using here?
I figured out how to use the Ant API to run a JUnit Test and create an XML of the result.
String pathToReports = "/tmp/junitreports";
Project project = new Project();
JUnitTest test = null;
try
{
new File(pathToReports).mkdir();
JUnitTask task = new JUnitTask();
project.setProperty("java.io.tmpdir",pathToReports);
task.setProject(project);
FormatterElement.TypeAttribute type = new FormatterElement.TypeAttribute();
type.setValue("xml");
FormatterElement formater = new FormatterElement();
formater.setType(type);
task.addFormatter(formater);
test = new JUnitTest(TestClass.class.getName());
test.setTodir(new File(pathToReports));
task.addTest(test);
task.execute();
}
...
TestClass:
public class TestClass
{
#Test
public void test()
{
fail("failed");
}
}
The Code works just fine. The XML is created and I can see that the test "failed".
Now my question: is there any way to also get the test results programatically? I expected to get an updated version of the JUnitTest object somehow where I can call the method "failureCount()".
test.failurecount() after execution of the task returns 0 of course. Parsing the XML seems odd to me as the number of failures should already be stored somewhere.
You could have a variable (int failedTests) increment each time a test fails. You could do this by using a TestWatcher rule.
After all the tests have run you could print it out (or do whatever you wanna do with it...) with:
#AfterClass
public static void printFailedTestsCount() {
System.out.println(failedTests + " tests failed.");
}
I am testing with the wonderful TestNG-Framework. My question is if it is possible to set the annotations for #Test-annotation in the testng.xml-configuration file?
I don't want to hard-code the #Test-annotation like
#Test(dataProvider = "dataFileProvider", dataProviderClass = TestDataProvider.class)
I want to configure it in the testng.xml
I have got two ideas on this case:
Workaraound 1: StaticProvider
You can easily change the Static Provider if needed
Workaraound 2: Annotation Transformer
Never tried that but should work even if have to grab the XML- data manually
Looking forward to Mr. Beust's answer... ;)
The short answer is: no, you can't add annotations to your code from testng.xml.
You can modify existing annotations with an Annotation Transformer, as explained by Frank.
Sometimes, you just really want to do something and you can't, like accessing private variables to fix memory leaks. Figuring out how to do things like this, despite the fact that you can't are fun. In case, you really want to, I might suggest trying to run your suite using the TestNG object and before running loading the testng.xml file.
Personally, I like using 'mvn test' and unfortunately, adding the pom.xml code to run from a testng xml file will require that you supply a testng.xml file, so 'mvn test' won't work. Always make sure what 95% of programmers use works, then allow overridding.
So, I might suggest extending the testng.xml file yourself and writing some code to read the testng.xml file and configure annotations using the annotation transformer class.
Here is some code to get you started:
public class TestNGSuite {
public static void main(String[] args) {
System.out.println("main start");
try {
new TestNGSuite(new Class[]{ Demo.class });
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println("main finish");
}
public TestNGSuite(Class[] classes) throws Exception {
// Create Suite List
List<XmlSuite> suites = new ArrayList<XmlSuite>();
// Add Suite to Suite List
XmlSuite suite = new XmlSuite();
suites.add(suite);
suite.setName("MyTestSuite");
// Add Test to Suite
XmlTest test = new XmlTest(suite);
test.setName("MyTest");
// Add Class List to Test
List<XmlClass> xmlClasses = new ArrayList<XmlClass>();
test.setXmlClasses(xmlClasses);
// Add Class to Class List
for(Class clazz: classes) {
XmlClass xmlClass = new XmlClass(clazz);
xmlClasses.add(xmlClass);
}
// Run TestNG
TestNG testNG = new TestNG();
testNG.setXmlSuites(suites);
testNG.addListener(new TestNGAnnotationTransformer(methodName));
testNG.addListener(new TestNGSuiteConsoleLogger());
testNG.run();
if(testNG.hasFailure()) { // Throw an exception to make mvn goal fail
throw new Exception("Failed Tests");
}
}
public static class TestNGSuiteConsoleLogger extends TestListenerAdapter{
#Override
public void onTestFailure(ITestResult tr) {
Console.log(TestNGSuiteConsoleLogger.class, "FAILURE:"+tr.getMethod());
tr.getThrowable().printStackTrace();
}
}
public static class TestNGAnnotationTransformer implements IAnnotationTransformer{
String methodToRun;
public TestNGAnnotationTransformer(String methodName) {
methodToRun = methodName;
}
public void transform(ITestAnnotation annotation, Class arg1,
Constructor arg2, Method testMethod) {
if (methodToRun.equals(testMethod.getName())) {
annotation.setEnabled(true);
}
}
}
}
If you want to run Demo.class, make sure there is a method there with the TestNG annotation "#Test".