I am not sure if it is possible but if so how do I do it? I have a project which contains another 2 projects as dependencies. The parent project is only used for testing purposes (client server app). When I am testing I have a difficulty reading through my testing output because of the large amount of output the client and the server projects have. I am trying to find a way hide from the console all the output(printf()) of my sub-projects so that I can only see my testing output. Is there a way to do so?
For testing I am using JUnit.
Thanks
You should use Java Logger API (or Log4J) instead of using System.out.print directly, sou you could disable specific loggers during executions of your tests.
If you can't change legacy code to use Logger API, there are two options:
Option 1:
Create a PrintStream decorator class like this pseudo-code:
public class FilteredPrintStream extends PrintStream {
private final PrintStream stream;
...
#Override
public void print(String str) {
if (notCalledBySubproject()) {
stream.print(str);
}
}
// override other methods too
}
Then you set default output: System.setOut(new FilteredPrintStream(System.out));
To find method caller (to create notCalledBySubproject method), see How do I find the caller of a method using stacktrace or reflection?
Option 2:
Use Logger API in your test code, so you can redirect your output to a file, for example, then you ignore console output and see the file generated by your tests.
It is not a perfect solution nor good coding practice but you could add a class like this
public class SystemOutput
{
public static final boolean DO_PRINT = true;
public void printf(String format, Object... args)
{
if(DO_PRINT)
System.printf(format, args);
}
}
and use Find and Replace once to replace "System.out.printf" with "SystemOutput.printf" in every class needed. Because both methods have the same declaraction only this has to be changed. When you want to block the output you can just set DO_PRINT to false.
Eclipse for example provides a search tool which can find and replace a certain string in every .java file in a project. (Strg + H under the File Search tab)
of course it is also possible to call System.setOut() with your own subclass of PrintStream, that overrides the printf() method to only print when a certain boolean value is true.
why do you need console to run unit tests? ignore it. if your tests passes you got 0 status code or green bar (IDE or jenkins). any error stack trace you can find in e.g. maven log tests results. just ignore the std output
another thing: using console in your application is usually bad idea - avoid it. use logging framework instead (it will let you control the destination and level of logging). use your IDE and refactor - replace all calls to printf with log.debug or with you own wrapper. if your IDE doesnt support it then use some regex and try replace-all
if you want to get rid of all the output you can redirect stdout to /dev/null or change output stream in java. but it's not a proper solution
Related
I have this method that I am using in a NetBeans plugin:
public static SourceCodeFile getCurrentlyOpenedFile() {
MainProjectManager mainProjectManager = new MainProjectManager();
Project openedProject = mainProjectManager.getMainProject();
/* Get Java file currently displaying in the IDE if there is an opened project */
if (openedProject != null) {
TopComponent activeTC = TopComponent.getRegistry().getActivated();
DataObject dataLookup = activeTC.getLookup().lookup(DataObject.class);
File file = FileUtil.toFile(dataLookup.getPrimaryFile()); // Currently opened file
// Check if the opened file is a Java file
if (FilenameUtils.getExtension(file.getAbsoluteFile().getAbsolutePath()).equalsIgnoreCase("java")) {
return new SourceCodeFile(file);
} else {
return null;
}
} else {
return null;
}
}
Basically, using NetBeans API, it detects the file currently opened by the user in the IDE. Then, it loads it and creates a SourceCodeFile object out of it.
Now I want to unit test this method using JUnit. The problem is that I don't know how to test it.
Since it doesn't receive any argument as parameter, I can't test how it behaves given wrong arguments. I also thought about trying to manipulate openedProject in order to test the method behaviour given some different values to that object, but as far as I'm concernet, I can't manipulate a variable in JUnit that way. I also cannot check what the method returns, because the unit test will always return null, since it doesn't detect any opened file in NetBeans.
So, my question is: how can I approach the unit testing of this method?
Well, your method does take parameters, "between the lines":
MainProjectManager mainProjectManager = new MainProjectManager();
Project openedProject = mainProjectManager.getMainProject();
basically fetches the object to work on.
So the first step would be to change that method signature, to:
public static SourceCodeFile getCurrentlyOpenedFile(Project project) {
...
Of course, that object isn't used, except for that null check. So the next level would be to have a distinct method like
SourceCodeFile lookup(DataObject dataLookup) {
In other words: your real problem is that you wrote hard-to-test code. The "default" answer is: you have to change your production code, to make easier to test.
For example by ripping it apart, and putting all the different aspects into smaller helper methods.
You see, that last method lookup(), that one takes a parameter, and now it becomes (somehow) possible to think up test cases for this. Probably you will have to use a mocking framework such as Mockito to pass mocked instances of that DataObject class within your test code.
Long story short: there are no detours here. You can't test your code (in reasonable ways) as it is currently structured. Re-structure your production code, then all your ideas about "when I pass X, then Y should happen" can work out.
Disclaimer: yes, theoretically, you could test the above code, by heavily relying on frameworks like PowerMock(ito) or JMockit. These frameworks allow you to contol (mock) calls to static methods, or to new(). So they would give you full control over everything in your method. But that would basically force your tests to know everything that is going on in the method under test. Which is a really bad thing.
I have an existing Java class that writes to a file.
public final class WriteToFile{
private Writer file_writer;
private static final String encoding_format = "UTF8";
private FileWrite(final File fpath) throws IOException {
this.file_writer = new OutputStreamWriter(new FileOutputStream(fpath), encoding_format);
}
#Override
public void fileWrite(final String msg) {
try {
this.file_writer.write(msg);
this.file_writer.write("\n");
this.file_writer.flush();
this.file_writer.close();
} catch (IOException e) {
log.error("File write failed", e);
}
}
}
In order to unit test this, I learnt that creating a file mock using a Mocking framework is not a good practice . What do I test here? The only way of testing this is to probably do the file write again, and check if the expected contents and actual contents are the same. In that case, doing it the JUnit way would be as mentioned in this post How to test write to file in Java?. However, I am not going to rewrite the file writing code, to include interface wrappers. How do I go about with this?
#Test public void testfileWrite() {
String msg = "somemessage";
String fpath = "path/to/file";
Writer file_writer = new OutputStreamWriter(new FileOutputStream(fpath), "UTF8");
file_writer.write(msg);
assertEquals("somemessage", file_writer.toString());
}
Is this all that needs to be tested?
The point of this class is to write a file. It does nothing else (and that is a good thing). So don't bother with a mockist unit test, all it shows is that you can write a ton of mock code. Instead write a Integration Test.
Use the JUnit rule TemporaryFolder to create and destroy a folder to put your test file in, then verify the file has what you want in it at the end of the test. The only time you should consider mocking for this kind of test is if the exceptional case does something funky. Then you can either do some evil black magic involving Powermock or pass in some form of "File Stream factory". Or ask yourself if that is really such a great place for complex logic that needs testing, and then move it.
When testing classes that make use of WriteToFile, mock or stub WriteToFile.
When it comes to write unit tests that must check generated files, I always prepare myself a repository of cases: For each case, an input file (if necessary), and and a set of expected output files.
I write one test method for each case, where I call the business logic, which will generate one (or some) file into the working directory, and I eventually check if the generated file is equal to the proper expected file.
I prepare the expected files manually and check them in into the Source Control System, so that they belong to each released version. If, in future, the business logic must change its behaviour, then it is required that the expected file be changed accordingly, and that both the code and the file be checked in and tagged together in the same release.
That is the easiest and safest way I found for checking generated files.
Use Powermockito to mock the call to the constructor of FileOutputStream and OutputStreamWriter as in: http://benkiefer.com/blog/2013/04/23/powermockito-constructor-mocking/
Then verify that file_writer methods write(String) and flush() have been invoked twice and once, respectively; and at the end close().
I am writing tests for an interpreter from some programming language in Java using JUnit framework. To this end I've created a large number of test cases most of them containing code snippets in a language under testing. Since these snippets are normally small it is convenient to embed them in the Java code. However, Java doesn't support multiline string literals which makes the code snippets a bit obscure due to escape sequences and the necessity to split longer string literals, for example:
String output = run("let a := 21;\n" +
"let b := 21;\n" +
"print a + b;");
assertEquals(output, "42");
Ideally I would like something like:
String output = run("""
let a := 21;
let b := 21;
print a + b;
""");
assertEquals(output, "42");
One possible solution is to move the code snippets to the external files and refer each file from corresponding test case. However this adds significant maintenance burden.
Another solution is to use a different JVM language, such as Scala or Jython which support multiline string literals, to write the tests. This will add a new dependency to the project and will require to port existing tests.
Is there any other way to keep the clarity of the test code snippets while not adding too much maintenance?
Moving the test cases to a file worked for me in the past, it was an interpreter as well:
created an XML file containg the snippets to be interpreted as well as the expected result. It was a fairly simple XML definition, a list of test elements mainly containing testID, value, expected result, type, and a description.
implemented exactly one JUnit test that read the file and looped through its contents, in case of failure we used the testID and description to log failing tests.
It mainly worked because we had one generic well-defined interface to the interpreter like your run method, so refactoring was still possible. In our case this did not increase maintenance effort, in fact we could easily create new tests by just adding more elements to the XML file.
Maybe this is not the optimal way in which Unit tests should be used, but it worked well for us.
Since you are talking about other JVM languages, have you considered Groovy? You would have to add an external dependency, but only at compile/test time (you don't have to put it in your production package), and it provides multiline strings. And one major advantage in your case : its syntax is backwards compatible with Java (meaning you won't have to rewrite your tests)!
I have done this in the past. I've done something similar to what was suggested by home, I used external file(s) containing the tests and their expected results, but using the #Parameterized test runner.
#RunWith(Parameterized.class)
public class ParameterTest {
#Parameters
public static List<Object[]> data() {
List<Object[]> list = new LinkedList<Object[]>();
for (File file : new File("/temp").listFiles()) {
list.add(new Object[]{file.getAbsolutePath(), readFile(file)});
}
return list;
}
private static String readFile(File file) {
// read file
return "file contents";
}
private String filename;
private String contents;
public ParameterTest(String filename, String contents) {
this.filename = filename;
this.contents = contents;
}
#Test
public void test1() {
// here we test something
}
#Test
public void test2() {
// here we test something
}
}
Here we are running test1() & test2() once for each file in /temp, with the parameters of the filename and the contents of the file. The Test Class is instantiated and called for each item that you add into the list in the method annotated with #Parameters.
Using this test runner, you can rerun a particular file if it fails; most IDEs support rerunning a single failed test. The disadvantage of #Parameterized is that there isn't any way to sensibly identify the tests so that the names appear in the Eclipse JUnit plugin. All you get is 0, 1, 2, etc. But at least you can rerun the failed tests.
As home says, good logging is important to identify the failing tests correctly and to aid debugging especially when running outside the IDE.
I have a log statement in which I always use this.getClass().getSimpleName()as the 1st parameter.
I would like to put this in some sort of macro constant and use that in all my log statements.
But I learned that Java has no such simple mechanism unlike say C++.
What is the best way to achieve this sort of functionality in Java?
My example log statements (from Android) is as follows..
Log.v(this.getClass().getSimpleName(),"Starting LocIden service...");
Java doesn't have macros but you can make your code much shorter:
Log.v(this, "Starting LocIden service...");
And in the Log class:
public void v(Object object, String s)
{
_v(object.getClass().getSimpleName(), s);
}
Another approach could be to inspect the call stack.
Karthik, most logging tools allow you to specify the format of the output and one of the parameters is the class name, which uses the method Mark mentioned (stack inspection)
For example, in log4j the parameter is %C to reference a class name.
Another approach is to follow what android suggests for its logging functionality.
Log.v(TAG, "Message");
where TAG is a private static final string in your class.
Use a proper logging framework (e.g. slf4j). Each class that logs has its own logger, so there's no need to pass the class name to the log method call.
Logger logger = LoggerFactory.getLogger(this.getClass());
logger.debug("Starting service");
//...
logger.debug("Service started");
Could a sensible unit test be written for this code which extracts a rar archive by delegating it to a capable tool on the host system if one exists?
I can write a test case based on the fact that my machine runs linux and the unrar tool is installed, but if another developer who runs windows would check out the code the test would fail, although there would be nothing wrong with the extractor code.
I need to find a way to write a meaningful test which is not binded to the system and unrar tool installed.
How would you tackle this?
public class Extractor {
private EventBus eventBus;
private ExtractCommand[] linuxExtractCommands = new ExtractCommand[]{new LinuxUnrarCommand()};
private ExtractCommand[] windowsExtractCommands = new ExtractCommand[]{};
private ExtractCommand[] macExtractCommands = new ExtractCommand[]{};
#Inject
public Extractor(EventBus eventBus) {
this.eventBus = eventBus;
}
public boolean extract(DownloadCandidate downloadCandidate) {
for (ExtractCommand command : getSystemSpecificExtractCommands()) {
if (command.extract(downloadCandidate)) {
eventBus.fireEvent(this, new ExtractCompletedEvent());
return true;
}
}
eventBus.fireEvent(this, new ExtractFailedEvent());
return false;
}
private ExtractCommand[] getSystemSpecificExtractCommands() {
String os = System.getProperty("os.name");
if (Pattern.compile("linux", Pattern.CASE_INSENSITIVE).matcher(os).find()) {
return linuxExtractCommands;
} else if (Pattern.compile("windows", Pattern.CASE_INSENSITIVE).matcher(os).find()) {
return windowsExtractCommands;
} else if (Pattern.compile("mac os x", Pattern.CASE_INSENSITIVE).matcher(os).find()) {
return macExtractCommands;
}
return null;
}
}
Could you not pass the class a Map<String,ExtractCommand[]> instances and then make an abstract method, say GetOsName, for getting the string to match. then you could look up the match string in the map to get the extract command in getSystemSpecificExtractCommands method. This would allow you to inject a list containing a mock ExtractCommand and override the GetOsName method to return the key of your mock command, so you could test that when the extract worked, the eventBus is fired etc.
private Map<String,EvenetCommand[]> eventMap;
#Inject
public Extractor(EventBus eventBus, Map<String,EventCommand[]> eventMap) {
this.eventBus = eventBus;
this.eventMap = eventMap;
}
private ExtractCommand[] getSystemSpecificExtractCommands() {
String os = GetOsName();
return eventMap.Get(os);
}
protected GetOsName();
{
return System.getProperty("os.name");
}
I would look for some pure java APIs for manipulating rar files. This way the code will not be system dependent.
A quick search on google returned this:
http://www.example-code.com/java/rar_unrar.asp
Start with a mock framework. You'll need to refactor a bit, as you will need to ensure that some of those private and local scope properties/variables can be overridden if need be.
Then when you are testing Extract, you make sure you've mocked out the commands, and ensure that the Extract method is called on your mocked objects. You'll also want to ensure that your event got fired too.
Now to make it more testable you can use constructor or property injection. Either way, you'll need to make the private ExtractCommand arrays overriddable.
Sorry, don't have time to recode it, and post, but that should just about get you started nicely.
Good luck.
EDIT. It does sound like you are more after a functional test anyway if you want to test that it is actually extracted correctly.
Testing can be tricky, especially getting the divides right between the different types of tests and when they should be run and what their responsibilities are. This is even more so with cross-platform code.
While it's possible to think of this as 1 code base you are testing, it's really multiple code bases, the generic java code and code for each target platform, so you will need multiple tests.
To begin with unit testing, you will not be exercising the external command. Rather, each platform specific class is tested to see that it generates the correct command line, without actually executing it.
Your java class that hides all the platform specifics (which command to use) has a unit test to verify that it instantiates the correct platform specific class for a given platform. The platform can be a parameter to the core test, so multiple platforms can be "emulated". To take the unit test further, you could mock out the command implementation (e.g. having a RAR file and it's uncompressed form as part of your test data, and the command is a simple copy of the uncompressed data.)
Once these unit tests are in place and green, you then can move on to functional tests, where the real platform specific commands are executed. Of course, these functional tests have to be run on the actual platform. Each functional test corresponds to a platform specific class that knows how to create the correct commandline to unrar.
Your build is configured to exclude tests for classes that don't apply to the current platform, for example, so LinuxUnrarer is not tested on Windows. The platform independent java class is always tested, and it will instantiate the appropriate platform specific test. This gives you a integration test to see that the system works end to end.
As to cross platform UNRAR, there is a java RAR scanner, but it doesn't decompress.