Suppose I want to write a test(s) for a Java class that would provide a method for reading and parsing external files (to be precise, files would be JSON and I would be using Jackson).
Moreover, I've got some examples of JSON files I'd parse, and I also have a vague idea what kind of Java object this SomeMagicalReader.readPony("path/to/location/pony.json") method should return; if I manage to getreadPony to return a some kind of PonyObject, I think have an idea how to test that the produced PonyObject is what I expect.
The question I have concerns providing the readPony function with the test data. I'm probably thinking about this way too much, but (1) is there an idiomatic "Java + Junit" way of doing that? (= testing a method that reads external files?) Copypaste the contents of the example file as a String variable in the test code? (They're fairly short, but that would still end up looking ugly quite fast.) Place the example JSONs just ...somewhere and call readPony with the path? (This sounds more sensible.) (2) What would then be a canonical place to put such external JSON test files, if my tests are organized in a Maven style test package hierarchy, e.g. src/test/java/com/stuff/app/package/SomeMagicalReaderTest.java?
As per maven's standard directory layout, I would advise you to put you JSON test files in src/test/resources since they are test resources. Note that you can (and should) organize your own folders hierarchy under the resources folder (other developers would find it easier to locate specific test resources when fixing or adding some other tests).
So yes you would end up with your JSON files somewhere but not anywhere if your own test resources hierarchy is good enough (for example, if you think your package structure is well organized with meaningful package names, following it for your test resources hierarchy isn't a bad idea at all).
You should ask yourself what the mission critical code is for your project - reading files or parsing their content. For my projects, parsing was the interesting part, so I placed files to parse as test resources, read them in the unit test to string and pass them to the parser to unit test the parser. It is also possible to include the contents directly in unit tests as big ugly strings, but when you have a dedicated place for test resources, why not use it.
Copypaste the contents of the example file as a String variable in the test code
I suggest against doing this as it makes modifying the input for your tests more difficult. Also, using an external file makes your tests more flexible. For example, reading an external file allows you to create multiple tests while reusing the basic framework for each test. Of course, this also means that you will need to take some time to design the methods that actually perform the tests.
Related
Current Situation
As a pre-build test, I am trying to check that any commits/changes to java fixtures will not break any fitnesse tests I have in my fitnesse server.
The approach I have taken is to grab all the tests (context.txt) files that I want to verify the commit will not break, and parse it the best I can and compare it with the available methods I can grab using reflection from my projects.
Currently, I have a HashMap of 'class name' to 'class object' for all my available java fixtures. I also have been able to get access to all the fitnesse tests as File objects. It looks a little like this:
HashMap<String, Class<?>> availableJavaClassesMap = initJavaMap();
List<File> allFitnesseTestFiles = initTestFiles();
Goal
Now I want to be able to do something like this:
HashMap<String, Method> parsedMethodsFromFitnesseMap;
For(File file: allFitnesseTestFiles) {
parsedMethodsFromFitnesseMap.addAll( parseFile( file ) );
}
Then I would simply compare the two HashMaps and make sure the parsedMethodsFromFitnesseMap HashMap is a subset of the availableJavaClassesMap HashMap.
Worries
Include files: how to handle parsing those first/other approaches
Scenarios: create my own list of known scenarios and how they work
Ideal Solution
Is there an already made parser that will do this?
I have found the Slim Code and think it could be refactored to my needs.
Is this the best approach for this pre-build check?
Note
Each test is very expensive to run, therefore simply running all the tests to see if they still work is not an option for my needs.
Ideas? Thanks.
My Solution
The approach I settled on was using SVNKit to programmatically pull from svn into the systems temp folder and then did my parsing and comparing between methods and then deleted the directory when I was done.
For execution time, if I pulled and parsed 200 tests, it took about 20 seconds to do so. However, about 90% of the time is taken up by simply pulling the tests from the svn repo.
Holding on to the repo instead of deleting it would solve this timing issue (besides the first pull obviously), however for my jUnit style approach, I will probably take the longer time hit for sake of encapsulation.
I made sure that the import files were parsed before hand so that the scenarios in those could be used when parsing the actual tests.
I ended up not using any of the slim code (However if you want to give it a shot, the InstructionExecutor class was what I thought was most useful/close to a parser.
This was a fairly easy approach and I would recommend it.
Note
I never tried the restful approach, which might of yielded a faster execution time, but I did enjoy the simplicity once the repo was successfully pulled onto the machine.
I had to implement some code to traverse a directory structure and return a list of files found. The requirements were pretty simple:
Given a base directory, find all files (which are not directories themselves) within.
If a directory is found, repeat step 1 for it.
I wanted to develop the code using TDD. As I started writing the tests, I realized that I was mocking class File, so I could intercept calls to File.isDirectory() and so on. In this way, I was forcing myself to use a solution where I would call that method.
I didn't like it because this test is definitely tightly coupled to the implementation. If I ever change the way in which I ask if a file is a directory, then this test is going to fail even if I keep the contract working. Looking at it as a Private Unit Test made me feel uneasy, for all the reasons expressed on that post. I'm not sure if this is one of those cases where I need to use that kind of testing. On the other hand, I really want to be sure that it returns every file that its not also a directory, traversing the entire structure. To me, that requires a nice, simple, test.
I wanted to avoid having to create a testing directory structure with real testing files "on disk", as I saw it rather clumsy and against some of the best practices that I have read.
Bear in mind that I don't need to do anything with the contents, so tricks like using StringReader instead of FileReader do not apply here. I thought I could do something equivalent, though, like being able to create a directory structure in memory when I set up the test, then tearing it down. I haven't found a way to do it.
How would you develop this code using TDD?
Thanks!
The mistake you have made is to mock File. There is a testing anti pattern that assumes that if your class delegates to class X, you must mock class X to test your class. There is also a general rule to be cautious of writing unit tests that do file I/O, because they tend to be too slow. But there is no absolute prohibition on file I/O in unit tests.
In your unit tests have a temporary directory set up and torn down, and create test files and directories within that temporary directory. Yes, your tests will be slower than pure CPU tests, but they will still be fast. JUnit even has support code to help with this very scenario: a #Rule on a TemporaryFolder.
Just this week I implemented, using TDD, some housekeeping code that had to scan through a directory and delete files, so I know this works.
As someone who gets very antsy about unit tests that take longer than a few milliseconds to complete, I strongly recommend mocking out the file I/O.
However, I don't think you should mock the File class directly. Instead, look at your use of the File class as the "how", and try to identify the "what". Then codify that with an interface.
For example: you mentioned that one of the things you do is intercept calls to File.isDirectory. Instead of interacting with the File class, what if your code interacted with some implementation of an interface like:
public interface FileSystemNavigator {
public boolean isDirectory(String path);
// ... other relevant methods
}
This hides the use of File.isDirectory from the rest of your code, while simultaneously reframing the problem into something more relevant to your program.
I have a class that does operations on file's on a disk.
More exactly it traverses a directory, reads through all files with a given suffix
and does some operations on the data and then outputs them to a new file.
I'm a bit dubious as to how to design a unittest for this class.
I'm thinking having the setup method create a temporary directory and temporary files in /tmp/somefolder, but I'm suspecting this is a bad idea for a couple of reasons(Developers using windows, file permissions etc).
Another idea would be to mock the classes I'm using to write and read to the disk, by encapsulating the classes using an interface and then providing a mock object, but it seems to be a bit messy.
What would be the standard way of approaching such a problem?
If using JUnit 4.7 and up, you can use the #TemporaryFolder rule to transparently obtain a temporary folder which should automatically get cleared after each test.
Your strategy is the right one, IMO. Just make sure not to hardcode the temp directory. Use System.getProperty("java.io.tmpdir") to get the path of the temp directory, and use a finally block in your test or a #After method to cleanup the created files and directories once your test is finished.
Mocking everything out is possible, but probably much more effort than it's worth. You can use the temporary directory supplied from Java System.getProperty("java.io.tmpdir") which you should be able to write to etc. no matter which system you're on. Stick to short file names and you'll be safe even if running on something ancient.
I am very new to TDD in general so please forgive me if my question does not make lots of sense.
After looking looking around for a bit, it seems that jUnit is capable of implement integration test. I am hoping that the community can provide me some guidance on how to write integration test. Here is a simple overview of my design.
I have Main1, that accept a list of zip files. Main will extract the zip files, edit the content of the pdf inside the zip files, and put the final pdf files into folder X. If the number of pdf reach a THRESHOLD, then Main2Processor (not a main class) will get invoked and zip all the pdf files, and also create a report text file with the same name as the newly create zip file.
If I run Main2, it also will kick off Main2Processor, which will zip the pdf file and create text file reports (even though the number of pdf in folder X did not reach a THRESHOLD).
How do I write integration test testing the correctness for my above design?
You're right; JUnit can be used to write tests that would be called integration tests. All you have to do is relax the rules regarding tests not touching external resources.
First, I would refactor the main() of your application to do as little as you can possibly make it do; there isn't a really good way to test the code in a main() function. Have it construct and run an object (that object can be the object containing main(), if you wish), passing that new object your list of ZIP files. That object can now be tested using JUnit by just instantiating it.
Now, you just have to architect the test to set up a constant test environment and then perform a repeatable test. Create, or clear out, a temp directory somewhere, then copy over some test ZIP files into that directory. Then, run your main processor.
To detect that the proper behavior occurs when the threshold is reached, you just test for the existence of a zip file (and/or its absence if the threshold isn't reached).
Do you really want an "integration test" (that term is overloaded beyond comprehension now so if you could state your end-goals that'd help) ? How about an acceptance test where you use this console/GUI app like a real user with specific input and check for the expected output?
JUnit is just a test runner and is oblivious of what the test actually does. So yes, you could use it to write any test. However it has been built for unit-testing and that will leak sometimes.. e.g. the fact that the test shuts down on the first error / the first assert that does not succeed. Usually coarse-tests like to push ahead and get a bunch of errors at the end.
If you want an integration test - you would have to redesign your app to be callable from tests (if it is not already so). Raise specific obstacles to writing this test and I can offer more specific advice.
I think that you should create some utility methods for your tests. For example running applicaiton, checking directory, clearing directory etc.
Then you will be able to implement test like the following:
#Test
public mytest1() {
exec(Main1.class, "f1.zip", "f2.zip");
Assert.assertTrue(getFileCount(OUTPUT_DIR) < THRESHOLD);
// perform verification of your files etc...
}
First of all you might describe your above specification, but from a "test sequence" point of view. For example, test one would provide to Main1 a set of N pdf files, N being under the threshold. Then your test code, after Main1 returned, would check X folder content, as well as reports, to verify that your expectations are met.
JUnit itself just helps running test case, it does not really help writing the tests.
And JUnit is "unit test" oriented (but you can use it as well for integration tests, despite some situations does not suit well; when a global setup is required for example, or when the test cases are expected to be run in a specific order...).
Some additional libraries can help greatly to interact easily with the rest of your code : dbunit, httpunit and so on.
In TDD(Test Driven Development) development process, how to deal with the test data?
Assumption that a scenario, parse a log file to get the needed column. For a strong test, How do I prepare the test data? And is it properly for me locate such files to the test class files?
Maven, for example, uses a convention for folder structures that takes care of test data:
src
main
java <-- java source files of main application
resources <-- resource files for application (logger config, etc)
test
java <-- test suites and classes
resources <-- additional resources for testing
If you use maven for building, you'll want to place the test resources in the right folder, if your building with something different, you may want to use this structure as it is more than just a maven convention, to my opinion it's close to 'best practise'.
Another option is to mock out your data, eliminating any dependency on external sources. This way it's easy to test various data conditions without having to have multiple instances of external test data. I then generally use full-fledged integration tests for lightweight smoke testing.
Hard code them in the tests so that they are close to the tests that use them, making the test more readable.
Create the test data from a real log file. Write a list of the tests intended to be written, tackle them one by one and tick them off once they pass.
getClass().getClassLoader().getResourceAsStream("....xml");
inside the test worked for me. But
getClass().getResourceAsStream("....xml");
didn't worked.
Don't know why but maybe it helps some others.
When my test data must be an external file - a situation I try to avoid, but can't always - I put it into a reserved test-data directory at the same level as my project, and use getClass().getClassLoader().getResourceAsStream(path) to read it. The test-data directory isn't a requirement, just a convenience. But try to avoid needing to do this; as #philippe points out, it's almost always nicer to have the values hard-coded in the tests, right where you can see them.