I'm new to unit testing using nunit (and Java development in general). When creating unit tests for private methods on classes, it looks as though the test file must be in the same package as the class being tested. What is the typical way of avoiding exporting the APIs of the unit tests? Can I make the classes/test methods package-protected? Or do developers typically have a separate build for release that excludes unit test files?
I can tell IntelliJ or Ant not to package JUnit tests in the deployment. I have tests in a separate directory from the source code, which is what makes it possible.
Don't mingle source and test classes together. Keep them separate to make it easier for the tool/script you use to deploy.
The test file does not necessarily have to be in the same package as the class being tested. In fact, it is a good practice to have the test files in a completely separate package, allowing them to test the public API without being concerned with package-level implementation details.
Alternately, you can set up your build script (e.g. Nant) to ignore files containing "Test" when you build your release executable.
Personally my approach is only to test exposed functionality, so you end up testing well encapsulated parts only.
This usually leads my design to contain small classes with well defined functionality, which are easier to test.
Generally, when unit testing you shouldn't be concerned with the internals of what you're testing, so I find this is the best way to approach it.
I also agree it's best to seperate test and production code.
Keep test source code out of application source code. In general, only test exposed functionality. If you really need to test private behavior, create a test object that extends the real object and allows publec access to the private behavior.
I think it's a mistake to move your test code out of the package of the CUT (Class Under Test). At some point you may want to test a protected method or class, and having your test code in another package makes that hard or impossible.
A better solution is to create a separate directory for your test code that simply mirrors the package structure of your production code. Here's what I do:
src/main/java/com/example/Foo.java
src/test/java/com/example/FooTest.java
Then your build script can very simply ignore src/test/** when it comes time for packaging and deployment.
Related
I have some classes for which I dont have to write junit classes so I want to skip Junit for some Java classes and want to increase Junit code coverage using Maven. For example, I have placed all Java classes in com.test.xxx package so that I can tell them all classes are to be skipped.
You can use a #Disabled Tag to disable a specific Test. If you just have empty test classes, for whatever reason. Delete them.
If you want to fake code coverage for whatever reason, you can create tests and call your services inside the tests, but do not add any assertations... However this is extremly bad practice and I cant really suggest it to you.
I am currently writing Selenium WebDriver tests for a variety of websites each using the same proprietary framework.
Because of this, there are many test cases that can be quite similar across different websites. As such I have made my test classes as generic as possible and made it so that every XPath/CSS Selector/ID used to locate elements is defined in a Constants class, which is unique to every project.
But, in some cases, the code for the same test can be the same across different websites, since I wrote them generically.
In addition each test is a direct/indirect extension of a BasicTest class which contains code that is likely to be reused by different tests (ex: WebDriver instance declaration, etc)
The way I thought about setting my test structure was the following:
one generic project that is "reused" by each subsequent project;
one project per website with its own definition of the Constantsclass and a TestSuite class that it can use to run both generic tests and tests specific to itself.
This would allow me not to have copies of these generic tests in each of my test projects.
The problem is that I don't really know how to set this up. The GenericProject is going to contain tests that require variables from Constant, but it makes no sense to have generic Constants. Plus, will I be able to call those tests inside my website project-specific TestSuites? If I redefine Constants in each specific project, will those constants be used for the generic tests defined in GenericProject?
How can I even set it up so that I can reuse Project A's classes inside of Project B, C, D... etc?
Extract your constants to a properties file which exists in each module as src/test/resources/common.properties.
Use org.apache.commons:commons-configuration2 PropertiesConfiguration to read this file. It will handle nested properties just fine.
Common code can be shared by depending on your GenericModule. Official instructions for two models of doing this (extract common tests to a new module or use a test-jar) are here
In general in order to reuse code over projects you would create a library containing the reusable code. In order to do so you'd need to think about a suitable API for the library.
This contains decisions about:
How will functionality be called from dependent code
How will dependent code provide required data.
If you are using constants for e.g. CSS selectors, that are different but have the same semantics, e.g.
root frame
side panel
main area
...
you might want to define an interface that the dependent code can provide. This could look like:
interface CssSelectors {
String rootFrame();
String sidePanel();
//...
}
If you are building this for tests you might also want to use features of your test framework (e.g. Rules in JUnit).
When reusing code in tests you also should consider another aspect:
If s.o. reads the tests written with your library, will she be able to sufficiently understand what is happening behind the border of the library to understand what the test is all about? This is a lot more of a question when dealing with test code than with production code as for test coverage and validity of tests it often matters a lot more how a setup or verification is done than is the case for production code.
I am new to jUnit and I am finding testing the small project I'm working on difficult. All the examples for jUnit testing seem to involve math, and what I wrote is a simple application that takes information via Scanner and creates various objects and stores them in an ArrayList. One of my thoughts was to test if the ArrayList was empty, but I obviously cannot access it from a separate class, but the standard seems to be to separate the test from the code. So I am not sure what to do?
You could provide a protected accessor for the array so that it can be used by the unit tests (which should be in the same package).
The typical well established practices are:
Unit test and production code goes into different files living in different projects but using the same package names.
Unit tests should not rely on internal state of production code. You don't want to write a test that needs to know about a field within the class under test. Because that means that your test can break when you change the production code to solve the problem differently.
The real answer here: you should share pieces of your code with us, to receive really helpful feedback.
In TDD(Test Driven Development) development process, how to deal with the test data?
Assumption that a scenario, parse a log file to get the needed column. For a strong test, How do I prepare the test data? And is it properly for me locate such files to the test class files?
Maven, for example, uses a convention for folder structures that takes care of test data:
src
main
java <-- java source files of main application
resources <-- resource files for application (logger config, etc)
test
java <-- test suites and classes
resources <-- additional resources for testing
If you use maven for building, you'll want to place the test resources in the right folder, if your building with something different, you may want to use this structure as it is more than just a maven convention, to my opinion it's close to 'best practise'.
Another option is to mock out your data, eliminating any dependency on external sources. This way it's easy to test various data conditions without having to have multiple instances of external test data. I then generally use full-fledged integration tests for lightweight smoke testing.
Hard code them in the tests so that they are close to the tests that use them, making the test more readable.
Create the test data from a real log file. Write a list of the tests intended to be written, tackle them one by one and tick them off once they pass.
getClass().getClassLoader().getResourceAsStream("....xml");
inside the test worked for me. But
getClass().getResourceAsStream("....xml");
didn't worked.
Don't know why but maybe it helps some others.
When my test data must be an external file - a situation I try to avoid, but can't always - I put it into a reserved test-data directory at the same level as my project, and use getClass().getClassLoader().getResourceAsStream(path) to read it. The test-data directory isn't a requirement, just a convenience. But try to avoid needing to do this; as #philippe points out, it's almost always nicer to have the values hard-coded in the tests, right where you can see them.
I've got 2 questions about organising Unit tests.
Do I have to put test to the same package as tested class, or can I organise tests in different packages?
For example if I have validity and other tests, is it correct to split them into different packages, even if they are for same class?
What about mock and stub classes? Shall I separate them from packages containing only tests, or put them together?
The way we do our JUnit test cases is to put them in the same package, but in a different root directory. Since we use Maven, we just use the standard locations making the structure similar to the following.
src/main/java/com/foo/Bar.java
src/test/java/com/foo/BarTest.java
Obviously there's more to the structure, but this lets us build the tests separately from the mainline code, but still access protected classes and the like. With respect to different types of tests, this is very subjective. When we started our testing effort (which unfortunately started after development), I tried to keep things pretty isolated. Unfortunately, it quickly became a nightmare when we got to the 500+ test case point. I've since tried to do more consolidation. This led to reduced amounts of code to maintain. As I said, though, it's very subjective.
As far as test-only code, we keep it in a separate com.foo.test package that resides only in the src/test/java tree.
I too tend to put my tests in the same package but under a different root directory. This allows me to test package-private classes or access packing-private classes while testing something else in the package. They are kept in a separate directory tree to allow excluding them from the deployed result (in particular to ensure that test code didn't accidentally get into production code). What matters most, however, is what works for your situation.
In terms of how many test classes per production class, the theory I've seen is that you write one test class per fixture, that is per setup structure. In many cases that is the same (or close enough) to one test class per production class, but I have sometimes written more test classes (in particular equality tests tend to be separated) for a give production class, and occasionally one test class of for a group of (related) production classes (say, for testing the Strategy pattern).
Mostly, I don't worry too much about the theory, but rework the tests as needed to keep duplication to an absolute minimum.
Keeping it the same package allows you to use package-private visibility for code that is intended to be accessed via the test only.
Regarding using separate root directories, that is a good practice. It also has an advantage for us, since we use IDEA, IDEA recognizes that production code cannot reference test code.
In terms of keeping them separate, there is a great power in having one, and only one, test class per production class at the unit level. Of course, some classes get created in production as part of refactoring that have no test classes at all, and that is fine, but when you want to know what test tests a certain class, having a convention that says ClassNameTest is the tests for ClassName is very helpful.
TestNG is much friendlier to this paradigm than JUnit, though.
Test classes should be rather in different packages, it's easier to separate them from the production code when you package it for release. I usually keep lots of test fluff in those packages, all sorts of mocks, configurations, scenarios.. But when you build - it doesn't get it. In some situations, it's a good idea to keep your testing stuff even in different projects. Depends.