If i have a long method of code which gathers data from 2 or 3 difference sources and returns a result. How can I refactor it so that it is more unit-testable? This method is a webservice and I want to make one call from client code to gather all the data.
I can refactor some portions out into smaller methods which will be more testable. But the current method will still be calling those 5 methods and will remain less testable. Assuming Java as programming language, is there a pattern for making such code testable?
This is a very common testing problem, and the most common solution I come across for this is to separate the sourcing of data from the code which uses the data using dependency injection. This not only supports good testing, but is generally a good strategy when working with external data sources (good segregation of responsibilities, isolates the integration point, promotes code reuse being some reasons for this).
The changes you need to make go something like:
For each data source, create an interface to define how data from that source is accessed, and then factor out the code which returns the data into a separate class which implements this.
Dependency inject the data source into the class containing your 'long' function.
For unit testing, inject a mock implementation of each data source.
Here is some code examples showing what this would look like - note that this code is merely illustrative of the pattern, you will need some more sensible names for things. It would be worth studying this pattern and learning more about dependency injection & mocking - two of the most powerful weapons in the unit testers armory.
Data Sources
public interface DataSourceOne {
public Data getData();
}
public class DataSourceOneImpl implements DataSourceOne {
public Data getData() {
...
return data;
}
}
public interface DataSourceTwo {
public Data getData();
}
public class DataSourceTwoImpl implements DataSourceTwo {
public Data getData() {
...
return data;
}
}
Class with Long Method
public class ClassWithLongMethod {
private DataSourceOne dataSourceOne;
private DataSourceTwo dataSourceTwo;
public ClassWithLongMethod(DataSourceOne dataSourceOne,
DataSourceTwo dataSourceTwo) {
this.dataSourceOne = dataSourceOne;
this.dataSourceTwo = dataSourceTwo;
}
public Result longMethod() {
someData = dataSourceOne.getData();
someMoreData = dataSourceTwo.getData();
...
return result;
}
}
Unit Test
import org.junit.Test;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.when;
public class ClassWithLongMethodTest {
#Test
public void testLongMethod() {
// Create mocked data sources which return the data required by your test
DataSourceOne dataSourceOne = mock(DataSourceOne.class);
when(dataSourceOne.getData()).thenReturn(...);
DataSourceTwo dataSourceTwo = mock(DataSourceTwo.class);
when(dataSourceTwo.getData()).thenReturn(...);
// Create the object under test using the mocked data sources
ClassWithLongMethod sut = new ClassWithLongMethod(dataSourceOne,
dataSourceTwo);
// Now you can unit test the long method in isolation from it's dependencies
Result result = sut.longMethod();
// Assertions on result
...
}
}
Please forgive (and correct) any syntactic mistakes, I don't write much java these days.
The test for the "big" method will look like integration test where the smaller methods can be mocked.
If you can separate the "big" method into five isolated methods, then the "big" method could be further partitioned into semantically-/contextually-meaningful groups of the isolated methods.
Then you can mock the larger groupings of isolated methods for the "big" method.
Related
I am working on a REST API where I have an interface that defines a list of methods which are implemented by 4 different classes, with the possibility of adding many more in the future.
When I receive an HTTP request from the client there is some information included in the URL which will determine which implementation needs to be used.
Within my controller, I would like to have the end-point method contain a switch statement that checks the URL path variable and then uses the appropriate implementation.
I know that I can define and inject the concrete implementations into the controller and then insert which one I would like to use in each particular case in the switch statement, but this doesn't seem very elegant or scalable for 2 reasons:
I now have to instantiate all of the services, even though I only need to use one.
The code seems like it could be much leaner since I am literally calling the same method that is defined in the interface with the same parameters and while in the example it is not really an issue, but in the case that the list of implementations grows ... so does the number of cases and redundant code.
Is there a better solution to solve this type of situation? I am using SpringBoot 2 and JDK 10, ideally, I'd like to implement the most modern solution.
My Current Approach
#RequestMapping(Requests.MY_BASE_API_URL)
public class MyController {
//== FIELDS ==
private final ConcreteServiceImpl1 concreteService1;
private final ConcreteServiceImpl2 concreteService2;
private final ConcreteServiceImpl3 concreteService3;
//== CONSTRUCTORS ==
#Autowired
public MyController(ConcreteServiceImpl1 concreteService1, ConcreteServiceImpl2 concreteService2,
ConcreteServiceImpl3 concreteService3){
this.concreteService1 = concreteService1;
this.concreteService2 = concreteService2;
this.concreteService3 = concreteService3;
}
//== REQUEST MAPPINGS ==
#GetMapping(Requests.SPECIFIC_REQUEST)
public ResponseEntity<?> handleSpecificRequest(#PathVariable String source,
#RequestParam String start,
#RequestParam String end){
source = source.toLowerCase();
if(MyConstants.SOURCES.contains(source)){
switch(source){
case("value1"):
concreteService1.doSomething(start, end);
break;
case("value2"):
concreteService2.doSomething(start, end);
break;
case("value3"):
concreteService3.doSomething(start, end);
break;
}
}else{
//An invalid source path variable was recieved
}
//Return something after additional processing
return null;
}
}
In Spring you can get all implementations of an interface (say T) by injecting a List<T> or a Map<String, T> field. In the second case the names of the beans will become the keys of the map. You could consider this if there are a lot of possible implementations or if they change often. Thanks to it you could add or remove an implementation without changing the controller.
Both injecting a List or a Map have some benefits and drawbacks in this case. If you inject a List you would probably need to add some method to map the name and the implementation. Something like :
interface MyInterface() {
(...)
String name()
}
This way you could transform it to a Map<String, MyInterface>, for example using Streams API. While this would be more explicit, it would polute your interface a bit (why should it be aware that there are multiple implementations?).
When using the Map you should probably name the beans explicitly or even introduce an annotation to follow the principle of least astonishment. If you are naming the beans by using the class name or the method name of the configuration class you could break the app by renaming those (and in effect changing the url), which is usually a safe operation to do.
A simplistic implementation in Spring Boot could look like this:
#SpringBootApplication
public class DynamicDependencyInjectionForMultipleImplementationsApplication {
public static void main(String[] args) {
SpringApplication.run(DynamicDependencyInjectionForMultipleImplementationsApplication.class, args);
}
interface MyInterface {
Object getStuff();
}
class Implementation1 implements MyInterface {
#Override public Object getStuff() {
return "foo";
}
}
class Implementation2 implements MyInterface {
#Override public Object getStuff() {
return "bar";
}
}
#Configuration
class Config {
#Bean("getFoo")
Implementation1 implementation1() {
return new Implementation1();
}
#Bean("getBar")
Implementation2 implementation2() {
return new Implementation2();
}
}
#RestController
class Controller {
private final Map<String, MyInterface> implementations;
Controller(Map<String, MyInterface> implementations) {
this.implementations = implementations;
}
#GetMapping("/run/{beanName}")
Object runSelectedImplementation(#PathVariable String beanName) {
return Optional.ofNullable(implementations.get(beanName))
.orElseThrow(UnknownImplementation::new)
.getStuff();
}
#ResponseStatus(BAD_REQUEST)
class UnknownImplementation extends RuntimeException {
}
}
}
It passes the following tests:
#RunWith(SpringRunner.class)
#SpringBootTest
#AutoConfigureMockMvc
public class DynamicDependencyInjectionForMultipleImplementationsApplicationTests {
#Autowired
private MockMvc mockMvc;
#Test
public void shouldCallImplementation1() throws Exception {
mockMvc.perform(get("/run/getFoo"))
.andExpect(status().isOk())
.andExpect(content().string(containsString("foo")));
}
#Test
public void shouldCallImplementation2() throws Exception {
mockMvc.perform(get("/run/getBar"))
.andExpect(status().isOk())
.andExpect(content().string(containsString("bar")));
}
#Test
public void shouldRejectUnknownImplementations() throws Exception {
mockMvc.perform(get("/run/getSomethingElse"))
.andExpect(status().isBadRequest());
}
}
Regarding two of your doubts :
1. Instantiating the service object should not be an issue as this is one time job and controller gonna need them to serve all type of request.
2. You can use the exact Path mapping to get rid of switch case. For e.g. :
#GetMapping("/specificRequest/value1")
#GetMapping("/specificRequest/value2")
#GetMapping("/specificRequest/value3")
All of the above mapping will be on separate method which would deal with specific source value and invoke respective service method.
Hope this will help to make code more cleaner and elegant.
There is one more option of separating this on service layer and having only one endpoint to serve all types of source but as you said there is different implementation for each source value then it says that source is nothing but a resource for your application and having separate URI/separate method makes the perfect sense here. Few advantages that I see here with this are :
Makes it easy to write the test cases.
Scaling the same without impacting any other source/service.
Your code dealing the each source as separate entity from other sources.
The above approach should be fine when you have limited source values. If you have no control over source value then we need further redesign here by making source value differentiate by one more value like sourceType etc. and then having separate controller for each group type of source.
I have a Java class like the following:
public class MyClass {
/** Database Connection. */
private dbCon;
public MyClass() {
dbCon = ...
}
public void doSomethingWith(MyData data) {
data = convertData(data);
dbCon.storeData(data);
}
private MyData convertData(MyData data) {
// Some complex logic...
return data;
}
}
since the true logic of this class lies in the convertData() method, I want to write a Unit Test for this method.
So I read this post
How do I test a private function or a class that has private methods, fields or inner classes?
where a lot of people say that the need to test a private method is a design smell. How can it be done better?
I see 2 approaches:
Extract the convertData() method into some utility class with a public api. But I think this would be also bad practice since such utility classes will violate the single responsibilty principle, unless I create a lot of utility classes with maybe only one or two methods.
Write a second constructor that allows injection of the dbCon, which allows me to inject a mocked version of the database connection and run my test against the public doSomething() method. This would be my preferred approach, but there are also discussions about how the need of mocking is also a code smell.
Are there any best practices regarding this problem?
Extract the convertData() method into some utility class with a public api. But I think this would be also bad practice since such utility classes will violate the single responsibility principle, unless I create a lot of utility classes with maybe only one or two methods.
You interpretation of this is wrong. That is exactly what the SRP and SoC (Separation of Concerns) suggests
public interface MyDataConverter {
MyData convertData(MyData data);
}
public class MyDataConverterImplementation implements MyDataConverter {
public MyData convertData(MyData data) {
// Some complex logic...
return data;
}
}
convertData implementation can be now tested in isolation and independent of MyClass
Write a second constructor that allows injection of the dbCon, which allows me to inject a mocked version of the database connection and run my test against the public doSomething() method. This would be my preferred approach, but there are also discussions about how the need of mocking is also a code smell.
Wrong again. Research Explicit Dependency Principle.
public class MyClass {
private DbConnection dbCon;
private MyDataConverter converter;
public MyClass(DbConnection dbCon, MyDataConverter converter) {
this.dbCon = dbCon;
this.converter = converter;
}
public void doSomethingWith(MyData data) {
data = converter.convertData(data);
dbCon.storeData(data);
}
}
MyClass is now more honest about what it needs to perform its desired function.
It can also be unit tested in isolation with the injection of mocked dependencies.
My question is about testing a class that implements many interfaces. For example, I have this class:
public class ServiceControllerImpl extends ServiceController implements IDataChanged, IEventChanged {
}
Now there are two ways for testing. The first is testing directly on the concrete class. That means the object type is the concrete class rather than the interface.
public class ServiceControllerImplTest {
ServiceControllerImpl instance;
#Before
public void setUp() {
instance = new ServiceControllerImpl();
// you can bring this instance anywhere
}
}
The second way is testing on the interface only. We must typecast this object to all interfaces it implements.
public class ServiceControllerImplTest {
ServiceController instance; // use interface here
IDataChanged dataChangeListener;
#Before
public void setUp() {
instance = new ServiceControllerImpl();
dataChangeListener = (IDataChanged) instance;
// instance and dataChangeListener "look like" two different object.
}
}
I prefer the second solution because maybe in future we can change the interface it implements to other objects, so using the concrete class might lead to failing tests in the future. I don't know the best practice for this problem.
Thanks :)
I prefer second solution because in reality, maybe in future we can change the interface it implements to other objects, so force using concreted class maybe leads to fail test in the future.
I guess it will lead to failed tests anyway, because you usually test that assertions are true or false. The question is: Do that tests apply to any IDataChanged or do these assertions only apply to the ServiceControllerImpl?
If the assertions only apply to the ServiceControllerImpl it doesn't matter if you use an IDataChanged instead of an ServiceControllerImpl, because you must edit the test when you use another IDataChanged object - different assertions. The test will fail if you use another object.
The way you setup unit tests Itself gives you an answer. A unit test usually tests one class in isolation. This means that you mock the environment. But mocking the environment means that you know the dependencies of the class you test and this are implementation details. So your test is written on an implemtation basis rather than only the interface.
It's possible to write tests that only test an abstract api - like an interface. But this usually means that your tests are abstract too. E.g.
public abstract class SetTest {
#Test
public void addAlreadyExistentObject(){
Set<String> setUnderTest = createSetUnderTest();
Assert.assertTrue(setUnderTest.isEmpty());
boolean setChanged = setUnderTest.add("Hello");
Assert.assertTrue(setChanged);
setChanged = setUnderTest.add("Hello");
Assert.assertFalse(setChanged);
Assert.assertEquals(setUnderTest.size(), 1);
}
protected abstract Set<String> createSetUnderTest();
}
You can then extend these abstract tests to test the api for concrete classes. E.g.
public class HashSetTest extends SetTest {
#Override
protected Set<String> createSetUnderTest() {
return new HashSet<String>();
}
}
In this case you can replace the implementation and the test must remain green.
But here is another example of an abstract api when replacing the object under test does not really make sense.
What about writing a test for all Runnables?
public class RunnableTest {
#Test
public void run(){
Runnable runnable = ...;
// What to test here?
// run is invoked without throwing any runtime exceptions?
runnable.run();
}
}
As you can see it does not make sense in some cases to write tests in a way so that you can easily replace the object under test.
If an api like the Set api defines a concrete state handling you can write abstract tests that test this.
JayC667 already correctly answered that it's best to refer to a class through its supertype(s) in tests of methods defined by those types. But I'd change the way you did that a bit to avoid casting:
public class ServiceControllerImplTest {
ServiceController controller;
IDataChanged dataChangeListener;
#Before
public void setUp() {
instance = new ServiceControllerImpl();
controller = instance;
dataChangeListener = instance;
}
}
An example of scenario I'm trying to unit test somehow:
A method gets three parameters: Company name, Plane Number 1, Plane Number 2
public Double getDistanceBetweenPlanes(Sting company, String plane1, String plane2)
{
-Apply some validation logic to see that plane numbers and company realy exists
-Use complex geLocation and setelate data to get plane location
return plane1.geoLocation() - plane2.geoLocation() ;
}
Now, I want to unit test this.
I do not have real plane numbers, I do not have live connection to the satellite.
Still I would like to be able to mock the whole thing somehow so that I can test this API.
Any suggestion on how to refactor it? Any new public method to be created just for testing?
Usually when refactoring non-tested code, I'm referring to what I would do when practicing TDD and BDD.
So first I would write a simple contract, one line = one requirement (of the existing code).
Then while coding the test, I would probably notice things I don't want to test. In this case, DI will help you remove the code that has different responsibility to another class. When working in a object design it is important to focus on behavior and interaction between objects with different concerns. Mockito, with is alias class BDDMockito, can help you favor this approach.
In your case, here's an example of what you can do.
public class PlaneLocator {
private CompanyProvider companyProvider;
private PlaneProvider companyProvider;
private LocationUtil locationUtil;
// Constructor injection or property injection
public Double getDistanceBetweenPlanes(String companyId, String plane1Id, String plane2Id) {
Company company = companyProvider.get(company);
Plane plane1 = planeProvider.get(plane1Id);
Plane plane1 = planeProvider.get(plane1Id);
return locationUtil.distance(plane1.geoLocation(), plane2.geoLocation());
}
}
A simple JUnit test might look like (using features from mockito 1.9.0) :
#RunWith(MockitoJUnitRunner.class)
public class PlaneLocatorTest {
#Mock CompanyProvider mockedCompanyProvider;
#Mock PlaneProvider mockedPlaneProvider;
#Mock LocationUtil locationUtil;
#InjectMocks SatelitePlaneLocator testedPlaneLocator;
#Test public void should_use_LocationUtil_to_get_distance_between_plane_location() {
// given
given(companyProvider.get(anyString())).willReturn(new Company());
given(planeProvider.get(anyString()))
.willReturn(Plane.builder().withLocation(new Location("A")))
.willReturn(Plane.builder().withLocation(new Location("B")));
// when
testedPlaneLocator.getDistanceBetweenPlanes("AF", "1251", "721");
// then
verify(locationUtil).distance(isA(Location.class), isA(Location.class));
}
// other more specific test on PlaneLocator
}
And you will have the following dependencies injected, each having his own unit test describing how the class behaves, considering the input and the collaborators :
public class DefaultCompanyProvider implements CompanyProvider {
public Company get(String companyId) {
companyIdValidator.validate(companyId);
// retrieve / create the company
return company;
}
}
public class SatellitePlaneProvider implements PlaneProvider {
public Plane get(Plane planeId) {
planeIdValidator.validate(planeId);
// retrieve / create the plane with costly satellite data (using a SatelliteService for example)
return plane;
}
}
Using this way you can refactor your code with a much lower coupling. The better separation of concerns will allow a better understanding of the code base, a better maintainability, and easier evolution.
I also would like to redirect you to further reading on this blog post The Transformation Priority Premise in TDD from the famous Uncle Bob Martin http://cleancoder.posterous.com/the-transformation-priority-premise
You may use mocks like jmock or easy mock.
With jmock you'll write something like this:
#Test
public testMethod(){
Mockery context = new Mockery();
plane1 = context.mock(Plane.class);
plane2 = context.mock(Plane.class);
context.expectations(new Expectations(){{
oneOf(plane1).geoLocation(); will(returnValue(integerNumber1));
oneOf(plane2).geoLocation(); will(returnValue(integerNumber2));
}});
assertEquals(
instanceUnderTest.method(plane1,plane2),
integerNumber1-integerNumber2
)
context.assertIsSatisfied()
}
The last method will assure the methods setted on expectations are called. If not an exception is raised and the test fails.
I'm new to using Mockito and am trying to understand a way to make a unit test of a class that relies on injected dependencies. What I want to do is to create mock objects of the dependencies and make the class that I am testing use those instead of the regular injected dependencies that would be injected by Spring. I have been reading tutorials but am a bit confused on how to do this.
I have one the class I want to test like this:
package org.rd.server.beans;
import org.springframework.beans.factory.annotation.Autowired;
public class TestBean1 {
#Autowired
private SubBean1 subBean1;
private String helloString;
public String testReturn () {
subBean1.setSomething("its working");
String something = subBean1.getSomething();
helloString = "Hello...... " + something;
return helloString;
}
Then I have the class that I want to use as a mock object (rather than the regular SubBean1 class, like below:
package org.rd.server.beans.mock;
public class SubBean1Mock {
private String something;
public String getSomething() {
return something;
}
public void setSomething(String something) {
this.something = something;
}
}
}
I just want to try running a simple test like this:
package test.rd.beans;
import org.rd.server.beans.TestBean1;
import junit.framework.*;
public class TestBean1Test extends TestCase
{
private TestBean1 testBean1;
public TestBean1Test(String name)
{
super(name);
}
public void setUp()
{
testBean1 = new TestBean1();
// Somehow inject the mock dependency SubBean1Mock ???
}
public void test1() {
assertEquals(testBean1.testReturn(),"working");
}
}
I figure there must be some fairly simple way to do this but I can't seem to understand the tutorials as I don't have the context yet to understand everything they are doing / explaining. If anyone could shed some light on this I would appreciate it.
If you're using Mockito you create mocks by calling Mockito's static mock method. You can then just pass in the mock to the class you're trying to test. Your setup method would look something like this:
testBean1 = new TestBean1();
SubBean1 subBeanMock = mock(SubBean1.class);
testBean1.setSubBean(subBeanMock);
You can then add the appropriate behavior to your mock objects for whatever you're trying to test with Mockito's static when method, for example:
when(subBeanMock.getSomething()).thenReturn("its working");
In Mockito you aren't really going to create new "mock" implementations, but rather you are going to mock out the methods on the interface of the injected dependency by telling Mockito what to return when the method is called.
I wrote a test of a Spring MVC Controller using Mockito and treated it just like any other java class. I was able to mock out the various other Spring beans I had and inject those using Spring's ReflectionTestUtils to pass in the Mockito based values. I wrote about it in my blog back in February. It has the full source for the test class and most of the source from the controller, so it's probably too long to put the contents here.
http://digitaljoel.nerd-herders.com/2011/02/05/mock-testing-spring-mvc-controller/
I stumbled on this thread while trying to set up some mocks for a slightly more complicated situation and figured I'd share my results for posterity.
My situation was similar in the fact that I needed to mock dependencies, but I also wanted to mock some of the methods on the class I was testing. This was the solution:
#MockBean
DependentService mockDependentService
ControllerToTest controllerToTest
#BeforeEach
public void setup() {
mockDependentService = mock(DependentService.class);
controllerToTest = mock(ControllerToTest.class);
ReflectionTestUtils.setField(controllerToTest, "dependantService", mockDependentService);
}
#Test
void test() {
//set up test and other mocks
//be sure to implement the below code that will call the real method that you are wanting to test
when(controllerToTest.methodToTest()).thenCallRealMethod();
//assertions
}
Note that "dependantService" needs to match whatever you have named the instance of the service on your controller. If that doesn't match the reflection will not find it and inject the mock for you.
This approach allows all the methods on the controller to be mocked by default, then you can specifically call out which method you want to use the real one. Then use the reflection to set any dependencies needed with the respective mock objects.
Hope this helps someone down the road as it stumped me for a while.