I am using the akka framework with its Java API and mockito + Testkit for unit testing the actor
Here is the actor
public class K8sDeploymentCreator extends AbstractActor {
private final LoggingAdapter log = Logging.getLogger(getContext().getSystem(), this);
#Override
public Receive createReceive() {
return receiveBuilder().match(createK8sDeployment.class, msg -> {
KubeNamespace kubenamespace = new KubeNamespace();
KubeDeployment kubeDeployment = new KubeDeployment();
Namespace namespace = kubenamespace.createNamespace(msg.kubeClient, msg.service);
Deployment deployment = kubeDeployment.createDeployment(msg.service, msg.kubeClient, namespace);
log.info("sending complete depl msg");
getSender().tell(new K8sDeploymentComplete(deployment), getSelf());
})
.matchAny(o -> log.info("received unknown message")).build();
}
}
And here is the test class
public class K8sDeploymentCreatorTest extends JUnitSuite {
static ActorSystem system;
#Before
public void setup() {
system = ActorSystem.create();
KubeDeployment mockKubeDeployment = mock(KubeDeployment.class);
KubeNamespace mockKubeNamespace = mock(KubeNamespace.class);
Deployment deployment = Mockito.mock(Deployment.class);
Namespace namespace = Mockito.mock(Namespace.class);
KubernetesClient kubeClient = Mockito.mock(KubernetesClient.class);
Service serviceTodeploy = new Service("group","artifact","version");
DeployEnvironment deployEnvironment = new DeployEnvironment();
deployEnvironment.setName("K8sDeploymentCreatorTest");
serviceTodeploy.setDeployEnvironment(deployEnvironment);
when(mockKubeNamespace.createNamespace(kubeClient, serviceTodeploy)).thenReturn(namespace);
when(mockKubeDeployment.createDeployment(serviceTodeploy, kubeClient, namespace)).thenReturn(deployment);
}
#AfterClass
public static void teardown() {
TestKit.shutdownActorSystem(system);
system = null;
}
#Test
public void testK8sDeployment() {
new TestKit(system) {
{
final Props props = Props.create(K8sDeploymentCreator.class);
final ActorRef underTest = system.actorOf(props);
KubeDeployment mockKubeDeployment = mock(KubeDeployment.class);
KubeNamespace mockKubeNamespace = mock(KubeNamespace.class);
Deployment deployment = Mockito.mock(Deployment.class);
Namespace namespace = Mockito.mock(Namespace.class);
KubernetesClient kubeClient = Mockito.mock(KubernetesClient.class);
DeployEnvironment deployEnvironment = new DeployEnvironment();
deployEnvironment.setName("K8sDeploymentCreatorTest");
Service serviceTodeploy = new Service("group","artifact","version");
serviceTodeploy.setDeployEnvironment(deployEnvironment);
createK8sDeployment msg = new createK8sDeployment(serviceTodeploy, kubeClient);
underTest.tell(msg, getRef());
expectMsg(K8sDeploymentComplete)
}
};
}
}
This fails with a NPE (NullPointerException) trying to execute code inside createNamespace(). This method has been mocked, should it skip the excution and just return whatever the when statement says it should return?
Is this because I am instantiating a new objec of KubeNamspace and also KubeDeployment where as the contact is for mocks?
You are not actually mocking anything in your test. You are creating mock objects but they are not getting injected into the code under test. Your actor is executing the following code on response to a message:
KubeNamespace kubeNamespace = new KubeNamespace();
KubeDeployment kubeDeployment = new KubeDeployment();
This creates new un-mocked objects which will run their course as coded -- and often result in NPEs since they don't have the external dependencies they rely upon.
If you want to mock objects that are created this way you either have to refactor your code to extract the creation of them into a mockable factory class or use a more invasive mock library such as PowerMock or jMockit.
Example of Factory mock
class KubeFactory {
public KubeNamespace makeNamespace() {
return new KubeNamespace();
}
public KubeDeployment makeDeployment() {
return new KubeDeployment();
}
}
public class K8sDeploymentCreator extends AbstractActor {
private final KubeFactory factory;
K8sDeploymentCreator() {
this(new KubeFactory());
}
// This constructor allows you to override the factory used for testing
K8sDeploymentCreator(KubeFactory factory) {
this.factory = factory;
}
#Override
public Receive createReceive() {
return receiveBuilder().match(createK8sDeployment.class, msg -> {
KubeNamespace kubenamespace = factory.makeNamespace();
KubeDeployment kubeDeployment = factory.makeDeployment();
// rest is as before...
});
}
}
Then in your test class you create a test KubeFactory which returns mocked instances for the classes you are testing with:
#Test
public void testK8sDeployment() {
new TestKit(system) {
{
final KubeFactory mockFactory = mock(KubeFactory.class);
when(mockFactory.makeNamespace()).thenReturn(mockKubeNamespace);
when(mockFactory.makeDeployment()).thenReturn(mockKubeDeployment);
final Props props = Props.create(K8sDeploymentCreator.class, mockFactory);
final ActorRef underTest = system.actorOf(props);
// and so on...
}
}
}
Related
I have created messaging component which will be called by other service for consuming and sending message from kafka, producer part is working fine, I am not sure what wrong with the below consumer listner part why it not printing messages or in debug mode control also not going inside the #kafkaListner method, but GUI based kafkamanager app shows offset is got committed even thought its mannual offset commit.
Here is my Message listner class code , I have checked topic and groupid is setting and fetched properly
#Component
public class SpringKafkaMessageListner {
public CountDownLatch latch = new CountDownLatch(1);
#KafkaListener(topics = "#{consumerFactory.getConfigurationProperties().get(\"topic-name\")}",
groupId = "#{consumerFactory.getConfigurationProperties().get(\"group.id\")}",
containerFactory = "springKafkaListenerContainerFactory")
public void listen(ConsumerRecord<?, ?> consumerRecord, Acknowledgment ack) {
System.out.println("listening...");
System.out.println("Received Message in group : "
+ " and message: " + consumerRecord.value());
System.out.println("current offsetId : " + consumerRecord.offset());
ack.acknowledge();
latch.countDown();
}
}
Consumer config class-
#Configuration
#EnableKafka
public class KafkaConsumerBeanConfig<T> {
#Autowired
#Lazy
private KafkaConsumerConfigDTO kafkaConsumerConfigDTO;
#Bean
public ConsumerFactory<Object, T> consumerFactory() {
return new DefaultKafkaConsumerFactory<>(kafkaConsumerConfigDTO.getConfigs());
}
//for spring kafka with manual offset commit
#Bean
public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<Object,
springKafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<Object, T> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
//manual commit
factory.getContainerProperties().setAckMode(ContainerProperties.AckMode.MANUAL_IMMEDIATE);
return factory;
}
#Bean
SpringKafkaMessageListner consumerListner(){
return new SpringKafkaMessageListner();
}
}
Below code snippet is consumer interface implementation which expose subscribe() method and all other bean creation is done thru ConfigurableApplicationContext.
public class SpringKafkaConsumer<T> implements Consumer<T> {
public SpringKafkaConsumer(ConsumerConfig<T> consumerConfig,
ConfigurableApplicationContext context) {
this.consumerConfig = consumerConfig;
this.context = context;
this.consumerFactory = context.getBean("consumerFactory", ConsumerFactory.class);
this.springKafkaContainer = context.getBean("springKafkaListenerContainerFactory",
ConcurrentKafkaListenerContainerFactory.class);
}
// here is it just simple code to initialize SpringKafkaMessageListner class and invoking
listening part
#Override
public void subscribe() {
consumerListner = context.getBean("consumerListner", SpringKafkaMessageListner.class);
try {
consumerListner.latch.await(30, TimeUnit.SECONDS);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
Test class with my local docker kafka setup
#RunWith(SpringRunner.class)
#DirtiesContext
#ContextConfiguration(classes = QueueManagerSpringConfig.class)
public class SpringKafkaTest extends AbstractJUnit4SpringContextTests {
#Autowired
private QueueManager queueManager;
private Consumer<KafkaMessage> consumer;`
// test method
#Test
public void testSubscribeWithLocalBroker() {
String topicName = "topic1";
String brokerServer = "127.0.0.1:9092";
String groupId = "grp1";
Map<String, String> additionalProp = new HashMap<>();
additionalProp.put(KafkaConsumerConfig.GROUP_ID, groupId);
additionalProp.put(KafkaConsumerConfig.AUTO_COMMIT, "false");
additionalProp.put(KafkaConsumerConfig.AUTO_COMMIT_INTERVAL, "100");
ConsumerConfig<KafkaMessage> consumerConfig =
new ConsumerConfig.Builder<>(topicName, new KafkaSuccessMessageHandler(new
KafkaMessageSerializerTest()),
new KafkaMessageDeserializerTest())
.additionalProperties(additionalProp)
.enableSpringKafka(true)
.offsetPositionStrategy(new EarliestPositionStrategy())
.build();
consumer = queueManager.getConsumer(consumerConfig);
System.out.println("start subscriber");
// calling subcribe method of consumer that will invoke kafkalistner
consumer.subscribe();
}
#Configuration
public class QueueManagerSpringConfig {
#Bean
public QueueManager queueManager() {
Map<String, String> kafkaProperties = new HashMap<>();
kafkaProperties.put(KafkaPropertyNamespace.NS_PREFIX +
KafkaPropertyNamespace.BOOTSTRAP_SERVERS,
"127.0.0.1:9092");
return QueueManagerFactory.getInstance(new KafkaPropertyNamespace(kafkaProperties)); } }
Using Akka framework for my use case where I created one SupervisorActor and two child actors now parallel to that I have token service which needs to update my cache before the expiry please find the code :
public class TokenCacheService {
final Logger logger = LoggerFactory.getLogger(TokenCacheService.class);
private static final String KEY = "USER_TOKEN";
private LoadingCache<String, String> tokenCache;
private final ScheduledExecutorService cacheScheduler;
ThreadFactory threadFactory = new ThreadFactoryBuilder()
.setNameFormat("MyCacheRefresher-pool-%d").setDaemon(true)
.build();
public UserTokenCacheService(CacheConfig cacheConfig) {
cacheScheduler = Executors.newSingleThreadScheduledExecutor(threadFactory);
buildCache(cacheConfig);
}
public String getToken() {
String token = StringUtils.EMPTY;
try {
token = tokenCache.get(KEY);
} catch (ExecutionException ex) {
logger.debug("unable to process get token...");
}
return token;
}
private void buildCache(CacheConfig cacheConfig) {
tokenCache = CacheBuilder.newBuilder()
.refreshAfterWrite(4, "HOURS")
.expireAfterWrite(5, "HOURS")
.maximumSize(2)
.build(new CacheLoader<String, String>() {
#Override
#ParametersAreNonnullByDefault
public String load(String queryKey) {
logger.debug("cache load()");
return <token method call which return token>
}
#Override
#ParametersAreNonnullByDefault
public ListenableFutureTask<String> reload(final String key, String prevToken) {
logger.debug("cache reload()");
ListenableFutureTask<String> task = ListenableFutureTask.create(() -> return <token method call which return token>);
cacheScheduler.execute(task);
return task;
}
});
cacheScheduler.scheduleWithFixedDelay(() -> tokenCache.refresh(KEY), 0,
4, "HOURS");
}
}
It is working fine with test class :
public static void main(String[] args) throws InterruptedException {
TokenCacheService userTokenCacheService = new TokenCacheService();
while(true){
System.out.println(tokenCacheService.getToken());
Thread.sleep(180000);
}
}
above method printing correct logs as after 4 hours which is expected but when I run the above code with my actual application (with Akka-actors) I can only see the first log cache load() apart from this it isn't printing further log for reload the cache.
Please suggest what am doing wrong here.
I tweaked the code little bit by setting the priority to 1 and replaced scheduleWithFixedDelay with scheduleAtFixedRate
ThreadFactory threadFactory = new ThreadFactoryBuilder()
.setNameFormat("MyCacheRefresher-pool-%d")
.setPriority(1)
.build();
public UserTokenCacheService(CacheConfig cacheConfig) {
idsTokenApplication = new IdsTokenApplication();
cacheScheduler = Executors.newSingleThreadScheduledExecutor(threadFactory);
buildCache(cacheConfig);
}
cacheScheduler.scheduleAtFixedRate(() -> tokenCache.refresh(KEY), 0,
cacheConfig.getReloadCache(), TimeUnit.valueOf(cacheConfig.getReloadCacheTimeUnit()));
I am new to Junit 5 . There are two functions in the class under test , The first function calls the second function and second function returns a value which is used in the first function for processing .
So I have created a mock for this class but not able to mock the second function call When I am testing the first function .
First function --exportOpportunityListing()
Second function -- entityToCsvReport()
public class OpportunityReportServiceImpl extends BaseService implements OpportunityReportService {
#Value("${nfs.mountPath}")
private String fileMountPath;
#Value("${take1.url.host}")
private String take1HostURL;
#Autowired
UsersRepository usersRepository;
#Autowired
MailUtil mailUtil;
#Autowired
OpportunityJDBCRepository ojdbc;
#Override
#Async
public void exportOpportunityListing(Map<String, Object> paramMap, List<OpportunityCriteria> lfvo,
String xRemoteUser) {
try {
List<OpportunityJDBCDTO> lo = ojdbc.getOppListWithoutPagination(paramMap, lfvo);
List<OpportunityReport> exportData = lo.parallelStream().map(this::entityToCsvReport)
.collect(Collectors.toList());
CsvCustomMappingStrategy<OpportunityReport> mappingStrategy = new CsvCustomMappingStrategy<>();
mappingStrategy.setType(OpportunityReport.class);
String dirPath = fileMountPath + REPORT_PATH;
File fileDir = new File(dirPath);
if (!fileDir.exists()) {
FileUtils.forceMkdir(fileDir);
}
String pathWithoutExtension = dirPath + "opportunity_data_"
+ LocalDateTime.now().format(DateTimeFormatter.ofPattern(YYYYMMDDHHMMSS));
File reportFile = new File(pathWithoutExtension + EXTENSION_CSV);
Writer writer = new PrintWriter(reportFile);
StatefulBeanToCsv<OpportunityReport> beanToCsv = new StatefulBeanToCsvBuilder<OpportunityReport>(writer)
.withMappingStrategy(mappingStrategy).build();
beanToCsv.write(exportData);
writer.close();
String zipFilePath = pathWithoutExtension + EXTENSION_ZIP;
ZipUtil.zip(reportFile, zipFilePath);
Users remoteUser = usersRepository.findByUsername(xRemoteUser)
.orElseThrow(() -> new Take1Exception(ErrorMessage.USER_NOT_FOUND_WITH_USERNAME, xRemoteUser));
Mail mail = Mail.builder().to(new String[] { remoteUser.getEmail() })
.model(MailModel.builder().name(remoteUser.getName())
.body("Please find attached the opportunity report you requested.").build())
.subject("Opportunity Report").attachments(Arrays.asList(new File(zipFilePath))).build();
mailUtil.sendMail(mail);
Files.delete(reportFile.toPath());
} catch (IOException | CsvDataTypeMismatchException | CsvRequiredFieldEmptyException e) {
throw new Take1Exception(ErrorMessage.INTERNAL_SERVER_EXCEPTION, e);
}
}
public OpportunityReport entityToCsvReport(OpportunityJDBCDTO o) {
OpportunityReport or = modelMapper.map(o, OpportunityReport.class);
or.setCurrency("USD");
or.setOnline(Boolean.TRUE.equals(o.getIsOnline()) ? "YES" : "NO");
return or;
}
}
Here is my JUnit Test case .
class OpportunityReportServiceImplTest {
#InjectMocks
OpportunityReportServiceImpl opportunityReportServiceImpl;
#Autowired
OpportunityReportServiceImpl ors;
#Mock
OpportunityJDBCRepository ojdbc;
#Mock
UsersRepository usersRepository;
#Mock
MailUtil mailUtil;
#Mock
ModelMapper mp;
String username = "anandabhishe";
String nfusername = "ananda";
Mail mail;
List<OpportunityJDBCDTO> lo = new ArrayList<OpportunityJDBCDTO>();
List<OpportunityReport> lor = new ArrayList<OpportunityReport>();
#BeforeEach
void setUp() throws Exception {
MockitoAnnotations.initMocks(this);
ReflectionTestUtils.setField(opportunityReportServiceImpl, "fileMountPath", ".");
ReflectionTestUtils.setField(opportunityReportServiceImpl, "take1HostURL", "");
lo.add(new OpportunityJDBCDTO());
lor.add(new OpportunityReport());
}
#Test
void testExportOpportunityListing() throws IOException {
OpportunityReport or = new OpportunityReport();
or.setCurrency("USD");
or.setOnline("Yes");
when(ojdbc.getOppListWithoutPagination(getParamMap(), getOppCriteria())).thenReturn(lo);
when(usersRepository.findByUsername(username)).thenReturn(Optional.of(getUser()));
doNothing().when(mailUtil).sendMail(mail);
// doNothing().when(opportunityReportServiceImpl).entityToCsvReport(oj);
when(opportunityReportServiceImpl.entityToCsvReport(getOpportunityJDBCDTO())).thenReturn(or);
opportunityReportServiceImpl.exportOpportunityListing(getParamMap(), getOppCriteria(), username);
assertTrue(true);
FileUtils.forceDelete(new File("." + REPORT_PATH));
}
private Map<String, Object> getParamMap() {
return new HashMap<String, Object>();
}
private List<OpportunityCriteria> getOppCriteria() {
List<OpportunityCriteria> loc = new ArrayList<>();
loc.add(new OpportunityCriteria());
return loc;
}
private OpportunityJDBCDTO getOpportunityJDBCDTO() {
OpportunityJDBCDTO oj = new OpportunityJDBCDTO();
oj.setIsOnline(true);
oj.setApplicationCount(2);
oj.setCost(200);
oj.setCountryCode("in");
oj.setCreationDate(LocalDateTime.now());
oj.setEndDate(LocalDate.now());
oj.setLocation("test");
oj.setOpportunityId(123);
oj.setOpportunityStatus("test");
oj.setOpportunityStatusId(1);
oj.setOpportunityTitle("test");
oj.setOpportunityType("test");
oj.setOpportunityTypeColor("test");
oj.setOpportunityTypeId(1);
oj.setPublishedAt(LocalDateTime.now());
oj.setPublishedBy("test");
oj.setPublishedByUserName("test");
oj.setRegistrationUrl("test");
oj.setStartDate(LocalDate.now());
oj.setSummary("test");
oj.setUserEmail("test");
oj.setUserFullName("test");
oj.setUserId(1);
oj.setUserName("test");
oj.setVendorName("test");
return oj;
}
private Users getUser() {
Users user = new Users();
return user;
}
}
I am getting Null Pointer Exception when the line in Test class is called :
when(opportunityReportServiceImpl.entityToCsvReport(getOpportunityJDBCDTO())).thenReturn(or);
I was missing mocking the modelmapper stub which is being used in second function , after I added that , the test passed .
OpportunityReport or = new OpportunityReport();
OpportunityJDBCDTO oj = new OpportunityJDBCDTO();
when(ojdbc.getOppListWithoutPagination(any(HashMap.class), anyList())).thenReturn(lo);
when(usersRepository.findByUsername(anyString())).thenReturn(Optional.of(getUser()));
doNothing().when(mailUtil).sendMail(mail);
doReturn(or).when(mp).map(oj, OpportunityReport.class);
opportunityReportServiceImpl.exportOpportunityListing(getParamMap(), getOppCriteria(), username);
assertTrue(true);
That's happening because opportunityReportServiceImpl is not a mock - it's the object that you're trying to test, but you're trying to stub a method of it as if it were a mock.
I would recommend that you don't try to stub the methods of the object that you're trying to test. But if you have to, you'll need to declare it as a #Spy. Then to stub it, you'll need the doReturn/when syntax instead of when/thenReturn. This might look like
doReturn(lo).when(ojdbc).getOppListWithoutPagination(getParamMap(), getOppCriteria());
I am not able to test a method that is inside an abstract class instance. I have already tried several ways and would like to know if it is possible to do this. The contents of the abstract class can be seen in the link below.
Jacoco Class Report
Belows is the JUnit and Mockito test that I did trying to test the cases in the image above.
#RunWith(MockitoJUnitRunner.class)
public class PahoRxMqttCallbackTest {
#Test
public void whenConnectionLostOccurs() {
PahoRxMqttCallback rxMqttCallback = mock(PahoRxMqttCallback.class);
assertThat(rxMqttCallback).isNotNull();
PahoRxMqttException exception = new PahoRxMqttException(
new MqttException(MqttException.REASON_CODE_CONNECTION_LOST));
ArgumentCaptor<Throwable> onConnectionLostCauseArgumentCaptor = ArgumentCaptor.forClass(Throwable.class);
rxMqttCallback.connectionLost(exception);
verify(rxMqttCallback).connectionLost(onConnectionLostCauseArgumentCaptor.capture());
assertThat(onConnectionLostCauseArgumentCaptor.getValue()).isNotNull();
assertThat(onConnectionLostCauseArgumentCaptor.getValue()).isInstanceOf(PahoRxMqttException.class);
assertThat(onConnectionLostCauseArgumentCaptor.getValue()).hasCauseInstanceOf(MqttException.class);
assertThat(onConnectionLostCauseArgumentCaptor.getValue()).isEqualTo(exception);
}
#Test
public void whenConnectCompleteOccurs() {
PahoRxMqttCallback rxMqttCallback = mock(PahoRxMqttCallback.class);
assertThat(rxMqttCallback).isNotNull();
boolean reconnect = true;
String brokerUri = "tcp://localhost:1883";
ArgumentCaptor<Boolean> onConnectCompleteReconnectArgumentCaptor = ArgumentCaptor.forClass(Boolean.class);
ArgumentCaptor<String> onConnectCompleteServerUriArgumentCaptor = ArgumentCaptor.forClass(String.class);
rxMqttCallback.connectComplete(reconnect, brokerUri);
verify(rxMqttCallback).connectComplete(
onConnectCompleteReconnectArgumentCaptor.capture(),
onConnectCompleteServerUriArgumentCaptor.capture());
assertThat(onConnectCompleteReconnectArgumentCaptor.getValue()).isNotNull();
assertThat(onConnectCompleteReconnectArgumentCaptor.getValue()).isEqualTo(reconnect);
assertThat(onConnectCompleteServerUriArgumentCaptor.getValue()).isNotNull();
assertThat(onConnectCompleteServerUriArgumentCaptor.getValue()).isEqualTo(brokerUri);
}
#Test
public void whenDeliveryCompleteOccurs() {
PahoRxMqttCallback rxMqttCallback = mock(PahoRxMqttCallback.class);
assertThat(rxMqttCallback).isNotNull();
IMqttDeliveryToken deliveryToken = mock(IMqttDeliveryToken.class);
assertThat(deliveryToken).isNotNull();
RxMqttToken rxMqttToken = new PahoRxMqttToken(deliveryToken);
//ArgumentCaptor<IMqttDeliveryToken> onDeliveryCompleteTokenArgumentCaptor = ArgumentCaptor.forClass(IMqttDeliveryToken.class);
ArgumentCaptor<RxMqttToken> onDeliveryCompleteRxTokenArgumentCaptor = ArgumentCaptor.forClass(RxMqttToken.class);
//rxMqttCallback.deliveryComplete(deliveryToken);
rxMqttCallback.deliveryComplete(rxMqttToken);
/*
* Following methods *cannot* be stubbed/verified: final/private/equals()/hashCode().
* Mocking methods declared on non-public parent classes is not supported.
*/
//verify(rxMqttCallback).deliveryComplete(onDeliveryCompleteTokenArgumentCaptor.capture());
verify(rxMqttCallback).deliveryComplete(onDeliveryCompleteRxTokenArgumentCaptor.capture());
//assertThat(onDeliveryCompleteTokenArgumentCaptor.getValue()).isNotNull();
//assertThat(onDeliveryCompleteTokenArgumentCaptor.getValue()).isExactlyInstanceOf(IMqttDeliveryToken.class);
//assertThat(onDeliveryCompleteTokenArgumentCaptor.getValue()).isEqualTo(deliveryToken);
assertThat(onDeliveryCompleteRxTokenArgumentCaptor.getValue()).isNotNull();
assertThat(onDeliveryCompleteRxTokenArgumentCaptor.getValue()).isExactlyInstanceOf(PahoRxMqttToken.class);
assertThat(onDeliveryCompleteRxTokenArgumentCaptor.getValue()).isEqualTo(rxMqttToken);
}
//#Test
public void whenMessageArrived() throws Exception {
PahoRxMqttCallback rxMqttCallback = mock(PahoRxMqttCallback.class);
assertThat(rxMqttCallback).isNotNull();
String topic = "topic";
MqttMessage message = new MqttMessage();
ArgumentCaptor<String> onMessageArrivedTopicArgumentCaptor = ArgumentCaptor.forClass(String.class);
ArgumentCaptor<MqttMessage> onMessageArrivedMessageArgumentCaptor = ArgumentCaptor.forClass(MqttMessage.class);
rxMqttCallback.messageArrived(topic, message);
/*
* Following methods *cannot* be stubbed/verified: final/private/equals()/hashCode().
* Mocking methods declared on non-public parent classes is not supported.
*/
verify(rxMqttCallback).messageArrived(onMessageArrivedTopicArgumentCaptor.capture(), onMessageArrivedMessageArgumentCaptor.capture());
assertThat(onMessageArrivedTopicArgumentCaptor.getValue()).isNotNull();
assertThat(onMessageArrivedTopicArgumentCaptor.getValue()).isEqualTo(topic);
assertThat(onMessageArrivedMessageArgumentCaptor.getValue()).isNotNull();
assertThat(onMessageArrivedMessageArgumentCaptor.getValue()).isEqualTo(message);
}
}
I really could not do it even after searching the web about it. So I appreciate the help.
Update
I was able to perform the tests and cover all the alerts that the Jacoco had shown. But for this I had to create an implementation for the abstract class rather than using anonymous class. As can be seen in the following link
Jacoco Class Report 2
The updated unit tests:
#RunWith(MockitoJUnitRunner.class)
public class PahoRxMqttCallbackTest {
#Test
public void whenConnectionLostOccurs() {
PahoRxMqttCallback rxMqttCallback = spy(PahoRxMqttCallback.create(cause -> {}, (recon, uri) -> {}, t -> {}));
PahoRxMqttException exception = new PahoRxMqttException(
new MqttException(MqttException.REASON_CODE_CONNECTION_LOST));
ArgumentCaptor<Throwable> onConnectionLostCauseArgumentCaptor = ArgumentCaptor.forClass(Throwable.class);
rxMqttCallback.connectionLost(exception);
verify(rxMqttCallback).connectionLost(onConnectionLostCauseArgumentCaptor.capture());
assertThat(onConnectionLostCauseArgumentCaptor.getValue()).isNotNull();
assertThat(onConnectionLostCauseArgumentCaptor.getValue()).isInstanceOf(PahoRxMqttException.class);
assertThat(onConnectionLostCauseArgumentCaptor.getValue()).hasCauseInstanceOf(MqttException.class);
assertThat(onConnectionLostCauseArgumentCaptor.getValue()).isEqualTo(exception);
}
#Test
public void whenConnectCompleteOccurs() {
PahoRxMqttCallback rxMqttCallback = spy(PahoRxMqttCallback.create(cause -> {}, (r, u) -> {}, t -> {}));
boolean reconnect = true;
String brokerUri = "tcp://localhost:1883";
ArgumentCaptor<Boolean> onConnectCompleteReconnectArgumentCaptor = ArgumentCaptor.forClass(Boolean.class);
ArgumentCaptor<String> onConnectCompleteServerUriArgumentCaptor = ArgumentCaptor.forClass(String.class);
rxMqttCallback.connectComplete(reconnect, brokerUri);
verify(rxMqttCallback).connectComplete(
onConnectCompleteReconnectArgumentCaptor.capture(),
onConnectCompleteServerUriArgumentCaptor.capture());
assertThat(onConnectCompleteReconnectArgumentCaptor.getValue()).isNotNull();
assertThat(onConnectCompleteReconnectArgumentCaptor.getValue()).isEqualTo(reconnect);
assertThat(onConnectCompleteServerUriArgumentCaptor.getValue()).isNotNull();
assertThat(onConnectCompleteServerUriArgumentCaptor.getValue()).isEqualTo(brokerUri);
}
#Test
public void whenDeliveryCompleteOccurs() {
PahoRxMqttCallback rxMqttCallback = spy(PahoRxMqttCallback.create(cause -> {}, (r, u) -> {}));
IMqttDeliveryToken deliveryToken = new MqttDeliveryToken();
RxMqttToken rxMqttToken = new PahoRxMqttToken(deliveryToken);
ArgumentCaptor<IMqttDeliveryToken> onDeliveryCompleteTokenArgumentCaptor = ArgumentCaptor.forClass(IMqttDeliveryToken.class);
ArgumentCaptor<RxMqttToken> onDeliveryCompleteRxTokenArgumentCaptor = ArgumentCaptor.forClass(RxMqttToken.class);
rxMqttCallback.deliveryComplete(deliveryToken);
rxMqttCallback.deliveryComplete(rxMqttToken);
verify(rxMqttCallback).deliveryComplete(onDeliveryCompleteTokenArgumentCaptor.capture());
verify(rxMqttCallback, times(2)).deliveryComplete(onDeliveryCompleteRxTokenArgumentCaptor.capture());
assertThat(onDeliveryCompleteTokenArgumentCaptor.getValue()).isNotNull();
assertThat(onDeliveryCompleteTokenArgumentCaptor.getValue()).isExactlyInstanceOf(MqttDeliveryToken.class);
assertThat(onDeliveryCompleteTokenArgumentCaptor.getValue()).isEqualTo(deliveryToken);
assertThat(onDeliveryCompleteRxTokenArgumentCaptor.getValue()).isNotNull();
assertThat(onDeliveryCompleteRxTokenArgumentCaptor.getValue()).isExactlyInstanceOf(PahoRxMqttToken.class);
assertThat(onDeliveryCompleteRxTokenArgumentCaptor.getValue()).isEqualTo(rxMqttToken);
}
#Test
public void whenMessageArrived() throws Exception {
PahoRxMqttCallback rxMqttCallback = spy(PahoRxMqttCallback.create(cause -> {}, (r, u) -> {}, t -> {}));
String topic = "topic";
MqttMessage message = new MqttMessage();
ArgumentCaptor<String> onMessageArrivedTopicArgumentCaptor = ArgumentCaptor.forClass(String.class);
ArgumentCaptor<MqttMessage> onMessageArrivedMessageArgumentCaptor = ArgumentCaptor.forClass(MqttMessage.class);
rxMqttCallback.messageArrived(topic, message);
verify(rxMqttCallback).messageArrived(onMessageArrivedTopicArgumentCaptor.capture(), onMessageArrivedMessageArgumentCaptor.capture());
assertThat(onMessageArrivedTopicArgumentCaptor.getValue()).isNotNull();
assertThat(onMessageArrivedTopicArgumentCaptor.getValue()).isEqualTo(topic);
assertThat(onMessageArrivedMessageArgumentCaptor.getValue()).isNotNull();
assertThat(onMessageArrivedMessageArgumentCaptor.getValue()).isEqualTo(message);
}
}
You never instantiate the class, only mocks of the class.
PahoRxMqttCallback rxMqttCallback = mock(PahoRxMqttCallback.class);
A mock is not the real class, just a fake copy.
Instead you should do
PahoRxMqttCallback rxMqttCallback = new PahoRxMqttCallback();
or
PahoRxMqttCallback rxMqttCallback = spy(new PahoRxMqttCallback());
I am trying to deploy a HASingleton on a JBoss6.4. I have followed this tutorial to come up with the following:
I create a Service which is supposed to start a timer (own timer interface), by injecting the timer bean through JNDI.
public class HATimerService implements Service<String> {
private Logger logger = Logger.getLogger(HATimerService.class);
private final AtomicBoolean started = new AtomicBoolean(false);
private ServiceName serviceName;
private final InjectedValue<ServerEnvironment> env = new InjectedValue();
private String JNDI = "java:global/my-ear/my-module/MyTimer"
public HATimerService() {
serviceName = ServiceName.JBOSS.append(new String[]{"my", "ha", "singleton", "MyHaService"});
}
public String getValue() throws IllegalStateException, IllegalArgumentException {
return "";
}
public void start(StartContext context) throws StartException {
if(!started.compareAndSet(false, true)) {
throw new StartException("The service is still started!");
} else {
try {
InitialContext e = new InitialContext();
TimerScheduler myTimer = (TimerScheduler)e.lookup(JNDI);
timer.startTimer();
} catch (NamingException var6) {
throw new StartException("Could not initialize timer", var6);
}
}
}
public void stop(StopContext context) {
if(started.compareAndSet(true, false)) {
try {
InitialContext e = new InitialContext();
((TimerScheduler)e.lookup(JNDI)).stopTimer();
} catch (NamingException var4) {
logger.error("Could not stop timer", var4);
}
}
}
public ServiceName getServiceName() {
return serviceName;
}
public InjectedValue<ServerEnvironment> getEnvironment() {
return env;
}
}
I also have an activator which activates the service.
public class HATimerServiceActivator implements ServiceActivator {
private final Logger log = Logger.getLogger(this.getClass());
public HATimerServiceActivator() {
}
public void activate(ServiceActivatorContext context) {
HATimerService service = new HATimerService();
this.log.info(service.getServiceName() + "HATimerService will be installed");
SingletonService singleton = new SingletonService(service, service.getServiceName());
singleton.build(new DelegatingServiceContainer(context.getServiceTarget(), context.getServiceRegistry()))
.addDependency(ServerEnvironmentService.SERVICE_NAME, ServerEnvironment.class, service.getEnvironment())
.setInitialMode(Mode.ACTIVE)
.install();
}
}
The timer bean, HATimerService, and the HATimerServiceActivator are all deployed in an ear called my-ear. In the log files I can see:
JNDI bindings for session bean named MyTimer.... :
java:global/my-ear/my-module/MyTimer
However, every once in a while (approx. 1/3 of all deploys), this setup fails due to a NameNotFoundException where the JNDI lookup fails. The full exception is: Caused by: javax.naming.NameNotFoundException: Error looking up my-ear/my-module/MyTimer, service service jboss.naming.context.java.global.my-ear.my-module.MyTimer is not started
My guess is that this can be some sort of race condition where the bean isn't registered in the JNDI-tree yet. How can I make the service wait with the lookup until the bean is available?
It seems that there exists a possibility to create dependencies on deployment units. When creating the SingletonService, the following dependency can be added:
ServiceName ejbDependency = ServiceName.of("jboss", "deployment", "subunit", "my-ear.ear", "my-module.jar", "component", "MyTimerBean", "START");
singleton.build(new DelegatingServiceContainer(context.getServiceTarget(), context.getServiceRegistry()))
.addDependency(ServerEnvironmentService.SERVICE_NAME, ServerEnvironment.class, service.getEnvironment())
.setInitialMode(Mode.ACTIVE)
.addDependency(ejbDependency)
.install();
As long as ejbDependency is a correct dependency, the lookup will be performed after bean start.