I am currently struggling with writing tests for my ExecuteGroovyScript process. I am able to do a lot of things with the Nifi Testrunner and MockingFlowFiles but I dont find a way to trigger the onStart callback for the GroovyScript.
I need the onStart as described here to initially (and only once) load a config for the processor and provide it to the script. It works as described in Nifi itself, but the onStart is not called while executing the tests with the TestRunner.
#BeforeEach
public void setup() throws IOException {
runner = TestRunners.newTestRunner(ExecuteScript.class);
runner.setProperty("Script Engine", "Groovy");
runner.setProperty("converterconfig", IOUtils.toString(Objects.requireNonNull(this.getClass().getResourceAsStream("config.json")), StandardCharsets.UTF_8));
runner.setProperty(ScriptingComponentUtils.SCRIPT_BODY,
IOUtils.toString(Objects.requireNonNull(this.getClass().getResourceAsStream("converter.groovy")), StandardCharsets.UTF_8));
runner.assertValid();
}
#Test
public void processSuccessfulEvent() throws IOException {
String inputString = IOUtils.toString(Objects.requireNonNull(this.getClass().getResourceAsStream("input.json")), StandardCharsets.UTF_8);
runner.enqueue(inputString);
runner.run();
runner.assertTransferCount(ExecuteScript.REL_SUCCESS, 1);
}
//Groovyscript
static onStart(ProcessContext context){
//do stuff with the config, but this is never called/triggered in the testrunner
def config = context.getProperty("converterconfig")
...
}
Does anyone have an advise how to get the tests to trigger the onStart? Nifi version is 1.12 and Groovy 2.4.16
Thank you in advance.
Related
Often when I consider a new library or technology, I create a small POC or test program to get a feel for it. So I did with gRPC-Spring-Boot-Starter. A simple example code is posted below my question text.
This sample has been extended in complexity and eventually, the library has found its way into production code. So far it has survived many runs under moderate load.
Note that, naturally, the production service is not client to itself. But the production gRPC service is in fact client to other gRPC services.
Now I was thinking to write some kind of between-unit-and-integration test where I spin up local instances (starting with a single one) of those other gRPC services (pulling data from some static local resource, for example). Basically, this test code looks very much like the one posted below my question.
However - as soon as we poll for results in forEachRemaining(), the test ends up hanging: I suspect a deadlock in ClientCalls#waitAndDrain (io.grpc:grpc-stub).
The funny thing is - this does not happen if the client were created "manually", i.e. without utilizing the third-party Spring extension:
ManagedChannel channel = ManagedChannelBuilder.forTarget("localhost:9091")
.defaultLoadBalancingPolicy("round_robin")
.usePlaintext()
.build();
StockStaticDataRequestServiceBlockingStub stub = StockStaticDataRequestServiceGrpc.newBlockingStub(channel);
I use Spring Boot 2.6.3, gRPC-Spring-Boot-Starter 2.13.1, gRPC 1.44.0, proto 3.19.2 and netty 4.1.73, for what it is worth.
Now I wonder if someone here encountered similar issues or can give me some pointers while I am trying to figure out the inner workings of gRPC more.
Added sample project on GH.
The main branch contains the - maybe dubious - test setup I chose in the beginning, branches are some refinements, like using #Abhijit Sarkar's grpc-test library. Tests are green so far.
grpc:
client:
stocks:
address: 'static://localhost:9091'
enableKeepAlive: false
negotiationType: plaintext
server:
port: 9092
#SpringBootTest
class TestGrpc {
#GrpcClient("stocks")
private StockStaticDataRequestServiceBlockingStub stub;
#BeforeAll
public static void setUp() throws Exception {
final Server server = ServerBuilder
.forPort(9091)
.addService(new StockStaticDataRequestTestService())
.build();
server.start();
final Thread serverThread = new Thread(() -> {
try {
server.awaitTermination();
} catch (final InterruptedException e) {
Thread.currentThread().interrupt();
}
});
serverThread.setDaemon(false);
serverThread.start();
}
#Test
void testClient() {
StockStaticManyDataRequest request = StockStaticManyDataRequest.newBuilder()
.addAllTickerSymbols(List.of("AAPL"))
.build();
stub.getManyStockStatics(request).forEachRemaining(security -> {
LOG.info("security={}", security);
});
}
}
public class StockStaticDataRequestTestService extends StockStaticDataRequestServiceImplBase {
#Override
public void getManyStockStatics(StockStaticManyDataRequest request, StreamObserver<Security> responseObserver) {
responseObserver.onNext(Security.newBuilder()
.setSecurity("TEST-MANY")
.build());
responseObserver.onNext(Security.newBuilder()
.setSecurity("TEST-MORE")
.build());
responseObserver.onCompleted();
}
}
message Security {
string tickerSymbol = 1;
string security = 2;
}
message StockStaticManyDataRequest {
repeated string tickerSymbols = 1;
}
service StockStaticDataRequestService {
rpc getManyStockStatics(StockStaticManyDataRequest) returns (stream Security) {}
}
I think what the problem might be is that you should not be starting the server at all. There are some grpc-spring-boot-starter annotations that should be added to a test configuration class that will start / stop the server. See details here.
https://yidongnan.github.io/grpc-spring-boot-starter/en/server/testing.html#integration-tests
I also tried to make what you have work, but the server once started really won't shutdown. This makes the next test suite that runs fail due to port conflicts when it tries to start.
Here's my test class.
#Slf4j
#SpringBootTest
#ActiveProfiles("test")
#SpringJUnitConfig(classes = { ServiceIntegrationTestConfiguration.class })
#DirtiesContext
class TestGprc {
#GrpcClient("stocks")
private StockStaticDataRequestServiceBlockingStub stub;
/**
* #throws java.lang.Exception
*/
#BeforeAll
static void setUpBeforeClass() throws Exception {
log.info("setUpBeforeClass");
}
/**
* #throws java.lang.Exception
*/
#AfterAll
static void tearDownAfterClass() throws Exception {
log.info("tearDownAfterClass");
}
/**
* #throws java.lang.Exception
*/
#BeforeEach
void setUp() throws Exception {
}
/**
* #throws java.lang.Exception
*/
#AfterEach
void tearDown() throws Exception {
}
#Test
#DirtiesContext
void testClient() {
StockStaticManyDataRequest request = StockStaticManyDataRequest.newBuilder()
.addAllTickerSymbols(List.of("AAPL"))
.build();
stub.getManyStockStatics(request).forEachRemaining(security -> {
log.info("security={}", security);
});
}
}
Here's the configuration project.
#Configuration
#ImportAutoConfiguration({ GrpcServerAutoConfiguration.class, // Create required server beans
GrpcServerFactoryAutoConfiguration.class, // Select server implementation
GrpcClientAutoConfiguration.class,
GrpcStarterApplication.class})
public class ServiceIntegrationTestConfiguration {
// add mock beans here of needed.
}
My overrides for the properties. see application-test.yaml
grpc:
client:
stocks:
address: in-process:test
enableKeepAlive:
negotiationType:
server:
inProcessName: test
port: -1
I posted the entire maven project here:
https://github.com/aerobiotic/grpc-spring-starter
Simply clone it and mvn clean install :-)
As far as your dead-lock goes in your production code:
make sure you are calling onCompleted
check your catch blocks and make sure onError is being called and that there is logging happening.
It's possible that starting the server and not getting it shutdown is affecting something. Perhaps test code is connecting to a server from a previous test.
I wrote a Camel-Route which uses the DLC-Pattern with an Processor which is executed before the Exchange is sent to the DLC.
errorHandler(deadLetterChannel("{{myDLCEndpoint}}")
.onPrepareFailure(getErrorProcessor()));
During my Route I throw a RuntimeException which is then handled by the errorProcessor and the DLC. Everything works as expected when I start the application and let the route run.
Now I wanted to write a Unit-Test just be sure that it works.
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.NONE)
#RunWith(SpringRunner.class)
public class MyRouteTest {
#MockBean
private ErrorProcessor errorProcessor;
#EndpointInject(uri = "{{quelle}}")
private ProducerTemplate quelle;
#EndpointInject(uri = "{{myDLCEndpoint}}")
private ProducerTemplate dlc;
#Test
#Transactional("myDataSourceTransactionManager") //For Rollback
public void test() throws Exception {
Mockito.verify(errorProcessor, never()).process(Mockito.any());
String inputXML = TestDataReader.readXML("myfile.xml");
assertNotNull(inputXML);
quelle.sendBody(inputXML);
}
}
I started the test and checked the log and unfortunately an exception occurs during the route is executed. The exception is handled by camel and the mocked errorprocessor is called for sure because I debugged it:
Unfortunately the unit test still succeeds even with Mockito.verify(errorProcessor, never()).process(Mockito.any());
And now I have no clue why it does not fail, which would be the result that I'd expect in such a situation?
I'm an idiot. The call of Mockito.verify was before the call quelle.sendBody.
WAAAH. Sorry guys, I just didn't see it :-D
The correct way to get Mockito to work is:
String inputXML = TestDataReader.readXML("myfile.xml");
assertNotNull(inputXML);
quelle.sendBody(inputXML);
//call mockito AFTER the test is executed!
Mockito.verify(errorProcessor, never()).process(Mockito.any());
Frustrating. Everywhere i look, i see samples of testing async Vertx code, but nothing that comes close to what i am trying to test.
Vertx 3.3.2, JUnit 4.12, Java 8
The method under test sends a message to the event bus. I want to verify that what happens in the eventBus().send() response handler is happening.
Sooooooo many examples i see have the eventBus().send() method in the TEST ITSELF (thus, testing the other end of the event bus - the consumer) I want to test the response handler in the .send()
I have tried Async in the test. Tried ArgumentCaptor. Tried Thread.sleep(). Tried doAnswer(). Nothing seems to get the test to (a) wait for the async eventBus().send() call in the method under test to finish and (b) able to verify() that there was an interaction (i think this might have to do with the different between the Vertx.TestContext and the JUnit.Runner Context..)
Code:
Method under test:
public void sendToEventBusAddress(RoutingContext context, String requestId, String userId) {
List<String> stuff = split(context.request().getParam("stuffToSplit"));
JsonObject eventBusMessage = new JsonObject()
.put("requestId", requestId)
.put("stuffList", new JsonArray(stuff))
.put("userId", userId);
LOGGER.info("Putting message: {} onto the EventBus at address: {}", eventBusMessage.encodePrettily(), EventBusEndpointEnum.STUFF_ACCESS.getValue());
context.vertx().eventBus().send(EventBusEndpointEnum.STUFF_ACCESS.getValue(), eventBusMessage, new DeliveryOptions().setSendTimeout(timeout), async -> {
if (async.succeeded()) {
LOGGER.info("EventBus Response: {}", async.result().body().toString());
context.response().setStatusCode(HttpStatus.SC_OK);
context.response().headers().set(HttpHeaders.CONTENT_TYPE, MediaType.TEXT_PLAIN);
context.response().end(async.result().body().toString());
} else {
LOGGER.error(errorMessage);
context.response().setStatusCode(HttpStatus.SC_BAD_REQUEST);
context.response().end(errorMessage);
}
});
}
Simplified (non-working) Test case and class:
#RunWith(VertxUnitRunner.class)
public class MyBrokenTest {
#Mock private RoutingContext routingContext;
#Mock private HttpServerRequest contextRequest;
#Mock private HttpServerResponse contextResponse;
#Mock private MultiMap responseHeaders;
#Rule public RunTestOnContext rule = new RunTestOnContext();
#Before
public void setUp(TestContext context) {
MockitoAnnotations.initMocks(this);
}
#Test
public void testOne(TestContext context) {
when(routingContext.vertx()).thenReturn(rule.vertx());
when(routingContext.request()).thenReturn(contextRequest);
when(contextRequest.getParam("stuffToSplit")).thenReturn("04MA");
when(routingContext.response()).thenReturn(contextResponse);
when(contextResponse.headers()).thenReturn(responseHeaders);
rule.vertx().eventBus().consumer(EventBusEndpointEnum.STUFF_ACCESS.getValue(), res -> {
res.reply("yo");
});
ClassUnderTest cut= new ClassUnderTest(180000);
cut.sendToEventBusAddress(routingContext, "testRequestId", "UnitTestId");
verify(contextResponse).setStatusCode(200);
}
}
I know that the test in its current form won't work, because the method under test returns as soon as the eventBus().send() method is called inside the method, and therefore, the verify fails with 'no interactions'.
What i can't figure out, is how to verify it properly given the async nature of Vertx!
Thanks
I did it so:
At #BeforeAll annotated method I deploy only the sending verticle.
At #BeforeEach - I create a consumer for the given message and store message(s) to variable/collection
Something like:
receivedMessage = new Message[1];
eventBus.consumer("DB",
message -> {
message.reply("OK");
receivedMessage[0] = message;
});
context.completeNow();
In test I validate stored value(s):
client.get(8080, "localhost", "/user/" + id)
.as(BodyCodec.string())
.send(context.succeeding((response -> context.verify(() -> {
Assertions.assertEquals(expectedMessage, receivedMessage[0].body());
context.completeNow();
}))));
I'm trying to figure out how to integrate an external API and run every integration test against it. I've been reading and looking at:
https://github.com/dropwizard/dropwizard/blob/master/dropwizard-example/src/test/java/com/example/helloworld/IntegrationTest.java
https://github.com/dropwizard/dropwizard/blob/master/docs/source/manual/testing.rst
but it looks like these are examples of testing local endpoints and not external ones. I would like to be able to test my api calls with JUnit tests. Currently I'm having to start up and run my app to make sure they're working.
This is the direction I'm currently exploring:
private Client client;
#Before
public void setUp() throws Exception {
client = ClientBuilder.newClient();
}
#After
public void tearDown() throws Exception {
client.close();
}
#Test
public void testHitApi() throws Exception {
client.target("https://api.github.com/users/" + getUser() + "/repos");
}
Any help would be much appreciated, thanks!
You need to make the api call to hit the endpoint.
doing just :
client.target("https://api.github.com/users/" + getUser() + "/repos")
returns a WebTarget .
you should ideally do something like:
client
.target("https://api.github.com/users/" + getUser() + "/repos")
.request()
.get() ; // for a get call
google for exact post/put/delete calls .
If you mean to run your integration tests against an external api or a separate running instance of your api.
testEnvironment = new Environment("Test environment", Jackson.newObjectMapper(),
null, new MetricRegistry(), null);
ObjectMapper mapper = Jackson.newObjectMapper(new YAMLFactory());
IntegrationTestConfiguration integrationTestConfiguration = mapper.readValue(fixture("integration-testing-config.yml"),
IntegrationTestConfiguration.class);
Instantiate your client as so
exampleClient = new exampleClient(testEnvironment, clientConfiguration);
Hope this helps.
I am testing with the wonderful TestNG-Framework. My question is if it is possible to set the annotations for #Test-annotation in the testng.xml-configuration file?
I don't want to hard-code the #Test-annotation like
#Test(dataProvider = "dataFileProvider", dataProviderClass = TestDataProvider.class)
I want to configure it in the testng.xml
I have got two ideas on this case:
Workaraound 1: StaticProvider
You can easily change the Static Provider if needed
Workaraound 2: Annotation Transformer
Never tried that but should work even if have to grab the XML- data manually
Looking forward to Mr. Beust's answer... ;)
The short answer is: no, you can't add annotations to your code from testng.xml.
You can modify existing annotations with an Annotation Transformer, as explained by Frank.
Sometimes, you just really want to do something and you can't, like accessing private variables to fix memory leaks. Figuring out how to do things like this, despite the fact that you can't are fun. In case, you really want to, I might suggest trying to run your suite using the TestNG object and before running loading the testng.xml file.
Personally, I like using 'mvn test' and unfortunately, adding the pom.xml code to run from a testng xml file will require that you supply a testng.xml file, so 'mvn test' won't work. Always make sure what 95% of programmers use works, then allow overridding.
So, I might suggest extending the testng.xml file yourself and writing some code to read the testng.xml file and configure annotations using the annotation transformer class.
Here is some code to get you started:
public class TestNGSuite {
public static void main(String[] args) {
System.out.println("main start");
try {
new TestNGSuite(new Class[]{ Demo.class });
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
System.out.println("main finish");
}
public TestNGSuite(Class[] classes) throws Exception {
// Create Suite List
List<XmlSuite> suites = new ArrayList<XmlSuite>();
// Add Suite to Suite List
XmlSuite suite = new XmlSuite();
suites.add(suite);
suite.setName("MyTestSuite");
// Add Test to Suite
XmlTest test = new XmlTest(suite);
test.setName("MyTest");
// Add Class List to Test
List<XmlClass> xmlClasses = new ArrayList<XmlClass>();
test.setXmlClasses(xmlClasses);
// Add Class to Class List
for(Class clazz: classes) {
XmlClass xmlClass = new XmlClass(clazz);
xmlClasses.add(xmlClass);
}
// Run TestNG
TestNG testNG = new TestNG();
testNG.setXmlSuites(suites);
testNG.addListener(new TestNGAnnotationTransformer(methodName));
testNG.addListener(new TestNGSuiteConsoleLogger());
testNG.run();
if(testNG.hasFailure()) { // Throw an exception to make mvn goal fail
throw new Exception("Failed Tests");
}
}
public static class TestNGSuiteConsoleLogger extends TestListenerAdapter{
#Override
public void onTestFailure(ITestResult tr) {
Console.log(TestNGSuiteConsoleLogger.class, "FAILURE:"+tr.getMethod());
tr.getThrowable().printStackTrace();
}
}
public static class TestNGAnnotationTransformer implements IAnnotationTransformer{
String methodToRun;
public TestNGAnnotationTransformer(String methodName) {
methodToRun = methodName;
}
public void transform(ITestAnnotation annotation, Class arg1,
Constructor arg2, Method testMethod) {
if (methodToRun.equals(testMethod.getName())) {
annotation.setEnabled(true);
}
}
}
}
If you want to run Demo.class, make sure there is a method there with the TestNG annotation "#Test".