Why the the reactive code is not executed in spring-boot? - java

I am using Spring boot to implement reactive micro-services. However, the reactive code is never executed in lambda function. My implementation as below. publishEventScheduler is created during application start-up. I am using this code together with Kafka to send an event to user micro-service to create user.
MainServiceApplication.java
public class MainServiceApplication {
private final Integer threadPoolSize;
private final Integer taskQueueSize;
public MainServiceApplication(
#Value("${app.threadPoolSize:10}") Integer threadPoolSize,
#Value("${app.taskQueueSize:100}") Integer taskQueueSize) {
this.threadPoolSize = threadPoolSize;
this.taskQueueSize = taskQueueSize;
}
#Bean
public Scheduler publishEventScheduler() {
LOG.info("Creating a message scheduler with connectionPoolSize = {}", threadPoolSize);
return Schedulers.newBoundedElastic(threadPoolSize, taskQueueSize, "publish-pool");
}
public static void main(String[] args) {
SpringApplication.run(MainServiceApplication.class, args);
}
}
MainIntegration.java
the function createUser() is called with a POST request from Postman (break point stop at subscribeOn(publishEventScheduler)) but sendMessageUser() is never executed (break point in the function not working)
#Component
public class MainIntegration implements UserService, TodoService {
private final String todoServiceUrl;
private final String userServiceUrl;
private final WebClient webClient;
private final StreamBridge streamBridge;
private final Scheduler publishEventScheduler;
public MainIntegration(
#Qualifier("publishEventScheduler") Scheduler publishEventScheduler,
WebClient.Builder webClient,
StreamBridge streamBridge,
#Value("${app.user-service.host}") String userServiceHost,
#Value("${app.user-service.port}") int userServicePort
) {
this.publishEventScheduler = publishEventScheduler;
this.webClient = webClient.build();
this.streamBridge = streamBridge;
userServiceUrl = "http://" + userServiceHost + ":" + userServicePort + "/user";
}
#Override
public Mono<User> createUser(User body) {
return Mono.fromCallable(() -> {
sendMessageUser("user-out-0", new Event<Event.Type, String, User >(Event.Type.CREATE, body.getUserName(), body));
return body;
}).subscribeOn(publishEventScheduler);
}
private void sendMessageUser(String bindingName, Event<Type, String, User> event) {
LOG.debug("Sending a {} message to {}", bindingName, event.getEventType());
Message<Event<Type, String, User>> message = MessageBuilder.withPayload(event)
.setHeader("partitionKey", event.getKey())
.build();
streamBridge.send(bindingName, message);
}
application.yaml
server.port: 7000
server.error.include-message: always
app:
user-service:
host: localhost
port: 7002
spring:
cloud:
stream:
default-binder: kafka
default-contentType: application/json
bindings:
user-out-0:
destination: user-service
producer:
required-groups: auditGroup
kafka:
binder:
brokers: 127.0.0.1
defaultBrokerPort: 2181
rabbitmq:
host: 127.0.0.1
port: 5672
username: guest
password: guest

Related

Spring Reactive WebClient is not calling another service

I have 2 Spring Boot microservices. Microservice (B) calls a reactive api exposed by Microservice (A).
Microservice (A) RestController code :
#RestController
#RequestMapping(value = "/documents")
public class ElasticDocumentController {
private static final Logger LOG = LoggerFactory.getLogger(ElasticDocumentController.class);
private final ElasticQueryService elasticQueryService;
public ElasticDocumentController(ElasticQueryService queryService) {
this.elasticQueryService = queryService;
}
#GetMapping(value = "/", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<ElasticQueryServiceResponseModel> getAllDocuments() {
Flux<ElasticQueryServiceResponseModel> response = elasticQueryService.getAllDocuments();
response = response.log();
LOG.info("Returning from query reactive service for all documents");
return response;
}
}
When I call the getAllDocuments() api from Postman, I can see the documents scrolling in the output cosole. So Microservice (A) is correct.
But when I call the api from Microservice (B), I cannot retrieve any documents. Microservice (B) cannot communicate with Microservice (A).
Microservice (B) Service code :
#Service
public class TwitterElasticQueryWebClient implements ElasticQueryWebClient {
private static final Logger LOG = LoggerFactory.getLogger(TwitterElasticQueryWebClient.class);
private final WebClient.Builder webClientBuilder;
private final ElasticQueryWebClientConfigData elasticQueryWebClientConfigData;
public TwitterElasticQueryWebClient(
#Qualifier("webClientBuilder") WebClient.Builder clientBuilder,
ElasticQueryWebClientConfigData configData
) {
this.webClientBuilder = clientBuilder;
this.elasticQueryWebClientConfigData = configData;
}
#Override
public Flux<ElasticQueryWebClientResponseModel> getAllData() {
LOG.info("Querying all data");
return webClientBuilder
.build()
.get()
.uri("/")
.accept(MediaType.valueOf(elasticQueryWebClientConfigData.getQuery().getAccept()))
.retrieve()
.bodyToFlux(ElasticQueryWebClientResponseModel.class);
}
}
Microservice (B) config code :
#Configuration
public class WebClientConfig {
private final ElasticQueryWebClientConfigData.WebClient webClientConfig;
public WebClientConfig(ElasticQueryWebClientConfigData webClientConfigData) {
this.webClientConfig = webClientConfigData.getWebClient();
}
#Bean("webClientBuilder")
WebClient.Builder webClientBuilder() {
return WebClient.builder()
.baseUrl(webClientConfig.getBaseUrl())
.defaultHeader(HttpHeaders.CONTENT_TYPE, webClientConfig.getContentType())
.defaultHeader(HttpHeaders.ACCEPT, webClientConfig.getAcceptType())
.clientConnector(new ReactorClientHttpConnector(HttpClient.from(getTcpClient())))
.codecs(configurer -> configurer.defaultCodecs()
.maxInMemorySize(webClientConfig.getMaxInMemorySize()));
}
private TcpClient getTcpClient() {
return TcpClient.create()
.option(ChannelOption.CONNECT_TIMEOUT_MILLIS, webClientConfig.getConnectTimeoutMs())
.doOnConnected(connection -> {
connection.addHandlerLast(new ReadTimeoutHandler(webClientConfig.getReadTimeoutMs(), TimeUnit.MILLISECONDS));
connection.addHandlerLast(new WriteTimeoutHandler(webClientConfig.getWriteTimeoutMs(), TimeUnit.MILLISECONDS));
});
}
}
Microservice (B) application.yml :
elastic-query-web-client:
webclient:
connect-timeout-ms: 10000
read-timeout-ms: 10000
write-timeout-ms: 10000
max-in-memory-size: 10485760 # 10MB
content-type: 'application/json'
accept-type: 'text/event-stream'
base-url: 'http://localhost:8183/reactive-elastic-query-service/documents'
query:
method: POST
uri: "/get-doc-by-text"
accept: ${elastic-query-web-client.webclient.accept-type}
server:
port: 8184
spring:
webflux:
base-path: /reactive-elastic-query-web-client
thymeleaf:
cache: false
reactive:
max-chunk-size: 8192
codec:
max-in-memory-size: 25MB
Microservice (B) controller :
#Controller
public class QueryController {
private static final Logger LOG = LoggerFactory.getLogger(QueryController.class);
private final ElasticQueryWebClient elasticQueryWebClient;
public QueryController(ElasticQueryWebClient webClient) {
this.elasticQueryWebClient = webClient;
}
#GetMapping("/all")
public String queryAll(Model model) {
Flux<ElasticQueryWebClientResponseModel> responseModels = elasticQueryWebClient.getAllData();
responseModels = responseModels.log();
IReactiveDataDriverContextVariable reactiveData = new ReactiveDataDriverContextVariable(responseModels, 1);
model.addAttribute("elasticQueryWebClientResponseModels", reactiveData);
model.addAttribute("searchText", "");
model.addAttribute("elasticQueryWebClientRequestModel", ElasticQueryWebClientRequestModel.builder().build());
LOG.info("Returning from reactive client controller for all data");
return "home";
}
}
There are no exceptions in the output consoles.
I don't see what I am missing here.

Spring Cloud Circuit Breaker: Feign doesn't open circuit breaker

Circuit breaker works, fallback is called, but circuit breaker doesn't change it's state and every time send request to failed service.
Tried the same YAML config with rest template - works correctly.
Feign client
#FeignClient(
name = MyFeignClient.SERVICE_NAME,
url = "https://httpbin.org/",
configuration = {FeignClientConfiguration.class})
public interface MyFeignClient {
String SERVICE_NAME = "producer-service";
#GetMapping(value = "/status/502")
ResponseEntity<String> gerRequest();
}
Fallback class
public class MyFallback implements MyFeignClient {
private final Exception cause;
public MyFallback(Exception cause) {
this.cause = cause;
}
public ResponseEntity<String> gerRequest() {
if (cause instanceof HttpServerErrorException){
return ResponseEntity.of(Optional.of(cause.getMessage()));
} else {
return ResponseEntity.of(Optional.of(cause.getMessage()));
}
}
}
Feign client configuration
#RequiredArgsConstructor
public class FeignClientConfiguration {
private final CircuitBreakerRegistry registry;
#Bean
#Scope("prototype")
public Feign.Builder feignBuilder() {
CircuitBreaker circuitBreaker = registry.circuitBreaker("producer-service");
FeignDecorators decorators = FeignDecorators.builder()
.withCircuitBreaker(circuitBreaker)
.withFallbackFactory(MyFallback::new)
.build();
return Resilience4jFeign.builder(decorators);
}
}
Circuit breaker YAML config
resilience4j.circuitbreaker:
configs:
default:
registerHealthIndicator: true
slidingWindowType: COUNT_BASED
slidingWindowSize: 5
minimumNumberOfCalls: 3
permittedNumberOfCallsInHalfOpenState: 1
automaticTransitionFromOpenToHalfOpenEnabled: true
waitDurationInOpenState: 5s
failureRateThreshold: 50
eventConsumerBufferSize: 10
writableStackTraceEnabled: true
recordExceptions:
- org.springframework.web.client.HttpServerErrorException
- java.util.concurrent.TimeoutException
- java.io.IOException
shared:
slidingWindowSize: 100
permittedNumberOfCallsInHalfOpenState: 30
waitDurationInOpenState: 1s
failureRateThreshold: 50
eventConsumerBufferSize: 10
instances:
producer-service:
baseConfig: default

Micronaut testing with declarative HTTP client Junit 5

Trying to do the Junit 5 E2E functional testing using Micronaut declarative HTTP client.
public interface IProductOperation {
#Get(value = "/search/{text}")
#Secured(SecurityRule.IS_ANONYMOUS)
Maybe<?> freeTextSearch(#NotBlank String text);
}
Declarative micronaut HTTP client
#Client(
id = "feteBirdProduct",
path = "/product"
)
public interface IProductClient extends IProductOperation {
}
JUnit - 5 testing
#MicronautTest
public record ProductControllerTest(IProductClient iProductClient) {
#Test
#DisplayName("Should search the item based on the name")
void shouldSearchTheItemBasedOnTheName() {
var value = iProductClient.freeTextSearch("test").blockingGet();
System.out.println(value);
}
}
Controller
#Controller("/product")
public class ProductController implements IProductOperation {
private final IProductManager iProductManager;
public ProductController(IProductManager iProductManager) {
this.iProductManager = iProductManager;
}
#Override
public Maybe<List> freeTextSearch(String text) {
LOG.info("Controller --> Finding all the products");
return iProductManager.findFreeText(text);
}
}
When I run the test, I get a 500 internet server error. I think when I run the test the application is also running. Not sure what is the reason for 500 internal server error.
Any help will be appreciated
Is #Get(value = "/search/{text}") causing the issue ?. If yes how can I solve with the declarative client
Service discovery
application.yml
consul:
client:
defaultZone: ${CONSUL_HOST:localhost}:${CONSUL_PORT:8500}
registration:
enabled: true
application-test.yml
micronaut:
server:
port: -1
http:
services:
feteBirdProduct:
urls:
- http://product
consul:
client:
registration:
enabled: false

Listening to many Kafka Streams in Spring

I'm developing an application in the event-driven architecture.
I'm trying to model the following flow of events:
UserAccountCreated (user-management-events) -> sending an e-mail -> MailNotificationSent (notification-service-events)
The notification-service application executes the whole flow. It waits for the UserAccountCreated event by listening to user-management-events topic. When the event is received, the application sends the email and publishes a new event - MailNotificationSent to the notification-service-events topic.
I have no problems with listening to the first event (UserAccountCreated) - application receives it and performs the rest of the flow. I also have no problem with publishing the MailNotificationSent event. Unfortunately, for development purposes, I want to listen to the MailNotificationSent event in the notification service, so the application has to listen to both UserAccountCreated and MailNotificationSent. Here I'm not able to make it works.
Let's take a look at the implementation:
NotificationStreams:
public interface NotificationStreams {
String INPUT = "notification-service-events-in";
String OUTPUT = "notification-service-events-out";
#Input(INPUT)
SubscribableChannel inboundEvents();
#Output(OUTPUT)
MessageChannel outboundEvents();
}
NotificationsEventsListener:
#Slf4j
#Component
#RequiredArgsConstructor
public class NotificationEventsListener {
#StreamListener(NotificationStreams.INPUT)
public void notificationServiceEventsIn(Flux<ActivationLinkSent> input) {
input.subscribe(event -> {
log.info("Received event ActivationLinkSent: " + event.toString());
});
}
}
UserManagementEvents:
public interface UserManagementEvents {
String INPUT = "user-management-events";
#Input(INPUT)
SubscribableChannel inboundEvents();
}
UserManagementEventsListener:
#Slf4j
#Component
#RequiredArgsConstructor
public class UserManagementEventsListener {
private final Gate gate;
#StreamListener(UserManagementEvents.INPUT)
public void userManagementEvents(Flux<UserAccountCreated> input) {
input.subscribe(event -> {
log.info("Received event UserAccountCreated: " + event.toString());
gate.dispatch(SendActivationLink.builder()
.email(event.getEmail())
.username(event.getUsername())
.build()
);
});
}
}
KafkaStreamsConfig:
#EnableBinding(value = {NotificationStreams.class, UserManagementEvents.class})
public class KafkaStreamsConfig {
}
EventPublisher:
#Slf4j
#RequiredArgsConstructor
#Component
public class EventPublisher {
private final NotificationStreams eventsStreams;
private final AvroMessageBuilder messageBuilder;
public void publish(Event event) {
MessageChannel messageChannel = eventsStreams.outboundEvents();
AvroActivationLinkSent activationLinkSent = new AvroActivationLinkSent(); activationLinkSent.setEmail(((ActivationLinkSent)event).getEmail());
activationLinkSent.setUsername(((ActivationLinkSent)event).getUsername() + "-domain");
activationLinkSent.setTimestamp(System.currentTimeMillis());
messageChannel.send(messageBuilder.buildMessage(activationLinkSent));
}
}
application config:
spring:
devtools:
restart:
enabled: true
cloud:
stream:
default:
contentType: application/*+avro
kafka:
binder:
brokers: localhost:9092
schemaRegistryClient:
endpoint: http://localhost:8990
kafka:
consumer:
group-id: notification-group
auto-offset-reset: earliest
kafka:
bootstrap:
servers: localhost:9092
The application seems to ignore the notification-service-events listener. It works when listening to only one stream.
I'm almost 100% sure that this is not an issue with publishing the event, because I've connected manually to Kafka and verified that messages are published properly:
kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic notification-service-events-out --from-beginning
Do you have any ideas what else I should check? Is there any additional configuration on the Spring side?
I've found where the problem was.
I was missing bindings configuration. In the application properties, I should have added the following lines:
cloud:
stream:
bindings:
notification-service-events-in:
destination: notification-service-events
notification-service-events-out:
destination: notification-service-events
user-management-events-in:
destination: user-management-events
In the user-management-service I didn't have such a problem because I used a different property:
cloud:
stream:
default:
contentType: application/*+avro
destination: user-management-events

Kafka consumer in spring cloud stream dont start

here is the configuration of my consumer
spring:
cloud:
stream:
defaultBinder: kafka
bindings:
input:
destination: greeting
content-type: application/json
kafka:
binder:
brokers: kafka
zkNodes: zookeeper
the code of my app
#SpringBootApplication
#EnableIntegration
#EnableBinding(CommandSink.class)
public class KafkaTesterApplication {
private static Logger logger = LogManager.getLogger(KafkaTesterApplication.class);
/**
* #param args
*/
public static void main(String[] args) {
SpringApplication.run(KafkaTesterApplication.class, args);
}
#ServiceActivator(inputChannel="input")
public void receiveMessage(String message) {
logger.debug("receive {}", message);
}
}
and the sink interface
public interface CommandSink {
public static final String CHANNEL = "input";
#Input(CommandSink.CHANNEL)
SubscribableChannel command();
}
it looks like consumer doesn't connect to zookeeper and kafka.
any idea?
Ok, we found the solution...
We don't know why but a topic was missing. The most curious in the problem was that the consumer with zookeeper (old style) can consume messages.
The missing topic was
__consumer_offsets

Categories

Resources