Spring Integration DSL Syntax problem - how to dynamically construct subflows? - java

I am trying to construct a complex flow in Spring Integration where the sub flows are dynamically defined at runtime. Code that functions perfectly well in the master flow definition fails the compile in the sub flow definition. Since the construct appears identical, it is not obvious what is going on. Any explanation would be appreciated.
Thank you in advance.
The master flow definition is coded something like this:
StandardIntegrationFlow flow = IntegrationFlows
.from(setupAdapter,
c -> c.poller(Pollers.fixedRate(1000L, TimeUnit.MILLISECONDS).maxMessagesPerPoll(1)))
// This one compiles fine
.enrichHeaders(h -> h.headerExpression("start", "start\")")
.headerExpression("end", "payload[0].get(\"end\")"))
.split(tableSplitter)
.enrichHeaders(h -> h.headerExpression("object", "payload[0].get(\"object\")"))
.channel(c -> c.executor(stepTaskExecutor))
.routeToRecipients(r -> this.buildRecipientListRouterSpecForRules(r, rules))
.aggregate()
.handle(cleanupAdapter).get();
buildRecipientListRouterSpecForRules is defined as:
private RecipientListRouterSpec buildRecipientListRouterSpecForRules(RecipientListRouterSpec recipientListSpec,
Collection<RuleMetadata> rules) {
rules.forEach(
rule -> recipientListSpec.recipientFlow(getFilterExpression(rule), f -> createFlowDefForRule(f, rule)));
return recipientListSpec;
}
createFlowDefForRule() is just a switch() wrapper to choose which actual DSL to run for the flow defined by the rule. Here is a sample
public IntegrationFlowDefinition constructASpecificFlowDef(IntegrationFlowDefinition flowDef, RuleMetadata rule) {
return flowDef
// This enrichHeaders element fails to compile,
// The method headerExpression(String, String) is undefined for the type Object
.enrichHeaders(h -> h.headerExpression("ALC_operation", "payload[0].get(\"ALC_operation\")"));
}

In general, it's better to put such explanations in the question text, rather than as comments in the code snippets; I completely missed that comment.
Can you provide a stripped-down (simpler) example (complete class) that exhibits this behavior so we can play with it?
I tried to simplify what you are doing, and this compiles fine and works as expected:
#SpringBootApplication
public class So65010958Application {
public static void main(String[] args) {
SpringApplication.run(So65010958Application.class, args);
}
#Bean
IntegrationFlow flow() {
return IntegrationFlows.from("foo")
.routeToRecipients(r -> r.recipientFlow("true", f -> buildFlow(f)))
.get();
}
private IntegrationFlowDefinition<?> buildFlow(IntegrationFlowDefinition<?> f) {
return f.enrichHeaders(h -> h.headerExpression("foo", "'bar'"))
.channel(MessageChannels.queue("bar"));
}
#Bean
public ApplicationRunner runner(MessageChannel foo, PollableChannel bar) {
return args -> {
foo.send(new GenericMessage<>("foo"));
System.out.println(bar.receive(0));
};
}
}
GenericMessage [payload=foo, headers={foo=bar, id=d526b8fb-c6f8-7731-b1ad-e68e326fcc00, timestamp=1606333567749}]
So, I must be missing something.

Related

How I can turn on and off BlockHound check

I have some App with WebFlux and i want to use BlockHound, but i need to have a possible turn on and off it through parameter in application.properties or through spring profiling or somthing else.
Also I want to override action, when the lock operation is caught so that not throw error but log warning. And firstly, i did through parameter in application.properties:
#SpringBootApplication
#Slf4j
public class GazPayApplication {
public static void main(String[] args) {
ConfigurableApplicationContext context =
SpringApplication.run(GazPayApplication.class, args);
BlockHoundSwitch blockHoundSwitch = (BlockHoundSwitch)context.getBean("BlockHoundSwitchBean");
if (blockHoundSwitch.isBlockHoundEnabled()) {
BlockHound.install(builder ->
builder.blockingMethodCallback(it ->
log.warn("find block operation: {}", it.toString())));
}
}
And my BlockHoundSwitch:
#Component("BlockHoundSwitchBean")
#Getter
public class BlockHoundSwitch {
#Value("${blockhound.enabled}")
private boolean blockHoundEnabled;
}
It works for me but in my opinion this solution quite difficult and a little unpredictable.
Next i tried resolve this task through profiling:
#Profile("blockhound_enabled")
#Slf4j
#Component()
public class BlockHoundSwitch {
public BlockHoundSwitch() {
BlockHound.install(builder ->
builder.blockingMethodCallback(it ->
log.warn("find block operation: {}", it.toString())));
}
}
And it works too. Well, I have a few questions:
Which way is better, why and maybe there is another solution?
I need to localize and log, where block operation happened. How can I get class name and method, where it`s happened?
I resolve it. Maybe someone it will be useful. I did it through profiling and my code bellow:
#Profile("blockhound_on")
#Slf4j
#Component()
#Getter
public class BlockHoundSwitch {
public BlockHoundSwitch() {
BlockHound.install(builder ->
builder.blockingMethodCallback(it -> {
List<StackTraceElement> itemList = Arrays.stream(new Exception(it.toString()).getStackTrace())
.filter(i -> i.toString().contains("application.package"))
.collect(Collectors.toList());
log.warn("find block operation: \n{}", itemList);
}));
}
}
where application.package - main package of my project, which i`m finding in stacktrace.

Throwing an exception vs Mono.error() in Spring webflux

I'm working on a Spring webflux project and I want to understand the difference between throwing an exception vs using Mono.error().
If there is a validation class like this for example:
public class NameValidator {
public static boolean isValid(String name) {
if(StringUtils.isEmpty(name)) {throw new RuntimeException("Invalid name");}
return true;
}
}
public class NameValidator2 {
public static Mono<Object> isValid(String name) {
if(StringUtils.isEmpty(name)) {
return Mono.error(new RuntimeException("Invalid name"));}
return Mono.just(true);
}
}
What are the pros & cons with each approach. When to use one over the other while working with reactive streams using spring webflux?
As #Joao already stated, the recommended way to deal with an error is to call the error method on a Publisher(Mono.error/Flux.error).
I would like to show you an example in which the traditional throw does not work as you may expect:
public void testErrorHandling() {
Flux.just("a", "b", "c")
.flatMap(e -> performAction()
.onErrorResume(t -> {
System.out.println("Error occurred");
return Mono.empty();
}))
.subscribe();
}
Mono<Void> performAction() {
throw new RuntimeException();
}
The onErrorResume operator will never be executed because the exception is thrown before Mono is assembled.
Basically you will have the same result in the end and no difference between the two options (maybe performance wise but I have not found anything backing this opinion so I guess it can be negligible.
The only “difference” is that Mono.error follows the Reactive Streams specification and throwing an exception as is does not (read more at https://github.com/reactive-streams/reactive-streams-jvm/blob/v1.0.3/README.md#2.13). However it is not prohibited, but if you like to follow standards and specifications (I guess you do) you should consider using Mono.error.

spring kafka: filtering KafkaNull values from payload

My consumer configured as follows:
#Bean
public Consumer<Message<List<Foo>>> consumer() {
return message -> {
message.getPayload().forEach(it -> {
// process
})
};
}
I've also configured ErrorHandlingDeserializer which may produce KafkaNull values in message payload collection. The problem that I cannot filter such values because accessing collection with forEach() method produces ClassCastException:
java.lang.ClassCastException: class org.springframework.kafka.support.KafkaNull cannot be cast to class Foo
How can I exclude KafkaNull values from processing (without changing signature of consumer to Consumer<Message<List<Object>>>)?
So we don't have a filtering function at the moment. I did raise the issue - https://github.com/spring-cloud/spring-cloud-function/issues/736
But for now the best approach for you would be to do something like this:
public Consumer<Message<List<Object>>> consumer() {
return message -> {
message.getPayload().stream().filter(it -> it instanceof Foo).forEach(it -> {
// process
})
};
}

Java/Spring: Provide a common implementation java function (Supplier or Function...)

Currently, I've this class:
#Component
public class AuditFactory {
private Supplier<String> auditIdSupplier;
public AuditFactory(Supplier<String> auditIdSupplier) {
this.auditIdSupplier = auditIdSupplier;
}
}
I've coded two projects are using this AuditFactory.
Currently I'm providing them using this #Bean:
In project front-office:
#Bean
public Supplier<String> auditIdSupplier(FrontOfficeProperties frontOfficeProperties) {
return () -> String.join(
"-",
frontOfficeProperties.getCpdId(),
UUID.randomUUID().toString()
);
}
In project back-office:
#Bean
public Supplier<String> auditIdSupplier(BackOfficeProperties backOfficeProperties) {
return () -> String.join(
"-",
backOfficeProperties.getCpdId(),
UUID.randomUUID().toString()
);
}
So, I'd like to avoid to create so many Supplier<String> implementations as projects I need in order to provide a same way of creating an common way to provide an id.
What only is changing is one parameter (cpdid).
So, I guess, I could create a class that inherits from Supplier<String>, but I don't quite figure out how to get it.
Assuming that AuditFactory is stored in a separate module, you could simply move Supplier<String> auditIdSupplier() to that module and have both FrontOfficeProperties and BackOfficeProperties implement the same c interface.
Then, whenever you will build front-office or back-office, a proper OfficeProperties would be injected.

Headerenricher Spring Integration and java dsl

I'me using Spring Integration and java dsl specifications to implement my IntegrationFlow.
I want to use an custom header enricher to add some file names to the header, it will be something like :
public class FileHeaderNamingEnricher {
public Message<File> enrichHeader(Message<File> fileMessage) {
// getting some details fom the database ...
return messageBuilder
.setHeader("filename", "somestuff")
.build();
}
}
And my Integration flow will look like :
public IntegrationFlow myflow() {
return IntegrationFlows.from("input")
.enrich // here I want to enrich the header using my class
}
Can any one help me with this please ?
You can have your FileHeaderNamingEnricher extend AbstractReplyProducingMesageHandler (put your code in handleRequestMessage()).
Or, implement GenericHandler<T> (its handle method gets the payload and headers as parameters and can return a message).
Then use the .handle method...
...
.handle(myEnricher())
...
#Bean
public void MessageHandler myEnricher() {
return new FileHeaderNamingEnricher();
}

Categories

Resources