JMSListener - dynamic selector - java

I currently have a JMSListener as shown below. It uses a selector of a value in a properties file. This works fine.
#JmsListener(destination = "myQueueDest",
selector = MyHeaders.SELECTOR_KEY + " = '${myapp.selector_val}'")
private void consumeData(MyCustomObj mycustomObj) { }
I have a need now to use a dynamic selector with a value in memory, rather than the spring property. Is there a way to use JMSListener (or some other listener mechnaism) to do a selection off the ActiveMQ queue?
Update:
It may be possible to assign an ID to my #JMSListener, and then retrieve it from my JmsListenerEndpointRegistry bean. Get the listener container by ID, cast it to DefaultMessageListenerContainer, and call setMessageSelector(), although I'm not entirely sure if this will work yet.
This requires setting my DefaultJmsListenerContainerFactory bean to have the cache level of CACHE_SESSION.
But this doesn't seem to work, as the listener picks up all messages, regardless of what I set the message selector to be.

JMS specification says the selection string must be provided while creating a consumer. So the answer is NO. Consumer must be closed and recreated with a different selection string to receive messages that match a different selection criteria.
If using JMS API is not a must for your project, then you could explore using Active MQ's native APIs. I am sure the API will have a way to specify a different selection string every time a receive is called. IBM MQ's native API provides such a functionality.

As stated in one of the comments:
the javadoc for setMessageSelector says it can be set at runtime. http://docs.spring.io/spring-framework/docs/2.5.x/api/org/springframework/jms/listener/AbstractMessageListenerContainer.html#setMessageSelector(java.lang.String)
This example explains how to setup at startup but doing it dynamically should be possible with a few more tricks:
#EnableJms
#Configuration
public class JmsConfiguration {
#Value("${my.int.param:100}")
private int config;
#Bean
public MessageConverter messageConverter() {
final MappingJackson2MessageConverter converter = new MappingJackson2MessageConverter();
converter.setTargetType(MessageType.TEXT);
converter.setTypeIdPropertyName("_type");
return converter;
}
#Bean
public JmsListenerContainerFactory<?> specialQueueListenerFactory() {
final String selector = "foo > " + config;
final DefaultJmsListenerContainerFactory factory = new CustomJmsListenerContainerFactory(selector);
factory.setMessageConverter(messageConverter());
return factory;
}
}
And the CustomJmsListenerContainerFactory
public class CustomJmsListenerContainerFactory extends DefaultJmsListenerContainerFactory {
private final String selector;
public CustomJmsListenerContainerFactory(String jmsSelector) {
this.selector = jmsSelector;
}
#Override
public DefaultMessageListenerContainer createListenerContainer(JmsListenerEndpoint endpoint) {
final DefaultMessageListenerContainer instance = super.createListenerContainer(endpoint);
instance.setMessageSelector(selector);
return instance;
}
}

Related

Handle inbound AMQP messages based on a header attribute (e.g. routing key)

I have a service, that receives AMQP messages. This service is bound to a queue, which receives all messages which match a set of routing keys.
My set up is as follows:
...
private SomeController controller;
#Autowired
private SimpleMessageListenerContainer receiverContainer;
#Bean
public IntegrationFlow inboundFlow(){
var adpater = Amqp.inboundAdapter(receiverContainer);
return IntegrationFlows.from(adapter)
// some transformations
.handle(controller, "processMessage")
.get();
}
This already works fine. However, now I want to handle a message with different controllers, depending on a header attribute. In this case I'd like to have a controller for each routing key. Is it also a good idea to use a single queue with multiple routing keys only to handle it differently for each key?
It is really legit to have several bindings between an exchange and a single queue.
See more info in this tutorial: https://www.rabbitmq.com/tutorials/tutorial-four-spring-amqp.html.
The Amqp.inboundAdapter() relies on the DefaultAmqpHeaderMapper.inboundMapper() by default which populates for us an AmqpHeaders.RECEIVED_ROUTING_KEY message header before producing. So, you indeed can use a route(Message.class, m -> m.getHeaders().get(AmqpHeaders.RECEIVED_ROUTING_KEY)) with appropriate channelMapping() for the routing key value.
I just wanted to add a code example, incorporating Artem Bilan's (correct) answer, because additionally to that, I had to incorporate a gateway (hinted by Artem Bilan with "appropriate channelMapping()).
More about why you need a gateway or in some cases a bridge, refer to this part of the documentation.
My initial code snippit becomes something like the following:
...
#Autowired
private FirstController firstController;
#Autowired
private SecondController secondController;
#Autowired
private SimpleMessageListenerContainer receiverContainer;
#Bean
public IntegrationFlow inboundFlow(){
var adpater = Amqp.inboundAdapter(receiverContainer);
return IntegrationFlows.from(adapter)
// some transformations
.route(Message.class, getMessageRoutingKey(m),
m -> m.subFlowMapping("routingKey1", firstFlow())
// after the first subFlow, all further integrationflows are wrapped in a gateway
.subFlowMapping("routingKey2", sf -> sf.gateway(secondFlow())))
.get();
}
#Bean
public IntegrationFlow firstFlow() {
return f -> f
// e.g. additional transformations
.handle(firstController, "processMessageInFirstFashion");
}
#Bean
public IntegrationFlow secondFlow() {
return f -> f
// e.g. additional transformations
.handle(secondController, "processMessageInSecondFashion");
}
private static String getMessageRoutingKey(final Message<?> message) {
return message.getHeaders().get(AmqpHeaders.RECEIVED_ROUTING_KEY).toString();
}

Spring boot: push message to specific topic for each request

I am using pub sub integration with spring boot, for which my configuration class look like this:
#Configuration
public class PubSubConfiguration {
#Value("${spring.pubsub.topic.name}")
private String topicName;
#Bean
#ServiceActivator(inputChannel = "MyOutputChannel")
public PubSubMessageHandler messageSender(PubSubTemplate pubsubTemplate) {
return new PubSubMessageHandler(pubsubTemplate, topicName);
}
#MessagingGateway(defaultRequestChannel = "MyOutputChannel")
public interface PubsubOutboundGateway {
void sendToPubsub(String attribute);
}
}
So now, I was calling only sendToPubSub method which add payload into topic from my app, like this:
#Autowired
private PubSubConfiguration.PubsubOutboundGateway outboundGateway;
// used line in my code wherever is needed.
outboundGateway.sendToPubsub(jsonInString);
The above code is just meant for one topic which i loaded from application property file.
But now I wanted to make my topic name is dynamically added into messageSender, how to do that.
To override the default topic you can use the GcpPubSubHeaders.TOPIC header.
final Message<?> message = MessageBuilder
.withPayload(msg.getPayload())
.setHeader(GcpPubSubHeaders.TOPIC, "newTopic").build();
and modify your sendToPubsub(Message<byte[]> message) to use message as input.
Refer for more information
Consider creating a BeanFactory to generate a PubSubMessageHandler Bean given a topic name. PubSubMessageHandler also has a setTopic() method, which may be of use.

How to filter meters when publishing NewRelicMeterRegistry

I am using custom timers to instrument a ton of fields via Micrometer. Ideally I do not want the metrics reported for this specific meter that have a count of zero between the configured step interval. This is not crucial but would love to potentially reduce noise of what is getting sent to NR every x seconds.
I've created an extension off NewRelicMeterRegistry that overrides the publish() method to add the functionality before the default behavior.
public class FilteringNewRelicMeterRegistry extends NewRelicMeterRegistry {
public FilteringNewRelicMeterRegistry(NewRelicConfig config, Clock clock) {
super(config, clock);
}
/**
* Remove field metrics that have not been used since the last publish.
*/
#Override
protected void publish() {
getMeters().stream()
.filter(filterByMeterId(...)))
.filter(meter -> ((Timer) meter).count() == 0)
.forEach(this::remove);
super.publish();
}
}
But for the life of me, I can't figure out how to get the AutoConfiguration to prefer this implementation over the default NewRelicMeterRegistry.
How do I get spring-boot or micrometer to honor my implementation and use that as the designated bean in the application context for autowiring purposes?
Also if there is an out of the box way to override this behavior via micrometer abstractions or utility, awesome that would be even better! Please let me know. I've tried using MeterRegistryCustomizer but that didn't seem to have what I needed.
I want to avoid using Spring's scheduling functionality via #Scheduled, would like to do this on an "on publish" basis.
if you don't want to default auto configuration disable default with this
#SpringBootApplication(exclude = { NewRelicMetricsExportAutoConfiguration.class })
and extends your FilteringNewRelicMeterRegistry class with StepMeterRegistry and configure with your responsibilities, because StepMeterRegistry is subclasses to MeterRegistry and micrometer detect your configuration
after register your custom configuration with this configuration class same as NewRelicMetricsExportAutoConfiguration StepMeterRegistry is need StepRegistryConfig and Clock is use default NewRelicConfig and clock and register like this, I read NewRelicMetricsExportAutoConfiguration and simplify configuration like this
#Configuration(proxyBeanMethods = false)
#AutoConfigureBefore({ CompositeMeterRegistryAutoConfiguration.class, SimpleMetricsExportAutoConfiguration.class })
#AutoConfigureAfter(MetricsAutoConfiguration.class)
#ConditionalOnProperty(prefix = "management.metrics.export.newrelic", name = "enabled", havingValue = "true",
matchIfMissing = true)
#EnableConfigurationProperties(NewRelicProperties.class)
public class FilteringNewRelicConfiguration {
private final NewRelicProperties properties;
public FilteringNewRelicConfiguration(NewRelicProperties properties) {
this.properties = properties;
}
#Bean
public NewRelicConfig newRelicConfig() {
return new NewRelicPropertiesConfigAdapter(this.properties);
}
#Bean
public FilteringNewRelicMeterRegistry filteringNewRelicMeterRegistry(NewRelicConfig newRelicConfig, Clock clock) {
return new FilteringNewRelicMeterRegistry(newRelicConfig, clock)
}
}

Dynamic Queues on RabbitListener Annotation

I'd like to use queue names using a specific pattern, like project.{queue-name}.queue. And to keep this pattern solid, I wrote a helper class to generate this name from a simple identifier. So, foo would generate a queue called project.foo.queue. Simple.
But, the annotation RabbitListener demands a constant string and gives me an error using my helper class. How can I achieve this (or maybe another approach) using RabbitListener annotation?
#Component
public class FooListener {
// it doesn't work
#RabbitListener(queues = QueueName.for("foo"))
// it works
#RabbitListener(queues = "project.foo.queue")
void receive(final FooMessage message) {
// ...
}
}
To create and listen to a queue name constructed from a dynamic UUID, you could use random.uuid.
The problem is that this must be captured to a Java variable in only one place because a new random value would be generated each time the property is referenced.
The solution is to use Spring Expression Language (SpEL) to call a function that provides the configured value, something like:
#RabbitListener(queues = "#{configureAMQP.getControlQueueName()}")
void receive(final FooMessage message) {
// ...
}
Create the queue with something like this:
#Configuration
public class ConfigureAMQP {
#Value("${controlQueuePrefix}-${random.uuid}")
private String controlQueueName;
public String getControlQueueName() {
return controlQueueName;
}
#Bean
public Queue controlQueue() {
System.out.println("controlQueue(): controlQueueName=" + controlQueueName);
return new Queue(controlQueueName, true, true, true);
}
}
Notice that the necessary bean used in the SpEL was created implicitly based on the #Configuration class (with a slight alteration of the spelling ConfigureAMQP -> configureAMQP).
Declare a magic bean, in this case implicitly named queueName:
#Component
public class QueueName {
public String buildFor(String name) {
return "project."+name+".queue";
}
}
Access this using a "constant string" that will be evaluated at runtime:
#RabbitListener(queues = "#{queueName.buildFor(\"foo\")}")
If {queue-name} would came from yml file - it should work:
#RabbitListener(queues = "${queue-name}")
public void receiveMessage(FooMessage message) {
}
Spring will inject value from application.yml.

How do I configure Spring to partially and optionally override properties?

I would like to have a properties setup which can, on certain environments, override specific properties. For example, our default JDBC properties for dev are:
db.driverClassName=com.mysql.jdbc.Driver
db.url=jdbc:mysql://localhost:3306/ourdb
db.username=root
db.password=
The problem is that some of our devs would like to have a different username/password on the db, or possibly even a non locally hosted db. The same is true for our rabbitMQ configuration, which currently uses a similar localhost, guest/guest setup. Being able to override the properties of certain elements of this configuration on a per-developer basis would allow us to move much of the infrastructure/installation requirements for building the software off the local machine and onto dedicated servers.
I have set-up a simple project to wrap my head around the configuration required to achieve what I want, and this is my first foray into the world of spring property configuration, since up till now, property loading and management is done with some custom code. Here is my setup:
class Main_PropertyTest {
public static void main(String[] args) {
String environment = System.getenv("APPLICATION_ENVIRONMENT"); // Environment, for example: "dev"
String subEnvironment = System.getenv("APPLICATION_SUB_ENVIRONMENT"); // Developer name, for example: "joe.bloggs"
System.setProperty("spring.profiles.active", environment);
System.setProperty("spring.profiles.sub", subEnvironment);
try(AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext(PropertyTestConfiguration.class)) {
Main_PropertyTest main = context.getBean(Main_PropertyTest.class);
main.printProperty();
}
}
private final String property;
public Main_PropertyTest(String property) {
this.property = property;
}
public void printProperty() {
System.out.println("And the property is: '" + property + "'.");
}
}
And my configuration:
#Configuration
public class PropertyTestConfiguration {
#Bean
public static PropertySourcesPlaceholderConfigurer primaryPlaceholderConfigurer() {
PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer = new PropertySourcesPlaceholderConfigurer();
propertySourcesPlaceholderConfigurer.setLocation(new ClassPathResource(System.getProperty("spring.profiles.active") + ".main.properties"));
return propertySourcesPlaceholderConfigurer;
}
#Bean
public static PropertySourcesPlaceholderConfigurer secondaryPlaceholderConfigurer() {
PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer = new PropertySourcesPlaceholderConfigurer();
propertySourcesPlaceholderConfigurer.setLocation(new ClassPathResource(System.getProperty("spring.profiles.sub") + ".main.properties"));
propertySourcesPlaceholderConfigurer.setIgnoreResourceNotFound(true);
propertySourcesPlaceholderConfigurer.setIgnoreResourceNotFound(true);
propertySourcesPlaceholderConfigurer.setOrder(-1);
return propertySourcesPlaceholderConfigurer;
}
#Bean
public Main_PropertyTest main_PropertyTest(#Value("${main.property}") String property) {
Main_PropertyTest main_PropertyTest = new Main_PropertyTest(property);
return main_PropertyTest;
}
}
And for completeness, my dev.main.properties and test.main.properties:
main.property=dev
main.property=test
The main problem is that I get an illegal argument exception. As far as I can tell, what I have written should be the javaconfig equivalent of this method: http://taidevcouk.wordpress.com/2013/07/04/overriding-a-packaged-spring-application-properties-file-via-an-external-file/
Unfortunately I get the following error: java.lang.IllegalArgumentException: Could not resolve placeholder 'main.property' in string value "${main.property}". Note that I also need to take care of the case where there is no sub-environment, and this is the case I have started with (although I get the same error even if both files exist). If I remove the bean which sets up the second propertysourcesplaceholderconfigurer, then it all works fine (by which I mean dev.main.properties is loaded and "And the property is: 'dev'." is printed out).
A secondary problem is that the code doesn't look great, and each layer of the system will need two PSPC's set-up so that they can access these properties. Furthermore, it requires a lot of manual calls to System.getProperty(), since I couldn't pass ${spring.profiles.active} to PSPC.setLocation();
Note: I have tried #PropertySources({primaryproperties, secondaryProperties}), but this fails because secondaryProperties does not exist. I have also tried #Autowired Environment environment; and getting the properties from that, but the secondary PSPC causes the environment to not be autowired...
So following this lengthy explanation, my questions are:
Is this the right way of solving this problem?
If so, what is wrong with my configuration?
How can I simplify the configuration (if at all)?
Is there an alternative mechanism available which would solve my problem?
Thank you for your time! :)
Your configuration is flawed when configuring BeanFactoryPostProcessor with java config the methods should be static. However it can be even easier, instead of registering your own PropertySourcesPlaceholderConfigurer utilize the default #PropertySource support.
Rewerite your jav config to the following
#Configuration
#PropertySource(name="main", value= "${spring.profiles.active}.main.properties")
public class PropertyTestConfiguration {
#Autowired
private Environment env;
#PostConstruct
public void initialize() {
String resource = env.getProperty("spring.profiles.sub") +".main.properties";
Resource props = new ClassPathResource(resource);
if (env instanceof ConfigurableEnvironment && props.exists()) {
MutablePropertySources sources = ((ConfigurableEnvironment) env).getPropertySources();
sources.addBefore("main", new ResourcePropertySource(props));
}
}
#Bean
public Main_PropertyTest main_PropertyTest(#Value("${main.property}") String property) {
Main_PropertyTest main_PropertyTest = new Main_PropertyTest(property);
return main_PropertyTest;
}
}
This should first load the dev.main.properties and additionally the test.main.properties which will override the earlier loaded properties (when filled ofcourse).
I had a similar issue with overwriting already existing properties in integration tests
I came up with this solution:
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = {
SomeProdConfig.class,
MyWebTest.TestConfig.class
})
#WebIntegrationTest
public class MyWebTest {
#Configuration
public static class TestConfig {
#Inject
private Environment env;
#PostConstruct
public void overwriteProperties() throws Exception {
final Map<String,Object> systemProperties = ((ConfigurableEnvironment) env)
.getSystemProperties();
systemProperties.put("some.prop", "test.value");
}
}

Categories

Resources