consuming message from activemq queue using spring integration - java

So I have application that sends message to activemq queue with spring integration.
<int-feed:inbound-channel-adapter id="feedAdapter"
channel="feedChannel"
auto-startup="${auto.startup:true}"
url="https://stackoverflow.com/feeds/question/49479712">
<int:poller fixed-rate="10000"/>
</int-feed:inbound-channel-adapter>
<int:channel id="feedChannel"/>
<int:transformer id="transformer" input-channel="feedChannel"
expression="payload.title + payload.author + '#{systemProperties['line.separator']}'"
output-channel="feedOutputChannel"/>
<int:channel id="feedOutputChannel"/>
<jms:outbound-gateway id="jmsOutGateway"
request-destination="inputQueue"
request-channel="feedOutputChannel"
requires-reply="false"/>
But now I want to create different application which consumes message from that queue and just prints it out to console with spring integration. I have made this:
<jms:message-driven-channel-adapter id="JMSInboundAdapter" destination="inputQueue"/>
<bean id="inputQueue" class="org.apache.activemq.command.ActiveMQQueue">
<constructor-arg value="input.queue"/>
</bean>
It works when I run application that sends message to queue. But it doesnt when I run message consume application.
Error I get : Dispatcher has no subscribers for channel 'application.JMSInboundAdapter'.
How do I need to configure my message consumer application?

If there is no channel on the adapter, the id becomes the channel name.
You need something to subscribe to that channel (e.g. a <service-activator inputChannel="JMSInboundAdapter" ... />).

Related

How to use multiple sftp outbound-gateway

I'm new in Spring Integration and I'm working with two process:
1.- Read files from sftp, save information. I use method "mget".
2.- After of save information, rename files processed in sftp server with method "mput".
this is my configuration:
<bean id="filePrinter" class="com.maven.integration.FilePrinter" />
<int:channel id="outboundChannel"></int:channel>
<int:channel id="aggregateResultsChannel"/>
<int:channel id="toGet"></int:channel>
<int:gateway id="simpleGateway" service-interface="com.maven.integration.FileGateway"
default-request-channel="ftpChannel" />
<int-sftp:outbound-gateway
session-factory="sftpClientFactory"
request-channel="ftpChannel"
command="mget"
command-options="-R -P"
expression="payload"
filename-regex="(new|.*.csv)"
mode="IGNORE"
local-directory-expression="#targetDir.get() + #remoteDirectory"
reply-channel="outboundChannel"
>
<int-sftp:request-handler-advice-chain>
<int:retry-advice />
</int-sftp:request-handler-advice-chain>
</int-sftp:outbound-gateway>
<int:splitter input-channel="outboundChannel" output-channel="toGet"/>
<bean id="targetDir" class="java.util.concurrent.atomic.AtomicReference">
<constructor-arg value="${target}"/>
</bean>
<int:service-activator ref="filePrinter" method="print"
input-channel="toGet"/>
<int-sftp:outbound-gateway
session-factory="sftpClientFactory"
request-channel="toGet"
command="mput"
command-options="-R"
local-directory="src/test/"
expression="payload"
filename-regex="(new|.*.csv)"
mode="IGNORE"
remote-filename-generator-expression="payload.getName() + '.ack'"
remote-directory-expression="#remoteDirectory"
reply-channel="aggregateResultsChannel"
>
<int-sftp:request-handler-advice-chain>
<int:retry-advice />
</int-sftp:request-handler-advice-chain>
</int-sftp:outbound-gateway>
<int:aggregator input-channel="aggregateResultsChannel"/>
currently only the first outbound-gateway with mget command is executed, but the second outbound-gateway is not executed, how could the second process run?
Because your toGet is a simple DirectChannel with the "only one per message subscriber" logic. At the same time your mput gateway uses that as its request-channel="toGet" sharing this channel with the
<int:service-activator>. And since this is earlier in the config, it is a first subscriber in thetoGetand therefore only this one process the message sent to thetoGet` by the splitter.
I think what you need to do (if the story is still about mput) is exactly opposite with what you have in mget. So, you should do that logic alongside with the splitting, not after. Therefore you should send MGET result to the splitter and to the MPUT in parallel. For this purpose I suggest you to change your outboundChannel to the <publish-subscribe-channel> and use this outboundChannel in the mput gateway for its request-channel.

Message only gets published to one queue in a RabbitMQ Fanout exchange (java)

So, I have 2 queues, outboundEmailQueue and storeEmailQueue:
<rabbit:queue name="outboundEmailQueue"/>
<rabbit:queue name="storeEmailQueue"/>
binded to a fanout exchange called integrationExchange:
<rabbit:fanout-exchange name="integrationExchange" auto-declare="true">
<rabbit:bindings>
<rabbit:binding queue="outboundEmailQueue"/>
<rabbit:binding queue="storeEmailQueue"/>
</rabbit:bindings>
</rabbit:fanout-exchange>
the template:
<rabbit:template id="integrationRabbitTemplate"
connection-factory="connectionFactory" exchange="integrationExchange"
message-converter="jsonMessageConverter" return-callback="returnCallback"
confirm-callback="confirmCallback" />
how I am sending an object to the exchange:
integrationRabbitTemplate.convertAndSend("integrationExchange", "", outboundEmail);
However, the message only gets published to storeEmailQueue:
What is wrong with my configuration? Why is the message not being queued to outboundEmailQueue?
From the screen captures, it seems your configuration is ok and the message is reaching both queues.
But the consumer configuration on each queue is not the same:
storeEmailQueue has consumer ack configured
outboundEmailQueue has autoack configured
If you have a doubt:
check the bindings section of either the exchange or the queues to confirm the link is there (but again, from your screen captures, seems likely to be present)
stop the consumers and push a message to the exchange, you should see the message ready count (and total count) increase on both queues.
I created the same example and its working fine, message is being added to both the queue, But I configure through annotations instead of the XML. If you want the annotations solution, please follow below link:
https://stackoverflow.com/questions/45803231/how-to-publish-messages-on-rabbitmq-with-fanout-exchange-using-spring-boot

Spring Integration: Why is client socket waiting for response from server?

I am using spring integration to make a client socket (java code), which must send messages to server socket (flash client). Basically, i want to push messages to flash through socket communication without caring for any response from it. The message which should be sent is coming from GW upon send method is called.
I am able to push a message, but the problem is that my client socket is waiting for a response and if it doesn't get one it times out at some point. Here is my configuration:
<int:gateway id="gw"
service-interface="integration.MessageGateway"
default-request-channel="input"/>
<int-ip:tcp-connection-factory id="client"
type="client"
host="localhost"
port="6767"
serializer="clientSerializer"
single-use="true" so-keep-alive="true"
so-timeout="10000"/>
<bean id="clientSerializer" class="org.springframework.integration.ip.tcp.serializer.ByteArrayCrLfSerializer" />
<int:channel id="input" />
<int-ip:tcp-outbound-gateway id="outGateway"
request-channel="input"
connection-factory="client"
request-timeout="10000"
reply-timeout="10000"/>
Consider to use one-way <int-ip:tcp-outbound-channel-adapter>: http://docs.spring.io/spring-integration/docs/latest-ga/reference/html/ip.html#tcp-adapters

Spring AMQP Integration - Consumer Manual Acknowledgement

I am testing Spring-AMQP with Spring-Integration support, I've following configuration and test:
<rabbit:connection-factory id="connectionFactory" />
<rabbit:queue name="durableQ"/>
<int:channel id="consumingChannel">
<int:queue capacity="2"/> <!-- Message get Acked as-soon-as filled in Q -->
</int:channel>
<int-amqp:inbound-channel-adapter
channel="consumingChannel"
queue-names="durableQ"
connection-factory="connectionFactory"
concurrent-consumers="1"
acknowledge-mode="AUTO"
/>
public static void main(String[] args) {
System.out.println("Starting consumer with integration..");
AbstractApplicationContext context = new ClassPathXmlApplicationContext(
"classpath:META-INF/spring/integration/spring-integration-context-consumer.xml");
PollableChannel consumingChannel = context.getBean("consumingChannel",
PollableChannel.class);
int count = 0;
while (true) {
Message<?> msg = consumingChannel.receive(1000);
System.out.println((count++) + " \t -> " + msg);
try { //sleep to check number of messages in queue
Thread.sleep(50000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
In this configuration it was evident that as soon as message arrives at consumingChannel they are Acked and hence removed from queue. I validated this by placing a high sleep after receive and check queue-size. There are no further control on it.
Now if I set acknowledge-mode=MANUAL, there are no ways seems to do manual ack via spring integration.
My need is to process message and after processing do a manual-ack so till ack message remains persisted at durableQ.
Is there any way to handle MANUAL ack with spring-amqp-integration? I want to avoid passing ChannelAwareMessageListener to inbound-channel-adapter since I want to have control of consumer's receive.
Update:
It even doesn't seems to be possible when using own listener-container with inbound-channel-adapter:
// Below creates a default direct-channel (spring-integration channel) named "adapter", to receive poll this channel which is same as above
<int-amqp:inbound-channel-adapter id="adapter" listener-container="amqpListenerContainer" />
<bean id="amqpListenerContainer" class="org.springframework.amqp.rabbit.listener.SimpleMessageListenerContainer">
<property name="connectionFactory" ref="connectionFactory" />
<property name="queueNames" value="durableQ" />
<property name="acknowledgeMode" value="MANUAL" />
// messageListener not allowed when using with adapter, so no way of having own ChannelAwareMessageListener, so no channel exposed onMessage, hence no way to ack
<property name="messageListener" ref="listener"/>
</bean>
<bean id="listener" class="com.sd.springint.rmq.MsgListener"/>
Above configuration throws error as messageListener property is not allowed, see inline comment on tag. So purpose of using listner-container got defeated (for exposing channel via ChannelAwareMessageListener).
To me spring-integration cannot be used for manual-acknowledgement (I know, this is a hard saying!), Can anyone help me in validating this or Is there any specific approach/configuration required for this which I am missing?
The problem is because you are using async handoff using a QueueChannel. It is generally better to control the concurrency in the container (concurrent-consumers="2") and don't do any async handoffs in your flow (use DirectChannels). That way, AUTO ack will work just fine. Instead of receiving from the PollableChannel subscribe a new MessageHandler() to a SubscribableChannel.
Update:
You normally don't need to deal with Messages in an SI application, but the equivalent of your test with a DirectChannel would be...
SubscribableChannel channel = context.getBean("fromRabbit", SubscribableChannel.class);
channel.subscribe(new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
System.out.println("Got " + message);
}
});
MANUAL Ack is allowed only via Channel.basicAck(). So, you should have an access to the Channel, on which your message was received.
Try to play with advice-chain of <int-amqp:inbound-channel-adapter>:
Implement some Advice as MethodBeforeAdvice
The advice-chain on Container is applied for ContainerDelegate#invokeListener
The first argument of that method is exactly a Channel
Suppose you can place to the MessageProperties.headers that Channel within that Advice
And configure <int-amqp:inbound-channel-adapter> with mapped-request-headers to that Channel.
And in the end try to invoke basicAck() on that Channel header from Spring Integration Message in the any place of your downstream flow.

Keep messages in ActiveMQ queue if ThreadPoolTaskExecutor has no free capacity

I have two Java processes, the first one of them produces messages and puts
them onto an ActiveMQ queue. The second process (consumer) uses Spring
Integration to get messages from the queue and processes them in threads.
I have two requirements:
The consumer should have 3 processing threads. If I have 10 messages
coming in through the queue, I want to have 3 threads processing the first 3
messages, and the other 7 messages should be buffered.
When the consumer stops while some messages are not yet processed, it
should continue processing the messages after a restart.
Here's my config:
<bean id="messageActiveMqQueue" class="org.apache.activemq.command.ActiveMQQueue">
<constructor-arg value="example.queue" />
</bean>
<int-jms:message-driven-channel-adapter
destination="messageActiveMqQueue" channel="incomingMessageChannel" />
<int:channel id="incomingMessageChannel">
<int:dispatcher task-executor="incomingMessageChannelExecutor" />
</int:channel>
<bean id="incomingMessageChannelExecutor"
class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="daemon" value="false" />
<property name="maxPoolSize" value="3" />
</bean>
<int:service-activator input-channel="incomingMessageChannel"
ref="myMessageProcessor" method="processMessage" />
The first requirement works as expected. I produce 10 messages and 3
myMessageProcessors start processing a message each. As soon as the 1st message has
finished, the 4th message is processed.
However, when I kill the consumer before all messages are processed, those
messages are lost. After a restart, the consumer does not get those messages
again.
I think in the above configuration that's because the threads generated by the
ThreadPoolTaskExecutor queue the messages. So the messages are already removed
from the incomingMessageChannel. Hence I tried setting the queue capacity of
the incomingMessageChannelExecutor:
<property name="queueCapacity" value="0" />
But now I get error messages when I have more than 3 messages:
2013-06-12 11:47:52,670 WARN [org.springframework.jms.listener.DefaultMessageListenerContainer] - Execution of JMS message listener failed, and no ErrorHandler has been set.
org.springframework.integration.MessageDeliveryException: failed to send Message to channel 'incomingMessageChannel'
I also tried changing the message-driven-channel-adapter to an inbound-gateway,
but this gives me the same error.
Do I have to set an error handler in the inbound-gateway, so that the errors go back to the ActiveMQ queue? How do I have to configure the queue so that the messages are kept in the queue if the ThreadPoolTaskExecutor doesn't have a free thread?
Thanks in advance,
Benedikt
No; instead of using an executor channel, you should be controlling the concurrency with the <message-driven-channel-adapter/>.
Remove the <dispatcher/> from the channel and set concurrent-consumers="3" on the adapter.

Categories

Resources