How to send and receive file with Java RabbitMQ? - java

How to send file with Java RabbitMQ?
Especially using message converter.
I'm using Spring Framework, can send String or ArrayList but can't send File. I'm only use convertAndSend and convertAndReceive to send File but get :
org.springframework.amqp.AmqpIOException: java.io.FileNotFoundException
I don't know how to use message converter. The code from here and change some class :
HelloWorldHandler.java
package org.springframework.amqp.helloworld.async;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;
import java.util.ArrayList;
import org.springframework.amqp.core.Message;
public class HelloWorldHandler {
public void handleMessage(File message) throws IOException {
BufferedReader br = new BufferedReader(new FileReader(message));
System.out.println(br.readLine());
}
}
ProducerConfiguration.java
package org.springframework.amqp.helloworld.async;
import java.io.File;
import java.util.concurrent.atomic.AtomicInteger;
import org.springframework.amqp.rabbit.connection.CachingConnectionFactory;
import org.springframework.amqp.rabbit.connection.ConnectionFactory;
import org.springframework.amqp.rabbit.core.RabbitTemplate;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.config.BeanPostProcessor;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.scheduling.annotation.ScheduledAnnotationBeanPostProcessor;
#Configuration
public class ProducerConfiguration {
protected final String helloWorldQueueName = "hello.world.queue";
#Bean
public RabbitTemplate rabbitTemplate() {
RabbitTemplate template = new RabbitTemplate(connectionFactory());
template.setRoutingKey(this.helloWorldQueueName);
return template;
}
#Bean
public ConnectionFactory connectionFactory() {
CachingConnectionFactory connectionFactory = new CachingConnectionFactory("x.x.x.x");
connectionFactory.setUsername("username");
connectionFactory.setPassword("password");
return connectionFactory;
}
#Bean
public ScheduledProducer scheduledProducer() {
return new ScheduledProducer();
}
#Bean
public BeanPostProcessor postProcessor() {
return new ScheduledAnnotationBeanPostProcessor();
}
static class ScheduledProducer {
#Autowired
private volatile RabbitTemplate rabbitTemplate;
private final AtomicInteger counter = new AtomicInteger();
#Scheduled(fixedRate = 3000)
public void sendMessage() {
rabbitTemplate.convertAndSend(new File("test.txt"));
}
}
}

You can convert the file content into byte array and send the byte[] as below.
byte[] fileData = // get content from file as byte[] [Refer Here][1]
String fileType = // get file type from file
Message message = MessageBuilder.withBody(fileData).setHeader("ContentType", fileType).build();
rabbitTemplate.send("exchnage name", "routing key", message);

Related

How to create a working TCP Server socket in spring boot and how to handle the incoming message?

I have tried to implement a TCP server socket with spring integration in an allready existing spring boot application, but I am facing a problem and this problem drives me crazy...
The client is sending a message (a byte array) to the server and timesout. That's it.
I am not receiving any exceptions from the server. It seems I have provided the wrong port or somthing but after checking the port, I am sure it is the right one.
This is my annotation based configuration class:
import home.brew.server.socket.ServerSocketHandler;
import lombok.extern.log4j.Log4j2;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.integration.config.EnableIntegration;
import org.springframework.integration.dsl.IntegrationFlow;
import org.springframework.integration.dsl.IntegrationFlows;
import org.springframework.integration.ip.dsl.Tcp;
#Log4j2
#Configuration
#EnableIntegration
public class TcpServerSocketConfiguration {
#Value("${socket.port}")
private int serverSocketPort;
#Bean
public IntegrationFlow server(ServerSocketHandler serverSocketHandler) {
TcpServerConnectionFactorySpec connectionFactory =
Tcp.netServer(socketPort)
.deserializer(new CustomSerializerDeserializer())
.serializer(new CustomSerializerDeserializer())
.soTcpNoDelay(true);
TcpInboundGatewaySpec inboundGateway =
Tcp.inboundGateway(connectionFactory);
return IntegrationFlows
.from(inboundGateway)
.handle(serverSocketHandler::handleMessage)
.get();
}
#Bean
public ServerSocketHandler serverSocketHandler() {
return new ServerSocketHandler();
}
}
I wanted to make the receive functionality work before I try to send an answer, so that's why have a minimal configuration.
And the following class should process the received message from the server socket
import lombok.extern.log4j.Log4j2;
import org.springframework.messaging.Message;
import org.springframework.messaging.MessageHeaders;
import org.springframework.messaging.MessagingException;
#Log4j2
public class ServerSocketHandler {
public String handleMessage(Message<?> message, MessageHeaders messageHeaders) {
log.info(message.getPayload());
// TODO implement something useful to process the incoming message here...
return message.getPayload().toString();
}
}
The handler method from above was never invoked even once!
I have googled for some example implementations or tutorials but I haven't found anyhing what worked for me.
I allready tried the implementations of these sites:
https://vispud.blogspot.com/2019/03/how-to-implement-simple-echo-socket.html
https://docs.spring.io/spring-integration/docs/current/reference/html/ip.html#note-nio
Spring Boot TCP Client
and a bunch of sites more... but nothing helped me :-(
UPDATE 1
I have implemented a custom serializer/deserializer:
import lombok.Data;
import lombok.extern.log4j.Log4j2;
import org.springframework.core.serializer.Deserializer;
import org.springframework.core.serializer.Serializer;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
#Log4j2
#Data
public class CustomSerializerDeserializer implements Serializer<byte[]>,
Deserializer<byte[]> {
#Override
public byte[] deserialize(InputStream inputStream) throws IOException {
return inputStream.readAllBytes();
}
#Override
public void serialize(byte[] object, OutputStream outputStream) throws IOException {
outputStream.write(object);
}
}
After the client have sent a message, the custom serializer is invoked but the content ist always empty. I have no idea why.... The serializer needs a lot of time to read all bytes from the stream and in the end it is empty. The procedure is repeating all the time, so I think I have build an infinty loop by accident...
UPDATE 2
I have captured the communication between Client and server socket:
It looks like I am stuck in the handshake and therefore there is no payload...
So if anybody could help me out with this, I would be very thankful and if you need some more information, just let me know.
Thanks in advance!
Well, after a few days of analysing and coding, I found the best solution for me to handle TCP socket communications using spring integration. For other developers who are struggling with the same problems. Here is what I've done so far.
This class contains a - for me working - annotation based TCP socket connection configuration
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.integration.annotation.IntegrationComponentScan;
import org.springframework.integration.annotation.MessagingGateway;
import org.springframework.integration.channel.DirectChannel;
import org.springframework.integration.config.EnableIntegration;
import org.springframework.integration.ip.IpHeaders;
import org.springframework.integration.ip.tcp.TcpInboundGateway;
import org.springframework.integration.ip.tcp.TcpOutboundGateway;
import org.springframework.integration.ip.tcp.connection.AbstractClientConnectionFactory;
import org.springframework.integration.ip.tcp.connection.AbstractServerConnectionFactory;
import org.springframework.integration.ip.tcp.connection.TcpNetClientConnectionFactory;
import org.springframework.integration.ip.tcp.connection.TcpNetServerConnectionFactory;
import org.springframework.messaging.MessageChannel;
import org.springframework.messaging.handler.annotation.Header;
import org.springframework.web.context.request.RequestContextListener;
/**
* Spring annotation based configuration
*/
#Configuration
#EnableIntegration
#IntegrationComponentScan
public class TcpServerSocketConfiguration {
public static final CustomSerializerDeserializer SERIALIZER = new CustomSerializerDeserializer();
#Value("${socket.port}")
private int socketPort;
/**
* Reply messages are routed to the connection only if the reply contains the ip_connectionId header
* that was inserted into the original message by the connection factory.
*/
#MessagingGateway(defaultRequestChannel = "toTcp")
public interface Gateway {
void send(String message, #Header(IpHeaders.CONNECTION_ID) String connectionId);
}
#Bean
public MessageChannel fromTcp() {
return new DirectChannel();
}
#Bean
public MessageChannel toTcp() {
return new DirectChannel();
}
#Bean
public AbstractServerConnectionFactory serverCF() {
TcpNetServerConnectionFactory serverCf = new TcpNetServerConnectionFactory(socketPort);
serverCf.setSerializer(SERIALIZER);
serverCf.setDeserializer(SERIALIZER);
serverCf.setSoTcpNoDelay(true);
serverCf.setSoKeepAlive(true);
// serverCf.setSingleUse(true);
// final int soTimeout = 5000;
// serverCf.setSoTimeout(soTimeout);
return serverCf;
}
#Bean
public AbstractClientConnectionFactory clientCF() {
TcpNetClientConnectionFactory clientCf = new TcpNetClientConnectionFactory("localhost", socketPort);
clientCf.setSerializer(SERIALIZER);
clientCf.setDeserializer(SERIALIZER);
clientCf.setSoTcpNoDelay(true);
clientCf.setSoKeepAlive(true);
// clientCf.setSingleUse(true);
// final int soTimeout = 5000;
// clientCf.setSoTimeout(soTimeout);
return clientCf;
}
#Bean
public TcpInboundGateway tcpInGate() {
TcpInboundGateway inGate = new TcpInboundGateway();
inGate.setConnectionFactory(serverCF());
inGate.setRequestChannel(fromTcp());
inGate.setReplyChannel(toTcp());
return inGate;
}
#Bean
public TcpOutboundGateway tcpOutGate() {
TcpOutboundGateway outGate = new TcpOutboundGateway();
outGate.setConnectionFactory(clientCF());
outGate.setReplyChannel(toTcp());
return outGate;
}
This class contains a custom serializer and deserialiser
import lombok.extern.log4j.Log4j2;
import org.jetbrains.annotations.NotNull;
import org.springframework.core.serializer.Deserializer;
import org.springframework.core.serializer.Serializer;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.nio.charset.StandardCharsets;
/**
* A custom serializer for incoming and/or outcoming messages.
*/
#Log4j2
public class CustomSerializerDeserializer implements Serializer<byte[]>, Deserializer<byte[]> {
#NotNull
#Override
public byte[] deserialize(InputStream inputStream) throws IOException {
byte[] message = new byte[0];
if (inputStream.available() > 0) {
message = inputStream.readAllBytes();
}
log.debug("Deserialized message {}", new String(message, StandardCharsets.UTF_8));
return message;
}
#Override
public void serialize(#NotNull byte[] message, OutputStream outputStream) throws IOException {
log.info("Serializing {}", new String(message, StandardCharsets.UTF_8));
outputStream.write(message);
outputStream.flush();
}
}
In the following classes you can implement some buisness logic to process incoming ...
import lombok.extern.log4j.Log4j2;
import org.springframework.integration.annotation.MessageEndpoint;
import org.springframework.integration.annotation.ServiceActivator;
import org.springframework.stereotype.Component;
#Log4j2
#Component
#MessageEndpoint
public class ClientSocketHandler {
#ServiceActivator(inputChannel = "toTcp")
public byte[] handleMessage(byte[] msg) {
// TODO implement some buisiness logic here
return msg;
}
}
and outgoing messages.
import lombok.extern.log4j.Log4j2;
import org.springframework.integration.annotation.MessageEndpoint;
import org.springframework.integration.annotation.ServiceActivator;
import org.springframework.stereotype.Component;
#Log4j2
#Component
#MessageEndpoint
public class ClientSocketHandler {
#ServiceActivator(inputChannel = "toTcp")
public byte[] handleMessage(byte[] msg) {
// implement some business logic here
return msg;
}
}
Hope it helps. ;-)
How are you communicating with this server? By default the connection factory is configured to require the input to be terminated by CRLF (e.g. Telnet). You have to configure a different deserializer if your client uses something else to indicate a message end.
Also, your method signature is incorrect; it should be:
public String handleMessage(byte[] message, MessageHeaders messageHeaders) {
String string = new String(message);
System.out.println(string);
return string.toUpperCase();
}
This works fine for me with Telnet:
$ telnet localhost 1234
Trying ::1...
Connected to localhost.
Escape character is '^]'.
foo
FOO
^]
telnet> quit
Connection closed.
And here is a version that works with just LF (e.g. netcat):
#Bean
public IntegrationFlow server(ServerSocketHandler serverSocketHandler) {
return IntegrationFlows.from(Tcp.inboundGateway(
Tcp.netServer(1234)
.deserializer(TcpCodecs.lf())
.serializer(TcpCodecs.lf())))
.handle(serverSocketHandler::handleMessage)
.get();
}
$ nc localhost 1234
foo
FOO
^C

#Value returning null in unit test

I have a spring boot app with an Endpoint Test Configuration class and a unit test to test my http client. I am trying to get my server address and port from my application.properties which is located in my src/test.(All the classes are in my src/test.)
Here is my config class code :
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import javax.xml.bind.JAXBException;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.util.ResourceUtils;
import com.nulogix.billing.service.PredictionEngineService;
import com.nulogix.billing.ws.endpoint.AnalyzeEndPoint;
import com.nulogix.billing.ws.endpoint.GetVersionEndPoint;
#Configuration
public class EndPointTestConfiguration {
#Value("${billing.engine.address}")
private String mockAddress;
#Value("${billing.engine.port}")
private String mockPort;
#Bean
public String getAddress() {
String serverAddress = "http://" + mockAddress + ":" + mockPort;
return serverAddress;
}
#Bean
public GetVersionEndPoint getVersionEndPoint() {
return new GetVersionEndPoint();
}
I annotated the values from my .properties with #value and then created a method that I instantiated with a bean to to return my server address string.
I then use that string value here in my HttpClientTest class:
import static org.junit.Assert.*;
import java.io.IOException;
import java.util.Map;
import org.apache.http.client.ClientProtocolException;
import org.apache.http.client.fluent.Request;
import org.apache.http.entity.ContentType;
import org.junit.AfterClass;
import org.junit.BeforeClass;
import org.junit.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.builder.SpringApplicationBuilder;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.context.ConfigurableApplicationContext;
import com.google.gson.Gson;
import com.nulogix.billing.configuration.EndPointTestConfiguration;
import com.nulogix.billing.mockserver.MockServerApp;
#SpringBootTest(classes = EndPointTestConfiguration.class)
public class HttpClientTest {
#Autowired
EndPointTestConfiguration endpoint;
public static final String request_bad = "ncs|56-2629193|1972-03-28|20190218|77067|6208|3209440|self|";
public static final String request_good = "ncs|56-2629193|1972-03-28|20190218|77067|6208|3209440|self|-123|-123|-123|0.0|0.0|0.0|0.0|0.0|0.0|0.0";
//gets application context
static ConfigurableApplicationContext context;
//call mock server before class
#BeforeClass
static public void setup(){
SpringApplication springApplication = new SpringApplicationBuilder()
.sources(MockServerApp.class)
.build();
context = springApplication.run();
}
//shutdown mock server after class
#AfterClass
static public void tearDown(){
SpringApplication.exit(context);
}
#Test
public void test_bad() throws ClientProtocolException, IOException {
// missing parameter
String result = Request.Post(endpoint.getAddress())
.connectTimeout(2000)
.socketTimeout(2000)
.bodyString(request_bad, ContentType.TEXT_PLAIN)
.execute().returnContent().asString();
Map<?, ?> resultJsonObj = new Gson().fromJson(result, Map.class);
// ensure the key exists
assertEquals(resultJsonObj.containsKey("status"), true);
assertEquals(resultJsonObj.containsKey("errorMessage"), true);
// validate values
Boolean status = (Boolean) resultJsonObj.get("status");
assertEquals(status, false);
String errorMessage = (String) resultJsonObj.get("errorMessage");
assertEquals(errorMessage.contains("Payload has incorrect amount of parts"), true);
}
#Test
public void test_good() throws ClientProtocolException, IOException {
String result = Request.Post(endpoint.getAddress())
.connectTimeout(2000)
.socketTimeout(2000)
.bodyString(request_good, ContentType.TEXT_PLAIN)
.execute().returnContent().asString();
Map<?, ?> resultJsonObj = new Gson().fromJson(result, Map.class);
// ensure the key exists
assertEquals(resultJsonObj.containsKey("status"), true);
assertEquals(resultJsonObj.containsKey("errorMessage"), false);
assertEquals(resultJsonObj.containsKey("HasCopay"), true);
assertEquals(resultJsonObj.containsKey("CopayAmount"), true);
assertEquals(resultJsonObj.containsKey("HasCoinsurance"), true);
assertEquals(resultJsonObj.containsKey("CoinsuranceAmount"), true);
assertEquals(resultJsonObj.containsKey("version"), true);
// validate values
Boolean status = (Boolean) resultJsonObj.get("status");
assertEquals(status, true);
String version = (String) resultJsonObj.get("version");
assertEquals(version, "0.97");
}
}
I use it in the request.post, I didn't want to hardcode in my IP address and port number.
When I run the test it says
[ERROR] HttpClientTest.test_bad:63 NullPointer
[ERROR] HttpClientTest.test_good:86 NullPointer
But I am not sure why it is null? I am pretty sure I have everything instantiated and the string is clearly populated..
My package structure for my config is com.billing.mockserver and my package structure for my unit test is com.billing.ws.endpoint.
Here is my application.properties
server.port=9119
server.ssl.enabled=false
logging.config=classpath:logback-spring.xml
logging.file=messages
logging.file.max-size=50MB
logging.level.com.nulogix=DEBUG
billing.engine.address=127.0.0.1
billing.engine.port=9119
billing.engine.api.version=0.97
billing.engine.core.name=Patient_Responsibility
You are missing springboot basic understanding. #Configuration class is to initialize other spring beans and other things and are the first classes which get initialized. You should not #Autowire #configuration class.
In your Configuration class you can either create Spring bean for username and password and autowire that in your test class or directly use #Value in your Test class.
Example: in your configuration class you are creating bean of GetVersionEndPoint and you can autowire that in your Test class.
Update 2:
For test classes, you need to add application.properties file in src\test\resource

How to write a proper unit test for Elasticsearch in Java

Overview:
I'm totally new to Elastic search testing and I'm gonna add proper unit tests. The project compatibilities are as follow:
Java 8
Elasticsearch 6.2.4
Project uses low level rest client for fetching data from ES
More info about ES configurations is as follow:
import static java.net.InetAddress.getByName;
import static java.util.Arrays.stream;
import java.net.UnknownHostException;
import java.util.Map;
import java.util.Objects;
import javax.inject.Inject;
import org.apache.http.HttpHost;
import org.elasticsearch.client.RestClient;
import org.elasticsearch.client.RestHighLevelClient;
import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.transport.TransportAddress;
import org.elasticsearch.transport.client.PreBuiltTransportClient;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import au.com.api.util.RestClientUtil;
import lombok.extern.slf4j.Slf4j;
#Slf4j
#Configuration
public class ElasticConfiguration implements InitializingBean{
#Value(value = "${elasticsearch.hosts}")
private String[] hosts;
#Value(value = "${elasticsearch.httpPort}")
private int httpPort;
#Value(value = "${elasticsearch.tcpPort}")
private int tcpPort;
#Value(value = "${elasticsearch.clusterName}")
private String clusterName;
#Inject
private RestClientUtil client;
#Bean
public RestHighLevelClient restHighClient() {
return new RestHighLevelClient(RestClient.builder(httpHosts()));
}
#Bean
#Deprecated
public RestClient restClient() {
return RestClient.builder(httpHosts()).build();
}
/**
* #return TransportClient
* #throws UnknownHostException
*/
#SuppressWarnings("resource")
#Bean
public TransportClient transportClient() throws UnknownHostException{
Settings settings = Settings.builder()
.put("cluster.name", clusterName).build();
return new PreBuiltTransportClient(settings).addTransportAddresses(transportAddresses());
}
#Override
public void afterPropertiesSet() throws Exception {
log.debug("loading search templates...");
try {
for (Map.Entry<String, String> entry : Constants.SEARCH_TEMPLATE_MAP.entrySet()) {
client.putInlineSearchTemplateToElasticsearch(entry.getKey(), entry.getValue());
}
} catch (Exception e) {
log.error("Exception has occurred in putting search templates into ES.", e);
}
}
private HttpHost[] httpHosts() {
return stream(hosts).map(h -> new HttpHost(h, httpPort, "http")).toArray(HttpHost[]::new);
}
private TransportAddress[] transportAddresses() throws UnknownHostException {
TransportAddress[] transportAddresses = stream(hosts).map(h -> {
try {
return new TransportAddress(getByName(h), tcpPort);
} catch (UnknownHostException e) {
log.error("Exception has occurred in creating ES TransportAddress. host: '{}', tcpPort: '{}'", h, tcpPort, e);
}
return null;
}).filter(Objects::nonNull).toArray(TransportAddress[]::new);
if (transportAddresses.length == 0) {
throw new UnknownHostException();
}
return transportAddresses;
}
}
Issue:
I don't know how to Mock ES or how to test ES without running an standalone ES on my machine. Please use the following class as an example and let me know how could I write a testcase (unit test not integration) for getSearchResponse method:
import java.util.Arrays;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
import org.elasticsearch.action.search.SearchRequest;
import org.elasticsearch.action.search.SearchResponse;
import org.elasticsearch.client.transport.NoNodeAvailableException;
import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.script.ScriptType;
import org.elasticsearch.script.mustache.SearchTemplateRequestBuilder;
import org.elasticsearch.search.Scroll;
import org.elasticsearch.search.aggregations.Aggregation;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.MessageSource;
import org.springframework.stereotype.Repository;
#Slf4j
#Repository
#NoArgsConstructor
public abstract class NewBaseElasticsearchRepository {
#Autowired
protected NewIndexLocator newIndexLocator;
#Value(value = "${elasticsearch.client.timeout}")
private Long timeout;
#Autowired
protected TransportClient transportClient;
#Autowired
protected ThresholdService thresholdService;
#Autowired
protected MessageSource messageSource;
/**
* #param script the name of the script to be executed
* #param templateParams a map of the parameters to be sent to the script
* #param indexName the index to target (an empty indexName will search all indexes)
*
* #return a Search Response object containing details of the request results from Elasticsearch
*
* #throws NoNodeAvailableException thrown when the transport client cannot connect to any ES Nodes (or Coordinators)
* #throws Exception thrown for all other request errors such as parsing and non-connectivity related issues
*/
protected SearchResponse getSearchResponse(String script, Map<String, Object> templateParams, String... indexName) {
log.debug("transport client >> index name --> {}", Arrays.toString(indexName));
SearchResponse searchResponse;
try {
searchResponse = new SearchTemplateRequestBuilder(transportClient)
.setScript(script)
.setScriptType(ScriptType.STORED)
.setScriptParams(templateParams)
.setRequest(new SearchRequest(indexName))
.execute()
.actionGet(timeout)
.getResponse();
} catch (NoNodeAvailableException e) {
log.error(ELASTIC_SEARCH_EXCEPTION_NOT_FOUND, e.getMessage());
throw new ElasticSearchException(ELASTIC_SEARCH_EXCEPTION_NOT_FOUND);
} catch (Exception e) {
log.error(ELASTIC_SEARCH_EXCEPTION, e.getMessage());
throw new ElasticSearchException(ELASTIC_SEARCH_EXCEPTION);
}
log.debug("searchResponse ==> {}", searchResponse);
return searchResponse;
}
So, I would be grateful if you could have a look on the example class and share your genuine solutions with me here about how could I mock TransportClient and get a proper response from SearchResponse object.
Note:
I tried to use ESTestCase from org.elasticsearch.test:framework:6.2.4 but faced jar hell issue and could't resolve it. In the meantime, I could't find any proper docs related to that or Java ES unit testing, in general.

Spring Batch java config error using ClassifierCompositeItemWriter

I'm using spring batch with java configuration (new to this) and I'm running into a error when trying to use a ClassifierCompositeItemWriter so generate separate files based on a classifier.
The error im getting is org.springframework.batch.item.WriterNotOpenException: Writer must be open before it can be written to
My configuration looks like follows:
package com.infonova.btcompute.batch.geneva.job;
import com.infonova.btcompute.batch.billruntransfer.BillRunTranStatusFinishedJobAssignment;
import com.infonova.btcompute.batch.billruntransfer.BillRunTranStatusInprogressJobAssignment;
import com.infonova.btcompute.batch.billruntransfer.BillRunTransferStatus;
import com.infonova.btcompute.batch.geneva.camel.GenevaJobLauncher;
import com.infonova.btcompute.batch.geneva.dto.GenevaDetailsResultsDto;
import com.infonova.btcompute.batch.geneva.dto.GenveaDetailsTransactionDto;
import com.infonova.btcompute.batch.geneva.properties.GenevaDetailsExportJobProperties;
import com.infonova.btcompute.batch.geneva.rowmapper.GenevaDetailsTransactionsRowMapper;
import com.infonova.btcompute.batch.geneva.steps.*;
import com.infonova.btcompute.batch.repository.BillrunTransferStatusMapper;
import com.infonova.btcompute.batch.utils.FileNameGeneration;
import com.infonova.product.batch.camel.CamelEnabledJob;
import org.apache.camel.Processor;
import org.apache.camel.builder.RouteBuilder;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.item.file.FlatFileItemWriter;
import org.springframework.batch.item.support.ClassifierCompositeItemWriter;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.classify.BackToBackPatternClassifier;
import org.springframework.context.annotation.Bean;
import org.springframework.core.io.FileSystemResource;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.sql.DataSource;
import java.io.File;
import java.sql.SQLException;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
public abstract class AbstractGenevaDetailsExportJob extends CamelEnabledJob {
private static final Logger LOGGER = LoggerFactory.getLogger(AbstractGenevaDetailsExportJob.class);
#Autowired
protected JobBuilderFactory jobBuilders;
#Autowired
protected StepBuilderFactory stepBuilders;
#Autowired
protected DataSource datasource;
#Autowired
private BillrunTransferStatusMapper billrunTransferStatusMapper;
#Autowired
protected JdbcTemplate jdbcTemplate;
public abstract GenevaDetailsExportJobProperties jobProperties();
#Bean
public RouteBuilder routeBuilder(final GenevaDetailsExportJobProperties jobProperties, final Job job) {
return new RouteBuilder() {
#Override
public void configure() throws Exception {
from(jobProperties.getConsumer())
.transacted("PROPAGATION_REQUIRED")
.routeId(jobProperties.getInputRouteName())
.process(genevaJobLauncher(job));
//.to("ftp://app#127.0.0.1?password=secret");
}
};
}
#Bean
public Processor genevaJobLauncher(Job job) {
return new GenevaJobLauncher(job);
}
#Bean
#StepScope
public GenevaDetailsReader reader() {
GenevaDetailsReader reader = new GenevaDetailsReader(jobProperties().getMandatorKey(),
jobProperties().getInvoiceType(), jobProperties().getSqlResourcePath());
reader.setSql("");
reader.setDataSource(datasource);
reader.setRowMapper(new GenevaDetailsTransactionsRowMapper());
reader.setFetchSize(jobProperties().getFetchSize());
return reader;
}
#Bean
#StepScope
public GenevaDetailsItemProcessor processor() {
return new GenevaDetailsItemProcessor();
}
#Bean
#StepScope
public ClassifierCompositeItemWriter writer() {
List<String> serviceCodes = new ArrayList<>();//billrunTransferStatusMapper.getServiceCodes(jobProperties().getMandatorKey());
Long billingTaskId = billrunTransferStatusMapper.getCurrentTaskId(jobProperties().getMandatorKey());
String countryKey = billrunTransferStatusMapper.getCountryKey(billingTaskId);
serviceCodes.add("BTCC");
serviceCodes.add("CCMS");
BackToBackPatternClassifier classifier = new BackToBackPatternClassifier();
classifier.setRouterDelegate(new GenveaDetailsRouterClassifier());
HashMap<String, Object> map = new HashMap<>();
for (String serviceCode : serviceCodes) {
map.put(serviceCode, genevaDetailsWriter(serviceCode, countryKey));
}
classifier.setMatcherMap(map);
ClassifierCompositeItemWriter<GenveaDetailsTransactionDto> writer = new ClassifierCompositeItemWriter<>();
writer.setClassifier(classifier);
return writer;
}
#Bean
#StepScope
public GenevaDetailsFlatFileItemWriter genevaDetailsWriter(String serviceCode, String countryKey) {
GenevaDetailsFlatFileItemWriter writer = new GenevaDetailsFlatFileItemWriter(jobProperties().getDelimiter());
FileNameGeneration fileNameGeneration = new FileNameGeneration();
try {
FileSystemResource fileSystemResource = new FileSystemResource(new File(jobProperties().getExportDir(), fileNameGeneration.generateFileName(jdbcTemplate,
serviceCode, countryKey)));
writer.setResource(fileSystemResource);
} catch (SQLException e) {
LOGGER.error("Error creating FileSystemResource : " + e.getMessage());
}
return writer;
}
#Bean
public Job job() {
return jobBuilders.get(jobProperties().getJobName())
.start(setBillRunTransferStatusDetailInprogressStep())
.next(processGenevaDetailsStep())
.next(setBillRunTransferStatusProcessedStep())
.build();
}
#Bean
public Step setBillRunTransferStatusDetailInprogressStep() {
return stepBuilders.get("setBillRunTransferStatusDetailInprogressStep")
.tasklet(setBillRunTransferStatusDetailInprogress())
.build();
}
#Bean
public Tasklet setBillRunTransferStatusDetailInprogress() {
return new BillRunTranStatusInprogressJobAssignment(BillRunTransferStatus.SUMMARY.toString(), BillRunTransferStatus.DETAILS_INPROGRESS.toString(),
jobProperties().getMandatorKey(), jobProperties().getInvoiceTypeNum(), jobProperties().getReportTypeNum());
}
#Bean
public Step setBillRunTransferStatusProcessedStep() {
return stepBuilders.get("setBillRunTransferStatusProcessedStep")
.tasklet(setBillRunTransferStatusProcessed())
.build();
}
#Bean
public Tasklet setBillRunTransferStatusProcessed() {
return new BillRunTranStatusFinishedJobAssignment(BillRunTransferStatus.PROCESSED.toString());
}
#Bean
public Step processGenevaDetailsStep() {
return stepBuilders.get("processGenevaDetailsStep")
.<GenveaDetailsTransactionDto, GenevaDetailsResultsDto>chunk(jobProperties().getChunkSize())
.reader(reader())
.processor(processor())
.writer(writer())
.build();
}
}
and my writer looks like:
package com.infonova.btcompute.batch.geneva.steps;
import com.infonova.btcompute.batch.geneva.dto.GenevaDetailsResultsDto;
import com.infonova.btcompute.batch.repository.BillrunTransferStatusMapper;
import com.infonova.btcompute.batch.utils.FileNameGeneration;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.core.StepExecution;
import org.springframework.batch.core.annotation.BeforeStep;
import org.springframework.batch.item.*;
import org.springframework.batch.item.file.FlatFileHeaderCallback;
import org.springframework.batch.item.file.FlatFileItemWriter;
import org.springframework.batch.item.file.transform.BeanWrapperFieldExtractor;
import org.springframework.batch.item.file.transform.DelimitedLineAggregator;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.core.io.FileSystemResource;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Component;
import java.io.File;
import java.io.IOException;
import java.io.Writer;
import java.sql.SQLException;
import java.util.Iterator;
import java.util.List;
#Component
public class GenevaDetailsFlatFileItemWriter extends FlatFileItemWriter<GenevaDetailsResultsDto> {
private static final Logger LOGGER = LoggerFactory.getLogger(GenevaDetailsFlatFileItemWriter.class);
#Autowired
protected JdbcTemplate jdbcTemplate;
#Autowired
private BillrunTransferStatusMapper billrunTransferStatusMapper;
private String delimiter;
public GenevaDetailsFlatFileItemWriter(String delimiter) {
this.delimiter = delimiter;
this.setLineAggregator(getLineAggregator());
this.setHeaderCallback(getHeaderCallback());
}
private DelimitedLineAggregator<GenevaDetailsResultsDto> getLineAggregator() {
DelimitedLineAggregator<GenevaDetailsResultsDto> delLineAgg = new DelimitedLineAggregator<>();
delLineAgg.setDelimiter(delimiter);
BeanWrapperFieldExtractor<GenevaDetailsResultsDto> fieldExtractor = new BeanWrapperFieldExtractor<>();
fieldExtractor.setNames(getNames());
delLineAgg.setFieldExtractor(fieldExtractor);
return delLineAgg;
}
private String[] getHeaderNames() {
return new String[] {"Record ID", "Service Identifier", "Billing Account Reference", "Cost Description", "Event Cost",
"Event Date and Time", "Currency Code", "Charge Category", "Order Identifier", "Net Usage", "UOM",
"Quantity", "Service Start Date", "Service End Date"};
}
private String[] getNames() {
return new String[] {"RECORD_ID", "SERVICE_CODE", "BILLING_ACCOUNT_REFERENCE", "COST_DESCRIPTION", "EVENT_COST",
"EVENT_DATE_AND_TIME", "CURRENCY_CODE", "CHARGE_CATEGORY", "ORDER_IDENTIFIER", "NET_USAGE", "UOM",
"QUANTITY", "SERVICE_START_DATE", "SERVICE_END_DATE"};
}
private FlatFileHeaderCallback getHeaderCallback()
{
return new FlatFileHeaderCallback() {
#Override
public void writeHeader(Writer writer) throws IOException {
writer.write(String.join(delimiter, getHeaderNames()));
}
};
}
// #BeforeStep
// public void beforeStep(StepExecution stepExecution) {
// billingTaskId = (Long) stepExecution.getJobExecution().getExecutionContext().get("billingTaskId");
// FileNameGeneration fileNameGeneration = new FileNameGeneration();
//
// try {
// FileSystemResource fileSystemResource = new FileSystemResource(new File(exportDir, fileNameGeneration.generateFileName(jdbcTemplate,
// serviceCode, billrunTransferStatusMapper.getCountryKey(billingTaskId))));
// setResource(fileSystemResource);
// } catch (SQLException e) {
// LOGGER.error("Error creating FileSystemResource : " + e.getMessage());
// }
// }
}
I have searched the web and cannot find a solution to this issue.
What #Hansjoerg Wingeier wrote about ClassifierCompositeItemWriter is correct, but the right way to resolve the problem is to register delegated writer(s) as stream(s) using AbstractTaskletStepBuilder.stream() to let SB manage execution context lifecycle.
ClassifierCompositeItemWriter does not implement the ItemStream interface, hence the open method of your FlatFileItemWriter is never called.
The easiest thing to do is to call the open method when you create your classifier map:
for (String serviceCode : serviceCodes) {
FlatFileItemWriter writer =genevaDetailsWriter(serviceCode, countryKey);
writer.open (new ExecutionContext ());
map.put(serviceCode, writer);
}

Capturing LOGIN COOKIE via Rest Template

I have below code for making a POST call to RestAPI for Tableau system, which is working and seeing response output.
However, I would like to capture cookie from this output and need to be used for further consumption! Can somebody help me on this problem?
Code:
package com.abc.it.automation.service;
import java.io.FileInputStream;
import java.io.IOException;
import java.net.CookieStore;
import java.net.HttpCookie;
import java.net.URI;
import java.security.KeyManagementException;
import java.security.NoSuchAlgorithmException;
import java.util.List;
import java.util.Properties;
import org.springframework.http.HttpHeaders;
import org.springframework.http.RequestEntity.HeadersBuilder;
import org.springframework.http.ResponseEntity;
import org.springframework.http.client.ClientHttpResponse;
import org.springframework.web.client.ResponseExtractor;
import org.springframework.web.client.RestTemplate;
import com.abc.it.automation.utils.SSLUtil;
public class BIaaSTableauService {
private static Properties tableau_properties = new Properties();
static {
// Loads the values from configuration file into the Properties instance
try {
tableau_properties.load(new FileInputStream("res/config.properties"));
} catch (IOException e) {
System.out.println(e);
}
}
private static final String loginURL = tableau_properties.getProperty("server.host");
private static final String siteSearchURL = tableau_properties.getProperty("site.search.url");
public static void main(String[] args) throws KeyManagementException, NoSuchAlgorithmException {
RestTemplate restTableau = new RestTemplate();
String requestLogin = "<tsRequest>"+
"<credentials name=\"svc_tableau\" password=\"xxxxxxxxx\" >"+
"<site contentUrl=\"\"/>"+
"</credentials>"+
"</tsRequest>";
SSLUtil.turnOffSslChecking();
ResponseEntity<String> responseLogin = restTableau.postForEntity(loginURL, requestLogin, String.class);
System.out.println(responseLogin.getBody());
You need to build your RestTemplate as follows.
RestTemplate restTableau = new RestTemplate(new MyClientHttpRequestFactory());
Extend ClientHttpRequestFactory as follows.
public class MyClientHttpRequestFactory extends SimpleClientHttpRequestFactory {
private Cookie cookie;
//setters and getters.
#Override
protected void prepareConnection(HttpURLConnection connection, String httpMethod) {
this.setCookie(connection.getRequestProperty("Cookie"));
}
}

Categories

Resources