Problem rising spring batch version 3.0.7 to 3.0.9 - java

I am working on a Java project using the Spring Batch framework and I have to upgrade it from version 3.0.7 to 3.0.9 but I have a problem :
#Bean
public Step bonjourRetourJpaToX(StepBuilderFactory stepBuilderFactory, TaskExecutor taskExecutor,
ItemProcessor<BonjourRetourGroup, BonjourElementBlocAgent> compBonjourRetourBonjourProcessor,
#Qualifier("promotionListenerBonjourRetour") ExecutionContextPromotionListener promotionListenerBonjourRetour) {
return stepBuilderFactory.get("bonjourRetourJpaToX").<BonjourRetourGroup, BonjourElementBlocAgent>chunk(batchSizeLoadXml)
.reader(bonjourRetourJpaReader)
.processor(compBonjourRetourBonjourProcessor)
.writer(bonjourRetourXmlWriter)
.taskExecutor(taskExecutor)
.listener(promotionListenerBonjourRetour)
.listener(dsBonjourRetourFinalProcessor())
.listener(dsBonjourRetourTemporaryProcessor)
.listener(bonjourRetourBonjourBoucleStepListener())
.throttleLimit(bonjourRetourJobThrottleLimit)
.build();
}
With the maven compilation error :
[ERROR] src/main/java/source/bonjourRetour/batch/BonjourRetourJobConfig.java:[1025,33] cannot find symbol
symbol: method throttleLimit(int)
This step is multi-threaded

For Mahmoud Ben Hassine :
The composite processor ->
#Scope(value = "step", proxyMode = ScopedProxyMode.NO)
#Bean(name = "compBonjourRetourBonjourProcessorX")
public ItemProcessor<BonjourRetourGroup, XElementBlocAgent> compBonjourRetourBonjourProcessorX(#Value("#{stepExecution}") final StepExecution stepExecution) {
CompositeItemProcessor<BonjourRetourGroup, XElementBlocAgent> compositeProcessor = new CompositeItemProcessor<>();
compositeProcessor
.setDelegates(Arrays.asList(dsBonjourRetourXTemporaryProcessor, dsBonjourRetourXinalProcessor()));
return compositeProcessor;
}
I noticed that when I moved the listener : bonjourRetourBonjourBoucleStepListener(); like that it compiles :
#Bean
public Step bonjourRetourJpaToX(StepBuilderFactory stepBuilderFactory, TaskExecutor
taskExecutor,
ItemProcessor<BonjourRetourGroup, BonjourElementBlocAgent> compBonjourRetourBonjourProcessor,
#Qualifier("promotionListenerBonjourRetour") ExecutionContextPromotionListener promotionListenerBonjourRetour) {
return stepBuilderFactory.get("bonjourRetourJpaToX").<BonjourRetourGroup, BonjourElementBlocAgent>chunk(batchSizeLoadXml)
.reader(bonjourRetourJpaReader)
.processor(compBonjourRetourBonjourProcessor)
.writer(bonjourRetourXmlWriter)
.taskExecutor(taskExecutor)
.listener(promotionListenerBonjourRetour)
.listener(dsBonjourRetourFinalProcessor())
.listener(bonjourRetourBonjourBoucleStepListener())
//SWITCHED
.listener(dsBonjourRetourTemporaryProcessor)
.throttleLimit(bonjourRetourJobThrottleLimit)
.build();
I know that dsBonjourRetourTemporaryProcessor return an ItemProcessor and bonjourRetourBonjourBoucleStepListener return a StepExecutionListener
For the other parts of my project with this problem, when I moved a StepExecutionListener, it works
Maybe the problem is here ?

Related

Spring Cloud Data Flow datasources overrides spring batch app datasource

I'm setting up an instance of Spring Cloud Data Flow. I've run the following commands:
java -jar spring-cloud-dataflow-server-2.9.2.jar \
--spring.cloud.dataflow.features.streams-enabled=false \
--spring.cloud.dataflow.features.schedules-enabled=true \
--spring.datasource.url=jdbc:postgresql://localhost:5432/batch \
--spring.datasource.username=postgres \
--spring.datasource.password=postgres \
--spring.datasource.driver-class-name=org.postgresql.Driver \
--spring.datasource.initialization_mode=always
I've developed a batch job using spring batch to be deployed in this platform. The job uses two data sources: batch for Spring and task Metadata and app_db for my business logic. When I run the app locally, it persists metadata in batch and my business data in app_db, as expected. The problem is when I try to execute de job inside the Spring Cloud Dataflow. The platform overrides my configured business logic database and uses only the batch database, which is supposed to store metadata only.
application.yaml
spring:
batch:
datasource:
url: jdbc:postgresql://localhost:5432/batch
username: postgres
password: postgres
datasource:
url: jdbc:postgresql://localhost:5432/app_db
username: postgres
password: postgres
DatasourceConfiguration
public class DatasourceConfiguration {
#Bean
#ConfigurationProperties("spring.datasource")
#Primary
public DataSourceProperties dataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#Primary
public DataSource dataSource(DataSourceProperties dataSourceProperties) {
return dataSourceProperties.initializeDataSourceBuilder().build();
}
#Bean(name = "batchDataSourceProperties")
#ConfigurationProperties("spring.batch.datasource")
public DataSourceProperties batchDataSourceProperties() {
return new BatchDataSourceProperties();
}
#Bean(name = "batchDataSource")
public DataSource batchDataSource() {
return batchDataSourceProperties.initializeDataSourceBuilder().build();
}
}
#SpringBootApplication
#EnableTask
#EnableBatchProcessing
public class BatchApplication {
#Bean
public TaskConfigurer taskConfigurer(#Qualifier("batchDataSource") DataSource dataSource) {
return new DefaultTaskConfigurer(dataSource);
}
#Bean
public BatchConfigurer batchConfigurer(#Qualifier("batchDataSource") DataSource dataSource) {
return new DefaultBatchConfigurer(dataSource);
}
public static void main(String[] args) {
SpringApplication.run(BatchApplication.class, args);
}
}
Job
#Bean
public Job startJob(JobBuilderFactory jobBuilderFactory, DataSource dataSource) {
try {
System.out.println(dataSource.getConnection().getMetaData().getURL().toString());
} catch (Exception e) {
//TODO: handle exception
}
}
When I look at the data source,jdbc:postgresql://localhost:5432/app_db will be printed when the batch is executed from local and jdbc:postgresql://localhost:5432/batch will be printed when the batch (task) is executed from SCDF.
I want to know how dataflow is overriding application the spring.datasource even though I am not passing any arguments while executing the task. Please suggest a solution to avoid the overriding of datasource.
One solution I am thinking of is creating AppDatasourceConfiguration(app.datasource) use it. But is there a possibility to use spring.datasource without getting overiddien by SCDF.

Spring Batch Job Param always show zsh : no matches found

I'm learning for the first time how to use spring batch.
I did my spring conf like this :
#EnableBatchProcessing
#SpringBootApplication
public class BatchChap4Application {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
public Job job(){
return jobBuilderFactory.get("basicJob").start(step1()).build();
}
#Bean
public Step step1(){
return stepBuilderFactory.get("step1")
.tasklet((stepContribution, chunkContext) -> {
System.out.println("Hello World");
return RepeatStatus.FINISHED;
}).build();
}
public static void main(String[] args) {
SpringApplication.run(BatchChap4Application.class, args);
}
}
when I run jar manually from command line with string value , it works perfectly.
java -jar batch.jar executionDate=2021/02/21
but when I tried to change the param type from string to date, it always show zsh: no matches found: executionDate(date)=2021/02/21
java -jar batch.jar executionDate(date)=2021/02/21
got an unexpected output like this :
zsh: no matches found: executionDate(date)=2021/02/21**strong text**
I tried to search for error. unfortunately no answer. I'm using macOS for the development.
please help.
You need to escape the parenthesis:
java -jar batch.jar executionDate\(date\)=2021/02/21
Or pass job parameters between single quotes:
java -jar batch.jar 'executionDate(date)=2021/02/21'

Spring boot tests with Testcontainers' kafka without DirtiesContext

My goal is to use kafka test containers with spring boot context in tests without #DirtiesContext. Problem is that without starting container separately for each test class I have no idea how to consume messages that were produced only by particular test class or method.
So I end up consuming messages that were not a part of even test class that is running.
One solution might be to purge topic of messages. I have no idea how to do this, I've tried to restart container but then next test was not able to connect to kafka.
Second solution that I had in mind is to have consumer that will be created at the beginning of test method and somehow record messages from latest while other staff in test will be called. I've been able to do so with embeded kafka, I have no idea how to do this using test containers.
Current configuration looks like this:
#TestConfiguration
public class KafkaContainerConfig {
#Bean(initMethod = "start", destroyMethod = "stop")
public KafkaContainer kafkaContainer() {
return new KafkaContainer("5.0.3");
}
#Bean
public KafkaAdmin kafkaAdmin(KafkaProperties kafkaProperties, KafkaContainer kafkaContainer) {
kafkaProperties.setBootstrapServers(List.of(kafkaContainer.getBootstrapServers()));
return new KafkaAdmin(kafkaProperties.buildAdminProperties());
}
}
With annotation that will provide above configuration
#Target({ElementType.TYPE})
#Retention(RetentionPolicy.RUNTIME)
#Import(KafkaContainerConfig.class)
#EnableAutoConfiguration(exclude = TestSupportBinderAutoConfiguration.class)
#TestPropertySource("classpath:/application-test.properties")
#DirtiesContext
public #interface IncludeKafkaTestContainer {
}
And in test class itself with multiple such configuration it would looks like:
#IncludeKafkaTestContainer
#IncludePostgresTestContainer
#SpringBootTest(webEnvironment = RANDOM_PORT)
class SomeTest {
...
}
Currently consumer in test method is created this way:
KafkaConsumer<String, String> kafkaConsumer = createKafkaConsumer("topic_name");
ConsumerRecords<String, String> consumerRecords = kafkaConsumer.poll(Duration.ofSeconds(1));
List<ConsumerRecord<String, String>> topicMsgs = Lists.newArrayList(consumerRecords.iterator());
And:
public static KafkaConsumer<String, String> createKafkaConsumer(String topicName) {
Properties properties = new Properties();
properties.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
properties.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaContainer.getBootstrapServers());
properties.put(ConsumerConfig.GROUP_ID_CONFIG, "testGroup_" + topicName);
properties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
properties.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class)
KafkaConsumer<String, String> kafkaConsumer = new KafkaConsumer<>(properties);
kafkaConsumer.subscribe(List.of(topicName));
return kafkaConsumer;
}

When my Spring app runs, it isn't using my TogglzConfig file

I have a large Spring application that is set up without XML using only annotations. I have made some changes to this application and have a separate project with what should be almost all the same code. However, in this separate project, Togglz seems to be using some sort of default config instead of the TogglzConfig file I've set up.
The first sign that something was wrong was when I couldn't access the Togglz console. I get a 403 Forbidden error despite my config being set to allow anyone to use it (as shown on the Togglz site). I then did some tests and tried to see a list of features and the list is empty when I call FeatureContext.getFeatureManager().getFeatures() despite my Feature class having several features included. This is why I think it's using some sort of default.
TogglzConfiguration.java
public enum Features implements Feature {
FEATURE1,
FEATURE2,
FEATURE3,
FEATURE4,
FEATURE5;
public boolean isActive() {
return FeatureContext.getFeatureManager().isActive(this);
}
}
TogglzConfiguration.java
#Component
public class TogglzConfiguration implements TogglzConfig {
public Class<? extends Feature> getFeatureClass() {
return Features.class;
}
public StateRepository getStateRepository() {
File properties = [internal call to property file];
try {
return new FileBasedStateRepository(properties);
} catch (Exception e) {
throw new TogglzConfigException("Error getting Togglz configuration from " + properties + ".", e);
}
}
#Override
public UserProvider getUserProvider() {
return new UserProvider() {
#Override
public FeatureUser getCurrentUser() {
return new SimpleFeatureUser("admin", true);
}
};
}
}
SpringConfiguration.java
#EnableTransactionManagement
#Configuration
#ComponentScan(basePackages = { "root package for the entire project" }, excludeFilters =
#ComponentScan.Filter(type=FilterType.ANNOTATION, value=Controller.class))
public class SpringConfiguration {
#Bean
public TransformerFactory transformerFactory() {
return TransformerFactory.newInstance();
}
#Bean
public DocumentBuilderFactory documentBuilderfactory() {
return DocumentBuilderFactory.newInstance();
}
#Bean
public RestTemplate restTemplate() {
return new RestTemplate();
}
}
My project finds a bunch of other beans set up with the #Component annotation. I don't know if the problem is that this component isn't being picked up at all or if Togglz simply isn't using it for some reason. I tried printing the name of the FeatureManager returned by FeatureContext.getFeaturemanager() and it is FallbackTestFeatureManager so this seems to confirm my suspicion that it's just not using my config at all.
Anyone have any ideas on what I can check? I'm flat out of ideas, especially since this is working with an almost completely the same IntelliJ project on my machine right now. I just can't find out what's different about the Togglz setup or the Spring configurations. Thanks in advance for your help.
I finally had my light bulb moment and solved this problem. In case anyone else has a similar issue, it seems my mistake was having the Togglz testing and JUnit dependencies added to my project but not limiting them to the test scope. I overlooked that part of the site.
<!-- Togglz testing support -->
<dependency>
<groupId>org.togglz</groupId>
<artifactId>togglz-testing</artifactId>
<version>2.5.0.Final</version>
<scope>test</scope>
</dependency>
Without that scope, I assume these were overriding the Togglz configuration I created with a default test configuration and that was causing my issue.

Solving cycle of beans three datasources

Look at this code:
#ConfigurationProperties(prefix = "first.datasource")
#Bean
public DataSource dataSourceFIRST() {
return DataSourceBuilder
.create()
.build();
}
#ConfigurationProperties(prefix = "second.datasource")
#Bean
public DataSource dataSourceSECOND {
return DataSourceBuilder
.create()
.build();
}
#Primary
#Bean
public MyRoutingDataSource routingDataSource(){
MyRoutingDataSource rDS= new MyRoutingDataSource ();
rDS.setDefaultTargetDataSource(dataSourceFIRST);
// some logic for config routing datasource (setting datasources)
// and creating targed data source tDS
//rDS.afterPropertiesSet(); (***)
rDS.setTargetDataSources(tDS);
return rDS;
}
It is getting with error:
┌─────┐
| routingDataSource defined in App
↑ ↓
| dataSourceFIRST defined in App
↑ ↓
| dataSourceInitializer
└─────┘
Uncomment (***) makes this code fine. However, I can't uncomment (***) because it overwrite neccessary config in application.properties.
However, by accident I found solution (it seems to me).
I annotated first and second datasource as #PostConstruct (next to #Bean annotation - I didnt remove this annotaion).
Can you explain me why it helps ? And if is it ok solution ? Maybe there is someting wrong in this approach.
In case you would like to try something different.
From my side, I fought for a long time, and still discovered at the end that excluding the class DataSourceAutoConfiguration
#EnableAutoConfiguration(exclude = { DataSourceAutoConfiguration.class })
I met the same problem.After lot of time I find solution
in this issue:
Circular dependencies error on Spring Boot's DataSourceInitializer
set spring.datasource.initialize = false. hope this can help you.

Categories

Resources