Elastic Search Stable Version - java

I'm new to Elastic search. We are building a Spring boot application with Elastic search.
Currently, we are bound to use Spring boot 2.1.3.RELEASE.
After a lot of R&D, I've finalized to use elasticsearch-rest-high-level-client for integrating my Spring boot application.
Thinking to setup latest Elastic search version "7.7.1".
Is it fine to proceed with the latest version or should I go with any previous version of Elastic search for any reason?

As you are just starting a new project I would recommend to go for a stable version. So at least you are sure that the problems in your application do not come from the ES.
Later in the development process as you analyse the changelog between releases you can consider if you need the fixes and the new features of the new release, and test that in a new branch.
I guess at the first golive you will end up with a completely different spring release and also a different Elastic stack release if you are doing something substantional.

Related

Camunda BPM - migration from 7.14.0 to 7.18.0

I'm trying to upgrade Camunda version in one of my current Spring Boot projects - from 7.14.0 to 7.18.0.
I've changed the versions of Camunda libs and from the application perspective everything works great, but I'm concidering If I should also make some changes in the database schema to avoid problems with past non-finished processes?
I've read camunda docs and found the latest patch: 'engine_7.14_patch_7.14.2_to_7.14.3.sql'
https://docs.camunda.org/manual/7.18/installation/database-schema/#liquibase-patch-level-update
https://jira.camunda.com/browse/CAM-12832
Does it mean that after version 17.4.3 db schema stays the same?
If not can someone provide me some tips where can I find any informations about the migrations steps?
Pleas efollow
https://docs.camunda.org/manual/7.18/update/minor/714-to-715/
https://docs.camunda.org/manual/7.18/update/minor/715-to-716/
https://docs.camunda.org/manual/7.18/update/minor/716-to-717/
https://docs.camunda.org/manual/7.18/update/minor/717-to-718/
to get from 7.14 to 7.18.

How to figure out why some file was deleted from spring-boot repository?

I have been updating my spring boot application from v2.2.1.RELEASE to v2.6.6 and I've noticed that one of configurations (HealthIndicatorAutoConfiguration for instance) doesn't belong to spring boot repository any more. Please, explain me how to figure out why authors delete it and what should I use instead? And what should I do if I can't find such information in release notes?
P.S. Configurations also disappeared from other spring repositories (e.g. Spring Cloud Sleuth):
TraceAutoConfiguration.class
SleuthTagPropagationAutoConfiguration.class
TraceWebServletAutoConfiguration.class
SleuthLogAutoConfiguration.class
If you upgrade in stages, going from 2.2.x -> 2.3.x -> 2.4.x -> 2.5.x -> 2.6.x rather than jumping from 2.2 straight to 2.6, you'll see that the classes are deprecated for a period of time before they're removed. The deprecation message should point to a replacement. For example, HealthIndicatorAutoConfiguration was deprecated in 2.2.0 in favor of HealthContributorAutoConfiguration.

Upgrade Hazelcast 3.5.* to a newer version without losing data

From the Hazelcast official documentation, rolling upgrade is supported starting from version 3.8.
Provided my server version is 3.5, is there a way to create a successful cluster with new boxes running newer versions of Hazelcast?
Naively upgrading to 3.6.* resulted in 2 different clusters (old boxes still running 3.5 and another one with the new ones running 3.6 that obviously has no data as it was never able to touch base with the existing boxes).
My deployment process is as follows:
create a new set of boxes
remove the existing boxes one by one
repeat with a second batch of boxes
My thoughts have gone towards storing a snapshot on disk / db and remount the partition / load from the DB at rollout time, but this might not even be supported and I'm hopeful there might be a better way.
What data structures do you use? For IMaps, ICaches and ILists, you can use Hazelcast Jet. It connects to the old cluster and pumps the data to the new cluster.
This works if your new cluster is on 3.x version. 3.x -> 4.x isn't possible this way. Use Jet 3.x version for it.
See https://docs.hazelcast.org/docs/jet/3.2.2/manual/manual.html#connector-imdg

What are the pros/cons of using the Gradle integration vs Spring Boot integration for Flyway?

Flyway has several integration options.
I'm trying to determine what the pros/cons are of using the Gradle integration vs the Spring Boot integration given that your project is already using both Spring Boot and Gradle.
The only thing I can think of is that if you want to be able to do migrations without starting the application or want to save time by not migrating every time you start the app then the Gradle choice could be better.
Think of it as build time vs run time.
In general you will build an artifact once and deploy it to many environments, so run time is a much better fit.
However sometimes build time makes sense. This is primarily for situations where you need a fully migrated database as part of the build, in order to for example generate code based on the structure of that database using frameworks like jOOQ or QueryDSL.

how to integrate magnolia cms with hibernate search?

Did somebody manage to get a working application which included these 2 frameworks?
The problem I'm facing is that the dependencies are like this:
magnolia 4.4.5 -> apache jackrabbit 1.6.4 -> apache lucene 2.4.1
hibernate search 3.4.1.Final -> apache lucene 3.1.0
So there's an inconsistency for the lucene version.
And I need a version of hibernate search that's with annotations.
I've really tried to integrate these 2 but with no luck. I hope somebody else managed to.
Thanks :)
We have succesfully used this a long time. It really sucks that Jackrabbit is so slow to update Lucene. That makes it hard. I think you should consider using Solr in a separate JVM just to get rid of this dependency.
Having said that, this is how you can solve it. What defines what is possible is Jackrabbit.
Jackrabbit 1.6.x and you MUST use Lucene 2.4.x. We did that successfully a long time but was then forced to use an old hibernate search. If you want to I can give you that config as well.
But recently when jackrabbit released 2.3.0 it depends on Lucene 3.0.3. It breaks on 3.1.0, so you must use Lucene 3.0.3.
This is our config simplified:
org.apache.jackrabbit:jackrabbit-core 2.3.0
- exclude org.apache.lucene:lucene-core
org.hibernate:hibernate-core:3.6.7.Final
org.hibernate:hibernate-commons-annotations:3.2.0.Final
- exclude org.hibernate:hibernate
org.hibernate:hibernate-search:3.3.0.Final or 3.4.0.CR1
- exclude org.hibernate:ejb3-persistence
- exclude org.apache.lucene:lucene-core
(3.4.0.CR1 is the last hibernate search that depends on Lucene 3.0.3, but if you don't want beta-versions use 3.3.0 or 3.3.1)
org.apache.lucene:lucene-core:3.0.3
Magnolia 4.4.5
The normal Hibernate Annotations project is now included in org.hibernate:hibernate-core:3.6.7.Final so no need to depend on that.

Categories

Resources