Spring Boot force reload DB connection on INI file read - java

I have an application with a UI which will manage some facets of a Spring Boot application while it is live.
First of all, there will be an INI file name passed in when the application starts which will have a username, password and host for the DB connection.
I have successfully implemented this dynamic database capability by starting the Spring Boot application after the initial INI file load.
However, I need to be able to change the #Primary data source on-the-fly during execution.
The user will click a button in the UI, the INI file will load from a specified source in the UI and I want the DB connection to drop, switch to the new properties read in from the INI file and restart the connection.
It seems to me that #RefreshScope + actuator method will not work for me since I am refreshing from a UI in the application and not an endpoint.
AbstractRoutingDatasource seems like it requires you to know the DB connection properties for the various sources at compile time and furthermore it's a lot more complex than I think is necessary to solve a simple problem such as this. I would think there should be some class which will allow a simple reload by telling it to call getDataSource again and reinitialize.
Configuration class:
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = "smsEntityManagerFactory",
transactionManagerRef = "smsTransactionManager",
basePackages = { "com.conceptualsystems.sms.db.repository" })
public class JpaConfig {
#Bean(name="SMSX")
#Primary
public DataSource getDataSource() {
Logger logger = LoggerFactory.getLogger(this.getClass());
logger.error("DATABASE INITIALIZING: getDataSource() called!");
DataSourceBuilder builder = DataSourceBuilder.create();
builder.driverClassName("com.microsoft.sqlserver.jdbc.SQLServerDriver");
builder.username(IniSettings.getInstance().getIniFile().getDbUser());
builder.password(IniSettings.getInstance().getIniFile().getDbPassword());
String host = IniSettings.getInstance().getIniFile().getDbPath();
String db = IniSettings.getInstance().getIniFile().getDbName();
String connectionString = "jdbc:sqlserver://" + host + ";databaseName=" + db;
logger.info("Connecting [" + connectionString +"] as [" +
IniSettings.getInstance().getIniFile().getDbUser() + ":" +
IniSettings.getInstance().getIniFile().getDbPassword() + "]");
builder.url(connectionString);
return builder.build();
}
#Bean(name = "smsEntityManagerFactory")
#Primary
public LocalContainerEntityManagerFactoryBean smsEntityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("SMSX") DataSource dataSource) {
return builder
.dataSource(dataSource)
.persistenceUnit("smsEntityManagerFactory")
.packages("com.conceptualsystems.sms.db.entity")
.build();
}
#Bean(name = "smsTransactionManager")
#Primary
public PlatformTransactionManager smsTransactionManager(
#Qualifier("smsEntityManagerFactory") EntityManagerFactory entityManagerFactory) {
return new JpaTransactionManager(entityManagerFactory);
}
}
main entry point:
public static void main(String[] args) {
try {
UIManager.setLookAndFeel( new FlatLightLaf() );
} catch( Exception ex ) {
System.err.println( "Failed to initialize LaF" );
}
javax.swing.SwingUtilities.invokeLater(() -> createAndShowSplash());
System.out.println("Scrap Management System v" + VERSION);
String iniFilename = null;
String user = null;
String pass = null;
for(String arg : args) {
if(arg.startsWith(ARG_HELP)) {
System.out.println(HELP_TEXT);
System.exit(0);
}
if(arg.startsWith(ARG_INI)) {
iniFilename = arg.substring(ARG_INI.length());
System.out.println("User entered INI file location on command line: " + iniFilename);
}
if(arg.startsWith(ARG_USER)) {
user = arg.substring(ARG_USER.length());
System.out.println("User entered DB username on command line: " + user);
}
if(arg.startsWith(ARG_PSWD)) {
pass = arg.substring(ARG_PSWD.length());
System.out.println("User entered DB password on command line: [****]");
}
}
mIniFile = new IniFile(iniFilename);
try {
mIniFile.load();
} catch (Exception e) {
System.out.println("Error loading INI file!");
e.printStackTrace();
}
IniSettings.getInstance().setIniFile(mIniFile);
System.out.println("INI file set completed, starting Spring Boot Application context...");
SplashFrame.getInstance().enableSiteSelection();
mApplicationContext = new SpringApplicationBuilder(Main.class)
.web(WebApplicationType.SERVLET)
.headless(false)
.bannerMode(Banner.Mode.LOG)
.run(args);
try {
IniSettings.getInstance().setIniSource(new IniJPA());
IniSettings.getInstance().load();
} catch(Exception e) {
System.out.println("Unable to setup INI from database!");
e.printStackTrace();
}
}

Related

Spring Boot Rest + ClearDB mySQL "exceeded 'max_user_connections'" error

I keep randomly getting this error every once in a while: "java.sql.SQLSyntaxErrorException: User '{key}' has exceeded the 'max_user_connections' resource (current value: 10)".
I have tried googling help for this, but all I can find is:
"increase the max connections limit" (which can't be done in free clearDB)
"adjust maxActive amount" or "release old connections" (both of which I can't find how to do it in Spring Boot)
Here's what my code looks like:
// application.properties
# Connect to heroku ClearDB MySql database
spring.datasource.url=jdbc:mysql://{heroku_url}?reconnect=true
spring.datasource.username={user}
spring.datasource.password={password}
# Hibernate ddl auto (create, create-drop, update)
spring.jpa.hibernate.ddl-auto=update
#MySQL DIALECT
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.MySQL5Dialect
spring.jpa.open-in-view=false
server.port=8080
#Configuration
public class DatabaseConfig {
#Value("${spring.datasource.url}")
private String dbUrl;
#Bean
public DataSource dataSource() {
HikariConfig config = new HikariConfig();
config.setJdbcUrl(dbUrl);
return new HikariDataSource(config);
}
}
EDIT 1: I was following PauMAVA's instructions as best as I could and I came up with this code, which for some reason fails:
#Configuration
public class DatabaseConfig {
#Value("${spring.datasource.url}")
private String dbUrl;
public static DataSource ds;
#Bean
public DataSource dataSource() {
HikariConfig config = new HikariConfig();
config.setJdbcUrl(dbUrl);
DataSource ds = new HikariDataSource(config);
DatabaseConfig.ds = ds;
return ds;
}
}
// Main class
public static void main(String[] args) {
SpringApplication.run(BloggerApplication.class, args);
Runtime.getRuntime().addShutdownHook(new Thread(new Runnable() {
public void run() {
DataSource ds = DatabaseConfig.ds;
if (ds != null) {
try {
ds.getConnection().close();
} catch (SQLException e) {
e.printStackTrace();
}
}
}
}, "Shutdown-thread"));
}
Whenever you create a connection object in you code, it is advisable to close the same in finally block. This way the number of connections do not get exhausted.
Hope this helps!
You should close the DataSource on application termination so that no unused connections remain open.
public void close(DataSource ds) {
if(ds != null) {
ds.close();
}
}
But do this only on program termination as stated here.
To use the data source later (on closing) you can register the DataSource as a Field in your class:
private DataSource ds;
#Bean
public DataSource dataSource() {
HikariConfig config = new HikariConfig();
config.setJdbcUrl(dbUrl);
DataSource ds = new HikariDataSource(config);
this.ds = ds;
return ds;
}
If you are going to have more than one data source you can make a List based approach:
private List<DataSource> activeDataSources = new ArrayList<>();
public DataSource dataSource() {
HikariConfig config = new HikariConfig();
config.setJdbcUrl(dbUrl);
DataSource ds = new HikariDataSource(config);
this.activeDataSources.add(ds);
return ds;
}
public void closeAllDataSources() {
for(DataSource ds: this.activeDataSources) {
if(ds != null) {
ds.close();
}
}
this.activeDataSources = new ArrayList<>();
}
To execute a function on program close refer to this.

JPA datasorce ignores username in properties file

I have a test class which I use to check if the connection to the database is established. The credentials are saved in a properties file. When I run the tests in eclipse everything works fine. But when I run a maven build the tests fail because the username used to connect to the database is not the one I set in the properties file. It is the windows username. This is my code:
Properties File:
driverClassName=oracle.jdbc.driver.OracleDriver
user=database_dev1
password=password_dev1
url=jdbc:oracle:thin:#MyAwsomeDatabase:1521:DEV01
Config Class:
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories("de.xxx.bvd.mobisl.service")
#PropertySource("classpath:database.properties")
#ComponentScan("de.xxx.bvd.mobisl.service")
public class JPAConfig {
#Value("${driverClassName}")
protected String driverClassName;
#Value("${url}")
protected String url;
#Value("${user}")
protected String username;
#Value("${password}")
protected String password;
private static final Logger logger = Logger.getLogger(JPAConfig.class);
#SuppressWarnings("unchecked")
#Lazy
#Bean
public DataSource dataSource() {
try {
SimpleDriverDataSource dataSource = new SimpleDriverDataSource();
Class<? extends Driver> driver = (Class<? extends Driver>) Class.forName(driverClassName);
dataSource.setDriverClass(driver);
dataSource.setUrl(url);
dataSource.setUsername(username);
dataSource.setPassword(password);
logger.info("created DataSource with username " + username + " and password " + password);
return dataSource;
} catch (ClassNotFoundException e) {
logger.error("cannot create datasource!!", e);
return null;
}
}
As I said, running from eclipse works fine. The logfile says:
[[XXX-LOG]] 2018-09-04 08:27:23 INFO JPAConfig:57 - created DataSource with username database_dev1
[[XXX-LOG]] 2018-09-04 08:27:27 INFO JPAConfigTest:52 - got result from database
But running from maven the logfile says:
[[XXX-LOG]] 2018-09-04 08:27:53 INFO JPAConfig:57 - created DataSource with username <<Windows-Username>>
How can I tell maven to use the username from the properties file?
${user} is replaced by maven with the environment variable user.
You can get this if you run mvn help:system
Solution rename the property to be more specific like
db.username
A side effect user is very ambiguous in bigger projects. If you rename it it is more cleary where it is used

How can I start flyway migration before hibernate validation?

I use flyway + hibernate validate. I have flyway bean:
#Component
public class DbMigration {
private static final Logger LOG = LoggerFactory.getLogger(DbMigration.class);
private final Config config;
#Autowired
public DbMigration(Config config) {
this.config = config;
}
public void runMigration() {
try {
Flyway flyway = new Flyway();
flyway.configure(properties());
int migrationApplied = flyway.migrate();
LOG.info("[" + migrationApplied + "] migrations are applied");
} catch (FlywayException ex) {
throw new DatabaseException("Exception during database migrations: ", ex);
}
}
public Properties properties() {
//my prop
}
}
And in Apllication class I do it:
public static void main(String[] args) {
try {
AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext(ApplicationConfiguration.class);
context.getBean(DbMigration.class).runMigration();
But my hibernate start before runMigration(); And validate throw exeption. How can I start next?
run Migration
start hibernate validation
EDIT:
#Bean
#Autowired
public LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource datasource) {
log.info("entityManagerFactory start");
dbMigration.runMigration();
But I think it is bad
In your spring application configuration, if you have an entity manager factory bean configuration you can make it depend on the flyway bean so that it gets initialized after it. Something like:
#Bean
#DependsOn("flyway")
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
// Initialize EntityManagerFactory here
}
The flyway bean configuration can be something like:
#Bean(initMethod = "migrate")
public Flyway flyway() {
Flyway flyway = new Flyway();
// configure bean here
return flyway;
}

ServiceName is getting ignored while connecting to HA enabled HDFS from Java using RPC

Here is my build config method
private Configuration buildConfiguration() {
Configuration conf = new Configuration();
if (connectivityDetail.isSecureMode()) {
conf.set("hadoop.security.authentication", "kerberos");
conf.set("hadoop.http.authentication.type", "kerberos");
conf.set("dfs.namenode.kerberos.principal", connectivityDetail.getHdfsServicePrincipal());
}
if (isHAEnabled()) {
String hdfsServiceName = connectivityDetail.getHdfsServiceName();
conf.set("fs.defaultFS", "hdfs://" + hdfsServiceName);
conf.set("dfs.ha.namenodes." + hdfsServiceName, "nn0,nn1");
conf.set("dfs.nameservices", hdfsServiceName);
conf.set("dfs.client.failover.proxy.provider." + hdfsServiceName,
"org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider");
conf.set("dfs.namenode.rpc-address." + hdfsServiceName + ".nn1",
connectivityDetail.getNameNodeUris().get(0));
conf.set("dfs.namenode.rpc-address." + hdfsServiceName + ".nn0",
connectivityDetail.getNameNodeUris().get(1));
}
return conf;
}
hdfsService name is incorrectly set into the configuration object but I am able to get FileSystem and everything is working correctly. I am not sure why it is not using service name?
This is how I am creating Path
public static Path getHdfsResourceLocation(String resourceLocation) throws Exception {
String[] hdfsURIs = OrchestrationConfigUtil.getHdfsUri();
Path hdfsResourceLoc = null;
if (isHAEnabled()) {
hdfsResourceLoc = new Path(resourceLocation);
} else {
hdfsResourceLoc = FileContext.getFileContext(new URI(hdfsURIs[0])).makeQualified(new Path(resourceLocation));
}
return hdfsResourceLoc;
}
Everything is working fine with wrong service name and I am not sure why?

How to export database schema using Hibernate SchemaExport with BeanValidation constraints?

please see my requirement: using SchemaExport to export database schema that appiled BeanValidation constraints(eg, #Length(32) will create DB constraints: column(32)).
In Hibernate 4.1.x, i can using the hack code post here: https://forum.hibernate.org/viewtopic.php?f=1&t=1024911&view=previous
but the Ejb3Configuration class that required in above hack code was removed in Hibernate 4.3.5.
so how can i export database schema that appiled BeanValidation constraints without using Ejb3Configuration?
Something like this should work:
PersistenceUnitDescriptorAdapter pu = new PersistenceUnitDescriptorAdapter() {
#Override
public List<String> getManagedClassNames() {
return Arrays.asList( MyClass.class.getName(), ... );
}
};
Map<Object, Object> settings = new HashMap<Object, Object>();
settings.put( "javax.persistence.schema-generation.scripts.action", "create" );
settings.put( "javax.persistence.schema-generation.scripts.create-target", "<path-to-export-file>" );
EntityManagerFactoryBuilderImpl factoryBuilder = new EntityManagerFactoryBuilderImpl( pu, settings );
factoryBuilder.generateSchema();
It relies on Hibernate internal classes, but so did your earlier solution. You could create a issue here - https://hibernate.onjira.com/browse/HHH - explaining your use case. Maybe a solution using a public API can be made available.
i found an temporary solution by using HibernationConfiguration build by EntityManagerFactoryBuilderImpl. it uses the JPA configuration to emit the schema script(with bean-validator constraints).
public final class JpaSchemaExporter
{
public JpaSchemaExporter(String utilName, String packageName, Properties properties, DialectType dialect,
Path outputPath) throws Exception
{
this.dialect = dialect;
this.outputPath = outputPath;
if (Files.exists(outputPath) && !Files.isDirectory(outputPath)) {
throw new IllegalArgumentException(
"Given path already exist and is not a directory! the path:" + outputPath);
}
Files.createDirectories(outputPath);
pud = new ParsedPersistenceXmlDescriptor(Resources.getResourceURL("META-INF"));
pud.setName(utilName);
pud.addClasses(Resources.getClasseNames(packageName));
pud.addMappingFiles("META-INF/orm.xml");
properties.setProperty("hibernate.dialect", dialect.getDialectClass());
ValidatorFactory validatorFactory = Validation.buildDefaultValidatorFactory();
factoryBuilder = new EntityManagerFactoryBuilderImpl( pud, properties );
factoryBuilder.withValidatorFactory(validatorFactory).build().close(); // create HibernateConfiguration instance
this.injectBeanValidationConstraintToDdlTranslator();
validatorFactory.close();
}
private void injectBeanValidationConstraintToDdlTranslator() {
try {
Configuration hibernateConfiguration = factoryBuilder.getHibernateConfiguration();
ValidatorFactory validatorFactory = (ValidatorFactory) factoryBuilder.getConfigurationValues().get(AvailableSettings.VALIDATION_FACTORY);
// private class in hibernate
Method applyRelationalConstraints = Class.forName("org.hibernate.cfg.beanvalidation.TypeSafeActivator")
.getMethod("applyRelationalConstraints",
ValidatorFactory.class,
java.util.Collection.class,
Properties.class,
Dialect.class);
applyRelationalConstraints.setAccessible(true);
Dialect dialectInstance = (Dialect) Class.forName(dialect.getDialectClass()).newInstance();
applyRelationalConstraints.invoke(null, validatorFactory,
Arrays.asList(Iterators.toArray(hibernateConfiguration.getClassMappings(), PersistentClass.class)) ,
hibernateConfiguration.getProperties(),
dialectInstance);
}
catch (Exception e) {
throw new RuntimeException(e);
}
}
#SuppressWarnings("unchecked")
public void create() throws IOException {
Configuration cfg = factoryBuilder.getHibernateConfiguration();
cfg.setProperty("hibernate.hbm2ddl.auto", "create");
SchemaExport export = new SchemaExport(cfg);
export.setDelimiter(";");
export.setOutputFile(Paths.get(outputPath.toString(), "ddl_create_" + dialect.name().toLowerCase() + ".sql").toString());
export.execute(true, false, false, true);
if (!export.getExceptions().isEmpty()) {
System.out.println();
System.out.println("SOME EXCEPTIONS OCCURED WHILE GENERATING THE UPDATE SCRIPT:");
for (Exception e : (List<Exception>) export.getExceptions()) {
System.out.println(e.getMessage());
}
}
}
#SuppressWarnings("unchecked")
public void update() throws IOException {
Configuration cfg = factoryBuilder.getHibernateConfiguration();
cfg.setProperty("hibernate.hbm2ddl.auto", "update");
SchemaUpdate updater = new SchemaUpdate(cfg);
updater.setDelimiter(";");
updater.setOutputFile(Paths.get(outputPath.toString(), "ddl_update_" + dialect.name().toLowerCase() + ".sql").toString());
updater.execute(true, false);
if (!updater.getExceptions().isEmpty()) {
System.out.println();
System.out.println("SOME EXCEPTIONS OCCURED WHILE GENERATING THE UPDATE SCRIPT:");
for (Exception e : ((List<Exception>) updater.getExceptions())) {
System.out.println(e.getMessage());
}
}
}
public void validate() {
Configuration hibernateConfiguration = factoryBuilder.getHibernateConfiguration();
hibernateConfiguration.setProperty("hibernate.hbm2ddl.auto", "validate");
SchemaValidator validator = new SchemaValidator(hibernateConfiguration);
validator.validate();
}
public static void main(String[] args) throws Exception {
Properties prop = new Properties(System.getProperties());
prop.setProperty("hibernate.connection.driver_class", "value in your env");
prop.setProperty("hibernate.connection.url", "value in your env");
prop.setProperty("hibernate.connection.username", "value in your env");
prop.setProperty("hibernate.connection.password", "value in your env");
Path path = Paths.get("schema output path in your env");
String packageName = prop.getProperty("package names of jpa classes");
String unitName = prop.getProperty("jpa Unit Name");
String[] dialects = "HSQL,MYSQL".split(",");
for(String dialect : dialects){
DialectType dialectType = DialectType.valueOf(dialect);
JpaSchemaExporter ddlExporter = new JpaSchemaExporter(unitName, packageName, prop, dialectType, path);
ddlExporter.update();
ddlExporter.create();
}
}
}

Categories

Resources