I'm trying to write a simple function which will connect to postgres and execute a select statement.
PostgresqlConnectionFactory connectionFactory = new PostgresqlConnectionFactory(
PostgresqlConnectionConfiguration.builder()
.host("localhost")
.port(5432)
.database("MyDB")
.username("username")
.password("password").build());
DatabaseClient client = DatabaseClient.create(connectionFactory);
Flux<Map<String, Object>> result = client.execute("select * from table").fetch().all();
result.map(s -> {
System.out.println(s);
return null;
});
The above piece of code isn't printing anything. There is no error as well. I can connect to DB using the same credentials. What is missing in the code to stream data from DB?
Create the configuration class which is similar to the below code to connect to the PostgreSQL database
#Configuration
#EnableR2dbcRepositories
public class DatabaseConfig extends AbstractR2dbcConfiguration {
#Override
public ConnectionFactory connectionFactory() {
return ConnectionFactories.get("r2dbc:postgresql://localhost:5432/DATABASE_NAME");
}
}
Related
I'm creating a spring-batch application, and I'm having trouble creating a unit test class with junit that tests my reader that uses JdbcPaginItemReaderBuilder.
Reader Code:
#Configuration
public class RelatorioReader {
#Bean("relatorioreader")
#StepScope
public ItemReader<Relatorio> relatorioItemReader(
#Qualifier("dataSource") DataSource dataSource,
#Value("#{jobParameters[dateParam]}") String dateParam) {
return new JdbcPagingItemReaderBuilder<Relatorio>()
.name("relatorioDiario")
.dataSource(dataSource)
.selectClause("SELET * ")
.fromClause("FROM myTable ")
.whereClause(" WHERE date = :dateParam")
.parameterValues(Collections.singletonMap("dateParam", dateParam))
.sortKeys(Collections.singletonMap("ID", Order.ASCENDING))
.rowMapper(new RelatorioMapper())
.build();
}
}
Junit Code
#ExtendWith(MockitoExtension.class)
public class RelatorioReaderTest {
#InjectMocks
RelatorioReader reader;
#Mock
DataSource dataSource;
#Test
public void test_itemReader() {
ItemReader<Relatorio> itemReader = reader.relatorioItemReader(dataSource, "2023-02-16");
assertNotNull(itemReader);
}
}
Exception when running Junit:
java.lang.IllegalArgumentException: Unable to determine PagingQueryProvider type
at org.springframework.batch.item.database.builder.JdbcPagingItemReaderBuilder.determineQueryProvider(JdbcPagingItemReaderBuilder.java:383)
at org.springframework.batch.item.database.builder.JdbcPagingItemReaderBuilder.build(JdbcPagingItemReaderBuilder.java:335)
at com.erico.relatorio.item.reader.RelatorioReader.relatorioItemReader(RelatorioReader.java:34)
at com.erico.relatorio.item.reader.RelatorioReaderTest.test_itemReader(RelatorioReaderTest.java:27)
...
Caused by: org.springframework.jdbc.support.MetaDataAccessException: Could not get Connection for extracting meta-data; nested exception is org.springframework.jdbc.CannotGetJdbcConnectionException: Failed to obtain JDBC Connection: DataSource returned null from getConnection(): dataSource
at ...
Caused by: org.springframework.jdbc.CannotGetJdbcConnectionException: Failed to obtain JDBC Connection: DataSource returned null from getConnection(): dataSource
at ...
When you do not specify a paging query provider, the builder will try to determine a suitable one from the meta-data of your data source. Since you are using a mocked database, you need to mock the call to getConnection(). Otherwise, you have to use a stub database for tests (like an embedded H2 or HSQL).
If you know what datasource you will be using, the best way is to specify its paging query provider implementation in your builder. Here is an example if you use H2:
#Configuration
public class RelatorioReader {
#Bean("relatorioreader")
#StepScope
public ItemReader<Relatorio> relatorioItemReader(
#Qualifier("dataSource") DataSource dataSource,
#Value("#{jobParameters[dateParam]}") String dateParam) {
return new JdbcPagingItemReaderBuilder<Relatorio>()
.name("relatorioDiario")
.dataSource(dataSource)
.selectClause("SELET * ")
.fromClause("FROM myTable ")
.whereClause(" WHERE date = :dateParam")
.parameterValues(Collections.singletonMap("dateParam", dateParam))
.sortKeys(Collections.singletonMap("ID", Order.ASCENDING))
.rowMapper(new RelatorioMapper())
.queryProvider(new H2PagingQueryProvider())
.build();
}
}
I'm setting up an instance of Spring Cloud Data Flow. I've run the following commands:
java -jar spring-cloud-dataflow-server-2.9.2.jar \
--spring.cloud.dataflow.features.streams-enabled=false \
--spring.cloud.dataflow.features.schedules-enabled=true \
--spring.datasource.url=jdbc:postgresql://localhost:5432/batch \
--spring.datasource.username=postgres \
--spring.datasource.password=postgres \
--spring.datasource.driver-class-name=org.postgresql.Driver \
--spring.datasource.initialization_mode=always
I've developed a batch job using spring batch to be deployed in this platform. The job uses two data sources: batch for Spring and task Metadata and app_db for my business logic. When I run the app locally, it persists metadata in batch and my business data in app_db, as expected. The problem is when I try to execute de job inside the Spring Cloud Dataflow. The platform overrides my configured business logic database and uses only the batch database, which is supposed to store metadata only.
application.yaml
spring:
batch:
datasource:
url: jdbc:postgresql://localhost:5432/batch
username: postgres
password: postgres
datasource:
url: jdbc:postgresql://localhost:5432/app_db
username: postgres
password: postgres
DatasourceConfiguration
public class DatasourceConfiguration {
#Bean
#ConfigurationProperties("spring.datasource")
#Primary
public DataSourceProperties dataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#Primary
public DataSource dataSource(DataSourceProperties dataSourceProperties) {
return dataSourceProperties.initializeDataSourceBuilder().build();
}
#Bean(name = "batchDataSourceProperties")
#ConfigurationProperties("spring.batch.datasource")
public DataSourceProperties batchDataSourceProperties() {
return new BatchDataSourceProperties();
}
#Bean(name = "batchDataSource")
public DataSource batchDataSource() {
return batchDataSourceProperties.initializeDataSourceBuilder().build();
}
}
#SpringBootApplication
#EnableTask
#EnableBatchProcessing
public class BatchApplication {
#Bean
public TaskConfigurer taskConfigurer(#Qualifier("batchDataSource") DataSource dataSource) {
return new DefaultTaskConfigurer(dataSource);
}
#Bean
public BatchConfigurer batchConfigurer(#Qualifier("batchDataSource") DataSource dataSource) {
return new DefaultBatchConfigurer(dataSource);
}
public static void main(String[] args) {
SpringApplication.run(BatchApplication.class, args);
}
}
Job
#Bean
public Job startJob(JobBuilderFactory jobBuilderFactory, DataSource dataSource) {
try {
System.out.println(dataSource.getConnection().getMetaData().getURL().toString());
} catch (Exception e) {
//TODO: handle exception
}
}
When I look at the data source,jdbc:postgresql://localhost:5432/app_db will be printed when the batch is executed from local and jdbc:postgresql://localhost:5432/batch will be printed when the batch (task) is executed from SCDF.
I want to know how dataflow is overriding application the spring.datasource even though I am not passing any arguments while executing the task. Please suggest a solution to avoid the overriding of datasource.
One solution I am thinking of is creating AppDatasourceConfiguration(app.datasource) use it. But is there a possibility to use spring.datasource without getting overiddien by SCDF.
Is it possible to retrieve a OracleDataSource from the default SpringBoot 2 Hikari connection pool using a NamedParameterJdbcTemplate object?
Using Java 8, Oracle 11g (ojdbc6-11.2.0.1.jar) and gradle
This is what i've tried.
#Repository
public class MyClass{
#Autowired
NamedParameterJdbcTemplate jdbcTemplate;
public void myMethod(){
try{
//OracleDataSource ods = new OracleDataSource(); // This works but is obviously not Spring
OracleDataSource ods = (OracleDataSource) jdbcTemplate.getJdbcTemplate().getDataSource(); // This fails
ods.setURL(url);
ods.setUser(user);
ods.setPassword(pass);
...
catch(Exception e){
System.out.println("In Exception");
e.printStackTrace();
}
}
}
Application.properties:
spring.datasource.url=jdbc:oracle:thin:#//${ORA_HOST}:${ORA_PORT}/${ORA_SID}
spring.datasource.username=${USER}
spring.datasource.password=${PASS}
Error message:
In Exception
java.lang.ClassCastException: com.zaxxer.hikari.HikariDataSource cannot be cast to oracle.jdbc.pool.OracleDataSource
I don't think this is possible (or neccessary). The easiest way is to unwrap() a connection object, which has already connected to the dB:
Connection conn = this.jdbcTemplate.getJdbcTemplate().getDataSource().getConnection().unwrap(OracleConnection.class);
Just a query like, how are you able to get OracleConnection.class, because in my case I get squiggly line underneath OracleConnection.class and there are no packages available to import.
I am not able to figure out how to implement this. Any help and/or pointers will be greatly appreciated.
Currently, my Java/Spring application backend is deployed on EC2 and accessing MySQL on RDS successfully using the regular Spring JDBC setup. That is, storing database info in application.properties and configuring DataSource and JdbcTemplate in #Configuration class. Everything works fine.
Now, I need to access MySQL on RDS securely. RDS instance has IAM Authentication enabled. I have also successfully created IAM role and applied inline policy. Then, following the AWS RDS documentation and Java example on this link, I am able to access the database from a standalone Java class successfully using Authentication Token and the user I created instead of regular db username and password. This standalone Java class is dealing with "Connection" object directly.
The place I am stuck is how I translate this to Spring JDBC configuration. That is, setting up DataSource and JdbcTemplate beans for this in my #Configuration class.
What would be a correct/right approach to implement this?
----- EDIT - Start -----
I am trying to implement this as a library that can be used for multiple projects. That is, it will be used as a JAR and declared as a dependency in a project's POM file. This library is going to include configurable AWS Services like this RDS access using general DB username and password, RDS access using IAM Authentication, KMS (CMK/data keys) for data encryption, etc.
Idea is to use this library on any web/app server depending on the project.
Hope this clarifies my need more.
----- EDIT - End -----
DataSource internally has getConnection() so I can basically create my own DataSource implementation to achieve what I want. But is this a good approach?
Something like:
public class MyDataSource implements DataSource {
#Override
public Connection getConnection() throws SQLException {
Connection conn = null;
// get a connection using IAM Authentication Token for accessing AWS RDS, etc. as in the AWS docs
return conn;
}
#Override
public Connection getConnection(String username, String password) throws SQLException {
return getConnection();
}
//other methods
}
You can use the following snippet as a replacement for the default connection-pool provided by SpringBoot/Tomcat. It will refresh the token password every 10 minutes, since the token is valid for 15 minutes. Also, it assumes the region can be extracted from the DNS hostname. If this is not the case, you'll need to specify the region to use.
public class RdsIamAuthDataSource extends org.apache.tomcat.jdbc.pool.DataSource {
private static final Logger LOG = LoggerFactory.getLogger(RdsIamAuthDataSource.class);
/**
* The Java KeyStore (JKS) file that contains the Amazon root CAs
*/
public static final String RDS_CACERTS = "/rds-cacerts";
/**
* Password for the ca-certs file.
*/
public static final String PASSWORD = "changeit";
public static final int DEFAULT_PORT = 3306;
#Override
public ConnectionPool createPool() throws SQLException {
return pool != null ? pool : createPoolImpl();
}
protected synchronized ConnectionPool createPoolImpl() throws SQLException {
return pool = new RdsIamAuthConnectionPool(poolProperties);
}
public static class RdsIamAuthConnectionPool extends ConnectionPool implements Runnable {
private RdsIamAuthTokenGenerator rdsIamAuthTokenGenerator;
private String host;
private String region;
private int port;
private String username;
private Thread tokenThread;
public RdsIamAuthConnectionPool(PoolConfiguration prop) throws SQLException {
super(prop);
}
#Override
protected void init(PoolConfiguration prop) throws SQLException {
try {
URI uri = new URI(prop.getUrl().substring(5));
this.host = uri.getHost();
this.port = uri.getPort();
if (this.port < 0) {
this.port = DEFAULT_PORT;
}
this.region = StringUtils.split(this.host,'.')[2]; // extract region from rds hostname
this.username = prop.getUsername();
this.rdsIamAuthTokenGenerator = RdsIamAuthTokenGenerator.builder().credentials(new DefaultAWSCredentialsProviderChain()).region(this.region).build();
updatePassword(prop);
final Properties props = prop.getDbProperties();
props.setProperty("useSSL","true");
props.setProperty("requireSSL","true");
props.setProperty("trustCertificateKeyStoreUrl",getClass().getResource(RDS_CACERTS).toString());
props.setProperty("trustCertificateKeyStorePassword", PASSWORD);
super.init(prop);
this.tokenThread = new Thread(this, "RdsIamAuthDataSourceTokenThread");
this.tokenThread.setDaemon(true);
this.tokenThread.start();
} catch (URISyntaxException e) {
throw new RuntimeException(e.getMessage());
}
}
#Override
public void run() {
try {
while (this.tokenThread != null) {
Thread.sleep(10 * 60 * 1000); // wait for 10 minutes, then recreate the token
updatePassword(getPoolProperties());
}
} catch (InterruptedException e) {
LOG.debug("Background token thread interrupted");
}
}
#Override
protected void close(boolean force) {
super.close(force);
Thread t = tokenThread;
tokenThread = null;
if (t != null) {
t.interrupt();
}
}
private void updatePassword(PoolConfiguration props) {
String token = rdsIamAuthTokenGenerator.getAuthToken(GetIamAuthTokenRequest.builder().hostname(host).port(port).userName(this.username).build());
LOG.debug("Updated IAM token for connection pool");
props.setPassword(token);
}
}
}
Please note that you'll need to import Amazon's root/intermediate certificates to establish a trusted connection. The example code above assumes that the certificates have been imported into a file called 'rds-cacert' and is available on the classpath. Alternatively, you can also import them into the JVM 'cacerts' file.
To use this data-source, you can use the following properties for Spring:
datasource:
url: jdbc:mysql://dbhost.xyz123abc.us-east-1.rds.amazonaws.com/dbname
username: iam_app_user
driver-class-name: com.mysql.cj.jdbc.Driver
type: com.mydomain.jdbc.RdsIamAuthDataSource
Using Spring Java config:
#Bean public DataSource dataSource() {
PoolConfiguration props = new PoolProperties();
props.setUrl("jdbc:mysql://dbname.abc123xyz.us-east-1.rds.amazonaws.com/dbschema");
props.setUsername("iam_dbuser_app");
props.setDriverClassName("com.mysql.jdbc.Driver");
return new RdsIamAuthDataSource(props);
}
UPDATE: When using MySQL, you can also decide to use the MariaDB JDBC driver, which has builtin support for IAM authentication:
spring:
datasource:
host: dbhost.cluster-xxx.eu-west-1.rds.amazonaws.com
url: jdbc:mariadb:aurora//${spring.datasource.host}/db?user=xxx&credentialType=AWS-IAM&useSsl&serverSslCert=classpath:rds-combined-ca-bundle.pem
type: org.mariadb.jdbc.MariaDbPoolDataSource
The above requires MariaDB and AWS SDK libraries, and needs the CA-bundle in the classpath
I know this is an older question, but after a some searching I found a pretty easy way you can now do this using the MariaDB driver. In version 2.5 they added an AWS IAM credential plugin to the driver. It will handle generating, caching and refreshing the token automatically.
I've tested using Spring Boot 2.3 with the default HikariCP connection pool and it is working fine for me with these settings:
spring.datasource.url=jdbc:mariadb://host/db?credentialType=AWS-IAM&useSsl&serverSslCert=classpath:rds-combined-ca-bundle.pem
spring.datasource.driver-class-name=org.mariadb.jdbc.Driver
spring.datasource.username=iam_username
#spring.datasource.password=dont-need-this
spring.datasource.hikari.maxLifetime=600000
Download rds-combined-ca-bundle.pem and put it in src/main/resources so you can connect via SSL.
You will need these dependencies on the classpath as well:
runtime 'org.mariadb.jdbc:mariadb-java-client'
runtime 'com.amazonaws:aws-java-sdk-rds:1.11.880'
The driver uses the standard DefaultAWSCredentialsProviderChain so make sure you have credentials with policy allowing IAM DB access available wherever you are running your app.
Hope this helps someone else - most examples I found online involved custom code, background threads, etc - but using the new driver feature is much easier!
There is a library that can make this easy. Effectively you just override the getPassword() method in the HikariDataSource. You use STS to assume the role and send a "password" for that role.
<dependency>
<groupId>io.volcanolabs</groupId>
<artifactId>rds-iam-hikari-datasource</artifactId>
<version>1.0.4</version>
</dependency>
I'm using Spring-Data-Neo4j 4.0.0.M1, and trying to connect to the server. I'm getting an exception:
Caused by: org.apache.http.client.HttpResponseException: Unauthorized
I have a password on the server interface, but I'm not sure how to tell Spring about it.
#Configuration
#EnableNeo4jRepositories(basePackages = "com.noxgroup.nitro.persistence")
#EnableTransactionManagement
public class MyConfiguration extends Neo4jConfiguration {
#Bean
public Neo4jServer neo4jServer () {
/*** I was quite surprised not to see an overloaded parameter here ***/
return new RemoteServer("http://localhost:7474");
}
#Bean
public SessionFactory getSessionFactory() {
return new SessionFactory("org.my.software.domain");
}
#Bean
ApplicationListener<BeforeSaveEvent> beforeSaveEventApplicationListener() {
return new ApplicationListener<BeforeSaveEvent>() {
#Override
public void onApplicationEvent(BeforeSaveEvent event) {
if (event.getEntity() instanceof User) {
User user = (User) event.getEntity();
user.encodePassword();
}
}
};
}
}
Side Note
4.0.0 Milestone 1 is absolutely fantastic. If anyone is using 3.x.x, I'd recommend checking it out!
The username and password are passed currently via system properties
e.g.
-Drun.jvmArguments="-Dusername=<usr> -Dpassword=<pwd>"
or
System.setProperty("username", "neo4j");
System.setProperty("password", "password");
https://jira.spring.io/browse/DATAGRAPH-627 is open (not targeted for 4.0 RC1 though), please feel free to add comments/vote it up