Cannot connect to cloud sql from cloud dataflow transform - java

I am unable to connect to cloud SQL from inside a custom DoFn while running in cloud dataflow. The errors that show up in the log are:
Connecting to Cloud SQL instance [] via ssl socket.
[Docbuilder-worker-exception]: com.zaxxer.hikari.pool.HikariPool$PoolInitializationException: Failed
to initialize pool: Could not create connection to database server.
The same code and config work fine when connecting to cloud sql from the appenginer handle.
I have explicitly given the compute engine service account - -compute#developer.gserviceaccount.com - the Cloud SQL client, Cloud SQL viewer and Editor roles.
Any help to troubleshoot this is greatly appreciated!

To connect to Cloud SQL from external applications there are some methods that could follow in the document How to connect to Cloud SQL from external applications[1] you can find the alternatives and the steps to achieve your goal.
[1]https://cloud.google.com/sql/docs/postgres/connect-external-app

I've also run into a lot of issues when trying to use connection pooling with cloud dataflow to cloud sql with custom DoFn. Now I do not remember if my error was the same as yours, but my solution was to create an #Setup method in the DoFn class like this:
static class ProcessDatabaseEvent extends DoFn<String, String> {
#Setup
public void createConnectionPool() throws IOException {
final Properties properties = new Properties();
properties.load(Thread.currentThread().getContextClassLoader().getResourceAsStream("config.properties"));
final String JDBC_URL = properties.getProperty("jdbc.url");
final String JDBC_USER = properties.getProperty("jdbc.username");
final String JDBC_PASS = properties.getProperty("jdbc.password");
final HikariConfig config = new HikariConfig();
config.setMinimumIdle(5);
config.setMaximumPoolSize(50);
config.setConnectionTimeout(10000);
config.setIdleTimeout(600000);
config.setMaxLifetime(1800000);
config.setJdbcUrl(JDBC_URL);
config.setUsername(JDBC_USER);
config.setPassword(JDBC_PASS);
pool = new HikariDataSource(config);
}
#ProcessElement
public void processElement(final ProcessContext context) throws IOException, SQLException {
//Your DoFn code here...
}

Related

Storage account connection string for 'AzureWebJobsAzureCosmosDBConnection' is invalid

I have this problen when I try to run a function with BlobTrigger.
Microsoft.Azure.WebJobs.Host: Error indexing method 'Functions.myFunction'. Microsoft.Azure.WebJobs.Extensions.Storage:
Storage account connection string for 'AzureWebJobsAzureCosmosDBConnection' is invalid.
The variables are:
AzureWebJobsAzureCosmosDBConnection = AccountEndpoint=https://example.com/;AccountKey=YYYYYYYYYYY;
AzureWebJobsStorage = UseDevelopmentStorage=true
AzureCosmosDBConnection = AccountEndpoint=https://example.com/;AccountKey=YYYYYYYYYYY;
I don't know why this function throws exception....
Not Sure, if you have written your local.settings.json configuration will be in the format of key = value or just mentioned in the question.
The format of local.settings.json configuration to any Azure Function will be "key":"value":
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=pravustorageac88;AccountKey=<alpha-numeric-symbolic_access_key>;EndpointSuffix=core.windows.net",
"FUNCTIONS_WORKER_RUNTIME": "java",
"MAIN_CLASS":"com.example.DemoApplication",
"AzureWebJobsDashboard": "DefaultEndpointsProtocol=https;AccountName=pravustorageac88;AccountKey=<alpha-numeric-symbolic_access_key>;EndpointSuffix=core.windows.net"
"AzureCosmosdBConnStr":"Cosmos_db_conn_str"
}
}
If you are using Cosmos DB Connection string, then you have to configure in such a way:
public HttpResponseMessage execute(
#HttpTrigger(name = "request", methods = {HttpMethod.GET, HttpMethod.POST}, authLevel = AuthorizationLevel.ANONYMOUS) HttpRequestMessage<Optional<User>> request,
#CosmosDBOutput(name="database", databaseName = "db_name", collectionName = "collection_name", connectionStringSetting = "AzureCosmosDBConnection")
ExecutionContext context)
Make Sure the cosmos database connection string present in the local.settigns.json file should be published to Azure Function App Configuration Menu > Application Settings.
For that either uncomment local.settings.json from .gitignore file or add the configuration settings manually to the Azure Function App Configuration:
I have uncommented the local.settings.json in .gitignore file and then published to Azure Function App as the cosmos database connection string also updated in the configuration:
Note:
If you have a proxy in the system, then you have to add the proxy settings in func.exe configuration file, given here by #p31415926
In two of the ways, you can configure the Cosmos DB Connection in the Azure Function Java Stack: Bindings (as code given above) and using the SDK given in this MS Doc.

Spark JDBC connectivity to SQL server using kerberos authentication in Java

I have a use case where the Saprk Data Set API has to connect to SQL server using jdbc to retrieve the data.
The DB is supporting the kerberos authentication thats why using Spring JTDS driver.
The code for JDBC connectivity is as :
/**
* Returns the data source for db connection
* #return
* #throws Exception
*/
private static DriverManagerDataSource getDataSource() throws Exception {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("net.sourceforge.jtds.jdbc.Driver");
String dataSourceUrl = "jdbc:jtds:sqlserver://" + "DBDEV.abc.com" + "/"
+ "TestDB";
dataSource.setUrl(dataSourceUrl);
Properties connProps = new Properties();
connProps.setProperty(DescapDataConstants.APP_NAME_PROPERTY, "Test");
connProps.setProperty(DescapDataConstants.USE_KERBEROS, Boolean.TRUE.toString());
connProps.setProperty(DescapDataConstants.LOGIN_TIMEOUT, "60");
connProps.setProperty(DescapDataConstants.SOCKET_TIMEOUT, "7200");
dataSource.setConnectionProperties(connProps);
return dataSource;
}
But there is no provision to use this Data source as per Spark JDBC API when checked on this page :
https://spark.apache.org/docs/2.3.1/api/java/org/apache/spark/sql/DataFrameReader.html#jdbc-java.lang.String-java.lang.String-java.util.Properties-
Is there any way to use Data Source for connecting to the JDBC via Spark API's.
As already noted in the comments and from a quick look at the source code here, it looks like the implementation does not indeed support a datasource lookup.
One option is to extend below source code and to extend it to accept a connection object itself and to figure out error handling and remember to close it!

Attempting to connect to Atlas via Lambda: No address associated with hostname

I'm trying to upload a document from a Lambda script, however I've been stuck where I keep getting the following whenever the Lambda script starts:
com.mongodb.MongoSocketException: cluster0-whnfd.mongodb.net: No address associated with hostname
The error seems obvious, however I can connect using that same URL via Mongo Compass. The Java class I'm using looks like:
public class MongoStore {
private final static String MONGO_ADDRESS = "mongodb+srv://<USERNAME>:<PASSWORD>#cluster0-whnfd.mongodb.net/test";
private MongoCollection<Document> collection;
public MongoStore() {
final MongoClientURI uri = new MongoClientURI(MONGO_ADDRESS);
final MongoClient mongoClient = new MongoClient(uri);
final MongoDatabase database = mongoClient.getDatabase("test");
this.collection = database.getCollection("test");
}
public void save(String payload) {
Document document = new Document();
document.append("message", payload);
collection.insertOne(document);
}
}
Have I just misconfigured my Java class, or is there something more tricky going on here?
The same problem I had with freshly created MongoDB Atlas database, when I started the migration of my Python web application from Heroku.
So I've realised the DNS name cluster0.hgmft.mongodb.net just doesn't exist.
The magic happened when I've installed the library dnspython (my app is written in Python), with this library MongoDB client was able to connect to my database in Mongo Atlas.

How to hook up AWS RDS - Aurora with AWS Lambda Java function

I am trying to hook up AWS RDS Aurora database with AWS Lambda Java function. For this, I am yet to see any concrete examples. I have seen some examples but they are non java.
I would also like to configure a mySQL DBMS tool with Aurora which I am not able to do :( Can someone help me with that as well. I have got the connection strings from https://console.aws.amazon.com/rds/home?region=us-east-1#dbinstances.
Also, the code I am trying to connect to DB via Lambda Java is:
private Statement createConnection(Context context) {
logger = context.getLogger();
try {
String url = "jdbc:mysql://HOSTNAME:3306";
String username = "USERNAME";
String password = "PASSWORD";
Connection conn = DriverManager.getConnection(url, username, password);
return conn.createStatement();
} catch (Exception e) {
e.printStackTrace();
logger.log("Caught exception: " + e.getMessage());
}
return null;
}
And yes, this doesn't help as I always get null using the db instance config.
RDS needs be in a security group that opens the DB port to the Security Group attached to the ENI of the lambda.
To enable your Lambda function to access resources inside your private VPC, you must provide additional VPC-specific configuration
information that includes VPC subnet IDs and security group IDs. AWS
Lambda uses this information to set up elastic network interfaces
(ENIs) that enable your function to connect securely to other
resources within your private VPC.
http://docs.aws.amazon.com/lambda/latest/dg/vpc.html
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-eni.html

Connect MS OLAP with Java

I have a data source which has a provider MSOLAP I want to connect to this source via java based application. I used the following:
public static void main(String[] args) throws Exception {
// Load the driver
Class.forName("org.olap4j.driver.xmla.XmlaOlap4jDriver");
// Connect
final Connection connection =
DriverManager.getConnection(
// This is the SQL Server service end point.
"jdbc:xmla:Server=http://localhost:81/mondrian/xmla"
// Tells the XMLA driver to use a SOAP request cache layer.
// We will use an in-memory static cache.
+ ";Cache=org.olap4j.driver.xmla.cache.XmlaOlap4jNamedMemoryCache"
// Sets the cache name to use. This allows cross-connection
// cache sharing. Don't give the driver a cache name and it
// disables sharing.
+ ";Cache.Name=MyNiftyConnection"
// Some cache performance tweaks.
// Look at the javadoc for details.
+ ";Cache.Mode=LFU;Cache.Timeout=600;Cache.Size=100",
// XMLA is over HTTP, so BASIC authentication is used.
null,
null);
// We are dealing with an olap connection. we must unwrap it.
final OlapConnection olapConnection = connection.unwrap(OlapConnection.class);
// Check if it's all groovy
ResultSet databases = olapConnection.getMetaData().getDatabases();
databases.first();
System.out.println(
olapConnection.getMetaData().getDriverName()
+ " -> "
+ databases.getString(1));
// Done
connection.close();
}
I get the class OlapConnection is not compiled. I have two questions: 1- I am using maven to build this test and it is not showing errors why would this class not be found?
2- is there any other way to connect to MSOLAP without using olap4j?
This isn't how you connect remotely to an XMLA service. Start by reading this code, and then you'll need to edit the connection string.
In SSAS, the connection string should look something like this:
jdbc:xmla:Server=http://localhost/olap/msmdpump.dll;Catalog=myCatalog

Categories

Resources