Not able to connect to Cloud SQL using Java SocketFactory Library - java

I am trying to connect to Cloud SQL ( Mysql ) using my java code. I am getting the below error -
com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: Could not create socket factory 'com.google.cloud.sql.mysql.SocketFactory' due to underlying exception:
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.google.cloud.sql.mysql.SocketFactory
Here is my code -
package utils
import java.sql.DriverManager
import java.sql.Connection
import scala.collection.mutable.ListBuffer
import entity.AnalyticFieldEntity
import compute.driver.AnalyticTools
import entity.ErrorHandlingEntity
object ScalaDbConnect {
def getAnalyticBatchMap(toolId : Int, paramMap: Map[String, String]): Map[String, Int] = {
val methodName = "getAnalyticBatchMap"
val errorMode = paramMap.get("mode")+"("+paramMap.get("analyticSource")+")"
val dbTuple = DbPropertiesReader.getDbProperties()
val databaseName = dbTuple._3
val instanceConnectionName = dbTuple._4
val username= dbTuple._1
val password= dbTuple._2
var connection: Connection = null
val analyticMap = collection.mutable.Map.empty[String, Int]
try {
//[START doc-example]
val jdbcUrl = String.format(
"jdbc:mysql://google/%s?cloudSqlInstance=%s&"
+ "socketFactory=com.google.cloud.sql.mysql.SocketFactory", databaseName, instanceConnectionName);
println(jdbcUrl);
//Class.forName("com.mysql.jdbc.GoogleDriver");
val connection = DriverManager.getConnection(jdbcUrl, username, password);
println(connection);
//[END doc-example]
try
{
val statement = connection.createStatement()
val resultSet = statement.executeQuery("SELECT omnitureColumnHeader.columnHeaderId, case when analyticFieldMap.isTag = 1 then concat(\"tag_\",analyticFieldMap.entityField) else " +
"analyticFieldMap.entityField end as entityField FROM omnitureColumnHeader INNER JOIN analyticFieldMap ON " +
"analyticFieldMap.analyticFieldBatch=omnitureColumnHeader.columnHeaderValue where analyticFieldMap.toolId = " + toolId);
System.out.println("resultSet: 2" + statement);
System.out.println("statement: 2" + resultSet);
while (resultSet.next()) {
System.out.println("inside the content loop: 2");
analyticMap += resultSet.getString("entityField") -> resultSet.getInt("columnHeaderId")
}
System.out.println("analyticMap: 2" + analyticMap);
}
catch
{
case _: Throwable => println("Got some other kind of exception")
}
} catch {
case e: Exception =>
val errorHandlingEntity = new ErrorHandlingEntity()
errorHandlingEntity.Mode=errorMode
errorHandlingEntity.Tool=paramMap.get("tool").toString()
errorHandlingEntity.Message="DB Connection Issue"
errorHandlingEntity.Trace=e.printStackTrace().toString()
errorHandlingEntity.Source = "Spark"
errorHandlingEntity.YarnAppId=paramMap.get("appID").toString()
errorHandlingEntity.MethodName=methodName
errorHandlingEntity.ReThrow = true
errorHandlingEntity.CurrentException=e
ErrorHandlingFramework.HandleException(errorHandlingEntity)
}
connection.close()
analyticMap.toMap
}
}
I have added the below details in my POM.XML
<dependency>
<groupId>com.google.cloud.sql</groupId>
<artifactId>mysql-socket-factory</artifactId>
<version>1.0.3</version>
</dependency>
Here is the complete POM.XML - https://pastebin.com/jvxSBZMX
I am trying to connect to Google Cloud SQL using my scala code and i am using the JAVA API(S).
The issue i am facing indicates, i am not able to access the correct class for the connection.
Any help would be appreciated.
Looking forward for the solution.
Thanks,

The issue is with how google cloud runs the maven Build.
It is not able to read the classes from the build so i passed those JAR Files with the extention --JARS .
This solves my issue.

Related

Databricks jdbc driver. java.lang.ClassCastExeption: cannot be cast to java.lang.string

Trying to read a table from MS SQL Server using Spark. Took the Databricks example (https://docs.databricks.com/data/data-sources/sql-databases.html)
Running the following scala code:
val jdbcHostname = "hostname"
val jdbcPort = 1433
val jdbcDatabase = "database"
val jdbcUrl = s"jdbc:sqlserver://${jdbcHostname}:${jdbcPort};database=${jdbcDatabase}"
import java.util.Properties
val connectionProperties = new Properties()
connectionProperties.put("user", s"${jdbcUsername}")
connectionProperties.put("password", s"${jdbcPassword}")
Getting the following exception:
scala> val stg = spark.read.jdbc(jdbcUrl, myTable, connectionProperties)
java.lang.ClassCastException: org.apache.spark.sql.ColumnName cannot be cast to java.lang.String
at scala.collection.convert.Wrappers$JPropertiesWrapper$$anon$3.next(Wrappers.scala:414)
at scala.collection.convert.Wrappers$JPropertiesWrapper$$anon$3.next(Wrappers.scala:409)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
at scala.collection.mutable.AbstractMap.$plus$plus$eq(Map.scala:80)
at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:235)
... 52 elided
What should I do in order to overcome this java exception?

How correctly pass parameters in controllers?

I am new in Play Framework and need some advice.
In my project I send AJAX request but unfortunatly Play Framework raise error which you can see below. From error message you can notice that problem most likely in controller. Where was my mistake? How to fix this error?
[error] application -
! #79mg8k016 - Internal server error, for (GET) [/] ->
play.api.UnexpectedException: Unexpected exception[CreationException: Unable to create injector, see the following errors:
1) No implementation for play.api.db.Database was bound.
while locating play.api.db.Database
for the 1st parameter of controllers.GetValuesController.<init>(GetValuesController.scala:14)
while locating controllers.GetValuesController
for the 4th parameter of router.Routes.<init>(Routes.scala:33)
at play.api.inject.RoutesProvider$.bindingsFromConfiguration(BuiltinModule.scala:121):
Binding(class router.Routes to self) (via modules: com.google.inject.util.Modules$OverrideModule -> play.api.inject.guice.GuiceableModuleConversions$$anon$1)
1 error]
at play.core.server.DevServerStart$$anon$1.reload(DevServerStart.scala:186)
at play.core.server.DevServerStart$$anon$1.get(DevServerStart.scala:124)
at play.core.server.AkkaHttpServer.handleRequest(AkkaHttpServer.scala:241)
at play.core.server.AkkaHttpServer.$anonfun$createServerBinding$1(AkkaHttpServer.scala:138)
at akka.stream.impl.fusing.MapAsyncUnordered$$anon$26.onPush(Ops.scala:1304)
at akka.stream.impl.fusing.GraphInterpreter.processPush(GraphInterpreter.scala:519)
at akka.stream.impl.fusing.GraphInterpreter.processEvent(GraphInterpreter.scala:482)
at akka.stream.impl.fusing.GraphInterpreter.execute(GraphInterpreter.scala:378)
at akka.stream.impl.fusing.GraphInterpreterShell.runBatch(ActorGraphInterpreter.scala:588)
at akka.stream.impl.fusing.GraphInterpreterShell$AsyncInput.execute(ActorGraphInterpreter.scala:472)
Caused by: com.google.inject.CreationException: Unable to create injector, see the following errors:
1) No implementation for play.api.db.Database was bound.
while locating play.api.db.Database
for the 1st parameter of controllers.GetValuesController.<init>(GetValuesController.scala:14)
while locating controllers.GetValuesController
for the 4th parameter of router.Routes.<init>(Routes.scala:33)
at play.api.inject.RoutesProvider$.bindingsFromConfiguration(BuiltinModule.scala:121):
Binding(class router.Routes to self) (via modules: com.google.inject.util.Modules$OverrideModule -> play.api.inject.guice.GuiceableModuleConversions$$anon$1)
It seems like something wrong in controller. Where is my mistake, how to fix it?
I use:
JDK 1.8.0_181
SBT 0.13.5
Scala 2.12
Play Framework 2.6.20
routes:
GET /get_values controllers.GetValuesController.get_data_from_db(start_date_time:String, end_date_time:String, city_name:String)
GetValuesController.scala:
package controllers
import javax.inject._
import akka.actor.ActorSystem
import play.api.Configuration
import play.api.mvc.{AbstractController, ControllerComponents}
import play.api.libs.ws._
import scala.concurrent.duration._
import scala.concurrent.{ExecutionContext, Future, Promise}
import services._
import play.api.db.Database
class GetValuesController#Inject()(db: Database, conf: Configuration, ws: WSClient, cc: ControllerComponents, actorSystem: ActorSystem)(implicit exec: ExecutionContext) extends AbstractController(cc) {
def get_data_from_db(start_date_time: String, end_date_time: String, city_name: String) = Action.async {
getValue(1.second, start_date_time: String, end_date_time: String, city_name: String).map {
message => Ok(message)
}
}
private def getValue(delayTime: FiniteDuration, start_date_time: String, end_date_time: String, city_name: String): Future[String] = {
val promise: Promise[String] = Promise[String]()
val service: GetValuesService = new GetValuesService(db)
actorSystem.scheduler.scheduleOnce(delayTime) {
promise.success(service.get_values(start_date_time, end_date_time, city_name))
}(actorSystem.dispatcher)
promise.future
}
}
GetValuesService.scala:
package services
import play.api.db.Database
import play.api.libs.json._
class GetYoutubeSpeedValuesService(db: Database) {
def get_youtube_speed_values(start_date_time: String, end_date_time: String, city_name: String): String ={
val SQL_STATEMENT = "SELECT " +
"table_name.\"Stamper\" AS DATE_TIME, " +
"table_name.\"CITY\" AS CITY, " +
"MAX(table_name.avg) AS MAX_SPEED " +
"FROM table_name" +
"WHERE table_name.\"CITY\"='" + city_name + "' " +
"AND (table_name.\"Stamper\" BETWEEN '" + start_date_time + "' AND '" + end_date_time + "') " +
"GROUP BY table_name.\"Stamper\", table_name.\"CITY\";"
val connection = db.getConnection()
var json_array = Json.arr()
try {
val query = connection.createStatement.executeQuery(SQL_STATEMENT)
while (query.next()) {
val json_object = Json.obj(
"DATE_TIME" -> query.getString(1),
"CITY" -> query.getString(2),
"MAX_SPEED" -> query.getString(3)
)
json_array +:= json_object
}
} finally {
connection.close()
}
println(json_array)
json_array.toString()
}
}
application.conf:
db.postgres.driver=org.postgresql.Driver
db.postgres.url="jdbc:postgresql://host:port/database_name"
db.postgres.username = "username"
db.postgres.password = "password"
Javascript:
$.ajax({
type: "POST",
url: "http://localhost:9000/get_values",
data: {
start_date_time: '2018-10-01 00:00:00',
end_date_time: '2018-10-31 23:00:00',
city_name: 'London'
},
success: function (result) {
console.log(result);
},
error: function (jqXHR, textStatus, errorThrown) {
console.log("jqXHR: " + jqXHR);
console.log("textStatus: " + textStatus);
console.log("errorThrown: " + errorThrown);
}
});
ERROR:
Action Not Found
For request 'POST /get_values'
In the same time controller works now correctly! If I call lets say such url: http://localhost:9000/get_values?start_date_time=2018-10-01%2000:00:00&end_date_time=2018-10-31%2023:00:00&city_name=London it return me JSON data.
Well, finally we found the problem. The problem was in application.conf file. In my case I used next database configuration:
db.default.driver=org.postgresql.Driver
db.default.url="jdbc:postgresql://host:port/database_name"
db.default.username = "username"
db.default.password = "password"
Also in AJAX code I remove this part: type: "POST"

Firebase Java Admin SDK 5.6.0 FirestoreClient constantly throws io.grpc.StatusRuntimeException: UNKNOWN

When using the Firebase Admin SDK 5.6.0 for Java (in a Scala Play! application), we are constantly getting io.grpc.StatusRuntimeException: UNKNOWN anytime we get or set data using the FirestoreClient. The auth functions of Firebase seem to work without any issue, however.
Here is the exception we are getting:
ERROR application - method=GET uri=/v1/users/synchAllUsers remote-address=0:0:0:0:0:0:0:1 status=500 error=class java.util.concurrent.ExecutionException: io.grpc.StatusRuntimeException: UNKNOWN
com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:503)
com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:482)
com.google.api.core.AbstractApiFuture.get(AbstractApiFuture.java:56)
services.FirebaseAdminService.createToken(FirebaseAdminService.scala:98)
services.UsersService.$anonfun$synchAllUsers$2(UsersService.scala:37)
scala.collection.Iterator.foreach(Iterator.scala:929)
scala.collection.Iterator.foreach$(Iterator.scala:929)
scala.collection.AbstractIterator.foreach(Iterator.scala:1417)
scala.collection.IterableLike.foreach(IterableLike.scala:71)
scala.collection.IterableLike.foreach$(IterableLike.scala:70)
scala.collection.AbstractIterable.foreach(Iterable.scala:54)
services.UsersService.$anonfun$synchAllUsers$1(UsersService.scala:34)
services.UsersService.$anonfun$synchAllUsers$1$adapted(UsersService.scala:34)
scala.util.Success.$anonfun$map$1(Try.scala:251)
scala.util.Success.map(Try.scala:209)
scala.concurrent.Future.$anonfun$map$1(Future.scala:287)
scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:29)
scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:29)
scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:91)
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:81)
akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:91)
akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:38)
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:43)
akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Here is the code we are running:
private lazy val app = {
FirebaseApp.getApps().stream().filter(a => a.getName == FirebaseApp.DEFAULT_APP_NAME).findFirst().orElseGet(
() => {
//val serviceAccount = new ByteArrayInputStream(Firebase.serviceAccountKey.getBytes(StandardCharsets.UTF_8.name()))
val serviceAccount = new FileInputStream("conf/gcp_service_account.json")
val options = new FirebaseOptions.Builder()
.setCredentials(GoogleCredentials.fromStream(serviceAccount))
//.setDatabaseUrl("https://festive-bazaar-146119.firebaseio.com")
.build()
serviceAccount.close()
FirebaseApp.initializeApp(options, FirebaseApp.DEFAULT_APP_NAME)
}
)
}
def createToken(user: User, claims: Map[String, Object] = Map()) = {
val auth = FirebaseAuth.getInstance(app)
val db = FirestoreClient.getFirestore(app)
val firebaseUser = getUserById(auth, user.id.toString)
if (firebaseUser == null) {
auth.createUserAsync(new CreateRequest()
.setUid(user.id.toString)
.setDisplayName(user.firstName + ' ' + user.lastName)
.setEmail(user.email)
.setEmailVerified(true)
.setDisabled(false)).get()
}
else {
auth.updateUserAsync(new UpdateRequest(user.id.toString)
.setDisplayName(user.firstName + ' ' + user.lastName)
.setEmail(user.email)
.setEmailVerified(true)
.setDisabled(false)).get()
}
// Set the user's info in our user metadata area
val userInfoRec = new ImmutableMap.Builder[String, String]()
.put("id", user.id.toString)
.put("name", user.email)
.put("email", user.email)
.put("firstName", user.firstName)
.put("lastName", user.lastName)
.build()
db.collection("tenants").document(user.companyId.toString).collection("users").document(user.id.toString).get()
val users = db.collection("tenants").get.get
val result = db.collection("tenants").document(user.companyId.toString).collection("users").document(user.id.toString).set(userInfoRec)
result.get() // **** This triggers the exception shown above, everytime ****
auth.createCustomTokenAsync(user.id.toString, (claims + ("companyId" -> user.companyId.asInstanceOf[Object])).asJava).get()
}
The comment above with the **** notes that is the line that triggers the exception. Has anyone else run into this problem or knows anything about this? In its current state, it makes the FirestoreClient completely useless to us as we can neither get or set data to the Firestore. I've checked the documentation, API reference, and generally Googled around but can't seem to find anything useful.

ChangeFileModeByMask error (5): Access is denied

I accessed MySQL database and fetched the table.
Everything is working fine till that.
when i am trying to save the records in text or other formats i am getting the error
Exit Code Exception exit Code=1: 'Change File Mode By Mask error' (5): Access is denied.
Any help will be appreciated.
object jdbcConnect {
def main(args: Array[String]) {
val url="jdbc:mysql://127.0.0.1:3306/mydb"
val username = "root"
val password = "token_password"
Class.forName("com.mysql.jdbc.Driver").newInstance
//DriverManager.registerDriver(new com.mysql.jdbc.Driver());
val conf = new SparkConf().setAppName("JDB CRDD").setMaster("local[2]").set("spark.executor.memory", "1g")
val sc = new SparkContext(conf)
val myRDD = new JdbcRDD( sc, () =>
DriverManager.getConnection(url,username,password) ,
"select s_Id,issue_date from store_details limit ?, ?",
0, 10, 1, r => r.getString("s_Id") + ", " + r.getString("issue_date"))
myRDD.foreach(println)
myRDD.saveAsTextFile("C:/jdbcrddexamplee")
}
}
Error
17/07/18 11:10:19 ERROR Executor: Exception in task 0.0 in stage 2.0
(TID 2) ExitCodeException exitCode=1: ChangeFileModeByMask error (5):
Access is denied.
at org.apache.hadoop.util.Shell.runCommand(Shell.java:582) at
org.apache.hadoop.util.Shell.run(Shell.java:479) at
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:773)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:866) at
org.apache.hadoop.util.Shell.execCommand(Shell.java:849) at
org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:733)
at
org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.(RawLocalFileSystem.java:225)
at
org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.(RawLocalFileSystem.java:209)
at
org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode(RawLocalFileSystem.java:307)
at
org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:296)
at
org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:328)
It seemed to be a permission error. My foolishness...
Make sure to run anything as an admin. Though i will suggest to use dataframe instead of RDD :D

JdbcRDD error : connection established data fetched partially

I tried to connect to a mysql database, to fetch table records. I can establish the connection, and 10 records are fetched as well, but then suddenly the code crashes. I don't know why. PS: i am new to scala... Any help would be appreciated.
object jdbcConnect {
def main(args: Array[String]) {
val url="jdbc:mysql://127.0.0.1:3306/mydb"
val username = "root"
val password = "token_password"
Class.forName("com.mysql.jdbc.Driver").newInstance
//DriverManager.registerDriver(new com.mysql.jdbc.Driver());
val conf = new SparkConf().setAppName("JDBC RDD").setMaster("local[2]").set("spark.executor.memory", "1g")
val sc = new SparkContext(conf)
val myRDD = new JdbcRDD( sc, () => DriverManager.getConnection(url,username,password) ,
"select s_Id,issue_date from store_details limit ?, ?",
0, 10, 1, r => r.getString("s_Id") + ", " + r.getString("issue_date"))
myRDD.foreach(println)
myRDD.saveAsTextFile("C:/jdbcrddexamplee")
}
}
ERROR
17/07/16 02:32:24 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 1)
ExitCodeException exitCode=1: ChangeFileModeByMask error (5): Access is denied. at org.apache.hadoop.util.Shell.runCommand(Shell.java:582)
at org.apache.hadoop.util.Shell.run(Shell.java:479)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:773)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:866)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:849)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:733)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.(RawLocalFileSystem.java:225)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.(RawLocalFileSystem.java:209)
at org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode(RawLocalFileSystem.java:307)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:296)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:328)
at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.(ChecksumFileSystem.java:398)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:461)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:440)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:804)
It seemed to be a permission error.
My foolishness...
Make sure to run anything as an admin.
Though i will suggest to use dataframe instead of RDD :D
Thanks

Categories

Resources