I have a case class which I want to serialize first. Then after that, I want to deserialize it for storing purpose in MongoDB but java 8 LocalDateTime was creating problem. I took help from this link:
how to deserialize DateTime in Lift
but with no luck. I am unable to write it for java 8 date time.
Can any one please help me with this date Time issue? Here is my code:
import net.liftweb.json.Serialization.{read, write}
implicit val formats = Serialization.formats(NoTypeHints)
case class Child(var str: String, var Num: Int, var abc: Option[String], MyList: List[Int], val dateTime: LocalDateTime = LocalDateTime.now())
val ser = write(Child("Mary", 5, None, List(1, 2)))
println("Child class converted to string" + ser)
var obj = read[Child](ser)
println("object of Child is " + obj)
And here is the error message printed on the console:
(run-main-0) java.lang.ArrayIndexOutOfBoundsException: 49938
java.lang.ArrayIndexOutOfBoundsException: 49938
at com.thoughtworks.paranamer.BytecodeReadingParanamer$ClassReader.<init>(BytecodeReadingParanamer.java:451)
at com.thoughtworks.paranamer.BytecodeReadingParanamer$ClassReader.<init>(BytecodeReadingParanamer.java:431)
at com.thoughtworks.paranamer.BytecodeReadingParanamer$ClassReader.<init>(BytecodeReadingParanamer.java:492)
at com.thoughtworks.paranamer.BytecodeReadingParanamer$ClassReader.<init>(BytecodeReadingParanamer.java:337)
at com.thoughtworks.paranamer.BytecodeReadingParanamer.lookupParameterNames(BytecodeReadingParanamer.java:100)
at com.thoughtworks.paranamer.CachingParanamer.lookupParameterNames(CachingParanamer.java:75)
at com.thoughtworks.paranamer.CachingParanamer.lookupParameterNames(CachingParanamer.java:68)
at net.liftweb.json.Meta$ParanamerReader$.lookupParameterNames(Meta.scala:89)
at net.liftweb.json.Meta$Reflection$.argsInfo$1(Meta.scala:237)
at net.liftweb.json.Meta$Reflection$.constructorArgs(Meta.scala:253)
at net.liftweb.json.Meta$Reflection$.net$liftweb$json$Meta$Reflection$$findMostComprehensive$1(Meta.scala:266)
at net.liftweb.json.Meta$Reflection$$anonfun$primaryConstructorArgs$1.apply(Meta.scala:269)
at net.liftweb.json.Meta$Reflection$$anonfun$primaryConstructorArgs$1.apply(Meta.scala:269)
at net.liftweb.json.Meta$Memo.memoize(Meta.scala:199)
at net.liftweb.json.Meta$Reflection$.primaryConstructorArgs(Meta.scala:269)
at net.liftweb.json.Extraction$.decompose(Extraction.scala:88)
at net.liftweb.json.Extraction$$anonfun$1.applyOrElse(Extraction.scala:91)
at net.liftweb.json.Extraction$$anonfun$1.applyOrElse(Extraction.scala:89)
at scala.collection.immutable.List.collect(List.scala:305)
at net.liftweb.json.Extraction$.decompose(Extraction.scala:89)
at net.liftweb.json.Serialization$.write(Serialization.scala:38)
at TestActor$.delayedEndpoint$TestActor$1(TestActor.scala:437)
at TestActor$delayedInit$body.apply(TestActor.scala:54)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:383)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at TestActor$.main(TestActor.scala:54)
at TestActor.main(TestActor.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
If I remove the dateTime parameter from case class, it works fine. It seems like the problem is in dateTime.
I ran your code on my Intellij Idea, got the same error. Tried to debug the cause but the invocation stack is so deep that I finally gave up.
But I guess maybe it is because Lift doesn't provide default Format for LocaleDateTime, just like the post you mentioned said, "it is the DateParser format that Lift uses by default."
Here is a compromise for your reference,Lift-JSON provides default Date format for us
// net.liftweb.json.Serialization Line 72
def formats(hints: TypeHints) = new Formats {
val dateFormat = DefaultFormats.lossless.dateFormat
override val typeHints = hints
}
So instead of going all the way to write customized serializer, we may as well change our data type to fit the default Date format. Plus, net.liftweb.mongodb.DateSerializer(Line 79) provides support for Date serialization.
Then, we can provide method to easily get LocaleDateTime. Following is how I try to figure it out.
package jacoffee.scalabasic
import java.time.{ ZoneId, LocalDateTime }
import java.util.Date
// package object defined is for Scala compiler to look for implicit conversion for case class parameter date
package object stackoverflow {
implicit def toDate(ldt: LocalDateTime): Date =
Date.from(ldt.atZone(ZoneId.systemDefault()).toInstant())
implicit def toLDT(date: Date): LocalDateTime =
LocalDateTime.ofInstant(date.toInstant(), ZoneId.systemDefault())
}
package jacoffee.scalabasic.stackoverflow
import java.time.LocalDateTime
import java.util.Date
import net.liftweb.json.{ NoTypeHints, Serialization }
import net.liftweb.json.Serialization.{ read, write }
case class Child(var str: String, var Num: Int, var abc: Option[String],
myList: List[Int], val date : Date = LocalDateTime.now()) {
def getLDT: LocalDateTime = date
}
object DateTimeSerialization extends App {
implicit val formats = Serialization.formats(NoTypeHints)
val ser = write(Child("Mary", 5, None, List(1, 2)))
// Child class converted to string {"str":"Mary","Num":5,"myList":[1,2],"date":"2015-07-21T03:07:05.699Z"}
println(" Child class converted to string " + ser)
var obj=read[Child](ser)
// Object of Child is Child(Mary,5,None,List(1, 2),Tue Jul 21 11:48:22 CST 2015)
println(" Object of Child is "+ obj)
// LocalDateTime of Child is 2015-07-21T11:48:22.075
println(" LocalDateTime of Child is "+ obj.getLDT)
}
Anyway, hope it helps.
Related
I create a function based on Java MaskFormatter function in Databricks/Scala.
But when I call it from sparksql, I received error message
Error in SQL statement: AnalysisException: Undefined function:
formatAccount. This function is neither a built-in/temporary function,
nor a persistent function that is qualified as
spark_catalog.default.formataccount.; line 1 pos 32
Here is my function
import javax.swing.text.MaskFormatter
def formatAccount(account: String, mask:String ) : String =
{
val formatter = new MaskFormatter(mask.replace("X", "A"))
formatter.setValueContainsLiteralCharacters(false)
val formatAccount = formatter.valueToString(account)
formatAccount
}
Here is the query code which received the error message
sql("""select java_method(emitToKafka ,formatAccount("1222233334", "X-XXXX-XXXX-X"))""")
However if I run below code, it works fine.
formatAccount("1222233334", "X-XXXX-XXXX-X")
res0: String = 1-2222-3333-4
what could be missed?
I'm having a problem when I try to filter data where data < todayData
If I use this code, I get the wrong results
Code:
val todayData = LocalDate.now.format(
DateTimeFormatter.ofPattern("dd/MM/yyyy")) //22/09/2021
val filtredDF = sampleData.where(sampleData("data_riferimento_condizioni") < todayData)
One of reuslt:
+--------+--------+---------------------------+-----------+
|istituto|servizio|data_riferimento_condizioni| stato|
+--------+--------+---------------------------+-----------+
| 62952| 923| 02/12/2022|in progress|
+--------+--------+---------------------------+-----------+
As you can see I get data that > todayDate, I want to bring data_riferimento_condizioni to LocalDate so I can use public boolean isBefore(ChronoLocalDate other)
At first you need to convert "data_riferimento_condizioni" to DateType or TimestampType instead StringType with to_date() or to_timestamp() functions from there and then filter your data
For spark 3 and newer you can filter out you rows comparing them with instances of java.time.LocalDate or java.time.Instant
val filtredDF = sampleData
.withColumn("converted", to_date(col("data_riferimento_condizioni"), "dd/MM/yyyy"))
.where(col("converted") < LocalDate.now)
But if you're using spark 2, you have to convert your LocalDate or Instant to java.sql.Date or java.sql.Timestamp
val filtredDF = sampleData
.withColumn("converted", to_date(col("data_riferimento_condizioni"), "dd/MM/yyyy"))
.where(col("converted") < Date.valueOf(LocalDate.now))
You can read more about using dates in spark and differences between spark2 and spark3 there
I have a column in spark dataframe as
time_span
values are in iso 8601 duration
ex: P0Y0M0DT0H5M35S . I want to convert that values in to seconds. Is there a function in spark or Scala which will help me do that? I am looking for a way and was unsuccessful
I tried with duration
import java.time.Duration
java.time.Duration.parse("P0Y0M0DT0H5M35S")
This gives me err as:
java.time.format.DateTimeParseException: Text cannot be parsed to a Duration
Am I doing anything wrong in passing value to function. I found this documentation
https://docs.oracle.com/javase/8/docs/api/java/time/Duration.html
If I was successful in doing it this way then will have to apply additional logic to do it on whole dataframe column
hope the below approach helps you.
import org.apache.spark.sql.types._
import org.apache.spark.sql.functions._
val isoToSecondsUDF = udf( (value: String) => (java.time.Duration.parse("PT".concat(value.split("T")(1))).get(java.time.temporal.ChronoUnit.SECONDS)))
val df=Seq(("P0Y0M0DT0H5M35S")).toDF("value")
df.withColumn("seconds",isoToSecondsUDF($"value")).show()
/*
+---------------+-------+
| value|seconds|
+---------------+-------+
|P0Y0M0DT0H5M35S| 335|
+---------------+-------+
*/
Updated Solution to cover case where month and day is present
for eg: P0Y0M2DT23H59M56S. and P0Y1M2DT23H59M56S
We will need to use time4j lib : https://github.com/MenoData/Time4J
Here is code :
import org.apache.spark.sql.types._
import org.apache.spark.sql.functions._
import net.time4j.Duration
def getSeconds(value: String) : String={
var b = Duration.parsePeriod(value).toTemporalAmount().get(java.time.temporal.ChronoUnit.MONTHS)
var c = Duration.parsePeriod(value).toTemporalAmount().get(java.time.temporal.ChronoUnit.DAYS)
var days =((b*30)+c).toString()
var seconds = (java.time.Duration.parse("P".concat(days).concat("DT").concat(if(value.contains("T")) value.split("T")(1) else value.split("D")(1))).get(java.time.temporal.ChronoUnit.SECONDS)).toString()
return seconds
}
val isoToSecondsUDF = udf( (value: String) => getSeconds(value))
spark.udf.register("isoToSecondsUDF", isoToSecondsUDF)
val df=Seq(("P0Y0M2DT23H59M56S")).toDF("value")
df.withColumn("seconds",isoToSecondsUDF($"value")).show()
First get the number of months then convert to days and add it to existing number of days then pass that to parse method.
#sathya
Output:
+-----------------+-------+
| value|seconds|
+-----------------+-------+
|P0Y0M2DT23H59M56S| 259196|
+-----------------+-------+
+-----------------+-------+
| value|seconds|
+-----------------+-------+
|P0Y1M2DT23H59M56S|2851196|
+-----------------+-------+
I have class with the field:
#ApiModelProperty(value = "Дата получения баланса", example = "2018-01-16T09:22:33.316Z")
#JsonProperty("date")
private Instant date;
When I generate yaml from this source (with swagger-maven-plugin) I get:
date:
type: "integer"
format: "int64"
example: "2018-01-16T09:22:33.316Z"
description: "Дата получения баланса"
So when I generate back my class from yaml (with swagger-codegen-maven-plugin) I get it with field:
#JsonProperty("date")
private Long date = null;
Why Instant converts to Long?
This is a bug of swaggger-core, and it was fixed on version 2.1.2 (See here).
If you're using a previous version you can customize this behaviour by replacing the PrimitiveType for the Instant data type with the following snippet:
PrimitiveType.customClasses().put(java.time.Instant.class.getName(),
PrimitiveType.DATE_TIME);
After parsing JSON UTC date-time data from a server, I was presented with
2017-03-27 16:27:45.567
... is there any way to format this without using tedious amount of String manipulation so that the seconds part is rounded up to 46 prior to passing it in as a DateTimeFormat pattern of say, "yyyy-MM-dd HH:mm:ss"?
You can round the second up like this:
DateTime dateTime = DateTimeFormat.forPattern("yyyy-MM-dd HH:mm:ss.SSS")
.withZoneUTC()
.parseDateTime("2017-03-27 16:27:45.567")
.secondOfMinute()
.roundCeilingCopy();
System.out.println(dateTime);
// 2017-03-27T16:27:46.000Z
Have you looked at (and could you use) the MomentJS library? I had issues with reading various date formats from the server and making sense of them in JavaScript code (which led me here). Since then, I've used MomentJS and working with dates/times in JavaScript has been much easier.
Here is an example:
<script>
try
{
var myDateString = "2017-03-27 16:27:45.567";
var d = moment(myDateString);
var result = d.format('YYYY/MM/DD HH:mm:ss');
alert("Simple Format: " + result);
// If we have millliseconds, increment to the next second so that
// we can then get its 'floor' by using the startOf() function.
if(d.millisecond() > 0)
d = d.add(1, 'second');
result = d.startOf('second').format('YYYY/MM/DD HH:mm:ss');
alert("Rounded Format: " + result);
}
catch(er)
{
console.log(er);
}
</script>
But of course, you'll probably want to wrap this logic into a function.