I am trying to use stax2 in order to write xml files with escaping special characters for the attributes.
When I am trying to achieve is an exact output like this:
<elem1 att1="This
That" />
But when I use the usual XMLStreamWriter is this:
<elem1 att1="This 
 That" />
So I tried the following with stax2 :
import org.codehaus.stax2.{XMLOutputFactory2}
import org.scalatest.funsuite.AnyFunSuite
import java.io.{File, FileOutputStream}
import javax.xml.stream.{XMLOutputFactory, XMLStreamWriter}
class testStreamXML extends AnyFunSuite{
val file = new File("stax2test.xml")
val fileOutputStream = new FileOutputStream(file)
val outputFactory: XMLOutputFactory2 = XMLOutputFactory.newInstance().asInstanceOf[XMLOutputFactory2]
//outputFactory.setProperty(XMLOutputFactory2.P_ATTR_VALUE_ESCAPER, true)
val writer= outputFactory.createXMLStreamWriter(fileOutputStream)
writer.writeStartDocument()
writer.writeStartElement("elem1")
writer.writeAttribute("att1", "This
That")
writer.writeEndElement()
writer.writeEndDocument()
}
And whenever i try to set the property P_ATTR_VALUE_ESCAPER to true or false,I receive this error:
An exception or error caused a run to abort: class java.lang.Boolean cannot be cast to class org.codehaus.stax2.io.EscapingWriterFactory (java.lang.Boolean is in module java.base of loader 'bootstrap'; org.codehaus.stax2.io.EscapingWriterFactory is in unnamed module of loader 'app')
java.lang.ClassCastException: class java.lang.Boolean cannot be cast to class org.codehaus.stax2.io.EscapingWriterFactory (java.lang.Boolean is in module java.base of loader 'bootstrap'; org.codehaus.stax2.io.EscapingWriterFactory is in unnamed module of loader 'app')
at com.ctc.wstx.api.WriterConfig.setProperty(WriterConfig.java:401)
at com.ctc.wstx.api.CommonConfig.setProperty(CommonConfig.java:100)
at com.ctc.wstx.stax.WstxOutputFactory.setProperty(WstxOutputFactory.java:153)
at testStreamXML3.<init>(testStreamXML3.scala:10)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:64)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500)
at java.base/java.lang.reflect.ReflectAccess.newInstance(ReflectAccess.java:128)
at java.base/jdk.internal.reflect.ReflectionFactory.newInstance(ReflectionFactory.java:350)
at java.base/java.lang.Class.newInstance(Class.java:645)
at org.scalatest.tools.Runner$.genSuiteConfig(Runner.scala:1402)
at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$8(Runner.scala:1199)
at scala.collection.immutable.List.map(List.scala:246)
at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1198)
at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1480)
at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
at org.scalatest.tools.Runner$.run(Runner.scala:798)
at org.scalatest.tools.Runner.run(Runner.scala)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2or3(ScalaTestRunner.java:38)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:25)
Any suggestion how to use resolve this? or to achieve my goal of escaping special characters in attribute?
The property you are referring to does require a class of EscapingWriterFactory. Here are the docs:
Property that can be set if a custom output escaping for attribute value content is needed. The value set needs to be of type
EscapingWriterFactory. When set, the factory will be used to create a
per-writer instance used to escape all attribute values written, both
via explicit XMLStreamWriter.writeAttribute(java.lang.String,
java.lang.String) methods, and via copy methods
(XMLStreamWriter2.copyEventFromReader(org.codehaus.stax2.XMLStreamReader2,
boolean)). [1]
Regarding your question of how to achieve an 'individual' escaping an implementation of this factory would do the job. Here is a simple implementation (inspired by [2]) using the given writer without applying any escaping - this might be your starting point for any special use case you want to address:
class CustomXmlEscapingWriterFactory extends EscapingWriterFactory{
override def createEscapingWriterFor(writer: Writer, s: String): Writer =
new Writer(){
override def write(cbuf: Array[Char], off: Int, len: Int): Unit =
writer.write(cbuf, off, len)
override def flush(): Unit = writer.flush()
override def close(): Unit = writer.close()
}
override def createEscapingWriterFor(outputStream: OutputStream, s: String): Writer =
throw IllegalArgumentException("not supported")
}
class TestStreamXML extends AnyFunSuite{
val file = new File("stax2test.xml")
val fileOutputStream = new FileOutputStream(file)
val oprovider: OutputFactoryProviderImpl = new OutputFactoryProviderImpl()
val outputFactory: XMLOutputFactory2 = oprovider.createOutputFactory()
// your factory implementation goes here as property
outputFactory.setProperty(XMLOutputFactory2.P_ATTR_VALUE_ESCAPER, CustomXmlEscapingWriterFactory())
val writer= outputFactory.createXMLStreamWriter(fileOutputStream)
writer.writeStartDocument()
writer.writeStartElement("elem1")
writer.writeAttribute("att1", "This
That")
writer.writeEndElement()
writer.writeEndDocument()
}
The resulting output looks like this:
<?xml version='1.0' encoding='UTF-8'?><elem1 att1="This
That"/>
[1] https://fasterxml.github.io/stax2-api/javadoc/4.0.0/org/codehaus/stax2/XMLOutputFactory2.html#P_ATTR_VALUE_ESCAPER
[2] Escaping quotes using jackson-dataformat-xml
Related
Goal:
I am trying to parse the postman_echo collection json and persist the result into a new json copy on disk, resulting the same file as the original one.
I prefer built-in data structure from the language, but using json library should be good too. Not sure Antlr4 is a better way.
follow-up question
is it possible to allow any valid nested json in the body of post request?
update:
https://github.com/chakpongchung/postman-parser
In the end we come up with this satisfactory solution.
An alternative to what zoran mentioned is to create a case class if the structure is not too dynamic (with Play JSON). This would make it easier to compare the result.
case class MyObject(
queryString: List[KeyValue],
method: String,
url: String,
httpVersion: String
) ... and so on
object MyObject {
implicit val format: Format[MyObject] = Json.format
}
case class KeyValue(name: String, value: String)
object KeyValue {
implicit val format: Format[KeyValue] = Json.format
}
Then, you just need to do:
object JsonParser extends App {
val postman_collections = "./scala_input.json"
val jsonifiedString = scala.io.Source.fromFile(postman_collections).mkString
val myJsonData = Try(Json.parse(jsonifiedString)).map(_.as[MyObject])
myJsonData match {
case Success(myValue) => // compare your case class here
case Failure(err) => println("none")
}
}
I'm not sure if I understand your question well, but if you are trying to iterate over json string, you might try something like this:
import play.api.libs.json.{JsObject, JsValue, Json}
import scala.util.{Failure, Success, Try}
object JsonParser extends App {
val postman_coolections = "./resources/scala_input.json"
val jsonifiedString = scala.io.Source.fromFile(postman_coolections).mkString
val json: JsValue = Try(Json.parse(jsonifiedString)) match {
case Success(js) => js
case Failure(ex) => throw new Exception("Couldn't parse json", ex)
}
json.asInstanceOf[JsObject].fields.foreach{
case (key: String, value: JsValue)=>
println(s"Key:$key value:${value.toString}")
writeFile(s"$key.json", Json.prettyPrint(value))
}
//writing the whole postman input as a single file
writeFile("postmanInputFormatted.json", Json.prettyPrint(json))
writeFile("postmanInput.json", Json.stringify(json))
// To access individual property one option is to use this approach
val lookedValue = json \ "postData" \ "params" \ 1 \ "hello" \ "test"
lookedValue match {
case JsDefined(value) => println(s"Test value is $value")
case JsUndefined() => println("Didn't find test value")
}
// or
val lookedValueAlt = (json \ "postData" \ "params" \ 1 \ "hello" \ "test").getOrElse(throw SomeException)
There are multiple problems in your parser and most of them are that you are trying to use default parser to handle Json object as string. For example, in Request you are handling header as Seq[String] when it's actually Seq of (key, value) pairs. For this particular case, you should change it to something like this:
case class Request(
method: String,
header: Seq[HeaderItem], // header: []
url: Option[Url] = None,
description: String = ""
)
object Request {
implicit val format: Format[Request] = Json.using[Json.WithDefaultValues].format
case class HeaderItem(key: String, value: String)
object HeaderItem {
implicit val format: Format[HeaderItem] = Json.format
}
You can convert header to Seq[String] if you want, but you will have to write custom Read for that.
In the above case you also have cases when description is missing, so you want to handle that with default values.
You have such problems to handle in few other places as well, e.g. "response".
Another problem that I've noticed is a way how you handle "type" property from Json string. Type is reserved keyword, and you can handle it by wrapping it in ``, e.g.
case class Script(
`type`: String,
exec: Seq[String]
)
A satisfactory solution is posted in the github link above in the question.
The title talks by itself, I have a Config object (from https://github.com/typesafehub/config) and I want to pass it the a constructor which only supports java.util.Properties as argument.
Is there an easy way to convert a Config to a Properties object ?
Here is a way to convert a typesafe Config object into a Properties java object. I have only tested it in a simple case for creating Kafka properties.
Given this configuration in application.conf
kafka-topics {
my-topic {
zookeeper.connect = "localhost:2181",
group.id = "testgroup",
zookeeper.session.timeout.ms = "500",
zookeeper.sync.time.ms = "250",
auto.commit.interval.ms = "1000"
}
}
You can create the corresponding Properties object like that:
import com.typesafe.config.{Config, ConfigFactory}
import java.util.Properties
import kafka.consumer.ConsumerConfig
object Application extends App {
def propsFromConfig(config: Config): Properties = {
import scala.collection.JavaConversions._
val props = new Properties()
val map: Map[String, Object] = config.entrySet().map({ entry =>
entry.getKey -> entry.getValue.unwrapped()
})(collection.breakOut)
props.putAll(map)
props
}
val config = ConfigFactory.load()
val consumerConfig = {
val topicConfig = config.getConfig("kafka-topics.my-topic")
val props = propsFromConfig(topicConfig)
new ConsumerConfig(props)
}
// ...
}
The function propsFromConfig is what you are mainly interested in, and the key points are the use of entrySet to get a flatten list of properties, and the unwrapped of the entry value, that gives an Object which type depends on the configuration value.
You can try my scala wrapper https://github.com/andr83/scalaconfig. Using it convert config object to java Properties is simple:
val properties = config.as[Properties]
As typesafe config/hocon supports a much richer structure than java.util.propeties it will be hard to get a safe conversion.
Or spoken otherwise as properties can only express a subset of hocon the conversion is not clear, as it will have a possible information loss.
So if you configuration is rather flat and does not contain utf-8 then you could transform hocon to json and then extract the values.
A better solution would be to implement a ConfigClass and populate the values with values from hocon and passing this to the class you want to configure.
It is not possible directly through typesafe config. Even rending the entire hocon file into json does provide a true valid json:
ex:
"play" : {
"filters" : {
"disabled" : ${?play.filters.disabled}[
"play.filters.hosts.AllowedHostsFilter"
],
"disabled" : ${?play.filters.disabled}[
"play.filters.csrf.CSRFFilter"
]
}
}
That format is directly from Config.render
as you can see, disabled is represented twice with hocon style syntax.
I have also had problems with rendering hocon -> json -> hocon
Example hocon:
http {
port = "9000"
port = ${?HTTP_PORT}
}
typesafe config would parse this to
{
"http": {
"port": "9000,${?HTTP_PORT}"
}
}
However if you try to parse that in hocon - it throws a syntax error. the , cannot be there.
The hocon correct parsing would be 9000${?HTTP_PORT} - with no comma between the values. I believe this is true for all array concatenation and substitution
I have a case class which I want to serialize first. Then after that, I want to deserialize it for storing purpose in MongoDB but java 8 LocalDateTime was creating problem. I took help from this link:
how to deserialize DateTime in Lift
but with no luck. I am unable to write it for java 8 date time.
Can any one please help me with this date Time issue? Here is my code:
import net.liftweb.json.Serialization.{read, write}
implicit val formats = Serialization.formats(NoTypeHints)
case class Child(var str: String, var Num: Int, var abc: Option[String], MyList: List[Int], val dateTime: LocalDateTime = LocalDateTime.now())
val ser = write(Child("Mary", 5, None, List(1, 2)))
println("Child class converted to string" + ser)
var obj = read[Child](ser)
println("object of Child is " + obj)
And here is the error message printed on the console:
(run-main-0) java.lang.ArrayIndexOutOfBoundsException: 49938
java.lang.ArrayIndexOutOfBoundsException: 49938
at com.thoughtworks.paranamer.BytecodeReadingParanamer$ClassReader.<init>(BytecodeReadingParanamer.java:451)
at com.thoughtworks.paranamer.BytecodeReadingParanamer$ClassReader.<init>(BytecodeReadingParanamer.java:431)
at com.thoughtworks.paranamer.BytecodeReadingParanamer$ClassReader.<init>(BytecodeReadingParanamer.java:492)
at com.thoughtworks.paranamer.BytecodeReadingParanamer$ClassReader.<init>(BytecodeReadingParanamer.java:337)
at com.thoughtworks.paranamer.BytecodeReadingParanamer.lookupParameterNames(BytecodeReadingParanamer.java:100)
at com.thoughtworks.paranamer.CachingParanamer.lookupParameterNames(CachingParanamer.java:75)
at com.thoughtworks.paranamer.CachingParanamer.lookupParameterNames(CachingParanamer.java:68)
at net.liftweb.json.Meta$ParanamerReader$.lookupParameterNames(Meta.scala:89)
at net.liftweb.json.Meta$Reflection$.argsInfo$1(Meta.scala:237)
at net.liftweb.json.Meta$Reflection$.constructorArgs(Meta.scala:253)
at net.liftweb.json.Meta$Reflection$.net$liftweb$json$Meta$Reflection$$findMostComprehensive$1(Meta.scala:266)
at net.liftweb.json.Meta$Reflection$$anonfun$primaryConstructorArgs$1.apply(Meta.scala:269)
at net.liftweb.json.Meta$Reflection$$anonfun$primaryConstructorArgs$1.apply(Meta.scala:269)
at net.liftweb.json.Meta$Memo.memoize(Meta.scala:199)
at net.liftweb.json.Meta$Reflection$.primaryConstructorArgs(Meta.scala:269)
at net.liftweb.json.Extraction$.decompose(Extraction.scala:88)
at net.liftweb.json.Extraction$$anonfun$1.applyOrElse(Extraction.scala:91)
at net.liftweb.json.Extraction$$anonfun$1.applyOrElse(Extraction.scala:89)
at scala.collection.immutable.List.collect(List.scala:305)
at net.liftweb.json.Extraction$.decompose(Extraction.scala:89)
at net.liftweb.json.Serialization$.write(Serialization.scala:38)
at TestActor$.delayedEndpoint$TestActor$1(TestActor.scala:437)
at TestActor$delayedInit$body.apply(TestActor.scala:54)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:383)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at TestActor$.main(TestActor.scala:54)
at TestActor.main(TestActor.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
If I remove the dateTime parameter from case class, it works fine. It seems like the problem is in dateTime.
I ran your code on my Intellij Idea, got the same error. Tried to debug the cause but the invocation stack is so deep that I finally gave up.
But I guess maybe it is because Lift doesn't provide default Format for LocaleDateTime, just like the post you mentioned said, "it is the DateParser format that Lift uses by default."
Here is a compromise for your reference,Lift-JSON provides default Date format for us
// net.liftweb.json.Serialization Line 72
def formats(hints: TypeHints) = new Formats {
val dateFormat = DefaultFormats.lossless.dateFormat
override val typeHints = hints
}
So instead of going all the way to write customized serializer, we may as well change our data type to fit the default Date format. Plus, net.liftweb.mongodb.DateSerializer(Line 79) provides support for Date serialization.
Then, we can provide method to easily get LocaleDateTime. Following is how I try to figure it out.
package jacoffee.scalabasic
import java.time.{ ZoneId, LocalDateTime }
import java.util.Date
// package object defined is for Scala compiler to look for implicit conversion for case class parameter date
package object stackoverflow {
implicit def toDate(ldt: LocalDateTime): Date =
Date.from(ldt.atZone(ZoneId.systemDefault()).toInstant())
implicit def toLDT(date: Date): LocalDateTime =
LocalDateTime.ofInstant(date.toInstant(), ZoneId.systemDefault())
}
package jacoffee.scalabasic.stackoverflow
import java.time.LocalDateTime
import java.util.Date
import net.liftweb.json.{ NoTypeHints, Serialization }
import net.liftweb.json.Serialization.{ read, write }
case class Child(var str: String, var Num: Int, var abc: Option[String],
myList: List[Int], val date : Date = LocalDateTime.now()) {
def getLDT: LocalDateTime = date
}
object DateTimeSerialization extends App {
implicit val formats = Serialization.formats(NoTypeHints)
val ser = write(Child("Mary", 5, None, List(1, 2)))
// Child class converted to string {"str":"Mary","Num":5,"myList":[1,2],"date":"2015-07-21T03:07:05.699Z"}
println(" Child class converted to string " + ser)
var obj=read[Child](ser)
// Object of Child is Child(Mary,5,None,List(1, 2),Tue Jul 21 11:48:22 CST 2015)
println(" Object of Child is "+ obj)
// LocalDateTime of Child is 2015-07-21T11:48:22.075
println(" LocalDateTime of Child is "+ obj.getLDT)
}
Anyway, hope it helps.
I am trying to transform a java.io.File to a java.sql.Blob. More specifically I have a model in Scala Play that needs to store a file inside of a database, here is what I have so far.
case class Complication(complicationId: Long,
vitalSignId: Long,
complication: String, definition: String, reason: String,
treatment: String, treatmentImage: Option[java.io.File], notes: String,
weblinks: String, upperlower: String)
object ComplicationDAO extends Table[Complication]("complications") with GasGuruDatabase {
def complicationId = column[Long]("complication_id", O.PrimaryKey, O.AutoInc)
def vitalSignId = column[Long]("vital_sign_id")
def complication = column[String]("complication")
def definition = column[String]("definition", O.DBType("varchar(4000)"))
def reason = column[String]("reason", O.DBType("varchar(4000)"))
def treatment = column[String]("treatment", O.DBType("varchar(4000)"))
def treatmentImage = column[Option[Blob]]("treatment_image")
def notes = column[String]("notes", O.DBType("varchar(4000)"))
def weblinks = column[String]("weblinks")
def upperlower = column[String]("upperlower")
def * = complicationId ~ vitalSignId ~ complication ~ definition ~ reason ~ treatment ~ treatmentImage ~ notes ~ weblinks ~ upperlower <> (Complication, Complication.unapply _)
The error is happening in between the mapping from a java.io.File to a Blob to store the file inside of the database, how can I do this?
Thanks!
You have to actually read the file into something compatible with blob before trying to store it, java.io.File is just a descriptor
From Slick's documentation
LOB types: java.sql.Blob, java.sql.Clob, Array[Byte]
case class Complication(complicationId: Long,
vitalSignId: Long,
complication: String, definition: String, reason: String,
treatment: String, treatmentImage: **Option[Array[Byte]]**, notes: String,
weblinks: String, upperlower: String)
In my scala code this works fine:
import org.springframework.jmx.export.annotation.{ManagedOperationParameters, ManagedResource, ManagedOperation, ManagedOperationParameter}
#Override #ManagedOperation(description = "somedesk")
def getStatsAsStr: String = "blabla"
but as soon as i add #ManagedOperationParameters I get illegal start of simple expression for #ManagedOperationParameter( although I do import it.
so while in java this compiles fine:
#Override #ManagedOperation(description = "some description")
#ManagedOperationParameters({#ManagedOperationParameter(name = "myname", description = "myname")
})
In scala does not compile:
import org.springframework.jmx.export.annotation.{ManagedOperationParameters, ManagedResource, ManagedOperation, ManagedOperationParameter}
#Override #ManagedOperation(description = "some description")
#ManagedOperationParameters(Array(#ManagedOperationParameter(name = "myname", description = "mydesc")) // PRODUCES 'illegal start of simple expression for #ManagedOperationParameter('
def getStatsAsStr(myname: String): String = "blabla"
is there a way for it to work? if i create it as a .java with java syntax in same project all is fine (which means my depenedncies are fine) i think its something with scala syntax i don't get what is it?
Inner annotation values have to be constructed with a different syntax. This should work (whitespace added for clarity, not relevant); if not, try replacing the named parameters with positional.
#ManagedOperationParameters(
Array(
new ManagedOperationParameter(name="myname", description="mydesc")
))