ZetaSQL Sample Using Apache beam - java

I am Facing Issues while Using ZetaSQL in Apache beam Framework (2.17.0-SNAPSHOT). After Going through documentation of the apache beam I am not able to find any sample for ZetaSQL.
I tried to add the Planner:
options.setPlannerName("org.apache.beam.sdk.extensions.sql.zetasql.ZetaSQLQueryPlanner");
But Still Facing Issue, Snippet is added below for help.
```
String sql =
"SELECT CAST (1243 as INT64), "
+ "CAST ('2018-09-15 12:59:59.000000+00' as TIMESTAMP), "
+ "CAST ('string' as STRING);";
ZetaSQLQueryPlanner zetaSQLQueryPlanner = new ZetaSQLQueryPlanner();
BeamRelNode beamRelNode = zetaSQLQueryPlanner.convertToBeamRel(sql);
PCollection<Row> stream = BeamSqlRelUtils.toPCollection(p, beamRelNode);
p.run();
I Understand we need the below Snippet but failed to create the config
Frameworks.newConfigBuilder()
and while Running Code I found below Exceptions:
Exception in thread "main" java.util.ServiceConfigurationError: com.google.zetasql.ClientChannelProvider: Provider com.google.zetasql.JniChannelProvider could not be instantiated
at java.util.ServiceLoader.fail(Unknown Source)
at java.util.ServiceLoader.access$100(Unknown Source)
at java.util.ServiceLoader$LazyIterator.nextService(Unknown Source)

Update: as of 06/23/2020, Beam ZetaSQL is supported on Mac OS as well (not all versions but at least most recent ones)!
====
I think it is related to your OS. Beam is as unified framework but your exception looks from its dependency: ZetaSQL parser. If you change to a newer version of linux I think your code snippet should work.

Related

java.lang.UnsupportedOperationException: option is not a supported field type

Using the following library
<dependency>
<groupId>net.rcarz</groupId>
<artifactId>jira-client</artifactId>
<version>0.5</version>
</dependency>
I am getting error while executing below code:
BasicCredentials creds = new BasicCredentials("username", "password");
JiraClient jira = new JiraClient("xyz/rest/api/2/issue", creds);
Issue newIssue = jira.createIssue("XYZ", "Bug")
.field(Field.SUMMARY, "tEST bUG")
.field("customfield_20200","No STeps")
.field("customfield_20202","No actual")
.field("customfield_25600",Field.valueById("35650"))
.execute();
Getting error for field("customfield_25600",Field.valueByID("35650"))
Error Description :
java.lang.UnsupportedOperationException: option is not a supported
field type
This is customized field in JIRA.
Please let me know if required more information.
Thanks in advance.
Field#toJson() method had not know about option type in v0.5, it was added later. That is why method throws UnsupportedOperationException. Try to use the latest version from GitHub: https://github.com/rcarz/jira-client
Seems to be a known issue with the library,the field you try to add is probably an option and it is not supported
The error was already reported here:
https://github.com/rcarz/jira-client/issues/123
Hi,
trying to use custom fields, I encounter the following issue :
For a "Select List (single choice)" type of field, I get the following exception when trying to create an issue :
Exception: java.lang.UnsupportedOperationException: option is not a supported field type
at net.rcarz.jiraclient.Field.toJson(Field.java:655)
at net.rcarz.jiraclient.Issue$FluentCreate.executeCreate(Issue.java:104)
at net.rcarz.jiraclient.Issue$FluentCreate.execute(Issue.java:59)
I'm using JIRA v7.1.0-OD-05-006
It seems to have to do with the JIRA version.
Following the link to #154, it seems that it was not fixed.
https://github.com/rcarz/jira-client/pull/154
The issue still persists
Caused by: java.lang.UnsupportedOperationException: option is not a supported field type
at net.rcarz.jiraclient.Field.toJson(Field.java:737)
at net.rcarz.jiraclient.Issue$FluentCreate.executeCreate(Issue.java:102)
at net.rcarz.jiraclient.Issue$FluentCreate.execute(Issue.java:57)
Here is my snippet code looks like. The customfield_12133 is an options.
JiraClient jiraClient;
Issue issue = jiraClient.createIssue("MYPROJECT", "Internal Bug")
.field(Field.SUMMARY, summary)
.field(Field.DESCRIPTION, summary)
.field("customfield_12133", "Other")
.execute();
Finally pull #176 should actually have fixed it:
https://github.com/rcarz/jira-client/pull/176
might be fixed in the next version (0.6) of the library

SparkStreaming: class cast exception for SerializedOffset

I'm writing a custom Spark structured streaming source (using v2 interfaces and Spark 2.3.0) in Java/Scala.
When testing the integration with the Spark offsets/checkpoint, I get the following error:
18/06/20 11:58:49 ERROR MicroBatchExecution: Query [id = 58ec2604-3b04-4912-9ba8-c757d930ac05, runId = 5458caee-6ef7-4864-9968-9cb843075458] terminated with error
java.lang.ClassCastException: org.apache.spark.sql.execution.streaming.SerializedOffset cannot be cast to org.apache.spark.sql.sources.v2.reader.streaming.Offset
at org.apache.spark.sql.execution.streaming.MicroBatchExecution$$anonfun$org$apache$spark$sql$execution$streaming$MicroBatchExecution$$runBatch$1$$anonfun$apply$9.apply(MicroBatchExecution.scala:405)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution$$anonfun$org$apache$spark$sql$execution$streaming$MicroBatchExecution$$runBatch$1$$anonfun$apply$9.apply(MicroBatchExecution.scala:390)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at org.apache.spark.sql.execution.streaming.StreamProgress.foreach(StreamProgress.scala:25)
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
at org.apache.spark.sql.execution.streaming.StreamProgress.flatMap(StreamProgress.scala:25)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution$$anonfun$org$apache$spark$sql$execution$streaming$MicroBatchExecution$$runBatch$1.apply(MicroBatchExecution.scala:390)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution$$anonfun$org$apache$spark$sql$execution$streaming$MicroBatchExecution$$runBatch$1.apply(MicroBatchExecution.scala:390)
at org.apache.spark.sql.execution.streaming.ProgressReporter$class.reportTimeTaken(ProgressReporter.scala:271)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:58)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.org$apache$spark$sql$execution$streaming$MicroBatchExecution$$runBatch(MicroBatchExecution.scala:389)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution$$anonfun$runActivatedStream$1$$anonfun$apply$mcZ$sp$1.apply$mcV$sp(MicroBatchExecution.scala:133)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution$$anonfun$runActivatedStream$1$$anonfun$apply$mcZ$sp$1.apply(MicroBatchExecution.scala:121)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution$$anonfun$runActivatedStream$1$$anonfun$apply$mcZ$sp$1.apply(MicroBatchExecution.scala:121)
at org.apache.spark.sql.execution.streaming.ProgressReporter$class.reportTimeTaken(ProgressReporter.scala:271)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:58)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution$$anonfun$runActivatedStream$1.apply$mcZ$sp(MicroBatchExecution.scala:121)
at org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:56)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runActivatedStream(MicroBatchExecution.scala:117)
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:279)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:189)
This is my Offset implementation (simplified version, I removed JSON (de) serialization):
package mypackage
import org.apache.spark.sql.execution.streaming.SerializedOffset
import org.apache.spark.sql.sources.v2.reader.streaming.Offset
case class MyOffset(offset: Long) extends Offset {
override val json = "{\"offset\":"+offset+"}"
}
private object MyOffset {
def apply(offset: SerializedOffset): MyOffset = new MyOffset(0L)
}
Dou you have any advice about how to solve this problem?
Check that Spark version of your client app is exactly the same as Spark version of your cluster. I used Spark v.2.4.0 dependencies in spark job application, but cluster had Spark engine v.2.3.0. When I downgraded dependencies to v.2.3.0 error has gone.

java mongo about org.bson.BsonSerializationException

Recently I found that Java-mongodb-driver offered some exceptions:
org.bson.BsonSerializationException: Detected unknown BSON type "\xce" for fieldname "����\�z". Are you using the latest driver version?
org.bson.BsonBinaryReader.readBsonType(BsonBinaryReader.java:95)
com.mongodb.connection.ProtocolHelper.getField(ProtocolHelper.java:87)
com.mongodb.connection.ProtocolHelper.isCommandOk(ProtocolHelper.java:82)
com.mongodb.connection.CommandProtocol.execute(CommandProtocol.java:113)
com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:168)
com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:289)
com.mongodb.connection.DefaultServerConnection.command(DefaultServerConnection.java:176)
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:216)
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:207)
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:113)
com.mongodb.operation.FindOperation$1.call(FindOperation.java:516)
com.mongodb.operation.FindOperation$1.call(FindOperation.java:510)
com.mongodb.operation.OperationHelper.withConnectionSource(OperationHelper.java:431)
com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:404)
com.mongodb.operation.FindOperation.execute(FindOperation.java:510)
com.mongodb.operation.FindOperation.execute(FindOperation.java:81)
com.mongodb.Mongo.execute(Mongo.java:836)
com.mongodb.Mongo$2.execute(Mongo.java:823)
com.mongodb.FindIterableImpl$FindOperationIterable.first(FindIterableImpl.java:216)
com.mongodb.FindIterableImpl.first(FindIterableImpl.java:156)
I want to know what circumstance can cause this problem, how to solve

ArcGIS GeoEvent Processor - javax.xml.ws.soap.SOAPFaultException: Unmarshalling Error

Background
I'm using wsimport to create what is essentially a Java webservice client, connecting to a .Net webservice that is returning datasets (unfortunately). To be more specific I'm working on a project (inbound transport) for the GeoEvent Processor suite of ESRI ArcGIS Server 10.2, but I think this might be answered on more general terms in relation to JAXB and WSDL bindings. Bear with me as I haven't touched Java since college (10+ years).
For purposes of the WSDL, the .Net DataSet is a polymorphic type whose actual layout isn't determined until run time, after the DataSet has been filled with data. This causes problems when you want to use that webservice with anything but .Net.
After some research I've managed to use wsimport to generate from the webservice wsdl. I was then able to put together a basic proof of concept program that gets results from the webservice as a DOM, then walks that DOM as a nodelist.
Reference:
JAX-WS error on WSDL file: "Error resolving component 's:schema'"
https://weblogs.java.net/blog/vivekp/archive/2007/05/how_to_deal_wit_1.html
The section on Toolkit Bindings and figure 6 in http://msdn.microsoft.com/en-us/magazine/cc188755.aspx
My wsimport looks like this (domain names have been changed to protect the innocent):
C:\Development\ArcGIS\WSDL>wsimport -b http://www.w3.org/2001/XMLSchema.xsd -b xsd.xjb -keep -p com.somecompany.services -XadditionalHeaders http://services.somecompany.com/DataRetrieval.asmx?wsdl
The Problem
Unfortunately, the same codebase that worked in my proof of concept, getting results from the webservice, fails once I implement in the ArcGIS GeoEvent Processor. My project is part of an OSGI bundle that the ArcGIS GeoEvent Processor will control. The error below is as shown in the Apache Karaf log for the GeoEvent Processor.
Based on the error, my understanding is there is a problem with how I did the binding in wsimport, referencing the generic schema per those links I have listed above. Looks like the generic schema lacks definitions for some of the elements that exist as classes generated by wsimport. Those classes appear to be properly generated when I check the output from wsimport.
I've not included the WSDL due to posting limitations, but will include in later responses if needed.
What I'm trying to figure out
How should this error be interpreted?
Why does the same wsimport generated code used to access the webservice in my basic proof of concept fail when run in the ArcGIS GeoEvent Processor?
The error mentions JAXB and SAX, I'm not consciously referencing either of those libraries in the proof of concept or the project for the ArcGIS GeoEvent Processor. Could it be that the binding/unmarshalling of the webservice is handled differently, with ArcGIS GeoEvent Processor wrapping in JAXB/SAX and the proof of concept not?
What can I do to resolve this?
Use a different, custom, xsd and xjb that spells out the expected schema for the webservice? I'm not sure exactly how that would be done.
Use something other than wsimport to generate the webservice reference classes?
Tweak something in the java environment for the ArcGIS GeoEvent Processor?
Other options?
Commit seppuku, then it's not my problem?
The Error
2014-09-23 16:10:14,365 | ERROR | ansport Listener | SomeInboundTransport | 367 - com.somecompany.arcgis.geoevent.transport.inbound.somecompanyInboundTransport - 1.0.0 | Unable to call Webservice
javax.xml.ws.soap.SOAPFaultException: Unmarshalling Error: unexpected element (uri:"http://www.w3.org/2001/XMLSchema", local:"element"). Expected elements are <{http://services.somecompany.com/}complexType>,<{http://services.somecompany.com/}annotation>,<{http://services.somecompany.com/}redefine>,<{http://services.somecompany.com/}element>,<{http://services.somecompany.com/}include>,<{http://services.somecompany.com/}attributeGroup>,<{http://services.somecompany.com/}group>,<{http://services.somecompany.com/}notation>,<{http://services.somecompany.com/}import>,<{http://services.somecompany.com/}simpleType>,<{http://services.somecompany.com/}attribute>
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:156)[120:org.apache.cxf.cxf-rt-frontend-jaxws:2.6.1]
at com.sun.proxy.$Proxy198.getCompanyArcgisData(Unknown Source)[367:com.somecompany.arcgis.geoevent.transport.inbound.somecompanyInboundTransport:1.0.0]
at com.somecompany.arcgis.geoevent.transport.inbound.SomeInboundTransport.callWebService(SomeInboundTransport.java:184)[367:com.somecompany.arcgis.geoevent.transport.inbound.somecompanyInboundTransport:1.0.0]
at com.somecompany.arcgis.geoevent.transport.inbound.SomeInboundTransport.run(SomeInboundTransport.java:257)[367:com.somecompany.arcgis.geoevent.transport.inbound.somecompanyInboundTransport:1.0.0]
at java.lang.Thread.run(Thread.java:722)[:1.7.0_17]
Caused by: javax.xml.bind.UnmarshalException
- with linked exception:
[com.sun.istack.SAXParseException2; lineNumber: 1; columnNumber: 651; unexpected element (uri:"http://www.w3.org/2001/XMLSchema", local:"element"). Expected elements are <{http://services.somecompany.com/}complexType>,<{http://services.somecompany.com/}annotation>,<{http://services.somecompany.com/}redefine>,<{http://services.somecompany.com/}element>,<{http://services.somecompany.com/}include>,<{http://services.somecompany.com/}attributeGroup>,<{http://services.somecompany.com/}group>,<{http://services.somecompany.com/}notation>,<{http://services.somecompany.com/}import>,<{http://services.somecompany.com/}simpleType>,<{http://services.somecompany.com/}attribute>]
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallerImpl.handleStreamException(UnmarshallerImpl.java:425)
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallerImpl.unmarshal0(UnmarshallerImpl.java:362)
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallerImpl.unmarshal(UnmarshallerImpl.java:339)
at org.apache.cxf.jaxb.JAXBEncoderDecoder.doUnmarshal(JAXBEncoderDecoder.java:784)[91:org.apache.cxf.cxf-rt-databinding-jaxb:2.6.1]
at org.apache.cxf.jaxb.JAXBEncoderDecoder.access$100(JAXBEncoderDecoder.java:97)[91:org.apache.cxf.cxf-rt-databinding-jaxb:2.6.1]
at org.apache.cxf.jaxb.JAXBEncoderDecoder$1.run(JAXBEncoderDecoder.java:812)
at java.security.AccessController.doPrivileged(Native Method)[:1.7.0_17]
at org.apache.cxf.jaxb.JAXBEncoderDecoder.unmarshall(JAXBEncoderDecoder.java:810)[91:org.apache.cxf.cxf-rt-databinding-jaxb:2.6.1]
at org.apache.cxf.jaxb.JAXBEncoderDecoder.unmarshall(JAXBEncoderDecoder.java:644)[91:org.apache.cxf.cxf-rt-databinding-jaxb:2.6.1]
at org.apache.cxf.jaxb.io.DataReaderImpl.read(DataReaderImpl.java:157)[91:org.apache.cxf.cxf-rt-databinding-jaxb:2.6.1]
at org.apache.cxf.interceptor.DocLiteralInInterceptor.handleMessage(DocLiteralInInterceptor.java:108)[87:org.apache.cxf.cxf-api:2.6.1]
at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:262)[87:org.apache.cxf.cxf-api:2.6.1]
at org.apache.cxf.endpoint.ClientImpl.onMessage(ClientImpl.java:798)[87:org.apache.cxf.cxf-api:2.6.1]
at org.apache.cxf.transport.http.HTTPConduit$WrappedOutputStream.handleResponseInternal(HTTPConduit.java:1667)[118:org.apache.cxf.cxf-rt-transports-http:2.6.1]
at org.apache.cxf.transport.http.HTTPConduit$WrappedOutputStream.handleResponse(HTTPConduit.java:1520)[118:org.apache.cxf.cxf-rt-transports-http:2.6.1]
at org.apache.cxf.transport.http.HTTPConduit$WrappedOutputStream.close(HTTPConduit.java:1428)[118:org.apache.cxf.cxf-rt-transports-http:2.6.1]
at org.apache.cxf.transport.AbstractConduit.close(AbstractConduit.java:56)[87:org.apache.cxf.cxf-api:2.6.1]
at org.apache.cxf.transport.http.HTTPConduit.close(HTTPConduit.java:658)[118:org.apache.cxf.cxf-rt-transports-http:2.6.1]
at org.apache.cxf.interceptor.MessageSenderInterceptor$MessageSenderEndingInterceptor.handleMessage(MessageSenderInterceptor.java:62)[87:org.apache.cxf.cxf-api:2.6.1]
at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:262)[87:org.apache.cxf.cxf-api:2.6.1]
at org.apache.cxf.endpoint.ClientImpl.doInvoke(ClientImpl.java:532)[87:org.apache.cxf.cxf-api:2.6.1]
at org.apache.cxf.endpoint.ClientImpl.invoke(ClientImpl.java:464)[87:org.apache.cxf.cxf-api:2.6.1]
at org.apache.cxf.endpoint.ClientImpl.invoke(ClientImpl.java:367)[87:org.apache.cxf.cxf-api:2.6.1]
at org.apache.cxf.endpoint.ClientImpl.invoke(ClientImpl.java:320)[87:org.apache.cxf.cxf-api:2.6.1]
at org.apache.cxf.frontend.ClientProxy.invokeSync(ClientProxy.java:89)[119:org.apache.cxf.cxf-rt-frontend-simple:2.6.1]
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:134)[120:org.apache.cxf.cxf-rt-frontend-jaxws:2.6.1]
... 4 more
Caused by: com.sun.istack.SAXParseException2; lineNumber: 1; columnNumber: 651; unexpected element (uri:"http://www.w3.org/2001/XMLSchema", local:"element"). Expected elements are <{http://services.somecompany.com/}complexType>,<{http://services.somecompany.com/}annotation>,<{http://services.somecompany.com/}redefine>,<{http://services.somecompany.com/}element>,<{http://services.somecompany.com/}include>,<{http://services.somecompany.com/}attributeGroup>,<{http://services.somecompany.com/}group>,<{http://services.somecompany.com/}notation>,<{http://services.somecompany.com/}import>,<{http://services.somecompany.com/}simpleType>,<{http://services.somecompany.com/}attribute>
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallingContext.handleEvent(UnmarshallingContext.java:642)
at com.sun.xml.bind.v2.runtime.unmarshaller.Loader.reportError(Loader.java:254)
at com.sun.xml.bind.v2.runtime.unmarshaller.Loader.reportError(Loader.java:249)
at com.sun.xml.bind.v2.runtime.unmarshaller.Loader.reportUnexpectedChildElement(Loader.java:116)
at com.sun.xml.bind.v2.runtime.unmarshaller.Loader.childElement(Loader.java:101)
at com.sun.xml.bind.v2.runtime.unmarshaller.StructureLoader.childElement(StructureLoader.java:243)
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallingContext._startElement(UnmarshallingContext.java:478)
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallingContext.startElement(UnmarshallingContext.java:459)
at com.sun.xml.bind.v2.runtime.unmarshaller.StAXStreamConnector.handleStartElement(StAXStreamConnector.java:242)
at com.sun.xml.bind.v2.runtime.unmarshaller.StAXStreamConnector.bridge(StAXStreamConnector.java:176)
at com.sun.xml.bind.v2.runtime.unmarshaller.UnmarshallerImpl.unmarshal0(UnmarshallerImpl.java:360)
... 28 more
Caused by: javax.xml.bind.UnmarshalException: unexpected element (uri:"http://www.w3.org/2001/XMLSchema", local:"element"). Expected elements are <{http://services.somecompany.com/}complexType>,<{http://services.somecompany.com/}annotation>,<{http://services.somecompany.com/}redefine>,<{http://services.somecompany.com/}element>,<{http://services.somecompany.com/}include>,<{http://services.somecompany.com/}attributeGroup>,<{http://services.somecompany.com/}group>,<{http://services.somecompany.com/}notation>,<{http://services.somecompany.com/}import>,<{http://services.somecompany.com/}simpleType>,<{http://services.somecompany.com/}attribute>
... 39 more
The Code (snippet)
import com.somecompany.services.*; //generated by wsimport
import javax.xml.ws.*;
//...
private com.somecompany.services.DataRetrieval myWS;
private com.somecompany.services.DataRetrievalSoap port;
private byte[] callWebService(String userName, String pwd, long dataTimeFrame)
{
try
{
myWS = new com.somecompany.services.DataRetrieval();
port = myWS.getDataRetrievalSoap();
com.somecompany.services.AuthSoapHeader mySoapHeader = new com.somecompany.services.AuthSoapHeader();
mySoapHeader.setUserName(userName);
//Hash the password then set it for the SOAP header
String pwdHash = hashMD5(pwd);
mySoapHeader.setPassword(pwdHash);
Holder holder = new Holder<AuthSoapHeader>(mySoapHeader);
Date endTime = new Date();
Date startTime = new Date(endTime.getTime() - dataTimeFrame);
XMLGregorianCalendar gcEndTime = dateToGregorianTime(endTime);
XMLGregorianCalendar gcStartTime = dateToGregorianTime(startTime);
GetCompanyArcgisDataResponse.GetCompanyArcgisDataResult companyData = port.getCompanyArcgisData(gcStartTime, gcEndTime, holder);
if( ((AuthSoapHeader)holder.value).getError() != null)
{
log.error("Authentication to web services failed!");
//OSGI stop service
this.stop();
return null;
}else
log.info("Authentication to web services successful.");
//Convert the results to a java object and then to a byte array to send to the adapter
Object companyDataAny = companyData.getAny();
byte[] companyDataBytes = objectToBytes(companyDataAny);
return companyDataBytes;
}
catch(Exception ex)
{
log.error("Unable to call Webservice", ex);
//OSGI stop service
this.stop();
return null;
}
}
Environment Specifics
JDK 7u17 (1.7.0_17) 64 bit. The ArcGIS GeoEvent Processor is using this version of the JRE, so I'm locked into that version for execution. Though I've done some development in 1.7.0_51 before I realized that.
wsimport - JAX-WS RI 2.2.4-b01
ArcGIS Server 10.2
ArcGIS GeoEvent Processor Extension
Karaf (used by ArcGIS Geovent Processor to run OSGI bundles)
This is probably not the best answer on this, but it's what I came up with.
The ArcGIS GeoEvent Processor that wrapped my OSGI project appeared to be doing some additional binding/unbinding of the web service that I referenced in my application. The work-around that I employed to get that .Net (DataSet return values) web service to function in Java just wasn't acceptable to that wrapper from the GeoEvent Processor.
My Solution
Ultimately what I did was create a secondary .Net web service which took the DataSet values and converted them to JSON, and returned JSON strings. This removed the problems encountered when attempting to reference DataSet return values from the web service, now I was dealing with a simple JSON string. The wsimport of that JSON web service went smooth, no work-around required. I tucked the newly imported web service files into my java project and now have no problems.
For Reference on C# DataSet to JSON:
Using Newtonsoft.Json (http://james.newtonking.com/json). After playing with a few other libraries for JSON serialization that is what I found worked best for me.
Newtonsoft.Json is available via NuGet package
Rick Strahl's site was a big help http://weblog.west-wind.com/posts/2008/Sep/03/DataTable-JSON-Serialization-in-JSONNET-and-JavaScriptSerializer

Neo4j REST API Java binding Uniqueness deprecated

I want to use Uniqueness for my Traversal.
Based on this tutorial, I'm using the following code :
GraphDatabaseService database = new RestGraphDatabase("http://localhost:7474/db/data");
TraversalDescription td = database.traversalDescription().uniqueness(Uniqueness.RELATIONSHIP_GLOBAL);
This code gave me the following error :
Exception in thread "main" java.lang.UnsupportedOperationException: Only values of class org.neo4j.kernel.Uniqueness are supported
at org.neo4j.rest.graphdb.traversal.RestTraversal.restify(RestTraversal.java:63)
at org.neo4j.rest.graphdb.traversal.RestTraversal.uniqueness(RestTraversal.java:54)
at org.neo4j.rest.graphdb.traversal.RestTraversal.uniqueness(RestTraversal.java:50)
at org.neo4j.rest.graphdb.traversal.RestTraversal.uniqueness(RestTraversal.java:37)
I already had to change the Traversal.description() to database.traversalDescription() because of deprecated but now I face the same problem for Uniqueness. In my example I used org.neo4j.graphdb.traversal.Uniquess because org.neo4j.kernel.Uniqueness is deprecated...
When using the package mentionned by the error I have a NullPointerException during the traverse() method, with no stack trace.
I'm using :
REST API : neo4j-rest-graphdb-2.0.0-M06.jar
Neo4j : neo4j-desktop-2.0.0.jar
Best regards.
There have been API changes in Neo4j 2.0 which are not in neo4j-rest-graphdb-2.0.0-M06
If you pull the latest neo4j-rest-graphdb github repo and build it locally it should work against: neo4j-rest-graphdb-2.0.0-SNAPSHOT

Categories

Resources