How to update document in TrueVault - java

I am trying to do update a document in TrueVault using document id and schema id but it gives me error like this
Response Code : 400Exception in thread "main"
java.io.IOException: Server returned HTTP response code: 400 for URL: https://api.truevault.com/v1/vaults/vault-id/documents/document-id at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at sun.net.www.protocol.http.HttpURLConnection$6.run(HttpURLConnection.java:1675)
at sun.net.www.protocol.http.HttpURLConnection$6.run(HttpURLConnection.java:1673)
at java.security.AccessController.doPrivileged(Native Method)
at sun.net.www.protocol.http.HttpURLConnection.getChainedException(HttpURLConnection.java:1671)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1244)
at sun.net.www.protocol.https.HttpsURLConnectionImpl.getInputStream(HttpsURLConnectionImpl.java:254)
at TrueVaultGetRequest.sendPut(TrueVaultGetRequest.java:264)
at TrueVaultGetRequest.main(TrueVaultGetRequest.java:140)
Caused by: java.io.IOException: Server returned HTTP response code: 400 for URL: https://api.truevault.com/v1/vaults/vault-id/documents/document-id
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1626)
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:468)
at sun.net.www.protocol.https.HttpsURLConnectionImpl.getResponseCode(HttpsURLConnectionImpl.java:338)
at TrueVaultGetRequest.sendPut(TrueVaultGetRequest.java:260)
... 1 more
my encoded json is also correct. I have check it multiple times but still I have not get any solution. please give me solution.
Thank you

This request is described in TrueVault documentation on Updating A Document.
Your request needs to look something like this format:
curl https://api.truevault.com/v1/vaults/00000000-0000-0000-0000-000000000000/documents/00000000-0000-0000-0000-000000000000 \
-u [API_KEY | ACCESS_TOKEN]: \
-X PUT \
-d "document=e30="
Try manually inputing this curl command on the command line using your information. A more descriptive error message will be a part of the response.
Note: I assume you replaced your actual Vault and Document IDs in this SO with vault-id and document-id to keep that data private, but if not then that would be your mistake. Insert the actual Vault and Document IDs in place of those strings to move on.

Related

I am getting java.io.IOException: Broken pipe when the response data is huge

I am getting java.io.IOException: Broken pipe issue in Higher environment where the code is deployed. But in local system it works fine.
The data is very huge, more than 1000 JSON objects and it takes about 40 seconds to get the response in Local system.
The error logs is as follows
org.apache.catalina.connector.ClientAbortException: java.io.IOException: Broken pipe
at org.apache.catalina.connector.OutputBuffer.realWriteBytes(OutputBuffer.java:341) ~[tomcat-embed-core-9.0.16.jar!/:9.0.16]
at org.apache.catalina.connector.OutputBuffer.flushByteBuffer(OutputBuffer.java:766) ~[tomcat-embed-core-9.0.16.jar!/:9.0.16]
at org.apache.catalina.connector.OutputBuffer.append(OutputBuffer.java:671) ~[tomcat-embed-core-9.0.16.jar!/:9.0.16]
at org.apache.catalina.connector.OutputBuffer.writeBytes(OutputBuffer.java:376) ~[tomcat-embed-core-9.0.16.jar!/:9.0.16]
at org.apache.catalina.connector.OutputBuffer.write(OutputBuffer.java:354) ~[tomcat-embed-core-9.0.16.jar!/:9.0.16]
at org.apache.catalina.connector.CoyoteOutputStream.write(CoyoteOutputStream.java:96) ~[tomcat-embed-core-9.0.16.jar!/:9.0.16]
at org.springframework.session.web.http.OnCommittedResponseWrapper$SaveContextServletOutputStream.write(OnCommittedResponseWrapper.java:620) ~[spring-session-core-2.1.4.RELEASE.jar!/:2.1.4.RELEASE]
at org.springframework.security.web.util.OnCommittedResponseWrapper$SaveContextServletOutputStream.write(OnCommittedResponseWrapper.java:639) ~[spring-security-web-5.1.4.RELEASE.jar!/:5.1.4.RELEASE]
at org.springframework.security.web.util.OnCommittedResponseWrapper$SaveContextServletOutputStream.write(OnCommittedResponseWrapper.java:639) ~[spring-security-web-5.1.4.RELEASE.jar!/:5.1.4.RELEASE]
and so on... Please help. Not finding any solution to fix this.

Janusgraph libs cant communicate with hbase in kerberos environment(Failed to specify server's Kerberos principal name)

I am getting "Failed to specify server's Kerberos principal name" when attempting to connect to habse with janusgraph in a kerberos hadoop cluster
First off a little environmental info -
OS: 7.6.1810
Java: 1.8.0_191-b12
Spark: 2.3.2.3.1.0.78-4
YARN: 2.5.0
Hbase: 2.0.2.3.1.0.78-4
Hadoop: 3.1.1.3.1.0.78-4
Kerberos: 5 version 1.15.1
Janusgraph: 0.4.0
I did kinit and test the bundled gremlin client to ensure the graph.properties for the env works. It was able to connect up create a simple test graph, add some vertices, restart and retrieve the stored data. So cool the bundled copy works.
For laziness/simplicity I decided to load the spark-shell with janusgraph libs. While attempting to connect to the same graph it started throwing kerberos errors.
First thought being maybe its a hadoop/spark lib/conf conflict(pretty typical). So built out a very simple and barebones java app in an attempt to see if it would work. Got the same errors as spark.
Spark Invocations -
First attempt:
spark-shell \
--conf spark.driver.userClassPathFirst=true \
--conf spark.executor.userClassPathFirst=true \
--conf spark.driver.userClassPathFirst=true \
--jars /etc/hadoop/conf/core-site.xml,/etc/hbase/conf/hbase-site.xml,groovy-console-2.5.6.jar,javax.servlet-api-3.1.0.jar,netty-buffer-4.1.25.Final.jar,RoaringBitmap-0.5.11.jar,groovy-groovysh-2.5.6-indy.jar,javax.ws.rs-api-2.0.1.jar,netty-codec-4.1.25.Final.jar,activation-1.1.jar,groovy-json-2.5.6-indy.jar,jaxb-api-2.2.2.jar,netty-common-4.1.25.Final.jar,airline-0.6.jar,groovy-jsr223-2.5.6-indy.jar,jaxb-impl-2.2.3-1.jar,netty-handler-4.1.25.Final.jar,antlr-2.7.7.jar,groovy-swing-2.5.6.jar,jbcrypt-0.4.jar,netty-resolver-4.1.25.Final.jar,antlr-3.2.jar,groovy-templates-2.5.6.jar,jboss-logging-3.1.2.GA.jar,netty-transport-4.1.25.Final.jar,antlr-runtime-3.2.jar,groovy-xml-2.5.6.jar,jcabi-log-0.14.jar,noggit-0.6.jar,aopalliance-repackaged-2.4.0-b34.jar,gson-2.2.4.jar,jcabi-manifests-1.1.jar,objenesis-2.1.jar,apacheds-i18n-2.0.0-M15.jar,guava-18.0.jar,jcl-over-slf4j-1.7.25.jar,ohc-core-0.3.4.jar,apacheds-kerberos-codec-2.0.0-M15.jar,hadoop-annotations-2.7.7.jar,je-7.5.11.jar,org.apache.servicemix.bundles.commons-csv-1.0-r706900_3.jar,api-asn1-api-1.0.0-M20.jar,hadoop-auth-2.7.7.jar,jersey-client-1.9.jar,oro-2.0.8.jar,api-util-1.0.0-M20.jar,hadoop-client-2.7.7.jar,jersey-client-2.22.2.jar,osgi-resource-locator-1.0.1.jar,asm-3.1.jar,hadoop-common-2.7.7.jar,jersey-common-2.22.2.jar,paranamer-2.6.jar,asm-5.0.3.jar,hadoop-distcp-2.7.7.jar,jersey-container-servlet-2.22.2.jar,picocli-3.9.2.jar,asm-analysis-5.0.3.jar,hadoop-gremlin-3.4.1.jar,jersey-container-servlet-core-2.22.2.jar,protobuf-java-2.5.0.jar,asm-commons-5.0.3.jar,hadoop-hdfs-2.7.7.jar,jersey-core-1.9.jar,py4j-0.10.7.jar,asm-tree-5.0.3.jar,hadoop-mapreduce-client-app-2.7.7.jar,jersey-guava-2.22.2.jar,pyrolite-4.13.jar,asm-util-5.0.3.jar,hadoop-mapreduce-client-common-2.7.7.jar,jersey-json-1.9.jar,reflections-0.9.9-RC1.jar,astyanax-cassandra-3.10.2.jar,hadoop-mapreduce-client-core-2.7.7.jar,jersey-media-jaxb-2.22.2.jar,reporter-config-base-3.0.0.jar,astyanax-cassandra-all-shaded-3.10.2.jar,hadoop-mapreduce-client-jobclient-2.7.7.jar,jersey-server-1.9.jar,reporter-config3-3.0.0.jar,astyanax-core-3.10.2.jar,hadoop-mapreduce-client-shuffle-2.7.7.jar,jersey-server-2.22.2.jar,scala-library-2.11.8.jar,astyanax-recipes-3.10.2.jar,hadoop-yarn-api-2.7.7.jar,jets3t-0.7.1.jar,scala-reflect-2.11.8.jar,astyanax-thrift-3.10.2.jar,hadoop-yarn-client-2.7.7.jar,jettison-1.3.3.jar,scala-xml_2.11-1.0.5.jar,audience-annotations-0.5.0.jar,hadoop-yarn-common-2.7.7.jar,jetty-6.1.26.jar,servlet-api-2.5.jar,avro-1.7.4.jar,hadoop-yarn-server-common-2.7.7.jar,jetty-sslengine-6.1.26.jar,sesame-model-2.7.10.jar,avro-ipc-1.8.2.jar,hamcrest-core-1.3.jar,jetty-util-6.1.26.jar,sesame-rio-api-2.7.10.jar,avro-mapred-1.8.2-hadoop2.jar,hbase-shaded-client-2.1.5.jar,jffi-1.2.16-native.jar,sesame-rio-datatypes-2.7.10.jar,bigtable-hbase-1.x-shaded-1.11.0.jar,hbase-shaded-mapreduce-2.1.5.jar,jffi-1.2.16.jar,sesame-rio-languages-2.7.10.jar,caffeine-2.3.1.jar,hibernate-validator-4.3.0.Final.jar,jline-2.14.6.jar,sesame-rio-n3-2.7.10.jar,cassandra-all-2.2.13.jar,high-scale-lib-1.0.6.jar,jna-4.0.0.jar,sesame-rio-ntriples-2.7.10.jar,cassandra-driver-core-3.7.1.jar,high-scale-lib-1.1.4.jar,jnr-constants-0.9.9.jar,sesame-rio-rdfxml-2.7.10.jar,cassandra-thrift-2.2.13.jar,hk2-api-2.4.0-b34.jar,jnr-ffi-2.1.7.jar,sesame-rio-trig-2.7.10.jar,checker-compat-qual-2.5.2.jar,hk2-locator-2.4.0-b34.jar,jnr-posix-3.0.44.jar,sesame-rio-trix-2.7.10.jar,chill-java-0.9.3.jar,hk2-utils-2.4.0-b34.jar,jnr-x86asm-1.0.2.jar,sesame-rio-turtle-2.7.10.jar,chill_2.11-0.9.3.jar,hppc-0.7.1.jar,joda-time-2.8.2.jar,sesame-util-2.7.10.jar,commons-cli-1.3.1.jar,htrace-core-3.1.0-incubating.jar,jsch-0.1.54.jar,sigar-1.6.4.jar,commons-codec-1.7.jar,htrace-core4-4.2.0-incubating.jar,json-20090211_1.jar,slf4j-api-1.7.12.jar,commons-collections-3.2.2.jar,httpasyncclient-4.1.2.jar,json-simple-1.1.jar,slf4j-log4j12-1.7.12.jar,commons-configuration-1.10.jar,httpclient-4.4.1.jar,json4s-ast_2.11-3.5.3.jar,snakeyaml-1.11.jar,commons-crypto-1.0.0.jar,httpcore-4.4.1.jar,json4s-core_2.11-3.5.3.jar,snappy-java-1.0.5-M3.jar,commons-httpclient-3.1.jar,httpcore-nio-4.4.5.jar,json4s-jackson_2.11-3.5.3.jar,solr-solrj-7.0.0.jar,commons-io-2.3.jar,httpmime-4.4.1.jar,json4s-scalap_2.11-3.5.3.jar,spark-core_2.11-2.4.0.jar,commons-lang-2.5.jar,ivy-2.3.0.jar,jsp-api-2.1.jar,spark-gremlin-3.4.1.jar,commons-lang3-3.3.1.jar,jackson-annotations-2.6.6.jar,jsr305-3.0.0.jar,spark-kvstore_2.11-2.4.0.jar,commons-logging-1.1.1.jar,jackson-core-2.6.6.jar,jts-core-1.15.0.jar,spark-launcher_2.11-2.4.0.jar,commons-math3-3.2.jar,jackson-core-asl-1.9.13.jar,jul-to-slf4j-1.7.16.jar,spark-network-common_2.11-2.4.0.jar,commons-net-1.4.1.jar,jackson-databind-2.6.6.jar,junit-4.12.jar,spark-network-shuffle_2.11-2.4.0.jar,commons-pool-1.6.jar,jackson-datatype-json-org-2.6.6.jar,kryo-shaded-4.0.2.jar,spark-tags_2.11-2.4.0.jar,commons-text-1.0.jar,jackson-jaxrs-1.9.13.jar,leveldbjni-all-1.8.jar,spark-unsafe_2.11-2.4.0.jar,compress-lzf-1.0.0.jar,jackson-mapper-asl-1.9.13.jar,libthrift-0.9.2.jar,spatial4j-0.7.jar,concurrentlinkedhashmap-lru-1.3.jar,jackson-module-paranamer-2.6.6.jar,log4j-1.2.16.jar,stax-api-1.0-2.jar,crc32ex-0.1.1.jar,jackson-module-scala_2.11-2.6.6.jar,logback-classic-1.1.3.jar,stax-api-1.0.1.jar,curator-client-2.7.1.jar,jackson-xc-1.9.13.jar,logback-core-1.1.3.jar,stax2-api-3.1.4.jar,curator-framework-2.7.1.jar,jamm-0.3.0.jar,lucene-analyzers-common-7.0.0.jar,stream-2.7.0.jar,curator-recipes-2.7.1.jar,janusgraph-all-0.4.0.jar,lucene-core-7.0.0.jar,stringtemplate-3.2.jar,disruptor-3.0.1.jar,janusgraph-berkeleyje-0.4.0.jar,lucene-queries-7.0.0.jar,super-csv-2.1.0.jar,dom4j-1.6.1.jar,janusgraph-bigtable-0.4.0.jar,lucene-queryparser-7.0.0.jar,thrift-server-0.3.7.jar,ecj-4.4.2.jar,janusgraph-cassandra-0.4.0.jar,lucene-sandbox-7.0.0.jar,tinkergraph-gremlin-3.4.1.jar,elasticsearch-rest-client-6.6.0.jar,janusgraph-core-0.4.0.jar,lucene-spatial-7.0.0.jar,unused-1.0.0.jar,exp4j-0.4.8.jar,janusgraph-cql-0.4.0.jar,lucene-spatial-extras-7.0.0.jar,uuid-3.2.jar,findbugs-annotations-1.3.9-1.jar,janusgraph-es-0.4.0.jar,lucene-spatial3d-7.0.0.jar,validation-api-1.1.0.Final.jar,gbench-0.4.3-groovy-2.4.jar,janusgraph-hadoop-0.4.0.jar,lz4-1.3.0.jar,vavr-0.9.0.jar,gmetric4j-1.0.7.jar,janusgraph-hbase-0.4.0.jar,lz4-java-1.4.0.jar,vavr-match-0.9.0.jar,gprof-0.3.1-groovy-2.4.jar,janusgraph-lucene-0.4.0.jar,metrics-core-3.0.2.jar,woodstox-core-asl-4.4.1.jar,gremlin-console-3.4.1.jar,janusgraph-server-0.4.0.jar,metrics-core-3.2.2.jar,xbean-asm6-shaded-4.8.jar,gremlin-core-3.4.1.jar,janusgraph-solr-0.4.0.jar,metrics-ganglia-3.2.2.jar,xercesImpl-2.9.1.jar,gremlin-driver-3.4.1.jar,javapoet-1.8.0.jar,metrics-graphite-3.2.2.jar,xml-apis-1.3.04.jar,gremlin-groovy-3.4.1.jar,javassist-3.18.0-GA.jar,metrics-json-3.1.5.jar,xmlenc-0.52.jar,gremlin-server-3.4.1.jar,javatuples-1.2.jar,metrics-jvm-3.2.2.jar,zookeeper-3.4.6.jar,gremlin-shaded-3.4.1.jar,javax.inject-1.jar,minlog-1.3.0.jar,zstd-jni-1.3.2-2.jar,groovy-2.5.6-indy.jar,javax.inject-2.4.0-b34.jar,netty-3.10.5.Final.jar,groovy-cli-picocli-2.5.6.jar,javax.json-1.0.jar,netty-all-4.1.25.Final.jar
Second attempt(less libs):
spark-shell \
--conf spark.driver.userClassPathFirst=true \
--conf spark.executor.userClassPathFirst=true \
--conf spark.driver.userClassPathFirst=true \
--jars /etc/hadoop/conf/core-site.xml,/etc/hbase/conf/hbase-site.xml,gremlin-core-3.4.1.jar,gremlin-driver-3.4.3.jar,gremlin-shaded-3.4.1.jar,groovy-2.5.7.jar,groovy-json-2.5.7.jar,javatuples-1.2.jar,commons-lang3-3.8.1.jar,commons-configuration-1.10.jar,janusgraph-core-0.4.0.jar,hbase-shaded-client-2.1.5.jar,janusgraph-hbase-0.4.0.jar,high-scale-lib-1.1.4.jar
Java attempt:
java \
-cp /etc/hadoop/conf/core-site.xml:/etc/hbase/conf/hbase-site.xml:hbase-shaded-client-2.1.5.jar:janusgraph-hbase-0.4.0.jar:janusgraph-core-0.4.0.jar:commons-lang3-3.8.1.jar:gremlin-driver-3.4.3.jar:groovy-2.5.7.jar:javatuples-1.2.jar:commons-configuration-1.10.jar:gremlin-core-3.4.1.jar:gremlin-shaded-3.4.1.jar:groovy-json-2.5.7.jar:high-scale-lib-1.1.4.jar:Janusgraph_Ingestion.jar:../janusgraph-0.4.0-hadoop2/lib/commons-lang-2.5.jar:../janusgraph-0.4.0-hadoop2/lib/slf4j-api-1.7.12.jar:../janusgraph-0.4.0-hadoop2/lib/slf4j-log4j12-1.7.12.jar:../janusgraph-0.4.0-hadoop2/lib/log4j-1.2.16.jar:../janusgraph-0.4.0-hadoop2/lib/guava-18.0.jar:../janusgraph-0.4.0-hadoop2/lib/commons-logging-1.1.1.jar:../janusgraph-0.4.0-hadoop2/lib/commons-io-2.3.jar:../janusgraph-0.4.0-hadoop2/lib/htrace-core4-4.2.0-incubating.jar \
Entry
As far as code being executed in the spark-shell or java
import org.janusgraph.core.JanusGraphFactory;
val g = JanusGraphFactory.open("/home/devuser/janusgraph-0.4.0-hadoop2/conf/janusgraph-hbase.properties").traversal()
Also tried adding the below before attempting to open the graph
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.security.UserGroupInformation;
val conf = new Configuration();
conf.set("hadoop.security.authentication", "Kerberos");
UserGroupInformation.setConfiguration(conf);
UserGroupInformation.loginUserFromSubject(null);
Including graph connect config for completeness
gremlin.graph=org.janusgraph.core.JanusGraphFactory
storage.backend=hbase
storage.hostname=hosta.example.com:2181,hostb.example.com:2181,hostc.example.com:2181
storage.hbase.table=JgraphTest
storage.hbase.ext.zookeeper.znode.parent=/hbase-secure
storage.batch-loading=false
java.security.krb5.conf=/etc/krb5.conf
storage.hbase.ext.hbase.security.authentication=kerberos
storage.hbase.ext.hbase.security.authorization=true
storage.hbase.ext.hadoop.security.authentication=kerberos
storage.hbase.ext.hadoop.security.authorization=true
storage.hbase.ext.hbase.regionserver.kerberos.principal=hbase/_HOST#HDPDEV.example.com
ids.block-size=10000
ids.renew-timeout=3600000
storage.buffer-size=10000
ids.num-partitions=10
ids.partition=true
schema.default=none
cache.db-cache = true
cache.db-cache-clean-wait = 20
cache.db-cache-time = 180000
cache.db-cache-size = 0.5
Expected result would be a usable traversal object
Actual result below
19/10/18 11:40:30 TRACE NettyRpcConnection: Connecting to hostb.example.com/192.168.1.101:16000
19/10/18 11:40:30 DEBUG AbstractHBaseSaslRpcClient: Creating SASL GSSAPI client. Server's Kerberos principal name is null
19/10/18 11:40:30 TRACE AbstractRpcClient: Call: IsMasterRunning, callTime: 4ms
19/10/18 11:40:30 DEBUG RpcRetryingCallerImpl: Call exception, tries=7, retries=16, started=8197 ms ago, cancelled=false, msg=java.io.IOException: Call to hostb.example.com/192.168.1.101:16000 failed on local exception: java.io.IOException: Failed to specify server's Kerberos principal name, details=, see https://s.apache.org/timeout, exception=org.apache.hadoop.hbase.MasterNotRunningException: java.io.IOException: Call to hostb.example.com/192.168.1.101:16000 failed on local exception: java.io.IOException: Failed to specify server's Kerberos principal name
at org.apache.hadoop.hbase.client.ConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionImplementation.java:1175)
at org.apache.hadoop.hbase.client.ConnectionImplementation.getKeepAliveMasterService(ConnectionImplementation.java:1234)
at org.apache.hadoop.hbase.client.ConnectionImplementation.getMaster(ConnectionImplementation.java:1223)
at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:57)
at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:105)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3089)
at org.apache.hadoop.hbase.client.HBaseAdmin.getHTableDescriptor(HBaseAdmin.java:569)
at org.apache.hadoop.hbase.client.HBaseAdmin.getTableDescriptor(HBaseAdmin.java:529)
at org.janusgraph.diskstorage.hbase.HBaseAdmin1_0.getTableDescriptor(HBaseAdmin1_0.java:105)
at org.janusgraph.diskstorage.hbase.HBaseStoreManager.ensureTableExists(HBaseStoreManager.java:726)
at org.janusgraph.diskstorage.hbase.HBaseStoreManager.getLocalKeyPartition(HBaseStoreManager.java:537)
at org.janusgraph.diskstorage.hbase.HBaseStoreManager.getDeployment(HBaseStoreManager.java:376)
at org.janusgraph.diskstorage.hbase.HBaseStoreManager.getFeatures(HBaseStoreManager.java:418)
at org.janusgraph.graphdb.configuration.builder.GraphDatabaseConfigurationBuilder.build(GraphDatabaseConfigurationBuilder.java:51)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:161)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:132)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:79)
at $line22.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:26)
at $line22.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:31)
at $line22.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:33)
at $line22.$read$$iw$$iw$$iw$$iw$$iw.<init>(<console>:35)
at $line22.$read$$iw$$iw$$iw$$iw.<init>(<console>:37)
at $line22.$read$$iw$$iw$$iw.<init>(<console>:39)
at $line22.$read$$iw$$iw.<init>(<console>:41)
at $line22.$read$$iw.<init>(<console>:43)
at $line22.$read.<init>(<console>:45)
at $line22.$read$.<init>(<console>:49)
at $line22.$read$.<clinit>(<console>)
at $line22.$eval$.$print$lzycompute(<console>:7)
at $line22.$eval$.$print(<console>:6)
at $line22.$eval.$print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:415)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:923)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
at org.apache.spark.repl.Main$.doMain(Main.scala:76)
at org.apache.spark.repl.Main$.main(Main.scala:56)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.io.IOException: Call to hostb.example.com/192.168.1.101:16000 failed on local exception: java.io.IOException: Failed to specify server's Kerberos principal name
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hbase.ipc.IPCUtil.wrapException(IPCUtil.java:221)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.onCallFinished(AbstractRpcClient.java:390)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.access$100(AbstractRpcClient.java:95)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:410)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:406)
at org.apache.hadoop.hbase.ipc.Call.callComplete(Call.java:103)
at org.apache.hadoop.hbase.ipc.Call.setException(Call.java:118)
at org.apache.hadoop.hbase.ipc.BufferCallBeforeInitHandler.userEventTriggered(BufferCallBeforeInitHandler.java:92)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeUserEventTriggered(AbstractChannelHandlerContext.java:329)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeUserEventTriggered(AbstractChannelHandlerContext.java:315)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireUserEventTriggered(AbstractChannelHandlerContext.java:307)
at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.userEventTriggered(DefaultChannelPipeline.java:1377)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeUserEventTriggered(AbstractChannelHandlerContext.java:329)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeUserEventTriggered(AbstractChannelHandlerContext.java:315)
at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.fireUserEventTriggered(DefaultChannelPipeline.java:929)
at org.apache.hadoop.hbase.ipc.NettyRpcConnection.failInit(NettyRpcConnection.java:179)
at org.apache.hadoop.hbase.ipc.NettyRpcConnection.saslNegotiate(NettyRpcConnection.java:197)
at org.apache.hadoop.hbase.ipc.NettyRpcConnection.access$800(NettyRpcConnection.java:71)
at org.apache.hadoop.hbase.ipc.NettyRpcConnection$3.operationComplete(NettyRpcConnection.java:273)
at org.apache.hadoop.hbase.ipc.NettyRpcConnection$3.operationComplete(NettyRpcConnection.java:261)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:500)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:479)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.DefaultPromise.trySuccess(DefaultPromise.java:104)
at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPromise.trySuccess(DefaultChannelPromise.java:82)
at org.apache.hbase.thirdparty.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.fulfillConnectPromise(AbstractNioChannel.java:306)
at org.apache.hbase.thirdparty.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:341)
at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:633)
at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Failed to specify server's Kerberos principal name
at org.apache.hadoop.hbase.security.AbstractHBaseSaslRpcClient.<init>(AbstractHBaseSaslRpcClient.java:99)
at org.apache.hadoop.hbase.security.NettyHBaseSaslRpcClient.<init>(NettyHBaseSaslRpcClient.java:43)
at org.apache.hadoop.hbase.security.NettyHBaseSaslRpcClientHandler.<init>(NettyHBaseSaslRpcClientHandler.java:70)
at org.apache.hadoop.hbase.ipc.NettyRpcConnection.saslNegotiate(NettyRpcConnection.java:194)
... 18 more
Well I feel like an idiot. Sooo apparently the answer was actually a really simple deal. Appears that the gremlin client will work fine when just using storage.hbase.ext.hbase.regionserver.kerberos.principal but when using the libs out side of that storage.hbase.ext.hbase.master.kerberos.principal is needed as well. Well as far as this things are working on to the next set of problems I made for myself lol.

Getting Exceptions from logs on *nix using sed/awk/whatever

I want to get exceptions from log files created by tomcat.
Yes, I did some research, but because I don't have any experience with sed or awk - adjusting what I've found to what I need is kinda difficult.
A sample file is shown below to have something to work on:
at org.springframework.scheduling.quartz.QuartzJobBean.execute(QuartzJobBean.java:86)
Caused by: org.apache.cxf.transport.http.HTTPException: HTTP response '503: Service Temporarily Unavailable' when communicating with http://66.66.66.66:1234/aaa/bbb/ccc
at org.apache.cxf.transport.http.HTTPConduit$WrappedOutputStream.handleResponseInternal(HTTPConduit.java:1546)
at org.apache.cxf.interceptor.MessageSenderInterceptor$MessageSenderEndingInterceptor.handleMessage(MessageSenderInterceptor.java:62)
... 39 more
2014-10-24 11:40:01,558 ERROR [aaa.bbb.ccc.ddd.SomeClass] - some exception on parsing '2007/11/45' bla bla
javax.xml.ws.WebServiceException: Could not send Message.
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:145)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:525)
Caused by: org.apache.cxf.transport.http.HTTPException: HTTP response '503: Service Temporarily Unavailable' when communicating with http://66.66.66.66:1234/aaa/bbb/ccc
at org.apache.cxf.transport.http.HTTPConduit$WrappedOutputStream.handleResponseInternal(HTTPConduit.java:1546)
at org.apache.cxf.frontend.ClientProxy.invokeSync(ClientProxy.java:88)
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:134)
... 32 more
2014-10-24 11:40:01,561 ERROR [aaa.bbb.ccc.ddd] - some error with id = 1214
java.lang.NullPointerException
at sun.reflect.GeneratedMethodAccessor181.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:525)
2014-10-24 11:44:48,253 INFO [org.springframework.orm.hibernate3.annotation.AnnotationSessionFactoryBean] - Closing Hibernate SessionFactory
2014-10-24 11:44:48,253 INFO [org.hibernate.impl.SessionFactoryImpl] - closing
2014-10-24 11:44:48 org.apache.catalina.loader.WebappClassLoader clearThreadLocalMap
2014-10-24 11:44:50 org.apache.coyote.http11.Http11Protocol destroy
INFO: Stopping Coyote HTTP/1.1 on http-8096
As we can see there are: 2 FULL EXCEPTIONS (we want them), 1 PARTIAL EXCEPTION(we dont want it). To make this example short I deleted some important stuff, like log4j:ERRORs, which we dont want.
so far I tried:
AWK (its my 1st day with AWK, please dont laugh :D). its pretty straight forward.
it finds "/t" (tab) + at + " " (empty space) in the beginning of every line. if 2 lines before it match given conditions (Exception and date) it prints them too. it works pretty well, but it also prints partial exception, which we DO NOT want.
BEGIN {
preprevious = "";
previous = "";
}
/^\tat / {
if( previous != "" ) {
if(preprevious ~ /20[0-9][0-9]-[0-9][0-9]-[0-9][0-9]/){
print preprevious;
preprevious = "";
}
if(previous ~ /.*Exception/) {
print previous;
previous = "";
}
}
print;
next;
}
{ preprevious = previous;
previous = $0; }
which i run like this:
awk -f awkScript testFileExceptions.txt
and SED in a bash script (I prefer that)
#!/bin/sh
if [ "$#" -eq "2" ]
then
tail -n $2 $1 | sed -n "/ ERROR \[/,/20[0-9][0-9]-[0-9][0-9]-[0-9][0-9]/p"
else
echo "usage: scriptName filePath amountOfLastLinesInFile"
fi
it matches '" "+ERROR+" "' (with empty spaces on both sides of ERROR, so log4j:ERROR will not be matched) to date. it kind of works...
disadvantages:
1)it will work IF there are some additional lines between exceptions - like this:
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:134)
... 32 more
2014-10-24 11:40:01,561 INFO [aaa.bbb.ccc.ddd] - AAAAAAAAAAAAAAAAAAAAAAAA
2014-10-24 11:40:01,561 ERROR [aaa.bbb.ccc.ddd] - some error with id = 1214
java.lang.NullPointerException
at sun.reflect.GeneratedMethodAccessor181.invoke(Unknown Source)
if not - then the 2nd exception will not be shown
2)it will also print out the last matched line (which is the one with the date match)
so to sum up, what I want on outcome is:
2014-10-24 11:40:01,558 ERROR [aaa.bbb.ccc.ddd.SomeClass] - some exception on parsing '2007/11/45' bla bla javax.xml.ws.WebServiceException: Could not send Message.
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:145)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:525)
Caused by: org.apache.cxf.transport.http.HTTPException: HTTP response '503: Service Temporarily Unavailable' when communicating with http://66.66.66.66:1234/aaa/bbb/ccc
at org.apache.cxf.transport.http.HTTPConduit$WrappedOutputStream.handleResponseInternal(HTTPConduit.java:1546)
at org.apache.cxf.frontend.ClientProxy.invokeSync(ClientProxy.java:88)
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:134)
... 32 more
2014-10-24 11:40:01,561 ERROR [aaa.bbb.ccc.ddd] - some error with id = 1214 java.lang.NullPointerException
at sun.reflect.GeneratedMethodAccessor181.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:525)
I also want to save different exceptions to different files (lets say exceptionOutputNNN.txt, for example exceptionOutput001.txt), but I'll figure it out later, shouldn't be that hard after I figured out how to do that for XMLs... :P
well.. that's it. I hope someone can help me:)
cheers
edit: please note, that Exceptions can end with "... NN more" and with simple "\tat org.*"
Still not sure exactly what you want
This should work for what output you want though
awk '/^[0-9]+/{x=0}/ERROR/{x=1}x' file
output
2014-10-24 11:40:01,558 ERROR [aaa.bbb.ccc.ddd.SomeClass] - some exception on parsing '2007/11/45' bla bla
javax.xml.ws.WebServiceException: Could not send Message.
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:145)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:525)
Caused by: org.apache.cxf.transport.http.HTTPException: HTTP response '503: Service Temporarily Unavailable' when communicating with http://66.66.66.66:1234/aaa/bbb/ccc
at org.apache.cxf.transport.http.HTTPConduit$WrappedOutputStream.handleResponseInternal(HTTPConduit.java:1546)
at org.apache.cxf.frontend.ClientProxy.invokeSync(ClientProxy.java:88)
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:134)
... 32 more
2014-10-24 11:40:01,561 ERROR [aaa.bbb.ccc.ddd] - some error with id = 1214
java.lang.NullPointerException
at sun.reflect.GeneratedMethodAccessor181.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:525)
Edit: For your original file
awk 'a=/^[0-9]+/{x=0}a&&/ERROR/{x=1}x' file
or
awk '(/^[0-9]/&&x=/ERROR/)||x' file
for any people viewing this post in the future, below is the script which uses Jidders solution and saves output to files (1 file perexception)
#!/bin/sh
if [ "$#" -eq "2" ]
then
rm output/errorOutputFile* 2>/dev/null
tail -n $2 $1 | awk '(/^[0-9]/&&x=/ERROR/)||x' | awk '/^[0-9]/{g++} { print $0 > "errorOutputFile"g".txt"}'
else
echo "usage: scriptName filePath amountOfLastLinesInFile"
fi
cheers

NestedRuntimeException: Cannot parse POST parameters of request

As part of an application, we have a JSP page (parent page) in which we are calling another JSP page (child page), where all the parameters of the child page has been assigned to the form of the parent page and then trying to submit the page.
At the time of submission, we are getting the following error on the frontend...
(I) Front end error
=====================
Error 500--Internal Server Error
From RFC 2068 Hypertext Transfer Protocol -- HTTP/1.1:
And on the backend we get the following error...
(II) Application server Error log
<Error> <HTTP> <BEA-101019> <[ServletContext(id=29607640,name=test,c
ontext-path=/Test)] Servlet failed with IOException
java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.read(SocketInputStream.java:129)
at weblogic.servlet.internal.PostInputStream.read(PostInputStream.java:170)
at weblogic.servlet.internal.ServletInputStreamImpl$1.read(ServletInputStreamImpl.java:115)
at weblogic.servlet.internal.ServletInputStreamImpl.read(ServletInputStreamImpl.java:180)
at weblogic.servlet.internal.ServletRequestImpl.mergePostParams(ServletRequestImpl.java:1339
)
at weblogic.servlet.internal.ServletRequestImpl.parseQueryParams(ServletRequestImpl.java:120
6)
at weblogic.servlet.internal.ServletRequestImpl.getParameter(ServletRequestImpl.java:1409)
at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubImpl.java:446)
at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubImpl.java:348)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletC
ontext.java:7047)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121)
at weblogic.servlet.internal.WebAppServletContext.invokeServlet(WebAppServletContext.java:39
02)
at weblogic.servlet.internal.ServletRequestImpl.execute(ServletRequestImpl.java:2773)
at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:224)
at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:183)
**--------------- nested within: ------------------
weblogic.utils.NestedRuntimeException: Cannot parse POST parameters of request: '/Test/test1.jsp
' - with nested exception:
[java.net.SocketTimeoutException: Read timed out]**
at weblogic.servlet.internal.ServletRequestImpl.mergePostParams(ServletRequestImpl.java:1364
)
at weblogic.servlet.internal.ServletRequestImpl.parseQueryParams(ServletRequestImpl.java:120
6)
at weblogic.servlet.internal.ServletRequestImpl.getParameter(ServletRequestImpl.java:1409)
at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubImpl.java:446)
at weblogic.servlet.internal.ServletStubImpl.invokeServlet(ServletStubImpl.java:348)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletC
ontext.java:7047)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121)
at weblogic.servlet.internal.WebAppServletContext.invokeServlet(WebAppServletContext.java:39
02)
at weblogic.servlet.internal.ServletRequestImpl.execute(ServletRequestImpl.java:2773)
at weblogic.kernel.ExecuteThread.execute(ExecuteThread.java:224)
at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:183)
Could you please offer a suggestion as to the cause of the problem.
You are getting a SocketTimeoutException, so there is probably something wrong with your POST. For example, you might not be sending through enough bytes of data, or you might not be closing off the connection once you're finished with it - basically its sitting there waiting for more information until it eventually times out.
Could you please post some of your JSP code, especially where you're creating and running your POST, so that we can provide some better feedback.

The response could not be parsed

I am using redstone-xmlrpc-1.1.1 api with my code and getting this error:
redstone.xmlrpc.XmlRpcException: The response could not be parsed.
at redstone.xmlrpc.XmlRpcClient.handleResponse(Unknown Source)
at redstone.xmlrpc.XmlRpcClient.endCall(Unknown Source)
at redstone.xmlrpc.XmlRpcClient.invoke(Unknown Source)
at redstone.xmlrpc.XmlRpcProxy.invoke(Unknown Source)
at net.bican.wordpress.$Proxy1.newMediaObject(Unknown Source)
at net.bican.wordpress.Wordpress.newMediaObject(Wordpress.java:582)
at WordpressPost.DataWordpressPost.DataPost(DataWordpressPost.java:53)
at arrestcentral.ArrestData.readPdf(ArrestData.java:420)
at arrestcentral.ArrestData.main(ArrestData.java:447)
Caused by: java.io.FileNotFoundException: http://www.arrestcentral.com/XMLrpc.php?
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1478)
... 9 more
can anyone help me why I am unable to post on wordpress..
Well, the exception tells you what's happened - you've tried to fetch a URL of
http://www.arrestcentral.com/XMLrpc.php?
... and it was giving you an HTTP 404 (not found) error. You probably need to change the URL, but you should have more idea of what that URL should be than we do.
java.io.FileNotFoundException: http://www.arrestcentral.com/XMLrpc.php? should have given you a clue:
Nothing was found at the URL you specified. This means that the server returned a HTTP response with the error code 404.
You either mistyped the URL or it no longer exists (at this location).

Categories

Resources