Can't retrieve data back from UMLS - java

I have been trying to run the ctakes-temporal-demo on github.
Currently I have configured everything in eclipse correctly. I have set the correct password and id for UMLS account. When I try to run it I get the following error:
Exception in thread "main" java.lang.Exception: org.apache.uima.resource.ResourceInitializationException: Initialization of annotator class "org.apache.ctakes.dictionary.lookup2.ae.DefaultJCasTermAnnotator" failed. (Descriptor: <unknown>)
at com.optum.cda.Main.main(Main.java:103)
Caused by: org.apache.uima.resource.ResourceInitializationException: Initialization of annotator class "org.apache.ctakes.dictionary.lookup2.ae.DefaultJCasTermAnnotator" failed. (Descriptor: <unknown>)
at org.apache.uima.analysis_engine.impl.PrimitiveAnalysisEngine_impl.initializeAnalysisComponent(PrimitiveAnalysisEngine_impl.java:252)
at org.apache.uima.analysis_engine.impl.PrimitiveAnalysisEngine_impl.initialize(PrimitiveAnalysisEngine_impl.java:156)
at org.apache.uima.impl.AnalysisEngineFactory_impl.produceResource(AnalysisEngineFactory_impl.java:94)
at org.apache.uima.impl.CompositeResourceFactory_impl.produceResource(CompositeResourceFactory_impl.java:62)
at org.apache.uima.UIMAFramework.produceResource(UIMAFramework.java:269)
at org.apache.uima.UIMAFramework.produceAnalysisEngine(UIMAFramework.java:387)
at org.apache.uima.analysis_engine.asb.impl.ASB_impl.setup(ASB_impl.java:254)
at org.apache.uima.analysis_engine.impl.AggregateAnalysisEngine_impl.initASB(AggregateAnalysisEngine_impl.java:431)
at org.apache.uima.analysis_engine.impl.AggregateAnalysisEngine_impl.initializeAggregateAnalysisEngine(AggregateAnalysisEngine_impl.java:375)
at org.apache.uima.analysis_engine.impl.AggregateAnalysisEngine_impl.initialize(AggregateAnalysisEngine_impl.java:185)
at org.apache.uima.fit.factory.AnalysisEngineFactory.createEngine(AnalysisEngineFactory.java:711)
at org.apache.uima.fit.factory.AggregateBuilder.createAggregate(AggregateBuilder.java:207)
at com.optum.cda.Main.main(Main.java:101)
Caused by: org.apache.uima.resource.ResourceInitializationException: EXCEPTION MESSAGE LOCALIZATION FAILED: java.util.MissingResourceException: Can't find resource for bundle java.util.PropertyResourceBundle, key Could not construct org.apache.ctakes.dictionary.lookup2.dictionary.UmlsJdbcRareWordDictionary
at org.apache.ctakes.dictionary.lookup2.ae.AbstractJCasTermAnnotator.initialize(AbstractJCasTermAnnotator.java:131)
at org.apache.uima.analysis_engine.impl.PrimitiveAnalysisEngine_impl.initializeAnalysisComponent(PrimitiveAnalysisEngine_impl.java:250)
... 12 more
Caused by: org.apache.uima.analysis_engine.annotator.AnnotatorContextException: EXCEPTION MESSAGE LOCALIZATION FAILED: java.util.MissingResourceException: Can't find resource for bundle java.util.PropertyResourceBundle, key Could not construct org.apache.ctakes.dictionary.lookup2.dictionary.UmlsJdbcRareWordDictionary
at org.apache.ctakes.dictionary.lookup2.dictionary.DictionaryDescriptorParser.parseDictionary(DictionaryDescriptorParser.java:199)
at org.apache.ctakes.dictionary.lookup2.dictionary.DictionaryDescriptorParser.parseDictionaries(DictionaryDescriptorParser.java:156)
at org.apache.ctakes.dictionary.lookup2.dictionary.DictionaryDescriptorParser.parseDescriptor(DictionaryDescriptorParser.java:128)
at org.apache.ctakes.dictionary.lookup2.ae.AbstractJCasTermAnnotator.initialize(AbstractJCasTermAnnotator.java:129)
... 13 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at org.apache.ctakes.dictionary.lookup2.dictionary.DictionaryDescriptorParser.parseDictionary(DictionaryDescriptorParser.java:196)
... 16 more
Caused by: java.lang.StringIndexOutOfBoundsException: String index out of range: -7
at java.lang.String.substring(Unknown Source)
at org.apache.ctakes.dictionary.lookup2.util.JdbcConnectionFactory.getConnectionUrl(JdbcConnectionFactory.java:110)
at org.apache.ctakes.dictionary.lookup2.util.JdbcConnectionFactory.getConnection(JdbcConnectionFactory.java:63)
at org.apache.ctakes.dictionary.lookup2.dictionary.JdbcRareWordDictionary.<init>(JdbcRareWordDictionary.java:91)
at org.apache.ctakes.dictionary.lookup2.dictionary.JdbcRareWordDictionary.<init>(JdbcRareWordDictionary.java:72)
at org.apache.ctakes.dictionary.lookup2.dictionary.UmlsJdbcRareWordDictionary.<init>(UmlsJdbcRareWordDictionary.java:31)
... 21 more
Can someone familiar with ctakes help me find out the error. I have tried the mailing list, but no replies.

Finally found a solution ! It seems like the path to the XML descriptor of the dictionary-fast is not reachable. Thus after validating an UMLS user the database setup could not be proceeded.
One of the solution I used was to use a static address for the database (which I will have to change always if the project is moved.). This is not very elegant but works.
Love to hear if anyone has another solution.

Have you put the data of ctakesnorx.script into the database?
Also we do need to update the following attribute values :
<property key="jdbcUrl" value="jdbc:hsqldb:file:org/apache/ctakes/dictionary/lookup/fast/ctakessnorx/ctakessnorx"/>
<property key="jdbcUser" value="sa"/>
<property key="jdbcPass" value=""/>
<property key="rareWordTable" value="cui_terms"/>

Related

java.lang.ClassNotFoundException: freemarker.template.Template , trying to implement hibernate there seems to be something wrong with the freemaker

Hi whenever I try to create a java class or anything on my project I get an unexpected error on my NetBeans, I installed a hibernate plugin which was requesting for freemaker jar version 238 and which after installing the hibernate plugin every time I try to create a java class this is the error I get, thank you in advance for your help.
java.lang.ClassNotFoundException: freemarker.template.Template
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
at org.netbeans.ProxyClassLoader.doFindClass(ProxyClassLoader.java:209)
Caused: java.lang.ClassNotFoundException: freemarker.template.Template starting from ModuleCL#58762fd2[org.netbeans.libs.freemarker] with possible defining loaders null and declared parents [org.netbeans.JarClassLoader#6dedf07b, org.netbeans.MainImpl$BootClassLoader#51efea79, ModuleCL#9e4e8cd[org.netbeans.modules.queries]]
at org.netbeans.ProxyClassLoader.doFindClass(ProxyClassLoader.java:211)
at org.netbeans.ProxyClassLoader.loadClass(ProxyClassLoader.java:125)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
Caused: java.lang.NoClassDefFoundError: freemarker/template/Template
at org.netbeans.libs.freemarker.FreemarkerFactory.getScriptEngine(FreemarkerFactory.java:156)
at java.scripting/javax.script.ScriptEngineManager.getEngineByName(ScriptEngineManager.java:241)
at org.netbeans.api.scripting.Scripting$EngineManager.getEngineByName(Scripting.java:180)
at org.netbeans.modules.templates.ScriptingCreateFromTemplateHandler.getEngine(ScriptingCreateFromTemplateHandler.java:141)
at org.netbeans.modules.templates.ScriptingCreateFromTemplateHandler.engine(ScriptingCreateFromTemplateHandler.java:150)
at org.netbeans.modules.templates.ScriptingCreateFromTemplateHandler.accept(ScriptingCreateFromTemplateHandler.java:65)
at org.netbeans.api.templates.CreateFromTemplateImpl.build(CreateFromTemplateImpl.java:113)
at org.netbeans.api.templates.CreateFromTemplateImpl.build(CreateFromTemplateImpl.java:73)
at org.netbeans.api.templates.FileBuilder.build(FileBuilder.java:227)
at org.netbeans.api.templates.FileBuilder.createFromTemplate(FileBuilder.java:329)
at org.openide.loaders.MultiDataObject.handleCreateFromTemplate(MultiDataObject.java:853)
at org.openide.loaders.DataObject$CreateAction.run(DataObject.java:1572)
at org.openide.loaders.DataObjectPool$1WrapAtomicAction.run(DataObjectPool.java:236)
at org.openide.filesystems.EventControl.runAtomicAction(EventControl.java:102)
at org.openide.filesystems.FileSystem.runAtomicAction(FileSystem.java:494)
at org.openide.loaders.DataObjectPool.runAtomicAction(DataObjectPool.java:261)
at org.openide.loaders.DataObject.invokeAtomicAction(DataObject.java:1026)
at org.openide.loaders.DataObject.createFromTemplate(DataObject.java:958)
at org.netbeans.modules.java.project.ui.NewJavaFileWizardIterator.instantiate(NewJavaFileWizardIterator.java:340)
at org.openide.loaders.TemplateWizard$InstantiatingIteratorBridge.instantiate(TemplateWizard.java:1050)
at org.openide.loaders.TemplateWizard.handleInstantiate(TemplateWizard.java:602)
at org.openide.loaders.TemplateWizard.instantiateNewObjects(TemplateWizard.java:436)
at org.openide.loaders.TemplateWizardIterImpl.instantiate(TemplateWizardIterImpl.java:223)
at org.openide.loaders.TemplateWizardIteratorWrapper.instantiate(TemplateWizardIteratorWrapper.java:135)
at org.openide.WizardDescriptor.callInstantiateOpen(WizardDescriptor.java:1605)
at org.openide.WizardDescriptor.callInstantiate(WizardDescriptor.java:1546)
at org.openide.WizardDescriptor.access$2300(WizardDescriptor.java:67)
at org.openide.WizardDescriptor$Listener$2$1.run(WizardDescriptor.java:2233)
at org.openide.util.RequestProcessor$Task.run(RequestProcessor.java:1418)
at org.netbeans.modules.openide.util.GlobalLookup.execute(GlobalLookup.java:45)
at org.openide.util.lookup.Lookups.executeWith(Lookups.java:278)
[catch] at org.openide.util.RequestProcessor$Processor.run(RequestProcessor.java:2033)

How to connect to Google big query with spark using java locally?

I am trying to connect to Google big query using spark in java, but I am unable to find accurate documentation for the same.
I tried: https://cloud.google.com/dataproc/docs/tutorials/bigquery-connector-spark-example
and
https://github.com/GoogleCloudPlatform/spark-bigquery-connector#compiling-against-the-connector
My code:
sparkSession.conf().set("credentialsFile", "/path/OfMyProjectJson.json");
Dataset<Row> dataset = sparkSession.read().format("bigquery").option("table","myProject.myBigQueryDb.myBigQuweryTable")
.load();
dataset.printSchema();
But this is throwing exception:
Exception in thread "main" java.util.ServiceConfigurationError: org.apache.spark.sql.sources.DataSourceRegister: Provider com.google.cloud.spark.bigquery.BigQueryRelationProvider could not be instantiated
at java.util.ServiceLoader.fail(ServiceLoader.java:232)
at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:247)
at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
at scala.collection.AbstractTraversable.filter(Traversable.scala:104)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:614)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:190)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
at com.mySparkConnector.getDataset(BigQueryFetchClass.java:12)
Caused by: java.lang.IllegalArgumentException: A project ID is required for this service but could not be determined from the builder or the environment. Please set a project ID using the builder.
at com.google.cloud.spark.bigquery.repackaged.com.google.common.base.Preconditions.checkArgument(Preconditions.java:142)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.ServiceOptions.<init>(ServiceOptions.java:285)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryOptions.<init>(BigQueryOptions.java:91)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryOptions.<init>(BigQueryOptions.java:30)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryOptions$Builder.build(BigQueryOptions.java:86)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryOptions.getDefaultInstance(BigQueryOptions.java:159)
at com.google.cloud.spark.bigquery.BigQueryRelationProvider$.$lessinit$greater$default$2(BigQueryRelationProvider.scala:29)
at com.google.cloud.spark.bigquery.BigQueryRelationProvider.<init>(BigQueryRelationProvider.scala:40)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
... 15 more
My json file contains project_id
I tried searching for possible solutions but am unable to find any, hence please help me with finding a solution to this exception, or else any documentation on how to connect to big query with spark.
I've got exactly the same error with DataProcPySparkOperator operator in the Airflow. The fix was to provide
dataproc_pyspark_jars='gs://spark-lib/bigquery/spark-bigquery-latest_2.12.jar'
instead of
dataproc_pyspark_jars='gs://spark-lib/bigquery/spark-bigquery-latest.jar'
I guess in your case it should be passed as a command line argument like
--jars=gs://spark-lib/bigquery/spark-bigquery-latest_2.12.jar
Recently a PR handling this issue has been merged to the the spark-bigquery-connector, a new version of the connector will be released soon.
A simple solution for now is to add the environment variable GOOGLE_APPLICATION_CREDENTIALS=/path/OfMyProjectJson.json to the spark runtime.

Apache phoenix concurrent queries failing with exception

We are trying a bunch of operations like SELECT -> store row key into a collection ->then split the collection into each worker thread -> Each thread again created connection using phoenix jdbc -> perform SELECT then depending on the result UPSERT into a different phoenix table.
I am using ExecutorService with a fixed thread pool of 4 I am seeing exceptions as below.
org.apache.phoenix.exception.PhoenixIOException: org.apache.phoenix.exception.PhoenixIOException: The system cannot find the path specified
at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:108)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:538)
at org.apache.phoenix.iterate.ConcatResultIterator.getIterators(ConcatResultIterator.java:50)
at org.apache.phoenix.iterate.ConcatResultIterator.currentIterator(ConcatResultIterator.java:97)
at org.apache.phoenix.iterate.ConcatResultIterator.next(ConcatResultIterator.java:117)
at org.apache.phoenix.jdbc.PhoenixResultSet.next(PhoenixResultSet.java:764)
at com.vonage.test.PopulateStagingGWCDRWorker.run(MyCode.java:74)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.ExecutionException: org.apache.phoenix.exception.PhoenixIOException: The system cannot find the path specified
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:206)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:534)
... 8 more
Caused by: org.apache.phoenix.exception.PhoenixIOException: The system cannot find the path specified
at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:108)
at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:122)
at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:73)
at org.apache.phoenix.iterate.SpoolingResultIterator$SpoolingResultIteratorFactory.newIterator(SpoolingResultIterator.java:67)
at org.apache.phoenix.iterate.ChunkedResultIterator.<init>(ChunkedResultIterator.java:92)
at org.apache.phoenix.iterate.ChunkedResultIterator$ChunkedResultIteratorFactory.newIterator(ChunkedResultIterator.java:72)
at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:92)
at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:83)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more
Caused by: java.io.IOException: The system cannot find the path specified
at java.io.WinNTFileSystem.createFileExclusively(Native Method)
at java.io.File.createTempFile(File.java:2024)
at org.apache.commons.io.output.DeferredFileOutputStream.thresholdReached(DeferredFileOutputStream.java:176)
at org.apache.phoenix.iterate.SpoolingResultIterator$1.thresholdReached(SpoolingResultIterator.java:98)
at org.apache.commons.io.output.ThresholdingOutputStream.checkThreshold(ThresholdingOutputStream.java:224)
at org.apache.commons.io.output.ThresholdingOutputStream.write(ThresholdingOutputStream.java:92)
at java.io.DataOutputStream.writeByte(DataOutputStream.java:153)
at org.apache.hadoop.io.WritableUtils.writeVLong(WritableUtils.java:273)
at org.apache.hadoop.io.WritableUtils.writeVInt(WritableUtils.java:253)
at org.apache.phoenix.util.TupleUtil.write(TupleUtil.java:146)
at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:107)
... 10 more
enter code here
But If I am using a pool size of 2 or less it works fine. I was wondering if there is a property at the client side that can be changed ?
I my case I solved this by using below dependency in pom.xml
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-protocol</artifactId>
<version>1.1.11</version>
</dependency>
Just to update you i am having Hbase version : 1.1 and phoenix is at 4.7
phoenix.spool.directory in the hbase-site.xml fixes this. Thanks

WebSphere onMessage error : Exception data: java.lang.reflect.InvocationTargetException

Here's my problem :
We have a banking Websphere application (version 7.0.0.27 ) on an AIX multi-instance server. On this application, we have received 51 MQ messages in 1 min to process but for 5 of them, we have the following errors :
[9/23/15 9:18:36:301 CEST] 00000065 LocalExceptio E CNTR0020E: EJB threw an unexpected (non-declared) exception during invocation of method "onMessage" on bean "BeanId(TrefleIntermediationEAR#TrefleEJB-3.4.3.4.jar#OPFRestitutionReception, null)". Exception data: java.lang.reflect.InvocationTargetException
at sun.reflect.GeneratedMethodAccessor521.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at com.ibm.ejs.jms.listener.ServerSessionDispatcher.dispatch(ServerSessionDispatcher.java:47)
at com.ibm.ejs.container.MDBWrapper.onMessage(MDBWrapper.java:98)
at com.ibm.ejs.container.MDBWrapper.onMessage(MDBWrapper.java:136)
at com.ibm.ejs.jms.listener.ServerSession.run(ServerSession.java:574)
at com.ibm.ws.util.ThreadPool$Worker.run(ThreadPool.java:1646)
Caused by: java.lang.RuntimeException: JMSCC0109: A message driven bean threw a runtime exception 'java.lang.ExceptionInInitializerError'.
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:56)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:39)
at java.lang.reflect.Constructor.newInstance(Constructor.java:527)
at com.ibm.msg.client.commonservices.j2se.NLSServices.createException(NLSServices.java:313)
at com.ibm.msg.client.commonservices.nls.NLSServices.createException(NLSServices.java:388)
at com.ibm.msg.client.jms.internal.JmsErrorUtils.createException(JmsErrorUtils.java:104)
at com.ibm.msg.client.jms.internal.JmsSessionImpl.run(JmsSessionImpl.java:3018)
at com.ibm.msg.client.jms.internal.JmsXAQueueSessionImpl$1.run(JmsXAQueueSessionImpl.java:395)
at com.ibm.mq.jms.MQSession.run(MQSession.java:862)
at com.ibm.ejs.jms.JMSSessionHandle.run(JMSSessionHandle.java:1055)
at com.ibm.ejs.jms.listener.ServerSession.connectionConsumerOnMessage(ServerSession.java:1082)
at com.ibm.ejs.jms.listener.ServerSession.onMessage(ServerSession.java:752)
at com.ibm.ejs.jms.listener.ServerSession.dispatch(ServerSession.java:718)
... 8 more
Caused by: java.lang.ExceptionInInitializerError
at java.lang.J9VMInternals.initialize(J9VMInternals.java:222)
at fr.bdf.trefle.trf.launcher.AbstractLauncher.retrieveOverridingStandaloneConnectionData(AbstractLauncher.java:117)
at fr.bdf.trefle.trf.launcher.AbstractLauncher.createFactory(AbstractLauncher.java:83)
at fr.bdf.trefle.trf.launcher.AbstractLauncher.<clinit>(AbstractLauncher.java:71)
at java.lang.J9VMInternals.initializeImpl(Native Method)
at java.lang.J9VMInternals.initialize(J9VMInternals.java:200)
at fr.bdf.trefle.trf.dao.FlowMonitorServiceImpl.insertFlowMonitor(FlowMonitorServiceImpl.java:465)
at fr.bdf.trefle.trf.dao.FlowMonitorServiceImpl.createFlowMonitor(FlowMonitorServiceImpl.java:176)
at fr.bdf.trefle.trf.mdb.OPFRestitutionReception.onMessage(OPFRestitutionReception.java:213)
at com.ibm.ejs.jms.listener.MDBWrapper$PriviledgedOnMessage.run(MDBWrapper.java:302)
at com.ibm.ws.security.util.AccessController.doPrivileged(AccessController.java:63)
at com.ibm.ejs.jms.listener.MDBWrapper.callOnMessage(MDBWrapper.java:271)
at com.ibm.ejs.jms.listener.MDBWrapper.onMessage(MDBWrapper.java:240)
at com.ibm.mq.jms.MQSession$FacadeMessageListener.onMessage(MQSession.java:147)
at com.ibm.msg.client.jms.internal.JmsSessionImpl.run(JmsSessionImpl.java:2846)
... 14 more
Caused by: java.util.MissingResourceException: Can't find resource for bundle configTechStandalone, key en_US
at java.util.ResourceBundle.getBundleImpl(ResourceBundle.java:379)
at java.util.ResourceBundle.getBundle(ResourceBundle.java:108)
at fr.bdf.trefle.common.util.StandaloneTechConfigurable.<clinit>(StandaloneTechConfigurable.java:31)
at java.lang.J9VMInternals.initializeImpl(Native Method)
at java.lang.J9VMInternals.initialize(J9VMInternals.java:200)
... 28 more
From my analysis, here's what I can conclude :
In the stack trace, the root error, mainly java.util.MissingResourceException, doesn't seem like the real cause because the resource the process is trying to get exists and for the other process, this works fine.
I tried to see if the method used to get the resource, getBundle(), has some performance issue but I didn't find anything.
I also tried to see if this was an already existing IBM bug but if it is the case, I can't find it.

Apache Gora - java.net.MalformedURLException while creating a hbase datastore

I am building a project which uses Gora-hbase as backend .
Hbase is up and running . I am not using maven or ivy .
Also i have specified the following in /conf/gora.properties :
gora.datastore.default=org.apache.gora.hbase.store.HBaseStore
gora.datastore.autocreateschema=true
In my code, i am using the following piece of code to start a datastore :
datastore =
DataStoreFactory.getDataStore(long.class,UserDetails.class,new
Configuration());
I am getting the following exception at the above line :
13/02/04 23:02:26 INFO zookeeper.ClientCnxn: Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x13ca8d9ecac000c, negotiated timeout = 40000
org.apache.gora.util.GoraException: java.lang.RuntimeException: java.net.MalformedURLException
at org.apache.gora.store.DataStoreFactory.createDataStore(DataStoreFactory.java:167)
at org.apache.gora.store.DataStoreFactory.getDataStore(DataStoreFactory.java:278)
at com.psl.gora.java.model.TestClass.init(TestClass.java:34)
at com.psl.gora.java.model.TestClass.<init>(TestClass.java:23)
at com.psl.gora.java.model.TestClass.main(TestClass.java:47)
Caused by: java.lang.RuntimeException: java.net.MalformedURLException
at org.apache.gora.hbase.store.HBaseStore.initialize(HBaseStore.java:125)
at org.apache.gora.store.DataStoreFactory.initializeDataStore(DataStoreFactory.java:102)
at org.apache.gora.store.DataStoreFactory.createDataStore(DataStoreFactory.java:161)
... 4 more
Caused by: java.net.MalformedURLException
at java.net.URL.<init>(URL.java:617)
at java.net.URL.<init>(URL.java:480)
at java.net.URL.<init>(URL.java:429)
at org.apache.xerces.impl.XMLEntityManager.setupCurrentEntity(Unknown Source)
at org.apache.xerces.impl.XMLVersionDetector.determineDocVersion(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source)
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source)
at org.jdom.input.SAXBuilder.build(SAXBuilder.java:453)
at org.jdom.input.SAXBuilder.build(SAXBuilder.java:770)
at org.apache.gora.hbase.store.HBaseStore.readMapping(HBaseStore.java:524)
at org.apache.gora.hbase.store.HBaseStore.initialize(HBaseStore.java:111)
... 6 more
Caused by: java.lang.NullPointerException
at java.net.URL.<init>(URL.java:522)
... 19 more
Is there anything I am missing or am not aware of?
Any help or suggestion appreciated.
When this stacktrace is shown, probably is because gora-hbase-mapping.xml is missing.
This question is from months ago, but if other person has the same problem, maybe this would help.
From HBaseStore:524 is being called builder.build(null) and results are like in http://www.eclipse.org/forums/index.php/t/262714/
---- Other possibility ----
Try as key class String.class and check if it works. (Just checking...)

Categories

Resources