This question already has an answer here:
Exception using mongodb as infinispan cache store
(1 answer)
Closed 8 years ago.
I want to use MongoDb as cachestore for Infinispan but getting following exception
It is different form the question Exception using mongodb as infinispan cache store
since the method b.loader() is not availabe in infinispan 7.0 jar i just figured out the turn around by using b.persistence.addStore() but still getting the exception as mentioned in the question
Code--
ConfigurationBuilder b = new ConfigurationBuilder();
b.loaders().addStore(
MongoDBCacheStoreConfigurationBuilder.class).host( "localhost" )
.port( 27017 )
.timeout( 1500 )
.acknowledgment( 0 )
.username( "mongoDBUSer" )
.password( "mongoBDPassword" )
.database( "infinispan_cachestore" )
.collection( "entries" );
final Configuration config = b.build();
MongoDBCacheStoreConfiguration store = (MongoDBCacheStoreConfiguration) config.loaders().cacheLoaders().get(0);
EmbeddedCacheManager cacheManager=new DefaultCacheManager();
cacheManager.defineConfiguration("usersCache", config);
//DefaultCacheManager manger=new DefaultCacheManager(config);
Cache<String, String> usersCache = cacheManager.getCache("usersCache");
Exception:--
log4j:WARN No appenders could be found for logger (org.jboss.logging).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" org.infinispan.CacheException: Unable to invoke method public void org.infinispan.loaders.CacheLoaderManagerImpl.start() on object of type CacheLoaderManagerImpl
at org.infinispan.util.ReflectionUtil.invokeAccessibly(ReflectionUtil.java:207)
at org.infinispan.factories.AbstractComponentRegistry$PrioritizedMethod.invoke(AbstractComponentRegistry.java:889)
at org.infinispan.factories.AbstractComponentRegistry.invokeStartMethods(AbstractComponentRegistry.java:658)
at org.infinispan.factories.AbstractComponentRegistry.internalStart(AbstractComponentRegistry.java:647)
at org.infinispan.factories.AbstractComponentRegistry.start(AbstractComponentRegistry.java:550)
at org.infinispan.factories.ComponentRegistry.start(ComponentRegistry.java:221)
at org.infinispan.CacheImpl.start(CacheImpl.java:691)
at org.infinispan.manager.DefaultCacheManager.wireAndStartCache(DefaultCacheManager.java:685)
at org.infinispan.manager.DefaultCacheManager.createCache(DefaultCacheManager.java:648)
at org.infinispan.manager.DefaultCacheManager.getCache(DefaultCacheManager.java:544)
at com.infini.demo.DemoWithInfi5.main(DemoWithInfi5.java:31)
Caused by: org.infinispan.CacheException: Unable to start cache loaders
at org.infinispan.loaders.CacheLoaderManagerImpl.start(CacheLoaderManagerImpl.java:163)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.infinispan.util.ReflectionUtil.invokeAccessibly(ReflectionUtil.java:205)
... 10 more
Caused by: java.lang.IllegalArgumentException: Cannot load null class!
at org.infinispan.util.Util.getInstance(Util.java:224)
at org.infinispan.loaders.CacheLoaderManagerImpl.createCacheLoader(CacheLoaderManagerImpl.java:346)
at org.infinispan.loaders.CacheLoaderManagerImpl.createCacheLoader(CacheLoaderManagerImpl.java:336)
at org.infinispan.loaders.CacheLoaderManagerImpl.start(CacheLoaderManagerImpl.java:149)
... 15 more
Download infinispan-cachestore-mongodb-5.3.0.final.jar and import it into your project and be sure to import the package inside the class that uses it.
Related
I am using Flink v1.11.2 and Avro v1.10.1.
I am trying to deserialize an Avro record as a Specific record from a Kafka topic, but for some reason keep getting this error:
I was able to output it as a generic record using this:
FlinkKafkaConsumer<GenericRecord> eventsConsumer = new FlinkKafkaConsumer(
fsiProcessorProps.getKafkaEventsInput(),
AvroDeserializationSchema.forGeneric(Sde.getClassSchema()),
properties);
And am now trying:
FlinkKafkaConsumer<Sde> eventsConsumer = new FlinkKafkaConsumer(
fsiProcessorProps.getKafkaEventsInput(),
AvroDeserializationSchema.forSpecific(Sde.class),
properties);
but get the following error when trying to start the job:
"org.apache.flink.runtime.rest.handler.RestHandlerException: Could not execute application.
at org.apache.flink.runtime.webmonitor.handlers.JarRunHandler.lambda$handleRequest$1(JarRunHandler.java:103)
at java.base/java.util.concurrent.CompletableFuture.uniHandle(Unknown Source)
at java.base/java.util.concurrent.CompletableFuture$UniHandle.tryFire(Unknown Source)
at java.base/java.util.concurrent.CompletableFuture.postComplete(Unknown Source)
at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(Unknown Source)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.util.FlinkRuntimeException: Could not execute application.
at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(Unknown Source)
at java.base/java.util.concurrent.CompletableFuture.completeThrowable(Unknown Source)
... 7 more
Caused by: org.apache.flink.util.FlinkRuntimeException: Could not execute application.
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:81)
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.run(DetachedApplicationRunner.java:67)
at org.apache.flink.runtime.webmonitor.handlers.JarRunHandler.lambda$handleRequest$0(JarRunHandler.java:100)
... 7 more
Caused by: org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: Expecting type to be a PojoTypeInfo
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:302)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:198)
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:149)
at org.apache.flink.client.deployment.application.DetachedApplicationRunner.tryExecuteJobs(DetachedApplicationRunner.java:78)
... 9 more
Caused by: java.lang.IllegalStateException: Expecting type to be a PojoTypeInfo
at org.apache.flink.formats.avro.typeutils.AvroTypeInfo.generateFieldsFromAvroSchema(AvroTypeInfo.java:74)
at org.apache.flink.formats.avro.typeutils.AvroTypeInfo.<init>(AvroTypeInfo.java:56)
at org.apache.flink.formats.avro.AvroDeserializationSchema.getProducedType(AvroDeserializationSchema.java:168)
at org.apache.flink.streaming.connectors.kafka.internals.KafkaDeserializationSchemaWrapper.getProducedType(KafkaDeserializationSchemaWrapper.java:66)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.getProducedType(FlinkKafkaConsumerBase.java:1066)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.getTypeInfo(StreamExecutionEnvironment.java:2172)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.addSource(StreamExecutionEnvironment.java:1608)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.addSource(StreamExecutionEnvironment.java:1569)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.addSource(StreamExecutionEnvironment.java:1551)
at FsiProcessor.main(FsiProcessor.java:46)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.base/java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288)
... 12 more
"
The Sde class is a typical generated avro class so I am not sure what I am missing. Also the data does not have the schema embedded (tried with schema embedded as well and got same issue).
So I wasn't able to find the answer to that because things switched to using a schema registry, but wanted to post how it was done with a schema registry at least.
So instead of doing:
FlinkKafkaConsumer<Sde> eventsConsumer = new FlinkKafkaConsumer(
fsiProcessorProps.getKafkaEventsInput(),
AvroDeserializationSchema.forSpecific(Sde.class),
properties);
It was changed to:
private static <T extends SpecificRecordBase> FlinkKafkaConsumerBase<T> getFlinkKafkaConsumer(
Properties properties, String topic, Class<? extends SpecificRecordBase> type, String schemaRegistryUrl, Map<String, String> sslConfiguration) {
ConfluentRegistryAvroDeserializationSchema<?> schema =
ConfluentRegistryAvroDeserializationSchema.forSpecific(
type, schemaRegistryUrl, sslConfiguration);
return new FlinkKafkaConsumer<T>(topic, (DeserializationSchema<T>) schema, properties);
}
I won't mark this as an answer, but wanted to give it as at least another way involving schema registry which is a good thing to use with Avro anyway.
Make it sure Avro generated POJO has both get/set method implemented by adding this:
<createSetters>true</createSetters>
I am trying to connect to Google big query using spark in java, but I am unable to find accurate documentation for the same.
I tried: https://cloud.google.com/dataproc/docs/tutorials/bigquery-connector-spark-example
and
https://github.com/GoogleCloudPlatform/spark-bigquery-connector#compiling-against-the-connector
My code:
sparkSession.conf().set("credentialsFile", "/path/OfMyProjectJson.json");
Dataset<Row> dataset = sparkSession.read().format("bigquery").option("table","myProject.myBigQueryDb.myBigQuweryTable")
.load();
dataset.printSchema();
But this is throwing exception:
Exception in thread "main" java.util.ServiceConfigurationError: org.apache.spark.sql.sources.DataSourceRegister: Provider com.google.cloud.spark.bigquery.BigQueryRelationProvider could not be instantiated
at java.util.ServiceLoader.fail(ServiceLoader.java:232)
at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:247)
at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
at scala.collection.AbstractTraversable.filter(Traversable.scala:104)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:614)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:190)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
at com.mySparkConnector.getDataset(BigQueryFetchClass.java:12)
Caused by: java.lang.IllegalArgumentException: A project ID is required for this service but could not be determined from the builder or the environment. Please set a project ID using the builder.
at com.google.cloud.spark.bigquery.repackaged.com.google.common.base.Preconditions.checkArgument(Preconditions.java:142)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.ServiceOptions.<init>(ServiceOptions.java:285)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryOptions.<init>(BigQueryOptions.java:91)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryOptions.<init>(BigQueryOptions.java:30)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryOptions$Builder.build(BigQueryOptions.java:86)
at com.google.cloud.spark.bigquery.repackaged.com.google.cloud.bigquery.BigQueryOptions.getDefaultInstance(BigQueryOptions.java:159)
at com.google.cloud.spark.bigquery.BigQueryRelationProvider$.$lessinit$greater$default$2(BigQueryRelationProvider.scala:29)
at com.google.cloud.spark.bigquery.BigQueryRelationProvider.<init>(BigQueryRelationProvider.scala:40)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380)
... 15 more
My json file contains project_id
I tried searching for possible solutions but am unable to find any, hence please help me with finding a solution to this exception, or else any documentation on how to connect to big query with spark.
I've got exactly the same error with DataProcPySparkOperator operator in the Airflow. The fix was to provide
dataproc_pyspark_jars='gs://spark-lib/bigquery/spark-bigquery-latest_2.12.jar'
instead of
dataproc_pyspark_jars='gs://spark-lib/bigquery/spark-bigquery-latest.jar'
I guess in your case it should be passed as a command line argument like
--jars=gs://spark-lib/bigquery/spark-bigquery-latest_2.12.jar
Recently a PR handling this issue has been merged to the the spark-bigquery-connector, a new version of the connector will be released soon.
A simple solution for now is to add the environment variable GOOGLE_APPLICATION_CREDENTIALS=/path/OfMyProjectJson.json to the spark runtime.
I have been trying to run the ctakes-temporal-demo on github.
Currently I have configured everything in eclipse correctly. I have set the correct password and id for UMLS account. When I try to run it I get the following error:
Exception in thread "main" java.lang.Exception: org.apache.uima.resource.ResourceInitializationException: Initialization of annotator class "org.apache.ctakes.dictionary.lookup2.ae.DefaultJCasTermAnnotator" failed. (Descriptor: <unknown>)
at com.optum.cda.Main.main(Main.java:103)
Caused by: org.apache.uima.resource.ResourceInitializationException: Initialization of annotator class "org.apache.ctakes.dictionary.lookup2.ae.DefaultJCasTermAnnotator" failed. (Descriptor: <unknown>)
at org.apache.uima.analysis_engine.impl.PrimitiveAnalysisEngine_impl.initializeAnalysisComponent(PrimitiveAnalysisEngine_impl.java:252)
at org.apache.uima.analysis_engine.impl.PrimitiveAnalysisEngine_impl.initialize(PrimitiveAnalysisEngine_impl.java:156)
at org.apache.uima.impl.AnalysisEngineFactory_impl.produceResource(AnalysisEngineFactory_impl.java:94)
at org.apache.uima.impl.CompositeResourceFactory_impl.produceResource(CompositeResourceFactory_impl.java:62)
at org.apache.uima.UIMAFramework.produceResource(UIMAFramework.java:269)
at org.apache.uima.UIMAFramework.produceAnalysisEngine(UIMAFramework.java:387)
at org.apache.uima.analysis_engine.asb.impl.ASB_impl.setup(ASB_impl.java:254)
at org.apache.uima.analysis_engine.impl.AggregateAnalysisEngine_impl.initASB(AggregateAnalysisEngine_impl.java:431)
at org.apache.uima.analysis_engine.impl.AggregateAnalysisEngine_impl.initializeAggregateAnalysisEngine(AggregateAnalysisEngine_impl.java:375)
at org.apache.uima.analysis_engine.impl.AggregateAnalysisEngine_impl.initialize(AggregateAnalysisEngine_impl.java:185)
at org.apache.uima.fit.factory.AnalysisEngineFactory.createEngine(AnalysisEngineFactory.java:711)
at org.apache.uima.fit.factory.AggregateBuilder.createAggregate(AggregateBuilder.java:207)
at com.optum.cda.Main.main(Main.java:101)
Caused by: org.apache.uima.resource.ResourceInitializationException: EXCEPTION MESSAGE LOCALIZATION FAILED: java.util.MissingResourceException: Can't find resource for bundle java.util.PropertyResourceBundle, key Could not construct org.apache.ctakes.dictionary.lookup2.dictionary.UmlsJdbcRareWordDictionary
at org.apache.ctakes.dictionary.lookup2.ae.AbstractJCasTermAnnotator.initialize(AbstractJCasTermAnnotator.java:131)
at org.apache.uima.analysis_engine.impl.PrimitiveAnalysisEngine_impl.initializeAnalysisComponent(PrimitiveAnalysisEngine_impl.java:250)
... 12 more
Caused by: org.apache.uima.analysis_engine.annotator.AnnotatorContextException: EXCEPTION MESSAGE LOCALIZATION FAILED: java.util.MissingResourceException: Can't find resource for bundle java.util.PropertyResourceBundle, key Could not construct org.apache.ctakes.dictionary.lookup2.dictionary.UmlsJdbcRareWordDictionary
at org.apache.ctakes.dictionary.lookup2.dictionary.DictionaryDescriptorParser.parseDictionary(DictionaryDescriptorParser.java:199)
at org.apache.ctakes.dictionary.lookup2.dictionary.DictionaryDescriptorParser.parseDictionaries(DictionaryDescriptorParser.java:156)
at org.apache.ctakes.dictionary.lookup2.dictionary.DictionaryDescriptorParser.parseDescriptor(DictionaryDescriptorParser.java:128)
at org.apache.ctakes.dictionary.lookup2.ae.AbstractJCasTermAnnotator.initialize(AbstractJCasTermAnnotator.java:129)
... 13 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at org.apache.ctakes.dictionary.lookup2.dictionary.DictionaryDescriptorParser.parseDictionary(DictionaryDescriptorParser.java:196)
... 16 more
Caused by: java.lang.StringIndexOutOfBoundsException: String index out of range: -7
at java.lang.String.substring(Unknown Source)
at org.apache.ctakes.dictionary.lookup2.util.JdbcConnectionFactory.getConnectionUrl(JdbcConnectionFactory.java:110)
at org.apache.ctakes.dictionary.lookup2.util.JdbcConnectionFactory.getConnection(JdbcConnectionFactory.java:63)
at org.apache.ctakes.dictionary.lookup2.dictionary.JdbcRareWordDictionary.<init>(JdbcRareWordDictionary.java:91)
at org.apache.ctakes.dictionary.lookup2.dictionary.JdbcRareWordDictionary.<init>(JdbcRareWordDictionary.java:72)
at org.apache.ctakes.dictionary.lookup2.dictionary.UmlsJdbcRareWordDictionary.<init>(UmlsJdbcRareWordDictionary.java:31)
... 21 more
Can someone familiar with ctakes help me find out the error. I have tried the mailing list, but no replies.
Finally found a solution ! It seems like the path to the XML descriptor of the dictionary-fast is not reachable. Thus after validating an UMLS user the database setup could not be proceeded.
One of the solution I used was to use a static address for the database (which I will have to change always if the project is moved.). This is not very elegant but works.
Love to hear if anyone has another solution.
Have you put the data of ctakesnorx.script into the database?
Also we do need to update the following attribute values :
<property key="jdbcUrl" value="jdbc:hsqldb:file:org/apache/ctakes/dictionary/lookup/fast/ctakessnorx/ctakessnorx"/>
<property key="jdbcUser" value="sa"/>
<property key="jdbcPass" value=""/>
<property key="rareWordTable" value="cui_terms"/>
I am using Spark 2.1.1 in Java and elasticsearch-spark-20_2.11 (version 5.3.2) in order to write data in Elasticsearch.I create JavaStreamingContext which I then set to await termination, so the application should always retrieve new data.
After I read the stream, I split it into RDDs and for each one I apply SQL aggregations and then write it to Elasticsearch as follows:
recordStream.foreachRDD(rdd -> {
if (rdd.count() > 0) {
/*
* Create RDD from JSON
*/
Dataset<Row> df = spark.read().json(rdd.rdd());
df.createOrReplaceTempView("data");
df.cache();
/*
* Apply the aggregations
*/
Dataset aggregators = spark.sql(ORDER_TYPE_DB);
JavaEsSparkSQL.saveToEs(aggregators.toDF(), "order_analytics/record");
aggregators = spark.sql(ORDER_CUSTOMER_DB);
JavaEsSparkSQL.saveToEs(aggregators.toDF(), "customer_analytics/record");
}
});
This works fine the first time data is read and inserted to Elasticsearch, but when more data are retrieved by the stream, I get the following error:
org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting 'es.nodes.wan.only'
at org.elasticsearch.hadoop.rest.InitializationUtils.discoverEsVersion(InitializationUtils.java:250)
at org.elasticsearch.hadoop.rest.RestService.createWriter(RestService.java:546)
at org.elasticsearch.spark.rdd.EsRDDWriter.write(EsRDDWriter.scala:58)
at org.elasticsearch.spark.sql.EsSparkSQL$$anonfun$saveToEs$1.apply(EsSparkSQL.scala:94)
at org.elasticsearch.spark.sql.EsSparkSQL$$anonfun$saveToEs$1.apply(EsSparkSQL.scala:94)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:322)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.elasticsearch.hadoop.rest.EsHadoopTransportException: java.net.BindException: Address already in use: JVM_Bind
at org.elasticsearch.hadoop.rest.NetworkClient.execute(NetworkClient.java:129)
at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:461)
at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:425)
at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:429)
at org.elasticsearch.hadoop.rest.RestClient.get(RestClient.java:155)
at org.elasticsearch.hadoop.rest.RestClient.remoteEsVersion(RestClient.java:627)
at org.elasticsearch.hadoop.rest.InitializationUtils.discoverEsVersion(InitializationUtils.java:243)
... 10 more
Caused by: java.net.BindException: Address already in use: JVM_Bind
at java.net.DualStackPlainSocketImpl.bind0(Native Method)
at java.net.DualStackPlainSocketImpl.socketBind(DualStackPlainSocketImpl.java:106)
at java.net.AbstractPlainSocketImpl.bind(AbstractPlainSocketImpl.java:387)
at java.net.PlainSocketImpl.bind(PlainSocketImpl.java:190)
at java.net.Socket.bind(Socket.java:644)
at java.net.Socket.<init>(Socket.java:433)
at java.net.Socket.<init>(Socket.java:286)
at org.apache.commons.httpclient.protocol.DefaultProtocolSocketFactory.createSocket(DefaultProtocolSocketFactory.java:80)
at org.apache.commons.httpclient.protocol.DefaultProtocolSocketFactory.createSocket(DefaultProtocolSocketFactory.java:122)
at org.apache.commons.httpclient.HttpConnection.open(HttpConnection.java:707)
at org.apache.commons.httpclient.HttpMethodDirector.executeWithRetry(HttpMethodDirector.java:387)
at org.apache.commons.httpclient.HttpMethodDirector.executeMethod(HttpMethodDirector.java:171)
at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:397)
at org.apache.commons.httpclient.HttpClient.executeMethod(HttpClient.java:323)
at org.elasticsearch.hadoop.rest.commonshttp.CommonsHttpTransport.execute(CommonsHttpTransport.java:478)
at org.elasticsearch.hadoop.rest.NetworkClient.execute(NetworkClient.java:112)
... 16 more
Any ideas what the problem could be?
Spark uses default configuration and is instantiated in Java as
SparkConf conf = new SparkConf().setAppName(topic).setMaster("local");
JavaStreamingContext streamingContext = new JavaStreamingContext(conf, Durations.seconds(2));
Elasticsearch is configured via Docker compose with the following environment parameters:
- cluster.name=cp-es-cluster
- node.name=cloud1
- http.cors.enabled=true
- http.cors.allow-origin="*"
- network.host=0.0.0.0
- discovery.zen.ping.unicast.hosts=${ENV_IP}
- network.publish_host=${ENV_IP}
- discovery.zen.minimum_master_nodes=1
- xpack.security.enabled=false
- xpack.monitoring.enabled=false
Good afternoon,
When I attempt to use the SMTP Appender, I get an odd error in the console. The error appears to occur when the XML file is loaded, as it isn't logged via any output stream except stdout. The contents of the appender are as follows (and I have confirmed that the error is in this block of XML). I've removed our server information.
<SMTP name="Mailer">
<Subject>[ERROR] (software name) on ${hostName} has thrown a fatal error</Subject>
<To>(a valid email in the form of a#a.com)</To>
<From>(a valid email in the form of a#a.com)</From>
<SMTPHost>(The Internal IP address of a server in the form of 192.168.0.1)</SMTPHost>
<SMTPPort>587</SMTPPort>
<BufferSize>512</BufferSize>
</SMTP>
Whether or not it is referenced in a logger, I receive the following error immediately upon running the program and running getLogger from the log manager. I've removed a couple of the file names and replaced them with a rough description of what the file is doing at that point.
2015-05-19 19:08:18,812 ERROR Unable to invoke factory method in class class org.apache.logging.log4j.core.appender.SmtpAppender for element SMTP. java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:137)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:766)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:706)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:698)
at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:358)
at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:161)
at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:361)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:426)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:442)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:138)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:147)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:175)
at org.apache.logging.log4j.LogManager.getLogger(LogManager.java:426)
at (Our Log Interface Class, which essentially just returns the log passed by the Apache log manager)
at (The global variable definition file, the first method to use getLogger)
at (The main method for our software, where the globals are loaded)
Caused by: java.lang.NoSuchMethodError: javax.mail.Session.setProtocolForAddress(Ljava/lang/String;Ljava/lang/String;)V
at org.apache.logging.log4j.core.net.SmtpManager$SMTPManagerFactory.createManager(SmtpManager.java:325)
at org.apache.logging.log4j.core.net.SmtpManager$SMTPManagerFactory.createManager(SmtpManager.java:299)
at org.apache.logging.log4j.core.appender.AbstractManager.getManager(AbstractManager.java:71)
at org.apache.logging.log4j.core.net.SmtpManager.getSMTPManager(SmtpManager.java:124)
at org.apache.logging.log4j.core.appender.SmtpAppender.createAppender(SmtpAppender.java:142)
... 21 more
2015-05-19 19:08:18,814 ERROR Null object returned for SMTP in Appenders.
2015-05-19 19:08:18,819 ERROR Unable to locate appender Mailer for logger fatalerror
The details of the configuration are correct (I know it's a valid IP, etc) - they worked in log4j 1. The error log tells me virtually nothing about the error, so I am hoping someone has heard of this before. Thanks everyone!
This is the important line from your stacktrace:
Caused by: java.lang.NoSuchMethodError: javax.mail.Session.setProtocolForAddress(Ljava/lang/String;Ljava/lang/String;)V
The setProtocolForAddress method has been part of JavaMail since version 1.4, so this suggests to me that you are using an old version of that JAR. If you are, try upgrading to a later version.
Note that log4j2 has many lookups, not just system properties. So you need to convert ${hostName} to ${sys:hostName}, and if you are using a lookup for the IP address, this may be the cause of the issue. (Try hardcoding all lookup values to exclude this possibility.)