I upgraded my serenity and cucumber tests today to version 2.6.0 en cucumber 6. Only to have a lot of package needed to change and the steps of my feature files no longer linked to the step definitions. If if this setup for my cucumberrunner
import io.cucumber.junit.CucumberOptions;
import net.serenitybdd.cucumber.CucumberWithSerenity;
import org.junit.runner.RunWith;
#RunWith(CucumberWithSerenity.class)
#CucumberOptions(features = {"src/test/resources/features"},
glue = {"be.nbb.hive.cucumber.steps"})
public class CucumberRunner {
}
But if I look in intellij in the feature files you can no longer click through the steps to go to the definition. And when I run a feature file I get following exception:
10:40:50.005 [main] DEBUG n.thucydides.core.steps.StepEventBus - Test suite started for story net.thucydides.core.model.Story#aa2c74aa
10:40:50.006 [main] INFO - Test Suite Started: Smoke Test Login
sep 09, 2021 10:40:50 AM io.cucumber.core.runtime.Runtime run
SEVERE: Exception while executing pickle
java.util.concurrent.ExecutionException: java.util.regex.PatternSyntaxException: Dangling meta character '?' near index 15
^I click on (?:?:the )?\?(?:.*)?(?:?:.*)? \((name|id|css|xpath\): \?(?:.+)?\\)$
^
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:192)
at io.cucumber.core.runtime.Runtime.run(Runtime.java:93)
at net.serenitybdd.cucumber.cli.Main.run(Main.java:27)
at net.serenitybdd.cucumber.cli.Main.main(Main.java:18)
Caused by: java.util.regex.PatternSyntaxException: Dangling meta character '?' near index 15
^I click on (?:?:the )?\?(?:.*)?(?:?:.*)? \((name|id|css|xpath\): \?(?:.+)?\\)$
^
at java.util.regex.Pattern.error(Pattern.java:1955)
at java.util.regex.Pattern.sequence(Pattern.java:2123)
at java.util.regex.Pattern.expr(Pattern.java:1996)
at java.util.regex.Pattern.group0(Pattern.java:2821)
at java.util.regex.Pattern.sequence(Pattern.java:2051)
at java.util.regex.Pattern.expr(Pattern.java:1996)
at java.util.regex.Pattern.compile(Pattern.java:1696)
at java.util.regex.Pattern.<init>(Pattern.java:1351)
at java.util.regex.Pattern.compile(Pattern.java:1054)
at io.cucumber.cucumberexpressions.DefaultPatternCompiler.compile(DefaultPatternCompiler.java:12)
at io.cucumber.cucumberexpressions.TreeRegexp.<init>(TreeRegexp.java:22)
at io.cucumber.cucumberexpressions.CucumberExpression.<init>(CucumberExpression.java:37)
at io.cucumber.cucumberexpressions.ExpressionFactory.createExpression(ExpressionFactory.java:34)
at io.cucumber.core.stepexpression.StepExpressionFactory.crateExpression(StepExpressionFactory.java:88)
at io.cucumber.core.stepexpression.StepExpressionFactory.createExpression(StepExpressionFactory.java:61)
at io.cucumber.core.stepexpression.StepExpressionFactory.createExpression(StepExpressionFactory.java:49)
at io.cucumber.core.runner.CachingGlue.lambda$prepareGlue$3(CachingGlue.java:244)
at java.util.ArrayList.forEach(ArrayList.java:1249)
at io.cucumber.core.runner.CachingGlue.prepareGlue(CachingGlue.java:243)
at io.cucumber.core.runner.Runner.runPickle(Runner.java:70)
at io.cucumber.core.runtime.Runtime.lambda$execute$5(Runtime.java:110)
at io.cucumber.core.runtime.CucumberExecutionContext.runTestCase(CucumberExecutionContext.java:117)
at io.cucumber.core.runtime.Runtime.lambda$execute$6(Runtime.java:110)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at io.cucumber.core.runtime.Runtime$SameThreadExecutorService.execute(Runtime.java:233)
at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:112)
at io.cucumber.core.runtime.Runtime.lambda$run$2(Runtime.java:86)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.stream.SliceOps$1$1.accept(SliceOps.java:204)
at java.util.ArrayList$ArrayListSpliterator.tryAdvance(ArrayList.java:1351)
at java.util.stream.ReferencePipeline.forEachWithCancel(ReferencePipeline.java:126)
at java.util.stream.AbstractPipeline.copyIntoWithCancel(AbstractPipeline.java:498)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:485)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
at io.cucumber.core.runtime.Runtime.run(Runtime.java:87)
... 2 more
I'm not sure what is going wrong as the same code worked with the previous version
This is a question related to upgrading to Cucumber 6, which has a different API, syntax and and architecture compared to Cucumber 2. You will need to rewrite many of the step definition methods, as the regular expressions used in Cucumber 2 will need updating, and the parameter conversion logic will also need updating (different classes are used).
Related
I'm upgrading amazonaws sdks in my project
following are the upgrades I did in POM
amazon-kinesis-client from 1.9.0 to 1.14.9
amazon-kinesis-producer from 0.10.2 to 0.15.2
aws-java-sdk-core from 1.11.272 to 1.12.398
jmespath-java from 1.11.98 to 1.12.398
after the changes getting the following runtime errors in log file and my kinesis consumer/worker are not working. kinesis-producer working fine.
[ INFO] [] [RecordProcessor-0000] (06 Feb 2023 11:35:49) (KinesisDataFetcher.java:171) - Initializing shard shardId-000000000000 with 49636335084413016973448851393414073031389798471324139522
[ERROR] [] [Thread-10] (06 Feb 2023 11:35:50) (Worker.java:709) - Worker.run caught exception, sleeping for 1000 milli seconds!
java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.NoClassDefFoundError: Could not initialize class com.amazonaws.protocol.json.SdkStructuredCborFactory
at com.amazonaws.services.kinesis.clientlibrary.lib.worker.KinesisShardConsumer.determineTaskOutcome(KinesisShardConsumer.java:393)
at com.amazonaws.services.kinesis.clientlibrary.lib.worker.KinesisShardConsumer.checkAndSubmitNextTask(KinesisShardConsumer.java:328)
at com.amazonaws.services.kinesis.clientlibrary.lib.worker.KinesisShardConsumer.consumeShard(KinesisShardConsumer.java:316)
at com.amazonaws.services.kinesis.clientlibrary.lib.worker.Worker.runProcessLoop(Worker.java:698)
at com.amazonaws.services.kinesis.clientlibrary.lib.worker.Worker.run(Worker.java:681)
at com.hk.web.listener.KinesisConsumer$2.run(KinesisConsumer.java:109)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.NoClassDefFoundError: Could not initialize class com.amazonaws.protocol.json.SdkStructuredCborFactory
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:192)
at com.amazonaws.services.kinesis.clientlibrary.lib.worker.KinesisShardConsumer.determineTaskOutcome(KinesisShardConsumer.java:376)
... 6 more
Caused by: java.lang.NoClassDefFoundError: Could not initialize class com.amazonaws.protocol.json.SdkStructuredCborFactory
at com.amazonaws.protocol.json.SdkJsonProtocolFactory.getSdkFactory(SdkJsonProtocolFactory.java:141)
at com.amazonaws.protocol.json.SdkJsonProtocolFactory.createGenerator(SdkJsonProtocolFactory.java:55)
at com.amazonaws.protocol.json.SdkJsonProtocolFactory.createGenerator(SdkJsonProtocolFactory.java:75)
at com.amazonaws.protocol.json.SdkJsonProtocolFactory.createProtocolMarshaller(SdkJsonProtocolFactory.java:65)
at com.amazonaws.services.kinesis.model.transform.GetShardIteratorRequestProtocolMarshaller.marshall(GetShardIteratorRequestProtocolMarshaller.java:52)
at com.amazonaws.services.kinesis.AmazonKinesisClient.executeGetShardIterator(AmazonKinesisClient.java:1420)
at com.amazonaws.services.kinesis.AmazonKinesisClient.getShardIterator(AmazonKinesisClient.java:1405)
at com.amazonaws.services.kinesis.clientlibrary.proxies.KinesisProxy.getIterator(KinesisProxy.java:574)
at com.amazonaws.services.kinesis.clientlibrary.proxies.MetricsCollectingKinesisProxyDecorator.getIterator(MetricsCollectingKinesisProxyDecorator.java:125)
at com.amazonaws.services.kinesis.clientlibrary.lib.worker.KinesisDataFetcher.getIterator(KinesisDataFetcher.java:224)
at com.amazonaws.services.kinesis.clientlibrary.lib.worker.KinesisDataFetcher.advanceIteratorTo(KinesisDataFetcher.java:200)
at com.amazonaws.services.kinesis.clientlibrary.lib.worker.KinesisDataFetcher.initialize(KinesisDataFetcher.java:172)
at com.amazonaws.services.kinesis.clientlibrary.lib.worker.InitializeTask.call(InitializeTask.java:94)
at com.amazonaws.services.kinesis.clientlibrary.lib.worker.MetricsCollectingTaskDecorator.call(MetricsCollectingTaskDecorator.java:49)
at com.amazonaws.services.kinesis.clientlibrary.lib.worker.MetricsCollectingTaskDecorator.call(MetricsCollectingTaskDecorator.java:24)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
I was checking if both my kiensis producer & consumer are working fine or not, producer is working fine after the upgrade, but kinesis consumer/worker are giving the error I mentioned.
I found the fix, needed to upgrade the com.fasterxml.jackson.core:jackson-databind from 2.6.x to 2.12.x+ and that resolved the error.
thanks everyone who responded.
I am working on uplifting a java component to Java 11 but I an encountering a runtime error. Java component builds successfully but getting this error: java.lang.NoClassDefFoundError: org/omg/CORBA/portable/IDLEntity when try to smoke test service.
After some research I found CORBA is not supported in java 11 and I'll need to externally download GlassFish CORBA JAR files externally http://support.sas.com/kb/63/716.html
I also tried adding glassfish dependency like mentioned here in second answer Is there a replacement library for CORBA in JDK 11 but still same error.
I downloaded glassfish jars and pointed them to CLASSPATH in .bash_profile
export CLASSPATH=${CLASSPATH}:/usr/local/glassfish-corba-internal-api-4.2.4.jar:/usr/local/glassfish-corba-omgapi-4.2.4.jar:/usr/local/glassfish-corba-orb-4.2.4.jar:/usr/local/pfl-basic-4.1.2.jar:/usr/local/pfl-tf-4.1.2.jar
Here's the complete error log:
21/11/07 15:52:39.865 GMT INFO [qtp1201454821-16] com.cerner.pophealth.bi.bodsp.service.lib.serviceclients.AnalyticsBiDirectoryServiceClient [cid:fa29ac1a-f394-44a5-abed-4055cf966362] Retrieving info from BI Directory Service for tenant 50187c3d-b72c-4f7a-9a34-b2abe35be868\n
21/11/07 15:52:41.726 GMT INFO [qtp1201454821-16] com.cerner.pophealth.bi.bodsp.service.lib.serviceclients.AnalyticsBiDirectoryServiceClient [cid:fa29ac1a-f394-44a5-abed-4055cf966362] Retrieving info from BI Directory Service for cluster 1ac5c9a5-4fca-4eb5-a5a8-9e73d53a1871\n
21/11/07 15:52:42.463 GMT INFO [qtp1201454821-16] com.cerner.pophealth.bi.bodsp.service.handlers.SessionUtil [cid:fa29ac1a-f394-44a5-abed-4055cf966362] Opening session to CMS: boapp1.northamerica.cerner.net\n
21/11/07 15:52:42.679 GMT ERROR [qtp1201454821-16] com.cerner.beadledom.jaxrs.exceptionmapping.ThrowableExceptionMapper [cid:fa29ac1a-f394-44a5-abed-4055cf966362] An unhandled exception was thrown.\njava.lang.NoClassDefFoundError: org/omg/CORBA/portable/IDLEntity\n at java.base/java.lang.ClassLoader.defineClass1(Native Method)\n at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1017)\n at java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:174)\n at java.base/java.net.URLClassLoader.defineClass(URLClassLoader.java:550)\n at java.base/java.net.URLClassLoader$1.run(URLClassLoader.java:458)\n at java.base/java.net.URLClassLoader$1.run(URLClassLoader.java:452)\n at java.base/java.security.AccessController.doPrivileged(Native Method)\n at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:451)\n at org.eclipse.jetty.webapp.WebAppClassLoader.foundClass(WebAppClassLoader.java:642)\n at org.eclipse.jetty.webapp.WebAppClassLoader.loadAsResource(WebAppClassLoader.java:615)\n at org.eclipse.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:529)\n at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)\n at java.base/java.lang.ClassLoader.defineClass1(Native Method)\n at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1017)\n at java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:174)\n at java.base/java.net.URLClassLoader.defineClass(URLClassLoader.java:550)\n at java.base/java.net.URLClassLoader$1.run(URLClassLoader.java:458)\n at java.base/java.net.URLClassLoader$1.run(URLClassLoader.java:452)\n at java.base/java.security.AccessController.doPrivileged(Native Method)\n at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:451)\n at org.eclipse.jetty.webapp.WebAppClassLoader.foundClass(WebAppClassLoader.java:642)\n at org.eclipse.jetty.webapp.WebAppClassLoader.loadAsResource(WebAppClassLoader.java:615)\n at org.eclipse.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:529)\n at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)\n at com.crystaldecisions.thirdparty.com.ooc.OBCORBA.ORB_impl.initializeDefaultPolicies(ORB_impl.java:369)\n at com.crystaldecisions.thirdparty.com.ooc.OBCORBA.ORB_impl.initialize(ORB_impl.java:196)\n at com.crystaldecisions.thirdparty.com.ooc.OBCORBA.ORB_impl.setParameters(ORB_impl.java:812)\n at com.crystaldecisions.thirdparty.com.ooc.OBCORBA.ORB_impl.init(ORB_impl.java:1313)\n at com.crystaldecisions.enterprise.ocaframework.idl.helper.ORBHelper.init(ORBHelper.java:57)\n at com.crystaldecisions.enterprise.ocaframework.ServiceMgr.<init>(ServiceMgr.java:402)\n at com.crystaldecisions.enterprise.ocaframework.ServiceMgrFactory.getServiceMgr(ServiceMgrFactory.java:66)\n at com.crystaldecisions.sdk.occa.security.internal.LogonService.ensureServiceStub(LogonService.java:646)\n at com.crystaldecisions.sdk.occa.security.internal.LogonService.doUserLogon(LogonService.java:829)\n at com.crystaldecisions.sdk.occa.security.internal.LogonService.doUserLogon(LogonService.java:806)\n at com.crystaldecisions.sdk.occa.security.internal.LogonService.userLogon(LogonService.java:211)\n at com.crystaldecisions.sdk.occa.security.internal.SecurityMgr.userLogon(SecurityMgr.java:166)\n at com.crystaldecisions.sdk.framework.internal.SessionMgr.logon_aroundBody0(SessionMgr.java:458)\n at com.crystaldecisions.sdk.framework.internal.SessionMgr.logon_aroundBody1$advice(SessionMgr.java:521)\at com.crystaldecisions.sdk.framework.internal.SessionMgr.logon(SessionMgr.java:1)\n at com.cerner.pophealth.bi.bodsp.service.handlers.SessionUtil.openSession(SessionUtil.java:100)\n at com.cerner.pophealth.bi.bodsp.service.handlers.ServiceProvider.<init>(ServiceProvider.java:47)\n at com.cerner.pophealth.bi.bodsp.service.resource.BODSPResourceImpl.getBODSPHandler(BODSPResourceImpl.java:237)\n at com.cerner.pophealth.bi.bodsp.service.resource.BODSPResourceImpl.getBODSPs(BODSPResourceImpl.java:71)\n at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\n at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\n at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n at java.base/java.lang.reflect.Method.invoke(Method.java:566)\n at org.jboss.resteasy.core.MethodInjectorImpl.invoke(MethodInjectorImpl.java:138)\n at org.jboss.resteasy.core.ResourceMethodInvoker.internalInvokeOnTarget(ResourceMethodInvoker.java:517)\n at org.jboss.resteasy.core.ResourceMethodInvoker.invokeOnTargetAfterFilter(ResourceMethodInvoker.java:406)\n at org.jboss.resteasy.core.ResourceMethodInvoker.lambda$invokeOnTarget$0(ResourceMethodInvoker.java:370)\n at org.jboss.resteasy.core.interception.PreMatchContainerRequestContext.filter(PreMatchContainerRequestContext.java:356)\n at org.jboss.resteasy.core.ResourceMethodInvoker.invokeOnTarget(ResourceMethodInvoker.java:372)\n at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:344)\n at org.jboss.resteasy.core.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:317)\n at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:440)\n at org.jboss.resteasy.core.AsynchronousDispatcher.invoke(AsynchronousDispatcher.java:312)\n at org.jboss.resteasy.core.SynchronousDispatcher.lambda$invoke$4(SynchronousDispatcher.java:229)\n at org.jboss.resteasy.core.SynchronousDispatcher.lambda$preprocess$0(SynchronousDispatcher.java:135)\n at org.jboss.resteasy.core.interception.PreMatchContainerRequestContext.filter(PreMatchContainerRequestContext.java:356)\n at org.jboss.resteasy.core.SynchronousDispatcher.preprocess(SynchronousDispatcher.java:138)\n at org.jboss.resteasy.core.SynchronousDispatcher.invoke(SynchronousDispatcher.java:215)\n at org.jboss.resteasy.plugins.server.servlet.ServletContainerDispatcher.service(ServletContainerDispatcher.java:227)\n at org.jboss.resteasy.plugins.server.servlet.HttpServletDispatcher.service(HttpServletDispatcher.java:56)\n at org.jboss.resteasy.plugins.server.servlet.HttpServletDispatcher.service(HttpServletDispatcher.java:51)\n at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)\n at org.eclipse.jetty.servlet.ServletHolder$NotAsync.service(ServletHolder.java:1450)\n at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799)\n at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:550)\n at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)\n at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:602)\n at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\n at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)\n at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624)\n at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)\n at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1434)\n at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)\n at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:501)\n at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594)\n at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)\n at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1349)\n at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:191)\n at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)\n at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\n at org.eclipse.jetty.webapp.logging.ContextLogHandler.handle(ContextLogHandler.java:64)\n at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\n at org.eclipse.jetty.server.Server.handle(Server.java:516)\n at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:388)\n at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:633)\n at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:380)\n at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\n at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\n at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\n at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\n at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\n at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\n at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\n at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137)\n at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\n at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\n at java.base/java.lang.Thread.run(Thread.java:829)\nCaused by: java.lang.ClassNotFoundException: org.omg.CORBA.portable.IDLEntity\n at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:471)\n at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)\n at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)\n at org.eclipse.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:538)\n at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)\n ... 102 common frames omitted\n
172.17.0.1 - - [21/11/07 15:52:42.738 GMT] "GET /data-security-profiles?tenantId=50187c3d-b72c-4f7a-9a34-b2abe35be868 HTTP/1.1" 500 64 "-" "PostmanRuntime/7.28.3" "localhost:8080" "-" "fa29ac1a-f394-44a5-abed-4055cf966362"
From your example classpath, I presume that you are not using a dependency management system in your build process.
Using one would eliminate the possibility that you mis-typed your classpath, or forgot a transient dependency, both of which seem likely culprits here.
The most-often used dependency management tools for Java are Gradle and Maven.
Whenever I try to read a Spark dataset using PySpark and convert it to a Pandas df for modeling I get the error: java.io.StreamCorruptedException: invalid stream header: 204356EC on the toPandas() step.
I am not a Java coder (hence PySpark) and so these errors can be pretty cryptic to me. I tried the following things, but I still have this issue:
Made sure my Spark and PySpark versions matched as suggested here: java.io.StreamCorruptedException when importing a CSV to a Spark DataFrame
Reinstalled Spark using the methods suggested here: Complete Guide to Installing PySpark on MacOS
The logging in the test script below verifies the Spark and PySpark versions are aligned.
test.py:
import logging
from pyspark.sql import SparkSession
from pyspark import SparkContext
import findspark
findspark.init()
logging.basicConfig(
format='%(asctime)s %(levelname)-8s %(message)s',
level=logging.INFO,
datefmt='%Y-%m-%d %H:%M:%S')
sc = SparkContext('local[*]', 'test')
spark = SparkSession(sc)
logging.info('Spark location: {}'.format(findspark.find()))
logging.info('PySpark version: {}'.format(spark.sparkContext.version))
logging.info('Reading spark input dataframe')
test_df = spark.read.csv('./data', header=True, sep='|', inferSchema=True)
logging.info('Converting spark DF to pandas DF')
pandas_df = test_df.toPandas()
logging.info('DF record count: {}'.format(len(pandas_df)))
sc.stop()
Output:
$ python ./test.py
21/05/13 11:54:32 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
2021-05-13 11:54:34 INFO Spark location: /Users/username/server/spark-3.1.1-bin-hadoop2.7
2021-05-13 11:54:34 INFO PySpark version: 3.1.1
2021-05-13 11:54:34 INFO Reading spark input dataframe
2021-05-13 11:54:42 INFO Converting spark DF to pandas DF
21/05/13 11:54:42 WARN package: Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.sql.debug.maxToStringFields'.
21/05/13 11:54:45 ERROR TaskResultGetter: Exception while getting task result12]
java.io.StreamCorruptedException: invalid stream header: 204356EC
at java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:936)
at java.io.ObjectInputStream.<init>(ObjectInputStream.java:394)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.<init>(JavaSerializer.scala:64)
at org.apache.spark.serializer.JavaDeserializationStream.<init>(JavaSerializer.scala:64)
at org.apache.spark.serializer.JavaSerializerInstance.deserializeStream(JavaSerializer.scala:123)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
at org.apache.spark.scheduler.TaskResultGetter$$anon$3.$anonfun$run$1(TaskResultGetter.scala:97)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1996)
at org.apache.spark.scheduler.TaskResultGetter$$anon$3.run(TaskResultGetter.scala:63)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Traceback (most recent call last):
File "./test.py", line 23, in <module>
pandas_df = test_df.toPandas()
File "/Users/username/server/spark-3.1.1-bin-hadoop2.7/python/pyspark/sql/pandas/conversion.py", line 141, in toPandas
pdf = pd.DataFrame.from_records(self.collect(), columns=self.columns)
File "/Users/username/server/spark-3.1.1-bin-hadoop2.7/python/pyspark/sql/dataframe.py", line 677, in collect
sock_info = self._jdf.collectToPython()
File "/Users/username/server/spark-3.1.1-bin-hadoop2.7/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py", line 1304, in __call__
File "/Users/username/server/spark-3.1.1-bin-hadoop2.7/python/pyspark/sql/utils.py", line 111, in deco
return f(*a, **kw)
File "/Users/username/server/spark-3.1.1-bin-hadoop2.7/python/lib/py4j-0.10.9-src.zip/py4j/protocol.py", line 326, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o31.collectToPython.
: org.apache.spark.SparkException: Job aborted due to stage failure: Exception while getting task result: java.io.StreamCorruptedException: invalid stream header: 204356EC
at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2253)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2202)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2201)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2201)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1078)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1078)
at scala.Option.foreach(Option.scala:407)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1078)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2440)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2382)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2371)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:868)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2202)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2223)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2242)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2267)
at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
at org.apache.spark.rdd.RDD.collect(RDD.scala:1029)
at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:390)
at org.apache.spark.sql.Dataset.$anonfun$collectToPython$1(Dataset.scala:3519)
at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3687)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3685)
at org.apache.spark.sql.Dataset.collectToPython(Dataset.scala:3516)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:748)
The issue was resolved for me by ensuring that the serialisation option (registered in configuration under spark.serlializer) was not incompatible with pyarrow (typically used during the conversion of pandas to pyspark and vice versa if you've got it enabled).
The fix was to remove the often recommended spark.serializer: org.apache.spark.serializer.KryoSerializer from the configuration and rely instead on the potentially slower default.
For context, our set-up was with a ML version of the databricks spark cluster (v7.3).
I have this exception with Spark Thrift server.
Driver version and cluster version was different.
In my case i delete this, for using version from driver in all cluster.
spark.yarn.archive=hdfs:///spark/3.1.1.zip
I'm using Apache Calcite in My Project to do CSV,Excel and Other Database management.
Its working when i execute through main method but its giving an error while executing through web service
Caused by: java.lang.RuntimeException: org.codehaus.commons.compiler.CompileException: Line 1, Column 0: package org.apache.calcite.rel.metadata does not exist (compiler.err.doesnt.exist)
at com.google.common.base.Throwables.propagate(Throwables.java:160)
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.load3(JaninoRelMetadataProvider.java:361)
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.access$000(JaninoRelMetadataProvider.java:94)
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider$1.load(JaninoRelMetadataProvider.java:113)
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider$1.load(JaninoRelMetadataProvider.java:110)
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3527)
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2319)
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2282)
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2197)
at com.google.common.cache.LocalCache.get(LocalCache.java:3937)
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3941)
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4824)
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.create(JaninoRelMetadataProvider.java:448)
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.revise(JaninoRelMetadataProvider.java:460)
at org.apache.calcite.rel.metadata.RelMetadataQuery.revise(RelMetadataQuery.java:186)
at org.apache.calcite.rel.metadata.RelMetadataQuery.collations(RelMetadataQuery.java:484)
at org.apache.calcite.rel.metadata.RelMdCollation.project(RelMdCollation.java:207)
at org.apache.calcite.rel.logical.LogicalProject$1.get(LogicalProject.java:117)
at org.apache.calcite.rel.logical.LogicalProject$1.get(LogicalProject.java:115)
at org.apache.calcite.plan.RelTraitSet.replaceIfs(RelTraitSet.java:238)
at org.apache.calcite.rel.logical.LogicalProject.create(LogicalProject.java:113)
at org.apache.calcite.rel.logical.LogicalProject.create(LogicalProject.java:103)
at org.apache.calcite.rel.core.RelFactories$ProjectFactoryImpl.createProject(RelFactories.java:120)
at org.apache.calcite.tools.RelBuilder.project(RelBuilder.java:853)
at org.apache.calcite.plan.RelOptUtil.createProject(RelOptUtil.java:2881)
at org.apache.calcite.plan.RelOptUtil.createProject(RelOptUtil.java:2839)
at org.apache.calcite.plan.RelOptUtil.createProject(RelOptUtil.java:2783)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectList(SqlToRelConverter.java:3495)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:665)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:622)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:2852)
at org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:556)
at org.apache.calcite.prepare.Prepare.prepareSql(Prepare.java:229)
at org.apache.calcite.prepare.Prepare.prepareSql(Prepare.java:193)
at org.apache.calcite.prepare.CalcitePrepareImpl.prepare2_(CalcitePrepareImpl.java:733)
at org.apache.calcite.prepare.CalcitePrepareImpl.prepare_(CalcitePrepareImpl.java:597)
at org.apache.calcite.prepare.CalcitePrepareImpl.prepareSql(CalcitePrepareImpl.java:567)
at org.apache.calcite.jdbc.CalciteConnectionImpl.parseQuery(CalciteConnectionImpl.java:215)
at org.apache.calcite.jdbc.CalciteMetaImpl.prepareAndExecute(CalciteMetaImpl.java:594)
at org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:613)
at org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:139)
... 44 more
I think you're hitting CALCITE-1461, which will be fixed in Calcite 1.11. For now, disable shading.
We are trying a bunch of operations like SELECT -> store row key into a collection ->then split the collection into each worker thread -> Each thread again created connection using phoenix jdbc -> perform SELECT then depending on the result UPSERT into a different phoenix table.
I am using ExecutorService with a fixed thread pool of 4 I am seeing exceptions as below.
org.apache.phoenix.exception.PhoenixIOException: org.apache.phoenix.exception.PhoenixIOException: The system cannot find the path specified
at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:108)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:538)
at org.apache.phoenix.iterate.ConcatResultIterator.getIterators(ConcatResultIterator.java:50)
at org.apache.phoenix.iterate.ConcatResultIterator.currentIterator(ConcatResultIterator.java:97)
at org.apache.phoenix.iterate.ConcatResultIterator.next(ConcatResultIterator.java:117)
at org.apache.phoenix.jdbc.PhoenixResultSet.next(PhoenixResultSet.java:764)
at com.vonage.test.PopulateStagingGWCDRWorker.run(MyCode.java:74)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.ExecutionException: org.apache.phoenix.exception.PhoenixIOException: The system cannot find the path specified
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:206)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:534)
... 8 more
Caused by: org.apache.phoenix.exception.PhoenixIOException: The system cannot find the path specified
at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:108)
at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:122)
at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:73)
at org.apache.phoenix.iterate.SpoolingResultIterator$SpoolingResultIteratorFactory.newIterator(SpoolingResultIterator.java:67)
at org.apache.phoenix.iterate.ChunkedResultIterator.<init>(ChunkedResultIterator.java:92)
at org.apache.phoenix.iterate.ChunkedResultIterator$ChunkedResultIteratorFactory.newIterator(ChunkedResultIterator.java:72)
at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:92)
at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:83)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more
Caused by: java.io.IOException: The system cannot find the path specified
at java.io.WinNTFileSystem.createFileExclusively(Native Method)
at java.io.File.createTempFile(File.java:2024)
at org.apache.commons.io.output.DeferredFileOutputStream.thresholdReached(DeferredFileOutputStream.java:176)
at org.apache.phoenix.iterate.SpoolingResultIterator$1.thresholdReached(SpoolingResultIterator.java:98)
at org.apache.commons.io.output.ThresholdingOutputStream.checkThreshold(ThresholdingOutputStream.java:224)
at org.apache.commons.io.output.ThresholdingOutputStream.write(ThresholdingOutputStream.java:92)
at java.io.DataOutputStream.writeByte(DataOutputStream.java:153)
at org.apache.hadoop.io.WritableUtils.writeVLong(WritableUtils.java:273)
at org.apache.hadoop.io.WritableUtils.writeVInt(WritableUtils.java:253)
at org.apache.phoenix.util.TupleUtil.write(TupleUtil.java:146)
at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:107)
... 10 more
enter code here
But If I am using a pool size of 2 or less it works fine. I was wondering if there is a property at the client side that can be changed ?
I my case I solved this by using below dependency in pom.xml
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-protocol</artifactId>
<version>1.1.11</version>
</dependency>
Just to update you i am having Hbase version : 1.1 and phoenix is at 4.7
phoenix.spool.directory in the hbase-site.xml fixes this. Thanks