How to open xls file with macros inside using apache poi? - java

The problem is that I can't open workbook from inputstream with xls file.
When I trying open it by WorkbookFactory.create(inputStream) I get:
Exception in thread "main" org.apache.poi.hssf.record.RecordFormatException: Not enough data (0) to read requested (1) bytes
at org.apache.poi.hssf.record.RecordInputStream.checkRecordPosition(RecordInputStream.java:216)
at org.apache.poi.hssf.record.RecordInputStream.readByte(RecordInputStream.java:224)
at org.apache.poi.hssf.record.RecordInputStream.readUByte(RecordInputStream.java:260)
at org.apache.poi.hssf.record.DConRefRecord.<init>(DConRefRecord.java:174)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.poi.hssf.record.RecordFactory$ReflectionConstructorRecordCreator.create(RecordFactory.java:87)
at org.apache.poi.hssf.record.RecordFactory.createSingleRecord(RecordFactory.java:338)
at org.apache.poi.hssf.record.RecordFactoryInputStream.readNextRecord(RecordFactoryInputStream.java:310)
at org.apache.poi.hssf.record.RecordFactoryInputStream.nextRecord(RecordFactoryInputStream.java:276)
at org.apache.poi.hssf.record.RecordFactory.createRecords(RecordFactory.java:480)
at org.apache.poi.hssf.usermodel.HSSFWorkbook.<init>(HSSFWorkbook.java:326)
at org.apache.poi.ss.usermodel.WorkbookFactory.create(WorkbookFactory.java:115)
at org.apache.poi.ss.usermodel.WorkbookFactory.create(WorkbookFactory.java:172)
at org.apache.poi.ss.usermodel.WorkbookFactory.create(WorkbookFactory.java:143)
at ru.efo.search.support.excel.WorkBookReader.init(WorkBookReader.java:67)
at ru.efo.search.support.excel.WorkBookReader.<init>(WorkBookReader.java:42)
at ru.efo.search.support.excel.WorkBookReader.main(WorkBookReader.java:126)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
But if the same file is .xlsm format it works ok, because WorkbookFactory uses XSSFWorkbook instead of HSSFWorkbook. Unfortunately, I have only InputStream in my application and I can't change file format. Please, help!

Related

Need help on opening Interactive Trader Workstation

When I try to open Trader Workstation on Mac OS, there is a startup error.
How do I resolve this?
What does this error message mean?
The error message:
java.lang.IllegalArgumentException: Input byte[] should at least have 2 bytes for base64 bytes
at java.util.Base64$Decoder.outLength(Base64.java:659)
at java.util.Base64$Decoder.decode(Base64.java:525)
at java.util.Base64$Decoder.decode(Base64.java:549)
at twslaunch.jauthentication.aV.d(aV.java:210)
at twslaunch.jauthentication.aV.c(aV.java:183)
at twslaunch.jclient.login.l.a(l.java:528)
at java.util.Hashtable.forEach(Hashtable.java:879)
at twslaunch.jclient.login.l.a(l.java:528)
at twslaunch.jclient.login.l.a(l.java:408)
at jclient.Launcher.main(Launcher.java:189)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.exe4j.runtime.LauncherEngine.launch(LauncherEngine.java:85)
at com.install4j.runtime.launcher.MacLauncher.main(MacLauncher.java:52)

RestAssured.get(url).statusCode causes java.lang.OutOfMemoryError: Java heap space

I'm using RestAssured.get(url).statusCode() to see if a url is working and respond successfully(Code 200) or not. In case of some urls(like url1) it properly works but for some urls(such as url2) it will cause the following error:
String url1="http://www.bbsys.com/playersdemo.exe"
String url2="http://www.ezhomeinspectionsoftware.com/filestorage/EZ-Setup.exe"
both of these urls result in downloading files. But the second one downloads a bigger file. So, probably that's why it cause out of memory exception. I need to know how can I handle cases like this. I need to get response from a large number of different urls. Do you recommend using another library or I'm using this library in a wrong way?
java.lang.OutOfMemoryError: Java heap space
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:3236)
at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:118)
at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93)
at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:153)
at io.restassured.internal.util.IOUtils.toByteArray(IOUtils.java:31)
at io.restassured.internal.http.GZIPEncoding$GZIPDecompressingEntity.getContent(GZIPEncoding.java:69)
at org.apache.http.conn.BasicManagedEntity.getContent(BasicManagedEntity.java:85)
at io.restassured.internal.http.HTTPBuilder.parseResponse(HTTPBuilder.java:545)
at io.restassured.internal.RequestSpecificationImpl$RestAssuredHttpBuilder.super$2$parseResponse(RequestSpecificationImpl.groovy)
at sun.reflect.GeneratedMethodAccessor108.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1215)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:132)
at io.restassured.internal.RequestSpecificationImpl$RestAssuredHttpBuilder.parseResponse(RequestSpecificationImpl.groovy:2137)
at sun.reflect.GeneratedMethodAccessor107.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:210)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:59)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:174)
at io.restassured.internal.RequestSpecificationImpl$RestAssuredHttpBuilder.doRequest(RequestSpecificationImpl.groovy:2073)
at io.restassured.internal.http.HTTPBuilder.doRequest(HTTPBuilder.java:494)
at io.restassured.internal.http.HTTPBuilder.request(HTTPBuilder.java:451)
at io.restassured.internal.http.HTTPBuilder$request$2.call(Unknown Source)
at io.restassured.internal.RequestSpecificationImpl.sendHttpRequest(RequestSpecificationImpl.groovy:1450)
at sun.reflect.GeneratedMethodAccessor96.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
Do you actually want to download the files or do you only want to check if the url is alive?
If not take, a look at the head method (https://static.javadoc.io/io.rest-assured/rest-assured/3.0.7/io/restassured/RestAssured.html#head-java.lang.String-java.util.Map-), to send a head request without triggering the download.
Otherwise your only option is to provide sufficient memory, as #Roberto suggested.

Aerospike EOFException

I have a 4 servers nodes cluster in Aerospike 3.8.4 with Java Client 3.2.2. When I try to save an object, I get this error:
error code 0 for key test:docs2:80000001:cd60b46ba665c24e646f74053de2846b2a17f7d0
com.aerospike.client.AerospikeException: java.io.EOFException
at com.aerospike.client.command.SyncCommand.execute(SyncCommand.java:95)
at com.aerospike.client.AerospikeClient.put(AerospikeClient.java:339)
at $line58.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$.<init>(<console>:41)
at $line58.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$.<clinit>(<console>)
at $line58.$eval$.$print$lzycompute(<console>:7)
at $line58.$eval$.$print(<console>:6)
at $line58.$eval.$print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:784)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1039)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:636)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:635)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:635)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:567)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:563)
at scala.tools.nsc.interpreter.ILoop.reallyInterpret$1(ILoop.scala:802)
at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:836)
at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:694)
at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:404)
at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:424)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:925)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:911)
at scala.tools.nsc.interpreter.ILoop.main(ILoop.scala:936)
at xsbt.ConsoleInterface.run(ConsoleInterface.scala:69)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:102)
at sbt.compiler.AnalyzingCompiler.console(AnalyzingCompiler.scala:77)
at sbt.Console.sbt$Console$$console0$1(Console.scala:23)
at sbt.Console$$anonfun$apply$2$$anonfun$apply$1.apply$mcV$sp(Console.scala:24)
at sbt.Console$$anonfun$apply$2$$anonfun$apply$1.apply(Console.scala:24)
at sbt.Console$$anonfun$apply$2$$anonfun$apply$1.apply(Console.scala:24)
at sbt.Logger$$anon$4.apply(Logger.scala:90)
at sbt.TrapExit$App.run(TrapExit.scala:244)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.EOFException
at com.aerospike.client.cluster.Connection.readFully(Connection.java:100)
at com.aerospike.client.command.WriteCommand.parseResult(WriteCommand.java:67)
at com.aerospike.client.command.SyncCommand.execute(SyncCommand.java:57)
... 47 more
Why is this happening? One thing I noticed is that even with these exceptions, sometimes the objects are actually written to the kvs.

Write a file on hdfs with permissions enabled

I try to write a file on hdfs using Java (v 1.8) with permissions enabled.
As hadoop instance I have used ready docker image : https://hub.docker.com/r/sequenceiq/hadoop-docker/
I have followed Write a file in hdfs with Java to do something like the following:
Configuration configuration = new Configuration();
configuration.set("fs.defaultFS", "hdfs://127.0.0.1:9000/user/root");
configuration.set("hadoop.job.ugi", "root");
FileSystem hdfs = FileSystem.get(configuration );
Path file = new Path("hdfs://127.0.0.1:9000//user/root/test.txt");
if ( hdfs.exists( file )) { //hdfs.delete( file, true );
System.out.println("File exist");
}
OutputStream os = hdfs.create( file,
new Progressable() {
public void progress() {
System.out.println("");
} });
BufferedWriter br = new BufferedWriter( new OutputStreamWriter( os, "UTF-8" ) );
br.write("Hello World");
br.close();
hdfs.close();
I get the following exceptions:
org.apache.hadoop.security.AccessControlException: Permission denied: user=xxx, access=WRITE, inode="/user/root/text.txt":root:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1698)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1682)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1665)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2494)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2429)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2312)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:622)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:397)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1628)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:909)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:802)
at com.finscience.Main.main(Main.java:28)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=stefano, access=WRITE, inode="/user/root/capacity-scheduler.xml":root:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1698)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1682)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1665)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2494)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2429)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2312)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:622)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:397)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
at org.apache.hadoop.ipc.Client.call(Client.java:1476)
at org.apache.hadoop.ipc.Client.call(Client.java:1407)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy9.create(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy10.create(Unknown Source)
at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
... 15 more
how can I write a file on hdfs without disable permissions?
HDFS is a POSIX-like file-system. Access to the directories and files are restricted by ACLs. To be able to write files, you have to deal with the ACLs.
If you are not familiar with the concept, just check out the following link https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/HdfsPermissionsGuide.html
In your specific case, it seems that you want to write into the home directory of the root.
You have three options:
* Execute the application as root user
* Add the service user under which the Java Application is run to the root users group
* change the owner of that directory(chown) or the rights(chmod)

ParameterizedAssertionError using EasyTest when test case fails

I wanted to try EasyTest to get input parameters from a CSV and found the following, nicely written, example in a blog posted here:
http://gpcmol.blogspot.com/2013/06/easytest-unit-testing-with-externalized.html
If I follow the example it runs beautifully giving me a nice PDF output as advertised.
However if I cause a test case failure, by changing the last line of input from ",9,-12" to ",9,-13", I no longer get PDF output and get 2 failures rather than one.
The first failure is the correct assertion that the test case fails. The second failure is the following exception:
<testcase name="classMethod" classname="TransformCelciusTest" time="0.0">
<failure message="org.junit.experimental.theories.internal.ParameterizedAssertionError: testToCelsiusConverter(TestInfo [testClass=org.junit.runners.model.TestClass#90bb3e6, dataLoader=org.easetech.easytest.loader.CSVDataLoader#5f4fc5ad, filePaths=[data/temperatureConversionData.csv], methodName=testToCelsiusConverter])" type="org.junit.experimental.theories.internal.ParameterizedAssertionError">org.junit.experimental.theories.internal.ParameterizedAssertionError: testToCelsiusConverter(TestInfo [testClass=org.junit.runners.model.TestClass#90bb3e6, dataLoader=org.easetech.easytest.loader.CSVDataLoader#5f4fc5ad, filePaths=[data/temperatureConversionData.csv], methodName=testToCelsiusConverter])
at org.easetech.easytest.util.RunAftersWithOutputData.writeData(RunAftersWithOutputData.java:157)
at org.easetech.easytest.util.RunAftersWithOutputData.evaluate(RunAftersWithOutputData.java:133)
at org.junit.runners.ParentRunner.run(ParentRunner.java:300)
at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.runTestClass(JUnitTestClassExecuter.java:86)
at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecuter.execute(JUnitTestClassExecuter.java:49)
at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassProcessor.processTestClass(JUnitTestClassProcessor.java:69)
at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:48)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.gradle.messaging.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at org.gradle.messaging.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at org.gradle.messaging.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:32)
at org.gradle.messaging.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:93)
at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:105)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.gradle.messaging.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at org.gradle.messaging.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at org.gradle.messaging.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:355)
at org.gradle.internal.concurrent.DefaultExecutorFactory$StoppableExecutorImpl$1.run(DefaultExecutorFactory.java:64)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
Caused by: java.lang.RuntimeException: java.lang.NullPointerException
at org.easetech.easytest.loader.CSVDataLoader.writeDataToCSV(CSVDataLoader.java:364)
at org.easetech.easytest.loader.CSVDataLoader.writeData(CSVDataLoader.java:180)
at org.easetech.easytest.util.RunAftersWithOutputData.writeData(RunAftersWithOutputData.java:154)
... 27 more
Caused by: java.lang.NullPointerException
at org.easetech.easytest.loader.CSVDataLoader.writeOutputData(CSVDataLoader.java:382)
at org.easetech.easytest.loader.CSVDataLoader.writeDataToCSV(CSVDataLoader.java:347)
... 29 more
</failure>
</testcase>
Anybody understand how to modify the example so that it properly completes without the ParameterizedAssertionError exception so that the test case failure can be properly reported in the output (PDF)?
I think this is related:
JUnit #Theory : is there a way to throw meaningful exception?
I tested the scenario with EasyTest Core 1.3.1 library and indeed it is an issue(infact a bug) in EasyTest 1.3.1. Specifically there is a NullPointerException in CSVDataLoader because it is expecting the test Duration value but its not present because of test failure. I have to see whats the best solution for this problem. I will keep you posted of the solution. In the mean time, you could try the Excel and XML Data loader. Or if you want a quick solution, you can copy paste the CSVDataLoader and override the line 364 of CSVDataLoader such that it checks if DURATIOn is present and only then call toString on it. Then you can use this Loader as Custom loader in the #DataLoader annotation.
Thanks,
Anuj Kumar

Categories

Resources