Update:
I have written the following updated code after inputs from Scala experts here.
Here below is the updated code.
The code compiles, but on "run" throws an IllegalStateException: I posted the error stacktrace after the code listing:
import java.io.IOException
import java.nio.file.FileSystems
import java.nio.file.FileVisitOption
import java.nio.file.FileVisitResult
import java.nio.file.FileVisitor
import java.nio.file.Files
import java.nio.file.Path
import java.nio.file.Paths
import java.nio.file.attribute.BasicFileAttributes
import java.util.EnumSet
import java.nio.file.{DirectoryStream,DirectoryIteratorException}
import scala.collection.JavaConversions._
class TestFVis(val searchPath: Path) extends FileVisitor[Path] {
//Here I provide implementations for postVisitDirectory,
//preVisitDirectory, visitFile, and visitFileFailed
}
object Main {
def main(args: Array[String]) {
val searchFileOrFolder: Path = Paths.get("C://my_dir")
println("The Path object is: " + searchFileOrFolder)
var testFileVisitorTop = new TestFVis(searchFileOrFolder)
println("Our top level FileVisitor is: " + testFileVisitorTop)
val opts = EnumSet.of(FileVisitOption.FOLLOW_LINKS)
val rootDirsIterable: Iterable[Path] = FileSystems.getDefault.getRootDirectories //returns the default filesystem
// and then returns an Iterable[Path] to iterate over the paths of the root directories
var dirStream:Option[DirectoryStream[Path]] = None
for(rootDir <- rootDirsIterable) {
println("in the Outer For")
dirStream= Some(Files.newDirectoryStream(searchFileOrFolder))
def dstream = dirStream.get
val streamIter = dstream.iterator().filter((path) => {
Files.isRegularFile(path)
})
for( dirStreamUnwrapped <- dirStream;(filePath:Path) <- dirStreamUnwrapped) {
//for( (filePath: DirectoryStream[Path]) <- dirStream) {
val tempPath = Files.walkFileTree(filePath, testFileVisitorTop)
//val tempPath = Files.walkFileTree(fileOrDir,opts,Integer.MAX_VALUE,testFileVisitorTop)
println("current path is: " + tempPath)
if (!testFileVisitorTop.found) {
println("The file or folder " + searchFileOrFolder+ " was not found!")
}
}
}
}
}
However for historical context here is the compile error I got at first:
[error] found : java.nio.file.Path => Unit
[error] required: java.nio.file.DirectoryStream[java.nio.file.Path] => ?
[error] for((filePath:Path) <- dirStream) {
--
After changing the code:
The code compiles with no errors, but I get an IllegalStateException on sbt 'run'\
> run
ur top level FileVisitor is C:\my_dir
Our top level FileVisitor is: com.me.ds.TestFileVisitor
#5564baf6
in the Outer For
[error] (run-main-0) java.lang.IllegalStateException: Iterator already obtained
java.lang.IllegalStateException: Iterator already obtained
at sun.nio.fs.WindowsDirectoryStream.iterator(WindowsDirectoryStream.jav
a:117)
at scala.collection.convert.Wrappers$JIterableWrapper.iterator(Wrappers.
scala:54)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.s
cala:777)
at com.me.ds.Main$$anonfun$main$1$$anonfun$appl
y$1.apply(SampleFileVis.scala:76)
at com.me.ds.Main$$anonfun$main$1$$anonfun$appl
y$1.apply(AFileVisitor.scala:76)
at scala.Option.foreach(Option.scala:256)
at **com.me.ds.Main$$anonfun$main$1.apply(SampleFileVisitor.scala:76)**
at com.me.ds.Main$$anonfun$main$1.apply(AFileVi
sitor.scala:68)
at scala.collection.Iterator$class.foreach(Iterator.scala:743)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1195)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at com.me.ds.Main$.main(SampleFileVis.scala:68)
at com.me.ds.Main.main(SampleFileVis.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
[trace] Stack trace suppressed: run last compile:run for the full output.
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code: 1
[error] Total time: 0 s, completed Mar 24, 2015 7:32:38 AM
>
=-------------
I am going out and investigating the error on my own also. If someone can point me in the right direction here that would make my code compile and run, I would have accomplished my goal here.
thanks
All you are doing with the first for is unwrapping the Option, which cannot be cast to a Path. You just need to take your unwrapped object and use it in the next part:
for(dirStreamUnwrapped <- dirStream;
(filePath:Path) <- dirStreamUnwrapped) {
val tempPath = Files.walkFileTree(filePath, testFVis)
}
object needs to be lowercase.
dirStream needs to be preceded by val.
you have too many } at the end.
you have new TestFileVisitor but you meant new TestFVis.
dirStream has type: Some[DirectoryStream[Path]], which means that you need (filePath: DirectoryStream[Path]) <- dirStream
Related
I have written a method that suppose to read the key value but this gives an error while running via jenkinsfile
here's the code (ScanMethods.groovy):
package api.Scan
public static ScanPipeline(String VERACODE_API_ID, String VERACODE_API_SECRET, String failOnSeverity, String BaseFile) {
Map custom_block = [
VERACODE_API_ID: VERACODE_API_ID,
VERACODE_API_SECRET: VERACODE_API_SECRET,
failOnSeverity: failOnSeverity,
BaseFile: "results.json"
]
Scan.scanload(custom_block)
}
Jenkinsfile
pipeline {
agent any
options {
buildDiscarder logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '', numToKeepStr: '5')
}
stages {
stage('Veracode Pipeline') {
agent { label "default" }
steps {
script {
ScanMethods.scan-veracode-pipeline(VERACODE_API_ID, VERACODE_API_SECRET, failOnSeverity, BaseFile)
}
}}
}
}
Error i have recieved:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
/var/jenkins_home/jobs/Test_pipelines/jobs/ankur-test/jobs/pipeline-scan-cit-cdw-jenkinsfile/branches/feature-T23D-4021.247156/builds/19/libs/ASTLib/src/api/ScanMethods.groovy: 11: Apparent variable 'Scan' was found in a static scope but doesn't refer to a local variable, static field or class. Possible causes:
You attempted to reference a variable in the binding or an instance variable from a static context.
You misspelled a classname or statically imported field. Please check the spelling.
You attempted to use a method 'Scan' but left out brackets in a place not allowed by the grammar.
# line 11, column 5.
Scan.scanload(custom_block)
^
1 error
at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:310)
at org.codehaus.groovy.control.CompilationUnit.applyToPrimaryClassNodes(CompilationUnit.java:1085)
at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:603)
at org.codehaus.groovy.control.CompilationUnit.processPhaseOperations(CompilationUnit.java:581)
at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:558)
at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:298)
at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:268)
at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:254)
at groovy.lang.GroovyClassLoader.recompile(GroovyClassLoader.java:761)
at groovy.lang.GroovyClassLoader.loadClass(GroovyClassLoader.java:718)
at groovy.lang.GroovyClassLoader.loadClass(GroovyClassLoader.java:787)
at java.lang.ClassLoader.loadClass(ClassLoader.java:405)
at org.jenkinsci.plugins.workflow.cps.CpsGroovyShell$TimingLoader.loadClass(CpsGroovyShell.java:170)
at java.lang.ClassLoader.loadClass(ClassLoader.java:405)
at groovy.lang.GroovyClassLoader.loadClass(GroovyClassLoader.java:677)
at groovy.lang.GroovyClassLoader.loadClass(GroovyClassLoader.java:545)
at org.codehaus.groovy.control.ClassNodeResolver.tryAsLoaderClassOrScript(ClassNodeResolver.java:185)
Caused: BUG! exception in phase 'semantic analysis' in source unit 'WorkflowScript' The lookup for api.ScanMethods caused a failed compilaton. There should not have been any compilation from this call.
Please help in how to resolve this error in order to read the values
I have a Drools file that I'm using for business logic on a Tomcat6 server running Java 1.7.0_131 inside a Docker container. When I run the code bellow:
package org.fosstrak.capturingapp
import org.fosstrak.capturingapp.util.Util;
import org.fosstrak.ale.xsd.ale.epcglobal.ECReport;
import org.fosstrak.ale.xsd.ale.epcglobal.ECReports;
import org.fosstrak.ale.xsd.ale.epcglobal.ECReportGroupListMember;
import org.fosstrak.ale.xsd.epcglobal.EPC;
import org.fosstrak.capturingapp.util.SimpleEPCISDocument;
import org.fosstrak.epcis.model.ActionType;
import java.util.List;
import java.util.LinkedList;
import function org.fosstrak.capturingapp.util.Util.extractEPC;
import function org.fosstrak.capturingapp.util.Util.extractReportMembers;
// the global collector for all the EPCIS documents for further processing.
global java.util.List epcisResults
function List warehouseReportHandler(List reports, String reportName){
// List of ECReports
List epcs = new LinkedList();
for(Object rs : reports){
if(rs instanceof ECReports){
ECReports rsc = (ECReports) rs;
for(ECReport report : rsc.getReports().getReport()){
if(report.getReportName() == reportName){
ecps.addAll(extractEPC(Util.selectTag, report));
}
}
}
}
return epcs;
}
rule "ADDITIONS Rule Tags from reader 'Reader_Warehouse_Shelve1' from the specName 'ECSpec'"
dialect "java"
when
$reports : ECReports( reports != null)
$epcs : LinkedList( size > 0 ) from collect (
EPC() from warehouseReportHandler($reports, "additions")
)
then
SimpleEPCISDocument simpleDocument = new SimpleEPCISDocument();
simpleDocument.addObjectEvent(
$epcs,
ActionType.OBSERVE,
"urn:epcglobal:cbv:bizstep:storing",
"urn:epcglobal:cbv:disp:sellable_not_accessible",
"urn:epc:id:sgln:76300544.00000.1",
"urn:epc:id:sgln:76300544.00000.0"
);
System.out.println("\n=====================================================");
System.out.println("Additions tags seen:");
for (Object o : $epcs) System.out.println(((EPC)o).getValue());
System.out.println("=====================================================\n");
epcisResults.add(simpleDocument.getDocument());
end
I get the following error message:
21146 [Thread-2] DEBUG org.fosstrak.capturingapp.ECReportsHandler - Unable to build expression for 'from' : Failed to compile: 1 compilation error(s):
capture | - (1,45) unable to resolve method using strict-mode: java.lang.Object.warehouseReportHandler(org.fosstrak.ale.xsd.ale.epcglobal.ECReports, java.lang.String) 'warehouseReportHandler($reports, "additions")' : [Rule name='ADDITIONS Rule Tags from reader 'Reader_Warehouse_Shelve1' from the specName 'ECSpec'']
capture | Error importing : 'org.fosstrak.capturingapp.WarehouseReportHandler.warehouseReportHandler'[ warehouseReportHandler : Function Compilation error
capture | warehouseReportHandler (line:28): ecps cannot be resolved
capture | ][ warehouseReportHandler : Function Compilation error
capture | warehouseReportHandler (line:28): ecps cannot be resolved
capture | ]Rule Compilation error : [Rule name='ADDITIONS Rule Tags from reader 'Reader_Warehouse_Shelve1' from the specName 'ECSpec'']
capture | org/fosstrak/capturingapp/Rule_ADDITIONS_Rule_Tags_from_reader__Reader_Warehouse_Shelve1__from_the_specName__ECSpec__0.java (2:489) : The import org.fosstrak.capturingapp.WarehouseReportHandler cannot be resolved
I'm new to Drools. I am not sure if it's a syntax problem.
Update: I've removed the generics I had previously and tried to follow the examples given in the project, without success. (https://github.com/Fosstrak/fosstrak/tree/master/capturingapp/trunk/src/main/resources/drools)
Thank you everyone for your time
This line
function List warehouseReportHandler(List reports, String reportName){
defines the function with the parameter as a list. However the invocation
$reports : ECReports( reports != null)
$epcs : LinkedList( size > 0 ) from collect (
EPC() from warehouseReportHandler($reports, "additions")
)
shows a parameter of type ECReports is being sent in to the method. Can you fix this and try?
I have the pre-trained model like Inception-v3. I want to remove the output layer and use it in image cognition. Here is the example given by tensorflow:
Just like the python framework Keras, it has a method like model.layers.pop(). I tried do it with tensorflow java api. First I tried to use dl4j, but when I imported the keras model, I got an error like this:
2017-06-15 21:15:43 INFO KerasInceptionV3Net:52 - Importing Inception model from data/inception-model.json
2017-06-15 21:15:43 INFO KerasInceptionV3Net:53 - Importing Weights model from data/inception_v3_complete
Exception in thread "main" java.lang.RuntimeException: Unknown exception.
at org.bytedeco.javacpp.hdf5$H5File.allocate(Native Method)
at org.bytedeco.javacpp.hdf5$H5File.<init>(hdf5.java:12713)
at org.deeplearning4j.nn.modelimport.keras.Hdf5Archive.<init>(Hdf5Archive.java:61)
at org.deeplearning4j.nn.modelimport.keras.KerasModel$ModelBuilder.weightsHdf5Filename(KerasModel.java:603)
at org.deeplearning4j.nn.modelimport.keras.KerasModelImport.importKerasModelAndWeights(KerasModelImport.java:176)
at edu.usc.irds.dl.dl4j.examples.KerasInceptionV3Net.<init>(KerasInceptionV3Net.java:55)
at edu.usc.irds.dl.dl4j.examples.KerasInceptionV3Net.main(KerasInceptionV3Net.java:108)
HDF5-DIAG: Error detected in HDF5 (1.10.0-patch1) thread 0:
#000: C:\autotest\HDF5110ReleaseRWDITAR\src\H5F.c line 579 in H5Fopen(): unable to open file
major: File accessibilty
minor: Unable to open file
#001: C:\autotest\HDF5110ReleaseRWDITAR\src\H5Fint.c line 1100 in H5F_open(): unable to open file: time = Thu Jun 15 21:15:44 2017,name = 'data/inception_v3_complete', tent_flags = 0
major: File accessibilty
minor: Unable to open file
#002: C:\autotest\HDF5110ReleaseRWDITAR\src\H5FD.c line 812 in H5FD_open(): open failed
major: Virtual File Layer
minor: Unable to initialize object
#003: C:\autotest\HDF5110ReleaseRWDITAR\src\H5FDsec2.c line 348 in H5FD_sec2_open(): unable to open file: name = 'data/inception_v3_complete', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0
major: File accessibilty
minor: Unable to open file
So I went back to tensorflow. I'm going to modify the model in keras and convert the model to tensor. Here is my conversion script:
input_fld = './'
output_node_names_of_input_network = ["pred0"]
write_graph_def_ascii_flag = True
output_node_names_of_final_network = 'output_node'
output_graph_name = 'test2.pb'
from keras.models import load_model
import tensorflow as tf
import os
import os.path as osp
from keras.applications.inception_v3 import InceptionV3
from keras.applications.vgg16 import VGG16
from keras.models import Sequential
from keras.layers.core import Flatten, Dense, Dropout
from keras.layers.convolutional import Convolution2D, MaxPooling2D, ZeroPadding2D
from keras.optimizers import SGD
output_fld = input_fld + 'tensorflow_model/'
if not os.path.isdir(output_fld):
os.mkdir(output_fld)
net_model = InceptionV3(weights='imagenet', include_top=True)
num_output = len(output_node_names_of_input_network)
pred = [None]*num_output
pred_node_names = [None]*num_output
for i in range(num_output):
pred_node_names[i] = output_node_names_of_final_network+str(i)
pred[i] = tf.identity(net_model.output[i], name=pred_node_names[i])
print('output nodes names are: ', pred_node_names)
from keras import backend as K
sess = K.get_session()
if write_graph_def_ascii_flag:
f = 'only_the_graph_def.pb.ascii'
tf.train.write_graph(sess.graph.as_graph_def(), output_fld, f, as_text=True)
print('saved the graph definition in ascii format at: ', osp.join(output_fld, f))
from tensorflow.python.framework import graph_util
from tensorflow.python.framework import graph_io
constant_graph = graph_util.convert_variables_to_constants(sess, sess.graph.as_graph_def(), pred_node_names)
graph_io.write_graph(constant_graph, output_fld, output_graph_name, as_t ext=False)
print('saved the constant graph (ready for inference) at: ', osp.join(output_fld, output_graph_name))
I got the model as .pb file, but when I put it into the tensor example, The LabelImage example, I got this error:
Exception in thread "main" java.lang.IllegalArgumentException: You must feed a value for placeholder tensor 'batch_normalization_1/keras_learning_phase' with dtype bool
[[Node: batch_normalization_1/keras_learning_phase = Placeholder[dtype=DT_BOOL, shape=<unknown>, _device="/job:localhost/replica:0/task:0/cpu:0"]()]]
at org.tensorflow.Session.run(Native Method)
at org.tensorflow.Session.access$100(Session.java:48)
at org.tensorflow.Session$Runner.runHelper(Session.java:285)
at org.tensorflow.Session$Runner.run(Session.java:235)
at com.dlut.cmh.sheng.LabelImage.executeInceptionGraph(LabelImage.java:98)
at com.dlut.cmh.sheng.LabelImage.main(LabelImage.java:51)
I don't know how to solve this. Can anyone help me? Or you have another way to do this?
The error message you get from the TensorFlow Java API:
Exception in thread "main" java.lang.IllegalArgumentException: You must feed a value for placeholder tensor 'batch_normalization_1/keras_learning_phase' with dtype bool
[[Node: batch_normalization_1/keras_learning_phase = Placeholder[dtype=DT_BOOL, shape=<unknown>, _device="/job:localhost/replica:0/task:0/cpu:0"]()]]
suggests that the model is constructed in a way that requires you to feed a boolean value for the tensor named batch_normalization_1/keras_learning_phase.
So, you'd have to include that in your call to run by changing:
try (Session s = new Session(g);
Tensor result = s.runner().feed("input",image).fetch("output").run().get(0)) {
to something like:
try (Session s = new Session(g);
Tensor learning_phase = Tensor.create(false);
Tensor result = s.runner().feed("input", image).feed("batch_normalization_1/keras_learning_phase", learning_phase).fetch("output").run().get(0)) {
The names of nodes you feed and fetch depend on the model, so it's possible that the names of the 'input' and 'output' nodes are different as well.
You might also want to consider using the TensorFlow SavedModel format (see also https://github.com/tensorflow/serving/issues/310#issuecomment-297015251)
Hope that helps
For a university project I have to implement arules(package of R) in java. I have successfully integrated R and java using JRI. I did not understand how to get output of "inspect(Groceries[1:1])". I have tried with asString(),asString[]() but this gives me following error:
Exception in thread "main" java.lang.NullPointerException
at TestR.main(TestR.java:11)
Also, how can implement summary(Groceries) in java? How to get output of summary in String array or string?
R code:
>data(Groceries)
>inspect(Groceries[1:1])
>summary(Groceries)
Java code:
import org.rosuda.JRI.Rengine;
import org.rosuda.JRI.REXP;
public class TestR {
public static void main(String[] args){
Rengine re = new Rengine(new String[]{"--no-save"}, false, null);
re.eval("library(arules)");
re.eval("data(Groceries)");
REXP result = re.eval("inspect(Groceries[1:1])");
System.out.println(result.asString());
}
}
Appears that the inspect function in pkg:arules returns NULL. The output you see is a "side-effect". You can attempt to "capture output" but this is untested since I don't have experience with this integration across languages. Try instead.:
REXP result = re.eval("capture.output( inspect(Groceries[1:1]) )");
In an R console session you will get:
library(arules)
data("Adult")
rules <- apriori(Adult)
val <- inspect(rules[1000])
> str(val)
NULL
> val.co <- capture.output(inspect(rules[1000]))
> val.co
[1] " lhs rhs support confidence lift"
[2] "1 {education=Some-college, "
[3] " sex=Male, "
[4] " capital-loss=None} => {native-country=United-States} 0.1208181 0.9256471 1.031449"
But I haven't tested this in a non-interactive session. May need to muck with the file argument to capture.output, ... or it may not work at all.
I'm attempting to use a unit test of mine for "load" testing on our browser. For various reasons, we have seen performance degradation on the browser side, because we heavily rely on the print dialog.
I have the following unit test working via ScalaTest:
class LoadPrePaidSpec extends FlatSpec with Matchers with Chrome with Eventually {
implicit override val patienceConfig =
PatienceConfig(timeout = scaled(Span(40, Seconds)), interval = scaled(Span(100, Millis)))
def build(csvLine:String):TestCSVHolder ={
val split = csvLine.split(",")
TestCSVHolder(memberId = split(0), preSaleCode = split(1),
prePaidCode = split(2), lastName = split(3), firstName = split(4), badgeName = split(5))
}
def memberHelper(member: TestCSVHolder): Unit = {
//insert member id via prepaid code
textField("member_id").value = member.prePaidCode
//fire keyup event
executeScript("var eventToFire=jQuery.Event(\"keyup\");eventToFire.keyCode=221;eventToFire.which=221;" +
"$(\"#member_id\").trigger(eventToFire)")
eventually {
val eles = webDriver.findElements(By.xpath(s"//*[contains(#id, '${member.memberId}')]"))
eles.get(0).getTagName
//We remove the head element because it just says Prep For Print
val tdEles = (eles.get(0).findElements(By.tagName("td")).toList.tail)
tdEles(0).getText() should be(member.lastName)
tdEles(1).getText() should be(member.firstName)
tdEles(2).getText() should be(member.badgeName)
}
}
"Scanning an ID" should "look up the member" in {
val member = new TestCSVHolder("100001", "ABCD", "[-100001-ABCD]", "John", "Doe", "JohnDoe")
go to (url)
//login
textField("user_name").value = "mrkaiser"
webDriver.findElementById("credentials").sendKeys("somepassword")
click on ("btnLogin")
//click to pre-paid
click on linkText("Pre-Paid")
memberHelper())
webDriver.quit()
}
}
However when I try to iterate through a list of elements using a foreach and passing in memberHelper, after a list of about 5 elements, I get the following stack trace:
The code passed to eventually never returned normally. Attempted 369 times over 40.110734904 seconds. Last failure message: Index: 0, Size: 0.
ScalaTestFailureLocation: LoadPrePaidSpec at (LoadPrePaidSpec.scala:43)
org.scalatest.exceptions.TestFailedDueToTimeoutException: The code passed to eventually never returned normally. Attempted 369 times over 40.110734904 seconds. Last failure message: Index: 0, Size: 0.
at org.scalatest.concurrent.Eventually$class.tryTryAgain$1(Eventually.scala:420)
at org.scalatest.concurrent.Eventually$class.eventually(Eventually.scala:438)
at LoadPrePaidSpec.eventually(LoadPrePaidSpec.scala:17)
at LoadPrePaidSpec.memberHelper(LoadPrePaidSpec.scala:43)
at LoadPrePaidSpec$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(LoadPrePaidSpec.scala:70)
at LoadPrePaidSpec$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(LoadPrePaidSpec.scala:70)
at scala.collection.immutable.List.foreach(List.scala:383)
at LoadPrePaidSpec$$anonfun$1.apply$mcV$sp(LoadPrePaidSpec.scala:70)
at LoadPrePaidSpec$$anonfun$1.apply(LoadPrePaidSpec.scala:54)
at LoadPrePaidSpec$$anonfun$1.apply(LoadPrePaidSpec.scala:54)
at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
at org.scalatest.Transformer.apply(Transformer.scala:22)
at org.scalatest.Transformer.apply(Transformer.scala:20)
at org.scalatest.FlatSpecLike$$anon$1.apply(FlatSpecLike.scala:1647)
at org.scalatest.Suite$class.withFixture(Suite.scala:1122)
at org.scalatest.FlatSpec.withFixture(FlatSpec.scala:1683)
at org.scalatest.FlatSpecLike$class.invokeWithFixture$1(FlatSpecLike.scala:1644)
at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656)
at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656)
at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
at org.scalatest.FlatSpecLike$class.runTest(FlatSpecLike.scala:1656)
at org.scalatest.FlatSpec.runTest(FlatSpec.scala:1683)
at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714)
at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
at scala.collection.immutable.List.foreach(List.scala:383)
at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:390)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:427)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
at scala.collection.immutable.List.foreach(List.scala:383)
at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
at org.scalatest.FlatSpecLike$class.runTests(FlatSpecLike.scala:1714)
at org.scalatest.FlatSpec.runTests(FlatSpec.scala:1683)
at org.scalatest.Suite$class.run(Suite.scala:1424)
at org.scalatest.FlatSpec.org$scalatest$FlatSpecLike$$super$run(FlatSpec.scala:1683)
at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760)
at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760)
at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
at org.scalatest.FlatSpecLike$class.run(FlatSpecLike.scala:1760)
at org.scalatest.FlatSpec.run(FlatSpec.scala:1683)
at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55)
at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2563)
at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2557)
at scala.collection.immutable.List.foreach(List.scala:383)
at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:2557)
at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1044)
at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1043)
at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:2722)
at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1043)
at org.scalatest.tools.Runner$.run(Runner.scala:883)
at org.scalatest.tools.Runner.run(Runner.scala)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:138)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:28)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
Caused by: java.lang.IndexOutOfBoundsException: Index: 0, Size: 0
at java.util.ArrayList.rangeCheck(ArrayList.java:653)
at java.util.ArrayList.get(ArrayList.java:429)
at LoadPrePaidSpec$$anonfun$memberHelper$1.apply$mcV$sp(LoadPrePaidSpec.scala:45)
at LoadPrePaidSpec$$anonfun$memberHelper$1.apply(LoadPrePaidSpec.scala:43)
at LoadPrePaidSpec$$anonfun$memberHelper$1.apply(LoadPrePaidSpec.scala:43)
at org.scalatest.concurrent.Eventually$class.makeAValiantAttempt$1(Eventually.scala:394)
at org.scalatest.concurrent.Eventually$class.tryTryAgain$1(Eventually.scala:408)
... 63 more
My end goal is to actually test something in the 20K range of elements from a file, but until I can get a small list like this working, I'm up a creek.
I'm using the chromedriver and am on Scala 2.11.6, scala test 2.2.0, selenium 2.35.0.