Selenium server is not starting for easyb project - java

[FAILURE: Could not contact Selenium Server; have you started it on 'localhost:4444' ? Read more at http://seleniumhq.org/projects/remote-control/not-started.html Connection refused]
Hi..
I am working on easyB and encounters the above problem
how can start the selenium rc server and what this problem is all about?
Thanks...

Well you could write a groovy script into [your-webapp]/scripts/_Events.groovy to start and stop selenium
(You would have to install selenium-rc plugin before to have access to the seleniumConfig or selenium Server scripts. )
includeTargets << new File("$seleniumRcPluginDir/scripts/_SeleniumConfig.groovy")
includeTargets << new File("$seleniumRcPluginDir/scripts/_SeleniumServer.groovy")
eventTestPhaseStart = { phase ->
if(isAcceptance(phase)){
startSeleniumServer()
}
}
eventTestPhaseEnd = { phase ->
if(isAcceptance(phase)){
stopSeleniumServer()
}
}
isAcceptance = { phase->
phase?.contains("acceptance");
}

You need to start the Selenium Server first before you can use the client instance.
So before you call your defaultSelenium instance creation, you can start your server by using a RemoteControlConfiguration (Link to javadoc) object and use it as an argument for the SeleniumServer constructor call and then boot the server using the serverinstance.boot() call.
Something like
RemoteControlConfiguration rcc = new RemoteControlConfiguration()
//set whatever values you want your rc to start with:port,logoutfile,profile etc.
SeleniumServer ss = new SeleniumServer(rcc)
ss.boot()
Make sure you shut it down when you are done with tests.

Related

Use of Java Smack 4.3.4 in a JUnit Testcase in Maven

I am working on a Java library with some services based on xmpp. For XMPP communication, I use Smack version 4.3.4. The development has so far been without problems and I have also created some test routines that can all be run without errors. After I migrated to a Maven project to generate a FatJar, I wanted to convert the executable test cases into JUnit tests. Unexpectedly, an error occurs, the reason of which I cannot explain. As I said, the code can be run outside of JUnit without any problems.
Below is the simplified test code (establishing a connection to the xmpp server):
#Test
public void connect()
{
Builder builder = XMPPTCPConnectionConfiguration.builder();
builder.setSecurityMode(SecurityMode.disabled);
builder.setUsernameAndPassword("iec61850client", "iec61850client");
builder.setPort(5222);
builder.setSendPresence(true);
try
{
builder.setXmppDomain("127.0.0.1");
builder.setHostAddress(InetAddress.getByName("127.0.0.1"));
}
catch (Exception e)
{
e.printStackTrace();
}
XMPPTCPConnectionConfiguration config = builder.build();
XMPPTCPConnection c = new XMPPTCPConnection(config);
c.setReplyTimeout(5000);
try
{
c.connect().login();
}
catch (Exception e)
{
e.printStackTrace();
}
}
And here is the error message I get:
Exception in thread "Smack Reader (0)" java.lang.AssertionError
at org.jivesoftware.smack.tcp.XMPPTCPConnection$PacketReader.parsePackets(XMPPTCPConnection.java:1154)
at org.jivesoftware.smack.tcp.XMPPTCPConnection$PacketReader.access$1000(XMPPTCPConnection.java:1092)
at org.jivesoftware.smack.tcp.XMPPTCPConnection$PacketReader$1.run(XMPPTCPConnection.java:1112)
In Smack it boils down to this 'assert' instruction:
assert (config.getXMPPServiceDomain().equals(reportedServerDomain));
Any idea what the problem might be or similar problems? I'm grateful for any help!
Thanks a lot,
Markus
If you look at the source code you will find that reportedServerDomain is extracted from the server's stream open tag. In this case the xmpp domain reported by the server does not match the one that is configured. This should usually not happen, but I assume it is related to the way you run the unit tests. Or more precisely, related to the remote server or mocked server that is used in the tests. If you enable smack's debug output, you will see the stream open tag and the 'from' attribute and its value. Compare this with the configured XMPP service domain in the ConnectionConfiguration.

How to change server port in runtime with a spring boot application and spock testing

I am testing a Spring Boot application with Spock, but on one of the test cases I need to mock or stub the calls to the auth server (using oauth 2) so I'm trying to redirect the requests to a dummy server for testing and make the methods return a fixed token. However, i overwrite the port at runtime but i get an error because the dummy server is on a fixed port (read from the application-test.yml), is there a way to change this at runtime to make the server match the random port that the test is running on?
this is my setup function:
`def setup() {
omcService.soapClient = Stub(SOAPClient)
String url = "http://localhost:${port}"
nonRetryableExceptionProcessor.omsUrl = url
omsService.omsUrl = url
omsService.authUrl = "$url/oauth/token?scope=all"
omsService = Spy(OmsService)
producerTemplate.start()
}
When I debug this test, the properties are changed but when the application performs a GET operation, it points to localhost:4321 always, which is not the random port picked up by Spring
You can inject random port into your test.
For example using #LocalManagementPort:
#LocalManagementPort
int port;
Or directly using #Value:
#Value("${local.server.port}")
int port;
But if above doesn't work, then I believe this is your last resort:
int port = context.embeddedServletContainer.port
Having it injected, you can perform GET to the server on that port.

Java remote debugging - how can I keep debugger listening?

I'm using IntelliJ IDEA to remote debug a Java CLI program with the debugger listening for connections.
This works fine for the first invocation, but the debugger stops listening after the CLI program disconnects. I want the debugger to keep listening since multiple CLI invocations will be made (in sequence, not in parallel) and only one of these will trigger the breakpoint I've set.
Here's my client debug config:
-agentlib:jdwp=transport=dt_socket,server=n,address=5005,suspend=y
Is it possible to keep the debugger listening?
Well since your CLI program terminates, debugger also stops. If you still want to continue debugger session for multiple runs of CLI program, then you can try as below,
Write a wrapper program from which you invoke your CLI program multiple times and debug the wrapper program instead of your CLI program.
Something like this,
public class Wrapper {
public static void main(String[] args) {
YourCLIProgram yp = new YourCLIProgram();
// First Invocation
String[] arg1 = { }; // Arguments required for your CLI program
yp.main(arg1);
// Second Invocation
String[] arg2 = { }; // Arguments required for your CLI program
yp.main(arg2);
// Third Invocation
String[] arg3 = { }; // Arguments required for your CLI program
yp.main(arg3);
// Fourth Invocation
String[] arg4 = { }; // Arguments required for your CLI program
yp.main(arg4);
}
}
I hope it works.
It depends also what you are trying to achieve.
If you want to check what parameters are passed to your CLI you can just log them to the file or save any information that you need in DB (or file as well).
In JPDA by specification transport service could support or not multiple connections.
For example, in Eclipse it doesn't. I suppose the same for IDEA.
When setting up your run configuration, did you select the "Listen" Debugger mode? The command line arguments you show look like the normal "Attach" settings, whereas the arguments for "Listen" look like this:
-agentlib:jdwp=transport=dt_socket,server=n,address=yourhost.yourdomain:5005, suspend=y,onthrow=<FQ exception class name>,onuncaught=<y/n> (Specifically, your arguments are missing the address for the application - your CLI program - to connect to IDEA at on start-up.)
I read a post that suggests the "onthrow" argument may not be necessary for general debugging, but I haven't tried it myself.
Try with suspend=n:
-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=5005
On my local app (tomcat web app), even though I run on JDK8, I still use the older way of doing it and it works fine (another thing you could try):
-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=n,address=5005

How to get the automatically defined port for a Spark Java Application?

In the API documentation for Java Spark (not Apache spark), you can specify a port of 0 to have it automatically select a port. Great!
However, I cannot figure out how to get that port after the server is started. I can see it in the logs:
15:41:12.459 [Thread-2] INFO spark.webserver.JettySparkServer - >> Listening on 0.0.0.0:63134
But I need to be able to get to it programmatically, so that my integration tests are able to run reliably every time.
So how do I get that port?
I could find no way to get this information in the API, and so I filed an issue on their github.
I was able to get at it via an ugly pile of reflection:
/**
* Meant to be called from a different thread, once the spark app is running
* This is probably only going to be used during the integration testing process, not ever in prod!
*
* #return the port it's running on
*/
public static int awaitRunningPort() throws Exception {
awaitInitialization();
//I have to get the port via reflection, which is fugly, but the API doesn't exist :(
//Since we'll only use this in testing, it's not going to kill us
Object instance = getInstance();
Class theClass = instance.getClass();
Field serverField = theClass.getDeclaredField("server");
serverField.setAccessible(true);
Object oneLevelDeepServer = serverField.get(instance);
Class jettyServerClass = oneLevelDeepServer.getClass();
Field jettyServerField = jettyServerClass.getDeclaredField("server");
jettyServerField.setAccessible(true);
//Have to pull in the jetty server stuff to do this mess
Server jettyServer = (Server)jettyServerField.get(oneLevelDeepServer);
int acquiredPort = ((ServerConnector)jettyServer.getConnectors()[0]).getLocalPort();
log.debug("Acquired port: {}", acquiredPort);
return acquiredPort;
}
This works well for me in our integration tests, but I'm not using https, and it does reach about two levels deep into the API via reflection grabbing protected fields. I could not find any other way to do it. Would be quite happy to be proven wrong.
This will work on Spark 2.6.0:
public static int start (String keystoreFile, String keystorePw)
{
secure(keystoreFile, keystorePw, null, null);
port(0);
staticFiles.location("/public");
get(Path.CLOCK, ClockController.time);
get(Path.CALENDAR, CalendarController.date);
// This is the important line. It must be *after* creating the routes and *before* the call to port()
awaitInitialization();
return port();
}
Without the call to awaitInitialization() port() would return 0.

Pentaho - Execute a Kettle Transformation remotely using Java

I am executing My Jobs/ Transformation using Java API and I am able to do it correctly in my host.
Now I am looking for a way to execute the transformation in remote host(where carte in running). Please help me or redirect me to the proper documentation where I find the classes to use to accomplish this.
PDI Version - 5.0.1
Currently I am executing my Job as below
try {
if(jobDetails.getGraphlocation()!=null)
{
KettleEnvironment.init();
JobMeta jobMeta = new JobMeta(jobDetails.getGraphlocation(), null);
for( String s : jobDetails.getArguments() )
{
String[] splitString = s.split("\\=");
if(splitString.length==2)
{
jobMeta.setParameterValue(splitString[0], splitString[1]);
}
else
System.err.println("Parameter should be of the form - name=value");
}
Job job = new Job(null, jobMeta);
job.setLogLevel(LogLevel.valueOf(jobDetails.getLoglevel().toString()));
job.start();
job.waitUntilFinished();
if (job.getErrors()!=0) {
System.out.println("Error encountered!");
}
}
The above code is able to execute the Job where ever I am running it. But I want to execute it in slave server by just passing the carte username,password ans server IP address.
you can do it through spoon by registering the carte server, or you can do it in a job by specifying the name and port of the carte server in the actual job/transformation step. i.e. you can create a launcher which just has start, job ( pointing at carte server ), success steps.

Categories

Resources