I'm an absolute rookie here (JAVA i mean), spent hours looking for a solution, now i just want to shoot myself.
I want to create a string in the beanshell assertion which is placed right above the HTTP Request.
In the beanshell i wrote:
String docid="abcd";
(in actuality i wish to concatenate a string with some variables)
In HTTP Request, send parameters i add ${docid}.
In BeanShell Assertion description section you can find the following:
vars - JMeterVariables - e.g. vars.get("VAR1"); vars.put("VAR2","value"); vars.putObject("OBJ1",new Object());
props - JMeterProperties (class java.util.Properties) - e.g. props.get("START.HMS"); props.put("PROP1","1234");
So to set jmeter variable in beanshell code (BeanShell Assertion sampler in your case) use the following:
String docid = "abcd";
vars.put("docid",docid);
or simply
vars.put("docid","abcd");
and then you can refer it as ${docid}, as you've done in your HTTP Request.
If you don't know Java well, you can use any of BSF or JSR223 Test elements and then select Javascript language as scripting language
http://jmeter.apache.org/usermanual/component_reference.html#JSR223_Sampler
If you need to pass value from one bean shell sampler to another, you should use variables.
vars.put("a", "something")
In other sampler, you should have something like:
String otherSampler = vars.get("a")
About debugging Shell Samplers - It is not so easy. I suggest to use SampleResult object. How to use it, you can see here Debugging Bean Shell Sampler
Related
I'm using closure-compiler which is provided by Google. I have JavaScript's in the string variable. need to compress the string using closure-compiler in java
I already tried the code from the following link http://blog.bolinfest.com/2009/11/calling-closure-compiler-from-java.html
This is the code I used "source" variable has the value of the javascript
Compiler compiler = new Compiler();
CompilerOptions options = new CompilerOptions();
// Advanced mode is used here, but additional options could be set, too.
CompilationLevel.SIMPLE_OPTIMIZATIONS.setOptionsForCompilationLevel(options);
compiler.compile(source, options);
return compiler.toSource();
I have error in the following line: compiler.compile(source, options);
Compiler.complier() method does not require 2 parameters but it requires 3 parameters.
Have a look at this link.
You will understand the number and kind of parameters required for the method you are calling.
I am trying to use run time parameters with BigtableIO in Apache Beam to write to BigTable.
I have created a pipeline to read from BigQuery and writing to Bigtable.
The pipeline works fine when i provide static parameters (using ConfigBigtableIO and ConfigBigtableConfiguration, referring to example here - https://github.com/GoogleCloudPlatform/cloud-bigtable-examples/blob/master/java/dataflow-connector-examples/src/main/java/com/google/cloud/bigtable/dataflow/example/HelloWorldWrite.java) but I am getting a compile error while trying to setup the pipeline with run time parameters.
The options is setup with all parameters being runtime Value Providers.
p.apply(BigQueryIO.readTableRows().fromQuery(options.getBqQuery())
.usingStandardSql())
.apply(ParDo.of(new TransFormFn(options.getColumnFamily(), options.getRowKey(), options.getColumnKey(), options.getRowKeySuffix())))
.apply(BigtableIO.write().withProjectId(options.getBigtableProjectId()).
withInstanceId(options.getBigtableInstanceId()).
withTableId(options.getBigtableTableId()));
It is expecting the output of Bigtable.write()... to be org.apache.beam.sdk.transforms.PTransform,OutputT>
while Bigtable.write() is returning a Write object.
Can you help with providing the correct syntax to fix this? Thanks.
Runtime parameters are meant to be used in Dataflow templates.
Are you trying to create a template and run the pipeline using the template? If yes, you would need following steps:
Create an Options that has runtime parameters you need, i.e.
ValueProvider tableId.
Pass these runtime parameters to the config object: i.e. withTableId(ValueProvider tableId) =>
withTableId(options.getTableId())
Construct your template
Execute your pipeline using the template.
The advantage of using a template is that it allows pipeline to be constructed once and executed multiple times later with runtime parameters.
For more information on how to use Dataflow template: https://cloud.google.com/dataflow/docs/templates/overview
When not using Dataflow template, you don't have set runtime parameters, i.e. withTableId(ValueProvider tableId). Instead, use withTableId(String tableId).
Hope this helps!
I have a need to extract a field parsed from a "complex" response header and use that value later in the test.
It seems that the "header" keyword in Karate is set up for setting request headers, not parsing response headers.
Is there a way to add a custom step definition maintaining access to the scenario variable stores? It appears the variable stores are private in the StepDefs class, and there doesn't seem to be a way to extend it easily.
You can get access to the response headers. Please look at the documentation for responseHeaders.
That said, the match header short-cut is most likely what you are looking for.
Karate's philosophy is that you never need to write custom step-definitions.
edit: some examples, sounds like you just need to do some string manipulation of the Location header ? You can freely mix JS code into Karate expressions.
* def location = responseHeaders['Location'][0]
# assume location = 'foo?bar=baz'
* def bar = location.substring(location.indexOf('bar=') + 4)
I'm trying to create a simple argument parser using commons-cli and I can't seem to figure out how to create the following options:
java ... com.my.path.to.MyClass producer
java ... com.my.path.to.MyClass consumer -j 8
The first argument to my program should be either producer or consumer, defining the mode which my program will run in. If it's in consumer mode, I'd like to have a -j argument which defines how many threads to service with.
Here's what I've got so far:
Options options = new Options();
options.addOption("mode", false, "Things.");
HelpFormatter formatter = new HelpFormatter();
formatter.printHelp("startup.sh", options);
When I print out these options, the mode parameter shows up as -mode.
In Python's argparse, I'd just do the following:
parser = argparse.ArgumentParser()
parser.add_argument('mode', choices=('producer', 'consumer'), required=True)
parser.print_help()
This does exactly what I'm looking for. How can I do this in commons-cli?
What I've done for things like this is to have separate Options for each class. In your main, check the first argument to decide which list to pass to the parser. FWIW, I don't consider it "hack" solution.
JCommander is the answer. commons-cli doesn't seem to support these options.
picocli is now included in Groovy 2.5.x. It has built-in support for subcommands.
enix12enix has written a standalone sikuli server to remotely initiate sikuli scripts. I have the server running and I'm now trying to pass values along with the url. I imagine it will look something like this :
http://server:9000/test.do?script=/yourscript&argv[1]=arg1value
Everything before the & works properly as it stands. I know the answer is somewhere in the java found here:
https://github.com/enix12enix/sikuliserver/blob/master/java/src/org/sikuli/SikuliScriptParamProcessor.java
As there is a function called extractparameters. Can anyone help figure out the syntax for the url?
Thanks a lot,
Jacob
According to the pattern that's used to check if the name of the params are correct, you should send the params with this format: argv## (1-99). So instead of sending argv[1] in the url you should be argv1.
Thanks Jair, I figured out how to reference this within the Sikuli script as well. I imagine that those familiar with Java / Python / Jython are already aware of this, but I am just a noob setting up a remote Sikuli server.
Parameters can be passed through the url as follows:
http://server:9000/test.do?script=/yourScriptName.sikuli&argv1=value1&argv2=value2
and so on, through argv99.
Normally, while running a script from the command line (--args value1 value2) , you would reference the argument within the Sikuli script like this:
import sys
var1 = sys.argv[1]
var2 = sys.argv[2]
While working with the java side of things, the reference is a little different:
import java
var1 = java.lang.System.getProperty('argv1')
var2 = java.lang.System.getProperty('argv2')
And so on.