create `KafkaServer` from Java - java

I am trying to start a Kafka server form Java
Specifically, how can I translate this line of Scala into a line of Java?
private val server = new KafkaServer(serverConfig, kafkaMetricsReporters = reporters)
I can create the serverConfig easily, but I can't seem to be able to create the kafkaMetricsReporters parameter.
Note: I can create a KafkaServerStartable but I would like to create a normal KafkaServer to avoid the JVM exiting in case of error.
Apache Kafka version 0.11.0.1

The kafkaMetricsReporters parameter is a scala Seq.
You can either:
Create a Java collection and convert it into a Seq:
You need to import scala.collection.JavaConverters:
List<KafkaMetricsReporter> reportersList = new ArrayList<>();
...
Seq<KafkaMetricsReporter> reportersSeq = JavaConverters.asScalaBufferConverter(reportersList).asScala();
Use KafkaMetricsReporter.startReporters() method to create them for you from your configuration:
As KafkaMetricsReporter is a singleton, you need to use the MODULE notation to use it:
Seq<KafkaMetricsReporter> reporters = KafkaMetricsReporter$.MODULE$.startReporters(new VerifiableProperties(props));
Also the KafkaServer constructor has 2 other arguments that are required when calling it from Java:
time can easily be created using new org.apache.kafka.common.utils.SystemTime()
threadNamePrefix is an Option. If you import scala.Option, you'll be able to call Option.apply("prefix")
Putting it all together:
Properties props = new Properties();
props.put(...);
KafkaConfig config = KafkaConfig.fromProps(props);
Seq<KafkaMetricsReporter> reporters = KafkaMetricsReporter$.MODULE$.startReporters(new VerifiableProperties(props));
KafkaServer server = new KafkaServer(config, new SystemTime(), Option.apply("prefix"), reporters);
server.startup();

Related

Can't read from Jenkins pipeline workspaces using Java API

I am trying to understand if what I want to get is faceable.
We want to write a shared library step that allows us to produce a Kafka message.
To do this we use
#Grab(group = 'org.apache.kafka', module = 'kafka-clients', version = '3.2.0')
...
...
def producer = new KafkaProducer([
"bootstrap.servers": bootstrapServers,
// serializers
"value.serializer" : "org.apache.kafka.common.serialization.StringSerializer",
"key.serializer" : "org.apache.kafka.common.serialization.StringSerializer",
// acknowledgement control
"acks" : "all",
// TLS config
"ssl.truststore.type": "JKS",
"security.protocol": "SSL",
"ssl.enabled.protocols": "TLSv1.2",
"ssl.protocol": "TLSv1.2",
"ssl.truststore.location" : "<cacets_location>",
"ssl.truststore.password" : "changeit"
])
The method gets all params from outside except ssl.truststore.location
that is provided through the node volume.
The problem I realized is that JAVA commands are executed in a different workspace.
Let's say we have ssl.truststore.location in "/etc/pki/java/cacerts" location,
while using the readFile command I'm able to read it
, however, when I try to read using the pure JAVA command, I'm getting NoSuchFileFoundExecption
I found that when I execute a Java command like
File folder = new File("").getAbsoluteFile();
My working dir is empty "\", as if it's executed in an absolutely empty sandbox that is not related to the Jenkins workspace.
My question is if what I'm trying to obtain is doable in the Jenkins pipeline scope
and how to get it working.

SoapUI java API, the TestCase.setPropertyValue operation with a long value blocks the java program

I'm using SoapUI API version 5.5.0 to execute SoapUI tests from a java test program. I want to pass the service endpoint to test and the input parameters changing the properties of the SoapUI testcase.
This is the dependence on my pom.xml:
<dependencies>
<dependency>
<groupId>com.smartbear.soapui</groupId>
<artifactId>soapui-maven-plugin</artifactId>
<version>5.5.0</version>
</dependency>
The program is blocked at execution time when I use a long value for one parameter.
import com.eviware.soapui.SoapUI;
import com.eviware.soapui.StandaloneSoapUICore;
import com.eviware.soapui.impl.wsdl.WsdlProject;
import com.eviware.soapui.model.support.PropertiesMap;
import com.eviware.soapui.model.testsuite.*;
private static void PutTestCaseProperties(TestCase testCase){
// Get keys of all properties of this TC
Map <String, TestProperty> propertiesTC = testCase.getProperties();
List<String> lKeys = new ArrayList<String>();
for(Map.Entry<String, TestProperty> entry : propertiesTC.entrySet()) {
lKeys.add(entry.getKey());
}
for(String keyTC : lKeys) {
String keyValue = "pppppp ppppppppppppppppppppppppppppppppppppppp ppppppppppppppppppppppppppppp ppppppppppppppppp ppppppppppppppppppppppppppppp pppp pppppppppppppppppppppppppppppppppppppppppppppppp pppppppppppppppppppppppppppp pppppppppppppppppppppppppppppppppppppppppppppppppppppp ppppppppppppppppppppppp ppppppppppppppppppppppppppppppppppppppppppppppppppppppp pppppppppppppppppppppppppppppppppppppppppppppppppppppppp pppppppppppppppppppppppppp ppppppppppppppppppppppppppppppppppppppppppppp";
//String keyValue = "short";
testCase.setPropertyValue(keyTC, keyValue);
}
}
If I use the "short" value for the keys, the SoapUI test is completely executed,
but if I use the long value the program is blocked after that.
Is there any length limit in the custom properties of a soapUI test case? I would like to use the parameters to write whole XML files (all built in a text line).
The soapUI program allows to load the custom properties of a test case from a external file. Is it also possible to do it from the soapUI java API?
I have found the following solution:
Based on this response I can load long parameters without errors. Only with the following variation in my groovy to use a properties file which path I configure with a parameter of the Test Case (and I configure this parameter from my Java code using the SoapUI API):
def props = new Properties()
//replace the path with your file name below. use / instead of \ as path separator even on windows platform.
new File(context.expand('${#TestCase#propertiesFile}')).withInputStream { s ->
props.load(s)
}
props.each {
context.testCase.setPropertyValue(it.key, it.value)
}

How to create a stateful Groovy Binding for interactive Session

I am building a groovy-based tool and as an add-in i'd like to provide an interactive command-line, I have this partially working but the binding doesn't keep state between GroovyShell.evaluate() calls, I've gone through the groovy documentation and they have an example using a class called InteractiveGroovyShell, which is not available on version 2.0.x.
Is there a way to configure normal groovy shell to achieve this functionality?
Here is a simplified version of how I'm creating the groovy shell right now:
CompilerConfiguration config = new CompilerConfiguration();
Binding binding = new Binding();
shell = new GroovyShell(binding, config);
shell.evaluate("def a = 20");
shell.evaluate("println a"); //this throws an exception telling the variable does not exist
shell.evaluate("def a = 20");
Instead of def a = 20 you need just a = 20. Each evaluate call parses and compiles a separate script, and declarations (whether with def or with an explicit type such as int a = 20) become local variables in that specific script and do not store anything in the binding. Without the def you have a plain assignment to an otherwise undeclared variable, which will go into the binding and so be visible to later evaluate calls.
You should reuse the same binding for different shells. The binding itself will maintain the state:
import org.codehaus.groovy.control.CompilerConfiguration
def binding = new Binding()
def shell = new GroovyShell(binding)
shell.evaluate("a = 5")
assert binding.variables == [a:5]
shell.evaluate("println a; b = 6")
assert binding.variables == [a:5, b:6]
def shell2 = new GroovyShell(binding)
// even in a new shell the binding keep the state
shell2.evaluate("c = 7")
assert binding.variables == [a:5, b:6, c:7]
Worked in groovy 2.0.5

Run a simple Cascading application in local mode

I'm new to Cascading/Hadoop and am trying to run a simple example in local mode (i.e. in memory). The example just copies a file:
import java.util.Properties;
import cascading.flow.Flow;
import cascading.flow.FlowConnector;
import cascading.flow.FlowDef;
import cascading.flow.local.LocalFlowConnector;
import cascading.pipe.Pipe;
import cascading.property.AppProps;
import cascading.scheme.hadoop.TextLine;
import cascading.tap.Tap;
import cascading.tap.hadoop.Hfs;
public class CascadingTest {
public static void main(String[] args) {
Properties properties = new Properties();
AppProps.setApplicationJarClass( properties, CascadingTest.class );
FlowConnector flowConnector = new LocalFlowConnector();
// create the source tap
Tap inTap = new Hfs( new TextLine(), "D:\\git_workspace\\Impatient\\part1\\data\\rain.txt" );
// create the sink tap
Tap outTap = new Hfs( new TextLine(), "D:\\git_workspace\\Impatient\\part1\\data\\out.txt" );
// specify a pipe to connect the taps
Pipe copyPipe = new Pipe( "copy" );
// connect the taps, pipes, etc., into a flow
FlowDef flowDef = FlowDef.flowDef()
.addSource( copyPipe, inTap )
.addTailSink( copyPipe, outTap );
// run the flow
Flow flow = flowConnector.connect( flowDef );
flow.complete();
}
}
Here is the error I'm getting:
09-25-12 11:30:38,114 INFO - AppProps - using app.id: 9C82C76AC667FDAA2F6969A0DF3949C6
Exception in thread "main" cascading.flow.planner.PlannerException: could not build flow from assembly: [java.util.Properties cannot be cast to org.apache.hadoop.mapred.JobConf]
at cascading.flow.planner.FlowPlanner.handleExceptionDuringPlanning(FlowPlanner.java:515)
at cascading.flow.local.planner.LocalPlanner.buildFlow(LocalPlanner.java:84)
at cascading.flow.FlowConnector.connect(FlowConnector.java:454)
at com.x.y.CascadingTest.main(CascadingTest.java:37)
Caused by: java.lang.ClassCastException: java.util.Properties cannot be cast to org.apache.hadoop.mapred.JobConf
at cascading.tap.hadoop.Hfs.sourceConfInit(Hfs.java:78)
at cascading.flow.local.LocalFlowStep.initTaps(LocalFlowStep.java:77)
at cascading.flow.local.LocalFlowStep.getInitializedConfig(LocalFlowStep.java:56)
at cascading.flow.local.LocalFlowStep.createFlowStepJob(LocalFlowStep.java:135)
at cascading.flow.local.LocalFlowStep.createFlowStepJob(LocalFlowStep.java:38)
at cascading.flow.planner.BaseFlowStep.getFlowStepJob(BaseFlowStep.java:588)
at cascading.flow.BaseFlow.initializeNewJobsMap(BaseFlow.java:1162)
at cascading.flow.BaseFlow.initialize(BaseFlow.java:184)
at cascading.flow.local.planner.LocalPlanner.buildFlow(LocalPlanner.java:78)
... 2 more
Just to provide a bit more detail: You can't mix local and hadoop classes in Cascading, as they assume different and incompatible environments. What's happening in your case is that you're trying to create a local flow with hadoop taps, the latter expecting a hadoop JobConf instead of the Properties object used to configure local taps.
Your code will work if you use cascading.tap.local.FileTap instead of cascading.tap.hadoop.Hfs.
Welcome to Cascading -
I just answered on the Cascading user list, but in brief the problem is a mix of local and Hadoop mode classes.. This code has LocalFlowConnector, but then uses Hfs taps.
When I revert back to the classes used in the "Impatient" tutorial, it run correctly:
https://gist.github.com/3784194
Yes, you need to use LFS(Local File System) tap instead of HFS (Hadoop File System).
Also you can test your code using Junit test cases (with cascading-unittest jar) in local mode itself/ from eclipse.
http://www.cascading.org/2012/08/07/cascading-for-the-impatient-part-6/

How to use automatic proxy configuration script in Java

My Internet Explorer is set to have an automatic proxy file(so-called PAC) for web access. Is there a way to use this on my Java program, also ?
My below Java code does not seem to use proxy at all.
ArrayList<Proxy> ar = new ArrayList<Proxy>(ProxySelector.getDefault().select(new URI("http://service.myurlforproxy.com")));
for(Proxy p : ar){
System.out.println(p.toString()); //output is just DIRECT T.T it should be PROXY.
}
I also set my proxy script on Java Control Panel(Control->Java), but the same result.
and I found there's no way to set PAC file for Java programmatically.
People use http.proxyHost for System.setProperties(..) but this is a only for setting proxy host, not proxy script(PAC file).
Wow! I could load Proxy Auto-Config (PAC) file on Java. Please see below codes and package.
import com.sun.deploy.net.proxy.*;
.
.
BrowserProxyInfo b = new BrowserProxyInfo();
b.setType(ProxyType.AUTO);
b.setAutoConfigURL("http://yourhost/proxy.file.pac");
DummyAutoProxyHandler handler = new DummyAutoProxyHandler();
handler.init(b);
URL url = new URL("http://host_to_query");
ProxyInfo[] ps = handler.getProxyInfo(url);
for(ProxyInfo p : ps){
System.out.println(p.toString());
}
You already have a [com.sun.deploy.net.proxy] package on your machine!
Find [deploy.jar] ;D
Java does not have any built-in support for parsing the JS PAC file. You are on your own. What you can do is download that file and parse the proxy host from it. You should read this.
In my case, I've just figured-out what the .pac file will return, then hardcode.
Based on #Jaeh answer I used the code below.
Note that SunAutoProxyHandler implements AbstractAutoProxyHandler and there is an alternative concrete implementation called PluginAutoProxyHandler but that implementation does not appear to be as robust:
BrowserProxyInfo b = new BrowserProxyInfo();
b.setType(ProxyType.AUTO);
b.setAutoConfigURL("http://example.com/proxy.pac");
SunAutoProxyHandler handler = new SunAutoProxyHandler();
handler.init(b);
ProxyInfo[] ps = handler.getProxyInfo(new URL(url));
for(ProxyInfo p : ps){
System.out.println(p.toString());
}

Categories

Resources