I am learning using a test Kafka consumer & producer however facing below error.
Kafka consumer program:
package kafka001;
import java.util.Arrays;
import java.util.Properties;
import java.util.Scanner;
import org.apache.kafka.clients.consumer.*;
import org.apache.kafka.common.errors.WakeupException;
public class ConsumerApp {
private static Scanner in;
private static boolean stop = false;
public static void main(String[] args) throws Exception {
System.out.println(args[0] + args.length);
if (args.length != 2) {
System.err.printf("Usage: %s <topicName> <groupId>\n");
System.exit(-1);
}
in = new Scanner(System.in);
String topicName = args[0];
String groupId = args[1];
ConsumerThread consumerRunnable = new ConsumerThread(topicName, groupId);
consumerRunnable.start();
//System.out.println("Here");
String line = "";
while (!line.equals("exit")) {
line = in.next();
}
consumerRunnable.getKafkaConsumer().wakeup();
System.out.println("Stopping consumer now.....");
consumerRunnable.join();
}
private static class ConsumerThread extends Thread{
private String topicName;
private String groupId;
private KafkaConsumer<String,String> kafkaConsumer;
public ConsumerThread(String topicName, String groupId){
//System.out.println("inside ConsumerThread constructor");
this.topicName = topicName;
this.groupId = groupId;
}
public void run() {
//System.out.println("inside run");
// Setup Kafka producer properties
Properties configProperties = new Properties();
configProperties.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "aup7727s.unix.anz:9092");
configProperties.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringDeserializer");
configProperties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringDeserializer");
configProperties.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
configProperties.put(ConsumerConfig.CLIENT_ID_CONFIG, "simple");
// subscribe to topic
kafkaConsumer = new KafkaConsumer<String, String>(configProperties);
kafkaConsumer.subscribe(Arrays.asList(topicName));
// Get/process messages from topic and print it to console
try {while(true) {
ConsumerRecords<String, String> records = kafkaConsumer.poll(100);
for (ConsumerRecord<String, String> record : records)
System.out.println(record.value());
}
} catch(WakeupException ex) {
System.out.println("Exception caught " + ex.getMessage());
}finally {
kafkaConsumer.close();
System.out.println("After closing KafkaConsumer");
}
}
public KafkaConsumer<String,String> getKafkaConsumer(){
return this.kafkaConsumer;
}
}
}
When I compile the code, I am noticing following class files:
ConsumerApp$ConsumerThread.class and
ConsumerApp.class
I've generated jar file named ConsumerApp.jar through eclipse and when I run this in Hadoop cluster, I get noclassdeffound error as below:
java -cp ConsumerApp.jar kafka001/ConsumerApp console1 group1
or
hadoop jar ConsumerApp.jar console1 group1
Exception in thread "main" java.lang.NoClassDefFoundError: org.apache.kafka.common.errors.WakeupException
at kafka001.ConsumerApp.main(ConsumerApp.java:24)
Caused by: java.lang.ClassNotFoundException: org.apache.kafka.common.errors.WakeupException
at java.net.URLClassLoader.findClass(URLClassLoader.java:607)
at java.lang.ClassLoader.loadClassHelper(ClassLoader.java:846)
at java.lang.ClassLoader.loadClass(ClassLoader.java:825)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:325)
at java.lang.ClassLoader.loadClass(ClassLoader.java:805)
... 1 more
I am using Eclipse to compile, maven build and generate jar file. Line number 24 correspond to creation of ConsumerThread instance.
I am unable to resolve if its due to ConsumerThread class name being incorrectly saved (Class file generated as ConsumerApp$ConsumerThread.class instead of ConsumerThread.class) ? or something to be taken care while generating jar file ?
Since I can't view the entire project, I would try this: Right click on the project -> go to Maven 2 tools -> click generate artifacts (check for updates). That should create any missing dependencies. Also make sure you check out other similar posts that may resolve your issue like this.
Related
I'm trying to make a simple graph using java but keep getting error
Code:
public class PlantUMLDemoMain {
public static void main(String[] args) throws Exception {
generateFromStringSource(new File("from-string.png"));
generateFromApi(new File("from-api.png"));
}
private static void generateFromApi(File file) throws IOException {
// 1. setup:
SequenceDiagramFactory f = new SequenceDiagramFactory();
SequenceDiagram diagram = f.createEmptyDiagram();
// 2. Build the diagram:
// "Bob -> Alice : hello"
// See net.sourceforge.plantuml.sequencediagram.command.CommandArrow#executeArg
Display bobD = Display.getWithNewlines("Bob");
Participant bobP = diagram.getOrCreateParticipant("Bob", bobD);
Display aliceD = Display.getWithNewlines("Alice");
Participant aliceP = diagram.getOrCreateParticipant("Alice", aliceD);
Display label = Display.getWithNewlines("hello");
ArrowConfiguration config = ArrowConfiguration.withDirectionNormal();
Message msg = new Message(bobP, aliceP, label, config, diagram.getNextMessageNumber());
checkState(null == diagram.addMessage(msg));
// 3. Output the diagram
// See net.sourceforge.plantuml.SourceStringReader#generateImage
diagram.makeDiagramReady();
checkState(1 == diagram.getNbImages());
try (OutputStream os = new FileOutputStream(file)) {
ImageData imageData = diagram.exportDiagram(os, 0, new FileFormatOption(FileFormat.PNG));
System.out.println("generateFromApi: " + diagram.getDescription().getDescription());
}
}
private static void generateFromStringSource(File file) throws IOException {
String source = "#startuml\n";
source += "Bob -> Alice : hello\n";
source += "#enduml\n";
StringBuffer stringBuffer = new StringBuffer();
SourceStringReader reader = new SourceStringReader(source);
// Write the first image to "png"
String desc = reader.generateImage(file);
// Return a null string if no generation
System.out.println("generateFromStringSource: " + desc);
}
}
Error: Exception in thread "main" java.lang.IllegalAccessError: class net.sourceforge.plantuml.png.PngIOMetadata (in unnamed module #0x9597028) cannot access class com.sun.imageio.plugins.png.PNGMetadata (in module java.desktop) because module java.desktop does not export com.sun.imageio.plugins.png to unnamed module #0x9597028
at net.sourceforge.plantuml.png.PngIOMetadata.writeWithMetadata(PngIOMetadata.java:60)
at net.sourceforge.plantuml.png.PngIO.write(PngIO.java:86)
at net.sourceforge.plantuml.png.PngIO.write(PngIO.java:80)
at net.sourceforge.plantuml.ugraphic.g2d.UGraphicG2d.writeImageTOBEMOVED(UGraphicG2d.java:219)
at net.sourceforge.plantuml.ugraphic.ImageBuilder.writeImageInternal(ImageBuilder.java:249)
at net.sourceforge.plantuml.ugraphic.ImageBuilder.writeImageTOBEMOVED(ImageBuilder.java:171)
at net.sourceforge.plantuml.sequencediagram.graphic.SequenceDiagramFileMakerPuma2.createOne(SequenceDiagramFileMakerPuma2.java:234)
at net.sourceforge.plantuml.sequencediagram.SequenceDiagram.exportDiagramInternal(SequenceDiagram.java:222)
at net.sourceforge.plantuml.UmlDiagram.exportDiagramNow(UmlDiagram.java:236)
at net.sourceforge.plantuml.AbstractPSystem.exportDiagram(AbstractPSystem.java:127)
at net.sourceforge.plantuml.SourceStringReader.generateImage(SourceStringReader.java:124)
at net.sourceforge.plantuml.SourceStringReader.generateImage(SourceStringReader.java:111)
at net.sourceforge.plantuml.SourceStringReader.generateImage(SourceStringReader.java:101)
at scr.graphviz.sk.PlantUMLDemoMain.generateFromStringSource(PlantUMLDemoMain.java:66)
at scr.graphviz.sk.PlantUMLDemoMain.main(PlantUMLDemoMain.java:23)
I found someone with similar problem and older version of plantuml worked for him. I have jar file of the older version but I'm not sure how to apply it. I tried inspecting the file and find out versions of libraries used and added maven dependencies for them but it didnt seem to work.
This is similar problem i mentioned https://github.com/plantuml/plantuml/issues/69
I was practicing to access properties from property file using a singleton class called PropertyLoader but however my maven project is not able to locate the file in resources and giving null pointer excepiton.
Here is the class code.
import java.io.IOException;
import java.util.Properties;
import org.apache.log4j.Logger;
public class PropertyLoader {
private static PropertyLoader instance = null;
private Properties properties;
private final static Logger LOGGER = Logger.getLogger(PropertyLoader.class.getName());
protected PropertyLoader() throws IOException {
//TODO: Fix problem with loading properties file below
properties = new Properties();
properties.load(PropertyLoader.class.getResourceAsStream("app.properties"));
}
public static PropertyLoader getInstance() {
if(instance == null) {
try {
instance = new PropertyLoader();
} catch (IOException ioe) {
LOGGER.error("Error Occurred while creating Property Loader instance: " + ioe.getMessage());
}
}
return instance;
}
public String getValue(String key) {
LOGGER.info("Getting property value for: " + key);
return properties.getProperty(key);
}
}
Error I am getting:
Exception in thread "main" java.lang.NullPointerException: inStream
parameter is null at
java.base/java.util.Objects.requireNonNull(Objects.java:247) at
java.base/java.util.Properties.load(Properties.java:404) at
in.net.sudhir.evernote.client.batchjob.PropertyLoader.(PropertyLoader.java:16)
at
in.net.sudhir.evernote.client.batchjob.PropertyLoader.getInstance(PropertyLoader.java:23)
at
in.net.sudhir.evernote.client.batchjob.EvernoteClient.(EvernoteClient.java:51)
at
in.net.sudhir.evernote.client.batchjob.BatchProcess.main(BatchProcess.java:33)
Here is the Screen shot of Project Structure.
Project Structure in IntelliJ IDEA
properties = new Properties();
try(InputStream inputStream = PropertyLoader.class.getClassLoader().getResourceAsStream("app.properties")) {
if(inputStream == null)
throw new FileNotFoundException("File not found in classpath");
properties.load(inputStream);
}
NOTE: It's bad practice to do computations in the constructor. It's better to create some method that load a resource file.
I have an application.yml file that inside calls a properties file
When executing in my IDE (eclipse) I have no problem, but when running the jar in console (with java -jar) it doesn't load the properties file that is set in YAML file.
Here is my application.yml:
apache:
kafka:
producer:
properties: kafka-producer-${application.environment}.properties
consumer:
properties: kafka-consumer-${application.environment}.properties
and here the .properties:
#
# Apache Kafka Consumer Properties
##
bootstrap.servers=XXXX:9092
group.id=consumers
enable.auto.commit=true
auto.commit.interval.ms=1000
key.deserializer=org.apache.kafka.common.serialization.StringDeserializer
value.deserializer=org.apache.kafka.common.serialization.StringDeserializer
session.timeout.ms=300000
request.timeout.ms=305000
I'm loading the properties file like this:
#Value("${apache.kafka.producer.properties}")
private String kafkaProducerProperties;
#Bean
public KafkaProducer<String, String> eventProducer() {
try {
Properties properties = new Properties();
properties.load(this.context.getResource("classpath:" + this.kafkaProducerProperties).getInputStream());
return new KafkaProducer<String, String>(properties);
} catch (final IOException exception) {
LOG.error("Error loading Kafka producer properties", exception);
}
return null;
}
Executing in the IDE, the this.kafkaProducerProperties has the right value, while executing the jar is null.
What am I doing wrong and why does it load right in the IDE, but not executing the jar?
Please try this code
#Bean
public KafkaProducer<String, String> eventProducer() {
try {
Properties properties = new Properties();
properties.put("bootstrap.servers","XXXX:9092");
properties.put("group.id","consumers");
properties.put("enable.auto.commit","true");
properties.put("auto.commit.interval.ms","1000");
properties.put("key.deserializer","org.apache.kafka.common.serialization.StringDeserializer");
properties.put("value.deserializer","org.apache.kafka.common.serialization.StringDeserializer");
properties.put("session.timeout.ms","300000");
properties.put("request.timeout.ms","305000");
return new KafkaProducer<String, String>(properties);
} catch (final IOException exception) {
LOG.error("Error loading Kafka producer properties", exception);
}
// Avoid return null in your code instead throw an exception
throw new IllegalAccessException("Error loading Kafka producer properties");
}
To avoid hardcoding, you can also do this
1- Add this in your application.yml
bootstrap.servers: XXXX
group.id: consumers
enable.auto.commit: true
auto.commit.interval.ms: 1000
key.deserializer: org.apache.kafka.common.serialization.StringDeserializer
value.deserializer: org.apache.kafka.common.serialization.StringDeserializer
session.timeout.ms: 300000
request.timeout.ms: 305000
2- Add this code
#Value("${bootstrap.servers}")
private String bootstrapServers;
#Value("${group.id}")
private String groupId;
#Value("${enable.auto.commit}")
private String enableAutoCommit;
#Value("${auto.commit.interval.ms}")
private String autoCommit;
#Value("${key.deserializer}")
private String keyDeserializer;
#Value("${value.deserializer}")
private String valueDeserializer;
#Value("${session.timeout.ms}")
private String sessionTimeout;
#Value("${request.timeout.ms}")
private String requestTimeout;
#Bean
public KafkaProducer<String, String> eventProducer() {
try {
Properties properties = new Properties();
properties.put("bootstrap.servers",bootstrapServers+"9092");
properties.put("group.id",groupId);
properties.put("enable.auto.commit",enableAutoCommit);
properties.put("auto.commit.interval.ms",autoCommit);
properties.put("key.deserializer",keyDeserializer);
properties.put("value.deserializer",valueDeserializer);
properties.put("session.timeout.ms",sessionTimeout);
properties.put("request.timeout.ms",requestTimeout);
return new KafkaProducer<String, String>(properties);
} catch (final IOException exception) {
LOG.error("Error loading Kafka producer properties", exception);
}
// Avoid return null in your code instead throw an exception
throw new IllegalAccessException("Error loading Kafka producer properties");
}
Hope this helps.
So I am learning Kafka currently and have attempted to duplicate the examples provided from Apache here. This is example code for the consumer and I have written it in java just as shown. When I attempt to execute the file however I run into some issues. I am able to get the file to compile but it will not run properly.
I am executing the program with the following line without the quotations, "java TestConsumer localhost:2181 group1 test 4" This passes the 4 arguments necessary in the example code. I am provided with the following error though when I execute this command.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/log4j/Category
at kafka.utils.VerifiableProperties.<init>(Unknown Source)
at kafka.consumer.ConsumerConfig.<init>(Unknown Source)
at TestConsumer.ConsumerProps(TestConsumer.java:69)
at TestConsumer.<init>(TestConsumer.java:31)
at TestConsumer.main(TestConsumer.java:97)
Caused by: java.lang.ClassNotFoundException: org.apache.log4j.Category
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 5 more
I have tried going in an manually replacing the arguments with the necessary values and attempting to execute that way but I am given a different issue. Below is the error message along with the code I'm using just in case I screwed something up from the example provided. If anyone can help me out I would be incredibly appreciative since I am attempting to write my own consumer to test with parsing given information, etc. Thanks
log4j:WARN No appenders could be found for logger (kafka.utils.VerifiableProperties).
log4j:WARN Please initialize the log4j system properly.
Exception in thread "main" java.lang.NoClassDefFoundError: org/I0Itec/zkclient/IZkStateListener
at kafka.javaapi.consumer.ZookeeperConsumerConnector.<init>(Unknown Source)
at kafka.javaapi.consumer.ZookeeperConsumerConnector.<init>(Unknown Source)
at kafka.consumer.Consumer$.createJavaConsumerConnector(Unknown Source)
at kafka.consumer.Consumer.createJavaConsumerConnector(Unknown Source)
at TestConsumer.<init>(TestConsumer.java:31)
at TestConsumer.main(TestConsumer.java:97)
Caused by: java.lang.ClassNotFoundException: org.I0Itec.zkclient.IZkStateListener
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
... 6 more
/*
* Test Consumer to gather input from
* a Producer. Attempt to perform functions
* from the produced data
*/
// Kafka API
import kafka.consumer.ConsumerConfig;
import kafka.consumer.KafkaStream;
import kafka.javaapi.consumer.ConsumerConnector;
import java.util.Map;
import java.util.HashMap;
import java.util.Properties;
import java.util.List;
import java.util.concurrent.Executors;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.TimeUnit;
public class TestConsumer{
private final ConsumerConnector consumer;
private final String topic;
private ExecutorService executor;
// CONSTRUCTOR
public TestConsumer(String zookeeper, String groupid, String aTopic){
consumer = kafka.consumer.Consumer.createJavaConsumerConnector(ConsumerProps(zookeeper, groupid));
this.topic = aTopic;
}
// END CONSTRUCTOR
// RUN FUNCTION
public void run(int threads){
Map<String, Integer> topicMap = new HashMap<String, Integer>();
topicMap.put(topic, new Integer(threads));
Map<String, List<KafkaStream<byte[], byte[]>>> consumerMap = consumer.createMessageStreams(topicMap);
List<KafkaStream<byte[], byte[]>> streams = consumerMap.get(topic);
executor = Executors.newFixedThreadPool(threads); // process threads
int numThread = 0; // thread counter for consumption
// consumer all messages
for(final KafkaStream stream : streams){
executor.submit(new TestConsumerRun(stream, numThread));
numThread ++;
}
}
// END RUN FUNCTION
// CREATE PROPERTIES FUNCTION
private static ConsumerConfig ConsumerProps(String zookeeper, String groupid){
Properties properties = new Properties(); // config properties file
properties.put("zookeeper.connect", zookeeper);
properties.put("group.id", groupid);
properties.put("zookeeper.session.timeout.ms", "400");
properties.put("zookeeper.sync.time.ms", "200");
properties.put("auto.commit.interval.ms", "1000");
properties.put("auto.offset.reset", "smallest");
return new ConsumerConfig(properties);
}
// END CREATE PROPERTIES FUNCTION
// SHUTDOWN FUNCTION
public void shutdown(){
if (consumer != null) consumer.shutdown();
if (executor != null) executor.shutdown();
try{
if (!executor.awaitTermination(5000, TimeUnit.MILLISECONDS)){
System.out.println("Timed out waiting for consumer threads to shut down, exiting uncleanly");
}
} catch (InterruptedException e){
System.out.println("Interrupted during shutdown, exiting uncleanly");
}
}
// END SHUTDOWN FUNCTION
// MAIN FUNCTION
public static void main(String[] args){
String zookeeper = args[0];
String groupid = args[1];
String topic = args[2];
int threads = Integer.parseInt(args[3]);
TestConsumer test = new TestConsumer(zookeeper, groupid, topic); // send information to constructor
test.run(threads); // pass threads for iteration
try{
Thread.sleep(10000);
} catch (InterruptedException ie){
}
test.shutdown(); // close program
}
// END MAIN FUNCTION
}
/*
* Test Consumer to gather input from
* a Producer. Attempt to perform functions
* from the produced data
*/
// Kafka API
import kafka.consumer.ConsumerIterator;
import kafka.consumer.KafkaStream;
public class TestConsumerRun implements Runnable{
private KafkaStream aStream;
private int aThread;
// CONSTRUCTOR
public TestConsumerRun(KafkaStream stream, int thread){
aStream = stream; // set stream from main read
aThread = thread; // set thread from main read
}
// END CONSTRUCTOR
// RUN FUNCTION
public void run(){
ConsumerIterator<byte[], byte[]> iterator = aStream.iterator(); // used to check throughout the list continiously
while(iterator.hasNext())
System.out.println("Thread " + aThread + ": " + new String(iterator.next().message()));
System.out.println("Shutting down Thread: " + aThread);
}
// END RUN FUNCTION
}
Try adding BasicConfigurator.configure(); in the main method and it will work fine.
I had the same problem. You need to add log4j jar to your classpath. Also you might need to add slf4j and commons-logging.
java.lang.NoClassDefFoundError occurs when JVM can't find the class at runtime. (But it was there during compile.) Happens when a jar is missing during runtime and also for many other reasons. Your classpath during the compile and runtime needs to be the same. Sometimes you might have the same jar with different versions, so at runtime JVM might find the different version rather than the one used in compile.
I am trying to call "getOrderList" from the ST312_TestMain class. I am getting the java.lang.ExceptionInInitializerError for the below mentioned class.
package com.Main;
import org.w3c.dom.Document;
import com.yantra.ycp.core.YCPContext;
import com.Main.XMLUtil;
import com.Main.SterlingUtil;
public class ST312_TestMain {
public static void main(String[] args) throws Exception {
String ServiceName = "getOrderList";
String sServiceFlag = "N";
Document dTemplate = null;
//ServiceName = "SendDN";
//sServiceFlag = "Y";
Document inputXML=null;
inputXML = XMLUtil.getDocument("<Order OrderHeaderKey='201407181105267340509' />");
//inputXML = XMLUtil.getXmlFromFile("src/Test.xml");
dTemplate = XMLUtil.getDocument("<Order OrderHeaderKey='' OrderNo=''/>");
if (args.length == 3) {
ServiceName = args[0];
sServiceFlag = args[1].equals("Y") ? "Y" : "N";
inputXML = XMLUtil.getXmlFromFile(args[2]);
} else {
System.out
.println("Usage: TestMain <API/Service Name> <API/Service(N/Y)> <Input XML File>");
System.out
.println("No Input received using preset XML to call preset Service");
System.out.println("Service Name=" + ServiceName);
}
***YCPContext env = new YCPContext("admin", "admin");***
System.out.println("Input XML \n" + XMLUtil.getXmlString(inputXML));
try {
Document outputXML = null;
if ("Y".equals(sServiceFlag)) {
outputXML = SterlingUtil.callService(env, inputXML, ServiceName, null);
} else {
outputXML = SterlingUtil.callAPI(env, inputXML, ServiceName, dTemplate);
}
env.commit();
} catch (Exception ex) {
ex.printStackTrace();
System.out.println("Service Invocation Failed");
}
}
}
The exception is as follows:
Usage: TestMain <API/Service Name> <API/Service(N/Y)> <Input XML File>
No Input received using preset XML to call preset Service
Service Name=getOrderList
log4j:WARN No appenders could be found for logger (com.yantra.ycp.core.YCPContext).
log4j:WARN Please initialize the log4j system properly.
Exception in thread "main" java.lang.ExceptionInInitializerError
at com.sterlingcommerce.woodstock.util.frame.Manager.getProperty(Manager.java:1365)
at com.yantra.yfc.util.YFCConfigurator.setStandalone(YFCConfigurator.java:37)
at com.yantra.yfs.core.YFSSystem.init(YFSSystem.java:62)
at com.yantra.yfs.core.YFSSystem.<clinit>(YFSSystem.java:47)
at com.yantra.ycp.core.YCPContext.<init>(YCPContext.java:288)
at com.yantra.ycp.core.YCPContext.<init>(YCPContext.java:276)
at com.Main.ST312_TestMain.main(ST312_TestMain.java:31)
Caused by: java.lang.NullPointerException
at com.sterlingcommerce.woodstock.util.frame.log.base.SCILogBaseConfig.doConfigure(SCILogBaseConfig.java:35)
at com.sterlingcommerce.woodstock.util.frame.log.LogService.<clinit>(LogService.java:110)
... 7 more
Please help me on this problem, since I am not sure how to handle the YCPContext object. ("YCPContext env = new YCPContext("admin", "admin");"). Thanks in advance.
Request support from IBM about this.
No matter what mistakes you may have made configuring it (if any), the Sterling code should not be throwing an NPE at you.
If you are testing in local using main() method, then you should comment all the lines where you are using environment varilable. For your code its env variable.
Probably you wouldn't have added the database driver jars to your project build path... For e.g.
For DB2 database, db2jcc.jar has to be added
For Oracle database, ojdbc.jar has to be added