Java SSHJ library and enabling logging - java

Below is a example section of my java 1.8 program. It appears to be failing when trying to authenticate. It goes through a number of authentication methods and then declares its run out.
I would like to see debug information from within the sshj library to help me determine what's failing :- username, key exchange or something else. I am familiar with log4j and I can put logging statements within my code, but I can't find an example (simple to follow) which shows me how to hook up log4j to sfl4j and then tell sshj to use the logger.
'''
SSHClient sshClient = new SSHClient();
try
{
String username = "testuser";
File privateKey = new File("/mykeys/keyname");
KeyProvider keys;
sshClient.addHostKeyVerifier(new PromiscuousVerifier());
keys = sshClient.loadKeys(privateKey.getPath());
sshClient.connect("1.2.3.4", 22);
sshClient.authPublickey(username, keys);
SFTPClient sftpClient = sshClient.newSFTPClient();
sftpClient.put("./send/file1.xml", "file1.xml");
sshClient.close();
}
catch (UserAuthException e)
{
// TODO Auto-generated catch block
System.out.println(e.getMessage());
}
catch (TransportException e)
{
// TODO Auto-generated catch block
System.out.println(e.getMessage());
}
catch (IOException e)
{
// TODO Auto-generated catch block
System.out.println(e.getMessage());
}
'''

Adding
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.6.6</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.6.6</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.16</version>
</dependency>
to pom.xml
and creating a log4j.proerties files did the trick for me:
# Define the root logger with appender file
log = ssh-test.log
log4j.rootLogger = DEBUG, FILE
# Define the file appender
log4j.appender.FILE=org.apache.log4j.FileAppender
log4j.appender.FILE.File=${log}/log.out
# Define the layout for file appender
log4j.appender.FILE.layout=org.apache.log4j.PatternLayout
log4j.appender.FILE.layout.conversionPattern=%m%n

Related

DOCX4j : space removed in pdf by deploying war file into tomcat

I am using below code to convert docx to pdf using DOCX4j.While running this code into IDE it is working fine and space preserved after conversion. For deployment I created war and put it into tomcat server. But into test server space is not preserved into document after conversion.
pom.xml
<!-- https://mvnrepository.com/artifact/org.apache.poi/poi-ooxml -->
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>5.2.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.docx4j/docx4j-JAXB-Internal -->
<dependency>
<groupId>org.docx4j</groupId>
<artifactId>docx4j-JAXB-Internal</artifactId>
<version>8.3.7</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.docx4j/docx4j-JAXB-ReferenceImpl -->
<dependency>
<groupId>org.docx4j</groupId>
<artifactId>docx4j-JAXB-ReferenceImpl</artifactId>
<version>8.3.7</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.docx4j/docx4j-JAXB-MOXy -->
<dependency>
<groupId>org.docx4j</groupId>
<artifactId>docx4j-JAXB-MOXy</artifactId>
<version>8.3.7</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.docx4j/docx4j-export-fo -->
<dependency>
<groupId>org.docx4j</groupId>
<artifactId>docx4j-export-fo</artifactId>
<version>8.3.7</version>
</dependency>
code for conversion
private byte[] docxToPdfBytes() {
InputStream templateInputStream = null;
try {
templateInputStream = new FileInputStream(ApplicationConstants.TEMP_DOCX_PATH);
WordprocessingMLPackage wordMLPackage = WordprocessingMLPackage.load(templateInputStream);
FileOutputStream os = new FileOutputStream(ApplicationConstants.TEMP_PDF_PATH);
Docx4J.toPDF(wordMLPackage, os);
os.flush();
os.close();
return Files.readAllBytes(Paths.get(ApplicationConstants.TEMP_PDF_PATH));
} catch (Throwable e) {
logger.info(XrefSubscriberServiceService.class.getName() + "==> Method : docxToPdfBytes");
logger.error(e.getMessage(), e);
}
finally {
if(null != templateInputStream) {
try {
templateInputStream.close();
} catch (IOException e) {
logger.info(XrefSubscriberServiceService.class.getName() + "==> Method : docxToPdfBytes");
logger.error(e.getMessage(), e);
}
}
}
return null;
}
So can anyone have idea why this type of behaviour is happened ??
this question is resolved. Just need to correct docx styling of font in docx document.

Maven Junit test case crashing with java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory

I have been trying to implement a Junit test case for a service which returns what are the different type of resources as shown in the Tester code below:
public class Tester {
MyInfoService myInfoService=null;
#Before
public void setUp() throws Exception{
myInfoService = new MyInfoService();
System.out.println(" ##$## myInfoService ="+myInfoService.getAllMyResourceTypes().size());
}
#Test
public void testResTypeAll() {
List<MyTypeInfoBean> resTypeBeanList = myInfoService.getAllMyResourceTypes();
assertEquals("Testing size for res type...", 18, resTypeBeanList.size());
}
}
Service class where the error occurs:
public class MyInfoService implements MyInfoServiceRemote {
// some code here
#Override
public List<MyTypeInfoBean> getAllMyResourceTypes() {
List<MyResourceTypeInfoDTO> resourceList = new ArrayList<>();
List<MyTypeInfoBean> resourceListBean = new ArrayList<>();
MyResourceTypeInfoDTO myResTypeInfoDTO = null;
try {
HashMap<Integer,MyConstantsUtilClass.MyResourceTypes> restypemap = (HashMap<Integer, MyResourceTypes>)MyConstantsUtilClass.MyResourceTypes.getRestypemap();
Set<Map.Entry<Integer, MyConstantsUtilClass.MyResourceTypes>> restypemapEntrySet = restypemap.entrySet();
for (Entry<Integer, MyResourceTypes> entry : restypemapEntrySet) {
myResTypeInfoDTO = new MyResourceTypeInfoDTO();
// code to read the entry values and populate my DTO
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
// size gets printed, so everything is fine till now
System.out.println("##$## resourceList ="+resourceList.size());
// The resource list is printed correctly above.The problem begins below
Mapper mapper=null;
if (resourceList != null) {
System.out.println("resourceList is not null object");
System.out.print("Original contents of al: ");
try {
// this does not work and get a NoClassDefError
mapper= DozerBeanMapperSingletonWrapper.getInstance(); } catch(Exception e) {
e.printStackTrace();
}
Iterator<MyResourceTypeInfoDTO> itr = resourceList.iterator();
while (itr.hasNext()) {
MyResourceTypeInfoDTO element = (MyResourceTypeInfoDTO) itr.next();
if (mapper == null)
System.out.println("Mapper is NULL");
MyTypeInfoBean beanElement = mapper.map(element, MyTypeInfoBean.class);
resourceListBean.add(beanElement);
}
return resourceListBean;
}
// other methods here
}
Below is the error log I get on running the Junit test case:
java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory
at org.dozer.DozerBeanMapper.<clinit>(DozerBeanMapper.java:58)
at org.dozer.DozerBeanMapperSingletonWrapper.getInstance(DozerBeanMapperSingletonWrapper.java:43)
at com.inv.service.MyInfoService.getAllResourceTypes(MyInfoService.java:508)
at com.bel.tropo.tester.Tester.setUp(Tester.java:21)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
My pom.xml file has the dozer dependency correctly mentioned as :
<dependency>
<groupId>net.sf.dozer</groupId>
<artifactId>dozer</artifactId>
<version>5.3.2</version>
<exclusions>
<exclusion>
<groupId>*</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
The dependencies required by dozer are already resolved through mvn dependency:resolve.
Should I downgrade to a lower version of dozer and check if it works?
These two (How can I resolve java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory? and dozer with maven) questions don't seem to give me a solution or am I missing something?
Any help would be great.
Restate your dozer dependency as follows:
<dependency>
<groupId>net.sf.dozer</groupId>
<artifactId>dozer</artifactId>
<version>5.3.2</version>
</dependency>
The dozer-5.3.2-pom declares SLF4J as a transitive dependency so it will be provided for you as soon as you remove the unwanted exclusions from your Dozer declaration.
<dependency>
<groupId>
net.sf.dozer
</groupId>
<artifactId>
dozer
</artifactId>
<version>
5.5.1
</version>
</dependency>
Updating the dozer version and doing a maven compile/install solved it for me besides getting rid of exclusions.

log4j trying to log in the file

I am trying to use log4j to log it in the file
here's the code
protected static Logger logger = Logger.getLogger(Application.class);
private static final String DIRECTORY = "/Users/me/Desktop";
private static final String EXTENSION = ".log";
protected void setupLogger(String fileName) {
SimpleLayout layout = new SimpleLayout();
FileAppender appender = new FileAppender(layout, DIRECTORY + "/logs/" + fileName + EXTENSION, false);
logger.addAppender(appender);
logger.setLevel((Level) Level.DEBUG);
}
and here's the pom that i use http://pastebin.com/vXdFtzSU
The stacktrace that I am getting is here
Error:(40, 28) java: incompatible types: org.apache.log4j.FileAppender cannot be converted to org.apache.log4j.Appender
I am trying to follow this answer configure log4j to log to custom file at runtime
Hi try changin your maven dependencies add this dependency:
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
and change spring boot dependencies to exclude logging dependencies :
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-thymeleaf</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
then add try/catch statement :
protected void setupLogger(String fileName) {
try {
SimpleLayout layout = new SimpleLayout();
FileAppender appender;
appender = new FileAppender(layout, DIRECTORY + "/logs/" + fileName + EXTENSION, false);
logger.addAppender(appender);
logger.setLevel((Level) Level.DEBUG);
} catch (IOException e) {
e.printStackTrace();
}
}
It almost looks like you are using a different version of the libraray at runtime than at compile-time. If the types were truly incompatible that would generate a compiler error. If you are running your program in a special environment like Tomcat etc. check if the same version of log4j is installed there.
Why not use log4j.properties It's a very simple configuration print to file.
log4j.logger.register=INFO,R7
log4j.appender.R7=org.apache.log4j.DailyRollingFileAppender
log4j.appender.R7.DatePattern='.'yyyyMMdd
log4j.appender.R7.File=/appLogs/address/logFile.log
log4j.appender.R7.layout=org.apache.log4j.PatternLayout
log4j.appender.R7.layout.ConversionPattern=%d{dd MMM yyyy HH:mm:ss} | %m%n
private static final Logger logger = Logger.getLogger("register");
logger.info("print to file and console");

Unable to run spark job from EMR which connects to Cassandra on EC2

I'm running spark job from EMR cluster which connects to Cassandra on EC2
The following are the dependencies which I'm using for my project.
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.5.0-M1</version>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>2.1.6</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.5.0-M3</version>
</dependency>
The issue that Im facing here is if I use the cassandra-driver-core 3.0.0 , I get the following error
java.lang.ExceptionInInitializerError
at mobi.vserv.SparkAutomation.DriverTester.doTest(DriverTester.java:28)
at mobi.vserv.SparkAutomation.DriverTester.main(DriverTester.java:16)
Caused by: java.lang.IllegalStateException: Detected Guava issue #1635 which indicates that a version of Guava less than 16.01 is in use. This introduces codec resolution issues and potentially other incompatibility issues in the driver. Please upgrade to Guava 16.01 or later.
at com.datastax.driver.core.SanityChecks.checkGuava(SanityChecks.java:62)
at com.datastax.driver.core.SanityChecks.check(SanityChecks.java:36)
at com.datastax.driver.core.Cluster.<clinit>(Cluster.java:67)
... 2 more
I have tried including the guaua version 19.0.0 also but still I'm unable to run the job
and when I degrate the cassandra-driver-core 2.1.6 I get the following error.
com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: /EMR PUBLIC IP:9042 (com.datastax.driver.core.TransportException: [/EMR PUBLIC IP:9042] Cannot connect))
at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:223)
at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:78)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1272)
at com.datastax.driver.core.Cluster.init(Cluster.java:158)
at com.datastax.driver.core.Cluster.connect(Cluster.java:248)
Please note that I have tested my code locally and it runs absolutely fine and I have followed the different combinations of dependencies as mentioned here https://github.com/datastax/spark-cassandra-connector
Code :
public class App1 {
private static Logger logger = LoggerFactory.getLogger(App1.class);
static SparkConf conf = new SparkConf().setAppName("SparkAutomation").setMaster("yarn-cluster");
static JavaSparkContext sc = null;
static
{
sc = new JavaSparkContext(conf);
}
public static void main(String[] args) throws Exception {
JavaRDD<String> Data = sc.textFile("S3 PATH TO GZ FILE/*.gz");
JavaRDD<UserSetGet> usgRDD1=Data.map(new ConverLineToUSerProfile());
List<UserSetGet> t3 = usgRDD1.collect();
for(int i =0 ; i <=t3.size();i++){
try{
phpcallone php = new phpcallone();
php.sendRequest(t3.get(i));
}
catch(Exception e){
logger.error("This Has reached ====> " + e);
}
}
}
}
public class phpcallone{
private static Logger logger = LoggerFactory.getLogger(phpcallone.class);
static String pid;
public void sendRequest(UserSetGet usg) throws JSONException, IOException, InterruptedException {
UpdateCassandra uc= new UpdateCassandra();
try {
uc.UpdateCsrd();
}
catch (ClassNotFoundException e) {
e.printStackTrace(); }
}
}
}
public class UpdateCassandra{
public void UpdateCsrd() throws ClassNotFoundException {
Cluster.Builder clusterBuilder = Cluster.builder()
.addContactPoint("PUBLIC IP ").withPort(9042)
.withCredentials("username", "password");
clusterBuilder.getConfiguration().getSocketOptions().setConnectTimeoutMillis(10000);
try {
Session session = clusterBuilder.build().connect("dmp");
session.execute("USE dmp");
System.out.println("Connection established");
} catch (Exception e) {
e.printStackTrace();
}
}
}
Assuming that you are using EMR 4.1+, you can pass in the guava jar into the --jars option for spark submit. Then supply a configuration file to EMR to use user class paths first.
For example, in a file setup.json
[
{
"Classification": "spark-defaults",
"Properties": {
"spark.driver.userClassPathFirst": "true",
"spark.executor.userClassPathFirst": "true"
}
}
]
You would supply the --configurations file://setup.json option into the create-cluster aws cli command.

NoClassDefFoundError while reading a pdf file

Iam trying to read a pdf file the code is
try {
File fileConn = new File(filePath);
InputStream inp = new FileInputStream(fileConn);
PdfReader reader = new PdfReader(inp);
int pages = reader.getNumberOfPages();
System.out.println("Pages" + pages);
} catch (Exception e) {
//Handle Exception
}
But the method is throwing NOClassDefFoundError. What coukd be the possible reason
Did you added pdfbox and itextpdf to your classpath?
Try this, if you're using maven:
<dependency>
<groupId>org.apache.pdfbox</groupId>
<artifactId>pdfbox</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>itextpdf</artifactId>
<version>5.0.6</version>
</dependency>

Categories

Resources