log4j trying to log in the file - java

I am trying to use log4j to log it in the file
here's the code
protected static Logger logger = Logger.getLogger(Application.class);
private static final String DIRECTORY = "/Users/me/Desktop";
private static final String EXTENSION = ".log";
protected void setupLogger(String fileName) {
SimpleLayout layout = new SimpleLayout();
FileAppender appender = new FileAppender(layout, DIRECTORY + "/logs/" + fileName + EXTENSION, false);
logger.addAppender(appender);
logger.setLevel((Level) Level.DEBUG);
}
and here's the pom that i use http://pastebin.com/vXdFtzSU
The stacktrace that I am getting is here
Error:(40, 28) java: incompatible types: org.apache.log4j.FileAppender cannot be converted to org.apache.log4j.Appender
I am trying to follow this answer configure log4j to log to custom file at runtime

Hi try changin your maven dependencies add this dependency:
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
and change spring boot dependencies to exclude logging dependencies :
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-thymeleaf</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
then add try/catch statement :
protected void setupLogger(String fileName) {
try {
SimpleLayout layout = new SimpleLayout();
FileAppender appender;
appender = new FileAppender(layout, DIRECTORY + "/logs/" + fileName + EXTENSION, false);
logger.addAppender(appender);
logger.setLevel((Level) Level.DEBUG);
} catch (IOException e) {
e.printStackTrace();
}
}

It almost looks like you are using a different version of the libraray at runtime than at compile-time. If the types were truly incompatible that would generate a compiler error. If you are running your program in a special environment like Tomcat etc. check if the same version of log4j is installed there.

Why not use log4j.properties It's a very simple configuration print to file.
log4j.logger.register=INFO,R7
log4j.appender.R7=org.apache.log4j.DailyRollingFileAppender
log4j.appender.R7.DatePattern='.'yyyyMMdd
log4j.appender.R7.File=/appLogs/address/logFile.log
log4j.appender.R7.layout=org.apache.log4j.PatternLayout
log4j.appender.R7.layout.ConversionPattern=%d{dd MMM yyyy HH:mm:ss} | %m%n
private static final Logger logger = Logger.getLogger("register");
logger.info("print to file and console");

Related

Java SSHJ library and enabling logging

Below is a example section of my java 1.8 program. It appears to be failing when trying to authenticate. It goes through a number of authentication methods and then declares its run out.
I would like to see debug information from within the sshj library to help me determine what's failing :- username, key exchange or something else. I am familiar with log4j and I can put logging statements within my code, but I can't find an example (simple to follow) which shows me how to hook up log4j to sfl4j and then tell sshj to use the logger.
'''
SSHClient sshClient = new SSHClient();
try
{
String username = "testuser";
File privateKey = new File("/mykeys/keyname");
KeyProvider keys;
sshClient.addHostKeyVerifier(new PromiscuousVerifier());
keys = sshClient.loadKeys(privateKey.getPath());
sshClient.connect("1.2.3.4", 22);
sshClient.authPublickey(username, keys);
SFTPClient sftpClient = sshClient.newSFTPClient();
sftpClient.put("./send/file1.xml", "file1.xml");
sshClient.close();
}
catch (UserAuthException e)
{
// TODO Auto-generated catch block
System.out.println(e.getMessage());
}
catch (TransportException e)
{
// TODO Auto-generated catch block
System.out.println(e.getMessage());
}
catch (IOException e)
{
// TODO Auto-generated catch block
System.out.println(e.getMessage());
}
'''
Adding
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.6.6</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.6.6</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.16</version>
</dependency>
to pom.xml
and creating a log4j.proerties files did the trick for me:
# Define the root logger with appender file
log = ssh-test.log
log4j.rootLogger = DEBUG, FILE
# Define the file appender
log4j.appender.FILE=org.apache.log4j.FileAppender
log4j.appender.FILE.File=${log}/log.out
# Define the layout for file appender
log4j.appender.FILE.layout=org.apache.log4j.PatternLayout
log4j.appender.FILE.layout.conversionPattern=%m%n

Programmatically connect LDAP and authenticate credentials in AEM

I want to connect to LDAP programmatically in AEM using maven dependency which resolves in OSGi
Approaches and subsequent issues faced:-
1. Cannot use
#Reference
private ExternalIdentityProviderManager externalIdentityProviderManager;
final String externalId = request.getParameter("externalId");
final String externalPassword = request.getParameter("externalPassword");
final ExternalIdentityProvider idap = externalIdentityProviderManager.getProvider("ldap");
final SimpleCredentials credentials = new SimpleCredentials(externalId, externalPassword.toCharArray());
final ExternalUser externalUser = idap.authenticate(credentials);
as this Identity provider config is only present in author environment and not in publish servers(as per req).
2. Trying to use
<dependency>
<groupId>org.apache.directory.api</groupId>
<artifactId>api-ldap-client-api</artifactId>
<version>2.0.0.AM4</version>
</dependency>
to resolve dependencies. It resolve my compile time errors but this is not an 'osgi ready' library, hence couldn't be installed in OSGi. If done so manually it has further unresolved dependencies.
Code reference for this approach -
https://directory.apache.org/api/user-guide/2.1-connection-disconnection.html
&
https://directory.apache.org/api/user-guide/2.10-ldap-connection-template.html
3. I've also tried to use
String rootDN = "uid=admin,ou=system";
String rootPWD = "secret";
Hashtable < String, String > environment = new Hashtable < String, String > ();
environment.put(Context.INITIAL_CONTEXT_FACTORY, "com.sun.jndi.ldap.LdapCtxFactory");
environment.put(Context.PROVIDER_URL, "ldap://localhost:10389");
environment.put(Context.SECURITY_AUTHENTICATION, "simple");
environment.put(Context.SECURITY_PRINCIPAL, rootDN);
environment.put(Context.SECURITY_CREDENTIALS, rootPWD);
DirContext dirContext = null;
NamingEnumeration < ? > results = null;
dirContext = new InitialDirContext(environment);
SearchControls controls = new SearchControls();
controls.setSearchScope(SearchControls.SUBTREE_SCOPE);
String userId = "abhishek";
String userPwd = "{SSHA}ip/DD+zUhv22NH3wE1dvJN7oauYE4TYQ3ziRtg=="; //"apple";
String filter = "(&(objectclass=person)(uid=" + userId + ")(userPassword=" + userPwd + "))";
results = dirContext.search("", filter, controls);
if(results.hasMore()) {
System.out.println("User found");
} else {
System.out.println("User not found");
}
It has 2 issues -
a) It works fine when tested as plain Java class in main method on class load, but when attempted to integrate in AEM/osgi service class, it throws -
javax.naming.NotContextException: Not an instance of DirContext at javax.naming.directory.InitialDirContext.getURLOrDefaultInitDirCtx(InitialDirContext.java:111) at javax.naming.directory.InitialDirContext.search(InitialDirContext.java:267)
b) Even in plain Java class, i had to provide the hashed password to validate, which would be difficult to integrate.
String userPwd = "{SSHA}ip/DD+zUhv22NH3wE1dvJN7oauYE4TYQ3ziRtg==";//"apple";
Can someone provide me any maven dependency/library that can integrate with osgi and resolve dependency as well as i don't need to provide hashed password to validate user credentials? Any approach that may resolve these issues?
Step 1:
Add these dependencies in project pom
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-pool2</artifactId>
<version>2.6.2</version>
</dependency>
<dependency>
<groupId>org.apache.directory.api</groupId>
<artifactId>api-all</artifactId>
<version>1.0.0-RC2</version>
</dependency>
<dependency>
<groupId>org.apache.mina</groupId>
<artifactId>mina-core</artifactId>
<version>2.1.3</version>
</dependency>
<dependency>
<groupId>commons-pool</groupId>
<artifactId>commons-pool</artifactId>
<version>1.6</version>
</dependency>
<dependency>
<groupId>antlr</groupId>
<artifactId>antlr</artifactId>
<version>2.7.7</version>
</dependency>
Step 2:
Add them to bundle pom
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-pool2</artifactId>
</dependency>
<dependency>
<groupId>org.apache.directory.api</groupId>
<artifactId>api-all</artifactId>
</dependency>
<dependency>
<groupId>org.apache.mina</groupId>
<artifactId>mina-core</artifactId>
</dependency>
<dependency>
<groupId>commons-pool</groupId>
<artifactId>commons-pool</artifactId>
</dependency>
<dependency>
<groupId>antlr</groupId>
<artifactId>antlr</artifactId>
</dependency>
Step 3:
In bundle pom at the plugin description
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<extensions>true</extensions>
<configuration>
<instructions>
<Import-Package>!net.sf.cglib.proxy, javax.inject;version=0.0.0,*</Import-Package>
<Export-Package />
<Sling-Model-Packages></Sling-Model-Packages>
<Bundle-SymbolicName></Bundle-SymbolicName>
<Embed-Dependency>antlr, mina-core, api-all, commons-pool, commons-pool2</Embed-Dependency>
</instructions>
</configuration>
</plugin>
Use these for the above mentioned plugin
<Import-Package>!net.sf.cglib.proxy</Import-Package>
<Embed-Dependency>antlr, mina-core, api-all, commons-pool, commons-pool2</Embed-Dependency>
Step 4:
Imports are specifics and works only when
<dependency>
<groupId>org.apache.directory.api</groupId>
<artifactId>api-all</artifactId>
<version>1.0.0-RC2</version>
</dependency>
is used. As there are some other dependencies which provides packages/class but they don't work at some point or the other.
import org.apache.directory.api.ldap.model.message.SearchScope;
import org.apache.directory.ldap.client.api.DefaultPoolableLdapConnectionFactory;
import org.apache.directory.ldap.client.api.LdapConnectionConfig;
import org.apache.directory.ldap.client.api.LdapConnectionPool;
import org.apache.directory.ldap.client.template.LdapConnectionTemplate;
import org.apache.directory.ldap.client.template.PasswordWarning;
import org.apache.directory.ldap.client.template.exception.PasswordException;
private String ldapAuthenticationApacheDsFlow(final SlingHttpServletRequest request) {
String status = "";
try {
LdapConnectionConfig config = new LdapConnectionConfig();
config.setLdapHost("localhost");
config.setLdapPort(10389);
config.setName("uid=admin,ou=system");
config.setCredentials("secret");
final DefaultPoolableLdapConnectionFactory factory = new DefaultPoolableLdapConnectionFactory(config);
final LdapConnectionPool pool = new LdapConnectionPool(factory);
pool.setTestOnBorrow(true);
final LdapConnectionTemplate ldapConnectionTemplate = new LdapConnectionTemplate(pool);
final String uid = request.getParameter("externalId");
final String password = request.getParameter("externalPassword");
final PasswordWarning warning = ldapConnectionTemplate.authenticate(
"ou=Users,dc=example,dc=com", "(uid=" + uid +")", SearchScope.SUBTREE, password.toCharArray());
status = "User credentials authenticated";
if(warning != null) {
status = status + " \n Warning!!" +warning.toString();
}
} catch(final PasswordException e) {
status = e.toString();
e.printStackTrace();
}
return status;
}
If no error is thrown at final PasswordWarning warning = user credentials are successfully validated.

Java: Read JSON from a file, convert to ORC and write to a file

I need to automate JSON-to-ORC conversion process. I was able to almost get there by using Apache's ORC-tools package except that JsonReader is doesn't handle Map type and throws an exception. So, the following works but doesn't handle Map type.
Path hadoopInputPath = new Path(input);
try (RecordReader recordReader = new JsonReader(hadoopInputPath, schema, hadoopConf)) { // throws when schema contains Map type
try (Writer writer = OrcFile.createWriter(new Path(output), OrcFile.writerOptions(hadoopConf).setSchema(schema))) {
VectorizedRowBatch batch = schema.createRowBatch();
while (recordReader.nextBatch(batch)) {
writer.addRowBatch(batch);
}
}
}
So, I started looking into using Hive classes for Json-to-ORC conversion, which has an added advantage that in the future I can convert to other formats, such as AVRO with minor code changes. However, I am not sure what the best way to do this using Hive classes. Specifically, it's not clear how to write HCatRecord to a file as shown below.
HCatRecordSerDe hCatRecordSerDe = new HCatRecordSerDe();
SerDeUtils.initializeSerDe(hCatRecordSerDe, conf, tblProps, null);
OrcSerde orcSerde = new OrcSerde();
SerDeUtils.initializeSerDe(orcSerde, conf, tblProps, null);
Writable orcOut = orcSerde.serialize(hCatRecord, hCatRecordSerDe.getObjectInspector());
assertNotNull(orcOut);
InputStream input = getClass().getClassLoader().getResourceAsStream("test.json.snappy");
SnappyCodec compressionCodec = new SnappyCodec();
try (CompressionInputStream inputStream = compressionCodec.createInputStream(input)) {
LineReader lineReader = new LineReader(new InputStreamReader(inputStream, Charsets.UTF_8));
String jsonLine = null;
while ((jsonLine = lineReader.readLine()) != null) {
Writable jsonWritable = new Text(jsonLine);
DefaultHCatRecord hCatRecord = (DefaultHCatRecord) jsonSerDe.deserialize(jsonWritable);
// TODO: Write ORC to file????
}
}
Any ideas on how to complete the code above or simpler ways of doing JSON-to-ORC will be greatly appreciated.
Here is what I ended up doing using Spark libraries per cricket_007 suggestion:
Maven dependency (with some exclusions to keep maven-duplicate-finder-plugin happy):
<properties>
<dep.jackson.version>2.7.9</dep.jackson.version>
<spark.version>2.2.0</spark.version>
<scala.binary.version>2.11</scala.binary.version>
</properties>
<dependency>
<groupId>com.fasterxml.jackson.module</groupId>
<artifactId>jackson-module-scala_${scala.binary.version}</artifactId>
<version>${dep.jackson.version}</version>
<exclusions>
<exclusion>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
<exclusions>
<exclusion>
<groupId>log4j</groupId>
<artifactId>apache-log4j-extras</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
</exclusion>
<exclusion>
<groupId>net.java.dev.jets3t</groupId>
<artifactId>jets3t</artifactId>
</exclusion>
<exclusion>
<groupId>com.google.code.findbugs</groupId>
<artifactId>jsr305</artifactId>
</exclusion>
<exclusion>
<groupId>stax</groupId>
<artifactId>stax-api</artifactId>
</exclusion>
<exclusion>
<groupId>org.objenesis</groupId>
<artifactId>objenesis</artifactId>
</exclusion>
</exclusions>
</dependency>
Java code synopsis:
SparkConf sparkConf = new SparkConf()
.setAppName("Converter Service")
.setMaster("local[*]");
SparkSession sparkSession = SparkSession.builder().config(sparkConf).enableHiveSupport().getOrCreate();
// read input data
Dataset<Row> events = sparkSession.read()
.format("json")
.schema(inputConfig.getSchema()) // StructType describing input schema
.load(inputFile.getPath());
// write data out
DataFrameWriter<Row> frameWriter = events
.selectExpr(
// useful if you want to change the schema before writing it to ORC, e.g. ["`col1` as `FirstName`", "`col2` as `LastName`"]
JavaConversions.asScalaBuffer(outputSchema.getColumns()))
.write()
.options(ImmutableMap.of("compression", "zlib"))
.format("orc")
.save(outputUri.getPath());
Hope this helps somebody to get started.

java.lang.NoClassDefFoundError: com/fasterxml/jackson/core/JsonFactory getting this error even after adding the desired imports

here is my code. I've added all the dependencies then also getting such error.
google-http-client-jackson2-1.17.0-rc.jar
here in this code in getting above mentioned error at JsonFactory jsonFactory = new JacksonFactory();
import com.google.api.services.customsearch.Customsearch;
import com.google.api.services.customsearch.model.Search;
import com.google.api.services.customsearch.model.Result;
import com.google.api.client.http.HttpRequestInitializer;
import com.google.api.client.http.HttpRequest;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.jackson2.JacksonFactory;
protected SearchResult[] doSearch() {
HttpRequestInitializer httpRequestInitializer = new HttpRequestInitializer()
{
#Override
public void initialize(HttpRequest request) throws IOException {
}
};
JsonFactory jsonFactory = new JacksonFactory();
Customsearch csearch = new Customsearch( new NetHttpTransport(), jsonFactory, httpRequestInitializer);
Customsearch.Cse.List listReqst;
try {
listReqst = csearch.cse().list(query.getQueryString());
listReqst.setKey(GOOGLE_KEY);
// set the search engine ID got from API console
listReqst.setCx("search engine ID");
// set the query string
listReqst.setQ(query.getQueryString());
// language chosen is English for search results
listReqst.setLr("lang_en");
// set hit position of first search result
listReqst.setStart((long) firstResult);
// set max number of search results to return
listReqst.setNum((long) maxResults);
//performs search
Search result = listReqst.execute();
java.util.List<Result> results = result.getItems();
String urls[] = new String [result.size()];
String snippets[] = new String [result.size()];
int i=0;
for (Result r : results){
urls[i] = r.getLink();
snippets[i] = r.getSnippet();
i++;
}
return getResults(snippets, urls, true);
} catch (IOException e) {
// TODO Auto-generated catch block
MsgPrinter.printSearchError(e);
System.exit(1);
return null;
}
}
kindly suggest me how it should be fixed.
To answer the question directly (it was answered in the comments by Pavel). The jackson core lib dependency was missing:
jackson-core-$x.y.z.jar
Happened to me when having 2 jackson versions - codehaus vs. fasterlxml.
Removing the fasterxml version (that was a trans-dependency of swagger) fixed the issue.
<dependency>
<groupId>io.swagger</groupId>
<artifactId>swagger-jersey-jaxrs</artifactId>
<version>1.5.3</version>
<exclusions>
<exclusion>
<groupId>javax.ws.rs</groupId>
<artifactId>jsr311-api</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.jaxrs</groupId>
<artifactId>jackson-jaxrs-json-provider</artifactId>
</exclusion>
<!--<exclusion>-->
<!--<groupId>com.fasterxml.jackson.datatype</groupId>-->
<!--<artifactId>jackson-datatype-joda</artifactId>-->
<!-- test -->
</exclusions>
</dependency>
I had a similar problem, eventually discovered it was an issue with the buildpath and dependencies. The easiest (not most efficient) is too add all the google-api-client jars to your project and it would disappear. Better way is to track and properly add all other dependencies of jacksonFactory
Down load the Maven of jackson from here.
Then add it to your dependencies.

Classes not found in maven project despite the library being included in the pom

I'm trying to run this open-source project, rokuality-server (full codebase here: https://github.com/rokuality/rokuality-server).
But in this method below I'm getting a java.lang.NoClassDefFoundError when trying to instantiate any Sikuli classes like Pattern or Finder.
import org.sikuli.script.Finder;
import org.sikuli.script.Image;
import org.sikuli.script.Match;
import org.sikuli.script.Pattern;
#SuppressWarnings("unchecked")
public class ImageUtils {
private static JSONObject getElement(String sessionID, String locator, File screenImage) {
JSONObject element = new JSONObject();
element.put(SessionConstants.ELEMENT_ID, UUID.randomUUID().toString());
boolean isFile = locator != null && new File(locator).exists();
boolean elementFound = false;
if (!screenImage.exists()) {
return null;
}
if (isFile) {
Finder finder = null;
float similarity = Float.valueOf(
SessionManager.getSessionInfo(sessionID).get(SessionConstants.IMAGE_MATCH_SIMILARITY).toString());
Pattern pattern = null;
try {
//******** THIS LINE BELOW THROWS THE ERROR ********
pattern = new Pattern(locator).similar(similarity);
finder = new Finder(screenImage.getAbsolutePath());
} catch (Exception e) {
Log.getRootLogger().warn(e);
}
}
// more code here
}
}
My suspicion is that something in the pom.xml file is wrong, so here's the Sikuli X Api dependency as it appears there:
<dependency>
<groupId>com.sikulix</groupId>
<artifactId>sikulixapi</artifactId>
<version>1.1.2</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-nop</artifactId>
</exclusion>
<exclusion>
<groupId>com.github.vidstige</groupId>
<artifactId>jadb</artifactId>
</exclusion>
<exclusion>
<groupId>com.sikulix</groupId>
<artifactId>sikulix2tigervnc</artifactId>
</exclusion>
</exclusions>
</dependency>
I tried changing the version to the latest one, 2.0.0 but it caused some errors in the project, which I think are related to changes in the org.sikuli.script.Image class's methods. Do I maybe need an earlier version?
This should be fixed in the newer releases of the rokuality project:
https://github.com/rokuality/rokuality-server/releases. It depended on the java jdk version the user was running.

Categories

Resources