I am trying to use documents4j for converting file types. I have tried converting multiple file types from one to another.
The code is successfully executed and I can intermittently see files getting converted and produced. But at the end of the execution the converted files are, I think, deleted automatically. I cannot see the converted files in the temp folder that is created.
I printed Future conversion object and here is the result:
LocalConversion{pending=false, cancelled=false, done=true, priority=Priority{value=1000, creationTime=1527163966676}, file-system-target=C:\Users\USERNAME\Desktop\New folder\63cabe72-b2cf-4d52-b428-530dfc0fd63d\temp2}.
Is the target file moved to some other location after the conversion?
Or am I missing some lines of code that copies the target file to other location?
I am using 1.0.3 versions of documents4j libs.
Code:
import java.io.BufferedInputStream;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.InputStream;
import java.util.concurrent.Future;
import java.util.concurrent.TimeUnit;
import com.documents4j.api.DocumentType;
import com.documents4j.api.IConverter;
import com.documents4j.job.LocalConverter;
public class Test {
public static void main (String[] args) {
try {
ByteArrayOutputStream bo = new ByteArrayOutputStream();
InputStream in = new BufferedInputStream(new FileInputStream("SOME_.TXT_FILE"));
IConverter converter = LocalConverter.builder()
.baseFolder(new File("SOME_FOLDER_PATH"))
.workerPool(20, 25, 2, TimeUnit.SECONDS)
.processTimeout(5, TimeUnit.SECONDS)
.build();
Future<Boolean> conversion = converter
.convert(in).as(DocumentType.TEXT)
.to(bo).as(DocumentType.DOCX)
.prioritizeWith(1000)
.schedule();
conversion.get();
System.out.println(conversion);
if(conversion.isDone()) {
System.out.println("Done");
} else if(conversion.isCancelled()){
System.out.println("Cancelled");
}
}catch(Exception e) {
System.out.println(e);
}
}
}
I am using maven for dependencies management. My pom.xml
<properties>
<documents4j.version>1.0.3</documents4j.version>
</properties>
<dependencies>
<dependency>
<groupId>com.documents4j</groupId>
<artifactId>documents4j-api</artifactId>
<version>${documents4j.version}</version>
</dependency>
<dependency>
<groupId>com.documents4j</groupId>
<artifactId>documents4j-util-conversion</artifactId>
<version>${documents4j.version}</version>
</dependency>
<dependency>
<groupId>com.documents4j</groupId>
<artifactId>documents4j-transformer</artifactId>
<version>${documents4j.version}</version>
</dependency>
<dependency>
<groupId>com.documents4j</groupId>
<artifactId>documents4j-util-all</artifactId>
<version>${documents4j.version}</version>
</dependency>
<dependency>
<groupId>com.documents4j</groupId>
<artifactId>documents4j-local</artifactId>
<version>${documents4j.version}</version>
</dependency>
<dependency>
<groupId>com.documents4j</groupId>
<artifactId>documents4j-transformer-msoffice-word</artifactId>
<version>${documents4j.version}</version>
</dependency>
<dependency>
<groupId>com.documents4j</groupId>
<artifactId>documents4j-transformer-msoffice-base</artifactId>
<version>${documents4j.version}</version>
</dependency>
</dependencies>
I figured it out after some one debugging.
So the target file is stored back in the ByteArrayOutputStream object bo.
Just need to save that as a file.
FileOutputStream fos = new FileOutputStream("C:\\Users\\USERNAME\\Desktop\\New folder\\OTGv4.docx");
bo.writeTo(fos);
Hope this helps!
Related
I tried to replicate the same example given in the following question.
import javax.sql.DataSource;
import org.apache.camel.main.Main;
import org.apache.camel.builder.RouteBuilder;
import org.apache.commons.dbcp.BasicDataSource;
public class JDBCExample {
private Main main;
public static void main(String[] args) throws Exception {
JDBCExample example = new JDBCExample();
example.boot();
}
public void boot() throws Exception {
// create a Main instance
main = new Main();
// enable hangup support so you can press ctrl + c to terminate the JVM
main.enableHangupSupport();
String url = "jdbc:oracle:thin:#MYSERVER:1521:myDB";
DataSource dataSource = setupDataSource(url);
// bind dataSource into the registery
main.bind("myDataSource", dataSource);
// add routes
main.addRouteBuilder(new MyRouteBuilder());
// run until you terminate the JVM
System.out.println("Starting Camel. Use ctrl + c to terminate the JVM.\n");
main.run();
}
class MyRouteBuilder extends RouteBuilder {
public void configure() {
String dst = "C:/Local Disk E/TestData/Destination";
from("direct:myTable")
.setBody(constant("select * from myTable"))
.to("jdbc:myDataSource")
.to("file:" + dst);
}
}
private DataSource setupDataSource(String connectURI) {
BasicDataSource ds = new BasicDataSource();
ds.setDriverClassName("oracle.jdbc.driver.OracleDriver");
ds.setUsername("sa");
ds.setPassword("devon1");
ds.setUrl(connectURI);
return ds;
}
}
I have included the camel-jdbc-3.0.1.jar and my db specific jar file in my class path.
When I try to compile the code using the following command
javac -cp .;D:\Code\bin JDBCExample.java
I am getting the following error.
JDBCExample.java:2: error: package org.apache.camel.main does not exist
import org.apache.camel.main.Main;
^
JDBCExample.java:3: error: package org.apache.camel.builder does not exist
import org.apache.camel.builder.RouteBuilder;
^
JDBCExample.java:4: error: package org.apache.commons.dbcp does not exist
import org.apache.commons.dbcp.BasicDataSource;
Where am I going wrong? I tried adding camel-core to the classpath, but it didn't help.
Kindly let me know your thoughts, thanks in advance.
You did well by adding camel-core to your classpath, but camel-core and camel-jdbc do not suffice, you should also add the following dependencies:
JDBCExample.java:2: error: package org.apache.camel.main does not exist
import org.apache.camel.main.Main;
Add camel-main dependency
JDBCExample.java:4: error: package org.apache.commons.dbcp does not exist
import org.apache.commons.dbcp.BasicDataSource;
Add commons-dbcp dependency
JDBCExample.java:3: error: package org.apache.camel.builder does not exist
import org.apache.camel.builder.RouteBuilder;
Add camel-core dependency
With these and the camel-jdbc dependency, you are good to go.
I suggest that you use maven to handle your dependencies (and much more) if you can... If you have not used it before this five minutes quickstart will gently introduce you to it.
Here is a sample pom.xml that resolves all these dependencies correctly
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>demo</groupId>
<artifactId>camel-jdbc-demo</artifactId>
<packaging>jar</packaging>
<version>1.0-SNAPSHOT</version>
<name>camel-jdbc-demo</name>
<url>http://maven.apache.org</url>
<dependencies>
<!-- https://mvnrepository.com/artifact/commons-dbcp/commons-dbcp -->
<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<version>1.4</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.camel/camel-main -->
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-main</artifactId>
<version>3.0.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.camel/camel-core -->
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core</artifactId>
<version>3.0.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.camel/camel-jdbc -->
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-jdbc</artifactId>
<version>3.0.1</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
I'm trying to run a spark stream from a kafka queue containing Avro messages.
As per https://spark.apache.org/docs/latest/sql-data-sources-avro.html I should be able to use from_avro to convert column value to Dataset<Row>.
However, I'm unable to compile the project as it complains from_avro cannot be found. I can see the method declared in package.class of the dependency.
How can I use the from_avro method from org.apache.spark.sql.avro in my Java code locally?
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
import static org.apache.spark.sql.functions.*;
import org.apache.spark.sql.avro.*;
public class AvroStreamTest {
public static void main(String[] args) throws IOException, InterruptedException {
// Creating local sparkSession here...
Dataset<Row> df = sparkSession
.readStream()
.format("kafka")
.option("kafka.bootstrap.servers", "host:port")
.option("subscribe", "avro_queue")
.load();
// Cannot resolve method 'from_avro'...
df.select(from_avro(col("value"), jsonFormatSchema)).writeStream().format("console")
.outputMode("update")
.start();
}
}
pom.xml:
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-avro_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<!-- more dependencies below -->
</dependencies>
It seems like Java is unable to import names from sql.avro.package.class
It's because of the generated class names, importing it as import org.apache.spark.sql.avro.package$; and then using package$.MODULE$.from_avro(...) should work
You need to include spark-sql-avro in your pom.xml which is available at
https://mvnrepository.com/artifact/org.apache.spark/spark-sql-avro_2.11/2.4.0-palantir.28-1-gdf34e2d
When practicing to realize a "hadoop RPC" sample, I keep getting this error.
According to previous similar questions and answers, I've checked the jar file in my classpath and got hadoop common.jar It shows that the jar file in the classpath contains hadoop.conf.Configuration.class.
And here's the code to build RPCServer:
*package rpc;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.ipc.RPC;
import org.apache.hadoop.ipc.Server;
public class RPCServer implements MyBizable {
public String doSomething(String str) {
return str;
}
public static void main(String[] args) throws Exception {
Server server = new RPC.Builder(new Configuration())
.setProtocol(MyBizable.class)
.setInstance(new RPCServer())
.setBindAddress("***.***.***.***")
.setPort(****)
.build();
server.start();
}
}*
And still this error shows up, anyone knows how to solve it?
Any help will be greatly appreciated! THX in advance!
Are you using Maven ?
if yes then add below dependencies.
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
<scope>provided</scope>
</dependency>
I am trying to get data from Cloudant using Java code and getting error,
I tried with below Spark and cloudant-spark version,
Spark 2.0.0,
Spark 2.0.1,
Spark 2.0.2
Getting same error for all version as error posted below.
If I add scala dependencies to resolve error this error than it is conflicting with Spark library.
Below is my java code,
package spark.cloudant.connecter;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.SQLContext;
import com.cloudant.spark.*;
public class cloudantconnecter {
public static void main(String[] args) throws Exception {
try {
SparkConf sparkConf = new SparkConf().setAppName("spark cloudant connecter").setMaster("local[*]");
sparkConf.set("spark.streaming.concurrentJobs", "30");
JavaSparkContext sc = new JavaSparkContext(sparkConf);
SQLContext sqlContext = new SQLContext(sc);
System.out.print("initialization successfully");
Dataset<org.apache.spark.sql.Row> st = sqlContext.read().format("com.cloudant.spark")
.option("cloudant.host", "HOSTNAME").option("cloudant.username", "USERNAME")
.option("cloudant.password", "PASSWORD").load("DATABASENAME");
st.printSchema();
} catch (
Exception e) {
e.printStackTrace();
}
}
}
Maven Dependencies
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>cloudant-labs</groupId>
<artifactId>spark-cloudant</artifactId>
<version>2.0.0-s_2.11</version>
</dependency>
</dependencies>
Getting error details,
Exception in thread "main" java.lang.NoSuchMethodError: scala/Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object; (loaded from file:/C:/Users/Administrator/.m2/repository/org/scala-lang/scala-library/2.10.6/scala-library-2.10.6.jar by sun.misc.Launcher$AppClassLoader#9f916f97) called from class scalaj.http.HttpConstants$ (loaded from file:/C:/Users/Administrator/.m2/repository/org/scalaj/scalaj-http_2.11/2.3.0/scalaj-http_2.11-2.3.0.jar by sun.misc.Launcher$AppClassLoader#9f916f97).
at scalaj.http.HttpConstants$.liftedTree1$1(Http.scala:637)
at scalaj.http.HttpConstants$.<init>(Http.scala:636)
at scalaj.http.HttpConstants$.<clinit>(Http.scala)
at scalaj.http.BaseHttp$.$lessinit$greater$default$2(Http.scala:754)
at scalaj.http.Http$.<init>(Http.scala:738)
at scalaj.http.Http$.<clinit>(Http.scala)
at com.cloudant.spark.common.JsonStoreDataAccess.getQueryResult(JsonStoreDataAccess.scala:152)
at com.cloudant.spark.common.JsonStoreDataAccess.getTotalRows(JsonStoreDataAccess.scala:99)
at com.cloudant.spark.common.JsonStoreRDD.totalRows$lzycompute(JsonStoreRDD.scala:56)
at com.cloudant.spark.common.JsonStoreRDD.totalRows(JsonStoreRDD.scala:55)
at com.cloudant.spark.common.JsonStoreRDD.totalPartition$lzycompute(JsonStoreRDD.scala:59)
at com.cloudant.spark.common.JsonStoreRDD.totalPartition(JsonStoreRDD.scala:58)
at com.cloudant.spark.common.JsonStoreRDD.getPartitions(JsonStoreRDD.scala:81)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:246)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:246)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1934)
at org.apache.spark.rdd.RDD$$anonfun$fold$1.apply(RDD.scala:1046)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)
at org.apache.spark.rdd.RDD.fold(RDD.scala:1040)
at org.apache.spark.sql.execution.datasources.json.InferSchema$.infer(InferSchema.scala:68)
at org.apache.spark.sql.DataFrameReader$$anonfun$3.apply(DataFrameReader.scala:317)
at org.apache.spark.sql.DataFrameReader$$anonfun$3.apply(DataFrameReader.scala:317)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:316)
at com.cloudant.spark.DefaultSource.create(DefaultSource.scala:127)
at com.cloudant.spark.DefaultSource.createRelation(DefaultSource.scala:105)
at com.cloudant.spark.DefaultSource.createRelation(DefaultSource.scala:100)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:315)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:149)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:132)
at spark.cloudant.connecter.cloudantconnecter.main(cloudantconnecter.java:24)
Error is showing because mentioned library in question using scala 2.10 and mentioned package spark cloudant library using 2.11
So please change library spark-core_2.10 to spark-core_2.11
So now dependencies are,
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.0.1</version>
</dependency>
<dependency>
<groupId>cloudant-labs</groupId>
<artifactId>spark-cloudant</artifactId>
<version>2.0.0-s_2.11</version>
</dependency>
I need to convert json to pojo. I Decided to use jackson and have added jackson-core-2.2.0.jar, jackson-databind-2.4.4.jar and jackson-annotations-2.1.2.jar to my project's classpath
I created following Main class:
import java.io.IOException;
import java.net.MalformedURLException;
import java.net.URL;
import java.util.List;
import com.fasterxml.jackson.core.JsonGenerationException;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JsonMappingException;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.core.JsonGenerationException;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.JsonMappingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.DeserializationFeature;
public class Json {
private static String SRC= "";
public static void main(String[] args) {
AwardList awardList = null;
ObjectMapper mapper = new ObjectMapper();
try{
awardList = (AwardList) mapper.readValue(new URL(SRC), AwardList.class);
}catch (JsonGenerationException e){
e.printStackTrace();
} catch (JsonMappingException e){
e.printStackTrace();
} catch (IOException e){
e.printStackTrace();
}
System.out.println(awardList);
}
}
And following AwardList class:
public class AwardList {
private Flights[] flights;
private String[] connections;
private SaverEconomy saverEconomy;
private StandartEconomy standartEconomy;
private SaverBusiness saverBusiness;
private StandartFirst standartFirst;
private SaverFirst saverFirst;
public Flights[] getFlights() {
return flights;
}
public void setFlights(Flights[] flights) {
this.flights = flights;
}
public SaverEconomy getSaverEconomy() {
return saverEconomy;
}
public void setSaverEconomy(SaverEconomy saverEconomy) {
this.saverEconomy = saverEconomy;
}
public StandartEconomy getStandartEconomy() {
return standartEconomy;
}
public void setStandartEconomy(StandartEconomy standartEconomy) {
this.standartEconomy = standartEconomy;
}
public SaverBusiness getSaverBusiness() {
return saverBusiness;
}
public void setSaverBusiness(SaverBusiness saverBusiness) {
this.saverBusiness = saverBusiness;
}
public StandartFirst getStandartFirst() {
return standartFirst;
}
public void setStandartFirst(StandartFirst standartFirst) {
this.standartFirst = standartFirst;
}
public SaverFirst getSaverFirst() {
return saverFirst;
}
public void setSaverFirst(SaverFirst saverFirst) {
this.saverFirst = saverFirst;
}
public String[] getConnections() {
return connections;
}
public void setConnections(String[] connections) {
this.connections = connections;
}
}
I want to convert json to pojo and save it in the database. I keep getting following error:
Exception in thread "main" java.lang.NoSuchMethodError: com.fasterxml.jackson.core.JsonFactory.requiresPropertyOrdering()Z
at com.fasterxml.jackson.databind.ObjectMapper.<init>(ObjectMapper.java:457)
at com.fasterxml.jackson.databind.ObjectMapper.<init>(ObjectMapper.java:379)
at Json.main(Json.java:72)
I was getting the exactly same issue. I was using Maven for dependency management and had added dependency for jackson-databind module only like this
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson.version}</version>
</dependency>
and then I resolved it by doing this.. I added its transitive dependencies explicitly with the same jackson.version mentioned for each of them in the pom.xml file, as guided here
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>${jackson.version}</version>
</dependency>
I came here with a similar issue on Google App Engine. Here is how I fixed it.
First I ran:
mvn dependency:tree
To find who is using the older version. I then excluded that from the offending dependency like so:
<dependency>
<groupId>com.google.appengine.tools</groupId>
<artifactId>appengine-gcs-client</artifactId>
<version>0.6</version>
<exclusions>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
</exclusion>
</exclusions>
</dependency>
Next I added the newer version of the dependency in my pom.xml:
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.8.7</version>
</dependency>
Hope this help others who stumble here.
I had the same issue. There was some incompatibility between the jackson-version 2.6.3 and another dependency (graphaware-framework-embedded).
I could resolve the issue by simply removing the dependency on jackson in my own pom and just let the other dependency load whatever jackson-version it needed.
As of now latest redisson
<dependency>
<groupId>org.redisson</groupId>
<artifactId>redisson</artifactId>
<version>3.13.4</version>
</dependency>
Latest Jackson jackson.version property is 2.11.2 for me
<!-- faster JSON -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>${jackson.version}</version>
</dependency>
Updating Eclipse references, without this runner configs were accessing earlier edition of jackson project.
eclipse:eclipse -U
I faced the same issue just now while upgrading my project from spring 3 to 4.1.6.
A little background for reference, just in case if it helps anyone running into same issue===>
First, it prompted me with error for mappingJacksonconverter vs mappingJackson2converter since the former doesn't exist anymore with spring 4. Once I changed those references in code, I ran into issue with my dispatcher servlet since it had xsd definitions from spring 3 support. Once that's corrected, my last step in journey was this error which you all faced as well.
My solution:
I had older versions of following jars:
jackson-annotations,
jackson-core,
jackson-databind,
jackson-core-asl,
jackson-mapper-asl
I upgraded first 3 jars to 2.10.0.pr1 release and last 2 to 1.9.13 release.
I also had one more older jar i.e.com.fasterxml.jackson.core.jar which I had to remove as well. Anyways, as error suggested this was the key reason for the mismatch (Like Tobias said in comment prior to mine)
I think only last paragraph is important from this particular question point of view, but I am hoping my narration of issues faced leading up to that may help someone saving 4-5 hrs :) Cheers, folks!
For me, I updated the version to a recently released version i.e. 2.12.4 and it worked fine.
Note: I was using another project inside the build path of a larger project in eclipse. The jackson update was done inside the pom.xml file of the smaller project.