log4j2 with Mongodb issue logging throwable with message - java

I'm trying to setup log4j2 with mongoDb3 and everything works fine, only problem is when I long throwable exception with message something like this
logger.error("Some Message",new Exception("Test"));
I will get an error as
Caused by: org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class org.apache.logging.log4j.mongodb3.MongoDbDocumentObject.
but I can easily log error with
logger.error(new Exception("Test"));
without any issue.
Our project is using slf4j mainly for logging and all of our error logs styled like the first example.
is there any way I can fix this error without changing all the exception logs?
Also my config is fairly simple
<NoSql name="databaseAppender">
<MongoDb3 databaseName="admin" collectionName="testLogger" server="localhost"
username="***" password="****" />
</NoSql>
Thank you

After lot's of headache and testing I figured out how to solve this problem.
I just post my result here incase someone else ran into this issue.
For some reason org.apache.logging.log4j.mongodb3.MongoDbDocumentObject is not getting register by MongoDb codecs automatically so I have to create a ConnectionFactory class to register the class .
public static MongoClient getMongoClient () {
MongoClientOptions.Builder optionsBuilder = MongoClientOptions.builder()
.codecRegistry(CodecRegistries.fromRegistries(
CodecRegistries.fromCodecs(new LevelCodec()),
CodecRegistries.fromProviders(PojoCodecProvider.builder().register(MongoDbDocumentObject.class).build()),
MongoClient.getDefaultCodecRegistry()));
MongoClientURI uri = new MongoClientURI("mongodb://"+userName+":"+passWord+"#"+dbUrl+":"+port,optionsBuilder);
return new MongoClient(uri);
}
Obviously you have to change your log4j Appender to use the class and method :
<NoSql name="databaseAppender">
<MongoDb3 databaseName="test" collectionName="testLogger" factoryClassName="ClassName"
factoryMethodName="getMongoClient" />
</NoSql>
My dependancies are :
implementation group: 'org.slf4j', name: 'slf4j-api', version: '1.7.30'
// Log4j
implementation group: 'org.apache.logging.log4j', name: 'log4j-core', version: '2.13.3'
implementation group: 'org.apache.logging.log4j', name:'log4j-mongodb3', version:'2.13.3'
implementation group: 'org.apache.logging.log4j', name: 'log4j-slf4j-impl', version: '2.13.3'
implementation group:'com.fasterxml.jackson.dataformat',name:'jackson-dataformat-xml', version:'2.9.4'
// https://mvnrepository.com/artifact/org.mongodb/mongo-java-driver
implementation group: 'org.mongodb', name: 'mongo-java-driver', version: '3.12.7'

Related

Imported Gradle Dependency not working at Java runtime

We are migrating to using the Microsoft Graph API from using older auth methods.
Previously in this project the dependency management has been very manual, this included downloading .jars and manually importing them into intelliJ for local development.
Since we need to pull in new dependencies for the Graph API SDK, I took it as a chance to pay off some tech-debt and start going toward more programmatic dependency management.
I was able to get the web-app to run properly, however bringing in the new dependencies
https://mvnrepository.com/artifact/com.microsoft.azure/msal4j
https://mvnrepository.com/artifact/com.microsoft.graph/microsoft-graph
https://mvnrepository.com/artifact/com.azure/azure-core
did not seem to work fully.
Our build.gradle looks like this
dependencies {
providedCompile 'com.microsoft.graph:microsoft-graph:5.40.0'
providedCompile 'com.azure:azure-identity:1.7.0'
// https://mvnrepository.com/artifact/com.azure/azure-core
implementation group: 'com.azure', name: 'azure-core', version: '1.34.0'
// https://mvnrepository.com/artifact/com.microsoft.azure/msal4j
implementation 'com.microsoft.azure:msal4j:1.13.3'
providedCompile fileTree(dir: "${webAppDirName}/WEB-INF/lib", excludes: [
'aspose.pdf.jar',
'aspose-pdf.jar',
'async-http-client-2.4.4.jar',
'commons-beanutils-core.jar',
'commons-collections.jar',
'commons-fileupload-1.1.jar',
'commons-fileupload-1.2.2.jar',
'commons-io-1.1.jar',
'commons-io-2.1.jar',
'commons-lang-2.3.jar',
'commons-logging.jar',
'commons-logging-1.0.jar',
'commons-logging-1.1.1.jar',
'commons-logging-1.1.3.jar',
'core-1.5.0.jar',
'empty.jar',
'firebase-admin-6.3.0.jar',
'groovy-all-1.7.5.jar',
'httpclient-4.1.2.jar',
'httpclient-4.5.jar',
'httpcore-4.1.2.jar',
'httpcore-4.4.1.jar',
'httpmime-4.1.2.jar',
'itext-2.0.4.jar',
'jackson-annotations-2.4.0.jar',
'jackson-annotations-2.8.0.jar',
'jackson-core-2.4.0.jar',
'jackson-databind-2.4.0-rc3.jar',
'jackson-jaxrs-1.9.2.jar',
'jackson-xc-1.9.2.jar',
'jasperreports-5.6.0.jar',
'jersey-json-1.17.1.jar',
'jsch-0.1.53.jar',
'json_simple-1.1.jar',
'mailapi.jar',
'netty-all-4.1.23.Final.jar',
'opentok-server-sdk-2.2.0.jar',
'opentok-server-sdk-4.2.0.jar',
'reactive-streams-1.0.2.jar',
'servlet.jar',
'soap.jar',
'twilio-7.1.0-jar-with-dependencies.jar',
'twilio-java-sdk-3.3.10-jar-with-dependencies.jar',
'webprovider-1.5.0.jar'
],include:'*.jar')
providedCompile 'org.apache.tomcat:tomcat-catalina:8.5.32'
}
The code in particular that I am running into an issue with is located here
public MicrosoftEmailClient() {
ClientSecretCredential clientSecretCredential = new ClientSecretCredentialBuilder()
.clientId(CLIENT_ID)
.clientSecret(CLIENT_SECRET)
.tenantId(TENANT_ID)
.build();
TokenCredentialAuthProvider tokenCredentialAuthProvider = new TokenCredentialAuthProvider(SCOPE, clientSecretCredential);
graphClient =
GraphServiceClient
.builder()
.authenticationProvider(tokenCredentialAuthProvider)
.buildClient();
}
and the error itself is here
Exception in thread "Thread-8" java.lang.NoClassDefFoundError: com/azure/core/credential/TokenCredential
For context around this project we are deploying it through Tomcat 8.5+

Spark SQL execution fails with error: java.lang.NoClassDefFoundError: org.codehaus.janino.InternalCompilerException

Running a Spark SQL program in Java immediately fails with the following exception, as soon as the first action is called on a dataset. Have tried all the suggestions in Spark SQL fails with java.lang.NoClassDefFoundError: org/codehaus/commons/compiler/UncheckedCompileException nothing seems to work. Tried upgrading versions of Spark still facing the same error.
Please note not running using spark-submit but via java -jar <app-name>
Below is Spark Gradle config:
compile group: 'org.apache.spark', name: 'spark-sql_2.12', version: '2.4.3'
implementation 'org.codehaus.janino:commons-compiler:3.0.16'
implementation 'org.codehaus.janino:janino:3.0.16'
Tried below exclusion config too, still same error persists:
implementation('org.apache.spark:spark-sql_2.12:2.4.3') {
exclude group: 'org.codehaus.janino', module: 'janino'
exclude group: 'org.codehaus.janino', module: 'commons-compiler'
}
compile "org.codehaus.janino:commons-compiler:3.0.16"
compile "org.codehaus.janino:janino:3.0.16"
Exception stack trace:
java.lang.NoClassDefFoundError: org.codehaus.janino.InternalCompilerException
at org.apache.spark.sql.catalyst.expressions.codegen.JavaCode$.variable(javaCode.scala:63)
at org.apache.spark.sql.catalyst.expressions.codegen.JavaCode$.isNullVariable(javaCode.scala:76)
at org.apache.spark.sql.catalyst.expressions.Expression.$anonfun$genCode$3(Expression.scala:109)
at org.apache.spark.sql.catalyst.expressions.Expression$$Lambda$2984/0x00000000bfa2b020.apply(Unknown Source)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.catalyst.expressions.Expression.genCode(Expression.scala:105)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.$anonfun$create$1(GenerateSafeProjection.scala:155)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$$$Lambda$2982/0x00000000ef9c1620.apply(Unknown Source)
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
at scala.collection.TraversableLike$$Lambda$1442/0x00000000ff849e20.apply(Unknown Source)
at scala.collection.immutable.List.foreach(List.scala:392)
at scala.collection.TraversableLike.map(TraversableLike.scala:238)
at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
at scala.collection.immutable.List.map(List.scala:298)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:152)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:38)
at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1193)
at org.apache.spark.sql.Dataset.collectFromPlan(Dataset.scala:3382)
at org.apache.spark.sql.Dataset.$anonfun$collectAsList$1(Dataset.scala:2794)
at org.apache.spark.sql.Dataset$$Lambda$2827/0x00000000af983620.apply(Unknown Source)
at org.apache.spark.sql.Dataset.$anonfun$withAction$2(Dataset.scala:3364)
at org.apache.spark.sql.Dataset$$Lambda$2919/0x00000000cf9f4220.apply(Unknown Source)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$$$Lambda$2920/0x00000000cfa4f020.apply(Unknown Source)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3364)
at org.apache.spark.sql.Dataset.collectAsList(Dataset.scala:2793)
Code implementation:
SparkConf sparkConf = new SparkConf().setAppName("App demo").setMaster("local[*]");
try (SparkSession sparkSession = createSparkSession(sparkConf)) {
Dataset<Row> df = sparkSession.read().json("/Users/shubhampr/Documents/spark/examples/src/main/resources/people.json");
df.show();
} catch (Exception e) {
log.error("Error in processing file: {}", e.getMessage());
return;
}
SparkSession createSparkSession(SparkConf sparkConf) {
return SparkSession.builder()
.sparkContext(new JavaSparkContext(sparkConf).sc())
.getOrCreate();
}
We need to strict janino:commons-compiler to version '3.0.16'
Below is the fix which worked:
// Spark lib
compile "org.apache.spark:spark-core_2.12:2.4.3"
compile "org.apache.spark:spark-sql_2.12:2.4.3"
implementation('org.codehaus.janino:commons-compiler'){
version {
strictly '3.0.16'
}
}
implementation('org.codehaus.janino:janino'){
version {
strictly '3.0.16'
}
}

How to fix: "Unrecognized token 'Unrecognized': was expecting ('true', 'false' or 'null')" using Horton schema-registry

I'm trying to use horton schema registry to use avro format for messages in kafka. The problem is that when i try to publish avro message i got this error:
Caused by: com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'Unrecognized': was expecting ('true', 'false' or 'null')
at [Source: (String)"Unrecognized field "initialState" (class com.hortonworks.registries.schemaregistry.SchemaVersion), not marked as ignorable (2 known properties: "description", "schemaText"])
at [Source: org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$UnCloseableInputStream#3daa4db5; line: 1, column: 321] (through reference chain: com.hortonworks.registries.schemaregistry.SchemaVersion["initialState"])"; line: 1, column: 13]
at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1804)
at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:703)
at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._reportInvalidToken(ReaderBasedJsonParser.java:2853)
at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._handleOddValue(ReaderBasedJsonParser.java:1899)
at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:757)
at com.fasterxml.jackson.databind.ObjectMapper._readTreeAndClose(ObjectMapper.java:4042)
at com.fasterxml.jackson.databind.ObjectMapper.readTree(ObjectMapper.java:2551)
at com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient.readCatalogResponse(SchemaRegistryClient.java:644)
I looked at class definition (SchemaVersion) and see that there is annotation: #JsonIgnoreProperties(ignoreUnknown = true) but i still get this error.
I'm also using gradle as a build tool:
compile(group: 'org.apache.avro', name: 'avro', version: '1.8.2')
compile(group: 'com.hortonworks.registries', name: 'schema-registry-serdes', version: '0.7.0')
compile(group: 'com.hortonworks.registries', name: 'schema-registry-client', version: '0.7.0')
// confluent platform 5.1.1 provided with kafka 2.1.0
compile(group: 'org.apache.kafka', name: 'kafka-clients', version: '2.1.0')
compile(group: 'org.glassfish.jersey.core', name: 'jersey-client', version: '2.28')
compile(group: 'org.glassfish.jersey.inject', name: 'jersey-hk2', version: '2.28')
Code which cause error:
package project;
import com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient;
import org.apache.avro.generic.GenericRecord;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.common.serialization.*;
import java.util.Properties;
import java.util.concurrent.ExecutionException;
public class HortonToConfluentTest {
private static final String HORTON_SCHEMA_REGISTRY_URL = "http://localhost:9090/api/v1";
private static final String BOOTSTRAP_SERVER = "localhost:29092";
private static final String AVRO_SOURCE_TOPIC = "avro_topic";
public static void main(String[] args) throws ExecutionException, InterruptedException {
Properties hortonProducerProperties = new Properties();
hortonProducerProperties.put(ProducerConfig.CLIENT_ID_CONFIG, "horton-producer");
hortonProducerProperties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, BOOTSTRAP_SERVER);
hortonProducerProperties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
hortonProducerProperties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, com.hortonworks.registries.schemaregistry.serdes.avro.kafka.KafkaAvroSerializer.class);
hortonProducerProperties.put(SchemaRegistryClient.Configuration.SCHEMA_REGISTRY_URL.name(), HORTON_SCHEMA_REGISTRY_URL);
KafkaProducer<String, GenericRecord> hortonProducer = new KafkaProducer<>(hortonProducerProperties);
hortonProducer.send(new ProducerRecord<>(AVRO_SOURCE_TOPIC, GenerateRecord.generate(1, "body"))).get();
hortonProducer.flush();
hortonProducer.close();
}
}
Schema registry and kafka running using docker (https://github.com/TheBookPeople/hortonworks-registry-docker):
version: '3' services: db:
image: mysql:5.7.17
container_name: db
hostname: db
ports:
- 3306:3306
environment:
MYSQL_ROOT_PASSWORD: password
MYSQL_DATABASE: hortonworks
MYSQL_USER: hortonworks
MYSQL_PASSWORD: password horton-registry:
image: thebookpeople/hortonworks-registry:latest
container_name: horton-registry
hostname: horton-registry
depends_on:
- db
ports:
- 9090:9090
environment:
DB_NAME: hortonworks
DB_USER: hortonworks
DB_PASSWORD: password
DB_PORT: 3306
DB_HOST: db zookeeper:
image: confluentinc/cp-zookeeper:5.1.1
hostname: zookeeper
container_name: zookeeper
ports:
- 22181:22181
environment:
- ZOOKEEPER_CLIENT_PORT=22181 kafka:
image: confluentinc/cp-enterprise-kafka:5.1.1
hostname: kafka
container_name: kafka
ports:
- 29092:29092
depends_on:
- zookeeper
environment:
- KAFKA_ZOOKEEPER_CONNECT=zookeeper:22181
- KAFKA_LISTENER_SECURITY_PROTOCOL_MAP=PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
- KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://kafka:9092,PLAINTEXT_HOST://localhost:29092
- KAFKA_BROKER_ID=1
- KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR=1
- KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS=0
What i have already tried:
Define schema manually in schema registry
Change version of dependency of horton-schema-registry
Explicitly define dependency on jackson (use last available version)
The problem was in dependencies (according to https://hortonworks.com/tutorial/schema-registry-in-trucking-iot-on-hdf/section/4/):
compile(group: 'com.hortonworks.registries', name: 'schema-registry-serdes', version: '0.3.0.3.0.1.1-5')
compile(group: 'javax.xml.bind', name: 'jaxb-api', version: '2.3.0')
instead of:
compile(group: 'com.hortonworks.registries', name: 'schema-registry-serdes', version: '0.7.0')
compile(group: 'com.hortonworks.registries', name: 'schema-registry-client', version: '0.7.0')
Final dependecines list:
compile(group: 'org.apache.avro', name: 'avro', version: '1.8.2')
compile(group: 'com.hortonworks.registries', name: 'schema-registry-serdes', version: '0.3.0.3.0.1.1-5')
// confluent platform 5.1.1 provided with kafka 2.1.0
compile(group: 'org.apache.kafka', name: 'kafka-clients', version: '2.1.0')
compile(group: 'javax.xml.bind', name: 'jaxb-api', version: '2.3.0')

Fatal Exception OkHttp Dispatcher with RetroFit

I am using RetroFit to make a call to TheMovieDatabase API and am trying to get a list of the popular movies to populate into a RecylerView. However, when I make the call through RetroFit, I am getting an error related to OkHTTP. I am using OkHTTP in conjunction with RetroFit:
06-08 19:57:26.281 19232-19254/popularmovies.troychuinard.com.popularmovies E/AndroidRuntime: FATAL EXCEPTION: OkHttp Dispatcher
Process: popularmovies.troychuinard.com.popularmovies, PID: 19232
java.lang.NoClassDefFoundError: Failed resolution of: Lokhttp3/internal/Platform;
at okhttp3.logging.HttpLoggingInterceptor$Logger$1.log(HttpLoggingInterceptor.java:112)
at okhttp3.logging.HttpLoggingInterceptor.intercept(HttpLoggingInterceptor.java:160)
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147)
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121)
at okhttp3.RealCall.getResponseWithInterceptorChain(RealCall.java:200)
at okhttp3.RealCall$AsyncCall.execute(RealCall.java:147)
at okhttp3.internal.NamedRunnable.run(NamedRunnable.java:32)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607)
at java.lang.Thread.run(Thread.java:761)
Caused by: java.lang.ClassNotFoundException: Didn't find class "okhttp3.internal.Platform" on path: DexPathList[[zip file "/data/app/popularmovies.troychuinard.com.popularmovies-2/base.apk", zip file "/data/app/popularmovies.troychuinard.com.popularmovies-2/split_lib_dependencies_apk.apk", zip file "/data/app/popularmovies.troychuinard.com.popularmovies-2/split_lib_slice_0_apk.apk", zip file "/data/app/popularmovies.troychuinard.com.popularmovies-2/split_lib_slice_1_apk.apk", zip file "/data/app/popularmovies.troychuinard.com.popularmovies-2/split_lib_slice_2_apk.apk", zip file "/data/app/popularmovies.troychuinard.com.popularmovies-2/split_lib_slice_3_apk.apk", zip file "/data/app/popularmovies.troychuinard.com.popularmovies-2/split_lib_slice_4_apk.apk", zip file "/data/app/popularmovies.troychuinard.com.popularmovies-2/split_lib_slice_5_apk.apk", zip file "/data/app/popularmovies.troychuinard.com.popularmovies-2/split_lib_slice_6_apk.apk", zip file "/data/app/popularmovies.troychuinard.com.popularmovies-2/split_lib_slice_7_apk.apk", zip file "/data/app/popularmovies.troychuinard.com.popularmovies-2/split_lib_slice_8_apk.apk", zip file "/data/app/popularmovies.troychuinard.com.popularmovies-2/split_lib_slice_9_apk.apk"],nativeLibraryDirectories=[/data/app/popularmovies.troychuinard.com.popularmovies-2/lib/arm, /system/lib, /vendor/lib]]
at dalvik.system.BaseDexClassLoader.findClass(BaseDexClassLoader.java:56)
at java.lang.ClassLoader.loadClass(ClassLoader.java:380)
at java.lang.ClassLoader.loadClass(ClassLoader.java:312)
at okhttp3.logging.HttpLoggingInterceptor$Logger$1.log(HttpLoggingInterceptor.java:112) 
at okhttp3.logging.HttpLoggingInterceptor.intercept(HttpLoggingInterceptor.java:160) 
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:147) 
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:121) 
at okhttp3.RealCall.getResponseWithInterceptorChain(RealCall.java:200) 
at okhttp3.RealCall$AsyncCall.execute(RealCall.java:147) 
at okhttp3.internal.NamedRunnable.run(NamedRunnable.java:32) 
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133) 
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607) 
at java.lang.Thread.run(Thread.java:761) 
Below is my relevant code from RetroFit:
spinner.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() {
#Override
public void onItemSelected(AdapterView<?> adapterView, View view, int i, long l) {
Toast.makeText(getApplicationContext(), String.valueOf(i), Toast.LENGTH_LONG).show();
String selection = String.valueOf(i);
switch (i){
case 0:
query = "popular";
mBaseURL = "https://api.themoviedb.org/3/movie/popular/";
break;
case 1:
query = "top_rated";
mBaseURL = "https://api.themoviedb.org/3/movie/top_rated/";
break;
default:
query = "popular";
mBaseURL = "https://api.themoviedb.org/3/movie/popular/";
break;
}
mMovieURLS.clear();
mMovieResultsAdapter.notifyDataSetChanged();
HttpLoggingInterceptor interceptor = new HttpLoggingInterceptor();
interceptor.setLevel(HttpLoggingInterceptor.Level.BODY);
OkHttpClient client = new OkHttpClient
.Builder()
.addInterceptor(interceptor)
.build();
Retrofit retrofit = new Retrofit.Builder()
.baseUrl(mBaseURL)
.client(client)
.addConverterFactory(GsonConverterFactory.create())
.build();
ApiInterface apiInterface = retrofit.create(ApiInterface.class);
Call<TheMovieDatabase> call = apiInterface.getImages();
call.enqueue(new Callback<TheMovieDatabase>() {
#Override
public void onResponse(Call<TheMovieDatabase> call, Response<TheMovieDatabase> response) {
String movieResponse = String.valueOf(response.isSuccessful());
Log.v("SUCESS", movieResponse);
}
#Override
public void onFailure(Call<TheMovieDatabase> call, Throwable t) {
}
});
}
API InterFace:
public interface ApiInterface{
#GET("?api_key=xxxxxxxxx&language=en-US")
Call<TheMovieDatabase> getImages();
}
Gradle Dependencies:
dependencies {
implementation fileTree(dir: 'libs', include: ['*.jar'])
implementation 'com.android.support:appcompat-v7:27.1.1'
implementation 'com.android.support.constraint:constraint-layout:1.1.0'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'com.android.support.test:runner:1.0.2'
androidTestImplementation 'com.android.support.test.espresso:espresso-core:3.0.2'
implementation 'com.squareup.picasso:picasso:2.5.2'
implementation 'com.android.support:recyclerview-v7:27.1.1'
implementation 'com.squareup.retrofit2:retrofit:2.4.0'
implementation 'com.squareup.okhttp3:okhttp:3.10.0'
implementation 'com.google.code.gson:gson:2.8.2'
implementation 'com.squareup.retrofit2:converter-gson:2.1.0'
implementation 'com.squareup.okhttp3:logging-interceptor:3.3.0'
}
I know this question has been 2 years, but I solve mine. So here is an alternate answer you could try.
My error started after upgrading version of my dependencies(Android Studio suggested it after looking in project structure). Since the problem involve Retrofit and OkHttp, I revert back to the older version that I use, when my app was working. So, I suggest to revert back to older version of :
implementation 'com.squareup.retrofit2:retrofit:*older version here*'
implementation 'com.squareup.retrofit2:converter-gson:*older version here*'
implementation 'com.squareup.retrofit2:converter-moshi:*older version here*'
implementation 'oauth.signpost:oauth-signpost:*older version here*'
implementation 'se.akerfeldt:okhttp-signpost:*older version here*'
implementation 'com.squareup.okhttp3:okhttp:*older version here*'
implementation 'oauth.signpost:signpost-core:*older version here*'
That is the dependencies that I revert back to older version. I suggest you change any Retrofit or OkHttp related dependencies back to older version when your app was working.
If you have any problem figuring out what is the older version that is available, I suggest to look into dependencies version from Project Structure > Dependencies tab > app tab. from there you can find your dependencies that you are looking for, and change the version.
Good luck

ServiceConfigurationError thrown when trying to create new Jetty WebSocketClient instance

I am trying to create a new WebSocketClient with no args constructor for cometD:
static BayeuxClient newInstace(String url) throws Exception {
WebSocketClient wsClient = new WebSocketClient(); //exception here!!
wsClient.start();
Map<String, Object> options = new HashMap<>();
ClientTransport transport = new JettyWebSocketTransport(options, Executors.newScheduledThreadPool(2), wsClient);
BayeuxClient client = new BayeuxClient(url, transport);
return client;
}
But this is throwing runtime exception:
java.util.ServiceConfigurationError: org.eclipse.jetty.websocket.api.extensions.Extension: Provider org.eclipse.jetty.websocket.common.extensions.identity.IdentityExtension not found
at java.util.ServiceLoader.fail(ServiceLoader.java:225)
at java.util.ServiceLoader.-wrap1(ServiceLoader.java)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:366)
at java.util.ServiceLoader$1.next(ServiceLoader.java:448)
at org.eclipse.jetty.websocket.api.extensions.ExtensionFactory.<init>(ExtensionFactory.java:35)
at org.eclipse.jetty.websocket.client.common.extensions.WebSocketExtensionFactory.<init>(WebSocketExtensionFactory.java:36)
at org.eclipse.jetty.websocket.client.WebSocketClient.<init>(WebSocketClient.java:117)
at org.eclipse.jetty.websocket.client.WebSocketClient.<init>(WebSocketClient.java:108)
at org.eclipse.jetty.websocket.client.WebSocketClient.<init>(WebSocketClient.java:88)
at org.asd.util.customerSupportChat.LekaneClient.newInstace(LekaneClient.java:40)
This is happening on Android
minSdkVersion 21
targetSdkVersion 25
And I have included the library like this:
//https://mvnrepository.com/artifact/org.cometd.java/cometd-java-websocket-jetty-client/3.1.2
compile group: 'org.cometd.java', name: 'cometd-java-websocket-jetty-client', version: '3.1.2'
Do you know what is wrong and how can I fix this?
--------------- edit ----------------
this was also in the stacktrace:
Caused by: java.lang.ClassNotFoundException: Didn't find class "org.eclipse.jetty.websocket.common.extensions.identity.IdentityExtension" on path: DexPathList[[zip file "/data/app/org.asd.debug-2/base.apk"],nativeLibraryDirectories=[/data/app/org.asd.debug-2/lib/x86, /system/lib, /vendor/lib]]
As sbordet mentioned in the comments, the problem was with missing dependencies. Adding this to build.gradle fixed the problem:
compile group: 'org.eclipse.jetty.websocket', name: 'websocket-common', version: '9.2.22.v20170606'
(using old version because don't have Java 8 available)
Don't know why it isn't resolved automatically though.

Categories

Resources