Spring Cloud Auth Service Unable to load config data from 'configserver - java

I have a problem about running auth service.
Even if I couldn't determine whether it works, I couldn't run it.
I got the error appearing on the console shown below.
java.lang.IllegalStateException: Unable to load config data from 'configserver:http://localhost:9296'
Caused by: java.lang.IllegalStateException: File extension is not known to any PropertySourceLoader. If the location is meant to reference a directory, it must end in '/' or File.separator
To run the example
1 ) Run Registry Service
2 ) Run config service
3 ) Run API Gateway
4 ) Run other services
Here is the project link : Project Link

Here is the solution.
After I changed the Spring version of auth which is the same as others, the issue disappeared.
Changed
<version>2.7.5</version>
to
<version>2.7.4</version>
in pom.xml

Related

Can't run my Quarkus app after adding JPA

I'm trying to learn Quarkus, but after adding a JPA dependency the app doesn't initialize anymore.
This is the added dependency:
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-jdbc-postgresql</artifactId>
</dependency>
Following are the errors I'm having:
[org.tes.uti.TestcontainersConfiguration] (build-47) Attempted to read Testcontainers configuration file at file:/home/fhb/.testcontainers.properties but the file was not found. Exception message: FileNotFoundException: /home/fhb/.testcontainers.properties (No such file or directory)
After that Quarkus keeps on and gets the following error:
Caused by: java.lang.RuntimeException: io.quarkus.runtime.configuration.ConfigurationException: Model classes are defined for the default persistence unit <default> but configured datasource <default> not found: the default EntityManagerFactory will not be created. To solve this, configure the default datasource. Refer to https://quarkus.io/guides/datasource for guidance.
This is my application.properties file:
quarkus.datasource.db-kind=postgresql
quarkus.datasource.username=postgres
quarkus.datasource.password=admin
quarkus.datasource..jdbc.url=jdbc:postgresql://localhost:5432/quarkus-social
quarkus.datasource.jdbc.max-size=16
I think that Quarkus is trying to run tests and for that it needs the .testcontainers.properties file, which I've never created. Anyways I don't want to create that file in /home/fhb/, so theres a way to specify that file location?
Besides thatI would like to know if Testcontainers has something to do with unit tests, which I would like to add to my quarkus application.
Thanks in advance for your help.
I guess the problem is a small typo.
Change from
quarkus.datasource..jdbc.url=jdbc:postgresql://localhost:5432/quarkus-social
To
quarkus.datasource.jdbc.url=jdbc:postgresql://localhost:5432/quarkus-social
If you don't specify the URL of the database and run in dev or test mode, Quarkus uses test containers to start one database for you.
There are tutorials on quarkus.io/guides/datasource.
About tests, you can use test containers or one in memory database as H2. You can find all this on Quarkus guides.

How to use org.apache.httpcomponents inside a spark job on Hadoop/Spark?

I am trying to run a spark job on a Hadoop cluster that also makes an http request to another server. I am using org.apache.httpcomponents to make this request, which works fine locally on my machine. However this fails the moment I submit the job to the cluster (managed by Cloudera) with the following error:
User class threw exception: java.lang.NoSuchFieldError: INSTANCE
at org.apache.http.conn.ssl.SSLConnectionSocketFactory.(SSLConnectionSocketFactory.java:151)
at org.apache.http.impl.client.HttpClientBuilder.build(HttpClientBuilder.java:977)
at org.apache.http.impl.client.HttpClients.createDefault(HttpClients.java:56)
From all the reading I have done, this error is caused by multiple versions of Apache Http client jar. It appears that Hadoop/Spark engine has it's own dependency to Apache Http client and that is a different version than the one I am using. Because my jar is run as part of the hadoop/spark engine it ends up including both my version of http as well as the one Hadoop requires.
If I add 'compileOnly' for org.apache.httpcomponents in my build.gradle and submit, I get this error instead:
User class threw exception: java.lang.NoClassDefFoundError: org/apache/http/impl/client/HttpClients
Is there a way for me to configure this in gradle so that when I build my jar, it will use the already existing version on Hadoop? ie. A way to declare a temporary dependency (when running locally download and use latest version, but when building UberJar drop the dependency)?
UPDATE
I decided to try swapping to a different http library (okhttp3) to see if that would resolve the issue. However I get a very similar exception when trying to run through the cluster here too:
User class threw exception: java.lang.NoSuchFieldError: Companion
at okhttp3.internal.Util.(Util.kt:70)
at okhttp3.OkHttpClient.(OkHttpClient.kt:959)
Looks like Cloudera also supplies a version of okhttp with it's spark2 client which is unfortunate.

OpenTok-java-sdk issue Create SessionId throwing exception

I am working on maven and spring.
I have created simple class having main method,
to create the OpenTok SessionId but getting exception as follows
Exception in thread "main" java.lang.NoSuchMethodError:
com.ning.http.client.AsyncHttpClient$BoundRequestBuilder.setParameters(Lcom/ning/http/client/FluentStringsMap;)Lcom/ning/http/client/AsyncHttpClient$BoundRequestBuilder;
at com.opentok.util.HttpClient.createSession(HttpClient.java:48) at
com.opentok.OpenTok.createSession(OpenTok.java:252) at
com.opentok.OpenTok.createSession(OpenTok.java:306) at
webapp.test.com.OpenToks.main(OpenTokProg.java:28)
My code is inside main method
OpenTok opentok = new OpenTok(API_KEY,API_SECRET);
String sessionId = opentok.createSession().getSessionId();
I having tried following steps -- got from "forums.tokbox.com/supported-server-api/exception-while-getting-session-object-in-java-t46638#p60778"
1) Make sure that you successfully build path for java server sdk.
--> used in pom.xml
<dependency>
<groupId>com.tokbox</groupId>
<artifactId>opentok-server-sdk</artifactId>
<version>2.3.2</version>
</dependency>
2) Try using hard coded API key and secret key (for testing purpose)at the time you making object of Opentok.
--> tried again same error
3) Make sure you have an access to Opentok server, run a diagnose over this link.
http://tokbox.com/tools/connectivity/
--> getting Message: Successful for all connections
Please help
The version of AsyncHttpClient that was used for compiling com.opentok.util.HttpClient (probably 1.8) is different from the one that's provided at runtime (probably 1.9).
Between those versions, the setParameters method was renamed into setFormParameters.
You have to find out where this clash comes from and resolve it. Use mvn dependency:tree to figure out which library depends on which.

Axis2 WS consumer in WebMethods8.2

I've got into a scenario where I have to get an Axis2 based ws consumer working within WebMethods as a java service. I've implemented the ws consumer first in netbeans just to see if it works and thus i found that the minimal amount of jars I'll require are the following:
[ xmlschema-1.4.7.jar, apache-mime4j-core-0.7.2.jar,
axiom-api-1.2.13.jar, axiom-impl-1.2.13.jar, axis2-adb-1.6.2.jar,
axis2-kernel-1.6.2.jar, axis2-transport-http-1.6.2.jar,
axis2-transport-local-1.6.2.jar, commons-codec-1.3.jar,
commons-httpclient-3.1.jar, commons-logging-1.1.1.jar,
httpcore-4.0.jar, mail-1.4.jar, neethi-3.0.2.jar, wsdl4j-1.6.2.jar ]
I've uploaded these jar files under the IS/packages/{package_name}/code/jars folder. Whenever I try to execute the java service that would send the request and process the response I get the following exception:
java.lang.reflect.InvocationTargetException:
org.apache.axiom.om.OMFactory.getMetaFactory()Lorg/apache/axiom/om/OMMetaFactory;
From the IS error log file I found that the actual error message is as follows:
org.apache.axiom.om.OMFactory.getMetaFactory()Lorg/apache/axiom/om/OMMetaFactory;
Caused by: java.lang.reflect.InvocationTargetException: null Caused
by:
java.lang.NoSuchMethodError:org.apache.axiom.om.OMFactory.getMetaFactory()Lorg/apache/axiom/om/OMMetaFactory;
The platform is WebMethods 8.2 under Linux environment. The JDK version is 1.6.0_32 and the application server under WebMethods is Jetty.
Actually the solve of this problem was a bit more tricky. First of all I manually had to configure the manifest file of the package on the IS server to use the jars provided in the package abnd thus it wouldn't get in conflict with the Axis used by the IS itself. On the other hand I had to manually add the ClassLoader because WebMethods can't use META-INF based information from jar files as it seems. To solve this problem simply use:
System.setProperty("org.apache.axiom.om.OMMetaFactory", "org.apache.axiom.om.impl.llom.factory.OMLinkedListMetaFactory");
That solves all the problems.

DWR invoke error in Javascript

I am using Spring 3.05 and DWR 3.0rc1 in my application for making AJAX calls. In local environment, i.e. Eclipse & Tomcat Server, it's working fine but in Clustered Environment I am getting the following errors while invoking DWR and a script error is coming saying DWR is not defined, but I am able to see myApplication/dwr/index.html properly.
Error Log:
Skipping 'script' due to NoClassDefFoundError on org.directwebremoting.create.ScriptedCreator. Cause: org/apache/bsf/BSFException
Skipping 'pageflow' due to ClassNotFoundException on org.directwebremoting.beehive.PageFlowCreator. Cause: Beehive/Weblogic jar file not available.
adding creator type: none = class org.directwebremoting.create.NullCreator
adding creator type: new = class org.directwebremoting.create.NewCreator
Please help me out
Looks to me that jars are not included in the classpath. Can you please check whether it is included?

Categories

Resources