Using dropwizard version 0.9.2 and the configuration yml looks somewhat like this
server:
applicationConnectors:
- type: http
port: 8090
adminConnectors:
- type: http
port: 8091
requestLog:
timeZone: UTC
appenders:
- type: file
currentLogFilename: file
threshold: ALL
archive: true
archivedLogFilenamePattern: some-pattern
archivedFileCount: 5
maxFileSize: 10MB
While executing getting the following error
* Unrecognized field at: server.requestLog
Did you mean?:
- adminConnectors
- adminContextPath
- adminMaxThreads
On search seems like this error is known in Jackson and fixed in 2.7.3. So, upgraded dropwizard to latest 1.0.2 but still the problem persists.
Also, tried excluding jackson explicitly and include the latest 2.8.3 didn't help either. Any inputs on solving this issue ?
Tried pom
<dependency>
<groupId>io.dropwizard</groupId>
<artifactId>dropwizard-core</artifactId>
<version>0.9.2</version>
<exclusions>
<exclusion>
<groupId>io.dropwizard</groupId>
<artifactId>dropwizard-jackson</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.dropwizard</groupId>
<artifactId>dropwizard-jackson</artifactId>
<version>1.0.2</version>
</dependency>
Logging is not part of server configurations.
server:
applicationConnectors:
- type: http
port: 8090
adminConnectors:
- type: http
port: 8091
logging:
level: INFO
loggers:
requestLog: INFO
appenders:
Use "logging" instead
Related
I am getting the error as per showed in title. I have searched in Stackoverflow and other people have been through by the same problem in previous versions. In an answer was said that would be solved in a next version of DL4J and it seems that it have not occurred.
Below are pom.xml and the dependencies i am using.
Please, can anybody help me?
Thank you in advance.
pom.xml:
<properties>
<dl4j-master.version>1.0.0-M1.1</dl4j-master.version>
<logback.version>1.2.3</logback.version>
<java.version>1.8</java.version>
<maven-shade-plugin.version>2.4.3</maven-shade-plugin.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-core</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-nlp</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.datavec</groupId>
<artifactId>datavec-api</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.nd4j</groupId>
<artifactId>nd4j-cuda-11.0-platform</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.bytedeco</groupId>
<artifactId>cuda-platform-redist</artifactId>
<version>11.0-8.0-1.5.4</version>
</dependency>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-cuda-11.0</artifactId>
<version>${dl4j-master.version}</version>
</dependency>
<dependency>
<groupId>org.bytedeco.javacpp-presets</groupId>
<artifactId>cuda</artifactId>
<version>10.0-7.4-1.4.4</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>${logback.version}</version>
</dependency>
</dependencies>
Error:
11:11:35.720 [main] INFO org.nd4j.linalg.factory.Nd4jBackend - Loaded [JCublasBackend] backend
11:11:37.543 [main] INFO org.nd4j.nativeblas.NativeOpsHolder - Number of threads used for linear algebra: 32
11:11:37.675 [main] INFO org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner - Backend used: [CUDA]; OS: [Windows 10]
11:11:37.676 [main] INFO org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner - Cores: [4]; Memory: [3,5GB];
11:11:37.676 [main] INFO org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner - Blas vendor: [CUBLAS]
11:11:37.702 [main] INFO org.nd4j.linalg.jcublas.JCublasBackend - ND4J CUDA build version: 11.0.221
11:11:37.705 [main] INFO org.nd4j.linalg.jcublas.JCublasBackend - CUDA device 0: [NVIDIA GeForce 930M]; cc: [5.0]; Total memory: [4294836224]
11:11:37.705 [main] INFO org.nd4j.linalg.jcublas.JCublasBackend - Backend build information:
MSVC: 192930038
STD version: 201703L
CUDA: 11.0.221
DEFAULT_ENGINE: samediff::ENGINE_CUDA
HAVE_FLATBUFFERS
11:11:37.782 [main] INFO org.deeplearning4j.models.sequencevectors.SequenceVectors - Starting vocabulary building...
11:11:37.783 [main] DEBUG org.deeplearning4j.models.word2vec.wordstore.VocabConstructor - Target vocab size before building: [0]
11:11:37.814 [main] DEBUG org.deeplearning4j.models.word2vec.wordstore.VocabConstructor - Trying source iterator: [0]
11:11:37.814 [main] DEBUG org.deeplearning4j.models.word2vec.wordstore.VocabConstructor - Target vocab size before building: [0]
11:11:51.450 [main] DEBUG org.deeplearning4j.models.word2vec.wordstore.VocabConstructor - Waiting till all processes stop...
11:11:51.457 [main] DEBUG org.deeplearning4j.models.word2vec.wordstore.VocabConstructor - Vocab size before truncation: [168165], NumWords: [1952392], sequences parsed: [318], counter: [1952389]
11:11:51.457 [main] DEBUG org.deeplearning4j.models.word2vec.wordstore.VocabConstructor - Vocab size after truncation: [168165], NumWords: [1952392], sequences parsed: [318], counter: [1952389]
11:11:54.179 [main] INFO org.deeplearning4j.models.word2vec.wordstore.VocabConstructor - Sequences checked: [318], Current vocabulary size: [168165]; Sequences/sec: [19,39];
11:11:54.248 [main] INFO org.deeplearning4j.models.embeddings.loader.WordVectorSerializer - Projected memory use for model: [128,30 MB]
Exception in thread "main" java.lang.RuntimeException: cudaGetSymbolAddress(...) failed; Error code: [13]
at org.nd4j.linalg.jcublas.ops.executioner.CudaExecutioner.createShapeInfo(CudaExecutioner.java:2173)
at org.nd4j.linalg.api.shape.Shape.createShapeInformation(Shape.java:3279)
at org.nd4j.linalg.api.ndarray.BaseShapeInfoProvider.createShapeInformation(BaseShapeInfoProvider.java:75)
at org.nd4j.jita.constant.ProtectedCudaShapeInfoProvider.createShapeInformation(ProtectedCudaShapeInfoProvider.java:96)
at org.nd4j.jita.constant.ProtectedCudaShapeInfoProvider.createShapeInformation(ProtectedCudaShapeInfoProvider.java:77)
at org.nd4j.linalg.jcublas.CachedShapeInfoProvider.createShapeInformation(CachedShapeInfoProvider.java:46)
at org.nd4j.linalg.api.ndarray.BaseNDArray.(BaseNDArray.java:180)
at org.nd4j.linalg.api.ndarray.BaseNDArray.(BaseNDArray.java:174)
at org.nd4j.linalg.api.ndarray.BaseNDArray.(BaseNDArray.java:316)
at org.nd4j.linalg.jcublas.JCublasNDArray.(JCublasNDArray.java:135)
at org.nd4j.linalg.jcublas.JCublasNDArrayFactory.createUninitialized(JCublasNDArrayFactory.java:1533)
at org.nd4j.linalg.factory.Nd4j.createUninitialized(Nd4j.java:4379)
at org.nd4j.linalg.factory.Nd4j.rand(Nd4j.java:2957)
at org.nd4j.linalg.factory.Nd4j.rand(Nd4j.java:2946)
at org.deeplearning4j.models.embeddings.inmemory.InMemoryLookupTable.resetWeights(InMemoryLookupTable.java:145)
at org.deeplearning4j.models.sequencevectors.SequenceVectors.fit(SequenceVectors.java:278)
at org.deeplearning4j.models.paragraphvectors.ParagraphVectors.fit(ParagraphVectors.java:667)
at gov.rfb.cocaj.dl4jGPU.DocumentClassifier.main(DocumentClassifier.java:44)
This is always due to an incompatible cuda version. Make sure that the version you have installed locally is not different from the one you are using with dl4j.
I have the following maven versions in my pom.xml (among others):
<dependency>
<groupId>org.apache.camel.springboot</groupId>
<artifactId>camel-spring-boot-starter</artifactId>
<version>3.7.0</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core</artifactId>
<version>3.7.0</version>
</dependency>
<dependency>
<groupId>org.apache-extras.camel-extra</groupId>
<artifactId>camel-jcifs</artifactId>
<version>2.25.2</version>
<exclusions>
<exclusion>
<groupId>com.sun.xml.bind</groupId>
<artifactId>jaxb-impl</artifactId>
</exclusion>
</exclusions>
</dependency>
Camel spring-boot version = 3.7.0 and I want to connect to a SMB endpoint like this:
smb://sharedriveuser#server-instance.sub.domain.net/folder?initialDelay=0&delay=9000&autoCreate=false&noop=true&idempotent=true&password=ThePassWorD&filter=#csvFileFilter
I read the Camel 3 Migration Guide and found nothing regarding this camel-extras.
When trying to connect, I get an error like the password option is not supported anymore:
Caused by: org.apache.camel.ResolveEndpointFailedException: Failed to resolve endpoint: smb://sharedriveuser#server-instance.sub.domain.net/folder?initialDelay=0&delay=9000&autoCreate=false&noop=true&idempotent=true&password=xxxxxx&filter=#csvFileFilter due to: There are 1 parameters that couldn't be set on the endpoint. Check the uri if the parameters are spelt correctly and that they are properties of the endpoint. Unknown parameters=[{password=ThePassWorD}]
The actual documentation link google found many times, seems dead.
From Maven central, there is no version 3.x of the lib camel-jcifs and I am wondering if the lib is still compatible with Camel 3.x.x, otherwise is there another alternative with Camel 3?
I also tried to downgrade the camel-jcifs to 2.24.3 with the same error.
Camel-extras is a separated project from the Apache Camel. There is some work in place in the camel-extra repository to support camel 3[1], but it is still to be completed and there is no release in sight.
[1] https://github.com/camel-extra/camel-extra/commit/f028dfdfaa467958c58abea0d604f8fe2f17be04
There is now a pull request to add camel-jcifs to the 3.x version:
https://github.com/camel-extra/camel-extra/pull/39
You might also get my fork and build it yourself:
https://github.com/bebbo/camel-extra *
It got merged and is in the official repository:
https://github.com/camel-extra/camel-extra
To use it with quarkus, you have to convert some List types to arrays.
I am unable to get the value I am expecting, An exception is thrown at this line #Value("${message:this-is-class-value}").
SERVER SIDE
pom.xml
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-security</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-config-server</artifactId>
</dependency>
</dependencies>
src/main/resources/application.properties
server.port=8888
spring.application.name=config-service
spring.cloud.config.server.git.uri=file:///C:/config
management.endpoints.web.exposure.include=*
spring.security.user.name=root
spring.security.user.password=abc123
Application class
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.cloud.config.server.EnableConfigServer;
#SpringBootApplication
#EnableConfigServer
public class ConfigServiceApplication {
public static void main(String[] args) {
SpringApplication.run(ConfigServiceApplication.class, args);
}
}
Local git folder
Configurations files with the same property but the different value to detect
c:/config/application.properties
c:/config/api-gateway.properties
c:/config/api-gateway-PROD.properties
output while server startup
Completed initialization in 5 ms
WARN : Could not merge remote for master remote: null
INFO : o.s.c.c.s.e.NativeEnvironmentRepository : Adding property source: file:/C:/config/application.properties
if I access this url
http://localhost:8888/api-gateway/PROD
console output is as follows
WARN : Could not merge remote for master remote: null
INFO : o.s.c.c.s.e.NativeEnvironmentRepository : Adding property source: file:/C:/config/api-gateway-PROD.properties
INFO : o.s.c.c.s.e.NativeEnvironmentRepository : Adding property source: file:/C:/config/api-gateway.properties
INFO : o.s.c.c.s.e.NativeEnvironmentRepository : Adding property source: file:/C:/config/application.properties
CLIENT SIDE (Separate Project)
pom.xml
<dependencies>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-config</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-netflix-zuul</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-netflix-eureka-client</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
</dependency>
</dependencies>
Application class
#SpringBootApplication
#EnableZuulProxy
#EnableDiscoveryClient
public class ApiGatewayApplication {
public static void main(String[] args) {
SpringApplication.run(ApiGatewayApplication.class, args);
}
}
Controller
#RestController
public class SettingsController {
#Value("${message:this-is-class-value}")
String name = "World";
#RequestMapping("/")
public String home() {
return "Hello " + name;
}
}
resources/application.yml
server:
port: 8282
spring:
application:
name: api-gateway
eureka:
instance:
preferIpAddress: true
client:
registerWithEureka: true
fetchRegistry: true
serviceUrl:
defaultZone: ${EUREKA_URI:http://localhost:8761/eureka}
resources/bootstrap.yml
spring:
profiles:
active: PROD
cloud:
config:
name: api-gateway
uri: http://localhost:8888/
username: root
password: abc123
management:
endpoints:
web:
exposure:
include: refresh
Console output
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'settingsController': Injection of autowired dependencies failed; nested exception is org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type [java.util.LinkedHashMap<?, ?>] to type [java.lang.String]
Caused by: org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type [java.util.LinkedHashMap<?, ?>] to type [java.lang.String]
Please do let me know if anything else is required.
add message=Hello from property file1 in src/main/resources/application.properties. Or if you want to read this property from c:/config/application.properties or c:/config/config-service.properties you need to configure external configuration file by using #ConfigurationProperties annotation.
Looks, you are trying to use the #Value annotation on controller class which resides in your Cloud config server application. While the concept is like cloud config server application will be the provider to other applications and it can provide the properties from git or local file system (configuration) to requester client application. Other applications will connect to your config server application by providing it's URL, application name and profile for which it want to get the properties. You can check below URL for cloud config server and client application.
https://www.thetechnojournals.com/2019/10/spring-cloud-config.html
Your cloud config server has security configuration as below.
spring.security.user.name=root
spring.security.user.password=abc123
You client configuration doesn't provide any security configuration which is causing 401 error in your logs.
To solve this issue please do below changes to your client configuration.
Client configuration:
spring:
profiles:
active: PROD
cloud:
config:
name: api-gateway
uri: http://localhost:8888/
username: root
password: abc123
Followed steps given below to send Metrics from Spring Boot to Prometheus:
Note: I have installed Prometheus locally on my Mac using a Docker image.
In pom.xml added this:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
<version>2.0.4.RELEASE</version>
</dependency>
<!-- Micormeter core dependecy -->
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-core</artifactId>
<version>1.0.6</version>
</dependency>
<!-- Micrometer Prometheus registry -->
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-registry-prometheus</artifactId>
<version>1.0.6</version>
</dependency>
In application.properties added this:
server.port: 9000
management.server.port: 9001
management.server.address: 127.0.0.1
management.endpoint.metrics.enabled=true
management.endpoints.web.exposure.include=*
management.endpoint.prometheus.enabled=true
management.metrics.export.prometheus.enabled=true
Started Prometheus with following lines in configuration file:
Global configurations
global:
scrape_interval: 5s # Set the scrape interval to every 5 seconds.
evaluation_interval: 5s # Evaluate rules every 5 seconds.
scrape_configs:
- job_name: 'hello-world-promethus'
metrics_path: '/actuator/prometheus'
static_configs:
- targets: ['localhost:9001']
When I hit: http://localhost:9001/actuator/prometheus, I can see the metrics but they are not visible on Prometheus UI.
What am I missing?
Solution was simple. You will run into this only if you're running Prometheus Docker Container. Changed target from: 'localhost:9001' to 'docker.for.mac.localhost:9001'. For example:
- job_name: hello-world-promethus
scrape_interval: 5s
scrape_timeout: 5s
metrics_path: /actuator/prometheus
scheme: http
static_configs:
- targets:
- docker.for.mac.localhost:9001
Below is my log4j configuration
#log4j.additivity.org.apache.qpid=false
log4j.rootLogger=DEBUG, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.threshold=DEBUG
log4j.appender.console.layout.ConversionPattern=%-7p %d [%t] %c %x - %m%n
log4j.logger.javax.jms=DEBUG
log4j.logger.org.apache.qpid=DEBUG
log4j.logger.org.apache.qpid.amqp_1_0=DEBUG
log4j.logger.org.apache.qpid.amqp_1_0.jms=DEBUG
and then in code
String log4jConfigFile = System.getProperty("user.dir") + File.separator + "log4j.properties";
PropertyConfigurator.configure(log4jConfigFile);
logger.debug("this is a debug log message");
my debug message this is a debug log message do get printed but the log messages from org.apache.qpid are not getting printed on console
<dependency>
<groupId>org.apache.qpid</groupId>
<artifactId>qpid-amqp-1-0-client-jms</artifactId>
<version>0.22</version>
</dependency>
EDIT
I am a newbie in java... The logging dependencies I have added. Do I need add some setting somewhere to redirect sl4j logs to log4j??
<slf4j-version>1.6.6</slf4j-version>
<log4j-version>1.2.17</log4j-version>
<!-- Logging -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j-version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j-version}</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j-version}</version>
</dependency>
The (deprecated) qpid-amqp-1-0-client-jms client used java.util.logging, and not log4j. To quote from a mail I sent back in 2014 to the users#qpid.apache.org mailing list:
you can turn it on by setting the Java system property
java.util.logging.config.file to point to a file that looks something
like this:
handlers=java.util.logging.FileHandler FRM.level=ALL
java.util.logging.FileHandler.formatter=java.util.logging.SimpleFormatter
java.util.logging.SimpleFormatter.format=[%1$tc] %4$s: %5$s%n
java.util.logging.FileHandler.level=ALL`
# (The output file is placed in the directory
# defined by the "user.home" System property.)
java.util.logging.FileHandler.pattern=%h/qpid-jms-%u.log`
When you run the client it should then generate a file called
qpid-jms-0.log in your home directory, with output that looks
something like:
[Mon Feb 24 18:45:58 CET 2014] FINE: RECV[/127.0.0.1:5672|0] :
SaslMechanisms{saslServerMechanisms=[ANONYMOUS]}
Note that the logging in this old client is really very minimal and ideally you should instead migrate your code to the supported Qpid JMS client for AMQP 1.0 https://qpid.apache.org/components/jms/index.html which does use slf4j, but uses different configuration syntax for connections and queues.