This question already has an answer here:
Spring SpEL expression evaluation for an application yml property
(1 answer)
Closed 1 year ago.
I'm using SpringBoot.
In previous projects I was using application.property file, and content looks look this:
seconds.timeOut=10
interval.milliseconds.cleaner=#{${seconds.timeOut}*2*1000}
interval.seconds.cleanerOffset=#{${seconds.timeOut}*3}
result was correct cleaner=20000 and cleanerOffset=30
In new project I switch to application.yml file. Have same configuration:
seconds:
timeOut: 10
interval:
milliseconds:
cleaner: ${seconds.timeOut}*2*1000
interval:
seconds:
cleanerOffset: ${seconds.timeOut}*3
but result is string cleaner = "10*2*1000"
Of course I have exception:
Caused by: java.lang.IllegalStateException: Encountered invalid #Scheduled method 'cleaningWorker': Invalid fixedDelayString value "10*2*1000" - cannot parse into long
I can't found any workaround could you help me?
Thanks for all.
this solution for me:
seconds:
timeOut: 10
interval:
milliseconds:
cleaner: '#{${seconds.timeOut}*2*1000}'
interval:
seconds:
cleanerOffset: '#{${seconds.timeOut}*3}'
Try:
$(( ${seconds.timeOut} * 2000seconds ))
If this didn't work please refer to this, this will help.
Related
Does anyone know if it is possible to configure the Jmeter html report so that the html report shows not only the first failed assertions, but all?
Generated xml_log.jtl looks like this.
<assertionResult>
<name>Response Assertion [202]</name>
<failure>true</failure>
<error>false</error>
<failureMessage>Test failed: code expected to equal /
received : [2]00
comparison: [3]00
/</failureMessage>
</assertionResult>
<assertionResult>
<name>Duration Assertion [5ms] request</name>
<failure>true</failure>
<error>false</error>
<failureMessage>The operation lasted too long: It took 293 milliseconds, but should not have lasted longer than 5 milliseconds.</failureMessage>
</assertionResult>
And generated report:
Thanks.
The point is that the HTML Reporting Dashboard can only be generated from .jtl results files in CSV format
The dashboard generator is a modular extension of JMeter. Its default behavior is to read and process samples from CSV files to generate HTML files containing graph views.
and the .jtl results file in CSV format stores information only about first failed assertion.
You can work it around by adding a JSR223 Listener to walk through all the assertion failures, combine failure messages into a single one and substitute first assertion's failure message with this combined cumulative synthetic one, example code:
def message = new StringBuilder()
prev.getAssertionResults().each { assertionResult ->
message.append(assertionResult.getFailureMessage()).append(System.getProperty('line.separator'))
}
if (prev.getAssertionResults().size() > 0) {
prev.getAssertionResults().first().setFailureMessage(message.toString())
}
More information on Groovy scripting in JMeter: Apache Groovy - Why and How You Should Use It
This might seem to be a simple question (if that's the case, please excuse me), but after 20 minutes of online searching I did not find any sensible answer.
I have several cron jobs to be executed via QuartzRunner, let's call the first FooBean and the second BarBean for now. FooBean is running daily at 00:00 for 6 (!) hours and sometimes it is not executed properly. After carefully studying the logs I found out that FooBean fails to execute when BarBean fails to execute. BarBean is executed daily at 03:00 and sometimes it throws:
22866 java.lang.NullPointerException: File cannot be <null>
22867 at org.jconfig.FileWatcher.<init>(FileWatcher.java:54)
22868 at org.jconfig.handler.AbstractHandler.addFileListener(AbstractHandler.java:39)
22869 at org.jconfig.ConfigurationManager.addFileListener(ConfigurationManager.java:180)
22870 at org.jconfig.ConfigurationManager.getConfiguration(ConfigurationManager.java:122)
sometimes it does not throw it and then FooBean is executed properly. If BarBean fails, then the log shows some Transaction deadlock issue repeatedly for ten minutes and then JDBC connection failures repeated again and again for almost three hours. I do not understand what file is being involved. The line throwing the error looks like:
Configuration config = ConfigurationManager.getConfiguration("inventory");
and org.jconfig namespaces are involved here. Intuitively this seems to be a misconfiguration, but I did not find any sources which explain the issue.
The ConfigurationManagers getConfiguration-Method tries to load a config file from your classpath. The function concatenates the given name with '_config.xml'.
In your case this would be 'inventory_config.xml' this file should be in available on your classpath (main/resources) because ConfigurationManager tries to load it from there.
I am using spring 5 with kotlin and I have the following code
#Scheduled(cron = "${job.notification.expiring.x}")
fun notify() {
}
and in application.yml
job:
notification:
expiring.x: 0 0 12 1/1 * ?
On the line with the #schedulled, at cron parameter annotation Intellij says
An annotation parameter must be a compile-time constant.
How can I fix this, in Java the properties were loaded at run time.
You need to escape the $ character in Kotlin since this is used for String templates:
#Scheduled(cron = "\${job.notification.expiring.x}")
See the section on String templates at the bottom of this page:
https://kotlinlang.org/docs/reference/basic-types.html
I'm trying to follow this tutorial to analyze Apache access log files using Pig:
http://venkatarun-n.blogspot.com/2013/01/analyzing-apache-logs-with-pig.html
And i'm stuck with this Pig script:
grpd = GROUP logs BY DayExtractor(dt) as day;
When i execute that in grunt terminal, i get the following error:
ERROR 1200: mismatched input 'as' expecting
SEMI_COLON Failed to parse: mismatched input 'as'
expecting SEMI_COLON
Function DayExtractor is defined from piggybank.jar in this manner:
DEFINE DayExtractor
org.apache.pig.piggybank.evaluation.util.apachelogparser.DateExtractor('yyyy-MM-dd');
Ideas anyone?
I've been searching for awhile about this. Any help would be greatly be appreciated.
I am not sure how the author of the blog post got it to work, but as far as I know, you cannot use as in GROUP BY in pig. Also, I don't think you cannot use UDFs in GROUP BY. May be the author had a different version of pig that supported such operations. To get the same effect, you can split it into two steps:
logs_day = FOREACH logs GENERATE ....., DayExtractor(dt) as day;
grpd = GROUP logs_day BY day;
I am getting this strange error while executing the following code.
EncoderRequest encoderRequest = new EncoderRequest(sid,appTxnId,pfid,transactionType,"",isUpdatetype9,true);
I have checked all the parameter values are valid. I am using java7 plateform.
can any one have come across this situation, please help.
following is the part of stacktrace i am getting.
Caused by: java.lang.ClassFormatError: Illegal local variable table length 48 in method com.cmc.facts.encoder.EncoderRequest.<init>(JLjava/lang/String;Ljava/lang/Long;Lcom/cmc/facts/enums/TransactionType;Ljava/lang/String;ZZ)V at com.cmc.facts.nist.NistReaderModel.preprossingOfNistFile(NistReaderModel.java:180) at com.cmc.facts.action.interstate.InterStateAction.uploadFIIF(InterStateAction.java:645) ... 115 more
There have been previous reports of the same error, on Junit tests and similar..
For them, adding the JVM arg -XX:-UseSplitVerifier seemed to work
Have a look at this article
You can also do this config :
Add -noverify in your jvg args
For ant config you can do : <jvmarg value="-noverify"/>
You can follow the link for more details on why we need to do this.