Java Logger Producing Multiple of the Same Log to Console - java

I have the following code, we are not using System.out.println statements but have to use a logger to print out to console.
Here is example code (Java):
public void printColumnStats() {
java.util.logging.Logger log = java.util.logging.Logger
.getLogger("ProfileStatusClass");
log.setLevel(Level.ALL);
ConsoleHandler handler = new ConsoleHandler();
handler.setFormatter(new MyFormatter());
handler.setLevel(Level.ALL);
log.addHandler(handler);
// This will print the current Column Profiling stats
log.fine("FieldName : " + this.datasetFieldName);
log.fine("Field index : " + this.fieldIndex);
NumberFormat formatter = new DecimalFormat("#0.00000000");
if (this.fieldType.equalsIgnoreCase("number")) {
log.fine("Field Null Count : " + this.datasetFieldNullCount);
log.fine("Field Valid/Obs Count : " + this.datasetFieldObsCount);
log.fine("Field Min : " + (0l + this.datasetFieldMin));
...
I have the following call for it (sorry this part is in Scala, but should be straight forward:
for (e <- tResults) {
e._2.printColumnStats()
println("++........................................................++")
}
What I am getting tons of repeats before the next set of stats pulls up even though there is just one of each type for the loop:
Field Null Count : 0.0
Field Null Count : 0.0
Field Null Count : 0.0
Field Null Count : 0.0
Field Null Count : 0.0
Field Null Count : 0.0
Field Null Count : 0.0
Field Null Count : 0.0
Field Null Count : 0.0
Field Null Count : 0.0
Field Null Count : 0.0
Field Null Count : 0.0
Field Null Count : 0.0
Field Null Count : 0.0
Field Null Count : 0.0

You are adding a new ConsoleHandler on every call to 'printColumnStats'. You only want to install one handler. If you are going to use code to setup the logger then move the setup code out of the printColumnStats function and into a static block.
private static final Logger log = Logger.getLogger("ProfileStatusClass");
static {
log.setLevel(Level.ALL);
ConsoleHandler handler = new ConsoleHandler();
handler.setFormatter(new MyFormatter());
handler.setLevel(Level.ALL);
log.addHandler(handler);
}
By default, the JVM will install a ConsoleHandler on the root logger too. Your logger should setUserParentHandlers to false so you don't publish to that handler too.

Related

org.hibernate.AssertionFailure: force initializing collection loading

After upgraded Spring to v.6 and Spring boot to v.3 (and automatically Hibernate to 6.1.5), tests started to fail because of org.hibernate.AssertionFailure: force initializing collection loading
I have an entity class AClass with a field
#OneToMany(mappedBy = "aClass", fetch = FetchType.EAGER, cascade = CascadeType.ALL, orphanRemoval = true)
private Set<BClass> items;
When repository of AClass is triggering several methods, hashCode method is invoked and when it tries to execute items.hashCode(), I'm getting the exception:
2022-12-13T09:42:46.314+01:00 ERROR 15336 --- [ Test worker] org.hibernate.AssertionFailure : HHH000099: an assertion failure occurred (this may indicate a bug in Hibernate, but is more likely due to unsafe use of the session): org.hibernate.AssertionFailure: force initializing collection loading
org.hibernate.AssertionFailure: force initializing collection loading
at app//org.hibernate.collection.spi.AbstractPersistentCollection.forceInitialization(AbstractPersistentCollection.java:807)
at app//org.hibernate.engine.internal.StatefulPersistenceContext.initializeNonLazyCollections(StatefulPersistenceContext.java:995)
at app//org.hibernate.engine.internal.StatefulPersistenceContext.initializeNonLazyCollections(StatefulPersistenceContext.java:981)
at app//org.hibernate.sql.results.spi.ListResultsConsumer.consume(ListResultsConsumer.java:170)
at app//org.hibernate.sql.results.spi.ListResultsConsumer.consume(ListResultsConsumer.java:32)
at app//org.hibernate.sql.exec.internal.JdbcSelectExecutorStandardImpl.doExecuteQuery(JdbcSelectExecutorStandardImpl.java:443)
at app//org.hibernate.sql.exec.internal.JdbcSelectExecutorStandardImpl.executeQuery(JdbcSelectExecutorStandardImpl.java:166)
at app//org.hibernate.sql.exec.internal.JdbcSelectExecutorStandardImpl.list(JdbcSelectExecutorStandardImpl.java:91)
at app//org.hibernate.sql.exec.spi.JdbcSelectExecutor.list(JdbcSelectExecutor.java:31)
at app//org.hibernate.loader.ast.internal.CollectionLoaderSingleKey.load(CollectionLoaderSingleKey.java:121)
at app//org.hibernate.persister.collection.AbstractCollectionPersister.initialize(AbstractCollectionPersister.java:789)
at app//org.hibernate.event.internal.DefaultInitializeCollectionEventListener.onInitializeCollection(DefaultInitializeCollectionEventListener.java:75)
at app//org.hibernate.event.service.internal.EventListenerGroupImpl.fireEventOnEachListener(EventListenerGroupImpl.java:107)
at app//org.hibernate.internal.SessionImpl.initializeCollection(SessionImpl.java:1710)
at app//org.hibernate.collection.spi.AbstractPersistentCollection.lambda$initialize$3(AbstractPersistentCollection.java:613)
at app//org.hibernate.collection.spi.AbstractPersistentCollection.withTemporarySessionIfNeeded(AbstractPersistentCollection.java:265)
at app//org.hibernate.collection.spi.AbstractPersistentCollection.initialize(AbstractPersistentCollection.java:611)
at app//org.hibernate.collection.spi.AbstractPersistentCollection.read(AbstractPersistentCollection.java:136)
at app//org.hibernate.collection.spi.PersistentSet.hashCode(PersistentSet.java:407)
at app//xyz.example.AClass.hashCode(AClass.java:52)
...
When I used default Lombok hashCode method, or IntelliJ default like
#Override
public int hashCode() {
int result = super.hashCode();
result = 31 * result + (x != null ? x.hashCode() : 0);
result = 31 * result + (y != null ? y.hashCode() : 0);
result = 31 * result + (z != null ? z.hashCode() : 0);
result = 31 * result + (items != null ? items.hashCode() : 0);
return result;
}
I'm getting exception, but when I removed this line:
result = 31 * result + (items != null ? items.hashCode() : 0);
tests are passing. However, I wouldn't like to remove items from hashCode method.

Documentation for MBean attributes of a Datasource

I want to monitor my datasources pools in order to fine tune my server using the resources that real day to day demand and take decisions in advance before it is overwhelmed.
So I found this code that works perfect. It's a servlet that dumps information in a web when requested:
protected void doGet(HttpServletRequest req, HttpServletResponse resp)
throws ServletException, IOException {
PrintWriter writer = resp.getWriter();
writer.println("<!DOCTYPE html>");
writer.println("<html>");
writer.println("<body>");
writer.println("<p><h1>Tomcat Pool</h1></p><p>");
try {
MBeanServer server = ManagementFactory.getPlatformMBeanServer();
Set<ObjectName> objectNames = server.queryNames(null, null);
for (ObjectName name : objectNames) {
MBeanInfo info = server.getMBeanInfo(name);
if (name.toString().contains("type=DataSource")) {
writer.println("--------------<br/>");
for (MBeanAttributeInfo mf : info.getAttributes()) {
Object attributeValue = server.getAttribute(name,
mf.getName());
if (attributeValue != null) {
writer.println("" + mf.getName() + " : "
+ attributeValue.toString() + "<br/>");
}
}
}
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
writer.println("</p></body>");
writer.println("</html>");
}
One request might look like this (I am just posting data for one datasource to simplify):
propagateInterruptState : false
useDisposableConnectionFacade : true
validationInterval : 30000
jmxEnabled : true
ignoreExceptionOnPreLoad : false
logAbandoned : false
commitOnReturn : false
password : Password not available as DataSource/JMX operation.
maxIdle : 10
testWhileIdle : false
removeAbandoned : false
minIdle : 10
abandonWhenPercentageFull : 0
maxWait : -1
active : 0
size : 5
logValidationErrors : false
driverClassName : com.mysql.jdbc.Driver
name : Tomcat Connection Pool[2-2023938592]
poolSweeperEnabled : true
validationQueryTimeout : -1
numActive : 0
modelerType : org.apache.tomcat.jdbc.pool.DataSource
validationQuery : select 1
rollbackOnReturn : false
className : org.apache.tomcat.jdbc.pool.DataSource
numIdle : 5
alternateUsernameAllowed : false
suspectTimeout : 0
useEquals : true
removeAbandonedTimeout : 60
loginTimeout : 0
testOnConnect : false
idle : 5
initialSize : 5
defaultTransactionIsolation : -1
url : jdbc:mysql://localhost:3306/SYSTEMLOGS
numTestsPerEvictionRun : 0
testOnBorrow : false
fairQueue : true
timeBetweenEvictionRunsMillis : 5000
minEvictableIdleTimeMillis : 60000
accessToUnderlyingConnectionAllowed : true
maxAge : 0
testOnReturn : false
useLock : false
waitCount : 0
maxActive : 120
username : root
That's what I wanted, I think.
The problem is I don't seem to find the doc where all this parameters are defined and explained to know what I am getting. "active" seems to be the actual number of request for that pool... or maybe "numActive"?
Any hint?
Thank you very much.

Java GRIB-Decoder: Extract data from GRIB2 files

I've downloaded some grib data files from here: ftp://data-portal.ecmwf.int/20160721000000/ (file type is .bin) and want to extract the data from this file in my Java application (I want to load the extracted data into a database later). I'm just trying with the file ftp://wmo:essential#data-portal.ecmwf.int/20160721000000/A_HWXE85ECEM210000_C_ECMF_20160721000000_24h_em_ws_850hPa_global_0p5deg_grib2.bin.
Therefore I've created a new Java project and added the two libraries grib-8.0.29.jar and netcdfAll-4.6.6.jar. Documentation for the grib API can be found here: http://www.unidata.ucar.edu/software/decoders/grib/javadoc/. I need to open the downloaded files to get the data. Retrieving some metadata via Grib2Dump seems to work (see below). Also the Grib2Input instance sais, that I have a valid GRIB file of version 2.
Here my working code for retrieving some metadata:
public static void main(String[] args) throws IOException, InterruptedException {
File srcDir = new File("C://test//");
File[] localFiles = srcDir.listFiles();
for (File tempFile : localFiles) {
RandomAccessFile raf = new RandomAccessFile(tempFile.getAbsolutePath(), "r");
System.out.println("======= Grib2GDSVariables ==========");
Grib2GDSVariables gdsVariables = new Grib2GDSVariables(raf.readBytes(raf.read()));
System.out.println("Gds key : " + gdsVariables.getGdsKey());
System.out.println("======= Grib2Input ==========");
Grib2Input input = new Grib2Input(raf);
System.out.println(Grib2Input.isValidFile(raf));
System.out.println("scan : " + input.scan(true, true));
System.out.println("getGDSs.size: " + input.getGDSs().size());
System.out.println("getProducts.size: " + input.getProducts().size());
System.out.println("getRecords.size: " + input.getRecords().size());
System.out.println("edition: " + input.getEdition());
System.out.println("======= Grib2Dump ==========");
Grib2Dump dump = new Grib2Dump();
dump.gribDump(new String[] {tempFile.getAbsolutePath()});
System.out.println("======= Grib2ExtractRawData ==========");
Grib2ExtractRawData extractRawData = new
Grib2ExtractRawData(raf); extractRawData.main(new String[] {tempFile.getAbsolutePath()});
}
System.out.println("finished");
}
This produces the following output:
======= Grib2GDSVariables ==========
Gds key : -1732955898
======= Grib2Input ==========
true
scan : true
getGDSs.size: 0
getProducts.size: 0
getRecords.size: 0
edition: 2
======= Grib2Dump ==========
--------------------------------------------------------------------
Header : GRIB2
Discipline : 0 Meteorological products
GRIB Edition : 2
GRIB length : 113296
Originating Center : 98 European Center for Medium-Range Weather Forecasts (RSMC)
Originating Sub-Center : 0
Significance of Reference Time : 1 Start of forecast
Reference Time : 2016-07-21T00:00:00Z
Product Status : 0 Operational products
Product Type : 1 Forecast products
Number of data points : 259920
Grid Name : 0 Latitude_Longitude
Grid Shape: 6 Earth spherical with radius of 6,371,229.0 m
Number of points along parallel: 720
Number of points along meridian: 361
Basic angle : 0
Subdivisions of basic angle: -9999
Latitude of first grid point : 90.0
Longitude of first grid point : 0.0
Resolution & Component flags : 48
Winds : True
Latitude of last grid point : -90.0
Longitude of last grid point : 359.5
i direction increment : 0.5
j direction increment : 0.5
Grid Units : degrees
Scanning mode : 0
Product Definition : 2 Derived forecast on all ensemble members at a point in time
Parameter Category : 2 Momentum
Parameter Name : 1 Wind_speed
Parameter Units : m s-1
Generating Process Type : 4 Ensemble Forecast
ForecastTime : 24
First Surface Type : 100 Isobaric surface
First Surface value : 85000.0
Second Surface Type : 255 Missing
Second Surface value : -9.999E-252
======= Grib2ExtractRawData ==========
finished
I tried around for two days now but couldn't get it to work! I can't obtain the content data (lat, lon, value) from the file...
Can someone give an example in Java?
You shouldn't use the GRIB classes in netCDF-java directly. Instead, use
NetcdfFile.open()
That will give you access through the CDM, giving you a straightforward interface with variables and attributes. There's a tutorial here: https://www.unidata.ucar.edu/software/thredds/current/netcdf-java/tutorial/NetcdfFile.html

How to resolve java.lang.NullPointerException in scala?

assume part of my code is like as:-
where doc is List[Document] that contains stu_name and roll_number
sometimes stu_name and roll_name may be null.
I used Try to avoid null Pointer exception in first two lines.
but why I m getting again Null Pointer exception in "val myRow".
val name= Try {Option.apply(doc.getFieldValue("stu_name"))}.getOrElse(null)
val rollNumber ={Option.apply(doc.getFieldValue("roll_number"))}.getOrElse(null)
val myRow = (
doc.getFieldValue("ID").asInstanceOf[Int] //can't be null
name.getOrElse(null).toString, //NullPointerException
rollNumber.getOrElse(null).asInstanceOf[Int] //NullPointerException
)
.....
.....
I m getting following error:
[2016-01-14 22:40:16,896] WARN o.a.s.s.TaskSetManager [] [akka://JobServer/user/context-supervisor/demeter] - Lost task 0.0 in stage 0.0 (TID 0, 10.29.23.136): java.lang.NullPointerException
at com.test.events.Monitoring$$anonfun$geteventTableReplicateDayFunc$1.apply(Monitoring.scala:75)
at com.test.events.Monitoring$$anonfun$geteventTableReplicateDayFunc$1.apply(Monitoring.scala:57)
at com.test.events.Monitoring$$anonfun$27.apply(Monitoring.scala:104)
at com.test.events.Monitoring$$anonfun$27.apply(Monitoring.scala:104)
I tried in console following but did not see any error:
scala> val a = Try (Option.apply("atar")).getOrElse(null)
a: Option[String] = Some(atar)
scala> a.getOrElse(null)
res16: String = atar
scala> val a = Try (Option.apply(null)).getOrElse(null)
a: Option[Null] = None
scala> a.getOrElse(null)
res17: Null = null
This is all wrong. By using getOrElse(null) you are basically removing all advantages to using an Option to begin with. Plus, generating much more complexity than needed.
You need to define what you will do if the values are null. This just keeps them as Options (None on null input):
val myRow = (
doc.getFieldValue("ID").toInt, // Fails if null
Option(doc.getFieldValue("stu_name")), // `None` if null
Option(doc.getFieldValue("roll_number")).map(_.toInt) // `None` if null
)
Or use default values:
val myRow = (
doc.getFieldValue("ID").toInt,
Option(doc.getFieldValue("stu_name")).getOrElse("default"),
Option(doc.getFieldValue("roll_number")).map(_.toInt).getOrElse(0)
)

Why am I not getting hystrix metrics?

I am trying to use hystrix to monitor a certain network call. But all the metrics I try to monitor are always empty. What am I doing wrong?
I simulate a network call by implementing a (somewhat) RESTful interface that returns a pow calculation:
GetPowerCommand gpc = new GetPowerCommand(5, 82);
powerMetrics = gpc.getMetrics();
This is how I call the hystrix command and expect to get some metrics (at least Requests: not 0)
boolean run = true;
while (run) {
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
run = false;
}
System.out.println("GetPowerCommand.run(): " + gpc.run());
System.out.println("GetPowerCommand.run(): " + gpc.run());
System.out.println("getStatsStringFromMetrics(powerMetrics): " + getStatsStringFromMetrics(powerMetrics));
}
But all I get is:
GetPowerCommand.run(): <p>I guess .. </p><p>2^5 = 32</p>
GetPowerCommand.run(): <p>I guess .. </p><p>2^5 = 32</p>
getStatsStringFromMetrics(powerMetrics): Requests: 0 Errors: 0 (0%) Mean: 0 50th: 0 75th: 0 90th: 0 99th: 0
GetPowerCommand.run(): <p>I guess .. </p><p>2^5 = 32</p>
GetPowerCommand.run(): <p>I guess .. </p><p>2^5 = 32</p>
getStatsStringFromMetrics(powerMetrics): Requests: 0 Errors: 0 (0%) Mean: 0 50th: 0 75th: 0 90th: 0 99th: 0
edit: my metrics retrieval method:
private static String getStatsStringFromMetrics(HystrixCommandMetrics metrics) {
StringBuilder m = new StringBuilder();
if (metrics != null) {
HealthCounts health = metrics.getHealthCounts();
m.append("Requests: ").append(health.getTotalRequests()).append(" ");
m.append("Errors: ").append(health.getErrorCount()).append(" (").append(health.getErrorPercentage())
.append("%) ");
m.append("Mean: ").append(metrics.getTotalTimeMean()).append(" ");
m.append("50th: ").append(metrics.getExecutionTimePercentile(50)).append(" ");
m.append("75th: ").append(metrics.getExecutionTimePercentile(75)).append(" ");
m.append("90th: ").append(metrics.getExecutionTimePercentile(90)).append(" ");
m.append("99th: ").append(metrics.getExecutionTimePercentile(99)).append(" ");
}
return m.toString();
}
You have already answered your question: use execute() instead of run(). Have a look also here

Categories

Resources