JPA 2 Constructor Expression cannot use constructor - java

I use OpenJPA 2.1.1 on WebSphere Application Server 8.
I want to create object from SELECT query using constructor expression:
String queryString = "SELECT NEW mypackage.StatisticDataObject(c.source, "
+ "SUM(CASE WHEN (c.validTo <= CURRENT_TIMESTAMP AND c.expireNoProblem LIKE 'N') THEN 1 ELSE 0 END ), "
+ "SUM(CASE WHEN (c.validTo <= CURRENT_TIMESTAMP AND c.expireNoProblem LIKE 'Y') THEN 1 ELSE 0 END ), "
+ "SUM(CASE WHEN (c.validTo > :timestamp30 ) THEN 1 ELSE 0 END ), "
+ "SUM(CASE WHEN (c.validTo > :timestamp10 AND c.validTo <= :timestamp30 ) THEN 1 ELSE 0 END ), "
+ "SUM(CASE WHEN (c.validTo > CURRENT_TIMESTAMP AND c.validTo <= :timestamp10 ) THEN 1 ELSE 0 END ) )"
+ "FROM MYTABLE c GROUP BY c.source";
TypedQuery<StatisticDataObject> q = em.createQuery(queryString,
StatisticDataObject.class);
q.setParameter("timestamp30", getTimestampIn(30));
q.setParameter("timestamp10", getTimestampIn(10));
Constructor:
public StatisticDataObject(String name, Integer expired,
Integer expiredButOK, Integer expireIn10Days, Integer expireIn30Days,
Integer expireGT30Days) {
this.name = name;
this.expired = expired;
this.expiredButOK = expiredButOK;
this.expireIn10Days = expireIn10Days;
this.expireIn30Days = expireIn30Days;
this.expireGT30Days = expireGT30Days;
}
But I get the following exception:
Caused by: <openjpa-2.1.1-SNAPSHOT-r422266:1141200 nonfatal user error> org.apache.openjpa.persistence.ArgumentException: Query "SELECT NEW mypackage StatisticDataObject(c.source, ... at org.apache.openjpa.kernel.QueryImpl.execute(QueryImpl.java:872)
at org.apache.openjpa.kernel.QueryImpl.execute(QueryImpl.java:794)
at org.apache.openjpa.kernel.DelegatingQuery.execute(DelegatingQuery.java:542)
at org.apache.openjpa.persistence.QueryImpl.execute(QueryImpl.java:315)
at org.apache.openjpa.persistence.QueryImpl.getResultList(QueryImpl.java:331)
...
Caused by: java.lang.RuntimeException: Es wurde kein Konstruktor für "class mypackage.StatisticDataObject" mit den Argumenttypen "[class java.lang.String, class java.lang.String, class java.lang.String, class java.lang.String, class java.lang.String, class java.lang.String]" gefunden, um die Daten einzutragen.
// ENGLISH Translation: Caused by: java.lang.RuntimeException: There is no constructor "class mypackage.StatisticDataObject" with argument type "[class java.lang.String, class java.lang.String, class java.lang.String, class java.lang.String, class java.lang.String, class java.lang.String]".
at org.apache.openjpa.kernel.FillStrategy$NewInstance.findConstructor(FillStrategy.java:139)
at org.apache.openjpa.kernel.FillStrategy$NewInstance.fill(FillStrategy.java:144)
at org.apache.openjpa.kernel.ResultShape.pack(ResultShape.java:362)
at org.apache.openjpa.kernel.ResultShapePacker.pack(ResultShapePacker.java:48)
at org.apache.openjpa.kernel.QueryImpl$PackingResultObjectProvider.getResultObject(QueryImpl.java:2082)
at org.apache.openjpa.lib.rop.EagerResultList.<init>(EagerResultList.java:36)
at org.apache.openjpa.kernel.QueryImpl.toResult(QueryImpl.java:1251)
at org.apache.openjpa.kernel.QueryImpl.execute(QueryImpl.java:1007)
at org.apache.openjpa.kernel.QueryImpl.execute(QueryImpl.java:863)
... 84 more
If I run the query without NEW mypackage.StatisticDataObject() it works by using Object[]. Also the class of object[1-5] (.getClass()) is Integer.
So why does JPA return a String from SUM() instead of Integer when using the constructor expression?

In your constructor take in count that the state of the field type in which you will apply SUM need to be numeric, and the result type must correspond on the field type. For example, if a Double field is summed , the result will be returned as Double,. If a Long field type is summed, the response will be returned as a Long.
It is the root cause of the problem, SUM not return a Integer type.

Related

JDBi3 UnableToCreateStatementException: No argument factory registered for 'false' of qualified type org.jdbi.v3.core.argument.NullArgument

I'm running into a strange issue where if I try to bind("paramName", false) to a nullable boolean (SQL bit) in my database, I receive the following error:
org.jdbi.v3.core.statement.UnableToCreateStatementException: No argument factory registered for 'false' of qualified type org.jdbi.v3.core.argument.NullArgument [statement:"UPDATE dbo.TagValues SET BoolValue = :boolValue, NumericValue = :numericValue, StringValue = :stringValue WHERE TagID = :tagID", arguments:{positional:{}, named:{stringValue:NULL,tagID:17,numericValue:NULL,boolValue:false}, finder:[]}]
I am using MSSQL Server 2017 and JDBi 3. All 3 of my colon delineated parameters boolValue, numericValue, and stringValue are all nullable.
The weird thing is, I am hardcoding the boolean value, and that is the only instance where the statement seems to barf. I've attached the code, with the offending line. Sorry ahead of time for the line lengths.
PreparedBatch batch = handle.prepareBatch("UPDATE dbo.TagValues SET BoolValue = :boolValue, NumericValue = :numericValue, StringValue = :stringValue WHERE TagID = :tagID");
active_sites.forEach(site -> site.getEntryPoints().forEach(entryPoint -> {
if(entryPoint.isGatewayResponding() && !entryPoint.hasError()){
entryPoint.getTiedTags().forEach(tag -> {
if (boolTypes.contains(tag.getType())) {
batch.bind("boolValue", tag.getValue().getValue()).bindNull("numericValue", Types.FLOAT).bindNull("stringValue", Types.VARCHAR).bind("tagID", tag.getIDX()).add();
}
else if (numericTypes.contains(tag.getType())) {
batch.bindNull("boolValue", Types.BIT).bind("numericValue", tag.getValue().getValue()).bindNull("stringValue", Types.VARCHAR).bind("tagID", tag.getIDX()).add();
}
else if (stringTypes.contains(tag.getType())) {
batch.bindNull("boolValue", Types.BIT).bindNull("numericValue", Types.FLOAT).bind("stringValue", tag.getValue().getValue()).bind("tagID", tag.getIDX()).add();
}
});
}
else {
entryPoint.getTiedTags().forEach(tag -> batch.bindNull("boolValue", Types.BIT).bindNull("numericValue", Types.FLOAT).bindNull("stringValue", Types.VARCHAR).bind("tagID", tag.getIDX()).add());
}
entryPoint.getErrorTags().stream().filter(tag -> tag.getType() == 33 || tag.getType() == 34 || tag.getType() == 35).forEach(tag -> {
if(Objects.nonNull(tag.getValue())) {
batch.bind("boolValue", tag.getValue().getValue()).bindNull("numericValue", Types.FLOAT).bindNull("stringValue", Types.VARCHAR).bind("tagID", tag.getIDX()).add();
}
else {
//TAG 17 FALLS INTO THIS CATEGORY, CONFIRMED BY PRINTLN. THIS IS THE OFFENDER.
batch.bind("boolValue", false).bindNull("numericValue", Types.FLOAT).bindNull("stringValue", Types.VARCHAR).bind("tagID", tag.getIDX()).add();
}
});
}));
Update:
jdbi: 3.18.1
mssql jdbc driver: 9.2.1.jre15
Full stacktrace:
org.jdbi.v3.core.statement.UnableToCreateStatementException: No argument factory registered for 'false' of qualified type org.jdbi.v3.core.argument.NullArgument [statement:"UPDATE dbo.TagValues SET BoolValue = :boolValue, NumericValue = :numericValue, StringValue = :stringValue WHERE TagID = :tagID", arguments:{positional:{}, named:{stringValue:NULL,tagID:17,numericValue:NULL,boolValue:false}, finder:[]}]
at org.jdbi.v3.core.statement.ArgumentBinder.factoryNotFound(ArgumentBinder.java:174)
at org.jdbi.v3.core.statement.ArgumentBinder.lambda$null$2(ArgumentBinder.java:141)
at java.base/java.util.Optional.orElseThrow(Optional.java:403)
at org.jdbi.v3.core.statement.ArgumentBinder.lambda$null$3(ArgumentBinder.java:141)
at org.jdbi.v3.core.statement.ArgumentBinder.lambda$null$4(ArgumentBinder.java:142)
at org.jdbi.v3.core.statement.ArgumentBinder$Prepared.lambda$prepareBinder$12(ArgumentBinder.java:230)
at org.jdbi.v3.core.statement.ArgumentBinder.lambda$wrapExceptions$6(ArgumentBinder.java:153)
at org.jdbi.v3.core.statement.ArgumentBinder$Prepared.lambda$null$13(ArgumentBinder.java:234)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
at org.jdbi.v3.core.statement.ArgumentBinder$Prepared.lambda$prepareBinder$14(ArgumentBinder.java:234)
at org.jdbi.v3.core.statement.ArgumentBinder$Prepared.bindNamed(ArgumentBinder.java:240)
at org.jdbi.v3.core.statement.ArgumentBinder.bind(ArgumentBinder.java:60)
at org.jdbi.v3.core.statement.PreparedBatch.internalBatchExecute(PreparedBatch.java:204)
at org.jdbi.v3.core.statement.PreparedBatch.execute(PreparedBatch.java:108)
at app.dao.services.impl.EntryDaoService.updateTagValues(EntryDaoService.java:165)
at app.SCADA.updateTagValues(SCADA.java:215)
at app.SCADA.run(SCADA.java:93)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630)
at java.base/java.lang.Thread.run(Thread.java:831)
Update: This was in fact a bug with JDBi where it couldn't bind null to a column for a row or rows when other rows in the same batch have the same parameters and contain data. This was resolved I believe in JDBi 3.24+

InvalidRequestException(why:line 1:184 mismatched character ')' expecting '-')

when I tried to save a table to cassandra using persist() method and kundera framework, i receive the error:
18976 [Thread-15-localhostAMQPbolt0-executor[2 2]] INFO c.i.c.c.CassandraClientBase - Returning cql query INSERT INTO "pieces"("width","depth","height","idpiece") VALUES(10.0,12.0,11.0,'1') .
18998 [Thread-15-localhostAMQPbolt0-executor[2 2]] INFO d.c.DatabaseController - insert piece to database: SUCCESS
18998 [Thread-15-localhostAMQPbolt0-executor[2 2]] INFO d.d.SensorDAOImpl - start to insert data
19011 [Thread-15-localhostAMQPbolt0-executor[2 2]] INFO c.i.c.c.CassandraClientBase - Returning cql query INSERT INTO "sensors"("event_time","temperature","pressure","IdSensor","date","this$0") VALUES(1462959800344,10.0,10.0,'1',150055,sensor.entitie.predefinedModel.SensorEntitie#1c4a9b7b) .
19015 [Thread-15-localhostAMQPbolt0-executor[2 2]] ERROR c.i.c.c.CassandraClientBase - Error while executing query INSERT INTO "sensors"("event_time","temperature","pressure","IdSensor","date","this$0") VALUES(1462959800344,10.0,10.0,'1',150055,sensor.entitie.predefinedModel.SensorEntitie#1c4a9b7b)
19015 [Thread-15-localhostAMQPbolt0-executor[2 2]] INFO c.i.c.c.CassandraClientBase - Returning delete query DELETE FROM "pieces" WHERE "idpiece" = '1'.
19018 [Thread-15-localhostAMQPbolt0-executor[2 2]] ERROR o.a.s.util - Async loop died!
java.lang.RuntimeException: com.impetus.kundera.KunderaException: com.impetus.kundera.KunderaException: InvalidRequestException(why:line 1:184 mismatched character ')' expecting '-')
at org.apache.storm.utils.DisruptorQueue.consumeBatchToCursor(DisruptorQueue.java:448) ~[storm-core-1.0.0.jar:1.0.0]
at org.apache.storm.utils.DisruptorQueue.consumeBatchWhenAvailable(DisruptorQueue.java:414) ~[storm-core-1.0.0.jar:1.0.0]
at org.apache.storm.disruptor$consume_batch_when_available.invoke(disruptor.clj:73) ~[storm-core-1.0.0.jar:1.0.0]
at org.apache.storm.daemon.executor$fn__8226$fn__8239$fn__8292.invoke(executor.clj:851) ~[storm-core-1.0.0.jar:1.0.0]
at org.apache.storm.util$async_loop$fn__554.invoke(util.clj:484) [storm-core-1.0.0.jar:1.0.0]
at clojure.lang.AFn.run(AFn.java:22) [clojure-1.7.0.jar:?]
at java.lang.Thread.run(Thread.java:745) [?:1.7.0_99]
Caused by: com.impetus.kundera.KunderaException: com.impetus.kundera.KunderaException: InvalidRequestException(why:line 1:184 mismatched character ')' expecting '-')
at com.impetus.kundera.persistence.EntityManagerImpl.persist(EntityManagerImpl.java:180) ~[project-0.0.1-SNAPSHOT-jar-with-dependencies.jar:?]
at database.dao.SensorDAOImpl.insert(SensorDAOImpl.java:54) ~[project-0.0.1-SNAPSHOT-jar-with-dependencies.jar:?]
at database.controller.DatabaseController.saveSensorEntitie(DatabaseController.java:49) ~[project-0.0.1-SNAPSHOT-jar-with-dependencies.jar:?]
at connector.bolt.PrinterBolt.execute(PrinterBolt.java:66) ~[project-0.0.1-SNAPSHOT-jar-with-dependencies.jar:?]
at org.apache.storm.daemon.executor$fn__8226$tuple_action_fn__8228.invoke(executor.clj:731) ~[storm-core-1.0.0.jar:1.0.0]
at org.apache.storm.daemon.executor$mk_task_receiver$fn__8147.invoke(executor.clj:463) ~[storm-core-1.0.0.jar:1.0.0]
at org.apache.storm.disruptor$clojure_handler$reify__7663.onEvent(disruptor.clj:40) ~[storm-core-1.0.0.jar:1.0.0]
at org.apache.storm.utils.DisruptorQueue.consumeBatchToCursor(DisruptorQueue.java:435) ~[storm-core-1.0.0.jar:1.0.0]
... 6 more
as you see I want to use onetomany
my class piece entity
#Entity
#Table(name = "pieces", schema = "mykeyspace#cassandra_pu")
public class PieceEntitie implements Serializable{
#Id
private String IdPiece;
#Column
private double width;
#Column
private double height;
#Column
private double depth;
my class sensor entity
#EmbeddedId
private CompoundKey key;
#Column
private float temperature;
#Column
private float pressure;
#OneToMany(cascade = { CascadeType.ALL }, fetch = FetchType.EAGER)
#JoinColumn(name="idsensor")
private List<PieceEntitie> pieces;
#Embeddable
public class CompoundKey
{
#Column
private String IdSensor;
#Column
private long date;
#Column(name = "event_time")
private long eventTime;
}
my tables:
CREATE TABLE mykeyspace.sensors (
idsensor text,
date bigint,
event_time timestamp,
pressure float,
temperature float,
PRIMARY KEY ((idsensor, date), event_time)
) WITH CLUSTERING ORDER BY (event_time ASC)
AND bloom_filter_fp_chance = 0.01
AND caching = '{"keys":"ALL", "rows_per_partition":"NONE"}'
AND comment = ''
AND compaction = {'class': 'org.apache.cassandra.db.compaction.SizeTieredCompactionStrategy'}
AND compression = {'sstable_compression': 'org.apache.cassandra.io.compress.LZ4Compressor'}
AND dclocal_read_repair_chance = 0.1
AND default_time_to_live = 0
AND gc_grace_seconds = 864000
AND max_index_interval = 2048
AND memtable_flush_period_in_ms = 0
AND min_index_interval = 128
AND read_repair_chance = 0.0
AND speculative_retry = '99.0PERCENTILE';
cqlsh:sensor> DESCRIBE table pieces ;
CREATE TABLE mykeyspace.pieces (
idpiece text PRIMARY KEY,
depth double,
height double,
idsensor text,
width double
) WITH bloom_filter_fp_chance = 0.01
AND caching = '{"keys":"ALL", "rows_per_partition":"NONE"}'
AND comment = ''
AND compaction = {'class': 'org.apache.cassandra.db.compaction.SizeTieredCompactionStrategy'}
AND compression = {'sstable_compression': 'org.apache.cassandra.io.compress.LZ4Compressor'}
AND dclocal_read_repair_chance = 0.1
AND default_time_to_live = 0
AND gc_grace_seconds = 864000
AND max_index_interval = 2048
AND memtable_flush_period_in_ms = 0
AND min_index_interval = 128
AND read_repair_chance = 0.0
AND speculative_retry = '99.0PERCENTILE';
tutorial followed; https://github.com/impetus-opensource/Kundera/wiki/Polyglot-Persistence
how can i resolve this problem ?
I resolved the problem by separating CompoundKey class and sensor class.
before I put the CompoundKey class in the sensor class, so Kundera was trying to insert CompoundKey as an attribute

Passing a parameter in a jpql query select

I have a jpql query instanciates a java object in select clause
public List<ChampEtatOT> getEtatOT(Date dateDebut, Date dateFin) {
Query query = em.createQuery("SELECT NEW ChampEtatOT( ot.numero, uo.denominationFr, ot.etat, ot.dateDebutReelle , ot.dateFinReelle, :dateParam1, :dateParam2, :dateParam3) FROM ordre ot JOIN ot.unite uo")
.setParameter("dateParam1", dateDebut, TemporalType.DATE)
.setParameter("dateParam2", dateFin, TemporalType.DATE)
.setParameter("dateParam3", new Date("2015-01-01"), TemporalType.DATE);
return query.getResultList();
}
I put 3 parameters, so i can pass it in the constructor
I get this error
Caused by: Exception [EclipseLink-6137] (Eclipse Persistence Services - 2.3.2.v20111125-r10461): org.eclipse.persistence.exceptions.QueryExceptionException Description: An Exception was thrown while executing a ReportQuery with a constructor expression: java.lang.NoSuchMethodException: dz.elit.gmao.commun.reporting.classe.ChampEtatOT.<init>(java.lang.String, java.lang.String, java.lang.String, java.util.Date, java.util.Date)Query: ReportQuery(referenceClass=TravOrdreTravail jpql="SELECT NEW dz.elit.gmao.commun.reporting.classe.ChampEtatOT( ot.numero, uo.denominationFr, ot.etat, ot.dateDebutReelle , ot.dateFinReelle, :dateParam1, :dateParam2, :dateParam3) FROM TravOrdreTravail ot JOIN ot.uniteOrganisationnellle uo")
I think that it's not possible to put parameters in a select clause so does anyone have an idea, the constructor method is as follows:
public ChampEtatOT(String numero, String denominationFr, String etat, Date dateDebutReelle, Date dateFinReelle, Date dateParam1, Date dateParam2, Date dateParam3) {
this.numero = numero;
this.denominationFr = denominationFr;
if (etat.equals("OUV")) {
if (dateDebutReelle.before(dateParam1)) {
etatEntreeSortie = "En instance debut du mois";
} else {
if (dateDebutReelle.before(dateParam2)) {
etatEntreeSortie = "En instance fin du mois";
} else {
if (dateDebutReelle.after(dateParam1) && dateDebutReelle.before(dateParam2)) {
etatEntreeSortie = "Entree/Mois";
}
}
}
}
}
Problem solved, as you suggested bRIMOs Bor it's not possible to pass parameters in a SELECT clause, so i have retreived all the results in a List than filtered the results according to the three dates date1, date2, date3
Query query = em.createQuery("SELECT NEW ChampEtatAteliers"
+ "( ot.numero, uo.denominationFr, ot.etat, ot.dateDebutReelle, ot.dateFinReelle) "
+ "FROM ordre ot JOIN ot.unite uo");
List<ChampEtatAteliers> champEtatAtelierses = query.getResultList();
for (ChampEtatAteliers champEtatAtelierse : champEtatAtelierses) {
if (champEtatAtelierse.getDateDebutReelle().compareTo(date1) >= 0 && champEtatAtelierse.getDateDebutReelle().compareTo(date2) <= 0) {
champEtatAtelierList2.add(new ChampEtatAteliers(champEtatAtelierse.getNumero(), champEtatAtelierse.getDenominationFr(), "Entree/Mois"));
}
if (champEtatAtelierse.getEtat().equals("OUV")) {
if (champEtatAtelierse.getDateDebutReelle().compareTo(date1) < 0) {
champEtatAtelierse.setEtatEntreeSortie("En instance début du mois");
} else {
if (champEtatAtelierse.getDateDebutReelle().compareTo(date2) <= 0) {
champEtatAtelierse.setEtatEntreeSortie("En instance fin du mois");
}
}
}
}
I think that it's not possible to reference a parameter in the contructor.
in your case it throws a NoSuchMethodexeption : it means that, no method with the current signature in your ChampEtatOT class (5 parameters instead of 8 )
you can refer to this answer => Passing a parameter in a jpql query select
So ,try to retrive all data then make a filter method to set all the etatEntreeSortie values inside the ChampEtatOT class of the ResultList
Clearly the JPQL BNF does permit passing parameters as constructor arguments.
constructor_expression ::= NEW constructor_name ( constructor_item {, constructor_item}* )
constructor_item ::= single_valued_path_expression | scalar_expression | aggregate_expression |
identification_variable
scalar_expression ::= simple_arithmetic_expression | string_primary | enum_primary |
datetime_primary | boolean_primary | case_expression | entity_type_expression
string_primary ::= state_field_path_expression | string_literal |
input_parameter | functions_returning_strings | aggregate_expression | case_expression
i.e a scalar_expression can be a string_primary, which can be an input_parameter. So your JPA provider is not meeting the JPA spec and you should raise a bug on it.

[Hive]I got "ArrayIndexOutOfBoundsException" while I query the hive database

I always get "ArrayIndexOutOfBoundsException" while I query the hive base(both hive-0.11.0 and hive-0.12.0), but sometimes not. Here is the error
java.lang.RuntimeException: Hive Runtime Error while closing operators: java.lang.ArrayIndexOutOfBoundsException: 0
at org.apache.hadoop.hive.ql.exec.mr.ExecReducer.close(ExecReducer.java:313)
at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:232)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:539)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:421)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ArrayIndexOutOfBoundsException: 0
at org.apache.hadoop.hive.ql.exec.persistence.RowContainer.first(RowContainer.java:231)
at org.apache.hadoop.hive.ql.exec.persistence.RowContainer.first(RowContainer.java:74)
at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:645)
at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:758)
at org.apache.hadoop.hive.ql.exec.JoinOperator.endGroup(JoinOperator.java:257)
at org.apache.hadoop.hive.ql.exec.mr.ExecReducer.close(ExecReducer.java:298)
... 8 more
Caused by: java.lang.ArrayIndexOutOfBoundsException: 0
at org.apache.hadoop.hive.ql.exec.persistence.RowContainer.first(RowContainer.java:220)
... 13 more
Could someone help me?
Update my code:
Select distinct jabberUseragent.gameID,agentPlayInfo.gameLabel,jabberUseragent.userAgent,CONCAT(CONCAT(CONCAT(triggerUsageStart.generateDate,' '),triggerUsageStart.timezone),CONCAT(' ',triggerUsageStart.generateTime)) as generateDate,(unix_timestamp(CONCAT(CONCAT(triggerUsageStop.generateDate,' '),triggerUsageStop.generateTime)) - unix_timestamp(CONCAT(CONCAT(triggerUsageStart.generateDate,' '),triggerUsageStart.generateTime))) from
(Select gameSession,gameID,userAgent from(Select distinct regexp_extract(t.payload,'playRequestId:(.*), playRequest',1) as gameSession,regexp_extract(t.payload,'gameId:(.*), userAgent:',1) as gameID,regexp_extract(t.payload,', userAgent:(.*), agentLocation',1) as userAgent,payload from (select * from ${hiveconf:DATA_BASE} p where p.dt >= '${hiveconf:LOW_DATE}' and p.dt <= '${hiveconf:UPPER_DATE}') t where CONCAT(t.generatedate,t.generatetime) >= CONCAT('${hiveconf:LOW_DATE}','${hiveconf:LOW_TIME}') and CONCAT(t.generatedate,t.generatetime) <= CONCAT('${hiveconf:UPPER_DATE}','${hiveconf:UPPER_TIME}'))jabberUseragent where jabberUseragent.gameSession!='' and jabberUseragent.userAgent!='') jabberUseragent
join
(Select gameID,gameLabel from(Select distinct regexp_extract(t.payload,'gameId=(.*),gameLabel=.*,configFilePath',1) as gameID,regexp_extract(t.payload,'gameId=.*,gameLabel=(.*),configFilePath',1) as gameLabel,payload from (select * from ${hiveconf:DATA_BASE} p where p.dt >= '${hiveconf:LOW_DATE}' and p.dt <= '${hiveconf:UPPER_DATE}') t where CONCAT(t.generatedate,t.generatetime) >= CONCAT('${hiveconf:LOW_DATE}','${hiveconf:LOW_TIME}') and CONCAT(t.generatedate,t.generatetime) <= CONCAT('${hiveconf:UPPER_DATE}','${hiveconf:UPPER_TIME}'))agentPlayInfo where agentPlayInfo.gameID!='' and agentPlayInfo.gameLabel!='') agentPlayInfo
join
(Select gameSession,generateDate,generateTime,timezone,payload from(Select distinct regexp_extract(t.payload,'GAME_SESSION=.*((.{8})-(.{4})-(.{4})-(.{4})-(.{12}))\" USAGE=\"([\\w\\-\\(\\)\\.]*,){41}9.*\"',1) as gameSession,generateDate,generateTime,timezone,payload from (select * from ${hiveconf:DATA_BASE} p where p.dt >= '${hiveconf:LOW_DATE}' and p.dt <= '${hiveconf:UPPER_DATE}') t where t.payload like '%[e] usage_record%' and CONCAT(t.generatedate,t.generatetime) <= CONCAT('${hiveconf:UPPER_DATE}','${hiveconf:UPPER_TIME}') and CONCAT(t.generatedate,t.generatetime) >= CONCAT('${hiveconf:LOW_DATE}','${hiveconf:LOW_TIME}'))triggerStart where triggerStart.gameSession!='')triggerUsageStart
join
(Select gameSession,generateDate,generateTime,timezone,payload from(Select distinct regexp_extract(t.payload,'GAME_SESSION=.*((.{8})-(.{4})-(.{4})-(.{4})-(.{12}))\" USAGE=\"([\\w\\-\\(\\)\\.]*,){41}[1-5].*\"',1) as gameSession,generateDate,generateTime,timezone,payload from (select * from ${hiveconf:DATA_BASE} p where p.dt >= '${hiveconf:LOW_DATE}' and p.dt <= '${hiveconf:UPPER_DATE}') t where t.payload like '%[e] usage_record%' and CONCAT(t.generatedate,t.generatetime) <= CONCAT('${hiveconf:UPPER_DATE}','${hiveconf:UPPER_TIME}') and CONCAT(t.generatedate,t.generatetime) >= CONCAT('${hiveconf:LOW_DATE}','${hiveconf:LOW_TIME}'))triggerStop where triggerStop.gameSession!='')triggerUsageStop
on (jabberUseragent.gameSession = triggerUsageStart.gameSession and triggerUsageStart.gameSession = triggerUsageStop.gameSession and jabberUseragent.gameID = agentPlayInfo.gameID) order by generateDate
Sorry, I can't share my samples.
By the way, I've also got this exception before I got "ArrayIndexOutOfBoundException"
javax.jdo.JDODataStoreException: Error executing SQL query "select PARTITIONS.PART_ID from PARTITIONS inner join TBLS on PARTITIONS.TBL_ID = TBLS.TBL_ID inner join DBS on TBLS.DB_ID = DBS.DB_ID where TBLS.TBL_NAME = ? and DBS.NAME = ? and PARTITIONS.PART_NAME in (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)".
at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451)
at org.datanucleus.api.jdo.JDOQuery.executeWithArray(JDOQuery.java:321)
at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.getPartitionsViaSqlFilterInternal(MetaStoreDirectSql.java:181)
at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.getPartitionsViaSqlFilter(MetaStoreDirectSql.java:82)
at org.apache.hadoop.hive.metastore.ObjectStore.getPartitionsByNamesInternal(ObjectStore.java:1717)
at org.apache.hadoop.hive.metastore.ObjectStore.getPartitionsByNames(ObjectStore.java:1700)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)......
NestedThrowablesStackTrace:
org.postgresql.util.PSQLException: ERROR: relation "partitions" does not exist
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:1591)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1340)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:192)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:471)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:373)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeQuery(AbstractJdbc2Statement.java:258)......
Based on the information provided, only this can be a sensible solution to your problem.
I have put the method definition for reference. Please go through it to understand
If you run down the source code closely, there are two areas / possibilities where ArrayIndexOutOfBoundException can be thrown.
Accessing the array values of input splits from Configuration
Reading the Row from currentReadBlock array (this is mostly not the case for the exception since it's size is greater than 0)
Please check your set of input files for job because InputFormat#split() method returns an array of InputSplit type. Each InputSplit is then assigned to an individual Mapper for processing. Mostly, the exception occurs while accessing this InputSplit[] array.

When do we get "'NaN' is not a valid numeric or approximate numeric value" in SQLException?

I have added the following code in my JSP and getting some exception on saving entity to db.
$('.validateWeight').click(function(event) {
var id = event.target.id;
var value;
$("#saveValue").val("");
if(id == "saveAndContinue"){
var el = document.getElementById('saveValue');
el.value = 1;
//<c:set var="saveValue" scope="page" value="1"/>;
}else if(id=="saveAndClose"){
var el = document.getElementById('saveValue');
el.value = 2;
//<c:set var="saveValue" scope="page" value="2"/>;
}
and
and in my Java Action Class, I have added the getter and setter
I am getting values set to 1 and 2 in needed cases but when saving the entity I do not how and from where 'NAN' value is coming and I am getting issues with save call.
Exception:
org.springframework.dao.TransientDataAccessResourceException: Hibernate operation: could not update: [com.hk.domain.inventory.GrnLineItem#100]; SQL [update person set age=?, no_of_children=?, address_line_1=?, address_line_2=?, mrp=?, state=?, city=?, pincode=?, weight=? where id=?]; 'NaN' is not a valid numeric or approximate numeric value; nested exception is java.sql.SQLException: 'NaN' is not a valid numeric or approximate numeric value
org.springframework.jdbc.support.SQLStateSQLExceptionTranslator.doTranslate(SQLStateSQLExceptionTranslator.java:107)
org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:72)
org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:80)
org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:80)
org.springframework.orm.hibernate3.HibernateAccessor.convertJdbcAccessException(HibernateAccessor.java:424)
org.springframework.orm.hibernate3.HibernateAccessor.convertHibernateAccessException(HibernateAccessor.java:410)
org.springframework.orm.hibernate3.HibernateTemplate.doExecute(HibernateTemplate.java:411)
org.springframework.orm.hibernate3.HibernateTemplate.executeWithNativeSession(HibernateTemplate.java:374)
org.springframework.orm.hibernate3.HibernateTemplate.flush(HibernateTemplate.java:881)
com.hk.impl.dao.BaseDaoImpl.resetHibernateAfterWrite(BaseDaoImpl.java:247)
and
java.sql.SQLException: 'NaN' is not a valid numeric or approximate numeric value
com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1055)
com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956)
com.mysql.jdbc.SQLError.createSQLException(SQLError.java:926)
com.mysql.jdbc.PreparedStatement.setDouble(PreparedStatement.java:3572)
org.apache.commons.dbcp.DelegatingPreparedStatement.setDouble(DelegatingPreparedStatement.java:129)
org.apache.commons.dbcp.DelegatingPreparedStatement.setDouble(DelegatingPreparedStatement.java:129)
org.apache.commons.dbcp.DelegatingPreparedStatement.setDouble(DelegatingPreparedStatement.java:129)
org.hibernate.type.DoubleType.set(DoubleType.java:60)
org.hibernate.type.NullableType.nullSafeSet(NullableType.java:154)
org.hibernate.type.NullableType.nullSafeSet(NullableType.java:131)
org.hibernate.persister.entity.AbstractEntityPersister.dehydrate(AbstractEntityPersister.java:2025)
org.hibernate.persister.entity.AbstractEntityPersister.update(AbstractEntityPersister.java:2399)
org.hibernate.persister.entity.AbstractEntityPersister.updateOrInsert(AbstractEntityPersister.java:2335)
org.hibernate.persister.entity.AbstractEntityPersister.update(AbstractEntityPersister.java:2635)
org.hibernate.action.EntityUpdateAction.execute(EntityUpdateAction.java:115)
org.hibernate.engine.ActionQueue.execute(ActionQueue.java:279)
org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:263)
org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:168)
org.hibernate.event.def.AbstractFlushingEventListener.performExecutions(AbstractFlushingEventListener.java:321)
org.hibernate.event.def.DefaultFlushEventListener.onFlush(DefaultFlushEventListener.java:50)
org.hibernate.impl.SessionImpl.flush(SessionImpl.java:1027)
org.springframework.orm.hibernate3.HibernateTemplate$28.doInHibernate(HibernateTemplate.java:883)
org.springframework.orm.hibernate3.HibernateTemplate.doExecute(HibernateTemplate.java:406)
org.springframework.orm.hibernate3.HibernateTemplate.executeWithNativeSession(HibernateTemplate.java:374)
org.springframework.orm.hibernate3.HibernateTemplate.flush(HibernateTemplate.java:881)
com.hk.impl.dao.BaseDaoImpl.resetHibernateAfterWrite(BaseDaoImpl.java:247)
com.hk.impl.dao.BaseDaoImpl.save(BaseDaoImpl.java:237)
You could modify your javascript to use Jquery
$('.validateWeight').click(function(event) {
var id = event.target.id;
$("#saveValue").val("");
if (id == "saveAndContinue")
$("#saveValue").val(1);
else if (id == "saveAndClose")
$("#saveValue").val(2);
}
You have to take into account that if id isn't "saveAndContinue" or "saveAndClose", the value of the input with ID "saveValue" will be "", which is not numeric and will produce your NaN error.
In the other hand, you should check if in the server side you receive a String or a Integer, because you should pass to your Hibernate function an Integer.

Categories

Resources