I have an entity class that has an embedded object within it:
#Entity
public class Flight implements Serializable {
/// .... other attributes
#Embedded
#AttributeOverrides({
#AttributeOverride(name = "value", column =
#Column(name = "FLIGHT_TIME")),
#AttributeOverride(name = "dataState", column =
#Column(name = "FLIGHT_TIME_TYPE", length = 20))
})
private DateDataStateValue flightDate;
}
The DateDataStateValue is as follows:
#Embeddable
public class DateDataStateValue implements DataStateValue<Date>, Serializable {
private static final long serialVersionUID = 1L;
#Column(name = "DATASTATE")
#Enumerated(value = EnumType.STRING)
private final DataState dataState;
#Column(name = "DATAVALUE")
#Temporal(TemporalType.TIMESTAMP)
private final Date value;
}
When performing a fetch of Flights from the database, using a CriteriaQuery, and creating an Order object on the time column:
Path<Flight> propertyPath = queryRoot.get("flightDate");
Order order = isAscending() ? criteriaBuilder.asc(propertyPath) : criteriaBuilder.desc(propertyPath);
The ordering is not what I want. For instance, if the flight table has the following values:
Flight 1 | ESTIMATED | 1 Jan 2012
Flight 2 | ESTIMATED | 1 Jan 2011
Flight 3 | ACTUAL | 1 Jan 2010
Flight 4 | ESTIMATED | 1 Jan 2009
The result of an ascending sort will be:
Flight 3 | ACTUAL | 1 Jan 2010
Flight 4 | ESTIMATED | 1 Jan 2009
Flight 2 | ESTIMATED | 1 Jan 2011
Flight 1 | ESTIMATED | 1 Jan 2012
It appears that the default ordering of an #Embedded column is to use the natural ordering of the elements in the order in which they are named in the class. Ie DATASTATE first, then DATAVALUE second.
What I would like to do is whenever the sort property is flightDate, the ordering is the date first, then the state, ie:
Flight 4 | ESTIMATED | 1 Jan 2009
Flight 3 | ACTUAL | 1 Jan 2010
Flight 2 | ESTIMATED | 1 Jan 2011
Flight 1 | ESTIMATED | 1 Jan 2012
Making the DateDataStateValue comparable doesn't affect it, and #orderColumn/#OrderBy don't seem to be the right thing for the job. Does anyone have any ideas?
Thanks in advance.
I didn't even know you could add an order by query on an embeddable property like this. But I wouldn't rely on it, and simply add two orders to your query:
Path<Flight> statePath = queryRoot.get("flightDate.dateState"); // or queryRoot.get("flightDate").get("dateState"): to be tested
Path<Flight> valuePath = queryRoot.get("flightDate.value");
Order[] orders;
if (isAscending()) {
orders = new Order[] {criteriaBuilder.asc(valuePath), criteriaBuilder.asc(statePath) };
}
else {
orders = new Order[] {criteriaBuilder.desc(valuePath), criteriaBuilder.desc(statePath)
}
query.orderBy(orders);
something like "flightDate.value ASC, flightDate.dataState ASC" perhaps, since all you defined was "flightDate", which implies natural ordering of that object
Related
I had an unexpected result when try to store with Hibernate (5.6.1) OffsetTime entity properties into Postgresql Time with time zone field.
For ex (if current default zone is +02):
| OffsetTime| Timez |
| -------- | -------- |
| 00:00+01 | 00:00+02 |
| 00:00+02 | 00:00+02 |
| 00:00+03 | 00:00+02 |
Original offset was lost and stored default instead.
I researched two classes:
org.hibernate.type.descriptor.sql.TimeTypeDescriptor
final Time time = javaTypeDescriptor.unwrap( value, Time.class, options );
org.hibernate.type.descriptor.java.OffsetTimeJavaDescriptor
if ( java.sql.Time.class.isAssignableFrom( type ) ) {
return (X) java.sql.Time.valueOf( offsetTime.toLocalTime() );
}
I think, that I had some mistake in understanding this logic, but in another answers I saw recommendation: (LINK)
ZoneOffset zoneOffset = ZoneOffset.systemDefault().getRules()
.getOffset(LocalDateTime.now());
Notification notification = new Notification()
//...
).setClockAlarm(
OffsetTime.of(7, 30, 0, 0, zoneOffset)
);
So, do I must to convert all OffsetTime values to default time zone so that it store correctly?
When i group the below select it getting type maching error. I already try to CAST as TIMESTAMP and try to change POJOs LocalDateTime type. Most of the sample codes converts as Row.class could not find any custom class example.
SELECT name, MIN(price) AS minPrice, MAX(price) AS maxPrice, AVG(price) AS avarage, COUNT(name) as sayi, TUMBLE_START(rowtime, INTERVAL '5' SECOND) AS zaman FROM STOCK GROUP BY TUMBLE(rowtime, INTERVAL '5' SECOND), name
Thrown Error:
Exception in thread "main" org.apache.flink.table.api.TableException: Result field 'zaman' does not match requested type. Requested: GenericType<java.time.LocalDateTime>; Actual: LocalDateTime
Code:
tableEnvironment.registerDataStream("STOCK", messageStream, "name, price, rowtime.rowtime");
Table result = tableEnvironment.sqlQuery(
"SELECT name, MIN(price) AS minPrice, MAX(price) AS maxPrice, AVG(price) AS avarage, COUNT(name) as sayi, TUMBLE_START(rowtime, INTERVAL '5' SECOND) AS zaman FROM STOCK GROUP BY TUMBLE(rowtime, INTERVAL '5' SECOND), name");
result.printSchema();
FlinkKafkaProducer011<String> myProducer = new FlinkKafkaProducer011<String>(kp.getProducerProperties().getProperty(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG),
"STOCKGROUP", new SimpleStringSchema());
myProducer.setWriteTimestampToKafka(true);
DataStream<Tuple2<Boolean, StockGroup>> stream = tableEnvironment.toRetractStream(result, StockGroup.class);
stream.map(x -> x.f1.toString()).addSink(myProducer);
StockGroup.class POJO;
public String name;
public Double minPrice;
public Double maxPrice;
public Double avarage;
public Long sayi;
public LocalDateTime zaman;
Printed Schema;
root
|-- name: STRING
|-- minPrice: DOUBLE
|-- maxPrice: DOUBLE
|-- avarage: DOUBLE
|-- sayi: BIGINT NOT NULL
|-- zaman: TIMESTAMP(3) *ROWTIME*
Suppose In my LDT(LargeMap) Bin I have following values,
key1, value1
key2, value2
key3, value3
key4, value4
. .
key50, value50
Now, I get my required data using following snippet :
Map<?, ?> myFinalRecord = new HashMap<?, ?>();
// First call to client to get the largeMap associated with the bin
LargeMap largeMap = myDemoClient.getLargeMap(myPolicy, myKey, myLDTBinName, null);
for (String myLDTKey : myRequiredKeysFromLDTBin) {
try {
// Here each get call results in one call to aerospike
myFinalRecord.putAll(largeMap.get(Value.get(myLDTKey)));
} catch (Exception e) {
log.warn("Key does not exist in LDT Bin");
}
}
The problem is here if myRequiredKeysFromLDTBin contains say 20 keys. Then largeMap.get(Value.get(myLDTKey)) will make 20 calls to aerospike.
Thus if I go by retrieval time of 1 ms per transaction , here my one call to retrieve 20 ids from a record will result in 20 calls to aerospike. This will increase my response time to approx. 20 ms !
So is there any way where I can just pass a set of ids to be retrieved from a LDT Bin and it takes only one call to do so ?
There is no direct API to do multi-get. A way of doing this would be call lmap API directly from server multiple time through UDF.
Example 'mymap.lua'
local lmap = require('ldt/lib_lmap');
function getmany(rec, binname, keys)
local resultmap = map()
local keycount = #keys
for i = 1,keycount,1 do
local rc = lmap.exists(rec, binname, keys[i])
if (rc == 1) then
resultmap[keys[i]] = lmap.get(rec, binname, keys[i]);
else
resultmap[keys[i]] = nil;
end
end
return resultmap;
end
Register this lua file
aql> register module 'mymap.lua'
OK, 1 module added.
aql> execute lmap.put('bin', 'c', 'd') on test.demo where PK='1'
+-----+
| put |
+-----+
| 0 |
+-----+
1 row in set (0.000 secs)
aql> execute lmap.put('bin', 'b', 'c') on test.demo where PK='1'
+-----+
| put |
+-----+
| 0 |
+-----+
1 row in set (0.001 secs)
aql> execute mymap.getmany('bin', 'JSON["b","a"]') on test.demo where PK='1'
+--------------------------+
| getmany |
+--------------------------+
| {"a":NIL, "b":{"b":"c"}} |
+--------------------------+
1 row in set (0.000 secs)
aql> execute mymap.getmany('bin', 'JSON["b","c"]') on test.demo where PK='1'
+--------------------------------+
| getmany |
+--------------------------------+
| {"b":{"b":"c"}, "c":{"c":"d"}} |
+--------------------------------+
1 row in set (0.000 secs)
Java Code to invoke this would be
try {
resultmap = myClient.execute(myPolicy, myKey, 'mymap', 'getmany', Value.get(myLDTBinName), Value.getAsList(myRequiredKeysFromLDTBin)
} catch (Exception e) {
log.warn("One of the key does not exist in LDT bin");
}
Value will be set if key exists and it would return NIL if it does not.
I have a Java class that I'm trying to test with Spock. The Java class contains an inner enum:
import static java.util.Calendar.*;
import java.util.*;
public class FederalHolidays {
public enum Observance {
NEW_YEARS_DAY(JANUARY, 1),
BIRTHDAY_OF_MARTIN_LUTHER_KING_JR(JANUARY, MONDAY, 3),
WASHINGTONS_BIRTHDAY(FEBRUARY, MONDAY, 3),
MEMORIAL_DAY(MAY, MONDAY, -1),
INDEPENDENCE_DAY(JULY, 4),
LABOR_DAY(SEPTEMBER, MONDAY, 1),
COLUMBUS_DAY(OCTOBER, MONDAY, 2),
VETERANS_DAY(NOVEMBER, 11),
THANKSGIVING_DAY(NOVEMBER, THURSDAY, 4),
CHIRSTMAS_DAY(DECEMBER, 25);
private final int month;
private final int dayOfMonth;
private final int dayOfWeek;
private final int weekOfMonth;
private static final int NA = 0;
private Observance(int month, int dayOfMonth) {
this.month = month;
this.dayOfMonth = dayOfMonth;
this.dayOfWeek = NA;
this.weekOfMonth = NA;
}
private Observance(int month, int dayOfWeek, int weekOfMonth) {
this.month = month;
this.dayOfMonth = NA;
this.dayOfWeek = dayOfWeek;
this.weekOfMonth = weekOfMonth;
}
boolean isFixedDate() {
return dayOfMonth != NA;
}
}
public Date dateOf(Observance observance, int year) {
Calendar cal = Calendar.getInstance(TimeZone.getTimeZone("EST"), Locale.ENGLISH);
cal.set(YEAR, year);
cal.set(MONTH, observance.month);
cal.clear(HOUR);
if (observance.isFixedDate()) {
cal.set(DAY_OF_MONTH, observance.dayOfMonth);
} else {
setNthDayOfWeek(cal, observance.dayOfWeek, observance.weekOfMonth);
}
adjustForWeekendsIfNecessary(cal);
return cal.getTime();
}
private void setNthDayOfWeek(Calendar cal, int dayOfWeek, int n) {
int week = 0;
int lastDay = cal.getActualMaximum(DAY_OF_MONTH);
int startDay = n > 0 ? 1 : lastDay;
int endDay = n > 0 ? lastDay : 1;
int incrementValue = n > 0 ? 1 : -1;
for (int day = startDay; day != endDay; day += incrementValue) {
cal.set(DAY_OF_MONTH, day);
if (cal.get(DAY_OF_WEEK) == dayOfWeek) {
week += incrementValue;
if (week == n) {
return;
}
}
}
}
private void adjustForWeekendsIfNecessary(Calendar cal) {
int dayOfWeek = cal.get(DAY_OF_WEEK);
cal.add(DAY_OF_MONTH, dayOfWeek == SATURDAY ? -1 : dayOfWeek == SUNDAY ? 1 : 0);
}
}
My Spock test spec looks like this:
class FederalHolidaysSpec extends Specification {
#Shared
def federalHolidays = new FederalHolidays()
def "holidays are correctly calculated"() {
expect:
federalHolidays.dateOf(observance, year).format('yyyy/MM/dd') == date
where:
observance | year | date
NEW_YEARS_DAY | 2011 | '2010/12/31'
BIRTHDAY_OF_MARTIN_LUTHER_KING_JR | 2011 | '2011/01/17'
WASHINGTONS_BIRTHDAY | 2011 | '2011/02/21'
MEMORIAL_DAY | 2011 | '2011/05/30'
INDEPENDENCE_DAY | 2011 | '2011/07/04'
LABOR_DAY | 2011 | '2011/09/05'
COLUMBUS_DAY | 2011 | '2011/10/10'
VETERANS_DAY | 2011 | '2011/11/11'
THANKSGIVING_DAY | 2011 | '2011/11/24'
CHIRSTMAS_DAY | 2011 | '2011/12/26'
}
}
When I run Spock, I get 10 test errors, one per each row of the table. Every error is identical:
groovy.lang.MissingPropertyException: No such property: Observance for class: bdkosher.FederalHolidaysSpec
at bdkosher.FederalHolidaysSpec.$spock_initializeFields(FederalHolidaysSpec.groovy)
(I'm using Spock 0.7-groovy-2.0, groovy-all 2.2.2 (non-indy), and Java 1.7.0_45.)
Why is Spock looking for a property named Observance on my FederalHolidaysSpec class?
I initially suspected the issue was related to inner enums/static imports, although changing my Spock test to use fully qualified enum values (e.g. bdkosher.FederalHolidays.Observance.CHRISTMAS_DAY) did not make a difference.
EDIT: Java file implementation corrected so it passes test; typos corrected in test file.
Make sure you have correct packages for both the source code and the test case. Below works for me if both the java class and the test class are in the same package.
package bdkosher
import spock.lang.Shared
import spock.lang.Specification
//import bdkosher.FederalHolidays.Observance //Do not need
import static bdkosher.FederalHolidays.Observance.*
class FederalHolidaysSpec extends Specification { .. }
I tested without a default package name as com.example and in a Grails app although the spock version was same.
After hearing of #elems' and #dmahaptro's successes, I started ripping things out of my project structure and reducing down dependencies in my POM.
After dropping the unused src/test/java and src/main/groovy directories entirely, I noticed the FederalHolidaysSpec.groovy source file in my IDE reverted to an older version (I'm not sure if I should blame gmavenplus-plugin or NetBeans for this--I'm suspecting the later since I was consistently testing using mvn clean test). In any case, this older version contained an additional field:
class FederalHolidaysSpec extends Specification {
def data = [(Observance.NEW_YEARS_DAY) : ['2011/12/31']]
This field was causing the test errors. Even with an import bdkosher.FederalHolidays.Observance, Groovy apparently believes the map key to be a property reference rather than a enum reference?
In any case, now I can fix the typos and legitimate test failures. Thanks for the help.
i have Grails(2.2.1) project and want to with Hibernate create simple "model" class
#Entity
#Table(name="User")
class User {
static mapping = {
datasource 'system'
}
#Id
#GeneratedValue
#Column(name="id")
private Long userId;
#Column(name="email", nullable=false)
private String email;
public User(){
}
#Transient
public Long getUserId(){
return this.userId;
}
#Transient
public String getEmail(){
return this.email;
}
}
but i getting this follow error :
Caused by MappingException: Could not determine type for: org.springframework.validation.Errors, at table: user, for columns: [org.hibernate.mapping.Column(errors)]
->> 334 | innerRun in java.util.concurrent.FutureTask$Sync
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
| 166 | run in java.util.concurrent.FutureTask
| 1145 | runWorker in java.util.concurrent.ThreadPoolExecutor
| 615 | run in java.util.concurrent.ThreadPoolExecutor$Worker
^ 722 | run . . . in java.lang.Thread
When i remove all hibernate annotations, app is deployed on server but table User has only two attributes (id, version).
Thank you for your help !
Move your domain class to grails-app/domain and change it to this:
class User {
String email
static mapping = {
datasource 'system'
}
static constraints = {
email nullable: false
}
}
11 lines vs. 31 lines. Groovy.