I use the Vlad Mihalcea's library in order to map SQL arrays (Postgresql in my case) to JPA. Then let's imagine I have an Entity, ex.
#TypeDefs(
{#TypeDef(name = "string-array", typeClass =
StringArrayType.class)}
)
#Entity
public class Entity {
#Type(type = "string-array")
#Column(columnDefinition = "text[]")
private String[] tags;
}
The appropriate SQL is:
CREATE TABLE entity (
tags text[]
);
Using QueryDSL I'd like to fetch rows which tags contains all the given ones. The raw SQL could be:
SELECT * FROM entity WHERE tags #> '{"someTag","anotherTag"}'::text[];
(taken from: https://www.postgresql.org/docs/9.1/static/functions-array.html)
Is it possible to do it with QueryDSL? Something like the code bellow ?
predicate.and(entity.tags.eqAll(<whatever>));
1st step is to generate proper sql: WHERE tags #> '{"someTag","anotherTag"}'::text[];
2nd step is described by coladict (thanks a lot!): figure out the functions which are called: #> is arraycontains and ::text[] is string_to_array
3rd step is to call them properly. After hours of debug I figured out that HQL doesn't treat functions as functions unless I added an expression sign (in my case: ...=true), so the final solution looks like this:
predicate.and(
Expressions.booleanTemplate("arraycontains({0}, string_to_array({1}, ',')) = true",
entity.tags,
tagsStr)
);
where tagsStr - is a String with values separated by ,
Since you can't use custom operators, you will have to use their functional equivalents. You can look them up in the psql console with \doS+. For \doS+ #> we get several results, but this is the one you want:
List of operators
Schema | Name | Left arg type | Right arg type | Result type | Function | Description
------------+------+---------------+----------------+-------------+---------------------+-------------
pg_catalog | #> | anyarray | anyarray | boolean | arraycontains | contains
It tells us the function used is called arraycontains, so now we look-up that function to see it's parameters using \df arraycontains
List of functions
Schema | Name | Result data type | Argument data types | Type
------------+---------------+------------------+---------------------+--------
pg_catalog | arraycontains | boolean | anyarray, anyarray | normal
From here, we transform the target query you're aiming for into:
SELECT * FROM entity WHERE arraycontains(tags, '{"someTag","anotherTag"}'::text[]);
You should then be able to use the builder's function call to create this condition.
ParameterExpression<String[]> tags = cb.parameter(String[].class);
Expression<Boolean> tagcheck = cb.function("Flight_.id", Boolean.class, Entity_.tags, tags);
Though I use a different array solution (might publish soon), I believe it should work, unless there are bugs in the underlying implementation.
An alternative to method would be to compile the escaped string format of the array and pass it on as the second parameter. It's easier to print if you don't treat the double-quotes as optional. In that event, you have to replace String[] with String in the ParameterExpression row above
For EclipseLink I created a function
CREATE OR REPLACE FUNCTION check_array(array_val text[], string_comma character varying ) RETURNS bool AS $$
BEGIN
RETURN arraycontains(array_val, string_to_array(string_comma, ','));
END;
$$ LANGUAGE plpgsql;
As pointed out by Serhii, then you can useExpressions.booleanTemplate("FUNCTION('check_array', {0}, {1}) = true", entity.tags, tagsStr)
Related
I am playing around with jOOQ and nesting queries. I have a JSON payload which may contain many subqueries. I want to treat these subqueries as a variable number of Common Table Expressions (i.e. CTE's in the WITH clause). Currently I have this working example, but it is static in terms of the number of CTE's it includes. How would I accomplish a variable number of CTE's within the WITH Clause?
/*
+-----------------+
|sum_of_everything|
+-----------------+
| 100|
+-----------------+
*/
Supplier<Stream<Map<String, Object>>> resultsWith =
() ->
dslContext
.with("s1")
.as(query)
.select(sum(field("1", Integer.class)).as("sum_of_everything"))
.from(table(name("s1")))
.fetchStream()
.map(Record::intoMap);
Ultimately I will need to deserialize the JSON payload to ensure that the CTE reference hierarchy works properly, and I will need to see if jOOQ supports referencing one CTE in another CTE before selecting the final result. I will need to accomplish something like this :
/*
+-----------------+
|sum_of_everything|
+-----------------+
| 100|
+-----------------+
*/
Supplier<Stream<Map<String, Object>>> resultsWith =
() ->
dslContext
.with("s1").as(query1)
.with("s2").as(query2) // should be able to reference "s1" i.e. query1
...
.with("sNMinus1").as(queryNMinus1)
.with("sN").as(queryN) // should be able to reference any upstream CTE
.select(sum(field("1", Integer.class)).as("sum_of_everything"))
.from(table(name("sN")))
.fetchStream()
.map(Record::intoMap);
You can create a CommonTableExpression instance starting out from a Name, using Name.as(Select) e.g.
CommonTableExpression<?> s1 = name("s1").as(query1);
CommonTableExpression<?> s2 = name("s2").as(query2);
// And then (or, of course, use a Collection)
dslContext.with(s1, s2, ..., SN)
One of the functions we implemented in SAP is not working correctly.
In SAP all functions are working correctly and return the right values, however, when called in Java JCo the Client wants a structure rather than a String or int.
When extracting the structure from the Parameter it gives a Structure that has two unnamed columns each with no lengths of Bytes to be filled in.
Metadata:
{[],[]}
0,0
We tried different datatypes in SAP for the Input Parameter "I_REZEPT" like int8 & char12
private String sollwerte(JSONObject jsonin) throws JSONException, JCoException {
String id = String.valueOf(jsonin.getInt("rezeptid"));
JCoStructure in = input.getStructure("I_REZEPT");
System.out.println("Fieldcount:"+in.getFieldCount());
input.setValue("I_REZEPT", id);
e.printStackTrace();
function.execute(destination);
...
Stacktrace:
com.sap.conn.jco.ConversionException: (122) JCO_ERROR_CONVERSION: Cannot convert a value of '1' from type java.lang.String to STRUCTURE at field I_REZEPT
at com.sap.conn.jco.rt.AbstractRecord.createConversionException(AbstractRecord.java:436)
at com.sap.conn.jco.rt.AbstractRecord.createConversionException(AbstractRecord.java:430)
at com.sap.conn.jco.rt.AbstractRecord.setValue(AbstractRecord.java:2824)
at com.sap.conn.jco.rt.AbstractRecord.setValue(AbstractRecord.java:3933)
at edu.hsalbsig.intellifarm.connector.sap.IntellifarmSapFunction.sollwerte(IntellifarmSapFunction.java:226)
at edu.hsalbsig.intellifarm.connector.sap.IntellifarmSapFunction.execute(IntellifarmSapFunction.java:61)
at edu.hsalbsig.intellifarm.connector.mqtt.IntellifarmMqttClient.messageArrived(IntellifarmMqttClient.java:98)
at org.eclipse.paho.client.mqttv3.internal.CommsCallback.deliverMessage(CommsCallback.java:513)
at org.eclipse.paho.client.mqttv3.internal.CommsCallback.handleMessage(CommsCallback.java:416)
at org.eclipse.paho.client.mqttv3.internal.CommsCallback.run(CommsCallback.java:213)
at java.base/java.lang.Thread.run(Thread.java:834)
While debugging the function from SAP looks like this
Input:
|--------|
| PARAMETERS 'INPUT'
|--------|
|I_REZEPT|
|--------|
| |
|--------|
|I_REZEPT|
|--------|
expected was something like this
Input:
|------------------|
| PARAMETERS 'INPUT'
|------------------|
|I_REZEPT |
|------------------|
|012345678901234567|
|------------------|
| |
|------------------|
Without knowing your function interface definition from ABAP side it is difficult to help here. But if input.getStructure("I_REZEPT"); works this import parameter I_REZEPT seems to be structure. Therefore you cannot call input.setValue("I_REZEPT", (String)id); with trying to set a string for it and this is what the exception is showing. I_REZEPT is an IMPORT parameter and is of type STRUCTURE, it is not a STRING or a CHAR type parameter. It contains various other fields - at least one.
Instead of this, I guess you may call in.setValue(0, id); for setting the first field of this structure or in.setValue("FIELDNAME", id); with using the correct field name within the structure.
Example:
|studentname | maths | computers |
++++++++++++++++++++++++++++++++++
|s1 |78 |90 |
==================================
|s2 |56 |75 |
==================================
|s3 |45 |50 |
==================================
The above table represents data that is present in Elasticsearch.
Consider that the user enters 60 and above then Elasticsearch should be able to display only s1, because he is the only student who has scored more than 60 in both subjects.
How do I solve it using Java API?
NOTE: I was able to find out for individual subjects by:
QueryBuilder query = QueryBuilders.boolQuery()
.should(
QueryBuilders.rangeQuery(maths)
.from(50)
.to(100)
)
You can have multiple .should() clauses in the the bool query. So in your case:
QueryBuilder query = QueryBuilders.boolQuery()
.should(
QueryBuilders.rangeQuery(maths)
.from(61)
.to(100)
)
.should(
QueryBuilders.rangeQuery(computers)
.from(61)
.to(100)
)
Note:
RangeQueryBuilder (returned from QueryBuilders.rangeQuery()) also has the method #gte().
According to the docs: "In a boolean query with no must or filter clauses, one or more should clauses must match a document." So if you have have no filter or must, you might not get the desired behavior. The above answer assumes you are using some other clauses. The point is you can combine clauses in the bool query. With the Java API this just means using the clause repeatedly.
You might want to use filter instead of should. This will lead to faster execution and caching (https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-bool-query.html)
I want to call stored procedures from PostgreSQL in JOOQ by name dynamically:
final Field function = function("report_" + name, Object.class, (Field[])params.toArray(new Field[params.size()]));
dsl().select(function).fetchArrays();
For example it generates:
select report_total_requests('83.84.85.3184');
Which returns:
report_total_requests
-----------------------
(3683,2111,0)
(29303,10644,1)
And in java it is array of "(3683,2111,0)" objects.
I want to generate:
select * from report_total_requests('83.84.85.3184')
To produce:
total | users | priority
------+-------+----------
3683 | 2111 | 0
29303 | 10644 | 1
That is in java array of arrays of objects
Any ideas?
The way forward is to use plain SQL as follows:
Name name = DSL.name("report_" + name);
QueryPart arguments = DSL.list(params);
dsl().select().from("{0}({1})", name, arguments).fetch();
Note, I've wrapped the function name in a DSL.name() object, to prevent SQL injection.
I can do cassandra cql queries using hexadecimal literal, but when I try to use textAsBlob it does not work
For example the following works just fine:
cqlsh:sprich> SELECT * FROM "STOCK_CHECK_RTAM_ITEM_INDEX" WHERE key in (0x3631313230);
key | column1 | value
--------------+----------------+------------------------------------
0x3631313230 | 0x000330303100 | 0xf77374b5eced11e3a877005056b37d30
0x3631313230 | 0x000330303200 | 0xf7757084eced11e3a877005056b37d30
0x3631313230 | 0x000330303400 | 0xf7712ac8eced11e3a877005056b37d30
But the following two queries do not:
cqlsh:sprich> SELECT * FROM "STOCK_CHECK_RTAM_ITEM_INDEX" WHERE key in (textAsBlob("0x3631313230"));
Bad Request: line 1:84 missing EOF at ')'
text could not be lexed at line 1, char 15
cqlsh:sprich> SELECT * FROM "STOCK_CHECK_RTAM_ITEM_INDEX" WHERE key in (textAsBlob("3631313230"));
Bad Request: line 1:82 missing EOF at ')'
text could not be lexed at line 1, char 15
The following two queries do not give a syntax error but do not return the proper results as well:
cqlsh:sprich> SELECT * FROM "STOCK_CHECK_RTAM_ITEM_INDEX" WHERE key in (textAsBlob('0x3631313230'));
cqlsh:sprich> SELECT * FROM "STOCK_CHECK_RTAM_ITEM_INDEX" WHERE key in (textAsBlob('3631313230'));
Why is textAsBlob is not working? What mistake am I making in the proper use of textAsBlob?
Cassandra represents a blob data type as a hexadecimal number, such as 0x3631313230.
This is different than a text data type (which is enclosed in single quotation marks), such as '0x3631313230'.
For those reasons, in your statements:
textAsBlob doesn't work because it expects a string in single quotation marks.
When you do use single quotation marks, since 0x3631313230 is different than '0x3631313230', it doesn't return the correct results.
You can see it clearly in the following example. Assume:
CREATE TABLE test(b blob, t text, PRIMARY KEY(b));
Now execute the following INSERT:
INSERT INTO test(b,t) VALUES(0x3631313230, blobAsText(0x3631313230));
When you do a SELECT, something like this will be returned:
b | t
0x3631313230 | 61120
As you can see, 0x3631313230 is not equivalent to '0x3631313230', and that's why you should use your first query to get the correct results:
SELECT * FROM "STOCK_CHECK_RTAM_ITEM_INDEX" WHERE key in (0x3631313230);
Here's the reference:
http://docs.datastax.com/en/cql/3.3/cql/cql_reference/cql_data_types_c.html