Query where clause over array elements inside jsonb PSQL - java

I have a JSON node on which I have to write PSQL query,
My table schema name(String),tagValues(jsonb). Example tagValue data is given below
Name_TagsTable
uid | name(String)| tagValues(jsonb)
-----+-------------------+-----------------------------
1 | myName | { "tags": [{"key":"key1","value" : "value1"}, {"key":"key1","value" : "value2"}, {"key":"key3","value" : "value3"}, {"key":"key4","value" : "value4"}] }
I need a query that gives me names for which
at least one of the tag in the tags list satisfy the condition
key = 'X' and value = 'Y'
Help me with the query. I am using PSQL 10.0

You can use the contains operator #> which also works with arrays
select *
from name_tagstable
where tagvalues -> 'tags' #> '[{"key": "x", "value": "y"}]';

Related

why i am getting rows with null values when i am trying to create view from json file in spark with java

i am reading Json file and creating view in spark with java when i am trying to display it was displaying two extra row starting and ending with null values
i have tried with different options line multi line true but it's not working
class Something
{
public void DoSomething() {
SparkSession session = SparkSession.builder().appName("jsonreader")
.master("local[4]").getOrCreate();
Dataset<Row> jsondataset = session.read()
.json("G:\\data\\employee.json");
jsondataset.select("id","name","age").show();
}
}
+----+-------+----+
| id| name| age|
+----+-------+----+
|null| null|null|
|1201| satish| 25|
|1202|krishna| 28|
|null| null|null|
+----+-------+----+
{
{"id" : "1201", "name" : "satish", "age" : "25"}
{"id" : "1202", "name" : "krishna", "age" : "28"}
}
is my json file and i am getting out put rows with null values like above
can any one help me why i am getting like this
The extra curly brackets are causing this. You will have to handle it either before reading JSON or after reading it ie through spark. Also the NULLs are read as strings not exactly NULL. Below is my workaround, the filter condition will uniquely identify these faulty rows due the "null" being string. :
jsondataset = jsondataset.select("age","id","name").filter("age <> 'null'")
jsondataset.show()
// Result
// +---+----+-------+
// |age|id |name |
// +---+----+-------+
// |25 |1201|satish |
// |28 |1202|krishna|
// +---+----+-------+

SparkSql and REGEX

in my case i use a dataset(dataframe) in JavaSparkSQL.
This dataset result from an JSON file. The json file is formed from key-value.When i lunch a query for see the value i write for examle:
SELECT key1.name from table
example JSON file
{
"key1":
{ "name": ".....",....}
"key2":
{ "name":"....",....}
}
my question is, when i want acceding at all key,I believe I should use a REGEX like
select key*.name from table
but i don't know the regex!
please help
I am afraid no such syntax is available in (spark) SQL.
You may want to construct your query programmatically though.
Something like :
String sql = Stream.of(ds.schema().fieldNames()).filter(name -> name.startsWith("key")).collect(Collectors.joining(", ", "select ", " from table"));
System.out.println(sql);
or even
Dataset<Row> result = spark.table("table").select(Stream.of(ds.schema().fieldNames()).filter(name -> name.startsWith("key")).map(name -> ds.col(name))
.toArray(Column[]::new));
result.show();
HTH!

Empty JSON vs null conversions to a

I'm trying to create an HTTP PUT endpoint in Java that takes in a delta json from the front end and I'm having some trouble figuring out how to implement "nulls".
For example if we have a database model that looks like
id : 1
firstname : Luke
lastname : Xu
age : 24
fav_color : Red
And we send over a PUT request to /person/1 with a json of {age : 25}. Currently, I have a JOOQ pojo to converts to following JSON to a java model but the problem is it is also updating my database values to be null.
There's no difference between
{age : 25}
and
{id : 1,
firstname : null,
lastname : null,
age : 25,
fav_color : null}
Once it hits my Java end point the java model just sets both cases to "null" and there's no difference between a passed in null or the value wasn't being passed in at all.
I also considered processing an input stream (type JSON) but the problem with this is that our JSON names have to be named exactly the same as the database column names which is also kind of unreasonable.
What is the standard for editing the database if we only want to send a "delta json"????
Since you're using jOOQ, I'd suggest you directly pass the JSON values to the jOOQ UpdatableRecord, which can in fact distinguish between:
null meaning not initialised (or default)
null meaning null
It does so by maintaining a changed() flag for each individual column.
For instance:
{age : 25}
... translates to this Java code:
// record.set(USER.ID, 1) I suspect this is still necessary...?
record.set(USER.AGE, 25);
record.update();
... and to this SQL statement:
UPDATE users
SET age = 25
WHERE id = 1
whereas
{id : 1,
firstname : null,
lastname : null,
age : 25,
fav_color : null}
... translates to this Java code
record.set(USER.ID, 1);
record.set(USER.FIRSTNAME, null);
record.set(USER.LASTNAME, null);
record.set(USER.AGE, 25);
record.set(USER.FAV_COLOR, null);
... and to this SQL statement
UPDATE users
SET firstname = null,
lastname = null,
age = 25,
fav_color = null
WHERE id = 1

Cassandra Lucene Index boolean syntax

I am performing a user search system in my Cassandra database. For that purpose I installed Cassandra Lucene Index from Stratio.
I am able to lookup users by username, but the problem is as follows:
This is my Cassandra users table and the Lucene Index:
CREATE TABLE user (
username text PRIMARY KEY,
email text,
password text,
is_verified boolean,
lucene text
);
CREATE CUSTOM INDEX search_main ON user (lucene) USING 'com.stratio.cassandra.lucene.Index' WITH OPTIONS = {
'refresh_seconds': '3600',
'schema': '{
fields : {
username : {type : "string"},
is_verified : {type : "boolean"}
}
}'
};
This is a normal query performed to Lookup a user by username:
SELECT * FROM user WHERE lucene = '{filter: {type : "wildcard", field : "username", value : "*%s*"}}' LIMIT 15;
My Question is:
How could I sort the returned results to ensure that any verified users are between the first 15 results in the query? (Limit is 15).
You can use this search:
SELECT * FROM user WHERE lucene = '{filter: {type:"boolean", must:[
{type : "wildcard", field : "username", value : "*%s*"},
{type : "match", field : "is_verified", value : true}
]}}' LIMIT 15;

Spring data mongodb removes a positional operator from "$unset" update query part

I have a collection of users:
> db.users.find().pretty()
{
"_id" : ObjectId("544ab933e4b099c3cfb62e12"),
"token" : "8c9f8cf4-1689-48ab-bf53-ee071a377f60",
"categories" : [
DBRef("cue_categories", ObjectId("544ab933e4b099c3cfb62e10")),
DBRef("cue_categories", ObjectId("544ab933e4b099c3cfb62e11"))
]
}
I want to find all users who have (let's say) ObjectId("544ab933e4b099c3cfb62e10") category and remove it (because this category was deleted and I don't want users to refer to it anymore).
The valid query to do it in JSON format would be:
db.users.update({
categories:{
$in:[
DBRef("cue_categories", ObjectId("544ab933e4b099c3cfb62e10"))
]
}
},
{
$unset:{
"categories.$":true
}
})
Here's a Spring mongodb query:
Query query = new Query();
query.addCriteria(Criteria.where("categories.$id").in(categoryIds));
Update update = new Update();
update.unset("categories.$");
operations.updateMulti(query, update, User.class);
In order to make an appropriate DB reference I have to provide a list of category IDs, each category ID (in categoryIds) is an instance of org.bson.types.ObjectId.
The problem is that the result query turns out to be without a positional operator:
DEBUG o.s.data.mongodb.core.MongoTemplate - Calling update using
query: { "categories.$id" : { "$in" : [ { "$oid" :
"544ab933e4b099c3cfb62e10"}]}} and update: { "$unset" : { "categories"
: 1}} in collection: users
So the update part must be { "$unset" : { "categories.$" : 1}}
P.S.
I managed to get around by falling back to the plain Java driver use
DBObject query = new BasicDBObject("categories.$id", new BasicDBObject("$in", categoryIds));
DBObject update = new BasicDBObject("$unset", new BasicDBObject("categories.$", true));
operations.getCollection("users").updateMulti(query, update);
But my question still remains open!
P.S.S.
My case is very similar to Update Array Field Using Positional Operator ($) Does Not Work bug and looks like it was fixed for versions 1.4.1 and 1.5. That being said I use spring-data-mongodb version 1.5.1. And I'm confused. Does anybody have a clue?
You can not use positional $ operator with unset as per MongoDB documentation. It will set the value as Null. https://docs.mongodb.com/manual/reference/operator/update/positional/

Categories

Resources