How to create a column in Parse using code - java

I want to create a new column to store an array list in Parse, but I am unable to create the column (without using the dashboard). It needs to be created in the default "User" class. I've tried creating a Parse object in the user class and I tried querying for the column(hoping that if it doesn't find it, it will create it). It needs to be a column that can store an array list. I am not getting any errors in my code so I am unsure what to do next.

My experience is with the .NET API, but I suspect the principle is the same.
Parse will not create a new column simply from a read; you must set a value in at least one instance, and save it to the DB. This will create the column. Previously existing rows will contain "Undefined" for the new column value, and will not contain a key for the column.
My practice has been to derive types for my various ParseObjects. One thing this affords is that I can wrap the check for the key in my property getters, and set a default value if it is missing.
A caveat: (I'm speaking C#-ese here, so you'll have to do a mental translation) When you derive from ParseObject, you decorate the class with a ParseClassName attribute that defines the name for the document type in your database that your class is bound to. However, Parse already has a derived type, ParseUser, and when you derive from that, you must bind to the predefined "_User" class. (This is true for "_Session" and "_Role" also.)

Related

Indexing a simple Java Record

I have a Java Object, Record . It represents a single record as a result of SQL execution. Can CQEngine index collection of Record ?
My class is of the form
public class Record {
private List<String> columnNames;
private List<Object> values;
... Other getters
}
I have looked through some examples, but I have no luck there.
I want to index only specific column(s) with its name and corresponding value. Can this be achived using cqengine or is there any other alternatives to achieve the same.
Thanks.
That seems to be a strange way to model data, but you can use CQEngine with that model if you wish.
(First off, CQEngine will have no use for your column names so you can remove that field.)
To do this, you will need to define a CQEngine virtual attribute for each of the indexes in your list of values.
Each attribute will need to be declared with the data type which will be stored in that column/index, and will need to be able to cast the object at that index in your list of values, to the appropriate data type (String, Double, Integer etc.).
So let's say your Record has a column called 'price', which is of type Double, and is stored at index 5 in the list of values. You could define an attribute which reads it as follows:
public static final Attribute<Record, Double> PRICE =
attribute("PRICE", record -> ((Double) record.values.get(5));
If this sounds complicated, it's because that way of modelling data makes things a bit complicated :) It's usually easier to work with a data model which leverages the Java type system (which your model does not). As such, you will need to keep track of the data types etc. of each field programmatically yourself.
CQEngine itself will work fine with that model though, because at the end of the day CQEngine attributes don't need to read fields, the attributes are just functions which are programmed to fetch values.
There's a bunch of stuff not covered above. For example can your values be null? (if so, you should use the nullable variety of attributes as discussed in the CQEngine docs. Or, might each of your Record objects have different sets of columns? (if so, you can create attributes on-the-fly when you encounter a new column, but you should probably cache the attributes you have created somewhere).
Hope that helps,
Niall (CQEngine author)

Hibernate Custom Basic Type with Database Function

I'm making a custom Hibernate type using AbstractSingleColumnStandardBasicType to automatically convert the SQL UDT into a Java type. The SQL uses database functions to construct and serialize the column data from BLOBs and/or CLOBs, and I want to hide that from a user. So they don't have to use #ColumnTransformer(read="_serializer(column_name)", write="_constructor(?)") for every column.
I've made types using the BLOB SqlTypeDescriptor and a custom JavaTypeDescriptor to do the conversion at the JDBC level, but how can I have my custom type wrap the column and parameters in the proper SQL functions? I'm not finding anything describing anything like this.
Edit:
To make this more concrete, this UDT (SDE ST_GEOMETRY) must be build via database functions (ST_LineFromText) or the defined UDT constructor (ST_LineString). Likewise, to read them from the database, you have to "export" them into a certain format (ST_AsText) for Java to read it. The regular approach of reading the fields of the type doesn't work in this case.
So:
create table geo_lines (
line SDE.ST_GEOMETRY
);
And:
#Entity
#Table(name="geo_lines")
public class GeoLine {
#Column(name="line")
#Type(type="org.example.MyStLineType")
// i'm wanting to be able to omit the following annotation
#ColumnTransformer(read="SDE.ST_AsText(line)",
write="SDE.ST_ST_LineFromText(?, 3857)")
private Polyline line;
}
So, all my type really needs from the user is the number (3857 in this case), which can be passed as a parameter, and everything else is standard boilerplate for any line. Repeat for the other types like points, polygons, etc.

Updating values of functional data properties doesn't remove old values, only adds new triples

I have an RDF Ontology with a functional property hasTrendValue which relates instances of a class with integer values. I want to change these values programmatically using Jena. I tried the following code:
Property hasTrend = ontModel.getDatatypeProperty(preFix+"hasTrendValue");
Individual regionQualifier = ontModel.getIndividual(activityName);
ontModel.addLiteral(regionQualifier,hasTrend,34);
PrintStream p = new PrintStream(ontoPath);
ontModel.write(p,null);
p.close();
This code executes correctly but, it does not update the already hasTrendValue value in the RDF; instead it adds a new hasTrendValue to the RDF ontology even though it declared as a functional property. What is the better way of doing this?
RDF does not have the concept of "change", only "add" and "remove". To change a value, you need to remove the old one and add the new one.
Declaring it as a functional property does not change this. Jena does not check the ontology on every operation. In fact, a functional property says that the object identifies one thing - it may be written in many ways. 001 and 1 are the same value. There may be multiple triples, it's not automatically wrong.

Storing various object types in single column of database

I am working in Java. I have an class called Command. This object class stores a variable List of parameters that are primitives (mostly int and double). The type, number, and order of parameters is specific to each command, so the List is type Object. I won't ever query the table based on what these parameter values are so I figured I would concatenate them into a single String or serialize them in some way. I think this may be a better approach that normalizing the table because I will have to join every time and that table will grow huge pretty quickly. (Edit: The Command object also stores some other members that won't be serialized such as a String to identify the type of command, and a Timestamp for when it was issued.)
So I have 2 questions:
Should I turn them into a delimited String? If so, how do I get each object as a String without knowing which type to cast them to? I attempted to loop through and use the .toString method, but that is not working. It seems to be returning null.
Or is there some way to just serialize that data of the array into a column of the DB? I read about serialization and it seems to be for the context of serializing whole classes.
I would use JSON serializer and deserializer like Jackson to store and retrieve those command objects in DB without losing the specific type information. On a side note, I would have these commands implement a common interface and store them in a list of commands and not in a list of objects.

java dynamic attributes table

I am developing a java application which needs a special component for dynamic attributes. The arguments are serialized (using JSON) and stored in a database and then deserialized at runtime. All attributes are displayed in a JTable with 3 columns (attribute name, attribute type and attribute value) and stored in a hashmap.
I have currently two problems to solve:
The hashmap can also store objects and the objects can be set to null. And if set to null i dont know which class they belong to. How could i store objects even if they are null and known which class they belong to? Do i need to wrap each object in a class that will holds the class of the stored object?
The objects are deserialized from json at runtime. The problem with this is that there are many different types of objects and i don't actually know all object types that will be stored in the hashmap. So i am looking for a way to dynamicly deserialize objects.. Is there such a way? Would i have to store the class of the object in the serialized json string?
Thanks!
Take a look to the Null Object Pattern. You can use an extra class to represent a Null instance of your type and still could contain information about itself.
There is something called a Class Token, Which is the use of Class objects as keys for heterogeneous containers. Take a look to Effective Java By Joshua Bloch, Item 29. I'm not sure how this approach could work for you since you may have many instances of the same type but I leave it as a reference.
First of all, can you motivate why you use JSON serialization for your attributes ?
This method is disadvantageous in many ways in my opinion, it can cause problems with database search and indexing, make database viewing painful and caus unnecessary code in your application. These problems can be not an issue, it depends how you want to use your attributes.
My solution for situation like these is simple table containing columns like:
id - int
attribute_name - varchar
And then add columns for each supported data type:
string_value - varchar
integer_value - int
date_value - date
... and any other types you want.
This design allow for supreme performance using simple and typesafe ORM mapping without any serialization or other boilerplate. It can store values of any type, you just set correct column for attribute type, leaving all other with null. You can simulate null value by using null in all data columns. Indexing and searching also becomes a piece of cake.

Categories

Resources