I have defined a data object that maps to the fields stored in DynamoDB table. Whenever I do a load by specifying the class name such as dynamodbmapper.load(Item.class, hashkey). Should I be making the Item class fields final? Conceptually should this object be immutable? I don't want any service code to modify this object.
Yes, if you don't want any service code to modify this object then this seems like a good idea in general. Only viable for you if DynamoDB Java library supports creating instances via costructor args only, no direct field access or setters - it may not (I dunno).
Related
I have seen in many Rest Assured frameworks that we create POJO classes for Serialization & deserialization but let's say we have to automate more than 50-70 APIs thus, creating POJO classes for all seems tedious work so can we deal with JSON objects and data directly? We can get rid of getters and setters by using Lombok annotations but still, we will have to set variables. just curious about what should be the best practice we can follow?
Not sure if I understood correctly. So maybe this answer goes in the totally wrong direction.
If you have lots of Classes and member variables, to streamline handling, you could introduce a level of abstraction.
As an example:
instead of a class and its member variables, you could have a HashMap that stores [variable name] as key, and [variable value] as value.
for multiple object of the same class, instantiate multiple HashMaps
maybe hold all those produced HashMaps in a Collection like a List
maybe even have an 'outer' HashMap, that maps [class name] to [collection]
in the end it might look like this: HashMap[class name -> Collection], Collection contains multiple 'object' HashMaps. Object HashMaps map their [member variable name] to the [meber variable value]
Now, to the JSON part: Google's GSON already has classes in place that do exactly that (object abstraction), so you now have an easy bridge between JSON and Java.
And then you bring it all together and only write ONE serializer and ONE deserializer for ALL classes you wanna handle.
In the end, you still have to put value to POJO or any kinds of object (Map, List, JSON objects...) to create a json payload. I don't like the idea manipulating data directly, it's so rigid.
To make your work less tedious, some techniques can be applied:
Create inner classes if you don't want to create too many POJO classes separatly.
Use Builder pattern (Intellij suggestion or #Builder annotation in lombok) to create POJO instance more straightforward (comparing to basic Setter).
Create POJO instance with default value for each property, then only the property that matters.
Separate the creation of POJO object in other packages, like PersonFactory to build Person instance. It will help make test cleaner.
I have a class named Database that is defined and contains some data in it's fields.
Now, I create an object of that class as
Database d1 = new Database();
Now, I make changes to those data values of d1 such that whenever I create another object of Database say 'Database d2', this object has the updated values of the data. That means, we need to be able to change the class definition using an object of that class.
Can this be achievable?
I think you might want to implement a copy constructor in your java code, you can do this:
class Database{
Filed field; // some field
Database(){
}
Database(Database db){ // copy constructor
this.field = db.field.clone();
}
}
Database d2 = new Database(d1)
Then d2 will have all the filed updated by d1.
"That means, we need to be able to change the class definition using an object of that class."
Really? You don't change the values in the fields of an instance of a class by changing the classes definition. The class definition is ... the source code / compiled bytecodes.
But if you DO change the class definition, you are liable to find that the new version of the class cannot deserialize instances that were serialized using the old version of the class. (Details depend on the precise nature of the changes ...)
More generally, Java serialization is not a good option for databases:
it is slow,
it is not incremental (if you are loading / saving the entire database)
it doesn't scale (if you are loading the entire database into memory)
it doesn't support querying (unless you load the entire database into memory and implement the queries in plain Java), and
you run into lots of problems if you need to change the "schema"; i.e. the classes that you are serializing / deserializing.
But modulo those limitations, it will work; i.e. it is "achievable".
And if you are really talking about how to change the state of your in-memory Database object, or how to create a new copy from an existing out then:
implement / use getters, setters and / or other methods in the Database class,
implement a copy constructor, factory method or clone method that copies the Database object state ... to whatever depth you need.
All very achievable (simple "Java 101" level programming), though not necessarily a good idea.
I have a class that that can potentially assign values to 50+ plus variables. I don't want to write getters for all of these fields. I would rather have some way that can report which fields have had a value assigned to them and, what that value is.
I had originally made these private and, I know that reflection basically breaks private. Additionally, Securecoding.org states this about reflection:
In particular, reflection must not be used to provide access to classes, methods, and fields unless these items are already accessible without the use of reflection. For example, the use of reflection to access or modify fields is not allowed unless those fields are already accessible and modifiable by other means, such as through getter and setter methods.
My main concern is mucking up my code by declaring dozens of instance variables(and possibly getters). Later in this project, I will have two more large sets of instance variables that need to be declared as well. I know that I can reduce the use of getters with some clever maps and enums but, that still takes parsing dozens ofnull values. Could anyone suggest another way?
I know only 4 ways to access field of class
directly unless field is private
Using method, e.g. getter.
Using constructor.
Using reflection
The ways 1 and 4 are beyond the discussion.
Constructor usage is not convenient here because huge number of fields.
So, methods the possibility.
It is up to you whether you want to use bean convention or for example builder pattern, but if you need this class for persistency or for serialization into XML or JSON etc you need at least getters.
Now, if you just want to validate the instance after its creation you can declare your interface Validatable that declares method validate() and call it when your object should be ready. You have to however implement and maintain this method for each class.
Alternative way is to use one of available validation frameworks. In this case you validation can be done using annotations. You should remember however that behind the scene such frameworks use reflection.
Here are some links for further reading:
http://commons.apache.org/proper/commons-validator/
http://java-source.net/open-source/validation
http://docs.oracle.com/javaee/6/tutorial/doc/gircz.html
I have a model instance already, it's a basic POJO, how can I populate it (by issuing a SELECT) with the values using dbutils by calling the setters which are named to match the table column names?
So BasicRowProcessor should match, I just don't find the appropiate class/method to call with the object as parameter.
There is only one instance I want to set, not an array.
I'm not sure I understand your question. Some source code would help.
There are lots of libraries that perform ORM. See source forge for some ORM projects. One of them is sormula which I created. For the simplest use of it, see POJO zero-config example.
All you can do is
YourObject result = new BasicRowProcessor().toBean(yourResultSet,YourObject.class);
It will create the instance though. This API is not designed to allow you to modify an already existing object.
If you really need to update the existing object you might implement a YourObject.copy(YourObject obj) method and call it with the result from BasicRowProcessor.toBean but it looks quite ugly.
An other (also ugly) solution would be to implement the BeanProcessor class, implement the BeanProcessor.newInstance(Class) method to return your object, then pass your implementation instance to the BasicRowProcessor instance.
I am using an API that gives access to a certain set of subclasses with a common interface. I use the interface throughout my code, and the instances are resolved to the proper subclass based on user needs. My problem is that I need to create a copy of one of these objects, but I don't have access to the clone() method and the API doesn't provide a copy constructor.
ie:
ObjectInterface myObject = objectFromParameter.clone(); //Not possible...
Is there a workaround in Java?
iYou might be able to do what you want with reflection. Alternatively, If the object supports serialization, you can serialize to a byte array and then reconstruct a new instance from that.