getting error in bytebuffer column in java class - java

I have made an entity in Jpa as shown below and it is working fine at the end it is representing the table as per the Jpa specifications now in this entity I have kept a member of type ByteBuffer so that at the backend in the corresponding table it is of Blob Data type , now the problem is that functionality is working fine but when i scan the code as per sonar lint for this class it gives me the exception that makes "configuration" transient or serializable .
But the point is that my class already implements serializable so please advise how to overcome this problem, please advise
#Table( value = "abc_configurations" )
public class Abc implements Serializable {
private static final long serialVersionUID = 22528067957L;
#PrimaryKeyColumn( name = "abcCode", type = PARTITIONED )
private String abcCode;
#Column
private ByteBuffer configuration;
// setters and getteres
//default constructor
}

Related

How to resolve "Should not reach the end of iterator" while trying to use Query by example of Spring for Cosmos DB

I have a requirement of using a java model which will have couple of attributes but some of them will have values and some of them will not have and this is not fixed. For example if it has 4 attribute 3 may have values while passing it to the controller method or it may so happen 2 of them will have values but then rest of the attributes will be null. So to handle this i choose to use Query by example of spring , but i am getting
java.lang.IllegalArgumentException : "Should not reach the end of iterator"
I am trying to fetch data from a Azure CosmosDB. Below is the code i have used
ExampleMatcher macther = ExampleMatcher.matching().withIgnoreNullValues();
Example<RequestModel> exampleQ = Example.of(new RequestModel(
req.getEmp(), // these are the attributes which can have alternatively values or can be empty
req.getBase(),
req.getSeat(),
req.getRent()
),matcher);
sampleRepo.findByEmpOrBaseOrSeatOrRent(exampleQ ); // here i am getting the exception
The Repository
public interface SampleRepo extends CosmosRepository<TableA,String>,BaseContainerRepo{
}
The container
#Container(containerName= "${container-tableA}")
public class TableA extends BaseContainer{
}
Base model class
public class BaseContainer{
#Id
private String id;
private Inetger emp;
#PartitionKey
private String key;
private String base;
private String eqp;
}
The base container repo
public interface BaseContainerRepo{
List<BaseContainer> findByEmpOrBaseOrSeatOrRent(Example<RequestModel> exampleQ);
}
Can anyone please let me know where i am doing it wrong .

AWS DynamoDBMapper save method keeps throwing `DynamoDBMappingException: not supported; requires #DynamoDBTyped or #DynamoDBTypeConverted`

I followed everything that is outlined here - https://github.com/derjust/spring-data-dynamodb/wiki/Use-Hash-Range-keys. But still no luck.
I have a DynamoDB table with a hash key and a sort key.
Here is my entity class RecentlyPlayed.class
#DynamoDBTable(tableName="some-table")
public class RecentlyPlayed {
#Id
private RecentlyPlayedId recentlyPlayedId;
// ----- Constructor methods -----
#DynamoDBHashKey(attributeName="keyA")
// Getter and setter
#DynamoDBRangeKey(attributeName="keyB")
// Getter and setter
}
Here is my key class RecentlyPlayedId.class
public class RecentlyPlayedId implements Serializable {
private static final long serialVersionUID = 1L;
private String keyA;
private String keyB;
public RecentlyPlayedId(String keyA, String keyB) {
this.keyA = keyA;
this.keyB = keyB;
}
#DynamoDBHashKey
// Getter and setter
#DynamoDBRangeKey
// Getter and setter
}
Here is my repository interface RecentlyPlayedRepository
#EnableScan
public interface RecentlyPlayedRepository extends CrudRepository<RecentlyPlayed, RecentlyPlayedId> {
List<RecentlyPlayed> findAllByKeyA(#Param("keyA") String keyA);
// Finding the entry for keyA with highest keyB
RecentlyPlayed findTop1ByKeyAOrderByKeyBDesc(#Param("keyA") String keyA);
}
I am trying to save an object like this
RecentlyPlayed rp = new RecentlyPlayed(...);
dynamoDBMapper.save(rp); // Throws that error
recentlyPlayedRepository.save(rp); // Also throws the same error
I am using Spring v2.0.1.RELEASE. The wiki in the original docs warns about this error and describes what to do to mitigate. I did exactly what they said. But still no luck.
The link to that wiki is here - https://github.com/derjust/spring-data-dynamodb/wiki/Use-Hash-Range-keys
DynamoDB only supports primitive data types, it does not know how to convert your complex field (recentlyPlayedId) into a primitive, such as a String.
To show that this is the case, you can add the annotation #DynamoDBIgnore to your recentlyPlayedId attribute like this:
#DynamoDBIgnore
private RecentlyPlayedId recentlyPlayedId;
You also need to remove the #id annotation.
Your save function will then work, but the recentlyPlayedId will not be stored in the item. If you do want to save this field, you need to use the #DynamoDBTypeConverted annotation and write a converter class. The converter class defines how to convert the complex field into a String, and then uncovert the String into the complex field.
Removing getters/setters for the #Id field fixed the problem for me. This is suggested in https://github.com/derjust/spring-data-dynamodb/wiki/Use-Hash-Range-keys
not supported; requires #DynamoDBTyped or #DynamoDBTypeConverted",
i was getting this error when i defined model class with field JsonNode,i converted it to MAP<String,String>,now it is working fine

Java/Cassandra throwing error with making Frozen attribute

I have a Java class that is using the datastax cassandra driver to write a pojo to a cassandra table. Everything works fine, until it comes to having to write a class object to the cassandra table. It throws this error:
Caused by: com.datastax.driver.core.exceptions.CodecNotFoundException: Codec not found for requested operation: [frozen< projKeySpace.smi > <-> code.generic.common.data.MyCustomSmiObject]
So I have tried a lot of different things to try and make the attribute "Frozen", but nothing works and I keep getting the same error. Here is an example of the class object.
#Table(keyspace="projkeyspace", name="summarytable")
public class DataGroupingObject implements Serializable {
#Column(name = "objid")
private String objId;
#Column(name = "timeofjob")
private Date timeOfJob;
#Column(name = "smiobjectinput")
#Frozen
//Have also tried:
//#Frozen("frozen<projKeySpace.smi>")
//#Frozen("frozen<smi>")
//#Frozen("frozen<MyCustomSmiObject>")
//And all other permutations I can think of...
private MyCustomSmiObject myCustomSmiObject; //The problem attribute
#Column(name = "column5")
private String dataForColumn5;
//Getters and setters....
}
So what am I overlooking? Digging into the datastax documentation didn't show much beyond this, http://docs.datastax.com/en/drivers/java/2.2/com/datastax/driver/mapping/annotations/Frozen.html , which I tried.
I also have tried having the MyCustomSmiObject be mapped to the frozen 'projkeyspace.smi' and that didn't work (of course I didn't think it would since there isn't actually a table in cassandra called smi, its just a type) but here is an example of it:
#Table(keyspace="projkeyspace", name="smi")
public class MyCustomSmiObject implements Serializable {
#Column(name = "idstring")
private String idString;
#Column(name = "valuenum")
private Double valueNum;
//Getters and Setters....
}
So like I said, I am at a loss. Any help would be greatly appreciated and thanks in advance!
smi is a UDT isn't it? In that case MyCustomSmiObject should be annotated with #UDT(keyspace="projkeyspace", name="smi") instead of #Table. By doing that, the driver should detect that this is a UDT and it will register a custom codec for it which will allow it to be able to properly serialize and deserialize it.
On another note the #Frozen annotation currently has no impact on the mapper, it is only informational at this time until the mapper has support for schema generation.

how to model a singleton object to table through hibernate?

I have a global config object in my project and there can ever be 0 or 1 instance of this class that i want to persist in db. What is the best way to do this ? One trick i know here is to have a "constant" field mapped with unique constraint set on it, are there other such ways as this looks a little hacky ?
Here's what i tried :-
#Entity
public class DTLdapConfig implements Serializable {
#GeneratedValue(strategy=GenerationType.TABLE)
#Id
private int id;
#Column(unique=true)
private boolean singletonGuard;
// no public setter getter for singletonGuard
// other code below
}

How annotation mapping is done in java persistence?

We use annotations for mapping the entity class with the database table by simply specifying #Entity and more like #Id, table joins and many things. I do not know how these entity variables are getting mapped with database table. Can anyone give a short description for understanding.
Thanks :)
Well the idea is to translate your objects and their connections with other objects into a relational database. These two ways of representing data (objects defined by classes and in tables in a database) are not directly compatible and that is where a so called Object Relational Mapper framework comes into play.
So a class like
class MyObject
{
private String name;
private int age;
private String password;
// Getters and setters
}
Will translate into a database table containing a column name which is of type varchar, age of type int and password of type varchar.
Annotations in Java simply add additional information (so called meta data) to your class definitions, which can be read by any other class (e.g. JavaDoc) and in the case of the Java Persistence API will be used by an ORM framework like Hibernate to read additional information you need to translate your object into the database (your database table needs a primary id and some information - like what type of a relation an object has to another - can't be automatically determined by just looking at your class definition).
Annotations are very well explained here:
http://docs.jboss.org/hibernate/stable/annotations/reference/en/html_single/
annotations are just metadata on a class, nothing magical. You can write your own annotations. Those annotations are given retention policies of runtime (which means you have access to that metadata at runtime). When you call persist etc the persistence provider iterates through the fields (java.lang.reflect.Field) in your class and checks what annotations are present to build up your SQL statement. Try writing your own annotation and doing something with it. It won't seem very magical after that.
in your case annotation working means mapping with tablename with entity class is look like as ....
#Entity
#Table(name = "CompanyUser")
public class CompanyUserCAB implements java.io.Serializable
{
private long companyUserID;
private int companyID;
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "companyUserID")
public long getCompanyUserID()
{
return this.companyUserID;
}
public void setCompanyUserID(long companyUserID)
{
this.companyUserID = companyUserID;
}
#Column(name = "companyID")
public int getCompanyID()
{
return this.companyID;
}
public void setCompanyID(int companyID)
{
this.companyID = companyID;
}
}

Categories

Resources