So, I am trying my hands-on MongoDB CRUD operations using spring-data-mongodb. Below are my model classes,
#Document(collection = "alumni_students")
public class AlumniStudent {
#Id
private String id;
private String firstName;
private String lastName;
private String email;
#DBRef
private AlumniDepartment alumniDepartment;
#DBRef
private List<AlumniSubject> alumniSubjects;
... getters/setters
#Document(collection = "alumni_department")
public class AlumniDepartment {
#Id
private String id;
private String departmentName;
private String location;
... getters/setters
#Document(collection = "alumni_subjects")
public class AlumniSubject {
#Id
private String id;
private String subjectName;
private int marks;
... getters/setters
I am using MongoRepository for individual collections for their operations like below,
#Repository
public interface AlumniStudentRepository extends MongoRepository<AlumniStudent, String> { }
#Repository
public interface AlumniDepartmentRepository extends MongoRepository<AlumniDepartment, String> {}
#Repository
public interface AlumniSubjectRepository extends MongoRepository<AlumniSubject, String> {}
I have so far done good while creation and getting the student details. The issue I am facing is while updating the student data. In that also specifically while updating the data, I am confused as hell.
Below is my update code from service layer,
#Autowired
AlumniStudentRepository alumniStudentRepo;
#Autowired
AlumniDepartmentRepository alumniDeptRepo;
#Autowired
AlumniSubjectRepository alumniSubjRepo;
public AlumniStudent updateStudent(AlumniStudent student, String id) {
Optional<AlumniStudent> fetchedStudent = alumniStudentRepo.findById(id);
**// UPDATE STUDENT DATA, WORKS FINE**
if (fetchedStudent.isPresent()) {
AlumniStudent studentFromDB = fetchedStudent.get();
studentFromDB.setFirstName(student.getFirstName());
studentFromDB.setLastName(student.getLastName());
studentFromDB.setEmail(student.getEmail());
**// UPDATE DEPARTMENT DATA, WORKS FINE**
if (student.getAlumniDepartment() != null) {
Optional<AlumniDepartment> deptData = alumniDeptRepo.findById(studentFromDB.getAlumniDepartment().getId());
if (deptData.isPresent()) {
AlumniDepartment alumniDepartment = deptData.get();
alumniDepartment.setDepartmentName(student.getAlumniDepartment().getDepartmentName());
alumniDepartment.setLocation(student.getAlumniDepartment().getLocation());
alumniDeptRepo.save(alumniDepartment);
studentFromDB.setAlumniDepartment(alumniDepartment);
}
}
**// UPDATE SUBJECTS ARRAY DATA.... HOW TO DO THIS?**
if (student.getAlumniSubjects() != null && !student.getAlumniSubjects().isEmpty()) {
// Problematic area. How to perform update of arraylist here?
}
return alumniStudentRepo.save(studentFromDB);
}
}
This is the URL to hit in postman :
localhost:8080/alumnistudents/60aa384ffbf1851f56c71bef
And this is the request body:
{
"firstName": "Babita",
"lastName": "Raman",
"email": "babita#gmail.com",
"alumniDepartment": {
"departmentName": "Android Developer",
"location": "Dubai"
},
"alumniSubjects": [
{
"subjectName": "Java",
"marks": 80
},
{
"subjectName": "Unit testing",
"marks": 60
},
{
"subjectName": "Docker",
"marks": 80
}
]
}
I tried some random code but ended up with
Cannot create a reference to an object with a NULL id.
Can someone help here with how to update the arrays data which is referenced as #DbRef ?
Thanks in advance everyone.
I have an issue problem with the inclusion of nested class in the aggregation.
This is a preview of the json document in my collection :
{
"id": "1234",
"typeApp": "API",
"name": "name",
"versionNum": "1",
"release": {
"author": "name",
//some other data
}
}
The document java class :
#Document(collection = "myClassExamples")
public class MyClassExampleDocument {
#Id
private String id;
private String typeApp;
private String name;
private String versionNum;
private Release release;
public static class Release {
private String author;
//Other fields...
}
}
I am trying to build a query, to find the last documents group by a given typeApp in parameter, and sort by versionNum DESC to get the new one by typeApp.
I started with an easier query, a simple group by typeApp :
Aggregation aggregation = newAggregation(
Aggregation.sort(Sort.Direction.DESC, "versionNum"),
Aggregation.group("typeApp"),
project(MyClassExampleDocument.class)
)
The query returns a list of MyClassExampleDocument, with all fields with null values except for the id which is populated with the typeApp.
Do you know how to build the aggregation in order to get the entire object, as stored in my collection ?
Thanks for the help !
You can use like following
public List<MyClassExampleDocument> test() {
Aggregation aggregation = Aggregation.newAggregation(
sort(Sort.Direction.DESC, "versionNum"),
group("typeApp").first("$$ROOT").as("data")
replaceRoot("data")
).withOptions(AggregationOptions.builder().allowDiskUse(Boolean.TRUE).build());
return mongoTemplate.aggregate(aggregation, mongoTemplate.getCollectionName(MyClassExampleDocument.class), MyClassExampleDocument.class).getMappedResults();
}
here is the aggregation
db.collection.aggregate([
{ "$sort": { versionNum: -1 } },
{
"$group": {
"_id": "$typeApp",
"data": { "$first": "$$ROOT" }
}
},
{ "$replaceRoot": { "newRoot": "$data" } }
])
Working Mongo playground
Note : The java code was not tested, it was implemented based on working mongo script
I have a MongoRepository query method which needs to fetch data based on conditions.
I have the following record in database.
{
'name' : 'test',
'show' : false,
'free' : true
}
And the following query, But the query doesn't return this record.
repositiry.findByNameNotNullAndShowIsTrueOrFreeIsTrue()
as per the condition, Name is not null and Free is True. But Why am I not getting the record.
I reproduced your scenario. It's working here. The query logged was correct.
{
"$or": [
{
"name": {
"$ne": null
},
"show": true
},
{
"free": true
}
]
}
To enable the mongodb query logs you must DEBUG MongoTemplate.
logging.level.org.springframework.data.mongodb.core.MongoTemplate = DEBUG
Entity
#Document
#Data
#Builder
class Entity {
private String id;
private String name;
private boolean show;
private boolean free;
}
Repository
interface EntityRepository extends MongoRepository<Entity,String> {
List<Entity> findByNameIsNotNullAndShowIsTrueOrFreeIsTrue();
}
Test
#Test
public void testQuery() {
repository.deleteAll();
Entity entity = Entity.builder()
.free(true)
.show(false)
.name("test")
.build();
repository.save(entity);
List<Entity> entities = repository.findByNameIsNotNullAndShowIsTrueOrFreeIsTrue();
Assert.assertEquals(1, entities.size());
}
I want to write a spring-boot program to get values of name, id, and key where abc.active is true. I have written some code
#Repository
public interface SwitchRepoDao extends MongoRepository< SwitchRepo, String> {
public List<SwitchRepo> findByAbc_active(boolean active);
}
also, I have written class for interface.
#Document(collection="switchrepo")
public class SwitchRepo{
#Id
private String id;
private String type;
private List<Abc> abc;
// with getters and setters also constructors.
And Abc is class.
public class Abc{
private String name;
private String id;
private String key;
private boolean active;
This is the code I am using to display output.
#Bean
CommandLineRunner runner(SwitchRepoDao switchRepoDao) {
return new CommandLineRunner() {
#Override
public void run(String... args) throws Exception {
Iterable<SwitchRepo> personList = switchRepoDao.findAllWithStatus(true);
System.out.println("Configuration : ");
for (SwitchRepo config : personList)
{
System.out.println(config.getRegistries().toString());
}
}
};
}
Can anyone please help me with this. For any query related question do comment. Thank You in advance.
Given below is MongoDB Collection from database test. and collection name is switchrepo.
"_id" : "1234567890",
"type" : "xyz",
"abc" : [
{
"name" : "test",
"id" : "test1",
"key" : "secret",
"active" : true
},
{
"name" : "test2",
"id" : "test12",
"key" : "secret2",
"active" : false
}
]
}
In response, I need output as
"id" : "test1",
"key" : "secret",
"active" : true
because active is true in that sub-document array.
Actual Result what I got is "abc" : [{"name" : "test","id" : "test1","key" : "secret","active" : true},{"name" : "test2","id" : "test12","key" : "secret2","active" : false}]
You cannot use property-expressions for a proprety when the the field type is an Array.
here solutions
using the #Query or Aggregations
Solution 1 (Using #Query)
#Repository
public interface SwitchRepoDao extends MongoRepository< SwitchRepo, String> {
//public List<SwitchRepo> findByAbc_active(boolean active);
#Query(value = "{ 'abc.active' : ?0}", fields = "{ 'abc' : 1 }")
List<SwitchRepo> findAllWithStatus(Boolean status);
}
{ 'abc.active' : ?0} for filtring
{ 'abc' : 1 } for only return that part of the document (abc).
Calling findAllWithStatus will return all SwitchRepo with at least one ABC with active is true, you need to filter (using java 8 streams filter for examples all no active Abc from array)
Solution 2 (Using Mongodb aggregation)
Create a new dto class
import java.util.List;
import org.springframework.data.annotation.Id;
import org.springframework.data.mongodb.core.mapping.Document;
#Document(collection="switchrepo")
public class SwitchRepoDto {
#Id
private String id;
private String type;
private Abc abc;
// with getters and setters also constructors.
public SwitchRepoDto() {
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getType() {
return type;
}
public void setType(String type) {
this.type = type;
}
public Abc getAbc() {
return abc;
}
public void setAbc(Abc abc) {
this.abc = abc;
}
}
Create custom method Add custom method to Repository or inject MongoOperations into your service layer.
#Autowired
private MongoOperations mongoOperations;
public List<SwitchRepoDto> findAllActive() {
UnwindOperation unwind = Aggregation.unwind("$abc");
MatchOperation match = Aggregation.match(Criteria.where("abc.active").is(true));
Aggregation aggregation = Aggregation.newAggregation(unwind,match);
AggregationResults<SwitchRepoDto> results = mongoOperations.aggregate(aggregation, SwitchRepoDto.class, SwitchRepoDto.class);
List<SwitchRepoDto> mappedResults = results.getMappedResults();
return mappedResults;
}
Given the following example POJO's: (Assume Getters and Setters for all properties)
class User {
String user_name;
String display_name;
}
class Message {
String title;
String question;
User user;
}
One can easily query a database (postgres in my case) and populate a list of Message classes using a BeanPropertyRowMapper where the db field matched the property in the POJO: (Assume the DB tables have corresponding fields to the POJO properties).
NamedParameterDatbase.query("SELECT * FROM message", new BeanPropertyRowMapper(Message.class));
I'm wondering - is there a convenient way to construct a single query and / or create a row mapper in such a way to also populate the properties of the inner 'user' POJO within the message.
That is, Some syntatical magic where each result row in the query:
SELECT * FROM message, user WHERE user_id = message_id
Produce a list of Message with the associated User populated
Use Case:
Ultimately, the classes are passed back as a serialised object from a Spring Controller, the classes are nested so that the resulting JSON / XML has a decent structure.
At the moment, this situation is resolved by executing two queries and manually setting the user property of each message in a loop. Useable, but I imagine a more elegant way should be possible.
Update : Solution Used -
Kudos to #Will Keeling for inspiration for the answer with use of the custom row mapper - My solution adds the addition of bean property maps in order to automate the field assignments.
The caveat is structuring the query so that the relevant table names are prefixed (however there is no standard convention to do this so the query is built programatically):
SELECT title AS "message.title", question AS "message.question", user_name AS "user.user_name", display_name AS "user.display_name" FROM message, user WHERE user_id = message_id
The custom row mapper then creates several bean maps and sets their properties based on the prefix of the column: (using meta data to get the column name).
public Object mapRow(ResultSet rs, int i) throws SQLException {
HashMap<String, BeanMap> beans_by_name = new HashMap();
beans_by_name.put("message", BeanMap.create(new Message()));
beans_by_name.put("user", BeanMap.create(new User()));
ResultSetMetaData resultSetMetaData = rs.getMetaData();
for (int colnum = 1; colnum <= resultSetMetaData.getColumnCount(); colnum++) {
String table = resultSetMetaData.getColumnName(colnum).split("\\.")[0];
String field = resultSetMetaData.getColumnName(colnum).split("\\.")[1];
BeanMap beanMap = beans_by_name.get(table);
if (rs.getObject(colnum) != null) {
beanMap.put(field, rs.getObject(colnum));
}
}
Message m = (Task)beans_by_name.get("message").getBean();
m.setUser((User)beans_by_name.get("user").getBean());
return m;
}
Again, this might seem like overkill for a two class join but the IRL use case involves multiple tables with tens of fields.
Perhaps you could pass in a custom RowMapper that could map each row of an aggregate join query (between message and user) to a Message and nested User. Something like this:
List<Message> messages = jdbcTemplate.query("SELECT * FROM message m, user u WHERE u.message_id = m.message_id", new RowMapper<Message>() {
#Override
public Message mapRow(ResultSet rs, int rowNum) throws SQLException {
Message message = new Message();
message.setTitle(rs.getString(1));
message.setQuestion(rs.getString(2));
User user = new User();
user.setUserName(rs.getString(3));
user.setDisplayName(rs.getString(4));
message.setUser(user);
return message;
}
});
A bit late to the party however I found this when I was googling the same question and I found a different solution that may be favorable for others in the future.
Unfortunately there is not a native way to achieve the nested scenario without making a customer RowMapper. However I will share an easier way to make said custom RowMapper than some of the other solutions here.
Given your scenario you can do the following:
class User {
String user_name;
String display_name;
}
class Message {
String title;
String question;
User user;
}
public class MessageRowMapper implements RowMapper<Message> {
#Override
public Message mapRow(ResultSet rs, int rowNum) throws SQLException {
User user = (new BeanPropertyRowMapper<>(User.class)).mapRow(rs,rowNum);
Message message = (new BeanPropertyRowMapper<>(Message.class)).mapRow(rs,rowNum);
message.setUser(user);
return message;
}
}
The key thing to remember with BeanPropertyRowMapper is that you have to follow the naming of your columns and the properties of your class members to the letter with the following exceptions (see Spring Documentation):
column names are aliased exactly
column names with underscores will be converted into "camel" case (ie. MY_COLUMN_WITH_UNDERSCORES == myColumnWithUnderscores)
Spring introduced a new AutoGrowNestedPaths property into the BeanMapper interface.
As long as the SQL query formats the column names with a . separator (as before) then the Row mapper will automatically target inner objects.
With this, I created a new generic row mapper as follows:
QUERY:
SELECT title AS "message.title", question AS "message.question", user_name AS "user.user_name", display_name AS "user.display_name" FROM message, user WHERE user_id = message_id
ROW MAPPER:
package nested_row_mapper;
import org.springframework.beans.*;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.jdbc.support.JdbcUtils;
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.sql.SQLException;
public class NestedRowMapper<T> implements RowMapper<T> {
private Class<T> mappedClass;
public NestedRowMapper(Class<T> mappedClass) {
this.mappedClass = mappedClass;
}
#Override
public T mapRow(ResultSet rs, int rowNum) throws SQLException {
T mappedObject = BeanUtils.instantiate(this.mappedClass);
BeanWrapper bw = PropertyAccessorFactory.forBeanPropertyAccess(mappedObject);
bw.setAutoGrowNestedPaths(true);
ResultSetMetaData meta_data = rs.getMetaData();
int columnCount = meta_data.getColumnCount();
for (int index = 1; index <= columnCount; index++) {
try {
String column = JdbcUtils.lookupColumnName(meta_data, index);
Object value = JdbcUtils.getResultSetValue(rs, index, Class.forName(meta_data.getColumnClassName(index)));
bw.setPropertyValue(column, value);
} catch (TypeMismatchException | NotWritablePropertyException | ClassNotFoundException e) {
// Ignore
}
}
return mappedObject;
}
}
Update: 10/4/2015. I typically don't do any of this rowmapping anymore. You can accomplish selective JSON representation much more elegantly via annotations. See this gist.
I spent the better part of a full day trying to figure this out for my case of 3-layer nested objects and just finally nailed it. Here's my situation:
Accounts (i.e. users) --1tomany--> Roles --1tomany--> views (user is allowed to see)
(These POJO classes are pasted at the very bottom.)
And I wanted the controller to return an object like this:
[ {
"id" : 3,
"email" : "catchall#sdcl.org",
"password" : "sdclpass",
"org" : "Super-duper Candy Lab",
"role" : {
"id" : 2,
"name" : "ADMIN",
"views" : [ "viewPublicReports", "viewAllOrders", "viewProducts", "orderProducts", "viewOfferings", "viewMyData", "viewAllData", "home", "viewMyOrders", "manageUsers" ]
}
}, {
"id" : 5,
"email" : "catchall#stereolab.com",
"password" : "stereopass",
"org" : "Stereolab",
"role" : {
"id" : 1,
"name" : "USER",
"views" : [ "viewPublicReports", "viewProducts", "orderProducts", "viewOfferings", "viewMyData", "home", "viewMyOrders" ]
}
}, {
"id" : 6,
"email" : "catchall#ukmedschool.com",
"password" : "ukmedpass",
"org" : "University of Kentucky College of Medicine",
"role" : {
"id" : 2,
"name" : "ADMIN",
"views" : [ "viewPublicReports", "viewAllOrders", "viewProducts", "orderProducts", "viewOfferings", "viewMyData", "viewAllData", "home", "viewMyOrders", "manageUsers" ]
}
} ]
A key point is to realize that Spring doesn't just do all this automatically for you. If you just ask it to return an Account item without doing the work of nested objects, you'll merely get:
{
"id" : 6,
"email" : "catchall#ukmedschool.com",
"password" : "ukmedpass",
"org" : "University of Kentucky College of Medicine",
"role" : null
}
So, first, create your 3-table SQL JOIN query and make sure you're getting all the data you need. Here's mine, as it appears in my Controller:
#PreAuthorize("hasAuthority('ROLE_ADMIN')")
#RequestMapping("/accounts")
public List<Account> getAllAccounts3()
{
List<Account> accounts = jdbcTemplate.query("SELECT Account.id, Account.password, Account.org, Account.email, Account.role_for_this_account, Role.id AS roleid, Role.name AS rolename, role_views.role_id, role_views.views FROM Account JOIN Role on Account.role_for_this_account=Role.id JOIN role_views on Role.id=role_views.role_id", new AccountExtractor() {});
return accounts;
}
Note that I'm JOINing 3 tables. Now create a RowSetExtractor class to put the nested objects together. The above examples show 2-layer nesting... this one goes a step further and does 3 levels. Note that I'm having to maintain the second-layer object in a map as well.
public class AccountExtractor implements ResultSetExtractor<List<Account>>{
#Override
public List<Account> extractData(ResultSet rs) throws SQLException, DataAccessException {
Map<Long, Account> accountmap = new HashMap<Long, Account>();
Map<Long, Role> rolemap = new HashMap<Long, Role>();
// loop through the JOINed resultset. If the account ID hasn't been seen before, create a new Account object.
// In either case, add the role to the account. Also maintain a map of Roles and add view (strings) to them when encountered.
Set<String> views = null;
while (rs.next())
{
Long id = rs.getLong("id");
Account account = accountmap.get(id);
if(account == null)
{
account = new Account();
account.setId(id);
account.setPassword(rs.getString("password"));
account.setEmail(rs.getString("email"));
account.setOrg(rs.getString("org"));
accountmap.put(id, account);
}
Long roleid = rs.getLong("roleid");
Role role = rolemap.get(roleid);
if(role == null)
{
role = new Role();
role.setId(rs.getLong("roleid"));
role.setName(rs.getString("rolename"));
views = new HashSet<String>();
rolemap.put(roleid, role);
}
else
{
views = role.getViews();
views.add(rs.getString("views"));
}
views.add(rs.getString("views"));
role.setViews(views);
account.setRole(role);
}
return new ArrayList<Account>(accountmap.values());
}
}
And this gives the desired output. POJOs below for reference. Note the #ElementCollection Set views in the Role class. This is what automatically generates the role_views table as referenced in the SQL query. Knowing that table exists, its name and its field names is crucial to getting the SQL query right. It feels wrong to have to know that... it seems like this should be more automagic -- isn't that what Spring is for?... but I couldn't figure out a better way. You've got to do the work manually in this case, as far as I can tell.
#Entity
public class Account implements Serializable {
private static final long serialVersionUID = 1L;
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private long id;
#Column(unique=true, nullable=false)
private String email;
#Column(nullable = false)
private String password;
#Column(nullable = false)
private String org;
private String phone;
#ManyToOne(fetch = FetchType.EAGER, optional = false)
#JoinColumn(name = "roleForThisAccount") // #JoinColumn means this side is the *owner* of the relationship. In general, the "many" side should be the owner, or so I read.
private Role role;
public Account() {}
public Account(String email, String password, Role role, String org)
{
this.email = email;
this.password = password;
this.org = org;
this.role = role;
}
// getters and setters omitted
}
#Entity
public class Role implements Serializable {
private static final long serialVersionUID = 1L;
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private long id; // required
#Column(nullable = false)
#Pattern(regexp="(ADMIN|USER)")
private String name; // required
#Column
#ElementCollection(targetClass=String.class)
private Set<String> views;
#OneToMany(mappedBy="role")
private List<Account> accountsWithThisRole;
public Role() {}
// constructor with required fields
public Role(String name)
{
this.name = name;
views = new HashSet<String>();
// both USER and ADMIN
views.add("home");
views.add("viewOfferings");
views.add("viewPublicReports");
views.add("viewProducts");
views.add("orderProducts");
views.add("viewMyOrders");
views.add("viewMyData");
// ADMIN ONLY
if(name.equals("ADMIN"))
{
views.add("viewAllOrders");
views.add("viewAllData");
views.add("manageUsers");
}
}
public long getId() { return this.id;}
public void setId(long id) { this.id = id; };
public String getName() { return this.name; }
public void setName(String name) { this.name = name; }
public Set<String> getViews() { return this.views; }
public void setViews(Set<String> views) { this.views = views; };
}
I worked a lot on stuff like this and do not see an elegant way to achieve this without an OR mapper.
Any simple solution based on reflection would heavily rely on the 1:1 (or maybe N:1) relation. Further your columns returned are not qualified by their type, so you cannot say which columns matches which class.
You may get away with spring-data and QueryDSL. I did not dig into them, but I think you need some meta-data for the query that is later used to map back the columns from your database into a proper data structure.
You may also try the new PostgreSQL json support that looks promising.
NestedRowMapper worked for me, the important part is getting the SQL correct. The Message properties shouldn't have the class name in them so the query should look like this:
QUERY:
SELECT title AS "title", question AS "question", user_name AS "user.user_name", display_name AS "user.display_name" FROM message, user WHERE user_id = message_id