I'm using the ninja framework, which utilizes JPA to access a database.
I've managed to set up a connection and get it running in an example controller class.
I'd like to model a "userManager" which, upon initialization, loads all current users from the database into a java map.
When doing so, I face a java.lang.NullPointerException upon calling entitiyManagerProvider.get() since entitiyManagerProvider is set to null.
I'm not sure what is causing this problem and how to solve it, as the "UserManager" has the same annotations as my (problem-free) test controller. Since I don't have any experience with ninja or JPA it might be a very simple fix I simply overlook and I'd appreciate any help.
This is the code for "UserManager.java":
package model;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import javax.persistence.EntityManager;
import javax.persistence.Query;
import com.google.inject.Inject;
import com.google.inject.Provider;
import dto.UserDTO;
import ninja.jpa.UnitOfWork;
public class UserManager {
#Inject
Provider<EntityManager> entitiyManagerProvider;
private static UserManager instance;
private Map<Integer, UserDTO> users = new HashMap<Integer, UserDTO>();
//UserManager is a Singleton
public static synchronized UserManager getInstance(){
if (UserManager.instance == null){
UserManager.instance = new UserManager();
}
return UserManager.instance;
}
private UserManager() {
// load all existing users to map
reloadUsersFromDb();
}
public int getAmountUsers(){
return users.values().size();
}
/**
* reloads ALL users in the map from the db
*/
#UnitOfWork
private void reloadUsersFromDb() {
if (entitiyManagerProvider == null) {
System.out.println("provider is null"); //this is printed
} else {
System.out.println("provider is NOT null"); //not printed
}
EntityManager entityManager = entitiyManagerProvider.get();
//^-- causes null Pointer exception
Query q = entityManager.createQuery("SELECT users FROM users");
List<UserDTO> dbUsers = q.getResultList();
int loadedUsers = 0;
users.clear();
for (UserDTO dbUser : dbUsers) {
users.put(dbUser.getId(), dbUser);
loadedUsers++;
}
System.out.println("loaded " + loadedUsers + "users from db to applicaton.");
}
}
The problem is that injection does never work with objects created directly using new keyword. And this is what you are doing in the getInstance() method. Also you should not depend on injected values in the constructor, as they are injected only after the object is created.
To turn on injection, the instance of UserManager must be created by the framework. Controllers are automatically created by the framework, therefore injection works there.
To fix your code, you could create a service out of UserManager, remove call to reloadUsersFromDb from constructor and mark this method to run at startup with #Start as described here in Ninja framework documentation
Related
I have written one aspect around a service class. In the aspect, I am doing some operation in the before section, which I would like to be rolled back if some exception occurs in the enclosed service method.
The service class is as follows:
#Service
#Transactional
class ServiceA {
...
public void doSomething() {
...
}
...
}
The aspect is as follows:
#Aspect
#Order(2)
public class TcStateManagementAspect {
...
#Around(value = "applicationServicePointcut()", argNames = "joinPoint")
public Object process(ProceedingJoinPoint joinPoint)
throws Throwable {
...
*/Before section */
do some processing and persist in DB
...
Object object = joinPoint.proceed();
...
do some post-processing
}
}
I am seeing an exception in the service method is not rolling back the DB operation in the Begin Section. I tried putting #Transactional on #Around, but it did not help.
In this context, I have gone through the following posts:
Spring #Transactional in an Aspect (AOP)
Custom Spring AOP Around + #Transactional
But I am not able to get any concrete idea regarding how to achieve this. Could anyone please help here? Thanks.
Like I said in my comment, what your around advice does must be declared transactional too. You cannot do that directly, because #Transactional internally uses Spring AOP via dynamic proxies. However, Spring AOP aspects cannot be the target of other aspects. But you can simply create a new helper #Component which you delegate your advice's action to.
Let us assume that the goal is to log the arguments of the #Transactional method targeted by your aspect. Then simply do this:
package com.example.managingtransactions;
import org.aspectj.lang.ProceedingJoinPoint;
import org.aspectj.lang.annotation.Around;
import org.aspectj.lang.annotation.Aspect;
import org.aspectj.lang.annotation.Pointcut;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
#Aspect
#Component
public class TxLogAspect {
private final static Logger logger = LoggerFactory.getLogger(TxLogAspect.class);
#Autowired
TxLogService txLogService;
#Pointcut(
"#annotation(org.springframework.transaction.annotation.Transactional) && " +
"!within(com.example.managingtransactions.TxLogService)"
)
public void applicationServicePointcut() {}
#Around("applicationServicePointcut()")
public Object process(ProceedingJoinPoint joinPoint) throws Throwable {
logger.info(joinPoint.toString());
// Delegate to helper component in order to be able to use #Transactional
return txLogService.logToDB(joinPoint);
}
}
package com.example.managingtransactions;
import org.aspectj.lang.ProceedingJoinPoint;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Component;
import org.springframework.transaction.annotation.Transactional;
import java.util.Arrays;
import java.util.List;
/**
* Helper component to delegate aspect advice execution to in order to make the
* advice transactional.
* <p>
* Aspect methods themselves cannot be #Transactional, because Spring AOP aspects
* cannot be targeted by other aspects. Delegation is a simple and elegant
* workaround.
*/
#Component
public class TxLogService {
#Autowired
private JdbcTemplate jdbcTemplate;
#Transactional
public Object logToDB(ProceedingJoinPoint joinPoint) throws Throwable {
jdbcTemplate.update(
"insert into TX_LOG(MESSAGE) values (?)",
Arrays.deepToString(joinPoint.getArgs())
);
return joinPoint.proceed();
}
public List<String> findAllTxLogs() {
return jdbcTemplate.query(
"select MESSAGE from TX_LOG",
(rs, rowNum) -> rs.getString("MESSAGE")
);
}
}
See? We are passing through the joinpoint instance to the helper component's own #Transactional method, which means that the transaction is started when entering that method and committed or rolled back depending on the result of joinPoint.proceed(). I.e. what the aspect helper writes to the DB itself will also be rolled back if something goes wrong in the aspect's target method.
BTW, because I never used Spring transactions before, I simply took the example from https://spring.io/guides/gs/managing-transactions/ and added the two classes above. Before, I also added this to schema.sql:
create table TX_LOG(ID serial, MESSAGE varchar(255) NOT NULL);
Next, I added made sure that TxLogService is injected into AppRunner:
private final BookingService bookingService;
private final TxLogService txLogService;
public AppRunner(BookingService bookingService, TxLogService txLogger) {
this.bookingService = bookingService;
this.txLogService = txLogger;
}
If then at the end of AppRunner.run(String...) you add these two statements
logger.info("BOOKINGS: " + bookingService.findAllBookings().toString());
logger.info("TX_LOGS: " + txLogService.findAllTxLogs().toString());
you should see something like this at the end of the console log:
c.e.managingtransactions.AppRunner : BOOKINGS: [Alice, Bob, Carol]
c.e.managingtransactions.AppRunner : TX_LOGS: [[[Alice, Bob, Carol]]]
I.e. you see that only for the successful booking transaction a log message something was written to the DB, not for the two failed ones.
I have a class the following class as RequestScope bean:
#RequestScope
class RequestContext {
private String requestId;
private String traceId;
private String authorisedId;
private String routeName;
// few more fields
#Inject RequestContext(SecurityContext securityContext) {
this.requestId = UUID.randomUUID().toString();
if(securityService.getAuthentication().isPresent()){
this.authorisedId = (securityService
.getAuthentication().get()).getUserId().toString();
}
}
/* to be updated in controller method interceptors */
public void updateRouteName(String name){
this.routeName = name;
}
The idea is to have an object containing the REST request level custom data accessible across the application, the scope of the this obviously should be within the current request. This can be used for say.. logging - whenever devs log anything from the application, some of the request meta data goes with it.
I am not clear what the #RequestScope bean really is:
From its definition - my assumption is it is created for every new http-request and same instance is shared for the life of that request.
when is it constructed by Micronaut ? Is it immutable ?
Across multiple requests I can see the same requestId ( expecting new UUID for every request)
Is it the right use-case for #RequestScope bean?
I was running into an issue regarding #RequestScope so I'll post an answer here for others.
I was trying to inject a #RequestScope bean into an HTTP filter, set a value in the bean, and then read it later from another bean. For example
#RequestScope
class RequestScopeBean() {
var id: Int? = null
}
#Filter
class SetRequestScopeBeanHere(
private val requestScopeBean: Provider<RequestScopeBean>
) {
override fun doFilterOnce(request: HttpRequest<*>, chain: ServerFilterChain): Publisher<MutableHttpResponse<*>> {
requestScopeBean.get().id = // id from Http Request
}
}
#Singleton
class GetRequestScopeBeanHere(
private val requestScopeBean: Provider<RequestScopeBean>
) {
fun getIdFromRequestScopeBean() {
println(requestScopeBean.get().id)
}
}
In this example before any controller is executed my filter (SetRequestScope) is called, this will set requestScopeBean.id but the key is that the request scope bean must be wrapped in a javax.inject.Provider, otherwise setting the field won't work.
Down the line, when GetRequestScopeBeanHere::getIdFromRequestScopeBean is called it'll have access to the requestScopeBean.id set earlier
This is intentional by Micronaut:
https://github.com/micronaut-projects/micronaut-core/issues/1615
when is it constructed by Micronaut ?
A #RequestScope bean is created during request processing, the first time the bean is needed.
Is it immutable ?
It could be. You get to decide if the bean is mutable or not when you write the class. As written in your example, RequestContext is mutable. If you remove the updateRouteName method, that bean would be immutable.
Is it the right use-case for #RequestScope bean?
I don't think so, but that is really an opinion based question.
EDIT: Based On Comments Added Below
See the project at https://github.com/jeffbrown/rscope.
https://github.com/jeffbrown/rscope/blob/2935a4c1fc60f350198d7d3c1dbf9a7eedd333b3/src/main/java/rscope/DemoController.java
package rscope;
import io.micronaut.http.annotation.Controller;
import io.micronaut.http.annotation.Get;
#Controller("/")
public class DemoController {
private final DemoBean demoBean;
public DemoController(DemoBean demoBean) {
this.demoBean = demoBean;
}
#Get("/doit")
public String doit() {
return String.format("Bean identity: %d", demoBean.getBeanIdentity());
}
}
https://github.com/jeffbrown/rscope/blob/2935a4c1fc60f350198d7d3c1dbf9a7eedd333b3/src/main/java/rscope/DemoBean.java
package rscope;
import io.micronaut.runtime.http.scope.RequestScope;
#RequestScope
public class DemoBean {
public DemoBean() {
}
public int getBeanIdentity() {
return System.identityHashCode(this);
}
}
https://github.com/jeffbrown/rscope/blob/2935a4c1fc60f350198d7d3c1dbf9a7eedd333b3/src/test/java/rscope/DemoControllerTest.java
package rscope;
import io.micronaut.http.client.RxHttpClient;
import io.micronaut.http.client.annotation.Client;
import io.micronaut.test.annotation.MicronautTest;
import org.junit.jupiter.api.Test;
import javax.inject.Inject;
import static org.junit.jupiter.api.Assertions.assertNotEquals;
import static org.junit.jupiter.api.Assertions.assertTrue;
#MicronautTest
public class DemoControllerTest {
#Inject
#Client("/")
RxHttpClient client;
#Test
public void testIndex() throws Exception {
// these will contain the identity of the the DemoBean used to handle these requests
String firstResponse = client.toBlocking().retrieve("/doit");
String secondResponse = client.toBlocking().retrieve("/doit");
assertTrue(firstResponse.matches("^Bean identity: \\d*$"));
assertTrue(secondResponse.matches("^Bean identity: \\d*$"));
// if you modify DemoBean to be #Singleton instead of
// #RequestScope, this will fail because the same instance
// will be used for both requests
assertNotEquals(firstResponse, secondResponse);
}
}
I am having issue with Mocking a JDBC call using the MockitoJUnitRunner.
Somehow Mockito is not mocking the actual call even though I have below subbing line into the test class.
when(readOnlyJdbcTemplate.query(anyString(), any(Object[].class), any(int[].class), any(FeatureCollectionResponseExtractor.class))).thenReturn(actual);
Very similar mocking is working in another class for very similar type of method. The only difference between them is my other class does have 3 parameters instead of 4 parameters. Below is the code which is actually mocking successfully for different class.
when(readOnlyJdbcTemplate.query(anyString(), any(Object[].class), any(FeaturesResultExtractor.class))).thenReturn(actual);
Below is my actual code.
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.inject.Inject;
import javax.inject.Named;
import java.net.HttpURLConnection;
import java.sql.Types;
import static com.accounts.features.utils.Constants.INTERNAL_SERVER_ERROR;
#Profile
#Log
#Named("featureLibraryDao")
public class FeatureLibraryDaoImpl implements FeatureLibraryDao {
private static final Logger LOGGER = LogManager.getLogger(FeatureLibraryDaoImpl.class);
#Value("${feature.library.function.sql.query}")
private String sqlSelectQuery;
#Inject
#Named("readOnlyJdbcTemplate")
private JdbcTemplate readOnlyJdbcTemplate;
#Override
public FeatureCollectionDTO getFeaturesData(FeatureRequest request) {
try {
int[] argTypes = new int[] { Types.BIGINT, Types.VARCHAR, Types.SMALLINT};
return readOnlyJdbcTemplate.query(sqlSelectQuery, new Object[] {
Long.parseLong(request.getAccountId()), request.getRequestedFeatures(), request.getApplicationSuffix()
}, argTypes,
new FeatureCollectionResponseExtractor(request));
} catch (CustomException cbe) {
throw cbe;
} catch (Exception ex) {
LOGGER.error("getFeaturesData method failed with error message:{}", ex.getMessage(), ex);
CustomErrorCode error = new CustomErrorCode(INTERNAL_SERVER_ERROR);
error.setDeveloperText(ex.getMessage());
throw new CustomSystemException(error, HttpURLConnection.HTTP_INTERNAL_ERROR);
}
}
}
and below is my test class.
#RunWith(MockitoJUnitRunner.class)
public class FeatureLibraryDaoImplTest {
#InjectMocks
private FeatureLibraryDaoImpl dao;
#Mock
private JdbcTemplate readOnlyJdbcTemplate;
private List<String> features = Arrays.asList("excl_clsd_ind_only", "excl_chrgoff_ind_only", "excl_dsput_ind_only");
#Test
public void getFeaturesDataWhenSuccess() {
//given
FeatureRequest request = getFeatureRequest();
FeatureCollectionDTO actual = new FeatureCollectionDTO(features);
when(readOnlyJdbcTemplate.query(anyString(), any(Object[].class), any(int[].class), any(FeatureCollectionResponseExtractor.class))).thenReturn(actual);
//when
FeatureCollectionDTO dto = dao.getFeaturesData(request);
//then
assertThat(dto, notNullValue());
}
}
Any suggestion about what is wrong here? Is there any issue with any(int[].class) ?
I do see you are not passing the sql query sqlSelectQuery value during the test case, But during mock you specified anyString() so it must be some value but not null. Since you are using spring project, you can use ReflectionTestUtils to set the field value for object
#Before
public void setUp() {
ReflectionTestUtils.setField(dao, "sqlSelectQuery", "query");
}
Hey Guys thanks much for all your suggestions. So I found that Test code is perfectly fine. Some how #Value tag wasn't injecting the actual value the sqlSelectQuery in the main code file.
#Value("${feature.library.function.sql.query}") private String sqlSelectQuery;
Instead of that I changed code to private String sqlSelectQuery = "${feature.library.function.sql.query}" and all test cases are passing.
Somehow sqlSelectQuery was't getting the value and hence Mockito wasn't mocking the actual method call. I am yet reviewing why #value is not working as it should be.
I am trying to implement Spring Security ACL in my application. I have many classes that I want to use an ACL on.
I read in the documentation that AOP have been used with success before. Does this mean that all the services should have a common interface for doing CRUD against the objects for maximum reuse of the advise?
Or is it normal to manually insert, delete, ... in the save, update, delete methods of the service?
I can't manage to find many examples of how people use the framework.
---- Listener for Entity removal (includes cascading deletes) -----
package com.acme.model.aspects;
import javax.annotation.PostConstruct;
import javax.persistence.PreRemove;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.stereotype.Component;
import org.springframework.util.Assert;
import org.springframework.data.domain.Persistable;
import com.acme.PermissionService;
#Component
public class ObjectIdentityListener {
private static final Logger LOG = LoggerFactory.getLogger(ObjectIdentityListener.class);
static private PermissionService permissionService;
#Autowired(required = true)
#Qualifier("permissionService")
public void setSearchService(PermissionService _permissionService)
{
permissionService = _permissionService;
}
#PreRemove
public void preRemove(Object object) {
if(object instanceof Persistable) {
LOG.info("Deleting object identity for class {} id {} ", persistable.getClass(), persistable.getId());
permissionService.deleteObjectIdentity((Persistable) object);
}
}
#PostConstruct
public void init() {
Assert.notNull(permissionService, "'permissionService' is required");
}
}
---- Delete method for permissionService ----
public void deleteObjectIdentity(Persistable persistable) {
try{
MutableAcl acl = (MutableAcl) mutableAclService.readAclById(identity(persistable));
mutableAclService.deleteAcl(acl.getObjectIdentity(), true);
} catch (NotFoundException e){
LOG.info("Could not find ACL for target {}", persistable);
}
}
It all depends on your app. Having a centralized hierarchy of services would certainly make it simpler to implement single security checks for create/retrieve/update/delete methods. But you have an existing app with different services that don't necessarily have a common parent implementation, then you'd have to add ALC security annotation on each service method.
Another option is to put ACL security on your DAO layer, it works fine, but for some reason just doesn't feel right. IMHO DAO's shouldn't deal with things like security. I've spent a LOT of time dealing with Spring Security ACL, got a pretty good handle on it by now, ping me if you need any concrete examples.
I have previously only used reflection to do things like dynamically get class and set field values in it. My Google search showed me that I could also possibly use reflection for dynamic type casting?
My code is as follows:
import entity.Shipvia;
import entity.Route;
import java.lang.reflect.Field;
import java.util.List;
import javax.persistence.EntityManager;
import javax.persistence.Persistence;
import javax.persistence.Query;
public class RetrieveResultList {
public static List retrieveResultList(String tablename) {
EntityManager entityManager = Persistence.createEntityManagerFactory("EntityLibraryPU").createEntityManager();
Query query = entityManager.createNamedQuery(tablename + ".findAll");
List<Shipvia> resultList = query.getResultList();
return resultList;
}
}
I am using this method to dynamically retrieve result from a database table. Because the table name is always different, I cannot have List as it will be different for each table.
How would I go about converting the tablename string that I am passing in, to be the type of the List?
You can't do that and even if you could, it would be useless as all generics information is removed from the Java code when it's compiled, only casts would be there and as you would be using reflection there would be no casts to be made.
The closest thing you will be able to do is, instead of sending in a String send a Class object. The caller code would have to say which class it wants (the caller probably knows what kind of object it's using) and you would use it to make the list have the correct generic.
A very simple implementation would be something like this:
List<Shipvia> shipvias = RetrieveResultList.retrieveResultList( Shipvia.class );
And implementation could be something like this:
public class RetrieveResultList {
private static final EntityManagerFactory FACTORY = Persistence.createEntityManagerFactory("EntityLibraryPU");
public static <T> List<T> retrieveResultList(Class<T> type) {
EntityManager entityManager = FACTORY.createEntityManager();
String tablename = type.getName(); // figure out table name from **type**
Query query = entityManager.createNamedQuery(tablename + ".findAll");
List<T> resultList = query.getResultList();
entityManager.close();
return resultList;
}
}
And then you should have something like what you're looking for.
ALSO, DO NOT create an entity manager factory on every call to this method, the entity manager factory MUST BE a singleton in your project as it's a very expensive object to create. You should also close the EntityManager you created before leaving the method.