I'm trying to do some testing on a piece of legacy code and I have hit a bit of a wall. The code is part of a backend server for one of our angular web applications. The particular code I need to test is responsible for managing the creation of a sales rep account using data passed in from the client. In addition to saving the new rep to our mongo db, the code also has to handle saving to an external sql db. The work flow looks something like this:
Receive 'put' request from client.
Create a new 'rep' object with the passed in data.
Save the rep to a mongo db.
Call a Groovy class that will do an 'insert' on a remote db and return the id of the remote record.
Save the remote id into the mongo data.
Normally, I would use Mockito to mock out the sql connection, but I couldn't get that to work for this case. My next thought was to try mocking the Groovy class instead. I don't actually care about the internals of the groovy method, I just need an Id back. So far that has not worked either. The reason I suspect is that the groovy method is getting called from a protected method inside my service class. I don't have any control over the signature of this method, it is an override from another library.
Is there anything I can do to be able to test this code without having to set up an actual connection to the Sql db?
Web Service:
#Override
protected void beforeInsert() {
super.beforeInsert();
final Rep weakRep = this;
dwPhase1 = injector.getInstance(DwPhase1.class);
return dwPhase1.insertRepDetail(weakRep);
}
Mocking code:
DwPhase1 dwGroovy = Mockito.mock(DwPhase1.class);
Mockito.when(dwGroovy.insertRepDetail(Mockito.any(Rep.class))).then(new Answer<Object>() {
#Override
public Object answer(InvocationOnMock invocation) throws Throwable {
// TODO Auto-generated method stub
return "hello";
}
});
}
Groovy snippet:
class DwPhase1 extends SQL{
public Number insertRepDetail(Rep r){
String insert="""insert into REP_DETAIL (some values)""";
List<List<Object>> rows =null;
sql.withTransaction {
if(update!=null){
sql.executeUpdate(update);
}
rows = sql.executeInsert(params,insert.toString());
}
Number dwId = null;
if(rows!=null && !rows.isEmpty()){
List columns=rows.get(0);
if(columns!=null && !columns.isEmpty()){
dwId = columns.get(0);
}
}
return dwId;
}
}
Related
Hoping someone else is having the same issue as me, or has other ideas.
I'm currently running Play 1.4.x (not by choice), but also working on upgrading to play 1.5.x, though I verified the same issue happens on both versions.
I created a simple Functional Test that loads data via fixtures
My fixture for loading test data is like so
data.yml
User(testUser):
name: blah
AccessToken(accessToken):
user: testUser
token: foo
Data(testData):
user: testUser
...
I've created a controller to do something with the data like this, that has middleware for authentication check. The routes file will map something like /foo to BasicController.test
public class BasicController extends Controller{
#Before
public void doAuth(){
String token = "foo"; // Get token somehow from header
AccessToken token = AccessToken.find("token = ?", token).first(); // returns null;
// do something with the token
if(token == null){
//return 401
}
//continue to test()
}
public void test(){
User user = //assured to be logged-in user
... // other stuff not important
}
}
Finally I have my functional test like so:
public class BasicControllerTest extends FunctionalTest{
#org.junit.Before
public void loadFixtures(){
Fixtures.loadModels("data.yml");
}
#Test
public void doTest(){
Http.Request request = newRequest()
request.headers.put(...); // Add auth token to header
Http.Response response = GET(request, "/foo");
assertIsOk(response);
}
}
Now, the problem I'm running into, is that I can verify the token is still visible in the headers, but running AccessToken token = AccessToken.find("token = ?", token).first(); returns null
I verified in the functional test, before calling the GET method that the accessToken and user were created successfully from loading the fixtures. I can see the data in my, H2 in-memory database, through plays new DBBrowser Plugin in 1.5.x. But for some reason the data is not returned in the controller method.
Things I've tried
Ensuring that the fixtures are loaded only once so there is no race condition where data is cleared while reading it.
Using multiple ways of querying the database via nativeQuery jpql/hql query language and through plays native way of querying data.
Testing on different versions of play
Any help would be very much appreciated!
This issue happens on functional tests, because JPA transactions must be encapsulated in a job to ensure that the result of the transaction is visible in your method. Otherwise, since the whole functional test is run inside a transaction, the result will only visible at the end of the test (see how to setup database/fixture for functional tests in playframework for a similar case).
So you may try this:
#Test
public void doTest() {
...
AccessToken token = new Job<AccessToken>() {
#Override
public User doJobWithResult() throws Exception {
return AccessToken.find("token = ?", tokenId).first();
}
}.now().get();
....
}
Hoping it works !
I think I had a similar issue, maybe this helps someone.
There is one transaction for the functional test and a different transaction for the controller. Changes made in the test will only become visible by any further transaction if those changes were committed.
One can achieve this by closing and re-opening the transaction in the functional test like so.
// Load / Persist date here
JPA.em().getTransaction().commit(); // commit and close the transaction
JPA.em().getTransaction().begin(); // reopen (if you need it)
Now the data should be returned in the controller method.
So your test would look like this:
public class BasicControllerTest extends FunctionalTest{
#org.junit.Before
public void loadFixtures(){
Fixtures.loadModels("data.yml");
JPA.em().getTransaction().commit();
// JPA.em().getTransaction().begin(); reopen (if you need it)
}
#Test
public void doTest(){
Http.Request request = newRequest()
request.headers.put(...); // Add auth token to header
Http.Response response = GET(request, "/foo");
assertIsOk(response);
}
}
I did never try this with fixtures. But i would assume they run in the same transaction.
I try to develop a scenario here my code must be asynchronous using quarkus framework, bellow a snippet of my code:
#Inject
ThreadContext threadContext;
#Inject
ManagedExecutor managedExecutor
#Transactional
#ActivateRequestContext
private void asyncMethod(DataAccessAuthorisationEntity dataAccess) {
dataAccess.setStatus(IN_PROGRESS);//!!!!!!!!!
dataAccessAuthorisationRepository.persist(dataAccess);
threadContext.withContextCapture(CompletableFuture.completedFuture("T")).runAsync(()->{
logger.info("[][][] for dataAccess id we begin the treatement "+dataAccess.getId());
boolean exit = false;
PortfolioEntity portfolioEntity = portfolioRepository.findById(dataAccess.getPortfolioId());
System.out.println("");
try {
logger.info("[BEGIN][copyFileAfterSharing] for data access id= "+dataAccess.getId());
String portfolioId = portfolioEntity.getExternalId() + "_" + portfolioEntity.getExternalIdType().getCode();
fileService.copyFileOnAnotherServer(new CopyObject(portfolioId, dataAccess.getStartPoint().toString(),
dataAccess.getEmitterOrganisationId(), dataAccess.getRecipientOrganisationId()));
} catch (Exception e) {
dataAccess.setStatus(PENDING);//!!!!!!!!!
dataAccessAuthorisationRepository.persist(dataAccess);
logger.info("[ERROR][copyFileAfterSharing][BELLOW STACKTRACE] for data access id= "+dataAccess.getId());
e.printStackTrace();
exit= true;
}
},managedExecutor);
}
but I get always when I my execution pass by the exception catch and When I call dataAccessAuthorisationRepository.persist(dataAccess) I get:
Transaction is not active, consider adding #Transactional to your
method to automatically activate one.
because I update my entity dataAccess twice time in the same transaction
Quarkus creates a proxy wrapper around your instance that is injected. When you call a method of a manged bean you call actually this proxy object, that hanldes annotations. If you call a mehtod via "this." the Bean-container/proxy will not detect this call as the call does not go thorugh it. You can't use annotations on calls with "this.".
I have a SpringBoot gradle project using apache flink to process datastream signals. When a new signal comes through the datastream, I would like to query look up (i.e. findById() ) it's details using an ID in a postgres database table which is already created in order to get additional information about the signal and enrich the data. I would like to avoid using spring dependencies to perform the lookup (i.e Autowire repository) and want to stick with flink implementation for the lookup.
Where can i specify how to add the postgres connection config information such as port, database, url, username, password etc... (for simplicity purposes can assume the postgres db is local in my machine). Is it as simple as adding the configuration to the application.properties file? if so how can i write the query method to look up the record in the postgres table when searching by non primary key value?
Some online sources are suggesting using this skeleton code but I am not sure how/id it fits my use case. (I have a EventEntity model created which contains all the params/columns from the table which i'm looking up).
like so
public class DatabaseMapper extends RichFlatMapFunction<String, EventEntity> {
// Declare DB connection & query statements
public void open(Configuration parameters) throws Exception {
//Initialize DB connection
//prepare query statements
}
#Override
public void flatMap(String value, Collector<EventEntity> out) throws Exception {
}
}
Your sample code is correct. You can set all your custom initialization and preparation code for PostgreSQL in open() method. Then you can use your pre-configured fields in your flatMap() function.
Here is one sample for Redis operations
I have used RichAsyncFunction here and I suggest you do the same as it is suggested as best practice. Read here for more: https://ci.apache.org/projects/flink/flink-docs-release-1.10/dev/stream/operators/asyncio.html)
You can pass configuration parameteres in your constructor method and use it in your initialization process
public static class AsyncRedisOperations extends RichAsyncFunction<Object,Object> {
private JedisPool jedisPool;
private Configuration redisConf;
public AsyncRedisOperations(Configuration redisConf) {
this.redisConf = redisConf;
}
#Override
public void open(Configuration parameters) {
JedisPoolConfig jedisPoolConfig = new JedisPoolConfig();
jedisPoolConfig.setMaxTotal(this.redisConf.getInteger("pool", 8));
jedisPoolConfig.setMaxIdle(this.redisConf.getInteger("pool", 8));
jedisPoolConfig.setMaxWaitMillis(this.redisConf.getInteger("maxWait", 0));
JedisPool jedisPool = new JedisPool(jedisPoolConfig,
this.redisConf.getString("host", "192.168.10.10"),
this.redisConf.getInteger("port", 6379), 5000);
try {
this.jedisPool = jedisPool;
this.logger.info("Redis connected: " + jedisPool.getResource().isConnected());
} catch (Exception e) {
this.logger.error(BaseUtil.append("Exception while connecting Redis"));
}
}
#Override
public void asyncInvoke(Object in, ResultFuture<Object> out) {
try (Jedis jedis = this.jedisPool.getResource()) {
String key = jedis.get(key);
this.logger.info("Redis Key: " + key);
}
}
}
Basically I have two services each one of them handle methods for each persistent object that I have in my project, these services hold some method which the endpoint(Google) will call to perform something.
I'm using Google Could Endpoints + Mysql Cloud + Hibernate.
Two POs
#Entity
public class Device {
...
}
#Entity
public class User {
...
}
The services for each one of POs
public class DeviceService {
Device getDevice(Long devId){
return new Dao().getById(devId, Device.class);
}
void allocateDevice(Long userId){
User u = new UserService().getUser(userId);
... do stuff
}
}
public class UserService {
User getUser(Long userId){
return new Dao().getById(userId, User.class);
}
}
The endpoint for each one
public class DeviceEndpoint {
#ApiMethod(
name = "device.get",
path = "device/{devId}",
httpMethod = ApiMethod.HttpMethod.GET
)
Device getDevice(Long devId){
MyEntityManager em = new MyEntityManager();
try {
em.getTransaction().begin();
new DeviceService().getDevice(devId);
em.getTransaction().commit();
}finally {
em.cleanup(); //custom method to rollback also
}
return device;
}
#ApiMethod(
name = "device.allocate",
path = "device/{userId}/allocate",
httpMethod = ApiMethod.HttpMethod.GET
)
void allocateDevice(Long deviceId){
MyEntityManager em = new MyEntityManager();
try {
em.getTransaction().begin();
new DeviceService().allocateDevice(userId);
em.getTransaction().commit();
}finally {
em.cleanup(); //custom method to rollback also
}
}
}
I would like to know where I put the database transaction logic(begin,commit,rollback).
Dao layer
Firstly I had inserted into Dao class, but every query/insert/update I had to open and close the connection and when I had to use more than one CRUD I did several open/close connections and it had been expensive and delayed.
Example: In one endpoint request I want to obtain some object from db and update. Two operations and two open/close connections.
Endpoint layer(as example)
Secondly I put the logic to open/close on endpoint methods(as example above), but they said(my work colleagues) it isn't a good pattern, begin and commit transactions in this layer isn't a good idea, then they suggested to do the third option.
Service layer
Put that logic(begin/commit/rollback) into Service layer, in each method, I tried but, some methods call another and that last also open and close the connection, so when the second method return, the transaction came closed.
Please, let me know case missing some important info.
Typically This type of action is performed in the Service Layer as this layer is there to provide logic to operate on the data sent to and from the DAO layer - that being said you could bundle these together into the same module.
The comment "I tried but, some methods call another and that last also open and close the connection, so when the second method return, the transaction came closed." Is interesting; I am not sure how you are managing your connections; but you may want/need to revisit if your connections are being closed before transactions are completed - you may want to look at Hibernates HibernateTransactionManager
Where should "#Transactional" be place Service Layer or DAO
I'm looking for some guidance on real unit testing for Restlet components, and specifically extractors. There is plenty of advice on running JUnit to rest entire endpoints, but being picky this is not unit testing, but integration testing. I really don't want to have set up an entire routing system and Spring just to check an extractor against a mock data repository.
The extractor looks like this:
public class CaseQueryExtractor extends Extractor {
protected int beforeHandle(Request request, Response response) {
extractFromQuery("offset", "offset", true);
extractFromQuery("limit", "limit", true);
// Stuff happens...
attributes.put("query", query);
return CONTINUE;
}
}
I'm thinking part of the virtue of Restlets is that its nice routing model ought to make unit testing easy, but I can't figure out what I need to do to actually exercise extractFromQuery and its friends, and all my logic that builds a query object, without mocking so much that I'm losing testing against a realistic web request.
And yes, I am using Spring, but I don't want to have to set the whole context for this -- I'm not integration testing as I haven't actually finished the app yet. I'm happy to inject manually, once I know what I need to make to get this method called.
Here's where I'm at now:
public class CaseQueryExtractorTest {
private class TraceRestlet extends Restlet {
// Does snothing, but prevents warning shouts
}
private CaseQueryExtractor extractor;
#Before
public void initialize() {
Restlet mock = new TraceRestlet();
extractor = new CaseQueryExtractor();
extractor.setNext(mock);
}
#Test
public void testBasicExtraction() {
Reference reference = new Reference();
reference.addQueryParameter("offset", "5");
reference.addQueryParameter("limit", "3");
Request request = new Request(Method.GET, reference);
Response response = extractor.handle(request);
extractor.handle(request, response);
CaseQuery query = (CaseQuery) request.getAttributes().get("query");
assertNotNull(query);
}
}
Which of course fails, as whatever set up I am doing isn't enough to make Restlets able to extract the query parameters.
Any thoughts or pointers?
There is a test module in Restlet that can provide you some hints about unit testing. See https://github.com/restlet/restlet-framework-java/tree/master/modules/org.restlet.test/src/org/restlet/test.
You can have a look at class HeaderTestCase (see https://github.com/restlet/restlet-framework-java/blob/master/modules/org.restlet.test/src/org/restlet/test/HeaderTestCase.java).
For information, if you use attributes from request, your unit test will pass ;-) See below:
public class CaseQueryExtractor extends Extractor {
protected int beforeHandle(Request request, Response response) {
extractFromQuery("offset", "offset", true);
extractFromQuery("limit", "limit", true);
// Stuff happens...
CaseQuery query = new CaseQuery();
Map<String,Object> attributes = request.getAttributes();
attributes.put("query", query);
return CONTINUE;
}
}
I don't know if you want to go further...
Hope it helps you,
Thierry