Junit test for Bigquery in java - java

I'm trying to write Junit test cases for BigQuery using Mockito and Junit5. I'm trying to mock or even initialize Table but am not able to do so is there any way we can do it
private JsonStreamWriter streamWriter;
void WriteToBQ(TableName parentTable) {
BigQuery bigquery = BigQueryOptions
.newBuilder()
.setProjectId(AppConstants.PROJECT_ID)
.build()
.getService();
Table table = bigquery.getTable(parentTable.getDataset(), parentTable.getTable());
Schema schema = table.getDefinition().getSchema();
TableSchema tableSchema = BqToBqStorageSchemaConverter.convertTableSchema(schema);
streamWriter = JsonStreamWriter.newBuilder(parentTable.toString(), tableSchema).build();
}
I tried initialise and mock Table using
table = new Table(bigquery, new TableInfo.BuilderImpl(TABLE_INFO));
but I can't use BuilderImpl outside the package as not being public
I have even tried to Mock Table, but still no luck
bigquery = mock(BigQuery.class);
mockOptions = mock(BigQueryOptions.class);
table = mock(Table.class);
when(bigquery.getTable(any(),any())).thenReturn(table);
when(bigquery.getOptions()).thenReturn(mockOptions);

If I had to write the test for this scenario, I would go making a slight change to the class, I would extract the BigQuery object creation and move it to a protected method like the one below and override the Object creation in test implementation
public class Writer {
private JsonStreamWriter streamWriter;
void WriteToBQ(TableName parentTable) throws Descriptors.DescriptorValidationException, IOException, InterruptedException {
BigQuery bigquery = buildBigQueryService();
Table table = bigquery.getTable(parentTable.getDataset(), parentTable.getTable());
Schema schema = table.getDefinition().getSchema();
TableSchema tableSchema = BqToBqStorageSchemaConverter.convertTableSchema(schema);
streamWriter = JsonStreamWriter.newBuilder(parentTable.toString(), tableSchema).build();
}
protected BigQuery buildBigQueryService(){
return BigQueryOptions
.newBuilder()
.setProjectId("AppConstants.PROJECT_ID")
.build()
.getService();
}
}
The test should be something like
class WriterTest {
BigQuery bigquery;
Table table;
#BeforeEach
public void setup() {
bigquery = mock(BigQuery.class);
table = mock(Table.class);
when(bigquery.getTable(any(), any())).thenReturn(table);
}
#Test
public void testWriteToBQ() throws Descriptors.DescriptorValidationException, IOException, InterruptedException {
TableName tableName = TableName.newBuilder().setTable("A").build();
Writer writer = new MockWriter();
writer.WriteToBQ(tableName);
}
class MockWriter extends Writer {
#Override
protected BigQuery buildBigQueryService() {
return mock(BigQuery.class);
}
}
}
Here in the MockWriter, I have just overridden the buildBigQueryService which gives mock BigQuery.class other functionalities that remain the same as the parent class
Also, it is not recommended to mock data so please consider build real table object instead of this
table = mock(Table.class);

Related

Is it possible to use MongoRepository without using Spring Boot? Just plain Java + MongoDB CRUD program

I get this error:
java.lang.NullPointerException
at com.mongodb.quickstart.ConnectToMongoDB.findByUserId(ConnectToMongoDB.java:49)
at com.mongodb.quickstart.ConnectToMongoDB.main(ConnectToMongoDB.java:41)
EmployeeRepository.java
public interface EmployeeRepository extends MongoRepository<Employee,String>{
Employee findByUserId(String userId);
List<Employee> findByLocation(String location);
}
And then I call it here:
public static boolean findByUserId(String userId) {
Optional<Employee> employee = Optional.ofNullable(employeeRepository.findByUserId(userId));
return employee.isPresent()? true: false;
}
However I get NPE Exception above.
But when I just do this:
public static void main( String[] args ) {
// Replace the placeholder with your MongoDB deployment's connection string
String uri = "mongodb://localhost:27017";
try (MongoClient mongoClient = MongoClients.create(uri)) {
MongoDatabase database = mongoClient.getDatabase("testdb");
MongoCollection<Document> collection = database.getCollection("employee");
Bson bsonFilter = Filters.eq("userId", "1234567");
//FindIterable<Document> queryResult = collection.find(bsonFilter);
Document doc = collection.find(bsonFilter).first();
if (doc != null) {
System.out.println(doc.toJson());
} else {
System.out.println("No matching documents found.");
}
}
}
I can fetch data from MongoDB and program works fine.
Spring Data definitely can be used without Springboot.
You need to define Configurations and Components Scan annotations for your project in order to tell Spring how to create the components.
#SpringBootApplication annotation includes all these annotation, that is why it finds all the components.

How to Mock Azure PagedIterable<T>

I have a Java Springboot web API project that uses Azure table storage as the data store. I'd like to create a unit test to make sure that the repository is properly converting an Azure TableEntity into a custom Tag object in the repository. However, I am not able to figure-out a way to mock the Azure PagedIterable<TableEntity> that is returned by the Azure TableClient.listEntities() function.
At the core of my repository class is the following function that returns a filtered list of table entities:
private PagedIterable<TableEntity> getFilteredTableRows(String filter, String tableName) {
ListEntitiesOptions options = new ListEntitiesOptions().setFilter(filter);
TableClient tableClient = tableServiceClient.getTableClient(tableName);
PagedIterable<TableEntity> pagedIterable = tableClient.listEntities(options, null, null);
return pagedIterable;
}
How do I ensure the TableClient is mocked and returns a valid PagedIterable<TableEntity>?
Below is sample JUnit test class that uses Mockito to mock the Azure PagedIterable<T> object and return a single TableEntity that is mapped to a custom Tag model in the repository code.
The test setup requires four mocks:
A mock Iterator
A mock PagedIterable
A mock TableServiceClient
A mock TableClient
If there is an easier way to accomplish the same thing, I'm open to suggestions.
#ExtendWith(MockitoExtension.class)
#MockitoSettings(strictness = Strictness.LENIENT)
public class DocTagRepositoryTest {
#InjectMocks
#Spy
DocTagRepository docTagRepository;
#Mock
TableServiceClient tableServiceClient;
#Mock
TableClient tableClient;
private static TableEntity testTableEntity;
private static Tag testTagObject;
#SneakyThrows
#BeforeAll
public static void setup() {
loadTableObjects();
}
#Test
public void testGetTagList() {
// Given: A request to get tags from Azure table storage...
Iterator mockIterator = mock(Iterator.class);
when(mockIterator.hasNext()).thenReturn(true, false);
when(mockIterator.next()).thenReturn(testTableEntity);
PagedIterable mockPagedTableEntities = mock(PagedIterable.class);
when(mockPagedTableEntities.iterator()).thenReturn(mockIterator);
when(tableServiceClient.getTableClient(Mockito.anyString())).thenReturn(tableClient);
when(tableClient.listEntities(any(), any(), any())).thenReturn(mockPagedTableEntities);
List<Tag> expected = new ArrayList<>();
expected.add(testTagObject);
// When: A call is made to the repository's getActiveTags() function...
List<Tag> actual = docTagRepository.getActiveTags();
// Then: Return an array of tag objects.
assertArrayEquals(expected.toArray(), actual.toArray());
}
private static void loadTableObjects() {
OffsetDateTime now = OffsetDateTime.now();
String testUser = "buh0000";
String rowKey = "test";
String partitionKey = "v1";
String activeStatus = "A";
Map<String, Object> properties = new HashMap<>();
properties.put("createdDate", now);
properties.put("createdBy", testUser);
properties.put("modifiedDate", now);
properties.put("lastModifiedBy", testUser);
properties.put("status", activeStatus);
testTableEntity = new TableEntity(partitionKey, rowKey);
testTableEntity.setProperties(properties);
testTagObject = new Tag(partitionKey, rowKey, now, testUser, now, testUser, activeStatus);
}
}

Mock enhanced DynamoDbTable CRUD operations

How to mock software.amazon.awssdk.enhanced.dynamodb.DynamoDbTable.getItem?
So far I have tried the below, which is throwing NullPointerException from inside the SDK.
Any idea how to mock the table CRUD operations?
#Mock private DynamoDbEnhancedClient enhdynamodb;
#Mock private DynamoDbClient dynamodb;
#Mock private DynamoDbTable<EventRecord> dyamodbTable;
#Mock private SecurityContext securityContext;
#Before
public void setup() {
MockitoAnnotations.initMocks(this);
when(securityContext.getUserPrincipal()).thenReturn(principal);
enhdynamodb = DynamoDbEnhancedClient.builder().dynamoDbClient(dynamodb).build();
dyamodbTable = enhdynamodb.table(TABLE_NAME, TableSchema.fromBean(EventRecord.class));
service = new EventsService(tokenSerializer, enhdynamodb, configProvider, clock);
service.setSecurityContext(securityContext);
}
#Test
public void getEvent_null_notFound() {
String userId = UUID.randomUUID().toString();
String eventId = UUID.randomUUID().toString();
GetItemResponse response = GetItemResponse.builder().build();
EventRecord event = null;
when(principal.getName()).thenReturn(userId);
when(dyamodbTable.getItem(any(GetItemEnhancedRequest.class))).thenReturn(event);
assertThatThrownBy(() -> service.getEvent(eventId)).isInstanceOf(NotFoundApiException.class);
}
public Event getEvent(String eventId) {
log.info("Getting event {}", eventId);
EventRecord eventRecord = loadEvent(eventId);
return modelMapper.map(eventRecord, Event.class);
}
private EventRecord loadEvent(final String eventId) {
String userId = securityContext.getUserPrincipal().getName();
EventRecord event =
getTable()
.getItem(
GetItemEnhancedRequest.builder()
.consistentRead(Boolean.TRUE)
.key(k -> k.partitionValue(userId).sortValue(eventId).build())
.build());
if (event == null) {
throw new NotFoundApiException(
new NotFoundException()
.errorCode("EventNotFound")
.message(String.format("Event %s can not be found.", eventId)));
}
return event;
}
private DynamoDbTable<EventRecord> getTable() {
return dynamodb.table(tableName, TableSchema.fromBean(EventRecord.class));
}
I tried it like this and it does not throw exceptions.
#Test
public void getEvent_null_notFound() {
String userId = UUID.randomUUID().toString();
String eventId = UUID.randomUUID().toString();
DynamoDbTable dynamoDbTable = mock(DynamoDbTable.class);
EventRecord event = null;
when(dynamoDbTable.getItem(any(GetItemEnhancedRequest.class))).thenReturn(event);
assertEquals(event, dynamoDbTable.getItem(event));
}
Note that I mocking DynamoDbTable instead of DynamoDbEnhancedClient.
Mocking calls to the client and doing unit test on your own code is of course a good idea but I highly recommend using the local dynamodb library if you want to do an actual DyanmoDb calls with a local DB.
Here is full documentation. If you use this library in your unit tests you dont need to mock the calls.

Can't switch AbstractRoutingDataSource without using #Transactional

I have a Spring-Boot project. As mentioned above, I want to use AbstractRoutingDataSource to switch DataSouce. I don't want to use #Transactional since no transactions are needed.
But when I transfer my method, it seems like which the #Transactional is being used, but when I test my method in unit test, it runs ok, I think it's strange because in unit-test, #Transactional shouldn't work.
public class DynamicDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
return DynamicContextHolder.peek();
}
}
The controller method
#RequestMapping(value = "/pipeline/analyze/{systemId}/{dbName}/{dbSchemaName}/{dbTableName}", method = RequestMethod.GET)
public String pipelineAnalyze(#PathVariable("systemId") Long systemId, #PathVariable("dbName") String dbName, #PathVariable("dbSchemaName") String dbSchemaName, #PathVariable("dbTableName") String dbTableName) throws IOException {
ETLServerAccessEntityInterface etlServerAccessEntity = etlServer.getETLServerAccessEntity();
/* LOGGER.info("===etlServerAccessEntity.getAccessToken()===");
LOGGER.info(etlServerAccessEntity.getAccessToken());
LOGGER.info("isAlive=" + etlServer.isAlive());*/
TableInfoMasterEntity tableInfoMaster = new TableInfoMasterEntity();
TableInfoMasterEntityPk tableInfoMasterPk = new TableInfoMasterEntityPk();
tableInfoMasterPk.setSystemId(systemId);
tableInfoMasterPk.setDbName(dbName);
tableInfoMasterPk.setDbSchemaName(dbSchemaName);
tableInfoMasterPk.setDbTableName(dbTableName);
tableInfoMaster.setId(tableInfoMasterPk);
Example<TableInfoMasterEntity> exampleTableInfoMaster = Example.of(tableInfoMaster);
List<TableInfoMasterEntity> filteredTableInfoMaster = tableInfoMasterService.findAll(exampleTableInfoMaster);
System.out.println("filteredTableInfoMaster.size()=" + filteredTableInfoMaster.size());
TableInfoMasterEntity tableInfoMasterEntity = filteredTableInfoMaster.get(0);
pipelineRequirementAnalyzeService.analyze(tableInfoMasterEntity);
return String.valueOf(pipelineRequirementAnalyzeService.toString());
}
the unit test
#Test
public void findByTemplateNameContaining() throws IOException {
ETLServerAccessEntityInterface etlServerAccessEntity = etlServer.getETLServerAccessEntity();
TableInfoMasterEntity tableInfoMaster = new TableInfoMasterEntity();
TableInfoMasterEntityPk tableInfoMasterPk = new TableInfoMasterEntityPk();
tableInfoMasterPk.setSystemId(1);
tableInfoMasterPk.setDbName("DGRAMES");
tableInfoMasterPk.setDbSchemaName("MESSERIES");
tableInfoMasterPk.setDbTableName("TBLEMSACCESSORYSTATE_MAINTAIN");
tableInfoMaster.setId(tableInfoMasterPk);
Example<TableInfoMasterEntity> exampleTableInfoMaster = Example.of(tableInfoMaster);
List<TableInfoMasterEntity> filteredTableInfoMaster = tableInfoMasterService.findAll(exampleTableInfoMaster);
TableInfoMasterEntity tableInfoMasterEntity = filteredTableInfoMaster.get(0);
pipelineRequirementAnalyzeService.analyze(tableInfoMasterEntity);
System.out.println();
}
These two method are similar, they all transfers the pipelineRequirementAnalyzeService.analyze(tableInfoMasterEntity);
but the results are different. here is the failure result and the success result.

How to populate database before running tests with Micronaut

I'm looking for a way to execute some SQL scripts before my test class is executed. With Spring I can easily annotate my test class (or test method) with the #Sql annotation. I haven't found any particular way to do the same with Micronaut.
The only way I found was to manually populate the data programmatically in the test method itself, but, in my experience, there are times when you have to perform multiple inserts to test a single case.
I've came up with the following code to test a REST controller:
Code
#Validated
#Controller("/automaker")
public class AutomakerController {
private AutomakerService automakerService;
public AutomakerController(AutomakerService automakerService) {
this.automakerService = automakerService;
}
#Get("/{id}")
public Automaker getById(Integer id) {
return automakerService.getById(id).orElse(null);
}
#Get("/")
public List<Automaker> getAll() {
return automakerService.getAll();
}
#Post("/")
public HttpResponse<Automaker> save(#Body #Valid AutomakerSaveRequest request) {
var automaker = automakerService.create(request);
return HttpResponse
.created(automaker)
.headers(headers -> headers.location(location(automaker.getId())));
}
#Put("/{id}")
#Transactional
public HttpResponse<Automaker> update(Integer id, #Body #Valid AutomakerSaveRequest request) {
var automaker = automakerService.getById(id).orElse(null);
return Objects.nonNull(automaker)
? HttpResponse
.ok(automakerService.update(automaker, request))
.headers(headers -> headers.location(location(id)))
: HttpResponse
.notFound();
}
}
Test
#Client("/automaker")
public interface AutomakerTestClient {
#Get("/{id}")
Automaker getById(Integer id);
#Post("/")
HttpResponse<Automaker> create(#Body AutomakerSaveRequest request);
#Put("/{id}")
HttpResponse<Automaker> update(Integer id, #Body AutomakerSaveRequest request);
}
#MicronautTest
public class AutomakerControllerTest {
#Inject
#Client("/automaker")
AutomakerTestClient client;
#Test
public void testCreateAutomakerWhenBodyIsValid() {
var request = new AutomakerSaveRequest("Honda", "Japan");
var response = client.create(request);
assertThat(response.code()).isEqualTo(HttpStatus.CREATED.getCode());
var body = response.body();
assertThat(body).isNotNull();
assertThat(body.getId()).isNotNull();
assertThat(body.getName()).isEqualTo("Honda");
assertThat(body.getCountry()).isEqualTo("Japan");
}
#Test
public void testUpdateAutomakerWhenBodyIsValid() {
var responseCreated = client.create(new AutomakerSaveRequest("Chvrolet", "Canada"));
assertThat(responseCreated.code()).isEqualTo(HttpStatus.CREATED.getCode());
var itemCreated = responseCreated.body();
assertThat(itemCreated).isNotNull();
var responseUpdated = client.update(itemCreated.getId(), new AutomakerSaveRequest("Chevrolet", "United States"));
assertThat(responseUpdated.code()).isEqualTo(HttpStatus.OK.getCode());
var itemUpdated = responseUpdated.body();
assertThat(itemUpdated).isNotNull();
assertThat(itemUpdated.getName()).isEqualTo("Chevrolet");
assertThat(itemUpdated.getCountry()).isEqualTo("United States");
}
}
I could use a method annotated with #Before to populate all the data I need but it would really be nice to be able to use *.sql scripts the way it is possible with Spring. Is there a way to provide such *.sql scripts before the tests are executed ?
TL;DR — Use Flyway.
With Flyway, you can set up and maintain a given database schema (extremely) easily. To your case, any migration script you put under ../test/resources/db/migration/ (or any other default location you set) will be only visible to your tests, and can be executed/run automatically (if configured) any time you run your tests.
Another solution would be to use an in-memory database (but I would stay away from that for real applications). For instance, H2 have a way to specify an "initialization" script and another for data seeding (and such).

Categories

Resources