SpringBootApp NullPointerException with #Autowired repository - java

This is my Spring Boot Application.
when i run the main method always a null pointer exception is thrown.
I have no idea why the #Autowired JsonReaderService is null. As i define it as component.
It is a sub folder in the project src folder so the Main Method is above the source folder. So spring should scan it correctly??
I have also a test method which works just fine.
#SpringBootApplication
public class DemoApplication {
#Autowired
private JsonReaderService jsonReaderService;
private static JsonReaderService stat_jsonReaderService;
static Logger logger = LoggerFactory.getLogger(DemoApplication.class);
public static void main(String[] args) throws IOException {
String textFileName = scanFileName();
Reader reader = Files.newBufferedReader(Paths.get("src/main/resources/" + textFileName));
// This line is always null pointer exception. The #autowired Annotation don't work with my JsonReaderService????? WHY
List<EventDataJson> jsonReaderServicesList = stat_jsonReaderService.readEventDataFromJson(reader);
stat_jsonReaderService.mappingToDatabase(jsonReaderServicesList);
}
public static String scanFileName() {
logger.info("Scanning keyboard input");
System.out.println("enter a file to scan");
Scanner scanInput = new Scanner(System.in);
String text = scanInput.nextLine();
logger.info("successful keyboard input was : " + text);
return text;
}
#PostConstruct
public void init() {
logger.info("initializing Demo Application");
stat_jsonReaderService = jsonReaderService;
}
}
Here i have the class which uses the repository #Autowired to save some Entity but i get always a nullpointer exception in the line repository.save(...)
#Component
public class JsonReaderService {
static Logger logger = LoggerFactory.getLogger(DemoApplication.class);
#Autowired
EventDataRepository repository;
private Reader reader;
private List<EventDataJson> eventDataList;
#Autowired
public JsonReaderService(){}
public List<EventDataJson> readEventDataFromJson(Reader reader) throws IOException {
try {
logger.info("parsing event data from json started");
Gson gson = new Gson();
EventDataJson[] eventData = gson.fromJson(reader, EventDataJson[].class);
eventDataList = Arrays.asList(eventData);
reader.close();
} catch (IOException e) {
logger.error("Error while reading the json file");
e.printStackTrace();
}
logger.info("parsing json eventData successful finished");
return eventDataList;
}
public Boolean mappingToDatabase(List<EventDataJson> eventDataList) {
logger.info("mapping from json to database eventData started ...");
Set<String> idList = eventDataList.stream().map(EventDataJson::getId).collect(Collectors.toSet());
for (String id : idList
) {
Stream<EventDataJson> filteredEventDataList1 = eventDataList.stream().filter((item) -> item.getId().equals(id));
Stream<EventDataJson> filteredEventDataList0 = eventDataList.stream().filter((item) -> item.getId().equals(id));
EventDataJson startedEvent = filteredEventDataList1.filter((item) -> item.getState().equals("STARTED")).findAny().orElse(null);
EventDataJson finishedEvent = filteredEventDataList0.filter((item) -> item.getState().equals("FINISHED")).findAny().orElse(null);
long duration0 = finishedEvent.getTimestamp() - startedEvent.getTimestamp();
Boolean alert;
if (duration0 > 4) {
alert = true;
} else {
alert = false;
}
try {
this.repository.save(new EventDataDb(id, duration0, startedEvent.getType(), startedEvent.getHost(), alert));
logger.info("mapping to Database Repository action successful");
} catch (Exception e) {
logger.error("Exception in database mapping occurred");
e.printStackTrace();
return false;
}
}
return true;
}
}
Repository with Annotation
#Repository
public interface EventDataRepository extends JpaRepository<EventDataDb, String> {
EventDataDb findAllById(String id);
}
Test case works just fine with #autowired Annotation i Don't know why it don't work in the main method. Is it because it is static?
#Autowired
private EventDataRepository repository;
#Autowired
private JsonReaderService jReader;
#Test
public void whenParseJson_thenTransform_and_save_to_db() throws IOException {
BufferedReader reader = Files.newBufferedReader(Paths.get("src/main/resources/" + "logfile.txt"));
List<EventDataJson> eventDataList1 = jReader.readEventDataFromJson(reader);
if (jReader.mappingToDatabase(eventDataList1)) {
EventDataDb eventDataFromDb = this.repository.findAllById("scsmbstgra");
Assertions.assertTrue(eventDataFromDb.getType().equals("APPLICATION_LOG"));
Assertions.assertTrue(eventDataFromDb.getHost().equals("12345"));
Assertions.assertTrue(eventDataFromDb.getAlert().equals(true));
Assertions.assertTrue(eventDataFromDb.getDuration() == 5);
logger.info("Assert successfully accomplished");
} else
logger.error("Could not persist eventData to DB Error");
}
Stack Trace
`Exception in thread "restartedMain" java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.springframework.boot.devtools.restart.RestartLauncher.run(RestartLauncher.java:49)
Caused by: java.lang.NullPointerException
at com.creditsuisse.demo.DemoApplication.main(DemoApplication.java:33)
... 5 more`

You need to run SpringApplication.run() because this method starts whole Spring Framework. Since you don't have it in your code, the beans are not autowired and JsonReaderService is null. You can do the following in your Application.java. Also, since this involves taking input from the CLI why not use CommandLineRunner as follows:
#SpringBootApplication
public class DemoApplication
implements CommandLineRunner {
#Autowired
private JsonReaderService jsonReaderService;
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
#Override
public void run(String... args) {
String textFileName = scanFileName();
Reader reader = Files.newBufferedReader(Paths.get("src/main/resources/" + textFileName));
List<EventDataJson> jsonReaderServicesList = stat_jsonReaderService.readEventDataFromJson(reader);
jsonReaderService.mappingToDatabase(jsonReaderServicesList);
}
}

Related

How to transfer a list of strings from a file to property of application.properties (using Spring boot 2.3.x)

I have a application.yaml
app:
list: /list.txt
list.txt
Also I have a file with list of strings. It locates into /resources(in the root /resource).
first
second
third
class
public class Bean{
#Value("${app.list}")
private List<String> listProp = new ArrayList<>();
public void print(){
System.out.println(listProp);
}
}
I have found that:
public class ResourceReader {
public static String asString(Resource resource) {
try (Reader reader = new InputStreamReader(resource.getInputStream(), UTF_8)) {
return FileCopyUtils.copyToString(reader);
} catch (IOException e) {
throw new UncheckedIOException(e);
}
}
public static String readFileToString(String path) {
ResourceLoader resourceLoader = new DefaultResourceLoader();
Resource resource = resourceLoader.getResource(path);
return asString(resource);
}
}
#Configuration
public class ConfigurationResource {
#Configuration
public class ConfigurationResource {
#Value("${app.list}")
private String pathToFile;
#Bean
public List<String> resourceString() {
String blackList = ResourceReader.readFileToString(pathToFile);
return List.of(blackList.split("\n"));
}
}
}
#RequiredArgsConstructor
public class HelloController {
private final List<String> resourceString;
...
}
This is necessary in order not to manually write a list of strings to the property app.name (there are several hundred lines).
However, I find it difficult to figure out how to do it at low cost. So that it can be easily maintained.
maybe there is an easier way ? I would not like to add a hardcoding value in the configuration class
Maybe someone has some ideas ?
Here is the solution from my understanding if you have to keep lines in the text file that you've shared:
public class Bean {
#Value("${app.list}")
private String listProp; // only get name of file
public void print(){
ClassLoader classLoader = getClass().getClassLoader();
InputStream is = classLoader.getResourceAsStream(listProp);
StringBuilder sb = new StringBuilder();
try {
for (int ch; (ch = is.read()) != -1; ) {
sb.append((char) ch);
}
} catch (IOException e) {
throw new RuntimeException(e);
}
System.out.println(sb);
}
}

Create a simple job spring boot

I created a spring boot project.
I use spring data with elastic search.
The whole pipeline: controller -> service -> repository is ready.
I now have a file that represents country objects (name and isoCode) and I want to create a job to insert them all in elastic search.
I read the spring documentation and I find that there's too much configuration for such a simple job.
So I'm trying to do a simple main "job" that reads a csv, creates objects and insert them in elastic search.
But I have a bit of trouble to understand how injection would work in this case:
#Component
public class InsertCountriesJob {
private static final String file = "D:path\\to\\countries.dat";
private static final Logger LOG = LoggerFactory.getLogger(InsertCountriesJob.class);
#Autowired
public CountryService service;
public static void main(String[] args) {
LOG.info("Starting insert countries job");
try {
saveCountries();
} catch (Exception e) {
e.printStackTrace();
}
}
public static void saveCountries() throws Exception {
try (CSVReader csvReader = new CSVReader(new FileReader(file))) {
String[] values = null;
while ((values = csvReader.readNext()) != null) {
String name = values[0];
String iso = values[1].equals("N") ? values[2] : values[1];
Country country = new Country(iso, name);
LOG.info("info: country: {}", country);
//write in db;
//service.save(country); <= can't do this because of the injection
}
}
}
}
based on Simon's comment. Here's how I resolved my problem. Might help people that are getting into spring, and that are trying not to get lost.
Basically, to inject anything in Spring, you'll need a SpringBootApplication
public class InsertCountriesJob implements CommandLineRunner{
private static final String file = "D:path\\to\\countries.dat";
private static final Logger LOG = LoggerFactory.getLogger(InsertCountriesJob.class);
#Autowired
public CountryService service;
public static void main(String[] args) {
LOG.info("STARTING THE APPLICATION");
SpringApplication.run(InsertCountriesJob.class, args);
LOG.info("APPLICATION FINISHED");
}
#Override
public void run(String... args) throws Exception {
LOG.info("Starting insert countries job");
try {
saveCountry();
} catch (Exception e) {
e.printStackTrace();
}
LOG.info("job over");
}
public void saveCountry() throws Exception {
try (CSVReader csvReader = new CSVReader(new FileReader(file))) {
String[] values = null;
while ((values = csvReader.readNext()) != null) {
String name = values[0];
String iso = values[1].equals("N") ? values[2] : values[1];
Country country = new Country(iso, name);
LOG.info("info: country: {}", country);
//write in db;
service.save(country);
}
}
}
}

Invocation of init method failed; nested exception is java.lang.IllegalArgumentException: The namespace property is required

I am trying to read and write parquet files as mentioned in spring data hadoop documentation and I got the following error
Error creating bean with name 'datasetRepositoryFactory' defined in
class path resource [com/example/demo/DatasetConfig.class]: Invocation
of init method failed; nested exception is
java.lang.IllegalArgumentException: The namespace property is required
Reference Project in spring boot
https://github.com/spring-projects/spring-hadoop-samples/tree/master/dataset
Writing data into Parquet Spring data hadoop - https://docs.spring.io/spring-hadoop/docs/current/reference/htmlsingle/#springandhadoop-store
DataSetConfig.java
#Configuration
#ImportResource("hadoop-context.xml")
public class DatasetConfig {
private #Autowired org.apache.hadoop.conf.Configuration hadoopConfiguration;
#Bean
public DatasetRepositoryFactory datasetRepositoryFactory() {
DatasetRepositoryFactory datasetRepositoryFactory = new
DatasetRepositoryFactory();
datasetRepositoryFactory.setConf(hadoopConfiguration);
datasetRepositoryFactory.setBasePath("/tmp"); return
datasetRepositoryFactory;
}
#Bean
public DataStoreWriter<FileInfo> dataStoreWriter() {
return new AvroPojoDatasetStoreWriter<FileInfo>(FileInfo.class,
datasetRepositoryFactory(), fileInfoDatasetDefinition()); }
#Bean
public DatasetOperations datasetOperations() {
DatasetTemplate datasetOperations = new DatasetTemplate();
datasetOperations.setDatasetDefinitions(Arrays.asList(fileInfoDatasetDefinition()));
datasetOperations.setDatasetRepositoryFactory(datasetRepositoryFactory());
return datasetOperations;
}
#Bean
public DatasetDefinition fileInfoDatasetDefinition() {
DatasetDefinition definition = new DatasetDefinition();
definition.setFormat(Formats.PARQUET.getName());
definition.setTargetClass(FileInfo.class);
definition.setAllowNullValues(false);
return definition;
}
}
Main.java
#ComponentScan
#EnableAutoConfiguration
public class ParquetReaderApplication implements CommandLineRunner {
private DatasetOperations datasetOperations;
private DataStoreWriter<FileInfo> writer;
private long count;
#Autowired
public void setDatasetOperations(DatasetOperations datasetOperations) {
this.datasetOperations = datasetOperations;
}
#Autowired
public void setDataStoreWriter(DataStoreWriter dataStoreWriter) {
this.writer = dataStoreWriter;
}
public static void main(String[] args) {
SpringApplication.run(ParquetReaderApplication.class, args);
}
#Override
public void run(String... strings) {
String fileDir = System.getProperty("user.home");
System.out.println("Processing " + fileDir + " ...");
File f = new File(fileDir);
try {
processFile(f);
} catch (IOException e) {
throw new StoreException("Error writing FileInfo", e);
} finally {
close();
}
countFileInfoEntries();
System.out.println("Done!");
}
private void processFile(File file) throws IOException {
if (file.isDirectory()) {
for (File f : file.listFiles()) {
processFile(f);
}
} else {
if (++count % 10000 == 0) {
System.out.println("Writing " + count + " ...");
}
FileInfo fileInfo = new FileInfo(file.getName(), file.getParent(), (int)file.length(), file.lastModified());
writer.write(fileInfo);
}
}
Expected result -
hdfs dfs -ls /tmp/*
Found 2 items
drwxr-xr-x - spring supergroup 0 2014-06-09 17:09 /user/spring/fileinfo/.metadata
-rw-r--r-- 3 spring supergroup 13824695 2014-06-09 17:10 /user/spring/fileinfo/6876f250-010a-404a-b8c8-0ce1ee759206.avro
By default there is no namespace available to DatasetRepositoryFactory so set the namespace
public void setNamespace(java.lang.String namespace) doc
Namespace to use. Defaults to no namespace ("default" used for Kite SDK API)
#Bean
public DatasetRepositoryFactory datasetRepositoryFactory() {
DatasetRepositoryFactory datasetRepositoryFactory = new
DatasetRepositoryFactory();
datasetRepositoryFactory.setConf(hadoopConfiguration);
datasetRepositoryFactory.setBasePath("/tmp");
datasetRepositoryFactory.setNamespace("default");
return datasetRepositoryFactory;

How to Mock LDAP template

I am trying to mock
#Mock
private LdapTemplate ldapTemplate;
But it's giving error:
org.mockito.exceptions.base.MockitoException:
Mockito couldn't inject mock dependency 'ldapTemplate' on field
'private org.springframework.ldap.core.LdapTemplate com.bt.resolve.dao.CadGroupsDAOImpl.ldapTemplate'
whose type 'com.bt.resolve.dao.CadGroupsDAOImpl' was annotated by #InjectMocks in your test.
Also I failed because: null
Can somebody please explain what is right way to mock LdapTemplate?
#Repository("cadGroupsDAO")
public class CadGroupsDAOImpl implements CadGroupsDAO {
private LdapTemplate ldapTemplate;
/**
* Logger for trace and errors.
*/
private static final Logger LOGGER = Logger.getLogger(CadGroupsDAOImpl.class);
long startTime = System.currentTimeMillis();
public void setLdapTemplate(LdapTemplate ldapTemplate) {
LdapContextSource contextSource = (LdapContextSource) ldapTemplate.getContextSource();
Map<String, Object> baseEnvironmentProperties = new HashMap<>();
baseEnvironmentProperties.put("java.naming.ldap.factory.socket", BlindSSLSocketFactoryTest.class.getName());
contextSource.setBaseEnvironmentProperties(baseEnvironmentProperties);
contextSource.afterPropertiesSet();
this.ldapTemplate = new LdapTemplate(contextSource);
this.ldapTemplate.setIgnorePartialResultException(true);
}
#Override
public List<LdapAttributeDTO> getCadGroups() throws UnableToFetchCadGroupsException {
LOGGER.info("Entered into CadGroupsDAOImpl.getCadGroups()");
List<LdapAttributeDTO> cadGroupList= new ArrayList<>();
// set the options
try {
SearchControls searchControls = new SearchControls();
searchControls.setSearchScope(SearchControls.SUBTREE_SCOPE);
searchControls.setTimeLimit(Integer.parseInt(SpringUtils.getProperty("LDAP_CONNECTION_TIMEOUT")));
searchControls.setCountLimit(Integer.parseInt(SpringUtils.getProperty("LDAP_CONNECTION_SEARCH_LIMIT")));
searchControls.setReturningAttributes(new String[] { "cn" });
LOGGER.debug("Obtained time out"+searchControls.getTimeLimit()+"and Count Limit"+searchControls.getCountLimit());
// set the filter
String filter = "cn=*";
// fetch results from ldap
cadGroupList = ldapTemplate.search("", filter, searchControls,
new LdapAttributeMapper());
} catch (Exception e) {
LOGGER.error("Exception occured in getCadGroups",e);
throw new UnableToFetchCadGroupsException();
}
LOGGER.info("Exiting from CadGroupsDAOImpl.getCadGroups() with List of Cad Grops:"+cadGroupList+ "And TIME_TAKEN_TO_FINISH "+ (System.currentTimeMillis() - startTime) + "MILIS");
return cadGroupList;
}
}
Here is the test I have written
#RunWith(MockitoJUnitRunner.class)
public class CadGroupsDAOTest {
#InjectMocks
CadGroupsDAOImpl CadGroupsDAO;
#Mock
private LdapTemplate ldapTemplate;
#Mock
private BTOSPropertyManager btPropertyManager;
LdapContextSource contextSource = new LdapContextSource();
#Before
public void setUp() throws Exception {
given(ldapTemplate.getContextSource()).willReturn(contextSource);
SpringUtils.setBtPropertyManager(btPropertyManager);
given(btPropertyManager.getProperty(anyString(), anyString())).will(SpringPropertyInstance.getSpringProperty());
}
#Test
public void getCadGroups_Success() throws UnableToFetchCadGroupsException{
try {
List<LdapAttributeDTO> cadGroups = CadGroupsDAO.getCadGroups();
} catch (Exception e) {
// TODO Auto-generated catch blocker
}
//need to write for success scenario
}
}

How to test Rest API and mock url using spring boot and mockbeam

I have a Rest API
The class code is :
#SpringBootTest
#RunWith(SpringRunner.class)
public class FetchCoreVersionsListIT {
#MockBean
private RestTemplateBuilder restTemplateBuilder;
#MockBean
private RestTemplate restTemplate;
private VersionsService versionsService;
#Autowired
private FetchCoreVersionsList fetchCoreVersionsList;
private VersionList versionList;
private ArtifactoryFolderInfoChild version;
#Before
public void setUp() throws Exception {
this.version = new ArtifactoryFolderInfoChild();
this.version.setUri("2.1.0");
this.version.setFolder(true);
when(restTemplateBuilder.build()).thenReturn(restTemplate);
}
#Test
public void testCoreVersionsJsonHandle() throws Exception{
when(restTemplate.getForObject("https://openmrs.jfrog.io/openmrs/api/storage/public/org/openmrs/api/openmrs-api/",
String.class))
.thenReturn(getFileAsString("core-versions.json"));
("2.1.0"));*/
}
This is the core-versions.json . This is nothing else but the data received from this Rest API.
Basically I'm trying to run a test and I have a spring schedule that will parse the json received from that Rest url. Now, while testing the schedule, I want to return the same data but without connecting to the internet and hence want to return the contents of core-versions.json. I get the following error unfortunately :
java.lang.IllegalStateException: File downloaded from could not be parsed
My schedule class is this:
#Component
public class FetchCoreVersionsList {
private final Logger logger = LoggerFactory.getLogger(getClass());
private static final String[] STRINGS_TO_EXCLUDE = {"alpha", "beta", "RC", "SNAPSHOT"};
#Value("${core_version_list.url}")
private String url;
//#Value("${core_version_list.strategy}")
//private FetchCoreVersionsList.Strategy strategy = FetchCoreVersionsList.Strategy.FETCH;
private RestTemplateBuilder restTemplateBuilder;
private ObjectMapper mapper;
private VersionsService versionsService;
#Autowired
public FetchCoreVersionsList(RestTemplateBuilder restTemplateBuilder,
ObjectMapper mapper,
VersionsService versionsService) {
this.restTemplateBuilder = restTemplateBuilder;
this.mapper = mapper;
this.versionsService = versionsService;
}
#Scheduled(
initialDelayString = "${scheduler.fetch_core_versions_list.initial_delay}",
fixedDelayString = "${scheduler.fetch_core_versions_list.period}")
public void fetchCoreVersionsList() throws Exception {
logger.info("Fetching list of OpenMRS-Core versions");
// FetchCoreVersionsList.Strategy strategy = FetchCoreVersionsList.Strategy.FETCH;
String json;
/* if (strategy == Strategy.LOCAL) {
logger.debug("LOCAL strategy");
json = StreamUtils.copyToString(getClass().getClassLoader().getResourceAsStream("openmrs-core-versions.json"),
Charset.defaultCharset());
} else {*/
json = restTemplateBuilder.build().getForObject(url, String.class);
logger.info("FETCH strategy: " + json);
ArtifactoryFolderInfo versionlist;
try { logger.info("FETCH strategy: " + json);
logger.debug("papa strategy: " + url);
versionlist = mapper.readValue(json, ArtifactoryFolderInfo.class);
} catch (Exception ex) {
throw new IllegalStateException("File downloaded from " + url + " could not be parsed", ex);
}
if (logger.isInfoEnabled()) {
logger.info("There are " + versionlist.getChildren().size() + " openmrs-core versions");
}
if (versionlist.size() > 0) {
List<String> versions = new ArrayList<>();
List<ArtifactoryFolderInfoChild> allversions = versionlist.getChildren();
for (ArtifactoryFolderInfoChild candidateVersion : allversions) {
if (candidateVersion.getFolder() && !stringContainsItemFromList(candidateVersion.getUri(), STRINGS_TO_EXCLUDE)) {
versions.add(candidateVersion.getUri().replaceAll("/", ""));
}
}
versionsService.setVersions(versions);
} else {
logger.warn("File downloaded from " + url + " does not list any Core Versions to index. Keeping our current list");
}
}
private static boolean stringContainsItemFromList(String inputStr, String[] items) {
return Arrays.stream(items).parallel().anyMatch(inputStr::contains);
}
public enum Strategy {
FETCH, LOCAL
}
}
Kindly bear with me if this is a silly error as I am completely new to testing.

Categories

Resources