The order of processes when starting the application - java

I have a question about initializing data in Spring-Boot.
I have this :
#PostConstruct
public void run() {
try {
upload(path);
logger.info("Seeding dictionary database...");
} catch (IOException e) {
//.....
}
}
run() method read .json file and fills the database with information from the file when the app starts. But I also have a .sql file that also fills the database when starts. The table created from initialization from the .sql file is related to the table created from the .json file
When app starts i have
INSERT INTO "USER_DICTIONARY"("DICTIONARY_ID", "USER_ID") VALUES (1, 0)
I have this line in my import.sql but this causes errors because DICTIONARY_ID doesn't exist jet because it comes from .json file which is loaded after import.sql
The data retrieved from the .json file is needed to correctly map this table when the sql file is executing.
Is it possible to execute my run() methody before executing .sql file, or can it be solved in some other way ? If so, please help me find the answer.

One option to seed a database is to use the CommandLineRunner in Spring Boot.
Example:
#Component
public class UserDataLoader implements CommandLineRunner {
#Autowired
UserRepository userRepository;
#Override
public void run(String... args) throws Exception {
loadUserData();
}
private void loadUserData() {
if (userRepository.count() == 0) {
User user1 = new User("John", "Doe");
User user2 = new User("John", "Cook");
userRepository.save(user1);
userRepository.save(user2);
}
System.out.println(userRepository.count());
}
}
The CommandRunner interface will execute just after the application starts.
Another way is is to use a #EventListener that listens to the application’s ContextRefreshEvent.
Example:
#Component
public class DBSeed {
#EventListener
public void seed(ContextRefreshedEvent event) {
try {
upload(path);
logger.info("Seeding dictionary database...");
} catch (IOException e) {
//.....
}
}
}

Related

Spring boot keep properties even after new deploy

Currently, I am sending app crashes logs of Android app via HTTP to my server (acra) and my server saves them in properties like this:
#RestController
public class EndlessBlowReportController {
public int counter;
#Autowired
public static final Properties defaultProperties = new Properties();
#PostMapping("/add_report")
public void addReport(#RequestBody String report) {
try {
JSONObject jsonObject = new JSONObject(report);
defaultProperties.put(counter, report);
counter++;
} catch (Exception ex) {
System.out.println(ex.getMessage());
}
}
#GetMapping("/get_reports")
public List<String> getReports() {
List<String> reports = new ArrayList<>();
try {
for(int i=0;i<defaultProperties.size();i++) {
reports.add((String)defaultProperties.get(i));
}
} catch (Exception ex) {
}
return reports;
}
}
and it works fine until I deploy a new version of the server.
How can I keep my properties even after deploy?
The properties are only stored in memory and won't be persisted to any permanent storage, such a file or database. My recommendation would be to not store this information in properties, but instead store it in a database, or alternatively in the file storage as a file.
For example, if you went with the file solution, you could load the file during the startup and update the file each time you get new reports. By doing so, you would persist the information and it wouldn't disappear each time you restart your server.
I hope you find this answer helpful.
Good luck!

spring boot execution of two dependent methods

I am developping an application using spring boot as framework. I have 2 methods the first one is deleting data from the database and the other one is deleting folder from the disk, so if i delete from the database and i can't delete from the disc all the operation will fail. So how can i do that with springboot ?
#Override
public ResponseEntity<?> delete(Long id) {
return libelleRepository.findById(id).map(libelle -> {
libelleRepository.delete(libelle);
return ResponseEntity.ok().build();
}).orElseThrow(() -> new GeneralResourceNotFoundException("Libelle not found with id " + id));
}
You can use the Spring's #Transactional for doing this.
Here is the sample code what I have tried. It performs a Database operation followed by a file operation in my example I'm trying to create a file. First am creating the file before performing the database operation and used TransactionSynchronizationAdapter to make sure the transaction is complete before commiting.
Code:
#Autowired
private UserService userService;
#Transactional
public String doFileOperation() {
File testFile = new File("C:\\test.txt");
TxnListener transactionListener = new TxnListener(testFile);
TransactionSynchronizationManager.registerSynchronization(transactionListener);
// DB Operation
userService.addUser();
// File Operation
List<String> lines = Arrays.asList("1st line", "2nd line");
try {
Files.write(Paths.get(testFile.getPath()),
lines,
StandardCharsets.UTF_8,
StandardOpenOption.CREATE,
StandardOpenOption.APPEND);
} catch (IOException e) {
e.printStackTrace();
}
return "";
}
public class TxnListener extends TransactionSynchronizationAdapter {
private File outputFile;
public TxnListener(File outputFile) {
this.outputFile = outputFile;
}
#Override
public void afterCompletion(int status) {
if (STATUS_COMMITTED != status) {
if (outputFile.exists()) {
if (!outputFile.delete()) {
System.out.println("Could not delete File" + outputFile.getPath() + " after failed transaction");
}
}
}
}
}
In case of exception during the Database operation afterCompletion will be called and the file will be deleted.
This way you can maintain the atomicity of the operation.

multiple batch queries within transaction template

I am trying to delete every 100 records read from a file in 3 tables,using spring jdbc batch delete .If i wrap the logic inside a transactionTemplate, is it going to work as expected, e.g lets say i am creating 3 batch out of 300 records and wrapping the logic inside a transaction ,then is the transaction going to roll back 1st and 2nd batch, if 3rd batch got a problem.My code is included in the question for reference. I am writing the below code to achieve what i have explained above, is my code correct?
TransactionTemplate txnTemplate = new TransactionTemplate(txnManager);
txnTemplate.execute(new TransactionCallbackWithoutResult() {
#Override
public void doInTransactionWithoutResult(final TransactionStatus status) {
try {
deleteFromABCTable(jdbcTemplate, successList);
deleteFromDEFTable(jdbcTemplate, successList);
deleteFromXYZTable(jdbcTemplate, successList);
} catch (Exception e) {
status.setRollbackOnly();
throw e;
}
}
});
My delete methods :-
private void deleteFromABCTable(JdbcTemplate jdbcTemplate, List
successList) {
try {
jdbcTemplate.batchUpdate(
"delete from ABC where document_id in (select document_id
from ABC where item in(?)))",
new BatchPreparedStatementSetter() {
#Override
public void setValues(PreparedStatement ps, int i)
throws SQLException {
ps.setString(0, successList.get(i));
} });
} catch (Exception e) { }

Shutting down spring boot application on db table space error

I have the following scheduled piece of code in my Spring Boot Application:
#Scheduled(fixedDelay = DELAY_SECONDS)
private void processJobQueue() {
BlockingQueue<ReportDeliverable> jobQueue = JobQueueFactory.getJobQueueInstance();
while (!jobQueue.isEmpty()) {
//do stuff
if (rCount == 0) {
status = send(reportDeliverable);
if (status == TransferStatus.FAILURE) {
populateQueue(reportDeliverable);
}
if (status == TransferStatus.SUCCESS) { //write the metadata to database
int i = dbMetadataWriter.writeMetadata(reportDeliverable);
}
} else if (rCount == -1) {
populateQueue(reportDeliverable);
} else
logger.info("Record exists in MetaData for {}. Ignoring the File transfer....", reportDeliverable.getFilePath());
}
}
In my DBMetadataWriter component, the writeMetadataWriter() looks something like this:
#Component
public class DBMetadataWriter {
public int writeMetadata(final ReportDeliverable reportDeliverable) {
int nbInserted = 0;
try {
nbInserted = jdbcTemplate.update(PORTAL_METADATA_INSERT, insertDataValues);
} catch (Exception e) {
logger.error("Could not insert metadata for {}, Exception: {} ", reportDeliverable.toString(), e.getMessage());
}
return nbInserted;
}
In some cases, when writing the insert to the database, I get table space issues with the database at which point I think it would be wise for me to shut down the spring boot application until table space problems are resolved.
What would be the correct way to handle these rare cases? What technique can I use to gracefully shutdown the spring boot application and how can I do it in the above code?
My entry point class where I initially validate all my database connections before processing etc has the following...
#Component
public class RegisterReportSchedules implements ApplicationListener<ContextRefreshedEvent> {
#Autowired
private ApplicationContext applicationContext;
#Override
public void onApplicationEvent(ContextRefreshedEvent contextRefreshedEvent) {
}
private void shutdownApplication() {
int exitCode = SpringApplication.exit(applicationContext, (ExitCodeGenerator) () -> 0);
System.exit(exitCode);
}
}
You have exit() method on SpringApplication class, which can be used for exiting Spring boot application gracefully.
It requires 2 paramerter:
ApplicationContext
ExitCodeGenerator
For further reading:
https://docs.spring.io/spring-boot/docs/current/api/org/springframework/boot/SpringApplication.html#exit-org.springframework.context.ApplicationContext-org.springframework.boot.ExitCodeGenerator...-
Code Example:
#Autowired
public void shutDown(ExecutorServiceExitCodeGenerator exitCodeGenerator) {
SpringApplication.exit(applicationContext, exitCodeGenerator);
}
Call this method when you get exception for No Table space

Play framework: How to apply database evolution to a fake applikation test database

I´m using the play framework 2.4 together with ebean data models und evolutions. At the moment I'm trying to write the first test for my models using a test mysql database. I want to have a clean test database before I start my tests.
So far I tried this:
public class ModelUnitTest {
public static FakeApplication app;
#BeforeClass
public static void startApp() {
// Load test config with testserver
Map<String, String> settings = new HashMap<String, String>();
settings.put("db.default.driver", "com.mysql.jdbc.Driver");
settings.put("db.default.url", "jdbc:mysql://127.0.0.1:3306/sopra-ws1516-team10-test?autoReconnect=true&useSSL=false");
settings.put("db.default.username", "root");
settings.put("db.default.password", "root");
settings.put("play.evolutions.autoApply", "true");
settings.put("ebean.default", "models.ebean.*");
app = play.test.Helpers.fakeApplication(settings);
Evolutions.applyEvolutions(XXXXXX);
Helpers.start(app);
}
#Test
public void testCreateUser() {
try {
User testUser = User.createUser("Max", "Musterman", "01.01.1980", "max.mustermann#gmail.com", "1234");
assertNotNull(testUser);
} catch (Exception e) {
assertNull(e);
}
}
#AfterClass
public static void stopApp() {
Evolutions.cleanupEvolutions(XXXXXX);
Helpers.stop(app);
}
}
Without the evolutions it works but the problem is that I can not be sure to have a empty database when I start my tests. My solution would be to apply evolutions before and clean evolutions after my tests. This would give me a clean environment for my tests.
But it is actually not possible to use the Evolutions class without having a Database object and it seems like there is no way to create a database object using the DB connections my FakeApplication has created.
Does someone has a solution for this problem?
I got around it by manually invoking the injector:
private static Database db;
#BeforeClass
public static void startApp() {
...
app = play.test.Helpers.fakeApplication(settings);
db = app.injector().instanceOf(Database.class);
Evolutions.applyEvolutions(db);
Helpers.start(app);
}
#AfterClass
public static void stopApp() {
Evolutions.cleanupEvolutions(db);
Helpers.stop(app);
}

Categories

Resources