Json file to rest api in spring boot - java

I have a json file with array of objects, i want to read it and create rest api with couple GET methods. What are best practices to do so? Should i create in memory database (H2), save json objects there and then do the rest? I am looking for most efficient solution.

If the data is static and you’re just doing GET requests, in your data layer you can just read from the contents of the file into POJOs. Then if you need to get more sophisticated you can always change up the implementation detail to H2 or some other DB.

If you JSON file is small and does not change frequently, you do not need to put it in H2 or another database. Just read the JSON file from the disk once and use it in your REST API endpoints.
Jackson is a good library for processing JSON data in Spring Boot. It offers multiple options to read and consume the JSON data.

Related

Is it possible to save data temporarily on runtime of java GUI application without having any database functionality?

I want to make a library management system in which I want to store book name and book ID etc. But I am restricted not to use the database functionality and save that data only during the time of execution.
How could I go about this?
Your options are straight forward:
keep that data in memory
use a temporary file in the file system
connect to some remote service where your app stores data (and then you are free to use whatever persistence mechanism you like to use)
Some type of storage medium needs to be decided on. There are a couple choices you can use. You can store the data as property files, xml or json.
There are tools like Jackson that can serialize and deserialize objects to json files and POJOs respectively to make persistence easier. There are also tools for xml and property file as well.
Yes, 2 quick ways that come to my mind right now are (if you are not using database):
1) use filing (searching and update could be harder)
2) user data structures - in memory storage (select the flexible data structure, storing data objects in array list would be easy but again depends on your requirements and nature of data)

How to pass SQL queries in json file?

I'm creating dynamic application with spring boot. I want to get SQL queries from user to make changes without modifications of code. So I want to get it from json property file.
Is this a good approach ? Or there are alternative ways to do the same.

Spring Data MongoDB - Append binary data to existing GridFS file

I'm implementing TUS protocol to upload large files using GridFS as persistence layer for the binary data. The idea is that the server will receive the data in chunks and append every new chunk into an existing resource. All the chunks will have the same size except for the last one.
I found this workaround here showing the idea of how to implement it myself but Im wondering if there is a way to append new chunks of binary data to an existing file using GridFSTemplate or other abstraction present on Spring Data Mongo project.
GridFS is a MongoDB-specific implementation. It could make sense to have appendable chunks in MongoDB's GridFS, the folks over at MongoDB are the right people to talk to in the first place.
Spring Data MongoDB can only implement such a functionality if the driver provides it.
Although it' possible to work with MongoDB's file chunks directly, this would include implementation details in Spring Data MongoDB and bind the library to a particular implementation of GridFS. Spring Data isn't maintained by MongoDB but rather by the Spring Team which isn't involved in any change process happening within the scope of MongoDB. So if GridFS undergoes any changes in the future, this could break Spring Data MongoDB's support for appendable chunks.

How to send data from a java application to elastic search without using the elasticsearch libraries

I saw already answered questions and seems they are old enough that I couldn't use them. I tried an example given at https://www.elastic.co/blog/found-java-clients-for-elasticsearch which has the code written but not in an organized manner that would help me. The libraries are old and code gives me error.
I saw Spring Data project but that only allow a specific type of document/class to be indexed and needs the model to be predefined which is not my usecase. My goal is to build a java web application which could ingest any data documents fed to elasticsearch and we could analyze it with Kibana. I would need to know how can i fire a rest call or curl for bulk data. Can anyone state an example with complete parts please.
Use rest client.
The Java High Level REST Client works on top of the Java Low Level
REST client. Its main goal is to expose API specific methods, that
accept request objects as an argument and return response objects, so
that request marshalling and response un-marshalling is handled by the
client itself.
To upload data from java application to ES , you can use bulk Api.
To check list of APIs check link

Using spring hibernate, read from read-only data source and write to read/write data source

I have two data sources having exactly same schema but one is read-only and other is read/write. The read-only data source get updated by the external project. I am planning to using spring-data-hibernate to create entity model classes and read data from the read-only data source and write to read/write data source.
Is it do-able? Do we have any best practices/design patterns regarding it?
Take a look at: http://spring.io/blog/2007/01/23/dynamic-datasource-routing/
Spring has an AbstractRoutingDataSource that allows you to define multiple data resources on your server which will let spring pick them up and allows you to define which ones are read from and which ones are written to.
I could go into more depth, but the link will take you to a good discussion about it.

Categories

Resources