I have a Spring Boot application that stores payment information in the database. The application has end-points
GET /api/orders - get orders by filter
POST /api/orders - add a new order
PUT /api/orders - update order
DELETE /api/orders - delete order
These endpoints are not secure itself, and I do not want to secure them on the application level. All traffic from the user goes to HTTPS proxy, which will decrypt it and forward to the application.
However, I use mongo atlas free version for prototyping
https://www.mongodb.com/cloud/atlas/faq
All the data I put into the database must be encrypted. Even the document structure (fields names and types) must be encrypted.
How to encrypt a field does not work for me, because I want to encrypt the whole document.
I do not want to use unofficial libraries like bellow (thus no one guarantees if the library is secure)
<dependency>
<groupId>com.bol</groupId>
<artifactId>spring-data-mongodb-encrypt</artifactId>
<version>1.3.0</version>
</dependency>
One idea that came to me is to configure the application to use a password (somehow configured or generated at given moment of time).
/**
* Provides a password to encrypt a document.
**/
// TODO How to do it better? I still have to improve it.
#Component
public class PasswordProviderImpl implements PasswordProvider {
private static final byte[] MASTER_PASSWORD = {1, 11, 37, 166, 11, 77};
#Autowired
private Environment environment;
// I do not care about the implementation yet
public char [] getPassword() {
final byte[] envPassword = environment.getProperty("appplicationPassword").toString().toByteArray();
return envPassword;
}
}
/**
* Encrypts input byte array with provided password, afterwards, cleans input data and password - populates them with zeros - 0.
**/
#Component
public class Encryptor {
private PasswordProvider passwordProvider;
public byte [] encrypt(final byte unecrypted) {
final byte [] password = passwordProvider.getPassword();
final byte [] encrypted = xor(unecrypted, password);
makeZeros(password);
makeZeros(unecrypted);
return encrypted;
}
private void makeZeros(final byte[] array) { /*Implementation*/}
private byte[] xor(final byte [] arg1, final byte[] arg2) {/*Implementation*/}
}
/**
* Represents a unit of data in my application. Its values and fields and structure - everything should be encrypted.
**/
public class Order {
private ObjectId id;
private Instant createdDate;
private Instant updatedDate;
private Money amount;
private String additionalDetails;
// gettters, setters, constructor
}
Converter, found Set MongoDb converter programmatically
/**
* It is declared application configuration. It defines how to store {#link Order} in mongodb
**/
#Component
public OrderConverter implements onverter<Order, SecuredOrder> {
private ObjectMapper objectMapper;
private Encryptor encryptor;
private Base64Converter base64Converter;
#Override
public SecuredOrder convert(Order source) {
final String json = objectMapper.writeValueAsString(source);
final unencrypted = json.toByteArray();
final byte[] encrypted = encryptor.encrypt(unencrypted);
final String payload = base64Converter.toBase64(encrypted);
return SecuredOrder.of(order.getId(), payload);
}
}
public class SecuredOrder {
private ObjectId id; // same as in order id
private String encryptedPayload; // The converter will make it
}
#Service
public OrderService {
public void saveOrder(Order order) {
orderRepository.save(order);
}
}
If you have done a similar thing, please give me a direction. I would really like to do it properly.
Also, mongodb provides encryption mechanism, so maybe I should use it?
https://docs.mongodb.com/manual/core/security-encryption-at-rest/
https://docs.mongodb.com/manual/core/security-encryption-at-rest/#encrypted-storage-engine
If you're able to use the WiredTiger storage engine and you are using MongoDB 3.2 or above, you can utilize it's Encryption at Rest capability (as you mentioned at the bottom of your post!), but be advised this is available for the enterprise version only.
Related
I have a spring boot project (version 2.5.5) and I'm using the spring-boot-starter-data-mongodb dependency to work with MongoDB.
I have a bean with these fields:
#Document(collection = "user_data")
public class UserData {
#Id
private String id;
#Field("is_active")
private Boolean isActive;
#Field("organization_id")
private String organizationId;
#Field("system_mode")
private SystemMode systemMode;
#Field("first_name")
private String firstName;
#Field("last_name")
private String lastName;
}
*Also with constructors and getters and setters but I omitted them for simplicity.
I also have a matching repository:
#Repository
public interface UsersDataRepository extends MongoRepository<UserData, String> {
}
Now the fields firstName and lastName are in fact encrypted and stored in the database as Binary type.
When I try to do say
Optional<UserData> optionalUserData = usersDataRepository.findById(userId);
I get an error stating that failed to convert from Binary to String, which makes sense because the fields are encrypted.
In the database I have a key_vault collection that contains the keys to decrypt.
So how can I add MongoDB client side field level decryption using the above setup so that I can get the fields decrypted and use them in my project?
I followed this guide and got a working solution for my case:
https://blog.contactsunny.com/tech/encrypting-and-decrypting-data-in-mongodb-with-a-springboot-project
In a nutshell, create a component that will handle encrypting and decrypting fields.
Create two event listener classes that will listen to mongo save and get from database events.
My MongoDBAfterLoadEventListener ended up like this,
note that it currently only works for strings:
public class MongoDBAfterLoadEventListener extends AbstractMongoEventListener<Object> {
#Autowired
private EncryptionUtil encryptionUtil;
#Override
public void onAfterLoad(AfterLoadEvent<Object> event) {
Document eventObject = event.getDocument();
List<String> keysToDecrypt = encryptionUtil.ENCRYPTED_FIELDS_MAP.get(event.getCollectionName());
if (keysToDecrypt == null || keysToDecrypt.isEmpty()) {
return;
}
for (String key : eventObject.keySet()) {
if (keysToDecrypt.contains(key)) {
Binary encrypted = (Binary) eventObject.get(key);
BsonBinary bsonBinary = new BsonBinary(encrypted.getData());
BsonValue decrypted = this.encryptionUtil.decryptText(bsonBinary);
eventObject.put(key, decrypted.asString().getValue());
}
}
super.onAfterLoad(event);
}
}
Use MongoDB GridFsTemplate to save, retrieve and delete the binary files. https://www.youtube.com/watch?v=7ciWYVx3ZrA&t=1267s
So I have this case where the JPA entity is encrypted using it's own ID as salt.
Here is an example of doing the en/decrypt without annotation, I have to "manually" create custom get/setter to each encrypted fields.
StandardDbCipher is just my cipher class that accepts a salt during construction (which is in this case is the ID field). The password is already fixed in some other file.
#Entity
public class Applicant implements Serializable {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private String id;
private String profilePic;
private String contact;
private String personalInfo;
#Transient
private StandardDbCipher cipher;
private StandardDbCipher getCipher() {
if (cipher == null) {
cipher = new StandardDbCipher(id);
}
return cipher;
}
private String encrypt (String plain) {
return getCipher().decrypt(plain);
}
private String decrypt (String crypt) {
return getCipher().encrypt(crypt);
}
public String getProfilePic() {
return decrypt(profilePic);
}
public void setProfilePic(String profilePic) {
this.profilePic = encrypt(profilePic);
}
public String getContact() {
return decrypt(contact);
}
public void setContact(String contact) {
this.contact = encrypt(contact);
}
public String getPersonalInfo() {
return decrypt(personalInfo);
}
public void setPersonalInfo(String personalInfo) {
this.personalInfo = encrypt(personalInfo);
}
}
I would like to simplify the code and reduce boilerplate using #Converter, but couldn't figure out how to put the ID as salt? Any ideas? Maybe other annotation?
If it's something that you need to do in many entities, then I think you can try Aspect Oriented Programming (the most known implementation is AspectJ). Spring also has an integration with that (I've not worked with that as I'm not using Spring). The idea is that you can have some intercepting code that would be executed before or after a call to methods of your objects (in your case getter/setter methods of your entities) and inside them you can manipulate the actual object/parameters/return values.
You can call your encrypt method before the execution of setter method and pass the encrypted value to your setter. For the decryption, you run your decrypt method after the execution of the getter method.
By doing so, your entities would remain as a simple POJO and you don't need to provide a converter for each.
. Here are some tutorial demonstrating the AOP concept:
#AspectJ Based AOP with Spring
Introduction to Spring AOP
Update:
Another solution could be to use JPA Entity Listeners. You can do the encryption on #PrePersist and the decryption on #PostLoad callbacks in your entity or use a single listener class for all such entities. You would just need to annotate your POJOs like this:
#Entity
#EntityListeners(class=EncDecListener.class)
public class Applicant implements Serializable {
}
public class EncDecListener {
#PreUpdate
public void encrypt(Applicatnt a) {
// do encryption
}
#PostLoad
public void decrypt(Applicatnt a) {
// do decryption
}
}
I've created spring application for CRUD. I can easily write into server data like string,Long,blob but When I try to retrieve it from server. I've encountered with difficulty which byte array from server gives in BigInteger from server. How I could get data in byte array instead of BigInteger?When I write in insert byte array this data which column is BLOB. Here is my code
Repository
public interface ArriveRepository extends JpaRepository<ArriveEntity,Long>
{
#Query(value = "select arrive.time,air_lines.image,arrive.flight,arrive.destination_uzb," +
"arrive.destination_eng,arrive.destination_rus,arrive.status,arrive.status_time " +
"from arrive inner join air_lines on air_lines.id = arrive.airline_id where arrive.arrive_date = (:date1)",nativeQuery = true)
List<Object[]> getForArriveTerminal(#Param("date1") LocalDate date1);
}
When I retrieve data from server I'm using this class
ArriveTerminalDto
public class ArriveTerminalDto {
private String time;
private BigInteger logo;
private String flight;
private String destinationUzb;
private String destinationEng;
private String destinationRus;
private String status;
private String statusTime;
//getter setter}
Service class
public List<ArriveTerminalDto> getToShow(LocalDate date1)
{
List<ArriveTerminalDto> list = new ArrayList<>();
List<Object[]> list1 = arriveRepository.getForArriveTerminal(date1);
for(Object[] objects: list1)
{
ArriveTerminalDto arriveTerminalDto = new ArriveTerminalDto();
arriveTerminalDto.setTime((String)objects[0]);
arriveTerminalDto.setLogo((BigInteger) objects[1]);
arriveTerminalDto.setFlight((String) objects[2]);
arriveTerminalDto.setDestinationUzb((String) objects[3]);
arriveTerminalDto.setDestinationRus((String) objects[4]);
arriveTerminalDto.setDestinationEng((String) objects[5]);
arriveTerminalDto.setStatus((String) objects[6]);
list.add(arriveTerminalDto);
}
return list;
}
This code works but it didn't give me byte array from server.
When I try to change BigInteger into byt[] array it gives me following errors
from postman
{
"timestamp": "2019-01-28T09:33:52.038+0000",
"status": 500,
"error": "Internal Server Error",
"message": "java.math.BigInteger cannot be cast to [B",
"path": "/arrive/terminal/date=2019-01-27"
}
Changed Object into ArriveTerminalDto but still it give error my following repo
public interface ArriveRepository extends JpaRepository<ArriveEntity,Long>
{
#Query(value = "select arrive.time,air_lines.image,arrive.flight,arrive.destination_uzb," +
"arrive.destination_eng,arrive.destination_rus,arrive.status,arrive.status_time " +
"from arrive inner join air_lines on air_lines.id = arrive.airline_id where arrive.arrive_date = (:date1)",nativeQuery = true)
List<ArriveTerminalDto> getForArriveTerminal(#Param("date1") LocalDate date1);
}
Why don't you take a look at the Spring Content community project. This project allows you to associate content with Spring Data entities. Think Spring Data but for Content, or unstructured data. This can also give you REST endpoints for the content as well, like Spring Data REST.
This approach will give you a clear abstraction for your content with implementations for many different types of storage. It is stream-based, rather than byte-based. Using byte[] won't work if you want to transfer very large files. Also getting databases to stream properly is very idiosyncratic. You probably don't want to figure all that out yourself when Spring Content already has.
This is pretty easy to add to your existing projects. I am not sure if you are using Spring Boot, or not. I'll give a non-spring boot example:
pom.xml
<!-- Java API -->
<dependency>
<groupId>com.github.paulcwarren</groupId>
<artifactId>spring-content-jpa</artifactId>
<version>0.5.0</version>
</dependency>
<!-- REST API (if you want it) -->
<dependency>
<groupId>com.github.paulcwarren</groupId>
<artifactId>spring-content-rest</artifactId>
<version>0.5.0</version>
</dependency>
Configuration
#Configuration
#EnableJpaStores
#Import("org.springframework.content.rest.config.RestConfiguration.class")
public class ContentConfig {
// schema management
//
#Value("/org/springframework/content/jpa/schema-drop-mysql.sql")
private Resource dropContentTables;
#Value("/org/springframework/content/jpa/schema-mysql.sql")
private Resource createContentTables;
#Bean
DataSourceInitializer datasourceInitializer() {
ResourceDatabasePopulator databasePopulator =
new ResourceDatabasePopulator();
databasePopulator.addScript(dropContentTables);
databasePopulator.addScript(createContentTables);
databasePopulator.setIgnoreFailedDrops(true);
DataSourceInitializer initializer = new DataSourceInitializer();
initializer.setDataSource(dataSource());
initializer.setDatabasePopulator(databasePopulator);
return initializer;
}
}
To associate content, add Spring Content annotations to your account entity.
ArriveEntity.java
#Entity
public class ArriveEntity {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
.. existing fields...
#ContentId
private String contentId;
#ContentLength
private long contentLength = 0L;
// if you have rest endpoints
#MimeType
private String mimeType = "text/plain";
}
Create a "store":
ArrivEntityContentStore.java
#StoreRestResource(path="arriveEntityContent)
public interface ArrivEntityContentStore extends ContentStore<ArriveEntity, String> {
}
This is all you need to create REST endpoints # /arriveEntityContent. When your application starts, Spring Content will look at your dependencies (seeing Spring Content JPA/REST), look at your ArrivEntityContentStore interface and inject an implementation of that interface for JPA. It will also inject a #Controller that forwards http requests to that implementation. This saves you having to implement any of this yourself which I think is what you are after.
So...
To access content with a Java API, auto-wire ArrivEntityContentStore and use it methods.
Or to access content with a REST API:
curl -X POST /arriveEntityContent/{arriveEntityId}
with a multipart/form-data request will store the image in the database and associate it with the account entity whose id is itemId.
curl /arriveEntityContent/{arriveEntityId}
will fetch it again and so on...supports full CRUD.
There are a couple of getting started guides here. The reference guide is here. And there is a tutorial video here. The coding bit starts about 1/2 way through.
HTH
Try to change entity definition to handle byte[] directly, but suggest JPA to interpret it as Lob. You can do it with #Lob annotation:
public class ArriveTerminalDto {
private String time;
#Lob
private byte[] logo;
private String flight;
private String destinationUzb;
private String destinationEng;
private String destinationRus;
private String status;
private String statusTime;
}
Laster, as #Clijsters suggested, you can change your repo to return List<ArriveTerminalDto>.
I am experimenting with spring data elasticsearch by implementing a cluster which will host multi-tenant indexes, one index per tenant.
I am able to create and set settings dynamically for each needed index, like
public class SpringDataES {
#Autowired
private ElasticsearchTemplate es;
#Autowired
private TenantIndexNamingService tenantIndexNamingService;
private void createIndex(String indexName) {
Settings indexSettings = Settings.builder()
.put("number_of_shards", 1)
.build();
CreateIndexRequest indexRequest = new CreateIndexRequest(indexName, indexSettings);
es.getClient().admin().indices().create(indexRequest).actionGet();
es.refresh(indexName);
}
private void preapareIndex(String indexName){
if (!es.indexExists(indexName)) {
createIndex(indexName);
}
updateMappings(indexName);
}
The model is created like this
#Document(indexName = "#{tenantIndexNamingService.getIndexName()}", type = "movies")
public class Movie {
#Id
#JsonIgnore
private String id;
private String movieTitle;
#CompletionField(maxInputLength = 100)
private Completion movieTitleSuggest;
private String director;
private Date releaseDate;
where the index name is passed dynamically via the SpEl
#{tenantIndexNamingService.getIndexName()}
that is served by
#Service
public class TenantIndexNamingService {
private static final String INDEX_PREFIX = "test_index_";
private String indexName = INDEX_PREFIX;
public TenantIndexNamingService() {
}
public String getIndexName() {
return indexName;
}
public void setIndexName(int tenantId) {
this.indexName = INDEX_PREFIX + tenantId;
}
public void setIndexName(String indexName) {
this.indexName = indexName;
}
}
So, whenever I have to execute a CRUD action, first I am pointing to the right index and then to execute the desired action
tenantIndexNamingService.setIndexName(tenantId);
movieService.save(new Movie("Dead Poets Society", getCompletion("Dead Poets Society"), "Peter Weir", new Date()));
My assumption is that the following dynamically index assignment, will not work correctly in a multi-threaded web application:
#Document(indexName = "#{tenantIndexNamingService.getIndexName()}"
This is because TenantIndexNamingService is singleton.
So my question is how achieve the right behavior in a thread save manner?
I would probably go with an approach similar to the following one proposed for Cassandra:
https://dzone.com/articles/multi-tenant-cassandra-cluster-with-spring-data-ca
You can have a look at the related GitHub repository here:
https://github.com/gitaroktato/spring-boot-cassandra-multitenant-example
Now, since Elastic has differences in how you define a Document, you should mainly focus in defining a request-scoped bean that will encapsulate your tenant-id and bind it to your incoming requests.
Here is my solution. I create a RequestScope bean to hold the indexes per HttpRequest
how does singleton bean handle dynamic index
I need a field to be ignored in front end UI, whereas the same field will be calculated in backend and needs to get stored in Postgres DB as a Jsonb object. Other than transforming the value object into a newer one, do we have any feature in Jackon for this use case.
Test.java
public class Test {
private Integer score;
private Date dateValidated = null;
private Boolean consent = false;
private Date dateConsented;
public void setConsent(Boolean consent) {
this.consent = consent;
this.dateConsented = consent ? new Date() : null;
}
}
Based on consent, dateConsented will be set and i don't want this to be set while calling my service. I can use #JsonIgnore for this
Problem
I will store this Test as json object in postgres (Jsonb). So if i use #JsonIgnore dateConsented will be ignored in DB as well. I don't want that to happen. Any suggestions/solution for this?
Just create a resource class for yourself, and convert this class to it. finally return this resource class to frontend UI.Take a look ConverterFactory from spring.