DynamoDB and TableNameOverride with prefix - java

I am testing DynamoDB tables and want to set up different table names for prod and dev environment using the prefix "dev_" for development.
I made this test to print the table name:
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig.TableNameOverride;
TableNameOverride tbl = new TableNameOverride("test").withTableNamePrefix("dev_");
System.out.println("name=" + tbl.getTableName() + " prefix=" + tbl.getTableNamePrefix());
This prints: name=null prefix=dev_
How come the name here is null ?
TableNameOverride tbl = new TableNameOverride("test");//.withTableNamePrefix("dev_");
System.out.println("name=" + tbl.getTableName() + " prefix=" + tbl.getTableNamePrefix());
This prints: name=test prefix=null
*How can I get the table name to be "dev_test" ?*
I want to use this later to get a "dev_" prefix for all tables in development mode like this:
DynamoDBTable annotation = (DynamoDBTable) myclass.getClass().getAnnotation(DynamoDBTable.class);
TableNameOverride tbl = new TableNameOverride(annotation.tableName()).withTableNamePrefix("dev_");
Or is there another solution to separate between dev and prod tables?
I first thought of putting them in separate regions but not sure about this.
Could also use this:
mapper.save(ck, new DynamoDBMapperConfig(new TableNameOverride((isDev ? "dev_" : "") + annotation.tableName())));

withTableNamePrefix is a static method. So this line is creating a new instance of TableNameOverride with the String "test", and then throwing that instance away by using it to call the static withTableNamePrefix method:
TableNameOverride tbl = new TableNameOverride("test").withTableNamePrefix("dev_");
To answer the deeper question of separating test from prod, I would recommend having 2 separate AWS Accounts entirely, one for dev and one for prod. This is the only way you can:
See billing separately
Ensure you never leak data between prod and test systems
Have high scaling on a dev table prevent you from scaling a prod table higher

I've faced the same situation and struggled with myself a couple of days to get that working.
Just in case you're using DynamoDB + Spring here is what worked for me:
POJO class:
#DynamoDBTable(tableName = "APP-ACCESSKEY")
public class AccessKey {
#NotBlank
#Size(min = 1, max = 36)
private String accessToken;
#NotNull
#Size(min = 3, max = 15)
private String userName;
private Date dateInsertion;
public AccessKey() {
// ... All POJO stuff
}
Spring configuration:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.1.xsd">
<!-- Amazon Credentials -->
<bean id="tableNameOverride" class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
<property name="staticMethod" value="com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix"/>
<property name="arguments" value="DES-" />
</bean>
<bean id="dynamoDBMapperConfig" class="com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig">
<constructor-arg index="0" ref="tableNameOverride" />
</bean>
<bean id="BasicAWSCredentials" class="com.amazonaws.auth.BasicAWSCredentials">
<constructor-arg index="0" value="${amazon.accessKey}" />
<constructor-arg index="1" value="${amazon.secretKey}" />
</bean>
<bean id="amazonDynamoDBClient" class="com.amazonaws.services.dynamodbv2.AmazonDynamoDBClient">
<constructor-arg index="0" ref="BasicAWSCredentials" />
<property name="endpoint" value="http://dynamodb.us-west-2.amazonaws.com" />
</bean>
<bean id="dynamoDBMapper" class="com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper">
<constructor-arg index="0" ref="amazonDynamoDBClient" />
<constructor-arg index="1" ref="dynamoDBMapperConfig" />
</bean>
</beans>
Explanation:
Taking into account that my AccessKey object point to APP-ACCESSKEY table on AWS DynamodDB then it turns out that after running this, your application will start to point to DES-APP-ACCESSKEY.
Hope it helps someone who's facing a situation akin to it
Cheers

Same as Paolo Almeidas solution, but with Spring-Boot annotations.
Just wanted to share it and maybe save someone time:
I have dynamodb tables for each namespace, e.g. myApp-dev-UserTable, myApp-prod-UserTable and I am using the EKS_NAMESPACE env variable, which in my case gets injected into the pods by kubernetes.
import com.amazonaws.services.dynamodbv2.AmazonDynamoDB;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClientBuilder;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperConfig;
#Configuration
#EnableDynamoDBRepositories(basePackages = "de.dynamodb")
public class DynamoDBConfig {
#Value("${EKS_NAMESPACE}")
String eksNamespace;
#Bean
public AmazonDynamoDB amazonDynamoDB() {
return AmazonDynamoDBClientBuilder.standard()
.withEndpointConfiguration(new AwsClientBuilder.EndpointConfiguration(
"dynamodb.eu-central-1.amazonaws.com", "eu-central-1"))
.withCredentials(awsCredentials())
.build();
}
#Bean
public AWSCredentialsProvider awsCredentials() {
return WebIdentityTokenCredentialsProvider.builder().build();
}
// Table Name override:
#Bean
public DynamoDBMapperConfig.TableNameOverride tableNameOverride() {
return DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix("myApp-" + eksNamespace + "-");
}
#Bean
public DynamoDBMapperConfig dynamoDBMapperConfig() {
return DynamoDBMapperConfig.builder().withTableNameOverride(tableNameOverride()).build();
}
#Bean
// Marked as primary bean to override default bean.
#Primary
public DynamoDBMapper dynamoDBMapper() {
return new DynamoDBMapper(amazonDynamoDB(), dynamoDBMapperConfig());
}
}
With a table like this:
#Data
#DynamoDBTable(tableName = "UserTable")
public class User {
#DynamoDBHashKey
private String userId;
#DynamoDBAttribute
private String foo;
#DynamoDBAttribute
private String bar;
}

Related

Map(value set using spring beans) is empty when I try to access using Rest URL

I am new to Spring Beans. I am trying to set entry map using beans.xml file and accessing that value using GET REST request.
beans.xml
<bean name ="book" id="book" class=" org.test.model.Book" scope = "singleton">
<property name="id" value="123" />
<property name="bookName" value="FirstBeanBook"></property>
</bean>
<bean name="bookservice2" id = "bookservice" class="org.test.service.BookService" scope="singleton">
<property name="bookMap">
<map><entry key="123" value-ref="book" /></map>
</property>
</bean>`
In Main class,
BookService bookService = (BookService) context.getBean("bookservice2");
bookService.getMap().toString(); // here it is working fine.`
I guess when I am trying to access this map using GET request it is creating another instance of BookService class which has empty bookMap.
Please provide some solution to get same result when I use GET request of REST.
Edit:
Handling get request as
#GET
#Produces(MediaType.APPLICATION_JSON)
#Path("/getBook/{id}")
public Book getBook(#PathParam("id") String id) {
return bookService.getBook(id);
}
BookService.Java
`public class BookService {
static Map<Integer, Book> bookMap = new HashMap<Integer, Book>();
//This class has Getter setter of bookmap too.
public BookService() {}
public Book getBook(String id) {
return bookMap.get(Integer.parseInt(id));
}`

Spring MongoDB inserting unwanted objects

I haven't been able to find a single web page or other post about this issue. Hence, I'm here posting.
Within the documents I am storing to my mongodb, I have these things showing up:
"itemModifiers" : [
{
"val$implicitModifierString" : "16% increased Spell Damage",
"modifierName" : "16% increased Spell Damage"
}
]
The val$implicitModifierString is actually a variable from within my Java code, which was not set to the ItemModifiers.class instance. Basically, when I set a variable in my classes which I am storing to MongoDb, any variable or Object that I use to set that variable is also getting stored to the database (or at least that is what it looks like to me!).
Here is some sample code of what the process looks like (if you hate maps, sorry; not really relevant here.):
public ItemModifier deriveModifier(final String modifier) {
for (Pattern outerKey : tierMap.keySet()) {
if (outerKey.matcher(modifier).matches()) {
HashMap<Pattern, ItemModifierTier> innerMap = tierMap.get(outerKey);
for (Pattern innerKey : innerMap.keySet()) {
if (innerKey.matcher(modifier).matches()) {
Matcher innerMatcher = innerKey.matcher(modifier);
Double[] tierValues = new Double[innerMatcher.groupCount()];
innerMatcher.find();
for (int i = 1; i <= innerMatcher.groupCount(); i++) {
tierValues[i - 1] = Double.valueOf(innerMatcher.group(i));
}
return new ItemModifier() {{
setModifierName(modifier);
setModifierTerm(termMap.get(outerKey.pattern()));
setModifierTier(innerMap.get(innerKey));
setModifierType(itemModifierType);
setModifierValues(tierValues);
}};
}
}
}
}
return null;
}
And here is the ItemModifier class (intentionally indexed every field because they are all queryable via a service; I have not yet created composite indexes but plan to once the issue at hand is sorted):
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonPropertyOrder({
"modifierName",
"modifierTerm",
"modifierType",
"modifierTier",
"modifierValues",
"modifierAverage"
})
public class ItemModifier {
#Indexed
#JsonProperty("modifierName")
private String modifierName;
#Indexed
#JsonProperty("modifierTerm")
private String modifierTerm;
#Indexed
#JsonProperty("modifierType")
private ItemModifierType modifierType;
#Indexed
#JsonProperty("modifierTier")
private ItemModifierTier modifierTier;
#Indexed
#JsonProperty("modifierValues")
private Double[] modifierValues;
#Indexed
#JsonProperty("modifierAverage")
private Double modifierAverage;
public ItemModifier() {
}
public String getModifierName() {
return modifierName;
}
public void setModifierName(String modifierName) {
this.modifierName = modifierName;
}
//... the other getters/setters
}
This ItemModifiers.class is held within an ItemDocument.class and is stored to the mongo database simply by invoking mongoOperations.insert(itemDocumentInstance);.
In case it matters, this is my mongoConfig.xml:
<beans xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xmlns="http://www.springframework.org/schema/beans"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo
http://www.springframework.org/schema/data/mongo/spring-mongo-1.0.xsd
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.0.xsd">
<mongo:mongo host="127.0.0.1" port="27017"/>
<mongo:db-factory dbname="public-stash-api"/>
<bean id="mappingContext"
class="org.springframework.data.mongodb.core.mapping.MongoMappingContext"/>
<bean id="defaultMongoTypeMapper"
class="org.springframework.data.mongodb.core.convert.DefaultMongoTypeMapper">
<constructor-arg name="typeKey">
<null/>
</constructor-arg>
</bean>
<bean id="mappingMongoConverter"
class="org.springframework.data.mongodb.core.convert.MappingMongoConverter">
<constructor-arg name="mongoDbFactory" ref="mongoDbFactory"/>
<constructor-arg name="mappingContext" ref="mappingContext"/>
<property name="typeMapper" ref="defaultMongoTypeMapper"/>
</bean>
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg name="mongoDbFactory" ref="mongoDbFactory"/>
<constructor-arg name="mongoConverter" ref="mappingMongoConverter"/>
</bean>
</beans>
Thank you in advance for your help!
You are mixing Jackson with MongoDB. The fact that MongoDB uses documents does not mean it is using Jackson. MongoDB stores document in BSON (Binary JSON) format but you can't customize the way your documents are stored by using Jackson annotations.
There are Spring Data MongoDB annotations (like org.springframework.data.mongodb.core.mapping.Field) for that very purpose. You actually used one of them in your code (org.springframework.data.mongodb.core.index.Indexed).
As it turns out, this is the culprit:
return new ItemModifier() {{
setModifierName(modifier);
setModifierTerm(termMap.get(outerKey.pattern()));
setModifierTier(innerMap.get(innerKey));
setModifierType(itemModifierType);
setModifierValues(tierValues);
}};
The static instantiation is what caused the weird val$inputVariables to persist into the Mongo Documents via Spring.

Reading 2 property files having same variable names in Spring

I am reading property files using below entry in my Spring xml.
<context:property-placeholder
location="classpath:resources/database1.properties,
classpath:resources/licence.properties"/>
I am injecting this values in variable using xml entry or using #Value annotation.
<bean id="myClass" class="MyClass">
<property name="driverClassName" value="${database.driver}" />
<property name="url" value="${database.url}" />
<property name="name" value="${database.name}" />
</bean>
I want to add a new property file(database2.properties) which has few same variable names as of database1.properties.
database1.properties:
database.driver=com.mysql.jdbc.Driver
database.url=jdbc:mysql://192.168.1.10/
database.name=dbname
database2.properties:
database.url=jdbc:mysql://192.168.1.50/
database.name=anotherdbname
database.user=sampleuser
You can see few property variables have same name like database.url, database.name in both the property files.
Is it possible to inject database.url of database2.properties?
Or I have to change variable names?
Thank you.
You can do it by configuring two PropertyPlaceholderConfigurer. Usually there's only one instance that serves out all the properties, however, if you change the placeholderPrefix you can use two instances, something like
<bean id="firstPropertyGroup" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="locations" value="classpath:resources/database1.properties,
classpath:resources/licence.properties" />
<property name="placeholderPrefix" value="${db1."/>
</bean>
<bean id="secondPropertyGroup" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="locations" value="classpath:resources/database2.properties" />
<property name="placeholderPrefix" value="${db2."/>"
</bean>
Then you would access your properties like ${db1.database.url} or ${db2.database.url}
There might be a solution, similar to what's that you want to achieve. Check the second answer to this question: Multiple properties access. It basically explains what to do in order to access the properties of the second file by using another expression, which is defined by you.
Otherwise, the simplest solution would be just changing the key values (the variable names).
You will sooner or later switch to Spring Boot. So with Spring Boot you can do have such POJO:
public class Database {
#NotBlank
private String driver;
#NotBlank
private String url;
#NotBlank
private String dbname;
public String getDriver() {
return driver;
}
public void setDriver(String driver) {
this.driver = driver;
}
public String getUrl() {
return url;
}
public void setUrl(String url) {
this.url = url;
}
public String getDbname() {
return dbname;
}
public void setDbname(String dbname) {
this.dbname = dbname;
}
}
and use #ConfigurationProperties to fill it:
#Bean
#ConfigurationProperties(locations="classpath:database1.properties", prefix="driver")
public Database database1(){
return new Database();
}
#Bean
#ConfigurationProperties(locations="classpath:database2.properties", prefix="driver")
public Database database2(){
return new Database();
}
Downside of this is that it's mutable. With Lombok library, you can eliminate nasty getters and setters.

How can I execute a stored procedure with JPA & Spring Data?

I am trying to call the Terminal_GetTicket stored procedure in my database but keep getting the following exception:
PropertyReferenceException: No property getTicket found for type TicketInfo
I have cross validated my configuration with a very simple test entity and everything seems to work fine, however for the actual case, something is wrong.
Here is my domain entity (TicketInfo):
#Entity
#NamedStoredProcedureQuery(name = "TicketInfo.getTicket", procedureName = "Terminal_GetTicket", resultClasses = TicketInfo.class, parameters = {
#StoredProcedureParameter(mode = ParameterMode.IN, name = "sys_id_game", type = Integer.class)})
public class TicketInfo {
#Id #GeneratedValue
private Long id;
private String idTicket;
private Integer externalTicketCode;
private Short sequenseAlert;
private Integer dlTimeStamp;
All the instance variables have their getters and setters properly defined and the stored procedure has a total of 5 output parameters matching the attributes of TicketInfo.
Furthermore, here is my repository interface:
public interface TicketInfoRepository extends CrudRepository<TicketInfo, Long> {
#Transactional(timeout = 5)
#Procedure
TicketInfo getTicket(Integer sys_id_game);
}
Also, here is my context.xml file (for Spring):
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:jpa="http://www.springframework.org/schema/data/jpa"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:repository="http://www.springframework.org/schema/data/repository"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-4.0.xsd
http://www.springframework.org/schema/context
http://www.springframework.org/schema/context/spring-context-4.1.xsd
http://www.springframework.org/schema/data/jpa
http://www.springframework.org/schema/data/jpa/spring-jpa-1.8.xsd
http://www.springframework.org/schema/data/repository
http://www.springframework.org/schema/data/repository/spring-repository-1.5.xsd">
<context:component-scan base-package="ar.com.boldt.godzilla" />
<jpa:repositories base-package="xx.xxx.xxx.godzilla.business.dao" />
<bean id="jpaVendorAdapter"
class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
<property name="showSql" value="${dataSource.show.sql}" />
<property name="generateDdl" value="false" />
<property name="database" value="SQL_SERVER" />
</bean>
<bean id="entityManagerFactory"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="jpaVendorAdapter" ref="jpaVendorAdapter" />
<!-- spring based scanning for entity classes -->
<property name="packagesToScan" value="xx.xxx.xxx.godzilla.business.dao" />
</bean>
<bean id="transactionManager" class="org.springframework.orm.jpa.JpaTransactionManager" />
<bean id="cacheManager" class="org.springframework.cache.ehcache.EhCacheCacheManager">
<property name="cacheManager" ref="ehcache" />
</bean>
<bean id="ehcache"
class="org.springframework.cache.ehcache.EhCacheManagerFactoryBean">
<property name="configLocation" value="classpath:ehcache.xml" />
</bean>
</beans>
And finally a watered-down version of the stored procedure itself:
ALTER PROCEDURE [Terminal_GetTicket](
#arg int
,#res int output
,#res2 int output
)
as
Declare #error int
select 0, 1, 2
RETURN #error
Now, whenever I try setting the #Autowired annotation, I get the exception mentioned above.
I remember that I have been struggling with the MS SQL stored procedures and spring-data-jpa. This is how I have been able to successfully run it:
Model:
#NamedNativeQueries({
#NamedNativeQuery(
name = "yourInternalName",
query = "EXEC [procedure_name] :param1, :param2",
resultClass = Foo.class
)
})
#Entity
public class Foo{
/* Fields, getters, setters*/
}
That's pretty straightforward. This approach is different though, you are not declaring procedures directly (that's also the reason why it doesn't have to work if you decide to change RDBS).
Then you have to extend your repository:
public interface FooRepositoryCustom {
Foo fancyMethodName(arg1, arg2);
}
And directly implement it:
public class FooRepositoryImpl implements FooRepositoryCustom {
#PersistenceContext
EntityManager entityManager;
#Override
public Foo fancyMethodName(arg1, arg2) {
Query query = entityManager.createNamedQuery("yourInternalName");
query.setParameter("param1", arg1);
query.setParameter("param2", arg2);
return query.getResultList();
}
Let's put it all together:
public interface FooRepository extends CrudRepository<Foo, Long>, FooRepositoryCustom {
}
Note that if you decide to return for example a List of Foo objects you only edit return value in your custom repository.
I followed SirKometas advice but I could not get it to work so I came up with something that worked for me and I think from syntax point of view is better. First create your entity class like below.
#NamedStoredProcedureQueries({//
#NamedStoredProcedureQuery(//
name = "MySP"//
, procedureName = "my_sp"//
, parameters = { //
#StoredProcedureParameter(mode = ParameterMode.IN, name = "arg", type = String.class)}//
, resultClasses = Foo.class)//})
#Entity
public class Foo {
Then the Implementation class of the repository would be:
#Component
public class FooRepositoryImpl implements FooCustomRepository {
#PersistenceContext
EntityManager entityManager;
#Override
public List<Foo> foo(String arg) {
Query query = entityManager.createNamedStoredProcedureQuery("MySP");
query.setParameter("arg", arg);
return query.getResultList();
}
}
The rest of the implementation is like the answer from SirKometa above. Think also that you have to create a EntityManager bean in your application for this to work.

How to configure a Spring beans with properties that are stored in a database table

In my project we'd like to externalize the properties of our Spring managed beans, that is very easy to do with standard Java .properties files, however we want to be able to read those properties from a DB table that behaves like a Map (key is the property name, value is the value assigned to that property).
I found this post that suggest the usage of Commons Configuration but I don't know if there's a better way to do the same with Spring 3.x. Maybe implementing my own PropertyResource or something.
Any clues?
I'd use a FactoryBean of type <Properties> that I'd implement using JdbcTemplate. You can then use the generated Properties object with the <context:property-placeholder> mechanism.
Sample code:
public class JdbcPropertiesFactoryBean
extends AbstractFactoryBean<Properties>{
#Required
public void setJdbcTemplate(final JdbcTemplate jdbcTemplate){
this.jdbcTemplate = jdbcTemplate;
}
private JdbcTemplate jdbcTemplate;
#Required
public void setTableName(final String tableName){
this.tableName = tableName;
}
private String tableName;
#Required
public void setKeyColumn(final String keyColumn){
this.keyColumn = keyColumn;
}
private String keyColumn;
#Required
public void setValueColumn(final String valueColumn){
this.valueColumn = valueColumn;
}
private String valueColumn;
#Override
public Class<?> getObjectType(){
return Properties.class;
}
#Override
protected Properties createInstance() throws Exception{
final Properties props = new Properties();
jdbcTemplate.query("Select " + keyColumn + ", " + valueColumn
+ " from " + tableName, new RowCallbackHandler(){
#Override
public void processRow(final ResultSet rs) throws SQLException{
props.put(rs.getString(1), rs.getString(2));
}
});
return props;
}
}
XML Configuration:
<bean id="props" class="foo.bar.JdbcPropertiesFactoryBean">
<property name="jdbcTemplate">
<bean class="org.springframework.jdbc.core.JdbcTemplate">
<!-- reference to a defined data source -->
<constructor-arg ref="dataSource" />
</bean>
</property>
<property name="tableName" value="TBL_PROPERTIES" />
<property name="keyColumn" value="COL_KEY" />
<property name="valueColumn" value="COL_VAL" />
</bean>
<context:property-placeholder properties-ref="props" />
In addition to Sean's suggestion, you can extend PropertyPlaceholderConfigurer. Look at the two current implementations - PreferencesX and ServletContextX, and roll out your own, jdbc-based.
There are ways to create "PropertyPlaceholderConfigurer" Programmatically , please see below.
Write a DAO which reads Properties and create a PropertyPlaceholderConfigurer as shown below.
XmlBeanFactory factory = new XmlBeanFactory(new FileSystemResource("beans.xml"));
PropertyPlaceholderConfigurer cfg = new PropertyPlaceholderConfigurer();
cfg.setProperties(yourProperties);
cfg.postProcessBeanFactory(factory);

Categories

Resources