We are using Redis connection with JedisConnectionFactory and using CrudRepository for querying data
#Component
#Configuration
public class RedisConfig {
#Bean
public RedisTemplate<String, Object> redisTemplate(RedisProperties redisProperties) {
RedisTemplate<String, Object> template = new RedisTemplate<>();
template.setConnectionFactory(jedisConnectionFactory(redisProperties));
return template;
}
private JedisConnectionFactory jedisConnectionFactory(RedisProperties redisProperties) {
RedisStandaloneConfiguration redisStandaloneConfiguration =
new RedisStandaloneConfiguration(redisProperties.getHost(), redisProperties.getPort());
return new JedisConnectionFactory(redisStandaloneConfiguration);
}
}
My Redis Object
#RedisHash("Resource")
public class ResourceDto {
#Id private String resourceId;
#Indexed private Date lastUpdatedDate;
private String resource;
}
I want to fetch all unique ResourceDto by resourceId with Latest lastUpdatedDate.
I have tried with few things
Using findByDistinctResourceId() but it gives error
Is there any alternative?
Dont know how to query for this kind of result using RedisTemplate
public static void main(String[] args) {
ApplicationContext ctx =
SpringApplication.run(Application.class, args);
RedisTemplate redisTemplate=ctx.getBean(RedisTemplate.class);
RedisScript redisScript=new RedisScript() {
#Override
public String getSha1() {
return null;
}
#Override
public Class getResultType() {
return ResourceDto.class;
}
#Override
public String getScriptAsString() {
return "SORT Resource BY nosort";
}
};
redisTemplate.execute(redisScript,null,null);
}
RedisTemplate is configured properly but dont know how to query for all unique ResourceDto by resourceId with Latest lastUpdatedDate
Related
I have use case where single entries need to removed from the cache at a specific time. The TTL needs to be set on a key and not on a cache level
Following this spring redis documentation I tried to implement key specific TTL but it does not work. There is no event happing, I used a listener to check that and there is only an event happing when the cache ttl runs out.
The cached object has a field annotated with #TimeToLive from
org.springframework.data.redis.core.TimeToLive looking at the documentation this should trigger an expire event after the time has run out
Cached object
#Data
#NoArgsConstructor
#AllArgsConstructor
public class BrandResponse {
#TimeToLive
private Long ttl;
#NotBlank
private String id;
}
Used dependencies
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-cache</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-redis</artifactId>
<version>2.6.6</version>
</dependency>
<dependency>
<groupId>redis.clients</groupId>
<artifactId>jedis</artifactId>
<version>3.6.3</version>
</dependency>
Enable Key Space Events
#SpringBootApplication
#ServletComponentScan
#EnableAsync
#EnableRedisRepositories(enableKeyspaceEvents = RedisKeyValueAdapter.EnableKeyspaceEvents.ON_STARTUP)
public class KikaRestApiApplication {
public static void main(String[] args) {
SpringApplication.run(KikaRestApiApplication.class, args);
}
}
The default TTL for the cache is 5 minutes .entryTtl(Duration.ofMinutes(5))
Cache setup
#Configuration
#EnableCaching
public class RedisCachingConfiguration {
private final KikaApiProperties kikaApiProperties;
#Value("${spring.redis.host}")
private String host;
#Value("${spring.redis.port}")
private Integer port;
public RedisCachingConfiguration(KikaApiProperties kikaApiProperties) {
this.kikaApiProperties = kikaApiProperties;
}
#Bean
public RedisCacheConfiguration cacheConfiguration() {
return RedisCacheConfiguration.defaultCacheConfig()
.entryTtl(Duration.ofMinutes(5))
.disableCachingNullValues()
.serializeValuesWith(
SerializationPair.fromSerializer(new GenericJackson2JsonRedisSerializer()));
}
#Bean
public RedisConnectionFactory redisConnectionFactory() {
RedisStandaloneConfiguration configuration = new RedisStandaloneConfiguration();
configuration.setHostName(host);
configuration.setPort(port);
return new JedisConnectionFactory(configuration);
}
#Bean
public RedisTemplate<String, Idmap> redisTemplate() {
RedisTemplate<String, Idmap> redisTemplate = new RedisTemplate<>();
redisTemplate.setConnectionFactory(redisConnectionFactory());
redisTemplate.setEnableTransactionSupport(true);
return redisTemplate;
}
}
Is there something I am missing of does #TimeToLive not work together with spring-redis caching.
as per documentation
TimeToLive marks a single numeric property on aggregate root to be
used for setting expirations in Redis. The annotated property
supersedes any other timeout configuration.
RedisHash marks Objects as aggregate roots to be stored in a Redis
hash.
Default ttl unit is second.
#TimeToLive(unit = TimeUnit.SECONDS)
you can use other values like MINUTES.
Use #RedisHash on BrandResponse.
All the best
Below is my working code
#RedisHash
#Data
#NoArgsConstructor
#AllArgsConstructor
public class BrandResponse {
#TimeToLive(unit = TimeUnit.SECONDS )
private Long ttl;
#NotNull
#Id
private String id;
}
#Repository
public interface BrandRepository extends JpaRepository<BrandResponse, String> {
}
public interface CacheService {
void add(BrandResponse response);
boolean exists(String id);
}
#Service
public class RedisCacheServiceImpl implements CacheService {
#Autowired
private BrandRepository brandRepository;
#Override
public void add(BrandResponse response){
this.brandRepository.save(response);
}
#Override
public boolean exists(String id){
return !this.brandRepository.findById(id).isEmpty();
}
}
#Configuration
#EnableCaching
public class RedisCachingConfiguration {
private String host ="192.168.1.59";
private Integer port = 6379;
#Bean
public RedisCacheConfiguration cacheConfiguration() {
return RedisCacheConfiguration.defaultCacheConfig()
.entryTtl(Duration.ofMinutes(5))
.disableCachingNullValues()
.serializeValuesWith(
RedisSerializationContext.SerializationPair.fromSerializer(new GenericJackson2JsonRedisSerializer()));
}
#Bean
JedisConnectionFactory jedisConnectionFactory() {
JedisConnectionFactory jedisConFactory
= new JedisConnectionFactory();
jedisConFactory.setHostName(host);
jedisConFactory.setPort(port);
return jedisConFactory;
}
#Bean
public RedisTemplate<String, Object> redisTemplate() {
RedisTemplate<String, Object> template = new RedisTemplate<>();
template.setConnectionFactory(jedisConnectionFactory());
return template;
}
}
I used two data sources one for Redis another for db
#SpringBootApplication
#EnableRedisRepositories(enableKeyspaceEvents = RedisKeyValueAdapter.EnableKeyspaceEvents.ON_STARTUP,
basePackages = {"com.c4c.authn.core.repository.redis"})
#EnableJpaRepositories(basePackages = {"com.c4c.authn.core.repository.db"})
public class AuthApplication {
public static void main(String[] args) {
SpringApplication.run(AuthApplication.class, args);
}
}
Unit test
public class RedisCacheServiceImplTest extends BaseServiceTest {
#Autowired
private CacheService cacheService;
#Test
public void test_add_ok() throws InterruptedException {
this.cacheService.add(new BrandResponse(5l, "ID1"));
assertTrue(this.cacheService.exists("ID1"));
Thread.sleep(6000);
assertFalse(this.cacheService.exists("ID1"));
}
}
I am new to JDBC template and am trying to use prepared statement for inserting data into database using auto commit mode off for achieving high performance but at the end i'm not able to commit the transaction. Please suggest some correct approach or reference that might solve my problem.
Thanks in advance...
SpringjdbcApplication.java
#SpringBootApplication
public class SpringjdbcApplication
{
public static void main(String[] args)
{
ApplicationContext context = SpringApplication.run(SpringjdbcApplication.class, args);
SampleService service = context.getBean(SampleService.class);
List<Batch> batchList = new ArrayList<>();
batchList.add(new Batch("A","B"));
batchList.add(new Batch("B","B"));
batchList.add(new Batch("C","B"));
batchList.add(new Batch("D","B"));
batchList.add(new Batch("E","B"));
System.err.println("The number of rows inserted = "+service.singleInsert(batchList));
System.err.println("The count of batch class is = "+service.getCount());
}
}
SampleConfiguration.java
#Configuration
public class SampleConfiguration
{
#Bean
public DataSource mysqlDataSource()
{
HikariConfig config= new HikariConfig();
config.setDriverClassName("ClassName");
config.setJdbcUrl("URL");
config.setUsername("User");
config.setPassword("Password");
config.setMinimumIdle(600);
config.setMaximumPoolSize(30);
config.setConnectionTimeout(251);
config.setMaxLifetime(250);
config.setAutoCommit(false);
return new HikariDataSource(config);
}
#Bean
public JdbcTemplate jdbcTemplate(DataSource dataSource)
{
JdbcTemplate jdbcTemplate = new JdbcTemplate();
jdbcTemplate.setDataSource(dataSource);
return jdbcTemplate;
}
}
Batch.java
#Entity
public class Batch implements Serializable
{
private static final long serialVersionUID = -5687736664713755991L;
#Id
#Column(name="field1")
private String field1;
#Column(name="field2")
private String field2;
....... getter, setter and constructor
}
SampleService.java
#Service
public interface SampleService
{
public int singleInsert(List<Batch> batchList);
}
SampleServiceImpl.java
#Service
public class SampleServiceImpl implements SampleService
{
#Autowired
JdbcTemplate jdbcTemplate;
#Override
public int singleInsert(List<Batch> batchList)
{
for(Batch i:batchList)
{
jdbcTemplate.update("insert into batch values(?,?)",i.getField1(),i.getField2());
}
try
{
DataSourceUtils.getConnection(jdbcTemplate.getDataSource()).commit();
}
catch(Exception e)
{
e.printStackTrace();
}
return 1;
}
}
I am new to Spring Boot. I am trying to achieve a PropertySourcesPlaceholderConfigurer configuration that would return properties from a property file as well as database table. Here's what I wrote:
#Configuration
#PropertySource(value = { "classpath:application.properties" }, ignoreResourceNotFound = false)
public class SpringPropertiesConfig implements EnvironmentAware {
private static final Logger log = LoggerFactory.getLogger(SpringPropertiesConfig.class);
#Inject
private org.springframework.core.env.Environment env;
#PostConstruct
public void initializeDatabasePropertySourceUsage() {
MutablePropertySources propertySources = ((ConfigurableEnvironment) env).getPropertySources();
System.out.println("propertySources : " + propertySources);
try {
// dataSource, Table Name, Key Column, Value Column
DatabaseConfiguration databaseConfiguration = new DatabaseConfiguration(dataSource(),
"APPLICATION_CONFIGURATION", "KEY", "VALUE");
Properties dbProps = ConfigurationConverter.getProperties(databaseConfiguration);
PropertiesPropertySource dbPropertySource = new PropertiesPropertySource("dbPropertySource", dbProps);
propertySources.addFirst(dbPropertySource);
} catch (Exception e) {
log.error("Error during database properties setup", e);
throw new RuntimeException(e);
}
}
#Bean(name = "pspc")
public static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
PropertySourcesPlaceholderConfigurer pspc = new PropertySourcesPlaceholderConfigurer();
pspc.setIgnoreUnresolvablePlaceholders(true);
// System.out.println("propertySourcesPlaceholderConfigurer = " +
// pspc.getAppliedPropertySources());
return pspc;
}
#Bean
public DataSource dataSource() {
BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName(env.getProperty("dev.datasource.driver-class-name"));
dataSource.setUrl(env.getProperty("dev.datasource.url"));
dataSource.setUsername(env.getProperty("dev.datasource.username"));
dataSource.setPassword(env.getProperty("dev.datasource.password"));
return dataSource;
}
#Override
public void setEnvironment(Environment paramEnvironment) {
this.env = paramEnvironment;
}
}
I found that properties from application.properties were getting resolved correctly.
#Value("${spnego.defaultRealm}")
private String defRealm;
Here, 'defRealm' contained the correct value. However properties from database were not getting resolved.
#Value("${enviromentName}")
private String envir;
If I print the value of envir, it prints '${enviromentName}'.
In the SpringPropertiesConfig class, the table is getting read correctly and the Properties object 'dbProps' prints all the rows in the APPLICATION_CONFIGURATION table.
Any ideas?
Thank you M. Deinum, I followed your suggestion and implemented the following :
public class SpringPropertiesConfig implements ApplicationContextInitializer<ConfigurableApplicationContext> {
public DataSource getDataSource(org.springframework.core.env.PropertySource<?> propSrc) {
String profile = (String) propSrc.getProperty("spring.profiles.active");
if (profile.equals("dev")) {
BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName((String) propSrc.getProperty("dev.datasource.driver-class-name"));
dataSource.setUrl((String) propSrc.getProperty("dev.datasource.url"));
dataSource.setUsername((String) propSrc.getProperty("dev.datasource.username"));
dataSource.setPassword((String) propSrc.getProperty("dev.datasource.password"));
return dataSource;
} else if (profile.equals("prod")) {
JndiDataSourceLookup dataSourceLookup = new JndiDataSourceLookup();
DataSource dataSource = dataSourceLookup
.getDataSource((String) propSrc.getProperty("prd.datasource.jndi-name"));
return dataSource;
}
return null;
}
#Override
public void initialize(ConfigurableApplicationContext ctx) {
org.springframework.core.env.PropertySource<?> p = ctx.getEnvironment().getPropertySources()
.get("applicationConfigurationProperties");
DatabaseConfiguration databaseConfiguration = new DatabaseConfiguration(getDataSource(p),
"APPLICATION_CONFIGURATION", "KEY", "VALUE");
System.out.println("databaseConfiguration created : " + databaseConfiguration);
Properties dbProps = ConfigurationConverter.getProperties(databaseConfiguration);
System.out.println("dbProps=" + dbProps);
PropertiesPropertySource dbPropertySource = new PropertiesPropertySource("dbPropertySource", dbProps);
ctx.getEnvironment().getPropertySources().addFirst(dbPropertySource);
}
}
But what i am not sure is whether
org.springframework.core.env.PropertySource<?> p = ctx.getEnvironment().getPropertySources().get("applicationConfigurationProperties");
is the best way to read from application.properties file.
The above ApplicationContextInitializer is registered with Spring as follows -
#SpringBootApplication
public class Application extends SpringBootServletInitializer {
#Override
protected SpringApplicationBuilder configure(final SpringApplicationBuilder application) {
return application.sources(Application.class);
}
public static void main(final String[] args) {
new SpringApplicationBuilder(Application.class).initializers(new SpringPropertiesConfig()).run(args);
}
#Bean
public RestTemplate restTemplate(final RestTemplateBuilder builder) {
return builder.build();
}
}
I have a properties class
#ConfigurationProperties(prefix = ShiroProperties.SHIRO_PREFIX)
public class ShiroProperties {
public static final String SHIRO_PREFIX = "shiro";
private String urlLogin;
private String urlSuccessed;
and a Configuration class
#Configuration
#EnableConfigurationProperties({ ShiroProperties.class })
public class ShiroConfig implements ApplicationContextAware {
ApplicationContext applicationContext;
#Autowired
private ShiroProperties shiroProperties ;
shiroProperties is null, but i can find it value in ShiroConfig used
applicationContext.getBean(ShiroProperties.class)
my Application class:
#SpringBootApplication
public class Bootstrap {
public static void main(String[] args) {
SpringApplication.run(Bootstrap.class, args);
}
}
So weird, i can run success with similar code in other project, but this.
I met this same issue as #Dean said,I had done is put the LifecycleBeanPostProcessor bean is another configure class ,and configure other Shiro in another configuration class ,see below example:
#Configuration
public class ShiroLifecycleBeanPostProcessorConfig {
/**
*
*
* #return
*/
#Bean(name = "lifecycleBeanPostProcessor")
public LifecycleBeanPostProcessor getLifecycleBeanPostProcessor() {
return new LifecycleBeanPostProcessor();
}
}
The main Shiro Configuration class:
#Configuration
#AutoConfigureAfter(value = ShiroLifecycleBeanPostProcessorConfig.class)
public class ShiroConfiguration {
public static final String cacheFile = "encache.xml";
private static final String active_cache_name = "activeSessionCache";
#Autowired
private RedisTemplate<String, Object> redisTemplate;
/**
*
*
* #throws UnknownHostException
*/
#Bean(name = "shiroFilter")
#ConditionalOnMissingBean
public ShiroFilterFactoryBean shiroFilterFactoryBean(DefaultWebSecurityManager securityManager)
throws UnknownHostException {
ShiroFilterFactoryBean shiroFilterFactoryBean = new ShiroFilterFactoryBean();
shiroFilterFactoryBean.setSecurityManager(securityManager);
shiroFilterFactoryBean.setLoginUrl(ShiroSecurityUrls.LOGIN_PAGE);
// shiroFilterFactoryBean.setSuccessUrl(ShiroSecurityUrls.LOGIN_SUCCESS_URL);
shiroFilterFactoryBean.setUnauthorizedUrl("/error");
Map<String, Filter> filters = new LinkedHashMap<String, Filter>();
LogoutFilter logoutFilter = new LogoutFilter();
logoutFilter.setRedirectUrl(ShiroSecurityUrls.LOGIN_PAGE);
filters.put(DefaultFilter.logout.name(), logoutFilter);
shiroFilterFactoryBean.setFilters(filters);
Map<String, String> filterChainDefinitionManager = new LinkedHashMap<String, String>();
filterChainDefinitionManager.put("/static/**", DefaultFilter.anon.name());
filterChainDefinitionManager.put("/node_modules/**", DefaultFilter.anon.name());
filterChainDefinitionManager.put("/pages/**", DefaultFilter.anon.name());
filterChainDefinitionManager.put(ShiroSecurityUrls.LOGIN_PAGE, DefaultFilter.anon.name());
filterChainDefinitionManager.put(ShiroSecurityUrls.LOGOUT_URL, DefaultFilter.logout.name());
filterChainDefinitionManager.put(ShiroSecurityUrls.REGISTER_PROCESS_URL, DefaultFilter.anon.name());
filterChainDefinitionManager.put("/**", DefaultFilter.user.name());
shiroFilterFactoryBean.setFilterChainDefinitionMap(filterChainDefinitionManager);
return shiroFilterFactoryBean;
}
/**
*
*
* #throws UnknownHostException
*/
#Bean(name = "securityManager")
#DependsOn(value = { "ehCacheManager", "rememberMeManager", "sessionManager", "credentialsMatcher" })
public DefaultWebSecurityManager securityManager(EhCacheManager ehCacheManager, RememberMeManager rememberMeManager,
SessionManager sessionManager, CredentialsMatcher credentialsMatcher) throws UnknownHostException {
DefaultWebSecurityManager securityManager = new DefaultWebSecurityManager();
// 1. Cache Support
securityManager.setCacheManager(ehCacheManager);
// 2. Session Support,inject the cacheManager from securitymanager
securityManager.setSessionManager(sessionManager);
// 3. Rememberme Support
securityManager.setRememberMeManager(rememberMeManager);
// 4. JDBC,LDAP Realm implements
Collection<Realm> authorizingRealms = Lists.newArrayList(shiroDatabaseRealm(credentialsMatcher),
shiroActiveDirectoryRealm(credentialsMatcher));
securityManager.setRealms(authorizingRealms); // inject the cacheManager
// from securitymanager
if (securityManager.getAuthenticator() instanceof ModularRealmAuthenticator) {
ModularRealmAuthenticator modularRealmAuthenticator = (ModularRealmAuthenticator) securityManager
.getAuthenticator();
modularRealmAuthenticator.setAuthenticationStrategy(new FirstSuccessfulStrategy());
}
return securityManager;
}
}
Hope this code helps you ,thanks.
This being another configuration class for your application should be decorated with #Configuration annotation to enable a bean creation and injection into the context for wiring from another classes.
In common, AutowiredAnnotationBeanPostProcessor set such property annotationed by #Autowired in the phase when Spring load FactoryBean classes. If the following factory beans:
ApplicationContextAwareProcessor
ApplicationListenerDetector
ConfigurationClassPostProcessor$ImportAwareBeanPostProcessor
PostProcessorRegistrationDelegate$BeanPostProcessorChecker
CommonAnnotationBeanPostProcessor
refer your config bean, your bean will not be autowired properties after creation due to AutowiredAnnotationBeanPostProcessor is not loaded.
For example, properties is null and throw NullPointerException
#Component
public class BeanFactoryTest {
#Autowired
private IdGenProperties properties;
#Bean
public SnowflakeServer snowflakeServer() {
System.out.println(properties.getBaseUrl());
return null;
}
#Bean(name = "conversionService")
public ConversionServiceFactoryBean getConversionService() {
ConversionServiceFactoryBean bean = new ConversionServiceFactoryBean();
Set<Converter> converters = new HashSet<>();
converters.add(new StringToDateConverter());
bean.setConverters(converters);
return bean;
}
public static class StringToDateConverter implements Converter<String, Date>
{
public Date convert(String source) {
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
try {
return sdf.parse(source);
} catch (ParseException e) {
e.printStackTrace();
}
return null;
}
}
}
Try with adding #Component in your ShiroProperties.class
#Component
#ConfigurationProperties(prefix = ShiroProperties.SHIRO_PREFIX)
public class ShiroProperties {
public static final String SHIRO_PREFIX = "shiro";
private String urlLogin;
private String urlSuccessed;
}
I am having trouble caching internal methods within my DAO layer while in Proxy mode.
I am aware that while in Proxy mode, only external method calls coming in through the proxy are intercepted. However,I want to avoid having to switch to AspectJ mode and was wondering if any other work arounds existed.
I am displaying my code below and am wondering what changes, if any, I can add to make this process work.
--Note I am using swagger to document my code
--Also note my code has been watered down....for obvious reasons
//Controller
#RestController
#Api(produces = "application/json", protocols = "https", tags = "Securities", description = "Securities information")
public class SecuritiesInfoController extends Controller {
private SecuritiesInfoManager _securitiesInfoManager = new SecuritiesInfoManager();
#RequestMapping(value = "/security", method = RequestMethod.GET)
public List<SecuritiesInfo> getAll(){
return _securitiesInfoManager.getAll();
}
}
//Service
public class SecuritiesInfoManager extends Manager {
private SecuritiesInfoDAO _securitiesDAO = new SecuritiesInfoDAO();
public List<SecuritiesInfo> getAll() {
return _securitiesDAO.getAll();
}
}
//DAO
public class SecuritiesInfoDAO extends DAO {
private static String securitiesTable = "Securities";
#SecuritiesInfoDAOInterface
public List<SecuritiesInfo> getAll() {
//Magic
}
}
//Interface
#Retention(RetentionPolicy.RUNTIME)
#Target({ElementType.METHOD})
#Cacheable(cacheNames = "SecuritiesInfo",cacheManager="cacheManager",
keyGenerator="keyGenerator" )
public #interface SecuritiesInfoDAOInterface {
}
//CacheConfig
#Configuration
//#EnableCaching(mode = AdviceMode.PROXY)
#EnableCaching(proxyTargetClass = true)
//#EnableCaching
public class CacheConfig extends CachingConfigurerSupport {
#Bean
public SecuritiesInfoDAO myService() {
// configure and return a class having #Cacheable methods
return new SecuritiesInfoDAO();
}
#Bean
public JedisConnectionFactory redisConnectionFactory() {
JedisConnectionFactory redisConnectionFactory = new JedisConnectionFactory();
// Defaults
redisConnectionFactory.setHostName("Nope");
redisConnectionFactory.setPort(LoL);
System.out.println("IN CONNTECTION");
redisConnectionFactory.setPassword("Please help me :)");
return redisConnectionFactory;
}
#Bean
public RedisTemplate<String, String> redisTemplate(RedisConnectionFactory cf) {
System.out.println("cf: "+cf.toString());
RedisTemplate<String, String> redisTemplate = new RedisTemplate<String, String>();
redisTemplate.setConnectionFactory(cf);
return redisTemplate;
}
/*
#Primary
#Bean
public RedisTemplate<String,ExpiringSession> redisTemplate2(RedisConnectionFactory connectionFactory) {
RedisTemplate<String, ExpiringSession> template = new RedisTemplate<String, ExpiringSession>();
template.setHashValueSerializer(new LdapFailAwareRedisObjectSerializer());
template.setConnectionFactory(connectionFactory);
return template;
}
*/
#Bean
public CacheManager cacheManager(RedisTemplate<String, String> redisTemplate) {
System.out.println("IN CACHE MANAGER");
RedisCacheManager cacheManager = new RedisCacheManager(redisTemplate);
// Number of seconds before expiration. Defaults to unlimited (0)
// cacheManager.setDefaultExpiration(300);
return cacheManager;
}
#Bean
public KeyGenerator keyGenerator() {
return new KeyGenerator() {
#Override
public Object generate(Object o, Method method, Object... objects) {
// This will generate a unique key of the class name, the method name,
// and all method parameters appended.
StringBuilder sb = new StringBuilder();
sb.append(o.getClass().getName());
sb.append(method.getName());
for (Object obj : objects) {
sb.append(obj.toString());
}
System.out.println(sb.toString());
return sb.toString();
}
};
}
So I figured out the answer. It turns out I wasn't implementing/instantiating the interface correctly.
First I have to #Autowire my manager class in my controller. Then #autowire my interface class in my manager.
For a more detailed solution, I am placing my revised code below.
//Controller
#RestController
#Api(produces = "application/json", protocols = "https", tags = "Securities", description = "Securities information")
public class SecuritiesInfoController extends Controller {
#Autowired
private SecuritiesInfoManager _securitiesInfoManager = new SecuritiesInfoManager();
#RequestMapping(value = "/security", method = RequestMethod.GET)
public List<SecuritiesInfo> getAll(){
return _securitiesInfoManager.getAll();
}
}
//Service
public class SecuritiesInfoManager extends Manager {
#Autowired
public void setSecuritiesInfoDAOInterface(SecuritiesInfoDAOInterface _securitiesInfoDAOInterface) {
this._securitiesInfoDAOInterface = _securitiesInfoDAOInterface;
}
public List<SecuritiesInfo> getAll() {
return _securitiesInfoDAOInterface.getAll();
}
}
//DAO
public class SecuritiesInfoDAO extends DAO implements SecuritiesInfoDAOInterface {
private static String securitiesTable = "Securities";
#Override
public List<SecuritiesInfo> getAll() {
//Magic
}
}
//Interface
public interface SecuritiesInfoDAOInterface {
#Cacheable(cacheNames = "SecuritiesInfo",cacheManager="cacheManager", keyGenerator="keyGenerator" )
List<SecuritiesInfo> getAll();
}
}
//CacheConfig
#Configuration
#EnableCaching
public class CacheConfig extends CachingConfigurerSupport {
#Bean
public SecuritiesInfoManager myService() {
// configure and return a class having #Cacheable methods
return new SecuritiesInfoManager();
}
//rest same as before
}
//WebConfig
#Configuration
#ComponentScan(basePackages = {"package name"})
public class WebConfig extends WebMvcConfigurerAdapter {
#Override
public void configurePathMatch(PathMatchConfigurer configurer) {
AntPathMatcher matcher = new AntPathMatcher();
matcher.setCaseSensitive(false);
configurer.setPathMatcher(matcher);
}
}