Guice injector.getInstance throwing Configuration exception - java

I am creating an instance of Predicate using a provider.
#Provides
#Singleton
#Named("RecordFilters")
public Predicate<ImmutablePair<AbstractRecord, StreamRecord>> getAllFilters() {
BackfillDataFilter backfillDataFilter = new BackfillDataFilter();
DummyUpdateFilter dummyUpdateFilter = new DummyUpdateFilter();
return input -> dummyUpdateFilter.test(input) && backfillDataFilter.test(input);
}
When i am trying to get its instance using injector.getInstance(Predicate.class) or injector.getInstance(Key.get(Predicate.class, Names.Named("RecordFilters"))
I am getting an exception.
com.google.inject.ConfigurationException: Guice configuration errors:
No implementation for java.util.function.Predicate was bound.
while locating java.util.function.Predicate
Please suggest.

Try as described in this answer:
injector.getInstance(Key.get(new TypeLiteral<Predicate<ImmutablePair<AbstractRecord, StreamRecord>>>(){})

Related

...factory.UnsatisfiedDependencyException actually of type 'org.springframework.cloud.sleuth.instrument.messaging.LazyTopicConnectionFactory'

So I've been trying to find out solution to this from long ! Any insights would help !
I am getting following error
org.springframework.beans.factory.UnsatisfiedDependencyException:
Error creating bean with name
'routerConnectionFactory' defined in class path resource
[com/CONFIDENTIAL/event/processor/configuration/EventsConfiguration.class]: Unsatisfied dependency expressed through method 'routerConnectionFactory' parameter 0; nested exception is
org.springframework.beans.factory.BeanNotOfRequiredTypeException: Bean named
'actionRouterConnectionFactory' is expected to be of type 'org.apache.activemq.ActiveMQConnectionFactory' but
was actually of type 'org.springframework.cloud.sleuth.instrument.messaging.LazyTopicConnectionFactory'
Code snippet
#Bean(name = "routerConnectionFactory")
#Primary
public CachingConnectionFactory routerConnectionFactory(ActiveMQConnectionFactory actionRouterConnectionFactory ){
CachingConnectionFactory cachingConnectionFactory = new CachingConnectionFactory();
cachingConnectionFactory.setTargetConnectionFactory(actionRouterConnectionFactory);
return cachingConnectionFactory;
}
#Bean
public ActiveMQConnectionFactory actionRouterConnectionFactory(
#Value("${confidential.gateway.message.broker.url}") String brokerURL,
#Value("${confidential.router.message.broker.user.name}") String userName,
#Value("${confidential.router.message.broker.user.password}") String password,
#Value("true") Boolean alwaysSyncSend, RedeliveryPolicy defaultEntry,
#Value("${shared.amq.keystore.path:#{null}}") String keyStorePath,
#Value("${shared.amq.keystore.password:#{null}}") String keyStorePassword) throws Exception {
ActiveMQSslConnectionFactory targetConnectionFactory= new ActiveMQSslConnectionFactory();
targetConnectionFactory.setBrokerURL(brokerURL);
targetConnectionFactory.setUserName(userName);
targetConnectionFactory.setPassword(password);
if(!StringUtils.isEmpty(keyStorePath) && !StringUtils.isEmpty(keyStorePassword)){
targetConnectionFactory.setTrustStore(keyStorePath);
targetConnectionFactory.setTrustStorePassword(keyStorePassword);
}
targetConnectionFactory.setAlwaysSyncSend(alwaysSyncSend);
targetConnectionFactory.setRedeliveryPolicy(defaultEntry);
return targetConnectionFactory;
}
spring-cloud-sleuth-core : 2.2.6.RELEASE
spring-cloud-sleuth-zipkin : 2.2.6.RELEASE
active-mq-broker, active-mq-camel, client, jms-pool , open-wire-legacy, pool, spring : 5.15.13
other spring boot and related dependencies : 2.2.6.RELEASE
https://edwin.baculsoft.com/2019/07/error-overriding-bean-of-same-name-declared-in-class-path-resource-when-integrating-spring-cloud-sleuth-and-activemq-library/
Referred multiple articles on this issue (also on StackoverFlow), also tried disabling sleuth but didn't help !
Any clue ?
Your method signatures are looking for 'ActiveMQConnectionFactory'-- that is tightly coupled to ActiveMQ. Most likely, the intetion is to couple to JMS API instead. Change code to use javax.jms.ConnectionFactory instead. (ActiveMQConnectionFactory implements javax.jms.ConnectionFactory)

Java 11 junit jupiter assertThrows

I try to migrate from Java 8 to 11 and get an error in my test class that I don't understand.
My failing (groovy) test is:
#SpringJUnitConfig
class TestSpringBeanScopeChecker {
#Autowired
ApplicationContext ctx
#Test
void testSingletonFail() {
Assertions.assertThrows(IllegalStateException.class) {
SpringBeanScopeChecker.check(ctx, DummyPrototype.class, BeanDefinition.SCOPE_SINGLETON)
}
}
}
The SpringBeanScopeChecker:
public class SpringBeanScopeChecker {
private SpringBeanScopeChecker() {}
public static void check(ApplicationContext ctx, Class<?> type, String scope)
throws IllegalStateException {
AbstractApplicationContext actx = (ctx instanceof AbstractApplicationContext) ?
((AbstractApplicationContext) ctx) :
new StaticApplicationContext(ctx);
ConfigurableListableBeanFactory factory = actx.getBeanFactory();
for (String key : ctx.getBeanNamesForType(type)) {
BeanDefinition definition = factory.getMergedBeanDefinition(key);
if (!scope.equals(definition.getScope())) {
throw new IllegalStateException(
"Every spring bean "
+ "must be request scoped in the bean configuration. The current scope is: "
+ definition.getScope());
}
}
}
}
So for the test I'm expecting a IllegalArgumentException. And this is working fine with Java8.
When I switch to Java11 and execute the test I get this error:
[ERROR] testSingletonFail Time elapsed: 0.009 s <<< FAILURE!
org.opentest4j.AssertionFailedError: Unexpected exception type thrown
==> expected: <java.lang.IllegalStateException> but was: <java.lang.AbstractMethodError>
at TestSpringBeanScopeChecker.testSingletonFail(TestSpringBeanScopeChecker.groovy:22)
Caused by: java.lang.AbstractMethodError: Receiver class
TestSpringBeanScopeChecker does not define or inherit an
implementation of the resolved method 'abstract java.lang.Object
getProperty(java.lang.String)' of interface groovy.lang.GroovyObject.
at TestSpringBeanScopeChecker.testSingletonFail(TestSpringBeanScopeChecker.groovy:22)
In case someone else has the same problem I write down the solution for this.
The problem was misconfiguration of the groovy-eclipse-compiler and groovy-eclipse-batch.
My groovy version is managed by spring-boot and I didn't update the groovy-eclipse-batch according to the groovy.version from the spring-boot pom.
According to this issue on github:
You have to compile with groovy-eclipse-batch and groovy runtime in the same version. groovy-eclipse-batch and groovy runtime should be matched up. E.g. batch 2.5.10-0x and runtime 2.5.10 or batch 3.0.1-0x and runtime 3.0.1.

When my Spring app runs, it isn't using my TogglzConfig file

I have a large Spring application that is set up without XML using only annotations. I have made some changes to this application and have a separate project with what should be almost all the same code. However, in this separate project, Togglz seems to be using some sort of default config instead of the TogglzConfig file I've set up.
The first sign that something was wrong was when I couldn't access the Togglz console. I get a 403 Forbidden error despite my config being set to allow anyone to use it (as shown on the Togglz site). I then did some tests and tried to see a list of features and the list is empty when I call FeatureContext.getFeatureManager().getFeatures() despite my Feature class having several features included. This is why I think it's using some sort of default.
TogglzConfiguration.java
public enum Features implements Feature {
FEATURE1,
FEATURE2,
FEATURE3,
FEATURE4,
FEATURE5;
public boolean isActive() {
return FeatureContext.getFeatureManager().isActive(this);
}
}
TogglzConfiguration.java
#Component
public class TogglzConfiguration implements TogglzConfig {
public Class<? extends Feature> getFeatureClass() {
return Features.class;
}
public StateRepository getStateRepository() {
File properties = [internal call to property file];
try {
return new FileBasedStateRepository(properties);
} catch (Exception e) {
throw new TogglzConfigException("Error getting Togglz configuration from " + properties + ".", e);
}
}
#Override
public UserProvider getUserProvider() {
return new UserProvider() {
#Override
public FeatureUser getCurrentUser() {
return new SimpleFeatureUser("admin", true);
}
};
}
}
SpringConfiguration.java
#EnableTransactionManagement
#Configuration
#ComponentScan(basePackages = { "root package for the entire project" }, excludeFilters =
#ComponentScan.Filter(type=FilterType.ANNOTATION, value=Controller.class))
public class SpringConfiguration {
#Bean
public TransformerFactory transformerFactory() {
return TransformerFactory.newInstance();
}
#Bean
public DocumentBuilderFactory documentBuilderfactory() {
return DocumentBuilderFactory.newInstance();
}
#Bean
public RestTemplate restTemplate() {
return new RestTemplate();
}
}
My project finds a bunch of other beans set up with the #Component annotation. I don't know if the problem is that this component isn't being picked up at all or if Togglz simply isn't using it for some reason. I tried printing the name of the FeatureManager returned by FeatureContext.getFeaturemanager() and it is FallbackTestFeatureManager so this seems to confirm my suspicion that it's just not using my config at all.
Anyone have any ideas on what I can check? I'm flat out of ideas, especially since this is working with an almost completely the same IntelliJ project on my machine right now. I just can't find out what's different about the Togglz setup or the Spring configurations. Thanks in advance for your help.
I finally had my light bulb moment and solved this problem. In case anyone else has a similar issue, it seems my mistake was having the Togglz testing and JUnit dependencies added to my project but not limiting them to the test scope. I overlooked that part of the site.
<!-- Togglz testing support -->
<dependency>
<groupId>org.togglz</groupId>
<artifactId>togglz-testing</artifactId>
<version>2.5.0.Final</version>
<scope>test</scope>
</dependency>
Without that scope, I assume these were overriding the Togglz configuration I created with a default test configuration and that was causing my issue.

Grails: how to call a service and the service name from a string

I want to call services dynamically so the service name will get as a string value, we can list all the services names in the grails project by using the code below.
import org.codehaus.groovy.grails.plugins.metadata.GrailsPlugin
for (type in ['service']) {
for (artifactClass in ctx.grailsApplication."${type}Classes") {
def clazz = artifactClass.clazz
def annotation = clazz.getAnnotation(GrailsPlugin)
if (annotation) {
println "$type $clazz.name from plugin '${annotation.name()}'"
}
else {
println "$type $clazz.name from application"
}
}
}
Here we will get artifactClass of the service.Is there any option to call the service by using this idea.Please help me.
You can get the bean for the service from the applicationContext
//inject application context bean
def applicationContext
//to use
applicationContext."${yourServiceName}".serviceMethod()
You can get the bean of your service this way:
import grails.util.Holders
...
YourService yourService =
(YourService)Holders.grailsApplication.mainContext["yourService"]

Kafka Storm Integration using Kafka Spout

I am using KafkaSpout. Please find the test program below.
I am using Storm 0.8.1. Multischeme class is there in Storm 0.8.2. I will be using that. I just want to know how were the earlier versions working just by instantiating the StringScheme() class? Where can I download earlier versions of Kafka Spout? But I doubt that would be a correct alternative than to work on Storm 0.8.2. ??? (Confused)
When I run the code (given below) on storm cluster (i.e. when I push my topology) I get the following error (This happens when the Scheme part is commented else of course I will get compiler error as the class is not there in 0.8.1):
java.lang.NoClassDefFoundError: backtype/storm/spout/MultiScheme
at storm.kafka.TestTopology.main(TestTopology.java:37)
Caused by: java.lang.ClassNotFoundException: backtype.storm.spout.MultiScheme
In the code given below you may find the spoutConfig.scheme=new StringScheme(); part commented. I was getting compiler error if I don't comment that line which is but natural as there are no constructors in there. Also when I instantiate MultiScheme I get error as I dont have that class in 0.8.1.
public class TestTopology {
public static class PrinterBolt extends BaseBasicBolt {
public void declareOutputFields(OutputFieldsDeclarer declarer) {
}
public void execute(Tuple tuple, BasicOutputCollector collector) {
System.out.println(tuple.toString());
}
}
public static void main(String [] args) throws Exception {
List<HostPort> hosts = new ArrayList<HostPort>();
hosts.add(new HostPort("127.0.0.1",9092));
LocalCluster cluster = new LocalCluster();
TopologyBuilder builder = new TopologyBuilder();
SpoutConfig spoutConfig = new SpoutConfig(new KafkaConfig.StaticHosts(hosts, 1), "test", "/zkRootStorm", "STORM-ID");
spoutConfig.zkServers=ImmutableList.of("localhost");
spoutConfig.zkPort=2181;
//spoutConfig.scheme=new StringScheme();
spoutConfig.scheme = new SchemeAsMultiScheme(new StringScheme());
builder.setSpout("spout",new KafkaSpout(spoutConfig));
builder.setBolt("printer", new PrinterBolt())
.shuffleGrouping("spout");
Config config = new Config();
cluster.submitTopology("kafka-test", config, builder.createTopology());
Thread.sleep(600000);
}
I had the same problem. Finally resolved it, and I put the complete running example up on github.
You are welcome to check it out here >
https://github.com/buildlackey/cep
(click on the storm+kafka directory for a sample program that should get you up and running).
We had a similar issue.
Our solution:
Open pom.xml
Change scope from provided to <scope>compile</scope>
If you want to know more about dependency scopes check the maven docu:
Maven docu - dependency scopes

Categories

Resources