Is there anyway so that I can provide parameter to API( not to the member of class) using Spring?
I know I can pass result of one API call to member of class
<bean id="registryService" class="foo.MyRegistry">
...properties set etc...
</bean>
<bean id="MyClient" class="foo.MyClient">
<property name="endPoint" value="#{registryService.getEndPoint('bar')}"/>
</bean>
But, I want to pass the value to API( Basically I am trying to add ActionListener on JButton from spring)
Not really a spring expert but...
In Spring 3.
#Value("#{properties.getAppropriateActionListener()}")
public void setActionListener(ActionListener listener) {
myJButton.setActionListener(listener);
}
Also, I think Spring expects a setEndpoint() and getEndPoint() methods to be able to resolve the property which is named "endPoint". Declaring a property like that effectively passes the value to the setEndPoint() method. So passing a value to API (which I assume is invoking a method call) is actually pretty straightforward.
Related
Is it possible to chain methods in factory-method in spring to create beans. For example, I have the following API:
SomeObject.builder().build();
Is there some way I can create this bean in spring XML config directly without creating 2 beans? For example,
<bean id="fooBar" class="com.foo.bar.SomeObject" factory-method="builder().build"/>
Note: The SomeObject.builder() call returns a SomeObjectBuilder object(private static class within SomeObject).
You can't do that. You just specify a single method (even without the brackets). But in SomeObject class you can create a static method that does that for you. For example:
static SomeObject newFactoryMethod(){
return builder().build();
}
And add it to the XML:
<bean id="fooBar" class="com.foo.bar.SomeObject" factory-method="newFactoryMethod"/>
I am trying to reproduce an assignment in Java code with an equivalent bean definition in Spring. As far as I can tell, though, Spring only lets you assign values to the fields within an object (provided that the class defines the appropriate setter methods). Is there a way to simply capture a reference to an object using Spring beans?
Here's an example of how I would expect this to work:
<!-- Non-working example. -->
<bean id="string" class="java.lang.String">
<value>"I am a string."</value>
</bean>
I realize that in this particular case I could just use a <constructor-arg>, but I'm looking for more general solution, one that also works for classes that don't provide parameterized constructors.
String class is immutable. No property setter method is available in java.lang.String class. If you want to inject the property value you can use below:
<bean id="emp" class="com.org.emp">
<property name="name" value="Alex" />
</bean>
in above for the obj emp, its name property will be set as Alex.
The thing to use here is a factory-method, possibly in conjunction with a factory-bean. (Non-static functions must be instantiated by a bean of the appropriate type.) In my example problem, I wanted to capture the output of a function that returns a String. Let's say the function looks like this:
class StringReturner {
public String gimmeUhString(String inStr) {
return "Your string is: " + instr;
}
}
First I need to create a bean of type StringReturner.
<bean name="stringReturner" class="how.do.i.java.StringReturner" />
Then I instantiate my String bean by calling the desired function as a factory-method. You can even provide parameters to the factory method using <constructor-arg> elements:
<bean id="string" factory-bean="stringReturner" factory-method="gimmeUhString">
<constructor-arg>
<value>I am a string.</value>
</constructor-arg>
</bean>
This is (for my purposes) equivalent to saying:
StringReturner stringReturner = new StringReturner();
String string = stringReturner.gimmeUhString("I am a string.");
String is not a Bean, Bean is a Objet that his Class that have a Constructor with empty arguments and the properties are accessible by Getters Methods and are modifiable by Setters Methods
The issue
Awhile back I started using MongoDB and Spring Data. I'd left most of the default functionality in place, and so all of my documents were stored in MongoDB with a _class field pointing to the entity's fully-qualified class name.
Right away that didn't "smell" right to me, but I left it alone. Until recently, when I refactored a bunch of code, and suddenly none of my documents could be read back from MongoDB and converted into their (refactored/renamed) Java entities. I quickly realized that it was because there was now a fully-qualified-classname mismatch. I also quickly realized that--given that I might refactor again sometime in the future--if I didn't want all of my data to become unusable I'd need to figure something else out.
What I've tried
So that's what I'm doing, but I've hit a wall. I think that I need to do the following:
Annotate each entity with #TypeAlias("ta") where "ta" is a unique, stable string.
Configure and use a different TypeInformationMapper for Spring Data to use when converting my documents back into their Java entities; it needs to know, for example, that a type-alias of "widget.foo" refers to com.myapp.document.FooWidget.
I determined that I should use a TypeInformationMapper of type org.springframework.data.convert.MappingContextTypeInformationMapper. Supposedly a MappingContextTypeInformationMapper will scan my entities/documents to find #TypeAlias'ed documents and store an alias->to->class mapping. But I can't pass that to my MappingMongoConverter; I have to pass a subtype of MongoTypeMapper. So I am configuring a DefaultMongoTypeMapper, and passing a List of one MappingContextTypeInformationMapper as its "mappers" constructor arg.
Code
Here's the relevant part of my spring XML config:
<bean id="mongoTypeMapper" class="org.springframework.data.mongodb.core.convert.DefaultMongoTypeMapper">
<constructor-arg name="typeKey" value="_class"></constructor-arg>
<constructor-arg name="mappers">
<list>
<ref bean="mappingContextTypeMapper" />
</list>
</constructor-arg>
</bean>
<bean id="mappingContextTypeMapper" class="org.springframework.data.convert.MappingContextTypeInformationMapper">
<constructor-arg ref="mappingContext" />
</bean>
<bean id="mappingMongoConverter"
class="org.springframework.data.mongodb.core.convert.MappingMongoConverter">
<constructor-arg ref="mongoDbFactory" />
<constructor-arg ref="mappingContext" />
<property name="mapKeyDotReplacement" value="__dot__" />
<property name="typeMapper" ref="mongoTypeMapper"/>
</bean>
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg ref="mongoDbFactory" />
<constructor-arg ref="mappingMongoConverter" />
</bean>
Here's a sample entity/document:
#Document(collection="widget")
#TypeAlias("widget.foo")
public class FooWidget extends Widget {
// ...
}
One important note is that any such "Widget" entity is stored as a nested document in Mongo. So in reality you won't really find a populated "Widget" collection in my MongoDB instance. Instead, a higher-level "Page" class can contain multiple "widgets" like so:
#Document(collection="page")
#TypeAlias("page")
public class Page extends BaseDocument {
// ...
private List<Widget> widgets = new ArrayList<Widget>();
}
The error I'm stuck on
What happens is that I can save a Page along with a number of nested Widgets in Mongo. But when I try to read said Page back out, I get something like the following:
org.springframework.beans.BeanInstantiationException: Could not instantiate bean class [com.myapp.document.Widget]: Is it an abstract class?
I can indeed see pages in Mongo containing "_class" : "page", with nested widgets also containing "_class" : "widget.foo" It just appears like the mapping is not being applied in the reverse.
Is there anything I might be missing?
In the default setting, the MappingMongoConverter creates a DefaultMongoTypeMapper which in turn creates a MappingContextTypeInformationMapper.
That last class is the one responsible for maintaining the typeMap cache between TypeInformation and aliases.
That cache is populated in two places:
In the constructor, for each mappingContext.getPersistentEntities()
When writing an object of an aliased type.
So if you want to make sure the alias is recognized in any context, you need to make sure that all your aliased entities are part of mappingContext.getPersistentEntities().
How you do that depends on your configuration. For instance:
if you're using AbstractMongoConfiguration, you can overwrite its getMappingBasePackage() to return the name of a package containing all of your entities.
if you're using spring boot, you can use #EntityScan to declare which packages to scan for entities
in any case, you can always configure it with a custom set (from a static list or a custom scan) using mongoMappingContext.setInitialEntitySet()
One side note, for an entity to be discovered by a scan, it has to be annotated with either #Document or #Persitent.
More informations can be found in the spring-data-commons Developer Guide
I spent a bunch of time with my debugger and the Spring Data source code, and I learned that Spring Data isn't as good as it probably should be with polymorphism as it should be, especially given the schema-less nature of NoSQL solutions like MongoDB. But ultimately what I did was to write my own type mapper, and that wasn't too tough.
The main problem was that, when reading in my Page document, the default mappers used by Spring Data would see a collection called widgets, then consult the Page class to determine that widgets pointed to a List, then consult the Widget class to look for #TypeAlias information. What I needed instead was a mapper that scanned my persistent entities up front and stored an alias-to-class mapping for later use. That's what my custom type mapper does.
I wrote a blog post discussing the details.
If you extend AbstractMongoConfiguration, you can override method getMappingBasePackage to specify the base package for your documents.
#Configuration
class RepositoryConfig extends AbstractMongoConfiguration {
#Override
protected String getMappingBasePackage() {
return "com.example";
}
}
Update: In spring-data-mongodb 2+ you should use:
#Configuration
class RepositoryConfig extends AbstractMongoConfiguration {
#Override
protected Collection<String> getMappingBasePackages(){
return Arrays.asList("com.example");
}
}
because getMappingBasePackage() is no deprecated and won't work.
Today I ran into the exact same issue. After more research I found out that my subclass was missing a repository. It appears that Spring Data is using the repositories to determine which concrete subclass to create and when it is missing, it falls back to the superclass which in this case is abstract.
So please try to add a FooWidgetRepository and map it to FooWidget with correct ID type. It might work in your case as well.
If you use spring boot with auto-configuration, declaring the following bean can help:
#Bean
MongoMappingContext mongoMappingContext(ApplicationContext applicationContext, MongoCustomConversions conversions) throws ClassNotFoundException {
MongoMappingContext context = new MongoMappingContext();
context.setInitialEntitySet(new EntityScanner(applicationContext).scan(Persistent.class));
context.setSimpleTypeHolder(conversions.getSimpleTypeHolder());
return context;
}
what does the trick is the following line:
new EntityScanner(applicationContext).scan(Persistent.class)
Instead of scanning for Documents it will scan for both Document and TypeAlias, since both of these annotations are Persistent
Andreas Svensson is right, this can be done much simpler than described by Dave Taubler.
I posted a slightly more elaborate answer than Andreas' (including sample code) in this post. Excerpt:
So all you need to do is to declare an "unused" Repository-Interface
for your sub-classes, just like you proposed as "unsafe" in your OP:
public interface NodeRepository extends MongoRepository<Node, String> {
// all of your repo methods go here
Node findById(String id);
Node findFirst100ByNodeType(String nodeType);
... etc.
}
public interface LeafType1Repository extends MongoRepository<LeafType1, String> {
// leave empty
}
public interface LeafType2Repository extends MongoRepository<LeafType2, String> {
// leave empty
}
I have a requirement to initialize the "flag" field in a java enum by spring:
public enum EntitySequenceType {
TypeOne(-1),
TypeTwo(-1000);
private boolean flag;
EntitySequenceType(long firstId){
System.out.println("enum's constructor: "+this.name()+" firstId="+firstId);
}
public void setFlag(boolean val) {
this.flag = val;
}
public boolean getFlag() {
return this.flag;
}
}
The spring config is:
<bean id="myEnum" class="com.maven.start.maven_spring.EntitySequenceType"
factory-method="valueOf">
<property name="flag" value="true"/>
<constructor-arg>
<value>TypeOne</value>
</constructor-arg>
</bean>
But I have met some problems, so I have the following questions:
1.It seems that I can only write a single value in the <constructor-arg> tag in the config xml, I can't figure out why it is like this.
2.When I debugging the code, I found that when spring is initializing the bean, although I only write one constructor-arg value in the config xml, the constructor is called twice.How could this happen?
3.In the constructor of EntitySequenceType, I found that the "flag"'s value is null, why? There is "afterPropertiesSet()" can be called if the enum implements InitializingBean, but it is not called every time a enum type is constructed, so is there any method to be called after the field is set by spring, but is called every time a enum type is called?
Thanks for your answers!
It seems that I can only write a single value in the
tag in the config xml, I can't figure out why it is like this.
Used with a factory-method, the constructor-arg values are referring to the parameter list of that factory-method. EntitySequenceType.valueOf(String) only takes one argument, a String.
When I debugging the code, I found that when spring is initializing
the bean, although I only write one constructor-arg value in the
config xml, the constructor is called twice.How could this happen?
Enum types, like any other types, have their .class file loaded and initialized when they are first referenced in the code. The enum constants
TypeOne(-1),
TypeTwo(-1000);
are actually static fields in the compiled byte code. As such, they are initialized when the class is initialized. Those are two constructor calls, so that is what you see.
The constructor-arg value has nothing to do with these constructors, it has to do with your factory-method.
In the constructor of EntitySequenceType, I found that the "flag"'s
value is null, why? There is "afterPropertiesSet()" can be called if
the enum implements InitializingBean, but it is not called every time
a enum type is constructed, so is there any method to be called after
the field is set by spring, but is called every time a enum type is
called?
It can't be null, it's a primitive type. Your property is going to be set after the factory-method is called and executed. There is no need to implement InitializingBean.
Don't use an enum for this. Enums are meant to be constant.
The problem is:
The constructor for an enum type must be package-private or private access. It automatically creates the constants that are defined at the beginning of the enum body. You cannot invoke an enum constructor yourself.
I have a static class ResourceFetcher with a static method fetchResource(String reference). I want to inject the resource returned by it into another class JobRunner. Can anyone specify the cleanest way of doing this?
I do not want to pass ResourceFetcher into JobRunner. In fact, I have an enum with set of keys, and I need to pass a map of key-value pairs into JobRunner with values obtained by invoking fetchResource.
Onething I want to clarify is that ResourceFetcher class' fetchResource returns an object of type String
Thanks in advance.
<bean id="resource" class="com.x.y.ResourceFetcher" factory-method="fetchResource">
<constructor-arg value="someReference"/>
</bean>
You can then inject resource into your JobRunner bean.
If the fetchResource method is static on ResourceFetcher, why can't JobRunner simply refer to it? I don't see the need to inject ResourceFetcher.