Dynamically trigger object method on application start - java

I have an application which should produce cars and operate with them. Car object creation is a complex process so I need a factory for each type of car. Also I want users to be able to provide their own type of cars and factories which produce them. These car types and factories should be plugged to my application as jars (probably there is a better way than jars but I don't see it).
I've come to an idea of making a common CarFactory which accepts the name of the car ("mercedes", "bmw", "nissan", etc) as an argument. CarFactory has a map where each name is mapped to its own factory class. The code looks something like this (sorry I can't provide a working copy because I'm still evaluating it and don't have a version which compiles without errors)
import scala.collection.mutable.Map
class CarFactory {
var knownCarTypes = Map[String, Class[Factory]]()
def create(carType: String) = knownCarTypes.get(carType) match {
case Some(factoryClass) => Some(factoryClass.getMethod("create").invoke(null).asInstanceOf[Car])
case None => None
}
}
}
The knownCarTypes is mutable because I want user factories to register on this map providing what type of car they are responsible for and what is the name of the factory class. So from a user class it looks like this
class Mercedes extends Car
object MercedesFactory extends Factory {
def register() {
CarFactory.knownCarTypes("mercedes") = getClass
}
def create() = new Mercedes()
}
And here is my question. I don't know how to trigger the register() method of a user factory. Is it possible? Is there a better solution than my approach?
I thought about making common trait for factories, find all loaded classes implementing the trait and trigger method via reflection. But it looks quite complex. I hope some design pattern or OOP trick can be used here. What do you think?
Thanks!

If I understand your question correctly, all you have to do is call register from the object's "body":
object MercedesFactory extends Factory {
def register() {
CarFactory.knownCarTypes("mercedes") = getClass
}
register
def create() = new Mercedes()
}

Finally I got it working via reflection. I iterated over all jars on specified path, found all classes implementing my com.example.Factory trait and triggered their register() method. For jars inspection I used Clapper ClassFinder and for invoking object method I followed Thomas Jung advice. Here is the final code
import org.apache.commons.io.FileUtils
import org.clapper.classutil.{ClassFinder, ClassInfo}
import scala.collection.JavaConverters._
def triggerFactories() {
val jars = FileUtils.iterateFiles(new File("lib"), Array[String]("jar"), true).asScala.toList
val classes = ClassFinder(jars).getClasses
val factories = ClassFinder.concreteSubclasses("com.example.Factory", classes)
factories.foreach { (factory: ClassInfo) =>
companion[Factory](factory.name).register()
}
}
def companion[T](name: String)(implicit man: Manifest[T]): T =
Class.forName(name).getField("MODULE$").get(man.erasure).asInstanceOf[T]
It's worked for me. It looks tricky but I hope it won't break anything in my application in future. Please post if there is better approach, I'll reaccept the answer.

Related

Mock scala objects method call in java class

I have a java code where I am calling a scala object's method (of a dependent library). For writing unit test I want to mock the scala object's method invocation
Operations.scala
object Operations {
def load(spark: SparkSession, path: String): Dataset[Row] = {
}
}
Main.java
public class Main {
public void callMethod() {
Dataset<Row> df = Operations.load(sparkSession, path);
}
}
I want to write a Unit test for callMethod() and hence mock the Operations.load scala call. I did not find a way to do it.
Any help will be appreciated.
If you are willing to change the signature of your callMethod method, one possible solution is to use interfaces and dependency injection.
Create an interface which gives you a Dataset[Row]:
trait ObtainDataSet {
def obtain: DataSet[Row]
}
And then have your callMethod have a parameter that takes that interface:
public void callMethod(ObtainDataSet obtainDataSet) {
Dataset<Row> df = obtainDataSet.obtain;
}
Then on production code you inject the real implementation using the Operations.load method:
class ObtainDataSetImpl(spark: SparkSession, path: String) extends ObtainDataSet {
override def obtain: Dataset[Row] = Operations.load(sparkSession, path)
}
But you can easily swap that production implementation for something ad-hoc on the tests. You don't even need to use mocking frameworks:
class MyMock extends ObtainDataSet {
override def obtain: DataSet[Row] = ??? // something useful for tests.
}
But if you want, ScalaMock is normally used
Here is how this is usually done ... I'll write it in scala, because java syntax is soooo much more verbose (and frankly, I don't remember it very well), but it is the same idea.
trait Operations {
def load(s: SparkSession, path: String): DataSet[Row]
}
object Operations extends Operations {
def load(s: SparkSession, path: String) = ??? // implementation here
}
class Main(operations: Operations = Operations) {
callMethod(): Unit = {
val df = operations.load(sparkSession, path);
...
// do stuff
}
}
and then in test, you write (this is with Mockito/scalatest but, again, same idea with whatever library you use for testing):
val ops = mock[Operations]
when(ops.load(any, any)).thenReturn(whatever)
new Main(ops).callMethod(mock[SparkSession], "/foo/bar")
This approach is called "dependency injection": Your Main class has a "dependency" on Operations, that is now "injected" at run time rather than than being hardcoded as in your snippet. This allows you to use different dependency instances for production (actual Operations object) and test (the mock) runs.
Also, this is a separate topic, but I thought it is still worth mentioning: Unit (void) return type for the method you are testing doesn't seem like the best idea, and will likely cause you more problems testing it down the road (how do you make sure the method actually produced whatever side effects you are expecting from it?).
It would be better to make this method return the actual result, and have the caller take care of the side effects. Then you could easily test that it produces the result you expect:
new Main(ops).callMethod(mock[SparkSession], "/foo/bar") shouldBe "ExpectedFooResult"

How to deal with abstract classes as entry api contracts for microservices and tackle polymorphism same time?

I am going through dozen tutorials which prove to me of very little help because production code is not an animal, bird or human. Not a weapon of type cutting or shooting it is much more complex to reason about.
So returning to reality, scenario:
service 1 is exchanging messages with service 2 through Kafka, messages are serialized/deserialized with Jackson, the model class is shared between services as jar.
Now the plague part, the culmination of evil :
#JsonTypeInfo(
use = Id.NAME,
property = "type",
visible = true
)
#JsonSubTypes({#Type(
value = InternalTextContent.class,
name = "text"
), #Type(
value = InternalImageContent.class,
name = "image"
), #Type(
value = InternalAudioContent.class,
name = "audio"
), #Type(
value = InternalCustomContent.class,
name = "custom"
)})
public abstract class InternalContent {
#JsonIgnore
private ContentType type;
public InternalContent() {
}
Obviously when the time will come to work with this content we will have something like:
message.getInternalContent
which results to a sea of switch statements, if conditions, instanceof and wait for it ... downcasting everywhere
And this is just one property example the wrapping object contains. Clearly I cannot add polymorphic behaviour to InternalContent , because hellooo it is within a jar.
What went wrong here? Is it even wrong?
How do I add polymorphic behaviour ? To add a new mitigating layer, I still need instanceof in some factory to create a new type of polymorphic objects family which are editable to add the desired behavior? Not even sure it is going to be better, it just smells and make me want to shoot the advocates which throw blind statement like instanceof with downcasting is a code smell" torturing people like me who genuinely care, which makes me wonder if they ever worked on a real project. I deliberately added system environment details to understand how to model not just the code but interaction between systems. What are possible redesign options to achieve the "by book" solution?
So far I can think of that sharing domain model is a sin. But then if I use different self-service-contained classes to represent same things for serialization/deserialization I gather flexibility but lose contract and increase unpredictability. Which is what technically happens with HTTP contracts.
Should I send different types of messages with different structures along the wire instead of trying to fit common parts and subtypes for uncommon in a single message type?
To throw more sand at OO , I consider Pivotal the best among the best yet:
https://github.com/spring-projects/spring-security/blob/master/core/src/main/java/org/springframework/security/authentication/dao/AbstractUserDetailsAuthenticationProvider.java
public boolean supports(Class<?> authentication) {
return (UsernamePasswordAuthenticationToken.class
.isAssignableFrom(authentication));
}
AuhenticationManager has a list of AuthenticationProviders like this and selects correct one based on the method above. Does this violate polymorphism ? Sometimes it all just feels as a hype...
Use the visitor pattern.
Example (I'll limit to two subclasses, but you should get the idea):
interface InternalContentVisitor<T> {
T visitText(InternalTextContent c);
T visitImage(InternalImageContent c);
}
public abstract class InternalContent {
public abstract <T> T accept(InternalContentVisitor<T> visitor);
// ...
}
public class InternalTextContent {
#Override
public <T> T accept(InternalContentVisitor<T> visitor) {
return visitor.visitText(this);
}
}
public class InternalImageContent {
#Override
public <T> T accept(InternalContentVisitor<T> visitor) {
return visitor.visitImage(this);
}
}
This code is completely generic, and can be shared by any application using the classes.
So now, if you want to polymorphically do something in project1 with an InternalContent, all you need to do is to create a visitor. This visitor is out of the InternalContent classes, and can thus contain code that is specific to project1. Suppose for example that project1 has a class Copier that can be used to create a Copy of a text or of an image, you can use
InternalContent content = ...; // you don't know the actual type
Copier copier = new Copier();
Copy copy = content.accept(new InternalContentVisitor<Copy>() {
#Override
public Copy visitText(InternalTextContent c) {
return copier.copyText(c.getText());
}
#Override
public Copy visitImage(InternalImageContent c) {
return copier.copyImage(c.getImage());
}
});
So, as you can see, there is no need for a switch case. Everything is still done in a polymorphic way, even though the InternalContent class and its subclasses have no dependency at all on the Copier class that only exists in project1.
And if a new InternalSoundContent class appears, all you have to do is to add a visitSound() method in the visitor interface, and implement it in all the implementations of this interface.

Dependency injection into scala objects (not classes)

I have an import "import play.api.libs.ws.WSClient" which i want to use within my object
Object X {
...
}
But this doesn't seem to be available inside my object. I see that dependency injection is only available for classes. How do i get this to work?
Injecting a dependency into an object is impossible.
You have two options:
Ugly and deprecated: Access the injector via the global application:
val wsClient = Play.current.injector.instanceOf[WSClient]
Way to go if your code needs to live in an object: Pass the dependency in as a parameter. However this just defers the problem to the caller.
def myMethod(wsClient: WSClient) = // foo
If youre working with a legacy application where you have objects and need an injected dependency, I think one way to "improve" the situation and make a step into the right direction is to provide access to an injected class like so:
object MyObject {
private def instance = Play.current.injector.instanceOf[MyObject]
def myMethod(param: String): String =
instance.myMethod(param)
}
class MyObject #Inject() (wsClient: WSClient) {
def myMethod(param: String): String =
// foo
}
This allows legacy code to access the methods via object, while new code can inject the dependency. You may also annotate the method on the object as deprecated so that users know.

how to inject an instance of configuration when creating instance of the injected class?

I have a simple situation:
class MyClass #Inject() (configuration: Configuration) {
val port = configuration.get[String]("port")
...
}
and now I want to use MyClass in some other object:
object Toyota extends Car {
val myClass = new MyClass(???)
...
}
but I dont know how when I use MyClass i give it the configuration instance i annotated that will be injected when MyClass is going to be instantiated..
im using play2.6/juice/scala
thanks!
First of all, you should decide if dependency injection is really what you need. Basic idea of DI: instead of factories or objects themselves creating new objects, you externally pass the dependencies, and pass the instantiation problem to someone else.
You suppose to go all in with it if you rely on framework, that is why no way of using new along with DI. You cannot pass/inject a class into the scala object, here is a draft of what you can do:
Play/guice require some preparation.
Injection Module to tell guice how to create objects (if you cannot do this if annotations, or want to do it in one place).
class InjectionModule extends AbstractModule {
override def configure() = {
// ...
bind(classOf[MyClass])
bind(classOf[GlobalContext]).asEagerSingleton()
}
}
Inject the injector to be able to access it.
class GlobalContext #Inject()(playInjector: Injector) {
GlobalContext.injectorRef = playInjector
}
object GlobalContext {
private var injectorRef: Injector = _
def injector: Injector = injectorRef
}
Specify which modules to enable, because there can be more than one.
// application.conf
play.modules.enabled += "modules.InjectionModule"
And finally the client code.
object Toyota extends Car {
import GlobalContext.injector
// at this point Guice figures out how to instantiate MyClass, create and inject all the required dependencies
val myClass = injector.instanceOf[MyClass]
...
}
A simple situation expanded with a frameworks help. So, you should really consider other possibilities. Maybe it would be better to pass the configs as an implicit parameter in your case?
For dependency injection with guice take a look at:
ScalaDependencyInjection with play and Guice wiki

C# extension methods in Java using Scala

I need to create some extension methods in my Java code. I've read some posts here in SO and people suggest XTend or Scala in order to achieve this.
Now, my question would be.. if i write kind of an Adapter layer in Scala (adding there my extension methods) and then using that project as a dependency for my own Java project, are those extended methods available for me to use, or they are defined just for the 'scope of Scala project' and then the JVM output cannot provide those new methods to the other project using it?
EDIT:
What i need to do is to extend a full hierarchy of classes in a given library and give some new functionality. As for Java's first approach I should extend every class in that hierarchy creating my own hierarchy of extended classes adding the new method there. I would like to avoid this and give the final user the sense of native functionality in the original hierarchy.
Regards.
As mentioned above in the comments, it is very close to C# but not exactly there because of the type erasure. For example, this works fine:
object myLibExtensions {
implicit class TypeXExtension( val obj: TypeX ) extends AnyRef {
def myCustomFunction( a: String ): String = {
obj.someMethod(a)
}
}
}
It will act somewhat similar to C# extension methods, i.e. create static method wrappers in reasonable cases (but not always).
The only thing I am missing in Scala is that you can't (or at least I couldn't figure out how to) return the values of the types being extended. For example, assume I want to have something like an extension method "withMeta" that works as follows:
class TypeY extends TypeX { def methodOfY(...) ...}
var y: TypeY = ....
y.withMeta(...).methodOfY(...)
The following didn't work for me:
object myLibExtensions {
private val something = ....
implicit class Extension[T<:TypeX]( val obj: T ) extends AnyRef {
def withMeta( meta: Meta[T] ): T = {
something.associateMeta(obj,meta)
val
}
}
}
... because T is being erased to TypeX. So effectively you will have to write extensions for all specific leaf classes of the hierarchy in this case, which is sad.

Categories

Resources