How can I get all #Entity classes from a Persistence Unit? - java

Problem
I'm writing a standalone utility program which, given a jar containing a JPA-2 annotated persistence unit, needs to programmatically get a list of all my #Entity classes in a particular persistence unit.
I'd like to decide which of 2 approaches would be the way to go to get this information, and why; or if there is another better way I haven't thought of.
Solution 1
Java program puts jar on the classpath, creates persistence unit from the classes in the jar using JavaSE methodologies. Then it uses the javax.persistence classes to get the JPA Metamodel, pull back list of class tokens from that.
EntityManagerFactory emf = Persistence.createEntityManagerFactory("MY_ PERSISTENCE_UNIT");
Metamodel mm = emf.getMetamodel();
// loop these, using getJavaType() from Type sub-interface to get
// Class tokens for managed classes.
mm.getManagedTypes();
Solution 2
Program scan the directories and files inside the specified jar for persistence.xml files, then finds one with the specified persistence unit name. Then XPath the file to get the list of <class> XML elements and read the fully qualified class names from there. From names, build class tokens.
Constraints/Concerns
I'd like to go with approach 1 if possible.
This utility will NOT run inside a container, but the jar is an EJB project designed to run inside one. How will this be a problem?
The utility will have Open-EJB available on the classpath to get implementations of all the Java EE 6 classes.
Even though the EJB project is built to run on Hibernate, the utility should not be Hibernate-specific.
Are there any stumbling blocks?

In case anyone's interested, Solution 1 worked. Here's essentially what I had to do:
public MySQLSchemaGenerator() throws ClassNotFoundException {
Properties mySQLDialectProps = new Properties();
mySQLDialectProps.setProperty("javax.persistence.transactionType", "RESOURCE_LOCAL");
mySQLDialectProps.setProperty("javax.persistence.jtaDataSource", "");
final EntityManagerFactory emf = Persistence.createEntityManagerFactory("<persistence_unit_name>", mySQLDialectProps);
final Metamodel mm = emf.getMetamodel();
for (final ManagedType<?> managedType : mm.getManagedTypes()) {
managedType.getJavaType(); // this returns the java class of the #Entity object
}
}
The key was to override my transaction type and blank out the jtaDataSource which had been defined in my persistence.xml. Turns out everything else was unnecessary.

If Your jar is well-formed (persistence.xml at the right place - in the META-INF folder), then all looks fine.
It is not necessary to run your utility inside a container, JPA is not a part of JavaEE specs.

Related

error: package generated.schema does not exist

In my Android Application I have an annotation processor which generates files using JavaPoet and places them under the package generated.schema.
The files are generating correctly. Whenever I use the generated file like so
GeneratedFile.someGeneratedMethod();
I get the following error:
error: package generated.schema does not exist.
But if I include the fully qualified class name instead of importing like so
generated.schema.GeneratedFile.someGeneratedMethod();
the code compiles and runs without any error.
I don't want to add complete package each time I am using GeneratedFile. I'm not sure what I did wrong, since I'm still learning to work with Annotation Processor.
Files generated by other libraries including Realm, DataBinding are all working correctly as expected.
File Generation :
using JavaPoet I run the following code.
if (roundEnvironment.processingOver()) {
for (TypeElement element : apiList) {
TypeSpec clazz = generateFile(element);
JavaFile.builder(NamespaceCreator.generateClassPackage(element), clazz)
.build()
.writeTo(filer);
}
}
NamespaceCreator.generateClassPackage(element) returns the package name for class i.e generated.schema.
While generating classes I was waiting for the last processing pass. the code generation encapsulated by
if (roundEnvironment.processingOver())
I was getting a warning because of this:
File for type 'generated.schema.GeneratedFile' created in the last round will not be subject to annotation processing.
I was aware of this warning before I posted the question, however I was willing to ignore further annotation processing on my generated files for simplicity of generating all files in one go.
Even though, after removing the last round/pass check from file generation I can correctly (with import) access the generated files without any error; I still don't understand how generating files throughout all rounds affects accessing files during build with import.
For that I will be posting a new question.

Bazel build with Jackson annotations for ser/Deser and generate Java classes

So, we have been trying to xploring bazel as a build system for our organization, since we have a huge monorepo.
One of the problems I'm facing is, we have some code-gen classes which use Jackson's annotation processors to generate immutable copies of some file types.
Eg :
#JsonSerialze
#JsonInclude(JsonInclude.Include.NON_EMPTY)
#Value.Immutable
#JsonDeserialize(as=ImmutableABC.class)
Public abstract class ABC {
...
}
So, for this , I include a java_plugin tag in bazel build file for this module as follows :
Java_plugin(
name="abcgen",
srcs=["src/.../ABC.java"],
Deps=[ {jackson-deps go here}],
processor_class = "org.immutables.processor.ProxyProcessor",
)
This always fails , saying cannot find the ImmutableABC.class file which is referenced in the annotation.
Any ideas? Am I missing the processor class for the Jackson annotations, and also is it possible to include multiple processor classes?
For anyone facing issues as such, ensure generates_api = 1 for your plugin, if the generated classes are used in the library, as for the jackson part, that wasn't really the problem

ClassPathScanningCandidateComponentProvider.findCandidateComponents have wrong class name

I have a java project containing a spring boot application called processor. This project depends on a project called rules and a project called service. Every project has the same package pattern - my.com.package.
The processor and rules projects both contain classes annotated with a custom annotation #Condition. The annotation interface is annotated with #Retention(RetentionPolicy.RUNTIME). When I scan for classes annotated with #Condition from service or processor like this
private ClassPathScanningCandidateComponentProvider scanner = new ClassPathScanningCandidateComponentProvider(
false);
scanner.addIncludeFilter(new AnnotationTypeFilter(Condition.class));
for (BeanDefinition bd : scanner.findCandidateComponents("my.com")) {
try {
Class<?> c = Class.forName(bd.getBeanClassName());
Condition condition = c.getAnnotation(Condition.class);
register(condition);
} catch (ClassNotFoundException | IOException e) {
logger.error(e.getLocalizedMessage());
}
}
The classes annotated with #Condition in the processor project have the correct class name(my.com.package.x.Class), but the classes annotated with #Condition in the rules project have an incorrect fully qualified class name(my.com.Class) and it only finds 2 out of 5 class names in the project that have the annotation.
If I change the argument to scanner.findCandidateComponents to the full package path in the rules project (my.com.package.rules) while scanning in either processor or service the scanner finds no candidates. If I use my.com.* as the argument it only finds the candidates in the processor project.
I saw a similar question here Using ClassPathScanningCandidateComponentProvider with multiple jar files? and the solution was to pass the class loader to the component provider. I tried getting the class loader of the class doing the scanning and passing it to the provider like this
scanner.setResourceLoader(new PathMatchingResourcePatternResolver(classLoader));
and it didn't change any results for me.
Silly mistake, the problem was I had the wrong version of the rules project defined in the pom for my processor project so it was using an older version of the code.
However this
Condition condition = c.getAnnotation(Condition.class);
returned null for the classes taken from the jar, so this concerns me a little if this code isn't being run from source in my workspace.

What is a good practice or design to swap algorithms at runtime?

I have several data processing algorithms that can be assembled into a pipeline to transform data. The code is split into two components: A pre-processing component that does data loading-related tasks, and a processing pipeline component.
I currently have the two parts compiled and packaged into two separate jars. The idea is that the same pre-processing jar can be shipped to all customers, but the pipeline jar can be exchanged depending on customer requirements. I would like to keep the code simple and minimize configuration, so that rules out the use of OSGi or CDI frameworks.
I've gotten some hints by looking at SLF4J's implementation. That project is split into two parts: A core API, and a bunch of implementations that wrap different logging APIs. The core API makes calls to dummy classes (which exist in the core project simply to allow compilation) that are meant to be overridden by the same classes found in the logging projects. At build time, the compiled dummy classes are deleted from the core API before packaging into jar. At run time, the core jar and a logging jar are required to be included in the class path, and the missing class files in the core jar will be filled in by the files from the logging jar. This works fine, but it feels a little hacky to me. I'm wondering if there is a better design, or if this is the best that be done without using CDI frameworks.
Sounds like the strategy software design pattern.
https://en.wikipedia.org/wiki/Strategy_pattern
Take a look at the ServiceLoader.
Example Suppose we have a service type com.example.CodecSet which is
intended to represent sets of encoder/decoder pairs for some protocol.
In this case it is an abstract class with two abstract methods:
public abstract Encoder getEncoder(String encodingName);
public abstract Decoder getDecoder(String encodingName);
Each method returns an appropriate object or null if the provider does
not support the given encoding. Typical providers support more than one
encoding. If com.example.impl.StandardCodecs is an implementation of
the CodecSet service then its jar file also contains a file named
META-INF/services/com.example.CodecSet
This file contains the single line:
com.example.impl.StandardCodecs # Standard codecs
The CodecSet class creates and saves a single service instance at
initialization:
private static ServiceLoader<CodecSet> codecSetLoader
= ServiceLoader.load(CodecSet.class);
To locate an encoder for a given encoding name it defines a static factory method which iterates
through the known and available providers, returning only when it has
located a suitable encoder or has run out of providers.
public static Encoder getEncoder(String encodingName) {
for (CodecSet cp : codecSetLoader) {
Encoder enc = cp.getEncoder(encodingName);
if (enc != null)
return enc;
}
return null;
}
A getDecoder method is defined similarly.
You already understand the gist of how to use it:
Split your project into parts (core, implementation 1, implementation 2, ...)
Ship the core API with the pre-processor
Have each implementation add the correct META-INF file to its .jar file.
The only configuration files that are necessary are the ones you package into your .jar files.
You can even have them automatically generated for you with an annotation:
package foo.bar;
import javax.annotation.processing.Processor;
#AutoService(Processor.class)
final class MyProcessor extends Processor {
// …
}
AutoService will generate the file
META-INF/services/javax.annotation.processing.Processor
in the output classes folder. The file will contain:
foo.bar.MyProcessor

Having problems while trying to execute an Acceleo module in standalone mode

I have successfully created an Acceleo module for M2T purposes and am trying to execute it from a Java program.
This is what I tried :
String[] str = {"/home/hamza/workspace/HLRedundancy/model/System1.xmi", "/home/hamza/workspace/HLRedundancy/"};
Generate.main(str);
Generate being the name of the Acceleo module I created and thus, the name of the Java class containing the Acceleo generation methods.
Here is the error I'm always getting :
Exception in thread "main" org.eclipse.acceleo.engine.AcceleoEvaluationException: The type of the first parameter of the main template named 'generateElement' is a proxy.
at org.eclipse.acceleo.engine.service.AcceleoService.doGenerate(AcceleoService.java:566)
at org.eclipse.acceleo.engine.service.AbstractAcceleoGenerator.generate(AbstractAcceleoGenerator.java:193)
at org.eclipse.acceleo.engine.service.AbstractAcceleoGenerator.doGenerate(AbstractAcceleoGenerator.java:158)
at HighLevelGenerator.main.Generate.doGenerate(Generate.java:250)
at HighLevelGenerator.main.Generate.main(Generate.java:160)
at Execute.main(Execute.java:11)
I've been searching for a while about this error but I have no idea about its cause.
Any idea about a solution to my problem ?
Thanks
The most common cause of this issue is failure in properly registering the metamodel and factory corresponding to your inpu model (System1.xmi).
If you look at the comments in the generated class "Generate.java", you will notice a number of places where we indicate steps to follow if running in standalone. The most important begin registerPackages where you are required to register your metamodel.
If you debug the launch down to the point where the model is loaded (place a breakpoint right after the line model = ModelUtils.load(newModelURI, modelResourceSet);), you can look at the model.eResource().getErrors() list to see whether there were errors loading your model.
You might also be interested in looking at this video describing the process (registration required) .
Check out the first line of your acceleo module,
what is the URI of the metamodel? Does it start with 'http://' ?
Maybe this can help:
Acceleo stand alone - first parameter is proxy
This issue happen when your meta model contains sub-packages and the top package not contain any class.
to solve the problem, add a Dummy class the the top package and regenerate the meta-model code. It worked fine for me.

Categories

Resources