I'd like to do some import of my own classes for use inside DSLD script, but DSLD compilation does not seem to use project's classpath - import statements break the compilation, and Class.forName throws class not found exception.
Is there a way to put custom jars on DSLD classpath, so I can use my own classes inside conribution blocks?
I am using Eclipse 3.7 and latest groovy plugin (2.6.0)
You can just pass a string with the fully qualified class name and as long as its on the projects classpath where the DSLD is being evaluated then it will work. This is described here groovy-eclipse DSLDs
Some subtleties about java.lang.Class references
Even though the DSLD script is being edited in the context of your
project, the script is actually loaded by Groovy-Eclipse. And so, the
runtime classpath of the script corresponds to Groovy-Eclipse's
classpath, rather than the classpath of your project.
Consequently, you cannot reference class objects for types defined in
your project. However, you can reference class objects that are
available to Groovy-Eclipse. This might be confusing since the
compiler will not show compile errors when types defined in your
project are referenced as class objects, but it will show compile
errors when Groovy-Eclipse types are referenced. This is because the
Groovy-Eclipse compiler works off of the project's classpath. It is
not yet aware that DSLD files will be run with a different classpath.
More specifically:
Instead of referencing the class MyLocalType directly, you can
reference it as a String "com.mycompany.MyLocalType" Standard JDK,
GDK, and all types defined in groovy-all are available directly in
your DSLD and will show compile errors. It is possible to reference
types in packages beginning with org.eclipse.jdt. and
org.codehaus.groovy.eclipse. if all references are fully qualified.
However, this is not recommended unless you really know what you are
doing.
I don't know much about the DSLD stuff, but it looks like Groovy might have it's own means of doing that.
Related
I'm developing plugin for IntelliJ IDEA. How can plugin get the name and version of libraries that are imported to the project that is being checked by plugin? I have PsiClass of the project, but cannot convert it to java.lang.Class. Maybe there's the way to get ClassLoader from PsiElement?
super.visitImportStatement(psiImport);
Class importedClass = Class.forName(psiImport.getQualifiedName(), true, psiImport.getClass().getClassLoader());
PsiImport.getClass().GetClassLoader() - returns ClassLoader of class PsiImportStatementImpl instead of ClassLoader of class that I've imported.
IntelliJ does mostly static analysis on your code. In fact, the IDE and the projects you run/debug have completely different classpaths. When you open a project, your dependencies are not added to the IDE classpath. Instead, the IDE will index the JARs, meaning it will automatically discover all the declarations (classes, methods, interfaces etc) and save them for later in a cache.
When you write code in your editor, the static analysis tool will leverage the contents of this index to validate your code and show errors when you're trying to use unknown definitions for example.
On the other hand, when you run a Main class from your project, it will spawn a new java process that has its own classpath. This classpath will likely contain every dependency declared in your module.
Knowing this, you should now understand why you can't "transform" a PsiClass to a corresponding Class.
Back to your original question:
How can plugin get the name and version of libraries that are imported to the project that is being checked by plugin?
You don't need to access Class objects for this. Instead, you can use IntelliJ SDK libraries. Here's an example:
Module mod = ModuleUtil.findModuleForFile(virtualFile,myProject);
ModuleRootManager.getInstance(mod).orderEntries().forEachLibrary(library -> {
// do your thing here with `library`
return true;
});
I work for a company that distributes our product as a jar file, and I'm trying to write something that will be able to test past versions of these jars with various inputs. Ideally, I could then run the test framework like
java -jar testframework.jar -cp "version1.jar"
or
java -jar testframework.jar -cp "version2.jar"
and get different outputs. Since the methods that take in input are set in stone, I figured I could make the dependency on our product scope "provided" or "runtime" in maven, and then call input methods on whatever version of the jar was provided in the classpath. Something like this:
<dependency>
<groupId>com.ourCompany</groupid>
<artifactId>ourProduct</artifactId>
<scope>provided</scope>
</dependency>
and then in the main TestFramework class:
public static void main(String[] args) {
ProductClass.doSomething();
}
However, I'm getting a compilation error that the doSomething method doesn't exist. I imagine I'm misunderstanding exactly what "provided" and "runtime" mean with respect to maven dependencies, but I haven't been able to find any resources that explain my mistake. Does anyone know how I can do what I'm trying to do?
ProductClass definitely exists within ProductJar. It has no
problem importing the class, just calling the method doSomething. And
I'm getting that error when I use provided scope.
Because you are confirming that the JAR exists, the issue seems like with the version of the JAR file you are pointing to, so specify the <version>X</version> (in which the doSomething method exist) as well for the <dependency> and should solve the problem.
I'm misunderstanding exactly what "provided" and "runtime" mean with
respect to maven dependencies
provided and runtime scopes are completely different, they are for two different purposes.
provided scope means that the dependency is required during compile and test time (but, the dependency JAR will not be bundled as part of the JAR packaging, so the JAR should be available in the container classpath)
runtime scope means that the dependency is required only during execution of the program, not at compile time.
The dependencies always need to be available at compile time. Otherwise, how would the compiler be able to know if your code is valid or not? Check that the version you've declared in the dependency does indeed have the doSomething method you want to use. If not you will need to change the version to one that does have that method.
I'm using AspectJ and the 'ajc' command line compiler. I specify aspectjrt.jar, aspectjtools.jar, and aspectjweaver.jar on the classpath ('-cp') during compilation, yet when I call the standard 'thisJoinPoint', an exception is thrown:
Compilation:
ajc -cp lib/aspectjrt.jar:lib/aspectjtools.jar:lib/aspectjweaver.jar -inpath work/src/ -outjar ./mynewjar.jar #work/source.lst
Code which causes exception:
before() : onCreateCall() {
System.out.println("[-] PC Info: " + thisJoinPoint.getSignature());
}
And the exception itself:
Could not find class 'org.aspectj.runtime.reflect.Factory', referenced from method com.test.WooAspects.ajc$preClinit
Of course, I've tried specifying the import with the following, but no luck:
import org.aspectj.runtime.reflect.Factory;
import org.aspectj.runtime.reflect.*;
Any ideas?
When compiling your code, if it references types in a separate library, that library (possibly packaged as a .jar) needs to be available on the compilation classpath (javac or ajc in this case).
When running your code, if it references types in a separate library, that library needs to be available on the runtime classpath (java or the alternative for aspectj).
Note that an import statement is unrelated to the classpath. All an import statement does is allow you to use a type's or member's short name instead of its fully qualified name.
The following things seem to be a little odd at first glance:
It looks as if you think that -inpath work/src actually is meant to include source files, but the inpath is actually meant to include class files. What you probably want is -sourceroots work/src.
Then you seem to use an argument file named work/source.lst which you have not shown us, so we do not know what is in there - maybe more command line switches, maybe more source files. I have no idea.
On your ajc classpath there are all three AspectJ libraries, but usually you only need aspetcjrt.jar. The other two are only needed for load-time weaving [LTW] (aspectjtools.jar) or if you want to use the AspectJ compiler and a few other tools during runtime (aspectjweaver.jar).
For a simple project in which Java and AspectJ code are in the same source directory, the following works for me (inserting line breaks for better readability, but it is all one line on the console):
ajc
-1.7
-cp lib/aspectjrt.jar
-sourceroots src
-outjar my.jar
Then you run the aspect-enhanced JAR like this (again one line on the console):
java
-cp lib/aspectjrt.jar;my.jar
de.scrum_master.app.Application
I.e. during runtime you also just need the runtime JAR on your classpath.
Maybe you want to use a build tool like Maven managing your dependencies and the build process. You can also use plugins like Maven Shade or One-JAR in order to produce a single über-JAR containing both the compiled Java + AspectJ code and the AspectJ runtime. Then you do not have any problems with classpaths during runtime, you just call
java -jar my_uber.jar
Update: You may want to read the ajc documentation for more info.
To create and load a class at runtime, I first read its content from the database, create a new SimpleJavaFileObject and finally compile it at runtime by passing it to a CompilationTask.
The point is that this new file may refer to other files (directly imported or "indirectly" via de.package.*) that are also stored in the db and not available as classes or sourcecode-files.
public class Test1 {
public de.otherpackage.Test2 reply() {
return null;
}
}
Like Test1 I would have to create and compile Test2 a step ahead, because there are no JavaFileObjects or classes to feed the compiler with.
So: How do I get a list of all sources a compiler needs to compile one class?
It would be enough to know that Test1 needs Test2. I first tried it by passing a Processor to the CompilationTask. I checked all attributes in the Trees but didnt find anything usefull or complete. If a class is imported using * on a package there is no way to get a full qualified name... at least not for me :-/
Any ideas? Maybe there are better ways to parse javasources?
Thanks for helping :-)
If you are asking if there is a way to do this before you compile the class, then the answer is "No there isn't". The source code, and the source code alone determines the direct dependencies. And you need to compile the source code in order to extract them.
If you are asking if there is a way to extract the dependencies while or after compiling then there are a few alternatives:
The javac command has a -verbose option that causes it to list each class loaded, and each file compiled.
If you use the standard compiler APIs, it provides hooks for loading dependent classes and locating source files. You could use those to track what is going on.
You can get most of this information from the bytecode files themselves. There are a couple of caveats though:
If the code is compiled with -g:none there won't be source filenames in the ".class" files.
You can determine the dependencies, by compilation times are not recorded ... unless you can infer them from file timestamps.
A dependency on a compile-time constant declared in another class is fully resolved (and inlined) at compile time ... and won't have any trace in the generated ".class" file.
But note that you generally don't need to do this to compile a class. If the compiler finds that it needs to load or compile a dependent class, it does it automatically. At least, that is how javac behaves by default.
I am having problems compiling some Scala with Maven or Eclipse where I try to import a class from a Java jar which contains both a namespace and class of the same name.
I can compile with scalac, however.
E.g. the Java project (jar) contains:
src/foo/bar.java
src/foo/bar/some_resource.txt
-> foobar.jar
Scala project references foobar.jar
Foobartest.scala:
import foo.bar
class foobartest {
}
The compiler complains with:
package foo contains object and package with same name: bar
one of them needs to be removed from classpath
Using Maven 3.0.03/Eclipse 3.7.1 with Scala 2.9.0.1 (and maven-scala-plugin).
The jar which I am having problems with is jenkins-core-1.399.jar - it definitely contains several instances where there is a namespace and object of the same name.
I am attempting to write a Jenkins plugin in Scala (I could do this in Java but would prefer scala since all of our libraries are in scala), which is dependent on using Maven -
https://wiki.jenkins-ci.org/display/JENKINS/Plugin+tutorial.
That kind of limitation was outlined in SI-4695: package object misbehaves in the presence of classfiles.
As suggested in SI-2089 (naming restriction makes some jars unusable), you could try and use the "resolve-term-conflict", as implemented in changeset 25145:
Added a -Y option to resolve namespace collisions between package and object.
It's a blunt instrument: if people have lots of these conflicts they need to resolve in individually nuanced fashion, they'll probably remain out of luck.
val termConflict = ChoiceSetting ("-Yresolve-term-conflict", "strategy", "Resolve term conflicts", 113 List("package", "object", "error"), "error")
// Some jars (often, obfuscated ones) include a package and
// object with the same name. Rather than render them unusable,
// offer a setting to resolve the conflict one way or the other.
// This was motivated by the desire to use YourKit probes, which
// require `yjp.jar` at runtime. See SI-2089.
The actual compiler option is "-Yresolve-term-conflict:strategy" where strategy is either package, object, error.