Puppet "require" not working as expected - java

I have the following two manifests:
class profile::maven inherits profile::base {
# Hiera
$version = hiera('profile::maven::version', '3.2.1')
$settings = hiera_hash('profile::maven::settings', undef)
$environments = hiera_hash('profile::maven::environments', undef)
# Dependencies
require '::profile::java'
# Modules
class { '::maven::maven':
version => $version,
}
if ($settings) {
create_resources('::maven::settings', $settings)
}
if ($environments) {
create_resources('::maven::environments', $environments)
}
}
and
class profile::java inherits profile::base {
# Hiera
$distribution = hiera('profile::java::distribution', 'jdk')
$version = hiera('profile::java::version', 'present')
# Modules
class { '::java':
distribution => $distribution,
version => $version,
}
# Parameters
$java_home = $::java::java_home
file { 'profile-script:java.sh':
ensure => present,
path => '/etc/profile.d/java.sh',
content => template('profile/java.sh.erb'),
}
}
I want that profile::java has completely finished before profile::maven is executed.
The site.pplooks as follows and should not be modified in order to comply with puppet's role-profile approach later (work in progress):
node 'gamma.localdomain' {
include 'profile::java'
include 'profile::maven'
}
After compilation the scripts starts with downloading the maven archive. Why does
require '::profile::java'
not ensure the execution order? Has someone an idea how to achieve the desired behavior?

I believe that the problem here is that the require profile::java is scoped to the profile::maven class, so all resources declared in the latter class depend on profile::java. However, this will not propagate to classes that profile::maven declares, such as maven::maven.
To achieve that, you can establish a dependency among those classes
include profile::java
include maven::maven
Class[profile::java] -> Class[maven::maven]
This can incur substantial complexity in the dependency graph, so be wary of that. This can be avoided using the Anchor Pattern.
Note that use of the require function is discouraged due to possible dependency cycle issues.

Since at least Puppet 3.5 you can use "contain" to ensure that everything inside profile::java completes before profile::maven. The following addition to profile::java would be required:
class profile::java inherits profile::base {
...
contain '::maven::maven'
...
}
Answering this old, answered question as it comes up first when googling "puppet require does not work"

Related

Access value of maven_jar.artifact from Skylark rule

In my Skylark rule, I am looking through all my deps - some of them are maven_jar instances defined in my WORKSPACE file. For those, I would like to access the value of maven_jar.artifact, but as far as I can tell it isn't available. Is it possible to get at that value?
For example, if my WORKSPACE has:
maven_jar(
name = "com_google_guava_guava",
artifact = "com.google.guava:guava:20.0",
)
And my BUILD file has something like this:
my_rule(
name = "foo",
deps = ["#com_google_guava_guava//jar"]
)
In the implementation of my_rule, I would like to get the value com.google.guava:guava:20.0.
I think you'll need to file a feature request for this:
https://github.com/bazelbuild/bazel/issues/new
The instance of the maven_jar rule in the workspace file isn't available to the rules in BUILD files, only the rules which the workspace rule generates are (i.e., #com_google_guava_guava//jar). Off the top of my head, maven_jar would have to generate a rule into the jar's workspace which has an attribute with the value of artifact, and that rule would need to create a provider containing that value for other rules to consume.
(There does happen to be META-INF/maven/com.google.guava/guava/pom.xml inside the jar, which seems to have the information you want, but I don't know if you can rely on that for all jars from maven, but either way, the contents of the jar aren't available at analysis time (within the rule implementation))

Java <-> Scala convertion - "value is not a member of"

I have a problem with Scala code with Java methods.
It is calling:
value getDepth is not a member of amqpManagment.utils.data.ChessObject
var depth: Int = chessObjects.getDepth()
^
However i use getDepth in many other places in Java code and it works fine.
Also after put that code it was working in InteliJ by few hours which is weird but maybe project didnt rebuild itself after that change...
However InteliJ shows code is okay, but during compiling it shows that error. Rebuilding by InteliJ or terminal doesnt help.
Scala code:
import amqpManagment.utils.data.ChessObject
object ChessScheduler {
// DEPTH GAME
def startGameWithDepthRule(chessObject: ChessObject) : Integer =
{
...
val depth: Int = chessObjects.getDepth()
...
}
}
Java Code:
#Getter
#Setter
public class ChessObject {
private Integer depth;
...
}
build.sbt
import sbt.Keys._
import sbt.Level
name := "ChessEngineModuler"
logLevel := Level.Warn
version := "1.0"
scalaVersion := "2.12.2"
Thank you for your help.
Hello #Chenna Reddy :)
Thank you for your post, It seems it was problem with Lombok indeed. However after your answer i realised it was a problem because Scala code was compiled before Java one.
I check three solutions cause i had added dependency and Annotation Processor On.
First solution is just adding Getters and Setters to Java class not by the Lombok, however it is ugly solution
Second Solution is just adding in Files -> Settings -> Build, Execution, Deployment -> Compiler -> Scala Compiler -> Compile Order -> Java then Scala.
Third one is set in build.sbt -> compileOrder := CompileOrder.JavaThenScala
I think 3rd is the best one if we want deploy that code somewhere :)
Looks like you are using lombok for auto generation of getters. Please add lombok dependency.
libraryDependencies += "org.projectlombok" % "lombok" % "1.16.16"
Above step is not required if you are building Java project seperately and that project has lombok as a compile time dependency. Then generated jar file must have all the getters already.
Regarding why Intellij shows error sometimes, its possible that you didn't enable annotation processing from Files -> Settings -> Build, Execution, Deployment -> Compiler -> Annotation Processors.

Construct the stackmap of method while using bcel

I am trying bcel to modify a method by inserting invoke before specific instructions.
It seems that my instrumentation would result in a different stackmap table, which can not be auto-generated by the bcel package itself.
So, my instrumented class file contains the old stackmap table, which would cause error with jvm.
I haved tried with removeCodeAttributes, the method of MethodGen, that can remove all the code attributes. It can work in simple cases, a wrapped function, for example. And it can not work in my case now.
public class Insert{
public static void main(String[] args) throws ClassFormatException, IOException{
Insert isrt = new Insert();
String className = "StringBuilder.class";
JavaClass jclzz = new ClassParser(className).parse();
ClassGen cgen = new ClassGen(jclzz);
ConstantPoolGen cpgen = cgen.getConstantPool();
MethodGen mgen = new MethodGen(jclzz.getMethods()[1], className, cpgen);
InstructionFactory ifac = new InstructionFactory(cgen);
InstructionList ilist = mgen.getInstructionList();
for (InstructionHandle ihandle : ilist.getInstructionHandles()){
System.out.println(ihandle.toString());
}
InstructionFinder f = new InstructionFinder(ilist);
InstructionHandle[] insert_pos = (InstructionHandle[])(f.search("invokevirtual").next());
Instruction inserted_inst = ifac.createInvoke("java.lang.System", "currentTimeMillis", Type.LONG, Type.NO_ARGS, Constants.INVOKESTATIC);
System.out.println(inserted_inst.toString());
ilist.insert(insert_pos[0], inserted_inst);
mgen.setMaxStack();
mgen.setMaxLocals();
mgen.removeCodeAttributes();
cgen.replaceMethod(jclzz.getMethods()[1], mgen.getMethod());
ilist.dispose();
//output the file
FileOutputStream fos = new FileOutputStream(className);
cgen.getJavaClass().dump(fos);
fos.close();
}
}
Removing a StackMapTable is not a proper solution for fixing a wrong StackMapTable. The important cite is:
4.7.4. The StackMapTable Attribute
In a class file whose version number is 50.0 or above, if a method's Code attribute does not have a StackMapTable attribute, it has an implicit stack map attribute (§4.10.1). This implicit stack map attribute is equivalent to a StackMapTable attribute with number_of_entries equal to zero.
Since a StackMapTable must have explicit entries for every branch target, such an implicit StackMapTable will work with branch-free methods only. But in these cases, the method usually doesn’t have an explicit StackMapTable anyway, so you wouldn’t have that problem then (unless the method had branches which your instrumentation removed).
Another conclusion is that you can get away with removing the StackMapTable, if you patch the class file version number to a value below 50. Of course, this is only a solution if you don’t need any class file feature introduced in version 50 or newer…
There was a grace period in which JVMs supported a fall-back mode for class files with broken StackMapTables just for scenarios like yours, where the tool support is not up-to-date. (See -XX:+FailoverToOldVerifier or -XX:-UseSplitVerifier) But the grace period is over now and that support has been declined, i.e. Java 8 JVMs do not support the fall-back mode anymore.
If you want to keep up with the Java development and instrument newer class files which might use features of these new versions you have only two choices:
Calculate the correct StackMapTable manually
Use a tool which supports calculating the correct StackMapTable attributes, e.g. ASM, (see java-bytecode-asm) does support it

Starting JVM for Inline

I have a Perl script that uses Inline::Java and just has to fork (it is a server and I want it to handle multiple connections simultaneously).
So I wanted to implement this solution which makes use of a shared JVM with SHARED_JVM => 1. Since the JVM is not shutdown when the script exits, I want to reuse it with START_JVM => 0. But since it might just be the first time I start the server, I would also like to have a BEGIN block make sure a JVM is running before calling use Inline.
My question is very simple, but I couldn't find any answer on the web: How do I simply start a JVM? I've looked at man java and there seems to be just no option that means "start and just listen for connections".
Here is a simplified version of what I'm trying to do in Perl, if this helps:
BEGIN {
&start_jvm unless &jvm_is_running;
}
use Inline (
Java => 'STUDY',
SHARED_JVM => 1,
START_JVM => 0,
STUDY => ['JavaStuff'],
);
if (fork) {
JavaStuff->do_something;
wait;
}
else {
Inline::Java::reconnect_JVM();
JavaStuff->do_something;
}
What I need help with is writing the start_jvm subroutine.
If you've got a working jvm_is_running function, just use it to determine whether Inline::Java should start the JVM.
use Inline (
Java => 'STUDY',
SHARED_JVM => 1,
START_JVM => jvm_is_running() ? 0 : 1,
STUDY => ['JavaStuff'],
);
Thanks to details provided by tobyink, I am able to answer my own question, which was based on a the erroneous assumption that the JVM itself provides a server and a protocole.
As a matter of fact, one major component of Inline::Java is a server, written in Java, which handles request by the Inline::Java::JVM client, written in Perl.
Therefore, the command-line to launch the server is:
$ java org.perl.inline.java.InlineJavaServer <DEBUG> <HOST> <PORT> <SHARED_JVM> <PRIVATE> <NATIVE_DOUBLES>
where all parameters correspond to configuration options described in the Inline::Java documentation.
Therefore, in my case, the start_jvm subroutine would be:
sub start_jvm {
system
'java org.perl.inline.java.InlineJavaServer 0 localhost 7891 true false false';
}
(Not that it should be defined: tobyink's solution, while it did not directly address the question I asked, is much better.)
As for the jvm_is_running subroutine, this is how I defined it:
use Proc::ProcessTable;
use constant {
JAVA => 'java',
INLINE_SERVER => 'org.perl.inline.java.InlineJavaServer',
};
sub jvm_is_running {
my $pt = new Proc::ProcessTable;
return grep {
$_->fname eq JAVA && ( split /\s/, $_->cmndline )[1] eq INLINE_SERVER
} #{ $pt->table };
}

Warning: File for type '[Insert class here]' created in the last round will not be subject to annotation processing

I switched an existing code base to Java 7 and I keep getting this warning:
warning: File for type '[Insert class here]' created in the last round
will not be subject to annotation processing.
A quick search reveals that no one has hit this warning.
It's not documented in the javac compiler source either:
From OpenJDK\langtools\src\share\classes\com\sun\tools\javac\processing\JavacFiler.java
private JavaFileObject createSourceOrClassFile(boolean isSourceFile, String name) throws IOException {
checkNameAndExistence(name, isSourceFile);
Location loc = (isSourceFile ? SOURCE_OUTPUT : CLASS_OUTPUT);
JavaFileObject.Kind kind = (isSourceFile ?
JavaFileObject.Kind.SOURCE :
JavaFileObject.Kind.CLASS);
JavaFileObject fileObject =
fileManager.getJavaFileForOutput(loc, name, kind, null);
checkFileReopening(fileObject, true);
if (lastRound) // <-------------------------------TRIGGERS WARNING
log.warning("proc.file.create.last.round", name);
if (isSourceFile)
aggregateGeneratedSourceNames.add(name);
else
aggregateGeneratedClassNames.add(name);
openTypeNames.add(name);
return new FilerOutputJavaFileObject(name, fileObject);
}
What does this mean and what steps can I take to clear this warning?
Thanks.
The warning
warning: File for type '[Insert class here]' created in the last round
will not be subject to annotation processing
means that your were running an annotation processor creating a new class or source file using a javax.annotation.processing.Filer implementation (provided through the javax.annotation.processing.ProcessingEnvironment) although the processing tool already decided its "in the last round".
This may be problem (and thus the warning) because the generated file itself may contain annotations being ignored by the annotation processor (because it is not going to do a further round).
The above ought to answer the first part of your question
What does this mean and what steps can I take to clear this warning?
(you figured this out already by yourself, didn't you :-))
What possible steps to take? Check your annotation processors:
1) Do you really have to use filer.createClassFile / filer.createSourceFile on the very last round of the annotaion processor? Usually one uses the filer object inside of a code block like
for (TypeElement annotation : annotations) {
...
}
(in method process). This ensures that the annotation processor will not be in its last round (the last round always being the one having an empty set of annotations).
2) If you really can't avoid writing your generated files in the last round and these files are source files, trick the annotation processor and use the method "createResource" of the filer object (take "SOURCE_OUTPUT" as location).
In OpenJDK test case this warning produced because processor uses "processingOver()" to write new file exactly at last round.
public boolean process(Set<? extends TypeElement> elems, RoundEnvironment renv) {
if (renv.processingOver()) { // Write only at last round
Filer filer = processingEnv.getFiler();
Messager messager = processingEnv.getMessager();
try {
JavaFileObject fo = filer.createSourceFile("Gen");
Writer out = fo.openWriter();
out.write("class Gen { }");
out.close();
messager.printMessage(Diagnostic.Kind.NOTE, "File 'Gen' created");
} catch (IOException e) {
messager.printMessage(Diagnostic.Kind.ERROR, e.toString());
}
}
return false;
}
I modified original example code a bit. Added diagnostic note "File 'Gen' created", replaced "*" mask with "org.junit.runner.RunWith" and set return value to "true". Produced compiler log was:
Round 1:
input files: {ProcFileCreateLastRound}
annotations: [org.junit.runner.RunWith]
last round: false
Processor AnnoProc matches [org.junit.runner.RunWith] and returns true.
Round 2:
input files: {}
annotations: []
last round: true
Note: File 'Gen' created
Compilation completed successfully with 1 warning
0 errors
1 warning
Warning: File for type 'Gen' created in the last round will not be subject to annotation processing.
If we remove my custom note from log, it's hard to tell that file 'Gen' was actually created on 'Round 2' - last round. So, basic advice applies: if in doubt - add more logs.
Where is also a little bit of useful info on this page:
http://docs.oracle.com/javase/7/docs/technotes/tools/solaris/javac.html
Read section about "ANNOTATION PROCESSING" and try to get more info with compiler options:
-XprintProcessorInfo
Print information about which annotations a processor is asked to process.
-XprintRounds Print information about initial and subsequent annotation processing rounds.
I poked around the java 7 compiler options and I found this:
-implicit:{class,none}
Controls the generation of class files for implicitly loaded source files. To automatically generate class files, use -implicit:class. To suppress class file generation, use -implicit:none. If this option is not specified, the default is to automatically generate class files. In this case, the compiler will issue a warning if any such class files are generated when also doing annotation processing. The warning will not be issued if this option is set explicitly. See Searching For Types.
Source
Can you try and implicitly declare the class file.

Categories

Resources