As a developer I want to create a Maven project and build an executable standalone JAR application. (No Spring Boot)
During development and build processes I want to add a Drools Kie artifact as dependency
<dependency>
<groupId>com.mycompany</groupId>
<artifactId>mydrools</artifactId>
<version>[1.0.0,)</version>
</dependency>
build my application as executable Jar and run it. My application has the code to call the Drools engine:
KieSession kSession = kContainer.newKieSession();
kSession.insert(myBean);
kSession.fireAllRules();
Above all, whilst I deploy my application on production:
I do not want to install Maven on my production server
I do not want my application to scan a local nor remote Maven repository
I want my application automatically scans periodically for a new version of my Drools Kie artifact without any reference to a Maven repository, just looking at the filesystem
I have tried with
String fileName = System.getenv("HOME") + "/libs/mydrools-1.0.0.jar";
File file = new File(fileName);
KieRepository kieRepository = ks.getRepository();
KieContainer kContainer = ks.newKieContainer(ks.newReleaseId("com.mycompany", "mydrools", "1.0.0"));
kieRepository.addKieModule(ks.getResources().newFileSystemResource(file));
KieScanner kScanner = ks.newKieScanner( kContainer );
kScanner.start( 10000L );
Loading the JAR works fine, but it seems that I am also forced to configure at least a minimal Maven repository (~/.m2 folder and a settings.xml). I get a heap of errors by the org.apache.maven plugin and related classes.
Of course I do not want my production environment to rely nor depend on any Maven configuration. I just want to run a JAR with another JAR (e.g. libs/mydrools-1.0.0.jar) as dependency and possibly dynamically reload that dependency whilst I update the libs/mydrools-1.0.0.jar.
Basically I need to set the internal Drools Kie Maven plugin completely disabled (offline).
How is it possible to do this with Drools 6.2.0.Final?
Update
This issue is strictly related with
Using Drools 6 Maven architecture completely offline
http://lists.jboss.org/pipermail/rules-users/2014-June/036245.html
The answer is you don't. KIE 6.* (and 7) has maven built into it, the KieScanner class uses maven to find updates.
The scanner will work better if in the ReleaseId you specify a version range e.g. [1.0.0,)
My company is in the process of deploying KIE based applications to production. We're setting up an Artifactory repository in PROD and there will be a maven repository as well.
You can essentially disable the maven part by not using KieScanner, instead use the getKieClasspathContainer() to get the KIE container. You won't be doing dynamic updates to rules though.
KIE also provides an Execution Server which pushes the Rules into a REST API. The Execution Server rules can also be updated via maven.
Architecturally speaking, you have three rule deployment models:
Model #1 is a dynamic rule updating mode. In 5.3 it pulled compiled classes over http, now in 6 & 7 it uses maven as the transport since it provides versioning and is by far the most prolific artifact versioning and transport tool. In this mode you have a production application (jar or war) which pulls rules (over maven aether through kie-ci) from a maven repo (you can have a dedicated maven repo for PROD if you like). If you use this model then you need kie-ci as a dependency and it will magically use maven under the hood.
You could use the scanner & have a maven settings.xml configured that has no <servers>, and therefor it should only pull from the production server ~/.m2 folder, allowing you to deploy to the server file system and use the OOTB scanner without any danger of it pulling externally.
Model #2 is a immutable rule model. So the concept is that you embed the rules in the application as a resource, they cannot be updated. This works well with immutable deployments such as CD pipelines and container/docker deployments that need to test the state of the app as it is right now. Having said that containers don't prohibit the option to dynamically update, I'm speaking from a pure architectural perspective. For this more omit kie-ci from the deps and use the 'getKieClasspathContainer()' (as whomer said) to load the rules from the resources folder and it will never attempt to update without redeploying the app.
Model #3 is a centralized "server" mode (and I'm only adding it for completeness due to it's limited use). It's where you centrally execute rules outside of your application's runtime, made popular by IBM's rule (and marketing) engine. However it is inferior for most use cases except in a managed service type application where you want to cross-charge. It doesn't scale naturally along with the app, entities have to be de/serialized over the wire so performance is poor etc.. however you do get central logging.
Related
We have started to implement Continuos Delivery for our Java Builds using Maven and Teamcity tooling for CI and Build automation.
We have few common jars that are built as standalone jar artefacts and are consumed by web modules.
Frequency of the change to these common modules is high; we have started to adopt the approach discussed in various forums What is the Maven way for automatic project versions when doing continuous delivery? and in this blog
http://blog.xebia.com/2012/09/30/continuous-releasing-of-maven-artifacts/ to use Major.Minor.BugFix-${revision} for all the common jars.
Value for revision is set in Parent POM as SNAPSHOT for local development and in case of Teamcity builds it is set to ${BuildNumberCounter}-${SVNRepoRevisionNumber} e.g. 1.0.0-10-233
For a Web Module that needs to consume the jar and always wants to pick the latest version Dependency range is defined as [1.0.0,2.0.0). This seems to be working fine; however to be honest we have not yet used this in anger, so will see if we hit challenges.
The problem that we have straightaway is that for local desktop development the dependency range in the Webmodule always resolves to the latest numbered release rather than snapshot build that was created by the developer for local testing of the common jar with the Web Module. We believe it is valid for the developer to be able to test the change of common jar with web modules locally. Only way it can be achieved is by committing the change and Teamcity producing new numbered release which is not ideal as it would potentially break the build of all Web Modules that use that common jar.
Wonder if anyone has faced similar problem and would have a solution.
I currently have a Jenkins instance installed on a Development box. This builds fine and deploys to our development environment without any issues.
During the build process my project makes use of a single properties file containing details such as a database connection URL (Details such as these will obviously vary depending on the environment I'm pointing to).
What I would like to know is what is the best way to configure my project so that when I want to release to Production the WAR file built by Jenkins contains the Production properties instead of Development?
(Note I am also using Maven in my project).
I know 3 options:
We have used maven.-profiles for that in the past, but they have the disadvantage, that the release-plugin of maven doesn't work with profiles, so we had to change the versions manually and were unable to deploy the artifacts in a remote repository like nexus.
Another Option is mavens assembly-plugin. That can be used together with the release-plugin, as far as I know.
We decided to write a simple tool that changes the war-files after the maven-build process. It runs in a seperate Jenkins-Job. The Idea is, that building and configuring are two seperate steps. The Artifacts comming out of maven are always in a default-configuration. And if we need the configuration for the production release we start a jenkins job that does the configuration of the war-files.
You can create different maven profiles, like dev, prod, then in the profile setting, use/filter the corresponding resource files like .../(dev|test|prod)/project.properties And in Jenkins, when you build for different platform, build with -Pdev or -Pprod to get the war for the right target.
You may want to check maven profile, maven resource filtering for detailed configuration.
something not related, connect Database via jndi if possible.
I do have own developed Java library (MyLib), which I later publish on private Maven repository and have it as Maven dependency in another webapp project (MyWebapp). If I have both projects - MyLib and MyWebapp at the same time opened in Eclipse - is there a way somehow to configure MyWebapp so, that local changes made to MyLib would be directly added to MyWebapp while building/deploying it? The issue is that during development it is not really comfortable always to make some changes in MyLib, make a build, deploy to Maven repository and then make a MyWebapp build, deploy it and only then I can see how the changes are affecting the webapp project...
I would like to reduce the overhead while developing and willing to see how changes are working out. Of course when it comes to real releases the above described flow does really make sense and works great.
Thanks!
There is no simple "just tick this option" solution, unfortunately. You can chose between these options:
Convert MyLib into a Maven module and add it to the sources of MyWebapp. This, of course, will make it harder later to reuse the library alone.
Stop deploying the application. If you look at the classpath in Eclipse, m2e should have added the dependency as a project from the workspace (instead of depending on the JAR in the repository). If not: There is an option for this.
The next step is to create another project which depends on Jetty and MyWebapp. Create a Java application in there (i.e. a file with main()) which creates a Jetty server and configure it to use the current classpath. That way, you can start the webapp just like any other Java application without deploying - Jetty will simply load classes from the classpath that m2e assembled.
I have two maven projects, which I need to deploy through automatic deployment process. (like nightly build or similar)
Scenario is as:
mv-proj1
-dependency-1.jar
-dependency-2.jar
-dependency-3.jar
mv-proj2
-dependency-3.jar
-dependency-4.jar
-mv-proj1.jar
sources of mv-proj1 and mv-proj2 can not be disclosed.
mv-proj2 is executable jars and provide services to other application modules.
So what is the standard way of deploying these to production machine or lets say UAT machine?
Do I need to set up intra-organization maven repository?
Do I need to install maven repository to UAT machine?
One possible way I could think is to set up and host intra-organization maven respository as well as setting up maven on UAT machine to fetch data from intra-organization maven respository. and deploy only pom.xml.
I would let my choice depend on what the consumers of your artifacts are.
If the consumers are also Maven projects that can pull in your JARs from said intra-organizational Maven repository, that's definitely a great way to go. I believe that every organization that is serious about using Maven is sooner or later going to have use cases for such own repository anyway. I've worked with Artifactory and Nexus and feel that both are great products (and free beer for the use case as stated here). They're both easy to install, and it should not be an exploration that is daunting, go for it!
If your UAT machines would use Maven to build and install anything that pulls in your artifacts as a dependency, them would be running Maven client-side. There would be a local repository (artifact cache) on these clients, but that's a different beast than the organizational repository mentioned above, which you would likely deploy not on the actual UAT machines.
If mv-proj2 is rather a "final delivery", executable as you say, you may want to pack it all up as a nice, single JAR (Maven can do that for you) and distribute that to your users. You could do that again through an organizational repository. You could ultimately release it to some network drive or web server. Many ways to do so, e.g. use maven-jar-plugin with outputDirectory pointing to wherever you want to release.
I'm confused about the use of maven in development and production environments - I'm sure it's something simple that I'm missing. Grateful for any help..
I set up maven inside eclipse on my local machine and wrote some software. I really like how it's made things like including dependent jars very easy.
So that's my development environment. But now I want to release the project to production on a remote server. I've searched the documentation, but I can't figure out how it's supposed to work or what the maven best practice is.. Are you supposed to:
a) Also be running maven on your production environment, and upload all your files to your production environment and rebuild your project there? (Something in me baulks at the idea of rebuilding 'released' code on the production server, so I'm fairly sure this isn't right..)
b) use mvn:package to create your jar file and then copy that up to production? (But then what of all those nice dependencies? Isn't there a danger that your tested code is now going to be running against different versions of the dependent jars in the production environment, possibly breaking your code? Or missing a jar..?)
c) Something else that I'm not figuring out..
Thanks in advance for any help!
You're supposed to have your code under version control (and you never "upload" files to another machine, you "download" them from the Version Control System if required).
You're supposed to package your code in a format (a WAR, an EAR, another kind of bundle) that can be deployed on the production environment for execution. Such bundles typically include the dependencies. To build more complex bundles, the Maven Assembly Plugin can help.
Maven generated artifacts (JARs, WARs, whatever) should be shared via a remote repository (and thus deployed - I mean mvn deploy here - to this remote repository). A remote repository can be a simple file system served via a web server or a more advanced solution like Nexus.
Development is usually done using SNAPSHOT dependencies (e.g. 1.0-SNAPSHOT). At release time, you're supposed to change the version into a "fixed" version (e.g. 1.0) and some other bits from your pom.xml, run the build to check that everything is ok, commit the modified pom.xml, create a tag in the VCS, promote the versions to a new SNAPSHOT (e.g. 1.1-SNAPSHOT) in the pom.xml, commit the new pom.xml in the VCS. The entire process require some work but this can be automated using the Maven Release Plugin.
On the production environment, get the artifacts to be deployed from the remote repository and deploy them (some projects automate the deployment to the production server using Maven but that's another story).
Of course, there are variations around this (deployment to production is most of time company specific) but the general idea is there.
You need to look into the Maven Assembly Plugin and the Maven Release Plugin.
When building artifact you usually state what scope the dependency has. In default scope it should be packaged in your archive. If you do not want it, use scope "provided" - in such case you have to prepare runtime environment providing the dependency. It's generaaly a bad idea to rebuild a package only for deployment.
As for deploying, you can use maven's antrun plugin to copy files locally or via scp .