I have an issue where Maven + frontend-maven-plugin and webpack doesn't go well together when I install an entire Maven module; Simply put Webpack the htmlwebpackPlugin will not inject the bundled js/css files the first time I install a Maven module, for some reason, even though a template is provided as such:
new HtmlWebpackPlugin({
template : '../resources/public/index.html',
filename : 'index.html',
inject : 'body',
})
However if I manually run the frontend-maven-plugin after installing the entire Maven module, it will actually inject the correct files, which is rather strange behavior.
To go around this, I wanted to know if there's a manual way to somehow inject these bundled files(I only have three; 1 css, 2 js files) with a chunkhash inside my own index.html template? That would make the build much more consistent.
A snip of my webpack.config.js - as you can see we add the chunkhash to the filenames if we are not in dev.
"use strict";
const ExtractTextPlugin = require("extract-text-webpack-plugin");
const HtmlWebpackPlugin = require('html-webpack-plugin');
let path = require('path');
let webpack = require("webpack");
const PATHS = {
build: path.join(__dirname, '../../../target', 'classes', 'public'),
};
const env = process.env.NODE_ENV;
let isDev = false;
if(env == "dev"){
isDev = true;
}
console.log(`Dev environment: ${isDev}`);
module.exports = {
entry: {
main: './js/index.jsx',
vendor: [
"react","react-dom","axios",
"react-table", "mobx", "mobx-react", "mobx-utils", "lodash"],
},
output: {
path: PATHS.build,
filename: `bundle.${isDev ? '' : '[chunkhash]'}.js`
},
plugins: [
new webpack.optimize.CommonsChunkPlugin({name: "vendor", filename: `/static/js/vendor.bundle.${isDev ? '' : '[chunkhash]'}.js`}),
new ExtractTextPlugin(`/static/css/[name].${isDev ? '' : '[chunkhash]'}.css`),
new HtmlWebpackPlugin({
template : '../resources/public/index.html',
filename : 'index.html',
inject : 'body',
})
],
module: {
loaders: [
// Bunch of loaders
]
},
};
I solved it - the issue was basically that Maven/Spring would take the Index.html(which I used as a template) in resources/public outside my target folder and overwrite it in the target folder - basically overwriting the output from webpackHTMLplugin, which makes logical sense in this context.
I solved it by not having any index.html file in resources/public, but just having a template.html in the src folder where webpack is. Thereby, it Maven/Spring doesn't overwrite the output with the empty template.
Related
In our project (java/spring/gradle stack), we are using openApi specs. We have a couple of services and specs for them. Also, we have common modules that duplicate in each spec. We moved those common modules to a separate spec file and include them in others. For instance:
specific_project1:
openapi:
spec.yaml
common_modules.yaml
build.gradle
The spec.yaml has the next code:
openapi: 3.0.0
info:
version: 0.0.1
paths:
/specific/post:
post:
requestBody:
context:
application/json:
schema:
$ref: "common_modules.yaml#/components/schemas/TestModule"
and the common_modules.yaml the next:
openapi: 3.0.0
info:
version: 0.0.1
components:
schemas:
TestModule:
type: object
properties:
value:
type: string
As the result, we need that spec.yaml to be generated with package name com.specific.project1, common_modules.yaml is generated with package name com.common.modules and the generated java class that is generated from spec.yaml has an import for TestModule with package com.common.modules in imports.
We found the solution on how to refer to the same object. We published TestModule as a separate project with its own package and in other projects, we used the importMappings configuration to define the proper package name for the object.
importMappings = [
TestModule: "com.common.modules.TestModule"
]
Yes it is possible
When you add openapi plug in to build.grade like
plugins {
id "org.openapi.generator" version "4.3.0"
}
You'll get access to openApiGenerate Task ,
This can be modified like this for example :
openApiGenerate {
generatorName = "jaxrs-jersey"
inputSpec = "$rootDir/spec-v1.yaml".toString()
outputDir = "$rootDir/".toString()
apiPackage = "name.of.api.package.rest.server.v1.api"
invokerPackage = "name.of.api.package.rest.server.v1.invoker"
modelPackage = "name.of.api.package.rest.server.v1.model"
configOptions = [
library: "jersey2"
]
}
here is the link for all the possible modification you can do related to spring for example https://github.com/OpenAPITools/openapi-generator/blob/master/docs/generators/spring.md
and for the full list of generators : https://github.com/OpenAPITools/openapi-generator/tree/master/docs/generators
In order to use 2 yaml files in a single prokect here is what you have to do .
task common(type: org.openapitools.generator.gradle.plugin.tasks.GenerateTask){
generatorName = "spring"
inputSpec = "$rootDir//schema/common.yaml".toString()
outputDir = "$rootDir/generated".toString()
apiPackage = "com.abc.code.generated.controller"
modelPackage = "com.abc.code.generated.model"
configOptions = [
dateLibrary: "java8"
]
systemProperties = [
invoker : "false",
generateSupportingFiles: "true"
]
additionalProperties = [
interfaceOnly : "true",
]
And you can leave the openApiGenerate for the spec.yaml
But make sure to add
compileJava.dependsOn common , tasks.openApiGenerate
So youll execute both of them .
Hope i helped .
So the BUILD structure are like below:
java:
/src/java/com/abc/code/Code.java
resources:
/src/resources/com/abc/code/application.properties
BUILD filegroups
filegroup(
name = "properties",
srcs = glob(["application.properties"])
visibility = ["//visibility:public"],
)
BUILD of app
use filegroups as resources/classpath_resources
java_binary(
name = "app",
classpath_resources = [
"//src/resources/com/abc/code:properties",
],
# resources = [
# "//src/resources/com/abc/code:properties",
# ],
main_class = "com.abc.code.Code",
runtime_deps = [
":app_bin",
],
)
get null back for Code.class.getResourceAsStream("application.properties");
and after checking the generated jar, found that application.properties sits in the top /
jar tf poc_delivery_system_app.jar
META-INF/
META-INF/MANIFEST.MF
application.properties
then update the code to Code.class.getResourceAsStream("/application.properties"); which works,
The question is why application.properties is in the top level instead of something like /com/abc/code/application.properties
The resources have the following resource(s)
srcs = glob(["application.properties"])
So, indeed they are at the source.
If you want them in a subdirectory, place the BUILD file at $WORKSPACE/src/resources/BUILD with
filegroup(
name = "resources",
srcs = glob(["**/*.properties"]),
)
and the java lib will then
resources = [
"//src/resources",
],
When I am trying to edit a property within Gradle it re-formats my entire properties file and removes the comments. I am assuming this is because of the way Gradle is reading and writing to the properties file. I would like to just change a property and leave the rest of the properties file untouched including leaving the current comments in place and order of the values. Is this possible to do using Gradle 5.2.1?
I have tried to just use setProperty (which does not write to the file), used a different writer: (versionPropsFile.withWriter { versionProps.store(it, null) } )
and tried a different way to read in the properties file: versionProps.load(versionPropsFile.newDataInputStream())
Here is my current Gradle code:
File versionPropsFile = file("default.properties");
def versionProps = new Properties()
versionProps.load(versionPropsFile.newDataInputStream())
int version_minor = versionProps.getProperty("VERSION_MINOR")
int version_build = versionProps.getProperty("VERSION_BUILD")
versionProps.setProperty("VERSION_MINOR", 1)
versionProps.setProperty("VERSION_BUILD", 2)
versionPropsFile.withWriter { versionProps.store(it, null) }
Here is a piece of what the properties file looks like before gradle touches it:
# Show splash screen at startup (yes* | no)
SHOW_SPLASH = yes
# Start in minimized mode (yes | no*)
START_MINIMIZED = no
# First day of week (mon | sun*)
# FIRST_DAY_OF_WEEK = sun
# Version number
# Format: MAJOR.MINOR.BUILD
VERSION_MAJOR = 1
VERSION_MINOR = 0
VERSION_BUILD = 0
# Build value is the date
BUILD = 4-3-2019
Here is what Gradle does to it:
#Wed Apr 03 11:49:09 CDT 2019
DISABLE_L10N=no
LOOK_AND_FEEL=default
ON_MINIMIZE=normal
CHECK_IF_ALREADY_STARTED=YES
VERSION_BUILD=0
ASK_ON_EXIT=yes
SHOW_SPLASH=yes
VERSION_MAJOR=1
VERSION_MINOR=0
VERSION_BUILD=0
BUILD=04-03-2019
START_MINIMIZED=no
ON_CLOSE=minimize
PORT_NUMBER=19432
DISABLE_SYSTRAY=no
This is not a Gradle issue per se. The default Properties object of Java does not preserve any layout/comment information of properties files. You can use Apache Commons Configuration, for example, to get layout-preserving properties files.
Here’s a self-contained sample build.gradle file that loads, changes and saves a properties file, preserving comments and layout information (at least to the degree that is required by your example file):
buildscript {
repositories {
mavenCentral()
}
dependencies {
classpath 'org.apache.commons:commons-configuration2:2.4'
}
}
import org.apache.commons.configuration2.io.FileHandler
import org.apache.commons.configuration2.PropertiesConfiguration
import org.apache.commons.configuration2.PropertiesConfigurationLayout
task propUpdater {
doLast {
def versionPropsFile = file('default.properties')
def config = new PropertiesConfiguration()
def fileHandler = new FileHandler(config)
fileHandler.file = versionPropsFile
fileHandler.load()
// TODO change the properties in whatever way you like; as an example,
// we’re simply incrementing the major version here:
config.setProperty('VERSION_MAJOR',
(config.getProperty('VERSION_MAJOR') as Integer) + 1)
fileHandler.save()
}
}
I've just upgraded to ES 2.0.0-rc1.
I use a local node for JUnit testing.
Settings settings = Settings.builder()
.put("script.inline", "on")
.put("script.indexed", "on")
.put("path.home", "/").build();
return NodeBuilder.nodeBuilder()
.settings(settings)
.local(true)
.clusterName("c").node();
My problem is that the upgraded version doesn't see my native scripts.
The query seems like this:
Script script = new Script("myscript", ScriptType.INDEXED, "native", params);
ScoreFunctionBuilder scoreBuilder = ScoreFunctionBuilders.scriptFunction(script);
The output is the following:
...
"functions" : [ {
"script_score" : {
"script" : {
"id" : "myscript",
"lang" : "native",
"params" : {
"searchMode" : "A"
}
}
}
...
This script Plugin is in the Maven dependency list.
It worked well with the former version however with this new version I get the following exception:
Caused by: org.elasticsearch.index.query.QueryParsingException: script_score the script could not be loaded
Caused by: org.elasticsearch.index.IndexNotFoundException: no such index
So how could I install the plugin to a local node?
Edit 1:
https://www.elastic.co/guide/en/elasticsearch/plugins/2.0/plugin-authors.html / Loading plugins from the classpath
might be the solution. Nope.
Edit 2:
It seems that the ScoreFunctionBuilder has been changed.
1.7:
ScoreFunctionBuilder scoreBuilder = ScoreFunctionBuilders.scriptFunction("myscript", "native", params);
2.0:
Script script = new Script("myscript", ScriptType.INDEXED, "native", params);
ScoreFunctionBuilder scoreBuilder = ScoreFunctionBuilders.scriptFunction(script);
However this doesn't fits to native scripts.
I don't know why since it doesn't follow any logic but all you need to do is to use ScriptType.INLINE
Script script = new Script("myscript", ScriptType.INLINE, "native", params);
We can't use INDEXED because elasticsearch will look for an indexed script in its system and since our script isn't indexed per say... it wouldn't work.
I have a java project that is built with buildr and that has some external dependencies:
repositories.remote << "http://www.ibiblio.org/maven2"
repositories.remote << "http://packages.example/"
define "myproject" do
compile.options.target = '1.5'
project.version = "1.0.0"
compile.with 'dependency:dependency-xy:jar:1.2.3'
compile.with 'dependency2:dependency2:jar:4.5.6'
package(:jar)
end
I want this to build a single standalone jar file that includes all these dependencies.
How do I do that?
(there's a logical followup question: How can I strip all the unused code from the included dependencies and only package the classes I actually use?)
This is what I'm doing right now. This uses autojar to pull only the necessary dependencies:
def add_dependencies(pkg)
tempfile = pkg.to_s.sub(/.jar$/, "-without-dependencies.jar")
mv pkg.to_s, tempfile
dependencies = compile.dependencies.map { |d| "-c #{d}"}.join(" ")
sh "java -jar tools/autojar.jar -baev -o #{pkg} #{dependencies} #{tempfile}"
end
and later:
package(:jar)
package(:jar).enhance { |pkg| pkg.enhance { |pkg| add_dependencies(pkg) }}
(caveat: I know little about buildr, this could be totally the wrong approach. It works for me, though)
I'm also learning Buildr and currently I'm packing Scala runtime with my application this way:
package(:jar).with(:manifest => _('src/MANIFEST.MF')).exclude('.scala-deps')
.merge('/var/local/scala/lib/scala-library.jar')
No idea if this is inferior to autojar (comments are welcome), but seems to work with a simple example. Takes 4.5 minutes to package that scala-library.jar thought.
I'm going to use Cascading for my example:
cascading_dev_jars = Dir[_("#{ENV["CASCADING_HOME"]}/build/cascading-{core,xml}-*.jar")]
#...
package(:jar).include cascading_dev_jars, :path => "lib"
Here is how I create an Uberjar with Buildr, this customization of what is put into the Jar and how the Manifest is created:
assembly_dir = 'target/assembly'
main_class = 'com.something.something.Blah'
artifacts = compile.dependencies
artifacts.each do |artifact|
Unzip.new( _(assembly_dir) => artifact ).extract
end
# remove dirs from assembly that should not be in uberjar
FileUtils.rm_rf( "#{_(assembly_dir)}/example/package" )
FileUtils.rm_rf( "#{_(assembly_dir)}/example/dir" )
# create manifest file
File.open( _("#{assembly_dir}/META-INF/MANIFEST.MF"), 'w') do |f|
f.write("Implementation-Title: Uberjar Example\n")
f.write("Implementation-Version: #{project_version}\n")
f.write("Main-Class: #{main_class}\n")
f.write("Created-By: Buildr\n")
end
present_dir = Dir.pwd
Dir.chdir _(assembly_dir)
puts "Creating #{_("target/#{project.name}-#{project.version}.jar")}"
`jar -cfm #{_("target/#{project.name}-#{project.version}.jar")} #{_(assembly_dir)}/META-INF/MANIFEST.MF .`
Dir.chdir present_dir
There is also a version that supports Spring, by concatenating all the spring.schemas