I am trying to use the org.apache.hadoop.tools.DistCp class to copy some files over into a S3 bucket. However overwrite functionality is not working in spite of explicitly setting the overwrite flag to true
Copying works fine but it does not overwrite if there are existing files. The copy mapper skips those files. I have explicitly set the "overwrite" option to true.
import com.typesafe.scalalogging.LazyLogging
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.fs.Path
import org.apache.hadoop.tools.{DistCp, DistCpOptions}
import org.apache.hadoop.util.ToolRunner
import scala.collection.JavaConverters._
object distcptest extends App with LazyLogging {
def copytoS3( hdfsSrcFilePathStr: String, s3DestPathStr: String) = {
val hdfsSrcPathList = List(new Path(hdfsSrcFilePathStr))
val s3DestPath = new Path(s3DestPathStr)
val distcpOpt = new DistCpOptions(hdfsSrcPathList.asJava, s3DestPath)
// Overwriting is not working inspite of explicitly setting it to true.
distcpOpt.setOverwrite(true)
val conf: Configuration = new Configuration()
conf.set("fs.s3n.awsSecretAccessKey", "secret key")
conf.set("fs.s3n.awsAccessKeyId", "access key")
conf.set("fs.s3n.impl", "org.apache.hadoop.fs.s3native.NativeS3FileSystem")
val distCp: DistCp = new DistCp(conf, distcpOpt)
val filepaths: Array[String] = Array(hdfsSrcFilePathStr, s3DestPathStr)
try {
val distCp_result = ToolRunner.run(distCp, filepaths)
if (distCp_result != 0) {
logger.error(s"DistCP has failed with - error code = $distCp_result")
}
}
catch {
case e: Exception => {
e.printStackTrace()
}
}
}
copytoS3("hdfs://abc/pqr", "s3n://xyz/wst")
}
I think the problem is you called ToolRunner.run(distCp, filepaths).
If you check the source code of DistCp, in run method will overwrite inputOptions, so the DistCpOptions passed to constructor will not work.
#Override
public int run(String[] argv) {
...
try {
inputOptions = (OptionsParser.parse(argv));
...
} catch (Throwable e) {
...
}
...
}
Related
I have a groovy file gitClone.groovy which has a function call.
def call(credentials, url, project, branch, path, refs, noTags=false,
timeout=20)
{
}
I am writing a test for validating the 'url'. I want to write a test case which will validate if the url is correct or not. My test file is as follows:
gitCloneSpec.groovy
import com.homeaway.devtools.jenkins.testing.JenkinsPipelineSpecification;
import spock.lang.*
import java.net.URL;
public class gitCloneSpec extends JenkinsPipelineSpecification {
def gitClone = null
def check = 0
def setup() {
gitClone = loadPipelineScriptForTest("vars/gitClone.groovy")
gitClone.getBinding().setVariable( "url", "https://www.google.com/")
}
def "validate url"(){
when:
try {
URL u = new URL(url)
u.toURI()
check = 1
}
// If there was an Exception
// while creating URL object
catch (Exception e) {
check = 2;
}
then:
check == 1
}
}
Somehow url is not able to store the string "http://www.google.com" and it is throwing the exception where 'check' is getting updated with value '2'
How can I perform this test?
When working with gRPC, we need to generate the gRPC client and server interfaces from our .proto service definition via protocol buffer compiler (protoc) or using Gradle or Maven protoc build plugin.
Flow now: protobuf file -> java code -> gRPC client.
So, is there any way to skip this step?
How to create a generic gRPC client that can call the server directly from the protobuf file without compile into java code?
Or, is there a way to Generated Code at runtime?
Flow expect: protobuf file -> gRPC client.
I want to build a generic gRPC client system with the input are protobuf files along with description of method, package, message request ... without having to compile again for each protobuf.
Thank you very much.
Protobuf systems really need protoc to be run. However, the generated code could be skipped. Instead of passing something like --java_out and --grpc_java_out to protoc you can pass --descriptor_set_out=FILE which will parse the .proto file into a descriptor file. A descriptor file is a proto-encoded FileDescriptorSet. This is the same basic format as used with the reflection service.
Once you have a descriptor, you can load it a FileDescriptor at a time and create a DynamicMessage.
Then for the gRPC piece, you need to create a gRPC MethodDescriptor.
static MethodDescriptor from(
Descriptors.MethodDescriptor methodDesc
) {
return MethodDescriptor.<DynamicMessage, DynamicMessage>newBuilder()
// UNKNOWN is fine, but the "correct" value can be computed from
// methodDesc.toProto().getClientStreaming()/getServerStreaming()
.setType(getMethodTypeFromDesc(methodDesc))
.setFullMethodName(MethodDescriptor.generateFullMethodName(
serviceDesc.getFullName(), methodDesc.getName()))
.setRequestMarshaller(ProtoUtils.marshaller(
DynamicMessage.getDefaultInstance(methodDesc.getInputType())))
.setResponseMarshaller(ProtoUtils.marshaller(
DynamicMessage.getDefaultInstance(methodDesc.getOutputType())))
.build();
static MethodDescriptor.MethodType getMethodTypeFromDesc(
Descriptors.MethodDescriptor methodDesc
) {
if (!methodDesc.isServerStreaming()
&& !methodDesc.isClientStreaming()) {
return MethodDescriptor.MethodType.UNARY;
} else if (methodDesc.isServerStreaming()
&& !methodDesc.isClientStreaming()) {
return MethodDescriptor.MethodType.SERVER_STREAMING;
} else if (!methodDesc.isServerStreaming()) {
return MethodDescriptor.MethodType.CLIENT_STREAMING);
} else {
return MethodDescriptor.MethodType.BIDI_STREAMING);
}
}
At that point you have everything you need and can call Channel.newCall(method, CallOptions.DEFAULT) in gRPC. You're also free to use ClientCalls to use something more similar to the stub APIs.
So dynamic calls are definitely possible, and is used for things like grpcurl. But it also is not easy and so is generally only done when necessary.
I did it in Java, and the step is:
Call reflection service to get FileDescriptorProto list by method name
Get FileDescriptor of method from FileDescriptorProto list by package name, service name
Get MethodDescriptor from ServiceDescriptor which get from the FileDescriptor
Generate a MethodDescriptor<DynamicMessage, DynamicMessage> by MethodDescriptor
Build request DynamicMessage from content like JSON or others
Call method
Parse response content to JSON from DynamicMessage response
You can reference the full sample in project helloworlde/grpc-java-sample#reflection
And proto is:
syntax = "proto3";
package io.github.helloworlde.grpc;
option go_package = "api;grpc_gateway";
option java_package = "io.github.helloworlde.grpc";
option java_multiple_files = true;
option java_outer_classname = "HelloWorldGrpc";
service HelloService{
rpc SayHello(HelloMessage) returns (HelloResponse){
}
}
message HelloMessage {
string message = 2;
}
message HelloResponse {
string message = 1;
}
Start server for this proto by yourself, and the full code in Java just like:
import com.google.protobuf.ByteString;
import com.google.protobuf.DescriptorProtos;
import com.google.protobuf.Descriptors;
import com.google.protobuf.DynamicMessage;
import com.google.protobuf.InvalidProtocolBufferException;
import com.google.protobuf.TypeRegistry;
import com.google.protobuf.util.JsonFormat;
import io.grpc.CallOptions;
import io.grpc.ManagedChannel;
import io.grpc.ManagedChannelBuilder;
import io.grpc.MethodDescriptor;
import io.grpc.protobuf.ProtoUtils;
import io.grpc.reflection.v1alpha.ServerReflectionGrpc;
import io.grpc.reflection.v1alpha.ServerReflectionRequest;
import io.grpc.reflection.v1alpha.ServerReflectionResponse;
import io.grpc.stub.ClientCalls;
import io.grpc.stub.StreamObserver;
import lombok.SneakyThrows;
import lombok.extern.slf4j.Slf4j;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.concurrent.TimeUnit;
import java.util.stream.Collectors;
#Slf4j
public class ReflectionCall {
public static void main(String[] args) throws InterruptedException {
// 反射方法的格式只支持 package.service.method 或者 package.service
String methodSymbol = "io.github.helloworlde.grpc.HelloService.SayHello";
String requestContent = "{\"message\": \"Reflection\"}";
// 构建 Channel
ManagedChannel channel = ManagedChannelBuilder.forAddress("127.0.0.1", 9090)
.usePlaintext()
.build();
// 使用 Channel 构建 BlockingStub
ServerReflectionGrpc.ServerReflectionStub reflectionStub = ServerReflectionGrpc.newStub(channel);
// 响应观察器
StreamObserver<ServerReflectionResponse> streamObserver = new StreamObserver<ServerReflectionResponse>() {
#Override
public void onNext(ServerReflectionResponse response) {
try {
// 只需要关注文件描述类型的响应
if (response.getMessageResponseCase() == ServerReflectionResponse.MessageResponseCase.FILE_DESCRIPTOR_RESPONSE) {
List<ByteString> fileDescriptorProtoList = response.getFileDescriptorResponse().getFileDescriptorProtoList();
handleResponse(fileDescriptorProtoList, channel, methodSymbol, requestContent);
} else {
log.warn("未知响应类型: " + response.getMessageResponseCase());
}
} catch (Exception e) {
log.error("处理响应失败: {}", e.getMessage(), e);
}
}
#Override
public void onError(Throwable t) {
}
#Override
public void onCompleted() {
log.info("Complete");
}
};
// 请求观察器
StreamObserver<ServerReflectionRequest> requestStreamObserver = reflectionStub.serverReflectionInfo(streamObserver);
// 构建并发送获取方法文件描述请求
ServerReflectionRequest getFileContainingSymbolRequest = ServerReflectionRequest.newBuilder()
.setFileContainingSymbol(methodSymbol)
.build();
requestStreamObserver.onNext(getFileContainingSymbolRequest);
channel.awaitTermination(10, TimeUnit.SECONDS);
}
/**
* 处理响应
*/
private static void handleResponse(List<ByteString> fileDescriptorProtoList,
ManagedChannel channel,
String methodFullName,
String requestContent) {
try {
// 解析方法和服务名称
String fullServiceName = extraPrefix(methodFullName);
String methodName = extraSuffix(methodFullName);
String packageName = extraPrefix(fullServiceName);
String serviceName = extraSuffix(fullServiceName);
// 根据响应解析 FileDescriptor
Descriptors.FileDescriptor fileDescriptor = getFileDescriptor(fileDescriptorProtoList, packageName, serviceName);
// 查找服务描述
Descriptors.ServiceDescriptor serviceDescriptor = fileDescriptor.getFile().findServiceByName(serviceName);
// 查找方法描述
Descriptors.MethodDescriptor methodDescriptor = serviceDescriptor.findMethodByName(methodName);
// 发起请求
executeCall(channel, fileDescriptor, methodDescriptor, requestContent);
} catch (Exception e) {
log.error(e.getMessage(), e);
}
}
/**
* 解析并查找方法对应的文件描述
*/
private static Descriptors.FileDescriptor getFileDescriptor(List<ByteString> fileDescriptorProtoList,
String packageName,
String serviceName) throws Exception {
Map<String, DescriptorProtos.FileDescriptorProto> fileDescriptorProtoMap =
fileDescriptorProtoList.stream()
.map(bs -> {
try {
return DescriptorProtos.FileDescriptorProto.parseFrom(bs);
} catch (InvalidProtocolBufferException e) {
e.printStackTrace();
}
return null;
})
.filter(Objects::nonNull)
.collect(Collectors.toMap(DescriptorProtos.FileDescriptorProto::getName, f -> f));
if (fileDescriptorProtoMap.isEmpty()) {
log.error("服务不存在");
throw new IllegalArgumentException("方法的文件描述不存在");
}
// 查找服务对应的 Proto 描述
DescriptorProtos.FileDescriptorProto fileDescriptorProto = findServiceFileDescriptorProto(packageName, serviceName, fileDescriptorProtoMap);
// 获取这个 Proto 的依赖
Descriptors.FileDescriptor[] dependencies = getDependencies(fileDescriptorProto, fileDescriptorProtoMap);
// 生成 Proto 的 FileDescriptor
return Descriptors.FileDescriptor.buildFrom(fileDescriptorProto, dependencies);
}
/**
* 根据包名和服务名查找相应的文件描述
*/
private static DescriptorProtos.FileDescriptorProto findServiceFileDescriptorProto(String packageName,
String serviceName,
Map<String, DescriptorProtos.FileDescriptorProto> fileDescriptorProtoMap) {
for (DescriptorProtos.FileDescriptorProto proto : fileDescriptorProtoMap.values()) {
if (proto.getPackage().equals(packageName)) {
boolean exist = proto.getServiceList()
.stream()
.anyMatch(s -> serviceName.equals(s.getName()));
if (exist) {
return proto;
}
}
}
throw new IllegalArgumentException("服务不存在");
}
/**
* 获取前缀
*/
private static String extraPrefix(String content) {
int index = content.lastIndexOf(".");
return content.substring(0, index);
}
/**
* 获取后缀
*/
private static String extraSuffix(String content) {
int index = content.lastIndexOf(".");
return content.substring(index + 1);
}
/**
* 获取依赖类型
*/
private static Descriptors.FileDescriptor[] getDependencies(DescriptorProtos.FileDescriptorProto proto,
Map<String, DescriptorProtos.FileDescriptorProto> finalDescriptorProtoMap) {
return proto.getDependencyList()
.stream()
.map(finalDescriptorProtoMap::get)
.map(f -> toFileDescriptor(f, getDependencies(f, finalDescriptorProtoMap)))
.toArray(Descriptors.FileDescriptor[]::new);
}
/**
* 将 FileDescriptorProto 转为 FileDescriptor
*/
#SneakyThrows
private static Descriptors.FileDescriptor toFileDescriptor(DescriptorProtos.FileDescriptorProto fileDescriptorProto,
Descriptors.FileDescriptor[] dependencies) {
return Descriptors.FileDescriptor.buildFrom(fileDescriptorProto, dependencies);
}
/**
* 执行方法调用
*/
private static void executeCall(ManagedChannel channel,
Descriptors.FileDescriptor fileDescriptor,
Descriptors.MethodDescriptor originMethodDescriptor,
String requestContent) throws Exception {
// 重新生成 MethodDescriptor
MethodDescriptor<DynamicMessage, DynamicMessage> methodDescriptor = generateMethodDescriptor(originMethodDescriptor);
CallOptions callOptions = CallOptions.DEFAULT;
TypeRegistry registry = TypeRegistry.newBuilder()
.add(fileDescriptor.getMessageTypes())
.build();
// 将请求内容由 JSON 字符串转为相应的类型
JsonFormat.Parser parser = JsonFormat.parser().usingTypeRegistry(registry);
DynamicMessage.Builder messageBuilder = DynamicMessage.newBuilder(originMethodDescriptor.getInputType());
parser.merge(requestContent, messageBuilder);
DynamicMessage requestMessage = messageBuilder.build();
// 调用,调用方式可以通过 originMethodDescriptor.isClientStreaming() 和 originMethodDescriptor.isServerStreaming() 推断
DynamicMessage response = ClientCalls.blockingUnaryCall(channel, methodDescriptor, callOptions, requestMessage);
// 将响应解析为 JSON 字符串
JsonFormat.Printer printer = JsonFormat.printer()
.usingTypeRegistry(registry)
.includingDefaultValueFields();
String responseContent = printer.print(response);
log.info("响应: {}", responseContent);
}
/**
* 重新生成方法描述
*/
private static MethodDescriptor<DynamicMessage, DynamicMessage> generateMethodDescriptor(Descriptors.MethodDescriptor originMethodDescriptor) {
// 生成方法全名
String fullMethodName = MethodDescriptor.generateFullMethodName(originMethodDescriptor.getService().getFullName(), originMethodDescriptor.getName());
// 请求和响应类型
MethodDescriptor.Marshaller<DynamicMessage> inputTypeMarshaller = ProtoUtils.marshaller(DynamicMessage.newBuilder(originMethodDescriptor.getInputType())
.buildPartial());
MethodDescriptor.Marshaller<DynamicMessage> outputTypeMarshaller = ProtoUtils.marshaller(DynamicMessage.newBuilder(originMethodDescriptor.getOutputType())
.buildPartial());
// 生成方法描述, originMethodDescriptor 的 fullMethodName 不正确
return MethodDescriptor.<DynamicMessage, DynamicMessage>newBuilder()
.setFullMethodName(fullMethodName)
.setRequestMarshaller(inputTypeMarshaller)
.setResponseMarshaller(outputTypeMarshaller)
// 使用 UNKNOWN,自动修改
.setType(MethodDescriptor.MethodType.UNKNOWN)
.build();
}
}
There isn't much to prevent this technically. The two big hurdles are:
having a runtime-callable parser for reading the .proto, and
having a general purpose gRPC client available that takes things like the service method name as literals
Both are possible, but neither is trivial.
For 1, the crude way would be to shell/invoke protoc using the descriptor-set option to generate a schema binary, then deserialize that as a FileDescriptorSet (from descriptor.proto); this model gives you access to how protoc sees the file. Some platforms also have native parsers (essentially reimplementing protoc as a library in that platform), for example protobuf-net.Reflection does this in .NET-land
For 2, here's an implementation of that in C#. The approach should be fairly portable to Java, even if the details vary. You can look at a generated implementation to see how it works in any particular language.
(Sorry that the specific examples are C#/.NET, but that's where I live; the approaches should be portable, even if the specific code: not directly)
technically both are possible.
The codegen is simply generating a handful of classes; mainly protobuf messages, grpc method descriptors and stubs. You can implement it or check in the generated code to bypass the codegen. i am not sure what is the benefit of doing this tbh. Also, it will be very annoying if the proto is changed.
It is also possible to do it dynamically using byte codegen as long as you check-in some interfaces/abstract classes to represent those generated stub/method descriptors and protobuf messages. you have to make sure those non dynamic code is in sync with the proto definition though (most likely runtime check/exception).
I am trying to use kotlin instead of Java, I cannot find a good way to do with try resource:
Java Code like this:
import org.tensorflow.Graph;
import org.tensorflow.Session;
import org.tensorflow.Tensor;
import org.tensorflow.TensorFlow;
public class HelloTensorFlow {
public static void main(String[] args) throws Exception {
try (Graph g = new Graph()) {
final String value = "Hello from " + TensorFlow.version();
// Construct the computation graph with a single operation, a constant
// named "MyConst" with a value "value".
try (Tensor t = Tensor.create(value.getBytes("UTF-8"))) {
// The Java API doesn't yet include convenience functions for adding operations.
g.opBuilder("Const", "MyConst").setAttr("dtype", t.dataType()).setAttr("value", t).build();
}
// Execute the "MyConst" operation in a Session.
try (Session s = new Session(g);
// Generally, there may be multiple output tensors,
// all of them must be closed to prevent resource leaks.
Tensor output = s.runner().fetch("MyConst").run().get(0)) {
System.out.println(new String(output.bytesValue(), "UTF-8"));
}
}
}
}
I do it in kotlin, I have to do this:
fun main(args: Array<String>) {
val g = Graph();
try {
val value = "Hello from ${TensorFlow.version()}"
val t = Tensor.create(value.toByteArray(Charsets.UTF_8))
try {
g.opBuilder("Const", "MyConst").setAttr("dtype", t.dataType()).setAttr("value", t).build()
} finally {
t.close()
}
var sess = Session(g)
try {
val output = sess.runner().fetch("MyConst").run().get(0)
println(String(output.bytesValue(), Charsets.UTF_8))
} finally {
sess?.close()
}
} finally {
g.close()
}
}
I have try to use use like this:
Graph().use {
it -> ....
}
I got error like this:
Error:(16, 20) Kotlin: Unresolved reference. None of the following candidates is applicable because of receiver type mismatch:
#InlineOnly public inline fun ???.use(block: (???) -> ???): ??? defined in kotlin.io
I just use wrong dependency:
compile "org.jetbrains.kotlin:kotlin-stdlib"
replace it with:
compile "org.jetbrains.kotlin:kotlin-stdlib-jdk8"
In Gradle i can get project info (dependencies, artifact, and group id's) on Groovy like this:
class TestPlugin implements Plugin<Project> {
#Override
void apply(Project project) {
def example = project.tasks.create("example") << {
def dep = project.configurations.runtime.allDependencies
def info = project.configurations.runtime.getName()
def g = project.configurations.runtime.getAllArtifacts()
}
How can i get this on Java ?
You can add a task that will write whatever values you like out to a java Properties file, like so:
apply plugin: 'java'
apply plugin: 'application'
def generatedResourcesDir = new File(project.buildDir, 'generated-resources')
tasks.withType(Jar).all { Jar jar ->
jar.doFirst {
def props = new Properties()
props.foobar = 'baz'
generatedResourcesDir.mkdirs()
def writer = new FileWriter(new File(generatedResourcesDir, 'build.properties'))
try {
props.store(writer, 'build properties')
writer.flush()
} finally {
writer.close()
}
}
}
sourceSets {
main {
resources {
srcDir generatedResourcesDir
}
}
}
mainClassName = 'BuildProps'
Notice that a directory is created in the build output directory of the root project (called generated-resources, though you can call it whatever you want, within reason). The properties file is then written to this directory as a result of running the custom task before any jar task. Finally, the generated-resources directory is added to the resources source set. This means it will become a resource within the generated jar file and as such can be accessed like any other resource; for example:
import java.util.Properties;
import java.io.InputStream;
import java.io.IOException;
class BuildProps {
public static void main(String[] args) {
try (InputStream inputStream =
BuildProps.class.getClassLoader().getResourceAsStream("build.properties")) {
Properties props = new Properties();
props.load(inputStream);
System.out.println("Build properties:");
System.out.println("foobar=" + props.getProperty("foobar", ""));
} catch (IOException e) {
e.printStackTrace();
}
}
}
which will print:
Build properties:
foobar=baz
As for your specific desired properties, you could set them like this: replace the line props.foobar = 'baz' with the following
def dependenciesProp = ''
for (def dependency : project.configurations.runtime.allDependencies) {
dependenciesProp += dependency.toString() + ','
}
props.dependencies = dependenciesProp
props.runtimename = project.configurations.runtime.name
def artifactsProp = ''
for (def artifact : project.configurations.runtime.allArtifacts) {
artifactsProp += artifact.toString() + ','
}
props.artifacts = artifactsProp
I have a simple parent project with modules/applications within it. My build tool of choice is gradle. The parent build.gradle is defined below.
apply plugin: 'groovy'
dependencies {
compile gradleApi()
compile localGroovy()
}
allprojects {
repositories {
mavenCentral()
}
version "0.1.0-SNAPSHOT"
}
What I would like to do is utilize the version attribute (0.1.0-SNAPSHOT) within my swing application. Specifically, I'd like it to display in the titlebar of the main JFrame. I expect to be able to do something like this.setTitle("My Application - v." + ???.version);
The application is a plain java project, but I'm not opposed to adding groovy support it it will help.
I like creating a properties file during the build. Here's a way to do that from Gradle directly:
task createProperties(dependsOn: processResources) {
doLast {
new File("$buildDir/resources/main/version.properties").withWriter { w ->
Properties p = new Properties()
p['version'] = project.version.toString()
p.store w, null
}
}
}
classes {
dependsOn createProperties
}
You can always use brute force as somebody suggested and generate properties file during build. More elegant answer, which works only partially would be to use
getClass().getPackage().getImplementationVersion()
Problem is that this will work only if you run your application from generated jar - if you run it directly from IDE/expanded classes, getPackage above will return null. It is good enough for many cases - just display 'DEVELOPMENT' if you run from IDE(geting null package) and will work for actual client deployments.
Better idea is to keep the project version in gradle.properties file. All the properties from this file will be automatically loaded and can be used in build.gradle script.
Then if you need the version in your swing application, add a version.properties file under src/main/resources folder and filter this file during application build, here is a post that shows how it should be done.
version.properties will be included in the final jar, hence can be read and via ClassLoader and properties from this file can be displayed in application.
Simpler and updated solution of #Craig Trader (ready for Gradle 4.0/5.0)
task createProperties {
doLast {
def version = project.version.toString()
def file = new File("$buildDir/resources/main/version.txt")
file.write(version)
}
}
war {
dependsOn createProperties
}
I used #Craig Trader's answer, but had to add quite some changes to make it work (it also adds git-details):
task createProperties() {
doLast {
def details = versionDetails()
new File("$buildDir/resources/main/version.properties").withWriter { w ->
Properties p = new Properties()
p['version'] = project.version.toString()
p['gitLastTag'] = details.lastTag
p['gitCommitDistance'] = details.commitDistance.toString()
p['gitHash'] = details.gitHash.toString()
p['gitHashFull'] = details.gitHashFull.toString() // full 40-character Git commit hash
p['gitBranchName'] = details.branchName // is null if the repository in detached HEAD mode
p['gitIsCleanTag'] = details.isCleanTag.toString()
p.store w, null
}
// copy needed, otherwise the bean VersionController can't load the file at startup when running complete-app tests.
copy {
from "$buildDir/resources/main/version.properties"
into "bin/main/"
}
}
}
classes {
dependsOn createProperties
}
And load it from the constructor of class: VersionController
import static net.logstash.logback.argument.StructuredArguments.v;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.info.BuildProperties;
import org.springframework.http.MediaType;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.InputStream;
import java.util.Map;
import java.util.Properties;
import java.util.Set;
#RestController
public class VersionController {
final static Logger log = LoggerFactory.getLogger(AppInfoController.class);
private Properties versionProperties = new Properties();
private String gitLastTag;
private String gitHash;
private String gitBranchName;
private String gitIsCleanTag;
VersionController()
{
String AllGitVersionProperties = "";
InputStream inputStream = getClass().getClassLoader().getResourceAsStream("classpath:/version.properties");
if(inputStream == null)
{
// When running unit tests, no jar is built, so we load a copy of the file that we saved during build.gradle.
// Possibly this also is the case during debugging, therefore we save in bin/main instead of bin/test.
try {
inputStream = new FileInputStream("bin/main/version.properties");
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
try {
versionProperties.load(inputStream);
} catch (IOException e) {
AllGitVersionProperties += e.getMessage()+":";
log.error("Could not load classpath:/version.properties",e);
}
gitLastTag = versionProperties.getProperty("gitLastTag","last-tag-not-found");
gitHash = versionProperties.getProperty("gitHash","git-hash-not-found");
gitBranchName = versionProperties.getProperty("gitBranchName","git-branch-name-not-found");
gitIsCleanTag = versionProperties.getProperty("gitIsCleanTag","git-isCleanTag-not-found");
Set<Map.Entry<Object, Object>> mainPropertiesSet = versionProperties.entrySet();
for(Map.Entry oneEntry : mainPropertiesSet){
AllGitVersionProperties += "+" + oneEntry.getKey()+":"+oneEntry.getValue();
}
log.info("All Git Version-Properties:",v("GitVersionProperties", AllGitVersionProperties));
}
}
Using #Craig Trader's solution to save the properties in a version.properties file. Add to build.gradle:
task createProperties() {
doLast {
def details = versionDetails()
new File("$buildDir/resources/main/version.properties").withWriter { w ->
Properties p = new Properties()
p['version'] = project.version.toString()
p['gitLastTag'] = details.lastTag
p['gitCommitDistance'] = details.commitDistance.toString()
p['gitHash'] = details.gitHash.toString()
p['gitHashFull'] = details.gitHashFull.toString() // full 40-character Git commit hash
p['gitBranchName'] = details.branchName // is null if the repository in detached HEAD mode
p['gitIsCleanTag'] = details.isCleanTag.toString()
p.store w, null
}
// copy needed, otherwise the bean VersionController can't load the file at startup when running complete-app tests.
copy {
from "$buildDir/resources/main/version.properties"
into "bin/main/"
}
}
}
classes {
dependsOn createProperties
}
To load the properties runtime in version.properties you need to annotate your class with #PropertySource({"classpath:version.properties"})
Then you can assign a property to a private variable with annotation like:
#Value("${gitLastTag}")
private String gitLastTag;
Full example:
package com.versioncontroller;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.PropertySource;
import javax.annotation.PostConstruct;
import java.util.Properties;
#PropertySource({"classpath:version.properties"})
public class VersionController {
#Value("${gitLastTag}")
private String gitLastTag;
#Value("${gitHash}")
private String gitHash;
#Value("${gitBranchName}")
private String gitBranchName;
#Value("${gitIsCleanTag}")
private String gitIsCleanTag;
#PostConstruct // properties are only set after the constructor has run
private void logVersion(){
// when called during the constructor, all values are null.
System.out.println("All Git Version-Properties:");
System.out.println("gitLastTag: " + gitLastTag),
System.out.println("gitHash: " + gitHash),
System.out.println("gitBranchName: " + gitBranchName),
System.out.println("gitIsCleanTag: " + gitIsCleanTag));
}
}