I'm trying to create a XAdES-BES signature for a given blob. For this signature, I'd need to add two transforms on the content before it is signed: Base64 (http://www.w3.org/2000/09/xmldsig#base64) & a custom one (called optional-deflate).
The problems lies with that optional transform. I'm trying to figure out how to implement a custom Transform, register it, and finally have Xades4J use it.
So far I figured a lot (thanks Google and a lot of time), so I got roughly to this: I've got a Provider-class that, in the constructor, puts the new TransformService; In my main code I add my Provider to the Security instance; then, I try to add the transform to my actual to-be-signed object.
Unfortunately, I always get the same error:
Exception in thread "main" xades4j.UnsupportedAlgorithmException: Unsupported transform on XML Signature provider (urn:xml:sig:transform:optional-deflate)
at xades4j.production.DataObjectDescsProcessor.processTransforms(DataObjectDescsProcessor.java:194)
at xades4j.production.DataObjectDescsProcessor.process(DataObjectDescsProcessor.java:87)
at xades4j.production.SignerBES.sign(SignerBES.java:173)
at xades4j.production.SignerBES.sign(SignerBES.java:122)
at com.mycompany.Test.createXades(Test.java:199)
at com.mycompany.Test.main(Test.java:47)
Caused by: org.apache.xml.security.transforms.TransformationException: Unknown transformation. No handler installed for URI urn:xml:sig:transform:optional-deflate
Original Exception was org.apache.xml.security.transforms.InvalidTransformException: Unknown transformation. No handler installed for URI urn:xml:sig:transform:optional-deflate
at org.apache.xml.security.transforms.Transforms.addTransform(Unknown Source)
at xades4j.production.DataObjectDescsProcessor.processTransforms(DataObjectDescsProcessor.java:185)
... 5 more
So, my code looks like this (abbreviated to what I think is necessary here):
TransformService class:
package com.mycompany.security;
import java.io.OutputStream;
import java.security.InvalidAlgorithmParameterException;
import java.security.spec.AlgorithmParameterSpec;
import javax.xml.crypto.Data;
import javax.xml.crypto.MarshalException;
import javax.xml.crypto.XMLCryptoContext;
import javax.xml.crypto.XMLStructure;
import javax.xml.crypto.dsig.TransformService;
import javax.xml.crypto.dsig.TransformException;
import javax.xml.crypto.dsig.spec.TransformParameterSpec;
public class OptionalDeflateTransform extends TransformService {
public AlgorithmParameterSpec getParameterSpec() {
return null;
}
public Data transform(Data data, XMLCryptoContext context) throws TransformException {
return null;
}
public Data transform(Data data, XMLCryptoContext context, OutputStream os) throws TransformException {
return null;
}
public boolean isFeatureSupported(String feature) {
return false;
}
public void init(TransformParameterSpec params) throws InvalidAlgorithmParameterException {}
public void marshalParams(XMLStructure parent, XMLCryptoContext context) throws MarshalException {}
public void init(XMLStructure parent, XMLCryptoContext context) throws InvalidAlgorithmParameterException {}
}
Provider subclass:
package com.mycompany.security;
import java.security.Provider;
public final class OptionalDeflateProvider extends Provider {
private static final long serialVersionUID = 8849833178389029123L;
public OptionalDeflateProvider() {
super("OptionalDeflate", 1.0, "OptionalDeflate provider 1.0 implementing the OptionalDeflate transform algorithm.");
put("TransformService.urn:xml:sig:transform:optional-deflate", "com.mycompany.security.OptionalDeflateTransform");
}
}
And finally, my main Test class, which contains the actual signing. Without that transform, it works (but well, doesn't add the transform, which is necessary). So Base64 works.
protected static void createXades(String content) throws Exception {
/*Get certificate & private key*/
Certificates c = new Certificates();
c.initSession(); //some helper class where I can get my certificate & private key for signing
/*Create a document*/
DocumentBuilderFactory docFactory = DocumentBuilderFactory.newInstance();
DocumentBuilder docBuilder = docFactory.newDocumentBuilder();
Document doc = docBuilder.newDocument();
Element objectElement = doc.createElement("object");
doc.appendChild(objectElement);
Element requestElement = doc.createElement("request");
requestElement.appendChild(doc.createTextNode(content));
requestElement.setAttribute("ID", UUID.randomUUID().toString());
objectElement.appendChild(requestElement);
/*Key provider, signing profile & signer itself*/
KeyingDataProvider kp = new CustomKeyingDataProvider(c.getCertificate(), c.getPrivateKey());
XadesSigningProfile p = new XadesBesSigningProfile(kp);
p.withAlgorithmsProviderEx(new ProviderEx());
XadesSigner signer = p.newSigner();
/*Add the optional deflate provider*/
Security.addProvider(new OptionalDeflateProvider());
System.out.println("--- installed providers ---");
for (Provider pr : Security.getProviders())
System.out.println(pr.getName());
System.out.println("---");
/*Test if we can get the transformservice-instance*/
TransformService ts = TransformService.getInstance("urn:xml:sig:transform:optional-deflate", "DOM");
System.out.println(ts.getAlgorithm());
System.out.println("---");
/*Signed data*/
DataObjectDesc flatFile = new DataObjectReference("#" + requestElement.getAttribute("ID"))
.withTransform(new GenericAlgorithm("http://www.w3.org/2000/09/xmldsig#base64"))
.withTransform(new GenericAlgorithm("urn:xml:sig:transform:optional-deflate"));
SignedDataObjects dataObjs = new SignedDataObjects(flatFile);
/*Actual signing*/
signer.sign(dataObjs, objectElement);
log(objectElement.getLastChild());
}
As you can see I print some things out. I for instance logged that the installation works fine and I also logged the installed providers. I get this as output:
--- installed providers ---
SUN
SunRsaSign
SunEC
SunJSSE
SunJCE
SunJGSS
SunSASL
XMLDSig
SunPCSC
SunMSCAPI
OptionalDeflate
---
urn:xml:sig:transform:optional-deflate
---
So as far as I can see, the provider has succesfully been registered, the transformservice can be loaded without a problem, ... So I don't really see what's going on?
I've checked the source code of Xades4j as well, and what's happening internally on the line .withTransform(new GenericAlgorithm("urn:xml:sig:transform:optional-deflate")) is pretty straight forward:
import org.apache.xml.security.transforms.Transforms;
...
private Transforms processTransforms(DataObjectDesc dataObjDesc, Document document) throws UnsupportedAlgorithmException {
Collection<Algorithm> dObjTransfs = dataObjDesc.getTransforms();
if (dObjTransfs.isEmpty()) {
return null;
}
Transforms transforms = new Transforms(document);
for (Algorithm dObjTransf : dObjTransfs) {
try {
List<Node> transfParams = this.algorithmsParametersMarshaller.marshalParameters(dObjTransf, document);
if (null == transfParams) {
transforms.addTransform(dObjTransf.getUri());
} else {
transforms.addTransform(dObjTransf.getUri(), DOMHelper.nodeList(transfParams));
}
} catch (TransformationException ex) {
throw new UnsupportedAlgorithmException("Unsupported transform on XML Signature provider", dObjTransf.getUri(), ex);
}
}
return transforms;
}
The exact line throwing up the error is transforms.addTransform(dObjTransf.getUri()). This transforms object is a 'standard' apache object (org.apache.xml.security.transforms.Transforms object). So I'd guess it should be able to get the same TransformService as I do in code litteraly two lines up higher? But it isn't?
Anyone that can point me out what I'm missing? I'll be eternally grateful.
Apparently, Apache Santuario loads transforms from a internal map. There is a register
method that you probably can use to register your custom transform.
Related
Im using #lgoncalves code to sign an XML with XADES4J EPES. But however jdeveloper don't find (SignerEPES) when I have the xades4j.jar on my classpath. I let you the image of my library and the code:
Project Library
private static void signBes(Document doc) throws XadesProfileResolutionException, XAdES4jException,
KeyStoreException {
//Document doc = getTestDocument();
Element elemToSign = doc.getDocumentElement();
SignaturePolicyInfoProvider policyInfoProvider = new SignaturePolicyInfoProvider()
{
#Override
public SignaturePolicyBase getSignaturePolicy()
{
return new SignaturePolicyIdentifierProperty(
new ObjectIdentifier("oid:/1.2.4.0.9.4.5", IdentifierType.OIDAsURI, "Policy description"),
new ByteArrayInputStream("Test policy input stream".getBytes()))
.withLocationUrl(""); //<- I really don't know what to put right here.
}
};
KeyingDataProvider kdp = new FileSystemKeyStoreKeyingDataProvider("pkcs12","C:/****/****.pfx",new FirstCertificateSelector(),new DirectPasswordProvider("****"),new DirectPasswordProvider("****"),true);
SignerEPES signer = (SignerEPES) new XadesEpesSigningProfile(kdp, policyInfoProvider).newSigner();
new Enveloped(signer).sign(elemToSign);
}
Link to the sample code on GitHub: https://github.com/luisgoncalves/xades4j/blob/master/src/test/java/xades4j/production/SignerEPESTest.java
EDIT:
I tried to force the import like (import xades4j.production.SignerEPES) but IDE says "Cannot be accessed from outside package" but really don't know what that means
SignerEPES is a package-private class, so application code won't be able to import it. The tests use it just to be sure that the proper type is being returned.
In your code you can just use XadesSigner as the type of your variable.
When working with gRPC, we need to generate the gRPC client and server interfaces from our .proto service definition via protocol buffer compiler (protoc) or using Gradle or Maven protoc build plugin.
Flow now: protobuf file -> java code -> gRPC client.
So, is there any way to skip this step?
How to create a generic gRPC client that can call the server directly from the protobuf file without compile into java code?
Or, is there a way to Generated Code at runtime?
Flow expect: protobuf file -> gRPC client.
I want to build a generic gRPC client system with the input are protobuf files along with description of method, package, message request ... without having to compile again for each protobuf.
Thank you very much.
Protobuf systems really need protoc to be run. However, the generated code could be skipped. Instead of passing something like --java_out and --grpc_java_out to protoc you can pass --descriptor_set_out=FILE which will parse the .proto file into a descriptor file. A descriptor file is a proto-encoded FileDescriptorSet. This is the same basic format as used with the reflection service.
Once you have a descriptor, you can load it a FileDescriptor at a time and create a DynamicMessage.
Then for the gRPC piece, you need to create a gRPC MethodDescriptor.
static MethodDescriptor from(
Descriptors.MethodDescriptor methodDesc
) {
return MethodDescriptor.<DynamicMessage, DynamicMessage>newBuilder()
// UNKNOWN is fine, but the "correct" value can be computed from
// methodDesc.toProto().getClientStreaming()/getServerStreaming()
.setType(getMethodTypeFromDesc(methodDesc))
.setFullMethodName(MethodDescriptor.generateFullMethodName(
serviceDesc.getFullName(), methodDesc.getName()))
.setRequestMarshaller(ProtoUtils.marshaller(
DynamicMessage.getDefaultInstance(methodDesc.getInputType())))
.setResponseMarshaller(ProtoUtils.marshaller(
DynamicMessage.getDefaultInstance(methodDesc.getOutputType())))
.build();
static MethodDescriptor.MethodType getMethodTypeFromDesc(
Descriptors.MethodDescriptor methodDesc
) {
if (!methodDesc.isServerStreaming()
&& !methodDesc.isClientStreaming()) {
return MethodDescriptor.MethodType.UNARY;
} else if (methodDesc.isServerStreaming()
&& !methodDesc.isClientStreaming()) {
return MethodDescriptor.MethodType.SERVER_STREAMING;
} else if (!methodDesc.isServerStreaming()) {
return MethodDescriptor.MethodType.CLIENT_STREAMING);
} else {
return MethodDescriptor.MethodType.BIDI_STREAMING);
}
}
At that point you have everything you need and can call Channel.newCall(method, CallOptions.DEFAULT) in gRPC. You're also free to use ClientCalls to use something more similar to the stub APIs.
So dynamic calls are definitely possible, and is used for things like grpcurl. But it also is not easy and so is generally only done when necessary.
I did it in Java, and the step is:
Call reflection service to get FileDescriptorProto list by method name
Get FileDescriptor of method from FileDescriptorProto list by package name, service name
Get MethodDescriptor from ServiceDescriptor which get from the FileDescriptor
Generate a MethodDescriptor<DynamicMessage, DynamicMessage> by MethodDescriptor
Build request DynamicMessage from content like JSON or others
Call method
Parse response content to JSON from DynamicMessage response
You can reference the full sample in project helloworlde/grpc-java-sample#reflection
And proto is:
syntax = "proto3";
package io.github.helloworlde.grpc;
option go_package = "api;grpc_gateway";
option java_package = "io.github.helloworlde.grpc";
option java_multiple_files = true;
option java_outer_classname = "HelloWorldGrpc";
service HelloService{
rpc SayHello(HelloMessage) returns (HelloResponse){
}
}
message HelloMessage {
string message = 2;
}
message HelloResponse {
string message = 1;
}
Start server for this proto by yourself, and the full code in Java just like:
import com.google.protobuf.ByteString;
import com.google.protobuf.DescriptorProtos;
import com.google.protobuf.Descriptors;
import com.google.protobuf.DynamicMessage;
import com.google.protobuf.InvalidProtocolBufferException;
import com.google.protobuf.TypeRegistry;
import com.google.protobuf.util.JsonFormat;
import io.grpc.CallOptions;
import io.grpc.ManagedChannel;
import io.grpc.ManagedChannelBuilder;
import io.grpc.MethodDescriptor;
import io.grpc.protobuf.ProtoUtils;
import io.grpc.reflection.v1alpha.ServerReflectionGrpc;
import io.grpc.reflection.v1alpha.ServerReflectionRequest;
import io.grpc.reflection.v1alpha.ServerReflectionResponse;
import io.grpc.stub.ClientCalls;
import io.grpc.stub.StreamObserver;
import lombok.SneakyThrows;
import lombok.extern.slf4j.Slf4j;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.concurrent.TimeUnit;
import java.util.stream.Collectors;
#Slf4j
public class ReflectionCall {
public static void main(String[] args) throws InterruptedException {
// 反射方法的格式只支持 package.service.method 或者 package.service
String methodSymbol = "io.github.helloworlde.grpc.HelloService.SayHello";
String requestContent = "{\"message\": \"Reflection\"}";
// 构建 Channel
ManagedChannel channel = ManagedChannelBuilder.forAddress("127.0.0.1", 9090)
.usePlaintext()
.build();
// 使用 Channel 构建 BlockingStub
ServerReflectionGrpc.ServerReflectionStub reflectionStub = ServerReflectionGrpc.newStub(channel);
// 响应观察器
StreamObserver<ServerReflectionResponse> streamObserver = new StreamObserver<ServerReflectionResponse>() {
#Override
public void onNext(ServerReflectionResponse response) {
try {
// 只需要关注文件描述类型的响应
if (response.getMessageResponseCase() == ServerReflectionResponse.MessageResponseCase.FILE_DESCRIPTOR_RESPONSE) {
List<ByteString> fileDescriptorProtoList = response.getFileDescriptorResponse().getFileDescriptorProtoList();
handleResponse(fileDescriptorProtoList, channel, methodSymbol, requestContent);
} else {
log.warn("未知响应类型: " + response.getMessageResponseCase());
}
} catch (Exception e) {
log.error("处理响应失败: {}", e.getMessage(), e);
}
}
#Override
public void onError(Throwable t) {
}
#Override
public void onCompleted() {
log.info("Complete");
}
};
// 请求观察器
StreamObserver<ServerReflectionRequest> requestStreamObserver = reflectionStub.serverReflectionInfo(streamObserver);
// 构建并发送获取方法文件描述请求
ServerReflectionRequest getFileContainingSymbolRequest = ServerReflectionRequest.newBuilder()
.setFileContainingSymbol(methodSymbol)
.build();
requestStreamObserver.onNext(getFileContainingSymbolRequest);
channel.awaitTermination(10, TimeUnit.SECONDS);
}
/**
* 处理响应
*/
private static void handleResponse(List<ByteString> fileDescriptorProtoList,
ManagedChannel channel,
String methodFullName,
String requestContent) {
try {
// 解析方法和服务名称
String fullServiceName = extraPrefix(methodFullName);
String methodName = extraSuffix(methodFullName);
String packageName = extraPrefix(fullServiceName);
String serviceName = extraSuffix(fullServiceName);
// 根据响应解析 FileDescriptor
Descriptors.FileDescriptor fileDescriptor = getFileDescriptor(fileDescriptorProtoList, packageName, serviceName);
// 查找服务描述
Descriptors.ServiceDescriptor serviceDescriptor = fileDescriptor.getFile().findServiceByName(serviceName);
// 查找方法描述
Descriptors.MethodDescriptor methodDescriptor = serviceDescriptor.findMethodByName(methodName);
// 发起请求
executeCall(channel, fileDescriptor, methodDescriptor, requestContent);
} catch (Exception e) {
log.error(e.getMessage(), e);
}
}
/**
* 解析并查找方法对应的文件描述
*/
private static Descriptors.FileDescriptor getFileDescriptor(List<ByteString> fileDescriptorProtoList,
String packageName,
String serviceName) throws Exception {
Map<String, DescriptorProtos.FileDescriptorProto> fileDescriptorProtoMap =
fileDescriptorProtoList.stream()
.map(bs -> {
try {
return DescriptorProtos.FileDescriptorProto.parseFrom(bs);
} catch (InvalidProtocolBufferException e) {
e.printStackTrace();
}
return null;
})
.filter(Objects::nonNull)
.collect(Collectors.toMap(DescriptorProtos.FileDescriptorProto::getName, f -> f));
if (fileDescriptorProtoMap.isEmpty()) {
log.error("服务不存在");
throw new IllegalArgumentException("方法的文件描述不存在");
}
// 查找服务对应的 Proto 描述
DescriptorProtos.FileDescriptorProto fileDescriptorProto = findServiceFileDescriptorProto(packageName, serviceName, fileDescriptorProtoMap);
// 获取这个 Proto 的依赖
Descriptors.FileDescriptor[] dependencies = getDependencies(fileDescriptorProto, fileDescriptorProtoMap);
// 生成 Proto 的 FileDescriptor
return Descriptors.FileDescriptor.buildFrom(fileDescriptorProto, dependencies);
}
/**
* 根据包名和服务名查找相应的文件描述
*/
private static DescriptorProtos.FileDescriptorProto findServiceFileDescriptorProto(String packageName,
String serviceName,
Map<String, DescriptorProtos.FileDescriptorProto> fileDescriptorProtoMap) {
for (DescriptorProtos.FileDescriptorProto proto : fileDescriptorProtoMap.values()) {
if (proto.getPackage().equals(packageName)) {
boolean exist = proto.getServiceList()
.stream()
.anyMatch(s -> serviceName.equals(s.getName()));
if (exist) {
return proto;
}
}
}
throw new IllegalArgumentException("服务不存在");
}
/**
* 获取前缀
*/
private static String extraPrefix(String content) {
int index = content.lastIndexOf(".");
return content.substring(0, index);
}
/**
* 获取后缀
*/
private static String extraSuffix(String content) {
int index = content.lastIndexOf(".");
return content.substring(index + 1);
}
/**
* 获取依赖类型
*/
private static Descriptors.FileDescriptor[] getDependencies(DescriptorProtos.FileDescriptorProto proto,
Map<String, DescriptorProtos.FileDescriptorProto> finalDescriptorProtoMap) {
return proto.getDependencyList()
.stream()
.map(finalDescriptorProtoMap::get)
.map(f -> toFileDescriptor(f, getDependencies(f, finalDescriptorProtoMap)))
.toArray(Descriptors.FileDescriptor[]::new);
}
/**
* 将 FileDescriptorProto 转为 FileDescriptor
*/
#SneakyThrows
private static Descriptors.FileDescriptor toFileDescriptor(DescriptorProtos.FileDescriptorProto fileDescriptorProto,
Descriptors.FileDescriptor[] dependencies) {
return Descriptors.FileDescriptor.buildFrom(fileDescriptorProto, dependencies);
}
/**
* 执行方法调用
*/
private static void executeCall(ManagedChannel channel,
Descriptors.FileDescriptor fileDescriptor,
Descriptors.MethodDescriptor originMethodDescriptor,
String requestContent) throws Exception {
// 重新生成 MethodDescriptor
MethodDescriptor<DynamicMessage, DynamicMessage> methodDescriptor = generateMethodDescriptor(originMethodDescriptor);
CallOptions callOptions = CallOptions.DEFAULT;
TypeRegistry registry = TypeRegistry.newBuilder()
.add(fileDescriptor.getMessageTypes())
.build();
// 将请求内容由 JSON 字符串转为相应的类型
JsonFormat.Parser parser = JsonFormat.parser().usingTypeRegistry(registry);
DynamicMessage.Builder messageBuilder = DynamicMessage.newBuilder(originMethodDescriptor.getInputType());
parser.merge(requestContent, messageBuilder);
DynamicMessage requestMessage = messageBuilder.build();
// 调用,调用方式可以通过 originMethodDescriptor.isClientStreaming() 和 originMethodDescriptor.isServerStreaming() 推断
DynamicMessage response = ClientCalls.blockingUnaryCall(channel, methodDescriptor, callOptions, requestMessage);
// 将响应解析为 JSON 字符串
JsonFormat.Printer printer = JsonFormat.printer()
.usingTypeRegistry(registry)
.includingDefaultValueFields();
String responseContent = printer.print(response);
log.info("响应: {}", responseContent);
}
/**
* 重新生成方法描述
*/
private static MethodDescriptor<DynamicMessage, DynamicMessage> generateMethodDescriptor(Descriptors.MethodDescriptor originMethodDescriptor) {
// 生成方法全名
String fullMethodName = MethodDescriptor.generateFullMethodName(originMethodDescriptor.getService().getFullName(), originMethodDescriptor.getName());
// 请求和响应类型
MethodDescriptor.Marshaller<DynamicMessage> inputTypeMarshaller = ProtoUtils.marshaller(DynamicMessage.newBuilder(originMethodDescriptor.getInputType())
.buildPartial());
MethodDescriptor.Marshaller<DynamicMessage> outputTypeMarshaller = ProtoUtils.marshaller(DynamicMessage.newBuilder(originMethodDescriptor.getOutputType())
.buildPartial());
// 生成方法描述, originMethodDescriptor 的 fullMethodName 不正确
return MethodDescriptor.<DynamicMessage, DynamicMessage>newBuilder()
.setFullMethodName(fullMethodName)
.setRequestMarshaller(inputTypeMarshaller)
.setResponseMarshaller(outputTypeMarshaller)
// 使用 UNKNOWN,自动修改
.setType(MethodDescriptor.MethodType.UNKNOWN)
.build();
}
}
There isn't much to prevent this technically. The two big hurdles are:
having a runtime-callable parser for reading the .proto, and
having a general purpose gRPC client available that takes things like the service method name as literals
Both are possible, but neither is trivial.
For 1, the crude way would be to shell/invoke protoc using the descriptor-set option to generate a schema binary, then deserialize that as a FileDescriptorSet (from descriptor.proto); this model gives you access to how protoc sees the file. Some platforms also have native parsers (essentially reimplementing protoc as a library in that platform), for example protobuf-net.Reflection does this in .NET-land
For 2, here's an implementation of that in C#. The approach should be fairly portable to Java, even if the details vary. You can look at a generated implementation to see how it works in any particular language.
(Sorry that the specific examples are C#/.NET, but that's where I live; the approaches should be portable, even if the specific code: not directly)
technically both are possible.
The codegen is simply generating a handful of classes; mainly protobuf messages, grpc method descriptors and stubs. You can implement it or check in the generated code to bypass the codegen. i am not sure what is the benefit of doing this tbh. Also, it will be very annoying if the proto is changed.
It is also possible to do it dynamically using byte codegen as long as you check-in some interfaces/abstract classes to represent those generated stub/method descriptors and protobuf messages. you have to make sure those non dynamic code is in sync with the proto definition though (most likely runtime check/exception).
I am trying to deserialize some Kafka messages that were serialized by Nifi, using Hortonworks Schema Registry
Processor used on the Nifi Side as RecordWritter: AvroRecordSetWriter
Schema write strategy: HWX COntent-Encoded Schema Reference
I am able to deserialize these messsages in other Nifi kafka consumer. However I am trying to deserialize them from my Flink application using Kafka code.
I have the following inside the Kafka deserializer Handler of my Flink Application:
final String SCHEMA_REGISTRY_CACHE_SIZE_KEY = SchemaRegistryClient.Configuration.CLASSLOADER_CACHE_SIZE.name();
final String SCHEMA_REGISTRY_CACHE_EXPIRY_INTERVAL_SECS_KEY = SchemaRegistryClient.Configuration.CLASSLOADER_CACHE_EXPIRY_INTERVAL_SECS.name();
final String SCHEMA_REGISTRY_SCHEMA_VERSION_CACHE_SIZE_KEY = SchemaRegistryClient.Configuration.SCHEMA_VERSION_CACHE_SIZE.name();
final String SCHEMA_REGISTRY_SCHEMA_VERSION_CACHE_EXPIRY_INTERVAL_SECS_KEY = SchemaRegistryClient.Configuration.SCHEMA_VERSION_CACHE_EXPIRY_INTERVAL_SECS.name();
final String SCHEMA_REGISTRY_URL_KEY = SchemaRegistryClient.Configuration.SCHEMA_REGISTRY_URL.name();
Properties schemaRegistryProperties = new Properties();
schemaRegistryProperties.put(SCHEMA_REGISTRY_CACHE_SIZE_KEY, 10L);
schemaRegistryProperties.put(SCHEMA_REGISTRY_CACHE_EXPIRY_INTERVAL_SECS_KEY, 5000L);
schemaRegistryProperties.put(SCHEMA_REGISTRY_SCHEMA_VERSION_CACHE_SIZE_KEY, 1000L);
schemaRegistryProperties.put(SCHEMA_REGISTRY_SCHEMA_VERSION_CACHE_EXPIRY_INTERVAL_SECS_KEY, 60 * 60 * 1000L);
schemaRegistryProperties.put(SCHEMA_REGISTRY_URL_KEY, "http://schema_registry_server:7788/api/v1");
return (Map<String, Object>) HWXSchemaRegistry.getInstance(schemaRegistryProperties).deserialize(message);
And here is the HWXSchemaRegistryCode to deserialize the message:
import com.hortonworks.registries.schemaregistry.avro.AvroSchemaProvider;
import com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient;
import com.hortonworks.registries.schemaregistry.errors.SchemaNotFoundException;
import com.hortonworks.registries.schemaregistry.serdes.avro.AvroSnapshotDeserializer;
public class HWXSchemaRegistry {
private SchemaRegistryClient client;
private Map<String,Object> config;
private AvroSnapshotDeserializer deserializer;
private static HWXSchemaRegistry hwxSRInstance = null;
public static HWXSchemaRegistry getInstance(Properties schemaRegistryConfig) {
if(hwxSRInstance == null)
hwxSRInstance = new HWXSchemaRegistry(schemaRegistryConfig);
return hwxSRInstance;
}
public Object deserialize(byte[] message) throws IOException {
Object o = hwxSRInstance.deserializer.deserialize(new ByteArrayInputStream(message), null);
return o;
}
private static Map<String,Object> properties2Map(Properties config) {
Enumeration<Object> keys = config.keys();
Map<String, Object> configMap = new HashMap<String,Object>();
while (keys.hasMoreElements()) {
Object key = (Object) keys.nextElement();
configMap.put(key.toString(), config.get(key));
}
return configMap;
}
private HWXSchemaRegistry(Properties schemaRegistryConfig) {
_log.debug("Init SchemaRegistry Client");
this.config = HWXSchemaRegistry.properties2Map(schemaRegistryConfig);
this.client = new SchemaRegistryClient(this.config);
this.deserializer = this.client.getDefaultDeserializer(AvroSchemaProvider.TYPE);
this.deserializer.init(this.config);
}
}
But I am getting a 404 HTTP Error code(schema not found). I think this is due to incompatible "protocols" between Nifi configuration and HWX Schema Registry Client implementation, so schema identifier bytes that the client is looking for does not exist on the server, or something like this.
Can someone help on this?
Thank you.
Caused by: javax.ws.rs.NotFoundException: HTTP 404 Not Found
at org.glassfish.jersey.client.JerseyInvocation.convertToException(JerseyInvocation.java:1069)
at org.glassfish.jersey.client.JerseyInvocation.translate(JerseyInvocation.java:866)
at org.glassfish.jersey.client.JerseyInvocation.lambda$invoke$1(JerseyInvocation.java:750)
at org.glassfish.jersey.internal.Errors.process(Errors.java:292)
at org.glassfish.jersey.internal.Errors.process(Errors.java:274)
at org.glassfish.jersey.internal.Errors.process(Errors.java:205)
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:390)
at org.glassfish.jersey.client.JerseyInvocation.invoke(JerseyInvocation.java:748)
at org.glassfish.jersey.client.JerseyInvocation$Builder.method(JerseyInvocation.java:404)
at org.glassfish.jersey.client.JerseyInvocation$Builder.get(JerseyInvocation.java:300)
at com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient$14.run(SchemaRegistryClient.java:1054)
at com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient$14.run(SchemaRegistryClient.java:1051)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient.getEntities(SchemaRegistryClient.java:1051)
at com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient.getAllVersions(SchemaRegistryClient.java:872)
at com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient.getAllVersions(SchemaRegistryClient.java:676)
at HWXSchemaRegistry.(HWXSchemaRegistry.java:56)
at HWXSchemaRegistry.getInstance(HWXSchemaRegistry.java:26)
at SchemaService.deserialize(SchemaService.java:70)
at SchemaService.deserialize(SchemaService.java:26)
at org.apache.flink.streaming.connectors.kafka.internals.KafkaDeserializationSchemaWrapper.deserialize(KafkaDeserializationSchemaWrapper.java:45)
at org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.runFetchLoop(KafkaFetcher.java:140)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:712)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:93)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:57)
at org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:97)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:302)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:711)
at java.lang.Thread.run(Thread.java:745)
I found a workaround. Since I wasn't able to get this working. I take the first bytes of the byte array to make several calls to schema registry and get the avro schema to deserialize later the rest of the byte array.
First byte (0) is protocol version (I figured out this is a Nifi-specific byte, since I didn't need it).
Next 8 bytes are the schema Id
Next 4 bytes are the schema version
The rest of the bytes are the message itself:
import com.hortonworks.registries.schemaregistry.SchemaMetadataInfo;
import com.hortonworks.registries.schemaregistry.SchemaVersionInfo;
import com.hortonworks.registries.schemaregistry.SchemaVersionKey;
import com.hortonworks.registries.schemaregistry.client.SchemaRegistryClient;
try(SchemaRegistryClient client = new SchemaRegistryClient(this.schemaRegistryConfig)) {
try {
Long schemaId = ByteBuffer.wrap(Arrays.copyOfRange(message, 1, 9)).getLong();
Integer schemaVersion = ByteBuffer.wrap(Arrays.copyOfRange(message, 9, 13)).getInt();
SchemaMetadataInfo schemaInfo = client.getSchemaMetadataInfo(schemaId);
String schemaName = schemaInfo.getSchemaMetadata().getName();
SchemaVersionInfo schemaVersionInfo = client.getSchemaVersionInfo(
new SchemaVersionKey(schemaName, schemaVersion));
String avroSchema = schemaVersionInfo.getSchemaText();
byte[] message= Arrays.copyOfRange(message, 13, message.length);
// Deserialize [...]
}
catch (Exception e)
{
throw new IOException(e.getMessage());
}
}
I also thought that maybe I had to remove the first byte before calling the hwxSRInstance.deserializer.deserialize in my question code, since this byte seems to be a Nifi specific byte to communicate between Nifi processors, but it didn't work.
Next step is to build a cache with the schema texts to avoid calling multiple times the schema registry API.
New info: I will extend my answer to include the avro deserialization part, since it was some troubleshooting for me and I had to inspect Nifi Avro Reader source code to figure out this part (I was getting not valid Avro data exception when trying to use the basic avro deserialization code):
import org.apache.avro.Schema;
import org.apache.avro.file.SeekableByteArrayInput;
import org.apache.avro.generic.GenericDatumReader;
import org.apache.avro.generic.GenericRecord;
import org.apache.avro.io.BinaryDecoder;
import org.apache.avro.io.DatumReader;
import org.apache.avro.io.DecoderFactory;
private static GenericRecord deserializeMessage(byte[] message, String schemaText) throws IOException {
InputStream in = new SeekableByteArrayInput(message);
Schema schema = new Schema.Parser().parse(schemaText);
DatumReader<GenericRecord> datumReader = new GenericDatumReader<>(schema);
BinaryDecoder decoder = DecoderFactory.get().binaryDecoder(in, null);
GenericRecord genericRecord = null;
genericRecord = datumReader.read(genericRecord, decoder);
in.close();
return genericRecord;
}
If you want to convert GenericRecord to map, note that String values are not Strings objects, you need to cast the Keys and values of types string:
private static Map<String, Object> avroGenericRecordToMap(GenericRecord record)
{
Map<String, Object> map = new HashMap<>();
record.getSchema().getFields().forEach(field ->
map.put(String.valueOf(field.name()), record.get(field.name())));
// Strings are maped to Utf8 class, so they need to be casted (all the keys of records and those values which are typed as string)
if(map.get("value").getClass() == org.apache.avro.util.Utf8.class)
map.put("value", String.valueOf(map.get("value")));
return map;
}
I try to get some data from a dbus service and work with it in Java.
I can get the information in cli with the following command:
dbus-send --print-reply --system --dest=com.victronenergy.solarcharger.ttyUSB0 /Dc/0/Voltage com.victronenergy.BusItem.GetValue
The result is:
method return time=1538903662.321580 sender=:1.14 -> destination=:1.806 serial=335692 reply_serial=2
variant double 13.43
What I tried to get this data in Java, is:
After hours of reading, I created an Interface.
package javadbus;
import java.util.Map;
import org.freedesktop.dbus.DBusInterface;
import org.freedesktop.dbus.DBusSignal;
import org.freedesktop.dbus.Variant;
import org.freedesktop.dbus.exceptions.DBusException;
public interface BusItem extends DBusInterface
{
public static class PropertiesChanged extends DBusSignal
{
public final Map<String,Variant> changes;
public PropertiesChanged(String path, Map<String,Variant> changes) throws DBusException
{
super(path, changes);
this.changes = changes;
}
}
public String GetDescription(String language, int length);
public Variant GetValue();
public String GetText();
public int SetValue(Variant value);
public Variant GetMin();
public Variant GetMax();
public int SetDefault();
public Variant GetDefault();
}
Here I call getConnection() and getRemoteObject() successfully.
package javadbus;
import org.freedesktop.dbus.DBusConnection;
import org.freedesktop.dbus.exceptions.DBusException;
import org.freedesktop.dbus.Variant;
public class VictronEnergyDBusSolarCharger {
private String port;
private DBusConnection conn;
public VictronEnergyDBusSolarCharger(String port) {
this.port = port;
try {
this.conn = DBusConnection.getConnection(DBusConnection.SYSTEM);
} catch (DBusException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
private String getData(String item) {
BusItem bi;
String data = null;
Variant vData = null;
try {
bi = (BusItem)conn.getRemoteObject("com.victronenergy.solarcharger." + this.port, item, BusItem.class);
vData = bi.GetValue();
//data = bi.GetText();
} catch (DBusException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return data;
}
...
}
It was a big task to resolve all dependecies and get the code compiled. But finaly I did it. So, javac now runs without errors.
But if I try to call the Method GetValue(), I get the following Exception:
[Sender] INFO org.freedesktop.dbus.MessageWriter - <= MethodCall(0,1) { Path=>/org/freedesktop/DBus, Interface=>org.freedesktop.DBus, Member=>Hello, Destination=>org.freedesktop.DBus } { }
[Sender] INFO org.freedesktop.dbus.MessageWriter - <= MethodCall(0,3) { Path=>/Dc/0/Voltage, Interface=>javadbus.BusItem, Member=>GetValue, Destination=>com.victronenergy.solarcharger.ttyUSB0 } { }
Exception in thread "main" org.freedesktop.DBus$Error$UnknownMethod: Method "GetValue" with signature "" on interface "javadbus.BusItem" doesn't exist
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.freedesktop.dbus.Error.getException(Error.java:141)
at org.freedesktop.dbus.Error.throwException(Error.java:171)
at org.freedesktop.dbus.RemoteInvocationHandler.executeRemoteMethod(RemoteInvocationHandler.java:158)
at org.freedesktop.dbus.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:222)
at com.sun.proxy.$Proxy1.GetValue(Unknown Source)
at javadbus.VictronEnergyDBusSolarCharger.getData(VictronEnergyDBusSolarCharger.java:28)
at javadbus.VictronEnergyDBusSolarCharger.getDcV(VictronEnergyDBusSolarCharger.java:38)
at javadbus.MainClass.main(MainClass.java:7)
Is it necessary to make a implementation of this Method GetValue? But why e.g. how should I do this? I only want to get this Information and not provide it like a Server.
Why was it a big task to get all dependencies?
dbus-java library and dependencies are all available at maven central, so a proper maven project should just work out-of-the-box.
Back to topic:
You don't have to implement GetValue(), but you need a suitable java interface for BusItem.
As far as I can see in the documentation of victronenergy (https://www.victronenergy.com/live/open_source:ccgx:d-bus) , your interface is not correct.
You provide SetDefault()/GetDefault() methods, which are only available on com.victronenergy.settings Objects, but you want to retrieve a com.victronenergy.BusItem (no part of the com.victronenergy.settings package).
This is one error. The second error is: you use the wrong package name for your BusItem class.
In your case DBus will try to resolve an object with the path javadbus.BusItem which is not provided by the connected BusAddress com.victronenergy.solarcharger.ttyUSB0.
The BusItem class has to be in package com.victronenergy or you have to use the annotation #DBusInterfaceName("com.victronenergy.BusItem").
The annotation will tell the DBus library to ignore the java package/class name and use the one provided in the annotation.
The Inteface BusItem had been created by CreateInterface-Script from https://dbus.freedesktop.org/doc/dbus-java/dbus-java/dbus-javase10.html and the XML from Introspect()
But you solved my real problem. I used the annotation #DBusInterfaceName("com.victronenergy.BusItem") now. No Exception anymore an i get data from my solarcharger. Thank you so much!
We are trying to track down a bug. We get the above error in the logs.
Can anyone explain what this message means? Are there any typical reasons for getting this message?
The stacktrace is:
org.apache.axiom.om.OMException: java.lang.IllegalArgumentException: local part cannot be "null" when creating a QName
at org.apache.axiom.om.impl.builder.StAXOMBuilder.next(StAXOMBuilder.java:206)
at org.apache.axiom.om.impl.llom.OMNodeImpl.build(OMNodeImpl.java:318)
at org.apache.axiom.om.impl.llom.OMElementImpl.build(OMElementImpl.java:618)
at org.apache.axis2.jaxws.message.util.impl.SAAJConverterImpl.toOM(SAAJConverterImpl.java:147)
at org.apache.axis2.jaxws.message.impl.XMLPartImpl._convertSE2OM(XMLPartImpl.java:77)
at org.apache.axis2.jaxws.message.impl.XMLPartBase.getContentAsOMElement(XMLPartBase.java:203)
at org.apache.axis2.jaxws.message.impl.XMLPartBase.getAsOMElement(XMLPartBase.java:255)
at org.apache.axis2.jaxws.message.impl.MessageImpl.getAsOMElement(MessageImpl.java:464)
at org.apache.axis2.jaxws.message.util.MessageUtils.putMessageOnMessageContext(MessageUtils.java:202)
at org.apache.axis2.jaxws.core.controller.AxisInvocationController.prepareRequest(AxisInvocationController.java:370)
at org.apache.axis2.jaxws.core.controller.InvocationController.invoke(InvocationController.java:120)
at org.apache.axis2.jaxws.client.proxy.JAXWSProxyHandler.invokeSEIMethod(JAXWSProxyHandler.java:317)
at org.apache.axis2.jaxws.client.proxy.JAXWSProxyHandler.invoke(JAXWSProxyHandler.java:148)
I got the same error message (local part cannot be "null" when creating a QName) while trying to construct a org.w3c.dom.Document from String. The problem went away after calling setNamespaceAware(true) on DocumentBuilderFactory. Working code snippet is given below.
private static Document getDocumentFromString(final String xmlContent)
throws Exception
{
DocumentBuilderFactory documentBuilderFactory =
DocumentBuilderFactory.newInstance();
documentBuilderFactory.setNamespaceAware(true);
try
{
return documentBuilderFactory
.newDocumentBuilder()
.parse(new InputSource(new StringReader(xmlContent)));
}
catch (Exception e)
{
throw new RuntimeException(e);
}
}
It means you are creating a DOM element or attribute using one of the namespace methods like createElementNS thus
document.createElementNS(namespace, null)
or createElementNS or setAttrbuteNS
and the second argument, the qname is null, or includes a prefix but no local part as in "foo:".
EDIT:
I would try to run the XML it's parsing through a validator. It's likely there's some tag or attribute name like foo: or foo:bar:baz that is a valid XML identifier but invalid according to the additional restrictions introduced by XML namespaces.
After whole hours in searching, I just wanted to share the Answer in this thread helped me with a Talend code migration - involving SOAP messages - from java8 to java11.
// Used libraries
import java.io.ByteArrayInputStream;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.soap.SOAPBody;
import org.w3c.dom.Document;
import org.w3c.dom.Node;
...
// This is the node I want to replace: "<DataArea><Contact/></DataArea>"
// <SOAP-ENV:Body> > SOAP Action (e.g. <ns:GetContact>) > <GetContactRequest> > <DataArea>
SOAPBody soapBodyRequest = objSOAPMessage.getSOAPPart().getEnvelope().getBody();
Node nodeDataArea = soapBodyRequest.getFirstChild().getFirstChild().getFirstChild();
// Build a valid Node object starting from a string e.g. "<Contact> etc etc nested-etc </Contact>"
DocumentBuilderFactory objDocumentBuilderFactory = DocumentBuilderFactory.newInstance();
// As per java11, this is essential. It forces the Factory to consider the ':' as a namespace separator rather than part of a tag.
objDocumentBuilderFactory.setNamespaceAware(true);
// Create the node to replace "<DataArea><Contact/></DataArea>" with "<DataArea><Contact>content and nested tags</Contact></DataArea>"
Node nodeParsedFromString = objDocumentBuilderFactory.newDocumentBuilder().parse(new ByteArrayInputStream(strDataArea.getBytes())).getDocumentElement();
// Import the newly parsed Node object in the request envelop being built and replace the existing node.
nodeDataArea.replaceChild(
/*newChild*/nodeDataArea.getOwnerDocument().importNode(nodeParsedFromString, true),
/*oldChild*/nodeDataArea.getFirstChild()
);
This throws the Local part cannot be "null" when creating a QName exception if you don't put the .setNamespaceAware(true) instruction
Although this is an old thread, I hope this answer would help some one else searching this error.
I have faced the same error when I was trying to build a web app using maven-enunciate-cxf-plugin:1.28.
For me this was caused after I added a checked exception to my web service signature:
#WebMethod(operationName = "enquirySth")
public IBANResponse enquirySth(String input) throws I
InvalidInputException { ...
I have used JAX-WS Spec for exception throwing but no success. At the end I have found this issue in Enunciate issue tracker system which indicates this issue is resolved in current version, but I think it still exist.
Finally I have done following workaround to fix my problem:
adding #XmlRootElement to my FaultBean.
#XmlRootElement
#XmlAccessorType(XmlAccessType.FIELD)
#XmlType(name = "FaultBean",propOrder = {"errorDescription", "errorCode"})
public class FaultBean implements Serializable {
#XmlElement(required = true, nillable = true)
protected String errorDescription;
#XmlElement(required = true, nillable = true)
protected String errorCode;
public FaultBean() {
}
public String getErrorDescription() {
return this.errorDescription;
}
public void setErrorDescription(String var1) {
this.errorDescription = var1;
}
public String getErrorCode() {
return this.errorCode;
}
public void setErrorCode(String var1) {
this.errorCode = var1;
}
}
That's it. Hope it helps.