I am trying several examples from Jest to use as a POC for ElasticSearch integration.
Right now, I am trying just a basic GET. I created a POJO called Document. In there are some basic setters and getters are some fields. I populate it and then use GSON to generate the JSON text.
From this generated JSON, I go into ElasticSearch Sense and do the following:
PUT /reports/documents/3
{
// JSON code
}
This generates just fine. I then try using Get to pull the values out from Java, like so:
JestClientFactory factory = new JestClientFactory();
factory.setHttpClientConfig(new HttpClientConfig
.Builder("http://localhost:9200")
.multiThreaded(true)
.build());
client = factory.getObject();
Get get = new Get.Builder("reports", "3").type("documents").build();
try {
JestResult result = client.execute(get);
String json = result.getJsonString();
System.out.println(json);
Document doc = null;
doc = result.getSourceAsObject(Document.class);
System.out.println("is doc null? " + doc == null);
}catch (Exception e) {
System.err.println("Error getting document");
e.printStackTrace();
}
The String json returns what I would expect (showing _index, _type, _id and of course _source). However, doc always comes out as NULL. I am not sure why that is happening.
Just to see if this was just a Get problem, I proceeded to try to Search.I did the following code snippet:
try {
SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder();
searchSourceBuilder.query(QueryBuilders.matchQuery("reportNumber", "101221895CRT-004"));
Search search = new Search.Builder(searchSourceBuilder.toString())
// multiple index or types can be added.
.addIndex("reports")
.addType("documents")
.build();
SearchResult result = client.execute(search);
//List<Document> results = result.getSourceAsObjectList(Document.class);
List<SearchResult.Hit<Document, Void>> hits = result.getHits(Document.class);
for (SearchResult.Hit hit : hits) {
Document source = (Document) hit.source;
Void ex = (Void) hit.explanation;
System.out.println();
}
System.out.println("Result size: " + hits.size());
}catch (Exception e) {
System.err.println("Error searching");
e.printStackTrace();
}
When looking at result, the JSON of the object is shown. However, the List<Document> results comes out as NULL. When using hits, the size of hits is correct, but the "source" and "ex" are both NULL.
Any ideas on what I am doing wrong with this?
UPDATE
After reading Cihat's comment, I went ahead and added in logging. It turns out I am getting an error when trying to convert a date (hence why it's always coming back as NULL).
I get the following error message:
Unhandled exception occurred while converting source to the object .com.someCompanyName.data.Document
com.google.gson.JsonSyntaxException: java.text.ParseException: Unparseable date: "Nov 6, 2014 8:29:00 AM"
I have tried all different formats:
11/06/2014 8:29:00 AM (and without time and making year just 14)
06-NOV-2014 8:29:00 AM (and without time and making year just 14)
2014-11-06 8:29:00 AM (same thing with time and year changes)
2014-NOV-06 8:29:00 AM (same thing with time and year changes)
06/11/2014 8:29:00 AM (same thing)
All of those failed. I am sure I tried some other formats, so not sure what format the date should be in. I even tried the exact date from DateFormat JavaDocs and it still failed. Every time I do a search, it says to define the Dateformat in the GsonBuilder, but in Jest I do not have access to that.
This test case demonstrates indexing a document with Jest and then getting the same document back out. Not a complete answer, but hopefully it is useful to see something that is known to work.
import io.searchbox.client.JestClient;
import io.searchbox.client.JestClientFactory;
import io.searchbox.client.JestResult;
import io.searchbox.client.config.HttpClientConfig;
import io.searchbox.core.Get;
import io.searchbox.core.Index;
import static org.hamcrest.Matchers.*;
import static org.hamcrest.MatcherAssert.*;
import org.junit.Test;
public class JestRoundtripIT {
public static final String INDEX = "reports";
public static final String TYPE = "documents";
public static final String ID = "3";
#Test
public void documentRoundTrip() throws Exception {
JestClientFactory factory = new JestClientFactory();
factory.setHttpClientConfig(new HttpClientConfig
.Builder("http://localhost:9200")
.multiThreaded(true)
.build());
JestClient client = factory.getObject();
Document original = new Document()
.withAuthor("Shay Banon")
.withContent("You know, for search...");
JestResult indexResult = client.execute(
new Index.Builder(original)
.index(INDEX)
.type(TYPE)
.id(ID)
.build());
assertThat(indexResult.isSucceeded(), equalTo(true));
JestResult getResult = client.execute(
new Get.Builder(INDEX, ID)
.type(TYPE)
.build());
assertThat(getResult.isSucceeded(), equalTo(true));
Document fromEs = getResult.getSourceAsObject(Document.class);
assertThat(fromEs, notNullValue());
assertThat(fromEs.getAuthor(), equalTo(original.getAuthor()));
assertThat(fromEs.getContent(), equalTo(original.getContent()));
}
public static class Document {
protected String author;
protected String content;
public Document withAuthor( String author ) {
this.author = author;
return this;
}
public Document withContent( String content ) {
this.content = content;
return this;
}
public String getAuthor() {
return author;
}
public void setAuthor( String author ) {
this.author = author;
}
public String getContent() {
return content;
}
public void setContent( String content ) {
this.content = content;
}
}
}
Related
I have implemented a spring boot application to retrieve file data from files and save it in separate collections. When I run the application it gives the following error. I couldn't resolve it. Can anyone help me to do this?
Error
Description:
Parameter 2 of constructor in com.bezkoder.spring.jwt.mongodb.SpringBootSecurityJwtMongodbApplication required a bean of type 'com.bezkoder.spring.jwt.mongodb.models.LogRecordCollection' that could not be found.
Action:
Consider defining a bean of type 'com.bezkoder.spring.jwt.mongodb.models.LogRecordCollection' in your configuration.
Disconnected from the target VM, address: '127.0.0.1:55297', transport: 'socket'
LogRecordController.java
#CrossOrigin(origins = "*", maxAge = 3600)
#RestController
#RequestMapping("/api/auth/log")
public class LogRecordController {
#Autowired
LogRecordRepository logRecordRepository;
#GetMapping("")
public ResponseEntity<?> getAllLogRecordsByLogFileId(#RequestParam("fileId") String fileId) {
try{
LogRecordCollection logRecordCollection = new LogRecordCollection();
logRecordCollection.setCollectionName(fileId);
// List<LogRecord> logRecords = logRecordRepository.findAll(PageRequest.of(1, 10, Sort.by(Sort.Direction.ASC, "no"))).getContent();
List<LogRecord> logRecords = logRecordRepository.findAll();
return ResponseEntity.ok().body(logRecords);
}catch (Exception e){
return ResponseEntity.status(HttpStatus.EXPECTATION_FAILED).body(e.getMessage());
}
}
}
SpringBootSecurityJwtMongodbApplication.java
#SpringBootApplication
#CrossOrigin(origins = "*", maxAge = 3600)
#RestController
#RequestMapping("/api/auth/logFile")
public class SpringBootSecurityJwtMongodbApplication {
public SpringBootSecurityJwtMongodbApplication(LogFileRepository logfileRepo, LogRecordRepository logrecordRepo, LogRecordCollection logrecordColl) {
this.logfileRepo = logfileRepo;
this.logrecordRepo = logrecordRepo;
this.logrecordColl = logrecordColl;
}
public static void main(String[] args) {
SpringApplication.run(SpringBootSecurityJwtMongodbApplication.class, args);
}
#Bean
public ApplicationRunner runner(FTPConfiguration.GateFile gateFile) {
return args -> {
List<File> files = gateFile.mget(".");
for (File file : files) {
JSONArray arr = new JSONArray();
System.out.println("Result:" + file.getAbsolutePath());
run(file, arr);
}
};
}
void run(File file, JSONArray arr) throws IOException {
SimpleDateFormat formatter = new SimpleDateFormat("hh:mm:ss");
Pcap pcap = Pcap.openStream(file);
JSONObject obj = new JSONObject();
String fileName = file.getName();
pcap.loop(
packet -> {
String Time = null;
String Source = null;
String Destination = null;
String dataProtocol = null;
Long Length = null;
if (packet.hasProtocol(Protocol.TCP)) {
TCPPacket packet1 = (TCPPacket) packet.getPacket(Protocol.TCP);
Time = formatter.format(new Date(packet1.getArrivalTime() / 1000));
Source = packet1.getSourceIP();
Destination = packet1.getDestinationIP();
dataProtocol = packet1.getProtocol().toString();
Length = packet1.getTotalLength();
} else if (packet.hasProtocol(Protocol.UDP)) {
UDPPacket packet1 = (UDPPacket) packet.getPacket(Protocol.UDP);
Time = formatter.format(new Date(packet1.getArrivalTime() / 1000));
Source = packet1.getSourceIP();
Destination = packet1.getDestinationIP();
dataProtocol = packet1.getProtocol().toString();
Length = packet1.getTotalLength();
} else {
System.out.println("Not found protocol. | " + packet.getProtocol());
}
obj.put("Time", Time);
obj.put("Source", Source);
obj.put("Destination", Destination);
obj.put("Protocol", dataProtocol);
obj.put("Length", Length);
arr.add(obj);
return packet.getNextPacket() != null;
}
);
System.out.println(arr);
System.out.println(fileName);
Calendar calendar = Calendar.getInstance();
String now = String.valueOf(calendar.getTime());
LogFile data =logfileRepo.save(new LogFile("", fileName, now));
String collectionName = data.getFileName();
System.out.println(collectionName);
//Converting jsonData string into JSON object
//Creating an empty ArrayList of type Object
ArrayList<Object> listdata = new ArrayList<>();
//Checking whether the JSON array has some value or not
if (arr != null) {
//Iterating JSON array
for (int i=0;i<arr.size();i++){
//Adding each element of JSON array into ArrayList
listdata.add(arr.get(i));
}
}
logrecordColl.setCollectionName(collectionName);
listdata.addAll(logrecordRepo.findAll());
}
private final LogFileRepository logfileRepo;
private final LogRecordRepository logrecordRepo;
private final LogRecordCollection logrecordColl;
}
LogRecordRepository.java
import com.bezkoder.spring.jwt.mongodb.models.LogRecord;
import org.springframework.data.mongodb.repository.MongoRepository;
public interface LogRecordRepository extends MongoRepository<LogRecord, String>{
}
LogRecordCollection.java
public class LogRecordCollection {
private static String collectionName = "undefined";
public static String getCollectionName(){
return collectionName;
}
public void setCollectionName(String collectionName){
this.collectionName = collectionName;
}
}
Parameter 2 of constructor in com.bezkoder.spring.jwt.mongodb.SpringBootSecurityJwtMongodbApplication required a bean of type 'com.bezkoder.spring.jwt.mongodb.models.LogRecordCollection' that could not be found.
In an nutshell, the exception like this is self-explanatory. It means that Spring could not find a bean to be injected into your class
In your case the class SpringBootSecurityJwtMongodbApplication has a constructor:
public SpringBootSecurityJwtMongodbApplication(LogFileRepository logfileRepo, LogRecordRepository logrecordRepo, LogRecordCollection logrecordColl) {
this.logfileRepo = logfileRepo;
this.logrecordRepo = logrecordRepo;
this.logrecordColl = logrecordColl;
}
Now, LogRecordCollection has to be a bean (annotated with #Component for example, or defined in via java configuration (#Configuration marked classes and method annotated with #Bean that creates this class). Otherwise spring won't "recognize" this class a bean.
So strictly speaking this is your issue.
Now, having said that - the code you've presented in the question looks extremely messy - you mix #SpringBootApplication which is an entry point to the application, the rest controller and what not. I really recommend you to separate all this to different files to improve the code clarity and avoid unexpected exceptions that can be tricky to fix.
add below annotations in SpringBootSecurityJwtMongodbApplication
#SpringBootApplication
#ComponentScan("com.bezkoder.spring.jwt.mongodb") //to scan packages mentioned
#EnableMongoRepositories("com.bezkoder.spring.jwt.mongodb") //to activate MongoDB repositories
public class SpringBootSecurityJwtMongodbApplication { ... }
When working with gRPC, we need to generate the gRPC client and server interfaces from our .proto service definition via protocol buffer compiler (protoc) or using Gradle or Maven protoc build plugin.
Flow now: protobuf file -> java code -> gRPC client.
So, is there any way to skip this step?
How to create a generic gRPC client that can call the server directly from the protobuf file without compile into java code?
Or, is there a way to Generated Code at runtime?
Flow expect: protobuf file -> gRPC client.
I want to build a generic gRPC client system with the input are protobuf files along with description of method, package, message request ... without having to compile again for each protobuf.
Thank you very much.
Protobuf systems really need protoc to be run. However, the generated code could be skipped. Instead of passing something like --java_out and --grpc_java_out to protoc you can pass --descriptor_set_out=FILE which will parse the .proto file into a descriptor file. A descriptor file is a proto-encoded FileDescriptorSet. This is the same basic format as used with the reflection service.
Once you have a descriptor, you can load it a FileDescriptor at a time and create a DynamicMessage.
Then for the gRPC piece, you need to create a gRPC MethodDescriptor.
static MethodDescriptor from(
Descriptors.MethodDescriptor methodDesc
) {
return MethodDescriptor.<DynamicMessage, DynamicMessage>newBuilder()
// UNKNOWN is fine, but the "correct" value can be computed from
// methodDesc.toProto().getClientStreaming()/getServerStreaming()
.setType(getMethodTypeFromDesc(methodDesc))
.setFullMethodName(MethodDescriptor.generateFullMethodName(
serviceDesc.getFullName(), methodDesc.getName()))
.setRequestMarshaller(ProtoUtils.marshaller(
DynamicMessage.getDefaultInstance(methodDesc.getInputType())))
.setResponseMarshaller(ProtoUtils.marshaller(
DynamicMessage.getDefaultInstance(methodDesc.getOutputType())))
.build();
static MethodDescriptor.MethodType getMethodTypeFromDesc(
Descriptors.MethodDescriptor methodDesc
) {
if (!methodDesc.isServerStreaming()
&& !methodDesc.isClientStreaming()) {
return MethodDescriptor.MethodType.UNARY;
} else if (methodDesc.isServerStreaming()
&& !methodDesc.isClientStreaming()) {
return MethodDescriptor.MethodType.SERVER_STREAMING;
} else if (!methodDesc.isServerStreaming()) {
return MethodDescriptor.MethodType.CLIENT_STREAMING);
} else {
return MethodDescriptor.MethodType.BIDI_STREAMING);
}
}
At that point you have everything you need and can call Channel.newCall(method, CallOptions.DEFAULT) in gRPC. You're also free to use ClientCalls to use something more similar to the stub APIs.
So dynamic calls are definitely possible, and is used for things like grpcurl. But it also is not easy and so is generally only done when necessary.
I did it in Java, and the step is:
Call reflection service to get FileDescriptorProto list by method name
Get FileDescriptor of method from FileDescriptorProto list by package name, service name
Get MethodDescriptor from ServiceDescriptor which get from the FileDescriptor
Generate a MethodDescriptor<DynamicMessage, DynamicMessage> by MethodDescriptor
Build request DynamicMessage from content like JSON or others
Call method
Parse response content to JSON from DynamicMessage response
You can reference the full sample in project helloworlde/grpc-java-sample#reflection
And proto is:
syntax = "proto3";
package io.github.helloworlde.grpc;
option go_package = "api;grpc_gateway";
option java_package = "io.github.helloworlde.grpc";
option java_multiple_files = true;
option java_outer_classname = "HelloWorldGrpc";
service HelloService{
rpc SayHello(HelloMessage) returns (HelloResponse){
}
}
message HelloMessage {
string message = 2;
}
message HelloResponse {
string message = 1;
}
Start server for this proto by yourself, and the full code in Java just like:
import com.google.protobuf.ByteString;
import com.google.protobuf.DescriptorProtos;
import com.google.protobuf.Descriptors;
import com.google.protobuf.DynamicMessage;
import com.google.protobuf.InvalidProtocolBufferException;
import com.google.protobuf.TypeRegistry;
import com.google.protobuf.util.JsonFormat;
import io.grpc.CallOptions;
import io.grpc.ManagedChannel;
import io.grpc.ManagedChannelBuilder;
import io.grpc.MethodDescriptor;
import io.grpc.protobuf.ProtoUtils;
import io.grpc.reflection.v1alpha.ServerReflectionGrpc;
import io.grpc.reflection.v1alpha.ServerReflectionRequest;
import io.grpc.reflection.v1alpha.ServerReflectionResponse;
import io.grpc.stub.ClientCalls;
import io.grpc.stub.StreamObserver;
import lombok.SneakyThrows;
import lombok.extern.slf4j.Slf4j;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.concurrent.TimeUnit;
import java.util.stream.Collectors;
#Slf4j
public class ReflectionCall {
public static void main(String[] args) throws InterruptedException {
// 反射方法的格式只支持 package.service.method 或者 package.service
String methodSymbol = "io.github.helloworlde.grpc.HelloService.SayHello";
String requestContent = "{\"message\": \"Reflection\"}";
// 构建 Channel
ManagedChannel channel = ManagedChannelBuilder.forAddress("127.0.0.1", 9090)
.usePlaintext()
.build();
// 使用 Channel 构建 BlockingStub
ServerReflectionGrpc.ServerReflectionStub reflectionStub = ServerReflectionGrpc.newStub(channel);
// 响应观察器
StreamObserver<ServerReflectionResponse> streamObserver = new StreamObserver<ServerReflectionResponse>() {
#Override
public void onNext(ServerReflectionResponse response) {
try {
// 只需要关注文件描述类型的响应
if (response.getMessageResponseCase() == ServerReflectionResponse.MessageResponseCase.FILE_DESCRIPTOR_RESPONSE) {
List<ByteString> fileDescriptorProtoList = response.getFileDescriptorResponse().getFileDescriptorProtoList();
handleResponse(fileDescriptorProtoList, channel, methodSymbol, requestContent);
} else {
log.warn("未知响应类型: " + response.getMessageResponseCase());
}
} catch (Exception e) {
log.error("处理响应失败: {}", e.getMessage(), e);
}
}
#Override
public void onError(Throwable t) {
}
#Override
public void onCompleted() {
log.info("Complete");
}
};
// 请求观察器
StreamObserver<ServerReflectionRequest> requestStreamObserver = reflectionStub.serverReflectionInfo(streamObserver);
// 构建并发送获取方法文件描述请求
ServerReflectionRequest getFileContainingSymbolRequest = ServerReflectionRequest.newBuilder()
.setFileContainingSymbol(methodSymbol)
.build();
requestStreamObserver.onNext(getFileContainingSymbolRequest);
channel.awaitTermination(10, TimeUnit.SECONDS);
}
/**
* 处理响应
*/
private static void handleResponse(List<ByteString> fileDescriptorProtoList,
ManagedChannel channel,
String methodFullName,
String requestContent) {
try {
// 解析方法和服务名称
String fullServiceName = extraPrefix(methodFullName);
String methodName = extraSuffix(methodFullName);
String packageName = extraPrefix(fullServiceName);
String serviceName = extraSuffix(fullServiceName);
// 根据响应解析 FileDescriptor
Descriptors.FileDescriptor fileDescriptor = getFileDescriptor(fileDescriptorProtoList, packageName, serviceName);
// 查找服务描述
Descriptors.ServiceDescriptor serviceDescriptor = fileDescriptor.getFile().findServiceByName(serviceName);
// 查找方法描述
Descriptors.MethodDescriptor methodDescriptor = serviceDescriptor.findMethodByName(methodName);
// 发起请求
executeCall(channel, fileDescriptor, methodDescriptor, requestContent);
} catch (Exception e) {
log.error(e.getMessage(), e);
}
}
/**
* 解析并查找方法对应的文件描述
*/
private static Descriptors.FileDescriptor getFileDescriptor(List<ByteString> fileDescriptorProtoList,
String packageName,
String serviceName) throws Exception {
Map<String, DescriptorProtos.FileDescriptorProto> fileDescriptorProtoMap =
fileDescriptorProtoList.stream()
.map(bs -> {
try {
return DescriptorProtos.FileDescriptorProto.parseFrom(bs);
} catch (InvalidProtocolBufferException e) {
e.printStackTrace();
}
return null;
})
.filter(Objects::nonNull)
.collect(Collectors.toMap(DescriptorProtos.FileDescriptorProto::getName, f -> f));
if (fileDescriptorProtoMap.isEmpty()) {
log.error("服务不存在");
throw new IllegalArgumentException("方法的文件描述不存在");
}
// 查找服务对应的 Proto 描述
DescriptorProtos.FileDescriptorProto fileDescriptorProto = findServiceFileDescriptorProto(packageName, serviceName, fileDescriptorProtoMap);
// 获取这个 Proto 的依赖
Descriptors.FileDescriptor[] dependencies = getDependencies(fileDescriptorProto, fileDescriptorProtoMap);
// 生成 Proto 的 FileDescriptor
return Descriptors.FileDescriptor.buildFrom(fileDescriptorProto, dependencies);
}
/**
* 根据包名和服务名查找相应的文件描述
*/
private static DescriptorProtos.FileDescriptorProto findServiceFileDescriptorProto(String packageName,
String serviceName,
Map<String, DescriptorProtos.FileDescriptorProto> fileDescriptorProtoMap) {
for (DescriptorProtos.FileDescriptorProto proto : fileDescriptorProtoMap.values()) {
if (proto.getPackage().equals(packageName)) {
boolean exist = proto.getServiceList()
.stream()
.anyMatch(s -> serviceName.equals(s.getName()));
if (exist) {
return proto;
}
}
}
throw new IllegalArgumentException("服务不存在");
}
/**
* 获取前缀
*/
private static String extraPrefix(String content) {
int index = content.lastIndexOf(".");
return content.substring(0, index);
}
/**
* 获取后缀
*/
private static String extraSuffix(String content) {
int index = content.lastIndexOf(".");
return content.substring(index + 1);
}
/**
* 获取依赖类型
*/
private static Descriptors.FileDescriptor[] getDependencies(DescriptorProtos.FileDescriptorProto proto,
Map<String, DescriptorProtos.FileDescriptorProto> finalDescriptorProtoMap) {
return proto.getDependencyList()
.stream()
.map(finalDescriptorProtoMap::get)
.map(f -> toFileDescriptor(f, getDependencies(f, finalDescriptorProtoMap)))
.toArray(Descriptors.FileDescriptor[]::new);
}
/**
* 将 FileDescriptorProto 转为 FileDescriptor
*/
#SneakyThrows
private static Descriptors.FileDescriptor toFileDescriptor(DescriptorProtos.FileDescriptorProto fileDescriptorProto,
Descriptors.FileDescriptor[] dependencies) {
return Descriptors.FileDescriptor.buildFrom(fileDescriptorProto, dependencies);
}
/**
* 执行方法调用
*/
private static void executeCall(ManagedChannel channel,
Descriptors.FileDescriptor fileDescriptor,
Descriptors.MethodDescriptor originMethodDescriptor,
String requestContent) throws Exception {
// 重新生成 MethodDescriptor
MethodDescriptor<DynamicMessage, DynamicMessage> methodDescriptor = generateMethodDescriptor(originMethodDescriptor);
CallOptions callOptions = CallOptions.DEFAULT;
TypeRegistry registry = TypeRegistry.newBuilder()
.add(fileDescriptor.getMessageTypes())
.build();
// 将请求内容由 JSON 字符串转为相应的类型
JsonFormat.Parser parser = JsonFormat.parser().usingTypeRegistry(registry);
DynamicMessage.Builder messageBuilder = DynamicMessage.newBuilder(originMethodDescriptor.getInputType());
parser.merge(requestContent, messageBuilder);
DynamicMessage requestMessage = messageBuilder.build();
// 调用,调用方式可以通过 originMethodDescriptor.isClientStreaming() 和 originMethodDescriptor.isServerStreaming() 推断
DynamicMessage response = ClientCalls.blockingUnaryCall(channel, methodDescriptor, callOptions, requestMessage);
// 将响应解析为 JSON 字符串
JsonFormat.Printer printer = JsonFormat.printer()
.usingTypeRegistry(registry)
.includingDefaultValueFields();
String responseContent = printer.print(response);
log.info("响应: {}", responseContent);
}
/**
* 重新生成方法描述
*/
private static MethodDescriptor<DynamicMessage, DynamicMessage> generateMethodDescriptor(Descriptors.MethodDescriptor originMethodDescriptor) {
// 生成方法全名
String fullMethodName = MethodDescriptor.generateFullMethodName(originMethodDescriptor.getService().getFullName(), originMethodDescriptor.getName());
// 请求和响应类型
MethodDescriptor.Marshaller<DynamicMessage> inputTypeMarshaller = ProtoUtils.marshaller(DynamicMessage.newBuilder(originMethodDescriptor.getInputType())
.buildPartial());
MethodDescriptor.Marshaller<DynamicMessage> outputTypeMarshaller = ProtoUtils.marshaller(DynamicMessage.newBuilder(originMethodDescriptor.getOutputType())
.buildPartial());
// 生成方法描述, originMethodDescriptor 的 fullMethodName 不正确
return MethodDescriptor.<DynamicMessage, DynamicMessage>newBuilder()
.setFullMethodName(fullMethodName)
.setRequestMarshaller(inputTypeMarshaller)
.setResponseMarshaller(outputTypeMarshaller)
// 使用 UNKNOWN,自动修改
.setType(MethodDescriptor.MethodType.UNKNOWN)
.build();
}
}
There isn't much to prevent this technically. The two big hurdles are:
having a runtime-callable parser for reading the .proto, and
having a general purpose gRPC client available that takes things like the service method name as literals
Both are possible, but neither is trivial.
For 1, the crude way would be to shell/invoke protoc using the descriptor-set option to generate a schema binary, then deserialize that as a FileDescriptorSet (from descriptor.proto); this model gives you access to how protoc sees the file. Some platforms also have native parsers (essentially reimplementing protoc as a library in that platform), for example protobuf-net.Reflection does this in .NET-land
For 2, here's an implementation of that in C#. The approach should be fairly portable to Java, even if the details vary. You can look at a generated implementation to see how it works in any particular language.
(Sorry that the specific examples are C#/.NET, but that's where I live; the approaches should be portable, even if the specific code: not directly)
technically both are possible.
The codegen is simply generating a handful of classes; mainly protobuf messages, grpc method descriptors and stubs. You can implement it or check in the generated code to bypass the codegen. i am not sure what is the benefit of doing this tbh. Also, it will be very annoying if the proto is changed.
It is also possible to do it dynamically using byte codegen as long as you check-in some interfaces/abstract classes to represent those generated stub/method descriptors and protobuf messages. you have to make sure those non dynamic code is in sync with the proto definition though (most likely runtime check/exception).
Most probably this issue is because of JSONObject(org.json.JSONObject) is incompatible with cloudant library.
Is any alternative way to use any other Object?
I am using below cloudant libraries,
<dependency>
<groupId>com.cloudant</groupId>
<artifactId>cloudant-client</artifactId>
<version>2.6.2</version>
</dependency>
Here is my code
package data.repositories;
import org.json.JSONObject;
import com.cloudant.client.api.*;
import com.cloudant.client.api.CloudantClient;
import com.cloudant.client.api.Database;
import com.cloudant.client.api.model.Response;
import util.Config;
public class DatabaseRepository {
CloudantClient client = ClientBuilder.account(Config.CLOUDANT_ACCOUNT_NAME)
.username(Config.CLOUDANT_USER_NAME)
.password(Config.CLOUDANT_PASSWORD).build();
public DatabaseRepository() {
JSONObject
}
public void Save(String dbName) {
Database db = client.database("dbTempName", true);
JSONObject jsonObject = new JSONObject("{hello: data}");
db.save(jsonObject);
}
}
Document saved in cloudant database is,
{
"_id": "1c7f223f74a54e7c9f4c8a713feaa537",
"_rev": "1-a3cd12379eec936b61f899c8278c9d62",
"map": {
"hello": "data"
}
}
I'm not familiar with cloudant but my guess is JsonObject has a property called "map" that holds your json string data (probably there's a myArray property too), and cloudant serializes it into json, thus adding those unnecessary values.
my suggestions:
1) try to save your json string directly like db.save("{hello: data}") to avoid serialization
2) if you really need to create a JsonObject try to customize cloudant's serialization process to avoid that extra fields.
in response to comment:
from what I read here, then I think you need a pojo, which when serialized into json would look like:
{ 'hello' : 'data' }
which is something like:
public class MyClass implements Serializable {
String hello;
public MyClass(String hello) {
this.hello = hello;
}
public String getHello() {
return hello;
}
}
then save it like:
db.save(new MyClass("data"));
or you can use a hashmap instead of a pojo:
Map<String, Object> map = new Hashmap ...
map.put("hello", "data");
db.save(map);
Look at the example in the README for the repo. It shows that you want a POJO, but you don't have to implement Serializable. Just create a class that has _id and _rev properties that are Strings. Then add Javascript object compatible properties as desired.
// A Java type that can be serialized to JSON
public class ExampleDocument {
private String _id = "example_id";
private String _rev = null;
private boolean isExample;
public ExampleDocument(boolean isExample) {
this.isExample = isExample;
}
public String toString() {
return "{ id: " + _id + ",\nrev: " + _rev + ",\nisExample: " + isExample + "\n}";
}
}
// Create an ExampleDocument and save it in the database
db.save(new ExampleDocument(true));
Although I haven't tried it, the Hashmap approach may work also, as discussed in this tutorial: https://www.ibm.com/blogs/bluemix/2014/07/cloudant_on_bluemix/.
// create a simple doc to place into your new database
Map<String, Object> doc = new HashMap<String, Object>();
doc.put("_id", UUID.randomUUID().toString());
doc.put("season", "summer");
doc.put("climate", "arid");
dbc.create(doc);
In question It seems org.json.JSONObject used And it is not compatible with cloudant client library. I tried with google object it is working good for me.
Issue got resolved by using google com.google.gson.JsonObject instead of org.json.JSONObject.
Correct Full code is given below,
Database db = client.database("dbTempName", true);
// Used google.gson.JsonObject instead of org.json.JSONObject.
com.google.gson.JsonParser parser = new com.google.gson.JsonParser();
com.google.gson.JsonObject jsonObject = parser.parse("{\"hello\": \"data\"}").getAsJsonObject();
db.save(jsonObject);
This is my code:
MongoDBSingleton dbSingleton = MongoDBSingleton.getInstance();
MongoDatabase db;
try {
db = dbSingleton.getTestdb();
MongoIterable<String> mg = db.listCollectionNames();
MongoCursor<String> iterator=mg.iterator();
while (iterator.hasNext()) {
MongoCollection<Document> table = db.getCollection(iterator.next());
for (Document doc: table.find()) {
System.out.println(doc.toJson());
}
}
}
This the output of toJson:
"modified" : { "$date" : 1475789185087}
This is my output of toString:
{"modified":"Fri Oct 07 02:56:25 IST 2016"}
I want String date format in Json, how to do it?
Sadly, IMO, MongoDB Java support is broken.
That said, there is a #deprecated class in the mongo-java-driver that you can use:
String json = com.mongodb.util.JSON.serialize(document);
System.out.println("JSON serialized Document: " + json);
I'm using this to produce fasterxml (jackson) compatible JSON from a Document object that I can deserialize via new ObjectMapper().readValue(json, MyObject.class).
However, I'm not sure what they expect you to use now that the JSON class is deprecated. But for the time being, it is still in the project (as of v3.4.2).
I'm importing the following in my pom:
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-async</artifactId>
<version>3.4.2</version>
</dependency>
<!-- Sadly, we need the mongo-java-driver solely to serialize
Document objects in a sane manner -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>3.4.2</version>
</dependency>
I'm using the async driver for actually fetching and pushing updates to mongo, and the non-async driver solely for the use of the JSON.serialize method.
No, it is not possible to produce the plain JSON. Please refer this link.
However, it can produce JSON in two modes.
1) Strict mode - Output that you have already got
2) Shell mode
Shell Mode:-
JsonWriterSettings writerSettings = new JsonWriterSettings(JsonMode.SHELL, true);
System.out.println(doc.toJson(writerSettings));
Output:-
"createdOn" : ISODate("2016-07-16T16:26:51.951Z")
MongoDB Extended JSON
In theory we are supposed to use toJSON() per...
https://jira.mongodb.org/browse/JAVA-1770
However, it seems that, at least up through 3.6, toJSON() isn't supported on various types the old JSON.serialize() method handled without issue, such as the AggregateIterable<Document> objects output by aggregate().
Here is a 2020 update to answer exactly your question, i.e. getting this exact format:
"modified":"2016-07-16T16:26:51.951Z"
You have to use writerSettings like notionquest suggested, but with a custom date converter and DateTimeFormatter.ISO_INSTANT:
public class JsonDateTimeConverter implements Converter<Long> {
private static final Logger LOGGER = LoggerFactory.getLogger(JsonDateTimeConverter.class);
static final DateTimeFormatter DATE_TIME_FORMATTER = DateTimeFormatter.ISO_INSTANT
.withZone(ZoneId.of("UTC"));
#Override
public void convert(Long value, StrictJsonWriter writer) {
try {
Instant instant = new Date(value).toInstant();
String s = DATE_TIME_FORMATTER.format(instant);
writer.writeString(s);
} catch (Exception e) {
LOGGER.error(String.format("Fail to convert offset %d to JSON date", value), e);
}
}
}
Use it like this:
doc.toJson(JsonWriterSettings
.builder()
.dateTimeConverter(new JsonDateTimeConverter())
.build())
if the bson.jar version is > 3.0.0 you may try document.toJson()
I used following
try {
MongoDatabase db = mongoClient.getDatabase("dbname");
MongoCollection<Document> collection = db.getCollection("nameofcollect");
Gson gson = new Gson();
ArrayList<JsonObject> array = new ArrayList<JsonObject>();
String jsonString = null;
/*WARNING Gson lib serialize string ->means add slash if you convert "json string" into "json string"*/
for (Document doc : collection.find()) {
jsonString = gson.toJson(doc);
array.add(new Gson().fromJson(jsonString, JsonObject.class));
}
//String finalarry = gson.toJson(array);
Map<Object, ArrayList<JsonObject>> seedMap = new HashMap<Object, ArrayList<JsonObject>>();
// String encode = coCoRy.encryptAndEncode(jsonString);
seedMap.put("seed", array);
String seedJsonString = gson.toJson(seedMap);
mongoClient.close();
return seedJsonString;
} catch (MongoException | ClassCastException e) {
e.printStackTrace();
return null;
}
Result will be like following
{
"seed": [
{
"_id": {
"timestamp": 1590914828,
"counter": 10457170,
"randomValue1": 5587428,
"randomValue2": -25784
},
"FIR_EVID_NUM": "3436345346",
"FIR_REG_NUM": "89678967",
"LOGIN_ID": "pc_admin",
"MEDIA_PATH": "C:\\Users\\ALPHAMALE\\Documents\\ShareX\\Screenshots\\2020-05\\1590211570.png"
},
{
"_id": {
"timestamp": 1590924463,
"counter": 7254997,
"randomValue1": 5012578,
"randomValue2": 24700
},
"FIR_EVID_NUM": "999999",
"FIR_REG_NUM": "888888",
"LOGIN_ID": "32323",
"MEDIA_PATH": "C:/uploads/c46847c7e2d130ffd746c789c0f0932e.png"
}
]
}
try this:
final JsonWriterSettings settings = JsonWriterSettings.builder( ).outputMode( JsonMode.SHELL ).build( );
System.out.println(doc.toJson(settings));
You can change the JsonMode is you wish
I am working in an XPages application with the OpenNTF Domino API to explore the Graph data modelling capabilities. As example I have taken the Teamroom that ships with IBM Domino.
I have defined a method to migrate responses documents into the Grap db but I get the error message: Cannot make a static reference to the non-static method
Here is what the method looks like:
private void migrateResponses(DFramedTransactionalGraph<DGraph> profilesGraph) {
try {
Database db = Factory.getSession().getCurrentDatabase();
View view = db.getView("responsesOnly");
DocumentCollection col = view.getAllDocuments();
System.out.println("number of docs found " + col.getCount());
for (Document response : col) {
System.out.println("form:" + response.getFormName());
System.out.println("id:" + response.getUniversalID());
org.openntf.domino.ext.Document parent = response.getParentDocument();
if (null == parent.getParentDocument()){
//has no parent document so this parent document is a MainTopic/Post
Post post = profilesGraph.addVertex(parent.getMetaversalID(), Post.class);
Response vertexResponse = profilesGraph.addVertex(response.getUniversalID(), Response.class);
vertexResponse.setSubject(response.getItemValueString("Subject"));
Post.addResponse(vertexResponse);
}
}
profilesGraph.commit();
} catch (Throwable t) {
XspOpenLogUtil.logError(t);
}
}
The error occurs in line:
Post.addResponse(vertexResponse);
Here is what my Post class looks like:
package com.wordpress.quintessens.graph.teamroom;
import org.openntf.domino.graph2.annotations.AdjacencyUnique;
import org.openntf.domino.graph2.builtin.DVertexFrame;
import com.tinkerpop.blueprints.Direction;
import com.tinkerpop.frames.Property;
import com.tinkerpop.frames.modules.typedgraph.TypeValue;
#TypeValue("post")
public interface Post extends DVertexFrame {
#Property("$$Key")
public String getKey();
#Property("subject")
public String getSubject();
#Property("subject")
public void setSubject(String n);
// real edges!
#AdjacencyUnique(label = "hasWritten", direction = Direction.OUT)
public Iterable<Profile> getAuthors();
#AdjacencyUnique(label = "hasReaction", direction = Direction.IN)
public void addResponse(Response response);
#AdjacencyUnique(label = "hasReaction", direction = Direction.IN)
public void removeResponse(Response response);
#AdjacencyUnique(label = "hasReaction", direction = Direction.IN)
public Iterable<Response> getResponses();
}
Do you have a suggestion how I should adapt my code to make it work?
Unless OpenNTF or TinkerPop are doing some kind of magic with the supplied annotations, you are attempting to call a non-static method on an interface. Are you sure that you don't want to change:
Post.addResponse(vertexResponse);
to
post.addResponse(vertexResponse);