I want to make a POST request with URL Query Params set to the values of an object.
For example
http://test/data?a=1&b=2&c=3
I want to make a post request to this URL with a class like this:
public class Data {
private Integer a;
private Integer b;
private Integer c;
}
I do NOT want to do each field manually, like this:
public void sendRequest(Data data) {
String url = UriComponentsBuilder.fromHttpUrl("http://test/")
.queryParam("a", data.getA())
.queryParam("b", data.getB())
.queryParam("c", data.getC())
.toUriString();
restTemplate.postForObject(url, body, Void.class);
}
Instead, I want to use the entire object:
public void sendRequest(Data data) {
String url = UriComponentsBuilder.fromHttpUrl("http://test/")
.queryParamsAll(data) //pseudo
.toUriString();
restTemplate.postForObject(url, body, Void.class);
}
Your requirement is like QS in js. Thx qianshui423/qs . It is implementation QS in java. It is coded by a Chinese guy. At first git clone it and use below cmd to build. You will get a jar called "qs-1.0.0.jar" in build/libs (JDK required version 8)
# cd qs directory
./gradlew build -x test
Import it, I do a simple demo as below. For your requirement, you can build class to transfer your Obj into QSObject. Besides toQString, QS can parse string to QSObject. I think it powerful.
import com.qs.core.QS;
import com.qs.core.model.QSObject;
public class Demo {
public static void main(String[] args) throws Exception{
QSObject qsobj = new QSObject();
qsobj.put("a",1);
qsobj.put("b",2);
qsobj.put("c",3);
String str = QS.toQString(qsobj);
System.out.println(str); // output is a=1&b=2&c=3
}
}
Related
When working with gRPC, we need to generate the gRPC client and server interfaces from our .proto service definition via protocol buffer compiler (protoc) or using Gradle or Maven protoc build plugin.
Flow now: protobuf file -> java code -> gRPC client.
So, is there any way to skip this step?
How to create a generic gRPC client that can call the server directly from the protobuf file without compile into java code?
Or, is there a way to Generated Code at runtime?
Flow expect: protobuf file -> gRPC client.
I want to build a generic gRPC client system with the input are protobuf files along with description of method, package, message request ... without having to compile again for each protobuf.
Thank you very much.
Protobuf systems really need protoc to be run. However, the generated code could be skipped. Instead of passing something like --java_out and --grpc_java_out to protoc you can pass --descriptor_set_out=FILE which will parse the .proto file into a descriptor file. A descriptor file is a proto-encoded FileDescriptorSet. This is the same basic format as used with the reflection service.
Once you have a descriptor, you can load it a FileDescriptor at a time and create a DynamicMessage.
Then for the gRPC piece, you need to create a gRPC MethodDescriptor.
static MethodDescriptor from(
Descriptors.MethodDescriptor methodDesc
) {
return MethodDescriptor.<DynamicMessage, DynamicMessage>newBuilder()
// UNKNOWN is fine, but the "correct" value can be computed from
// methodDesc.toProto().getClientStreaming()/getServerStreaming()
.setType(getMethodTypeFromDesc(methodDesc))
.setFullMethodName(MethodDescriptor.generateFullMethodName(
serviceDesc.getFullName(), methodDesc.getName()))
.setRequestMarshaller(ProtoUtils.marshaller(
DynamicMessage.getDefaultInstance(methodDesc.getInputType())))
.setResponseMarshaller(ProtoUtils.marshaller(
DynamicMessage.getDefaultInstance(methodDesc.getOutputType())))
.build();
static MethodDescriptor.MethodType getMethodTypeFromDesc(
Descriptors.MethodDescriptor methodDesc
) {
if (!methodDesc.isServerStreaming()
&& !methodDesc.isClientStreaming()) {
return MethodDescriptor.MethodType.UNARY;
} else if (methodDesc.isServerStreaming()
&& !methodDesc.isClientStreaming()) {
return MethodDescriptor.MethodType.SERVER_STREAMING;
} else if (!methodDesc.isServerStreaming()) {
return MethodDescriptor.MethodType.CLIENT_STREAMING);
} else {
return MethodDescriptor.MethodType.BIDI_STREAMING);
}
}
At that point you have everything you need and can call Channel.newCall(method, CallOptions.DEFAULT) in gRPC. You're also free to use ClientCalls to use something more similar to the stub APIs.
So dynamic calls are definitely possible, and is used for things like grpcurl. But it also is not easy and so is generally only done when necessary.
I did it in Java, and the step is:
Call reflection service to get FileDescriptorProto list by method name
Get FileDescriptor of method from FileDescriptorProto list by package name, service name
Get MethodDescriptor from ServiceDescriptor which get from the FileDescriptor
Generate a MethodDescriptor<DynamicMessage, DynamicMessage> by MethodDescriptor
Build request DynamicMessage from content like JSON or others
Call method
Parse response content to JSON from DynamicMessage response
You can reference the full sample in project helloworlde/grpc-java-sample#reflection
And proto is:
syntax = "proto3";
package io.github.helloworlde.grpc;
option go_package = "api;grpc_gateway";
option java_package = "io.github.helloworlde.grpc";
option java_multiple_files = true;
option java_outer_classname = "HelloWorldGrpc";
service HelloService{
rpc SayHello(HelloMessage) returns (HelloResponse){
}
}
message HelloMessage {
string message = 2;
}
message HelloResponse {
string message = 1;
}
Start server for this proto by yourself, and the full code in Java just like:
import com.google.protobuf.ByteString;
import com.google.protobuf.DescriptorProtos;
import com.google.protobuf.Descriptors;
import com.google.protobuf.DynamicMessage;
import com.google.protobuf.InvalidProtocolBufferException;
import com.google.protobuf.TypeRegistry;
import com.google.protobuf.util.JsonFormat;
import io.grpc.CallOptions;
import io.grpc.ManagedChannel;
import io.grpc.ManagedChannelBuilder;
import io.grpc.MethodDescriptor;
import io.grpc.protobuf.ProtoUtils;
import io.grpc.reflection.v1alpha.ServerReflectionGrpc;
import io.grpc.reflection.v1alpha.ServerReflectionRequest;
import io.grpc.reflection.v1alpha.ServerReflectionResponse;
import io.grpc.stub.ClientCalls;
import io.grpc.stub.StreamObserver;
import lombok.SneakyThrows;
import lombok.extern.slf4j.Slf4j;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.concurrent.TimeUnit;
import java.util.stream.Collectors;
#Slf4j
public class ReflectionCall {
public static void main(String[] args) throws InterruptedException {
// 反射方法的格式只支持 package.service.method 或者 package.service
String methodSymbol = "io.github.helloworlde.grpc.HelloService.SayHello";
String requestContent = "{\"message\": \"Reflection\"}";
// 构建 Channel
ManagedChannel channel = ManagedChannelBuilder.forAddress("127.0.0.1", 9090)
.usePlaintext()
.build();
// 使用 Channel 构建 BlockingStub
ServerReflectionGrpc.ServerReflectionStub reflectionStub = ServerReflectionGrpc.newStub(channel);
// 响应观察器
StreamObserver<ServerReflectionResponse> streamObserver = new StreamObserver<ServerReflectionResponse>() {
#Override
public void onNext(ServerReflectionResponse response) {
try {
// 只需要关注文件描述类型的响应
if (response.getMessageResponseCase() == ServerReflectionResponse.MessageResponseCase.FILE_DESCRIPTOR_RESPONSE) {
List<ByteString> fileDescriptorProtoList = response.getFileDescriptorResponse().getFileDescriptorProtoList();
handleResponse(fileDescriptorProtoList, channel, methodSymbol, requestContent);
} else {
log.warn("未知响应类型: " + response.getMessageResponseCase());
}
} catch (Exception e) {
log.error("处理响应失败: {}", e.getMessage(), e);
}
}
#Override
public void onError(Throwable t) {
}
#Override
public void onCompleted() {
log.info("Complete");
}
};
// 请求观察器
StreamObserver<ServerReflectionRequest> requestStreamObserver = reflectionStub.serverReflectionInfo(streamObserver);
// 构建并发送获取方法文件描述请求
ServerReflectionRequest getFileContainingSymbolRequest = ServerReflectionRequest.newBuilder()
.setFileContainingSymbol(methodSymbol)
.build();
requestStreamObserver.onNext(getFileContainingSymbolRequest);
channel.awaitTermination(10, TimeUnit.SECONDS);
}
/**
* 处理响应
*/
private static void handleResponse(List<ByteString> fileDescriptorProtoList,
ManagedChannel channel,
String methodFullName,
String requestContent) {
try {
// 解析方法和服务名称
String fullServiceName = extraPrefix(methodFullName);
String methodName = extraSuffix(methodFullName);
String packageName = extraPrefix(fullServiceName);
String serviceName = extraSuffix(fullServiceName);
// 根据响应解析 FileDescriptor
Descriptors.FileDescriptor fileDescriptor = getFileDescriptor(fileDescriptorProtoList, packageName, serviceName);
// 查找服务描述
Descriptors.ServiceDescriptor serviceDescriptor = fileDescriptor.getFile().findServiceByName(serviceName);
// 查找方法描述
Descriptors.MethodDescriptor methodDescriptor = serviceDescriptor.findMethodByName(methodName);
// 发起请求
executeCall(channel, fileDescriptor, methodDescriptor, requestContent);
} catch (Exception e) {
log.error(e.getMessage(), e);
}
}
/**
* 解析并查找方法对应的文件描述
*/
private static Descriptors.FileDescriptor getFileDescriptor(List<ByteString> fileDescriptorProtoList,
String packageName,
String serviceName) throws Exception {
Map<String, DescriptorProtos.FileDescriptorProto> fileDescriptorProtoMap =
fileDescriptorProtoList.stream()
.map(bs -> {
try {
return DescriptorProtos.FileDescriptorProto.parseFrom(bs);
} catch (InvalidProtocolBufferException e) {
e.printStackTrace();
}
return null;
})
.filter(Objects::nonNull)
.collect(Collectors.toMap(DescriptorProtos.FileDescriptorProto::getName, f -> f));
if (fileDescriptorProtoMap.isEmpty()) {
log.error("服务不存在");
throw new IllegalArgumentException("方法的文件描述不存在");
}
// 查找服务对应的 Proto 描述
DescriptorProtos.FileDescriptorProto fileDescriptorProto = findServiceFileDescriptorProto(packageName, serviceName, fileDescriptorProtoMap);
// 获取这个 Proto 的依赖
Descriptors.FileDescriptor[] dependencies = getDependencies(fileDescriptorProto, fileDescriptorProtoMap);
// 生成 Proto 的 FileDescriptor
return Descriptors.FileDescriptor.buildFrom(fileDescriptorProto, dependencies);
}
/**
* 根据包名和服务名查找相应的文件描述
*/
private static DescriptorProtos.FileDescriptorProto findServiceFileDescriptorProto(String packageName,
String serviceName,
Map<String, DescriptorProtos.FileDescriptorProto> fileDescriptorProtoMap) {
for (DescriptorProtos.FileDescriptorProto proto : fileDescriptorProtoMap.values()) {
if (proto.getPackage().equals(packageName)) {
boolean exist = proto.getServiceList()
.stream()
.anyMatch(s -> serviceName.equals(s.getName()));
if (exist) {
return proto;
}
}
}
throw new IllegalArgumentException("服务不存在");
}
/**
* 获取前缀
*/
private static String extraPrefix(String content) {
int index = content.lastIndexOf(".");
return content.substring(0, index);
}
/**
* 获取后缀
*/
private static String extraSuffix(String content) {
int index = content.lastIndexOf(".");
return content.substring(index + 1);
}
/**
* 获取依赖类型
*/
private static Descriptors.FileDescriptor[] getDependencies(DescriptorProtos.FileDescriptorProto proto,
Map<String, DescriptorProtos.FileDescriptorProto> finalDescriptorProtoMap) {
return proto.getDependencyList()
.stream()
.map(finalDescriptorProtoMap::get)
.map(f -> toFileDescriptor(f, getDependencies(f, finalDescriptorProtoMap)))
.toArray(Descriptors.FileDescriptor[]::new);
}
/**
* 将 FileDescriptorProto 转为 FileDescriptor
*/
#SneakyThrows
private static Descriptors.FileDescriptor toFileDescriptor(DescriptorProtos.FileDescriptorProto fileDescriptorProto,
Descriptors.FileDescriptor[] dependencies) {
return Descriptors.FileDescriptor.buildFrom(fileDescriptorProto, dependencies);
}
/**
* 执行方法调用
*/
private static void executeCall(ManagedChannel channel,
Descriptors.FileDescriptor fileDescriptor,
Descriptors.MethodDescriptor originMethodDescriptor,
String requestContent) throws Exception {
// 重新生成 MethodDescriptor
MethodDescriptor<DynamicMessage, DynamicMessage> methodDescriptor = generateMethodDescriptor(originMethodDescriptor);
CallOptions callOptions = CallOptions.DEFAULT;
TypeRegistry registry = TypeRegistry.newBuilder()
.add(fileDescriptor.getMessageTypes())
.build();
// 将请求内容由 JSON 字符串转为相应的类型
JsonFormat.Parser parser = JsonFormat.parser().usingTypeRegistry(registry);
DynamicMessage.Builder messageBuilder = DynamicMessage.newBuilder(originMethodDescriptor.getInputType());
parser.merge(requestContent, messageBuilder);
DynamicMessage requestMessage = messageBuilder.build();
// 调用,调用方式可以通过 originMethodDescriptor.isClientStreaming() 和 originMethodDescriptor.isServerStreaming() 推断
DynamicMessage response = ClientCalls.blockingUnaryCall(channel, methodDescriptor, callOptions, requestMessage);
// 将响应解析为 JSON 字符串
JsonFormat.Printer printer = JsonFormat.printer()
.usingTypeRegistry(registry)
.includingDefaultValueFields();
String responseContent = printer.print(response);
log.info("响应: {}", responseContent);
}
/**
* 重新生成方法描述
*/
private static MethodDescriptor<DynamicMessage, DynamicMessage> generateMethodDescriptor(Descriptors.MethodDescriptor originMethodDescriptor) {
// 生成方法全名
String fullMethodName = MethodDescriptor.generateFullMethodName(originMethodDescriptor.getService().getFullName(), originMethodDescriptor.getName());
// 请求和响应类型
MethodDescriptor.Marshaller<DynamicMessage> inputTypeMarshaller = ProtoUtils.marshaller(DynamicMessage.newBuilder(originMethodDescriptor.getInputType())
.buildPartial());
MethodDescriptor.Marshaller<DynamicMessage> outputTypeMarshaller = ProtoUtils.marshaller(DynamicMessage.newBuilder(originMethodDescriptor.getOutputType())
.buildPartial());
// 生成方法描述, originMethodDescriptor 的 fullMethodName 不正确
return MethodDescriptor.<DynamicMessage, DynamicMessage>newBuilder()
.setFullMethodName(fullMethodName)
.setRequestMarshaller(inputTypeMarshaller)
.setResponseMarshaller(outputTypeMarshaller)
// 使用 UNKNOWN,自动修改
.setType(MethodDescriptor.MethodType.UNKNOWN)
.build();
}
}
There isn't much to prevent this technically. The two big hurdles are:
having a runtime-callable parser for reading the .proto, and
having a general purpose gRPC client available that takes things like the service method name as literals
Both are possible, but neither is trivial.
For 1, the crude way would be to shell/invoke protoc using the descriptor-set option to generate a schema binary, then deserialize that as a FileDescriptorSet (from descriptor.proto); this model gives you access to how protoc sees the file. Some platforms also have native parsers (essentially reimplementing protoc as a library in that platform), for example protobuf-net.Reflection does this in .NET-land
For 2, here's an implementation of that in C#. The approach should be fairly portable to Java, even if the details vary. You can look at a generated implementation to see how it works in any particular language.
(Sorry that the specific examples are C#/.NET, but that's where I live; the approaches should be portable, even if the specific code: not directly)
technically both are possible.
The codegen is simply generating a handful of classes; mainly protobuf messages, grpc method descriptors and stubs. You can implement it or check in the generated code to bypass the codegen. i am not sure what is the benefit of doing this tbh. Also, it will be very annoying if the proto is changed.
It is also possible to do it dynamically using byte codegen as long as you check-in some interfaces/abstract classes to represent those generated stub/method descriptors and protobuf messages. you have to make sure those non dynamic code is in sync with the proto definition though (most likely runtime check/exception).
How do I go about configuring the classpath when using the scripts package with atom/java?
I know my classpath is:
usr/local/algs4/algs4.jar
Here is the code I am testing with:
import edu.princeton.cs.algs4.*;
public class Wget {
public static void main(String[] args) {
// read in data from URL
String url = args[0];
In in = new In(url);
String data = in.readAll();
// write data to a file
String filename = url.substring(url.lastIndexOf('/') + 1);
Out out = new Out(filename);
out.println(data);
out.close();
}
}
Since you're using algs4, Use Princeton's site and search for classpath.
http://algs4.cs.princeton.edu/code/
Most probably this issue is because of JSONObject(org.json.JSONObject) is incompatible with cloudant library.
Is any alternative way to use any other Object?
I am using below cloudant libraries,
<dependency>
<groupId>com.cloudant</groupId>
<artifactId>cloudant-client</artifactId>
<version>2.6.2</version>
</dependency>
Here is my code
package data.repositories;
import org.json.JSONObject;
import com.cloudant.client.api.*;
import com.cloudant.client.api.CloudantClient;
import com.cloudant.client.api.Database;
import com.cloudant.client.api.model.Response;
import util.Config;
public class DatabaseRepository {
CloudantClient client = ClientBuilder.account(Config.CLOUDANT_ACCOUNT_NAME)
.username(Config.CLOUDANT_USER_NAME)
.password(Config.CLOUDANT_PASSWORD).build();
public DatabaseRepository() {
JSONObject
}
public void Save(String dbName) {
Database db = client.database("dbTempName", true);
JSONObject jsonObject = new JSONObject("{hello: data}");
db.save(jsonObject);
}
}
Document saved in cloudant database is,
{
"_id": "1c7f223f74a54e7c9f4c8a713feaa537",
"_rev": "1-a3cd12379eec936b61f899c8278c9d62",
"map": {
"hello": "data"
}
}
I'm not familiar with cloudant but my guess is JsonObject has a property called "map" that holds your json string data (probably there's a myArray property too), and cloudant serializes it into json, thus adding those unnecessary values.
my suggestions:
1) try to save your json string directly like db.save("{hello: data}") to avoid serialization
2) if you really need to create a JsonObject try to customize cloudant's serialization process to avoid that extra fields.
in response to comment:
from what I read here, then I think you need a pojo, which when serialized into json would look like:
{ 'hello' : 'data' }
which is something like:
public class MyClass implements Serializable {
String hello;
public MyClass(String hello) {
this.hello = hello;
}
public String getHello() {
return hello;
}
}
then save it like:
db.save(new MyClass("data"));
or you can use a hashmap instead of a pojo:
Map<String, Object> map = new Hashmap ...
map.put("hello", "data");
db.save(map);
Look at the example in the README for the repo. It shows that you want a POJO, but you don't have to implement Serializable. Just create a class that has _id and _rev properties that are Strings. Then add Javascript object compatible properties as desired.
// A Java type that can be serialized to JSON
public class ExampleDocument {
private String _id = "example_id";
private String _rev = null;
private boolean isExample;
public ExampleDocument(boolean isExample) {
this.isExample = isExample;
}
public String toString() {
return "{ id: " + _id + ",\nrev: " + _rev + ",\nisExample: " + isExample + "\n}";
}
}
// Create an ExampleDocument and save it in the database
db.save(new ExampleDocument(true));
Although I haven't tried it, the Hashmap approach may work also, as discussed in this tutorial: https://www.ibm.com/blogs/bluemix/2014/07/cloudant_on_bluemix/.
// create a simple doc to place into your new database
Map<String, Object> doc = new HashMap<String, Object>();
doc.put("_id", UUID.randomUUID().toString());
doc.put("season", "summer");
doc.put("climate", "arid");
dbc.create(doc);
In question It seems org.json.JSONObject used And it is not compatible with cloudant client library. I tried with google object it is working good for me.
Issue got resolved by using google com.google.gson.JsonObject instead of org.json.JSONObject.
Correct Full code is given below,
Database db = client.database("dbTempName", true);
// Used google.gson.JsonObject instead of org.json.JSONObject.
com.google.gson.JsonParser parser = new com.google.gson.JsonParser();
com.google.gson.JsonObject jsonObject = parser.parse("{\"hello\": \"data\"}").getAsJsonObject();
db.save(jsonObject);
I did a Java OCR project with Tesseract in the Mirth.When I run the jar file from the Mirth,I get this error.When I search it,I found that there is a init() method and also it is a protected void in Tesseract.java.I think that maybe it is the reason for that error.
What should I do?Thank you so much for your helps.
package Tess4jTest;
import java.io.File;
import java.io.IOException;
import net.sourceforge.tess4j.*;
public class TestTess {
public static String Tc;
public static String phone;
public static String date;
public static void main(String[] args) {
//System.out.println(returnText("C:\\Users\\Nevzat\\Desktop\\deneme.pdf"));
}
public static String returnText(String fileName){
File imageFile = new File(fileName);
if(imageFile.exists()){
Tesseract instance = new Tesseract();
instance.setDatapath("C:\\imageRAD\\Onam\\tessdata");
String result = null;
try {
result = instance.doOCR(imageFile);
} catch (TesseractException e) {
System.err.println(e.getMessage());
}
if(result!=null){
int i=result.indexOf("Numarasn: ");
int j=result.indexOf("Tel No:");
int k=result.indexOf("Bilgllendirme Tarihl:");
Tc = result.substring(i+10, i+21);
phone = result.substring(j+8,j+23);
date = result.substring(k+22,k+32);
//System.out.println(result);
}else{
return "Null Error!";
}
}else{
return "Does not found a file!";
}
return Tc+","+phone+","+date;
}
public static String returnTC() throws IOException{
return Tc;
}
public static String returnPhone() throws IOException{
return phone;
}
public static String returnDate() throws IOException{
return date;
}
}
The error you got occurs when you try to create an object with a private constructor. (<init>() is the name of a constructor with no parameters)
Looking at the tess4j source I found a method with the following documentation:
#deprecated As of Release 2.0, use default constructor instead.
Looking at the source before 2.0 reveals the default constructor was private.
This means your problem is most likely that you are compiling against a version newer than 2.0, but your environment is running one older than 2.0.
Either update your environment or downgrade the library you build against to fix it.
I solved the error and have finished the project.I mention step by step
1.You have to use right jar files for tess4j.
2.Add java project all of in the tess4j-3.2.1.zip except tess4j-3.2.1.jar via Build Path.
3.Add tess4j-1.5.jar from this
4.Add tessdata folder,ghost4j-0.5.1.jar,jna-4.1.jar,tess4j.jar and jar file of your java project.
I have a class CollectionObject which creates a ArrayList.
public class CollectionObject {
private List<String> collectionObject;
public CollectionObject() {
collectionObject = new ArrayList<String>();
}
public List<String> getCollectionObject() {
return collectionObject;
}
public void add(final String stringToWrite) throws VerifyException {
collectionObject.add(stringToWrite);
}
}
There is another class which takes in the class CollectionObject and uses it to write the contents of the file to the class CollectionObject.
public class ReaderFileWriterObjectService {
private BufferedReader bufferedReader;
private CollectionObject collectionObject;
private String line;
public CollectionObject getCollectionObjectAfterWritingFromAFile(final File file)
throws VerifyException, IOException {
collectionObject = new CollectionObject();
bufferedReader = new BufferedReader(new FileReader(file));
while ((line = bufferedReader.readLine()) != null) {
collectionObject.add(line);
}
bufferedReader.close();
return collectionObject;
}
How to Test and Mock the method of the class ReaderFileWriterObjectService?
Let me complement on #LouisWasserman's answer.
You just cannot test APIs which rely on java.io.File; this class cannot be reliably unit tested (even though it is not even final at the JDK level).
But this is not the case with the new filesystem API, which appeared with Java 7.
Also known as JSR 203, this API provides a unified API to any storage medium providing "filesystem objects".
Short story:
a "filesystem object" is materialized by a Path in this API;
any JDK implementing JSR 203 (ie, any Java 7+ version) supports this API;
to get a Path from a resource on the default FileSystem, you can use Paths.get();
but you are not limited to that.
In short, in your API and test case, you should use Path, not File. And if you want to test anything related to some filesystem resource, use the JDK's Files class to test Path instances.
And you can create FileSystems out of your main, disk based, file system. Recommendation: use this.
I am doing the same thing, And the following idea is working,
I hope this will work for u too,
#InjectMocks
private CollectionObject collectionObject;
#Test
public void getCollectionObjectAfterWritingFromAFile() throws Exception {
CollectionObject expectedObject =new CollectionObject();
List<String> expectedList=new ArrayList<String>();
expectedList.add("100");
CollectionObject resultObject =new CollectionObject();
BufferedReader reader=new BufferedReader(new StringReader("100"));
PowerMockito.mock(BufferedReader.class);
PowerMockito.mock(FileReader.class);
PowerMockito.whenNew(FileReader.class).withArguments("test10.csv").thenReturn(null);
PowerMockito.whenNew(BufferedReader.class).withArguments(null).thenReturn(reader);
resultObject=collectionObject.getCollectionObjectAfterWritingFromAFile( "test10.csv");
assertEquals(expectedObject ,resultObject );
}
You can use JUnit's TemporaryFolder for creating a file and copy the contents from a resource to it.
public YourText {
#Rule
public TemporaryFolder folder = new TemporaryFolder();
#Test
public void checkSomething() throws Exception {
InputStream resource = getClass().getResourceAsStream("/your/resource");
File file = folder.newFile();
Files.copy(resource, file);
ReaderFileWriterObjectService service = ...
CollectionObject collection = service
.getCollectionObjectAfterWritingFromAFile(file);
...
}
You cannot. You're pretty much out of luck. A better design would accept a Java 7 java.nio.file.FileSystem and a Path that could be swapped out for a test implementation, e.g. https://github.com/google/jimfs.
Okay, at first lets consider what do u want to test? If it is unit test then u dont want to test integrations like communications with filesystem, you have to test your own logic and your logic is something like:
1) Read next line from file using file system integration
2) add this line into my object
The second step you should not test because this method is too easy to break. The first step you can't test cause it performs integrations call. So i don't think that here u need a unit test
But if your logic will be more complicated, then you can introduce interface wrapper and mock it in your test:
public interface FileWrapper{
public String readLine();
public void close();
}
public class FileWrapperImpl implements FileWrapper{
private File file;
private BufferedReader reader;
public FileWrapperImpl (File file){
this.file = file;
this.reader= ...
}
public String readLine(){
return reader.nextLine();
}
}
And then your ReaderFileWriterObjectService:
public CollectionObject getCollectionObjectAfterWritingFromAFile(FileWrapper wrapper)
CollectionObject collectionObject = new CollectionObject();
while ((line = wrapper.readLine()) != null) {
collectionObject.add(line);
}
wrapper.close();
return collectionObject;
}
And now you can easily mock FileWrapper for test and pass it to your service
I'd suggest changing the API to accept Reader or BufferedReader - those can be mocked. Hide the dependency on file with a factory.