Making http request using retrofit from non activity class - java

We have used a Activity-BL-DAO-DB(sqlite) in my app while developing.
Due to change in requirement, we have to use REST service from the server alone. I have seen Retrofit for it. But I'm not sure how to use it in DAO classes instead of SQL queries.
We have looked into bus concepts which requires more rework. We wanted to make minimal changes to the code to incorporate this change.
If anything else needed,let me know.
Eg:
Following is the sample flow which will display list of technologies in the list.
Technology Activity OnCreate method:
techList=new ArrayList<Technology>();
techList=technologyBL.getAllTechnology(appId);
adapterTech=new TechnologyAdapter(this,new ArrayList<Technology> (techList));
listView.setAdapter(adapterTech);
Technology BL :
public List<Technology> getAllTechnology(String appId) {
techList=technologyDao.getAllTechnology(appId);
// some logic
return techList;
}
Technology DAO:
public List<Technology> getAllTechnology(String appId) {
//sql queries
return techList;
}
Technology Model:
class Technology{
String id,techName,techDescription;
//getters & setters
}
I have to replace sql queries with retrofit request. I have created the following retrofit class and interfaces:
RestClient Interface:
public interface IRestClient {
#GET("/apps/{id}/technologies")
void getTechnoloies(#Path("id") String id,Callback<List<Technology>> cb);
//Remaining methods
}
RestClient :
public class RestClient {
private static IRestClient REST_CLIENT;
public static final String BASE_URL = "http://16.180.48.236:22197/api";
Context context;
static {
setupRestClient();
}
private RestClient() {}
public static RestClient get() {
return REST_CLIENT;
}
private static void setupRestClient() {
RestAdapter restAdapter = new RestAdapter.Builder()
.setEndpoint(BASE_URL)
.setClient(getClient())
.setRequestInterceptor(new RequestInterceptor() {
//cache related things
})
.setLogLevel(RestAdapter.LogLevel.FULL)
.build();
REST_CLIENT = restAdapter.create(IAluWikiClient.class);
}
private static OkClient getClient(){
//cache related
}
}
I tried calling with both sync/async methods in DAO. For sync method,it was throwing some error related to main thread.For Async,it is crashing as request is done late.
Sync call in DAO:
techList=RestClient.get().getTecchnologies(id);
Async call in DAO:
RestClient.get().getTechnolgies(id,new CallBack<List<Technolgy>(){
#Override
public void success(List<Technology> technologies, Response response) {
techList=technologies;
}
#Override
public void failure(Retrofit error){}
});

You've got two options here.
The first is to create the Retrofit callback in the Activity:
RestClient.get().getTechnolgies(id,new CallBack<List<Technolgy>(){
#Override
public void success(List<Technology> technologies, Response response) {
ArrayList<Technology> techList = technologyBL.someLogic(technologies);
adapterTech=new TechnologyAdapter(this,techList);
listView.setAdapter(adapterTech);
}
#Override
public void failure(Retrofit error){}
});
Note that you will have to extract your //some logic part into a separate BL method.
The second option is to make the Retrofit API call return an RxJava Observable (which is integrated into Retrofit):
RestClient.get().getTechnolgies(id)
.observeOn(AndroidSchedulers.mainThread())
.subscribe(new Action1<List<Technology>>() {
#Override
public void call(List<Technology> technologies) {
ArrayList<Technology> techList = technologyBL.someLogic(technologies);
adapterTech=new TechnologyAdapter(this,techList);
listView.setAdapter(adapterTech);
}
});
In this case, your RestClient interface is:
public interface IRestClient {
#GET("/apps/{id}/technologies")
Observable<List<Technology>> getTechnologies(#Path("id") String id);
//Remaining methods
}
You can read more about it in the "SYNCHRONOUS VS. ASYNCHRONOUS VS. OBSERVABLE" section of http://square.github.io/retrofit/. Also, see these two blogposts to get your head around RxJava and Observables:

In my experience, I have found it useful to create a Service which executes calls to the Retrofit API by using custom AsyncTask implementations. This paradigm keeps all your data model interactions in one place (the service) and gets all the API calls off the main thread.

Related

Calling Spring controller method without going to internet

tldr: Is there a way to make an internal request (using the method's path) without going to the internet?
--
Why do I need it? I have a project which receives many events. The decision of who will handle each event is made by a Controller. So I have something similar to this:
#RestController
#RequestMapping("/events")
public class EventHandlerAPI {
#Autowired
private EventAHandler eventAhandler;
#Autowired
private EventBHandler eventBhandler;
#PostMapping("/a")
public void handleEventA(#RequestBody EventA event) {
eventAhandler.handle(id, event);
}
#PostMapping("/b")
public void handleEventB(#RequestBody EventB event) {
eventBhandler.handle(id, event);
}
}
We recently added support to receive events through a Queue service. It sends to us the payload and the event class. Our decision is to let both interfaces working (rest and queue). The solution to avoid code duplication was to keep the Controller choosing which handler will take care of the event. The code nowadays is similar to this:
#Configuration
public class EventHandlerQueueConsumer {
#Autowired
private EventHandlerAPI eventHandlerAPI;
private Map<Class, EventHandler> eventHandlers;
#PostConstruct
public void init() {
/* start listen queue */
declareEventHandlers();
}
private void declareEventHandlers() {
eventHandlers = new HashMap<>();
eventHandlers.put(EventAHandler.class, (EventHandler<EventAHandler>) eventHandlerAPI::handleEventA);
eventHandlers.put(EventBHandler.class, (EventHandler<EventBHandler>) eventHandlerAPI::handleEventB);
}
private void onEventReceived(AbstractEvent event) {
EventHandler eventHandler = eventHandlers.get(event.getClass());
eventHandler.handle(event);
}
private interface EventHandler<T extends AbstractEvent> {
void handle(T event);
}
}
This code works, but it doesn't let the controller choose who will handle the event (our intention). The decision is actually being made by the map.
What I would like to do was to invoke the controller method through it's request mapping without going to the internet. Something like this:
#Configuration
public class EventHandlerQueueConsumer {
// MADE UP CLASS TO SHOW WHAT I WANT
#Autowired
private ControllerInkover controllerInvoker;
#PostConstruct
public void init() { /* start listen queue */ }
private void onEventReceived(AbstractEvent event) {
controllerInvoker.post(event.getPath(), new Object[] { event });
}
}
This way is much cleaner and let all the decisions be made by the controller.
I've researched a lot and didn't found a way to implement it. Debugging spring, I found how he routes the request after the DispatcherServlet, but all the spring internals uses HttpServletRequest and HttpServletResponse :(
Is there a way to make an internal request (using the method's path) without going to the internet?
They are classes of the same application
Then it should easy enough.
1) You can call your own API on http(s)://localhost:{port}/api/{path} using RestTemplate utility class. This is preferred way, since you'll follow standard MVC pattern. Something like:
restTemplate.exchange(uri, HttpMethod.POST, httpEntity, ResponseClass.class);
2) If you don't want to invoke network connection at all, then you can either use Spring's internal to find the mapping/method map or use some reflection to build custom
map upon controller's startup. Then you can pass your event/object to the method from the map in a way shown in your mock-up class. Something like:
#RequestMapping("foo")
public void fooMethod() {
System.out.println("mapping = " + getMapping("fooMethod")); // you can get all methods/mapping in #PostContruct initialization phase
}
private String getMapping(String methodName) {
Method methods[] = this.getClass().getMethods();
for (int i = 0; i < methods.length; i++) {
if (methods[i].getName() == methodName) {
String mapping[] = methods[i].getAnnotation(RequestMapping.class).value();
if (mapping.length > 0) {
return mapping[mapping.length - 1];
}
}
}
return null;
}

Map containing Retrofit API interface method from ServiceHelper

I am trying to follow the REST client implementation pattern described in the Google I/O Dobjanschi video here and am using Retrofit2 for the REST API calls.
Based on the REST client pattern described above I introduced a ServiceHelper layer that calls the actual API method via Retrofit. However I don't have a clean way to call the interface methods from the ServiceHelper layer.
I currently have an enum of the available API calls and pass that from the ServiceHelper. And in my ApiProcessor introduced a function that uses an giant if..else if ladder that returns the appropriate Retrofit API interface call based on the enum passed in. I haven't really found a better/cleaner approach to this.
Is there a better / cleaner way to map these? Or any other ideas to do this?
You should throw away that monolithic ServiceHelper and create several repositories following the repository pattern in order to encapsulate and distribute responsibilities between classes.
Actually, the Retrofit API itself favors composition over inheritance, so you can easily create as much interfaces as needed and use them in the right repository.
Without the code it is a bit hard to "inspect" your solution. :)
As You asked the question it is not really the best way to solve the problem in this way (in my opinion). Altough there are a ton approaches like "if it works it is OK".
In my opinion a bit cleaner solution would be the following: Your helper is a good thing. It should be used to hide all the details of the API You are using.
It is a good thing to hide those API specific stuff because if it changes You only forced to change only your helper/adapter. My recommendation is to use multiple method in apiprocessor and not enums. It is a bit easier to maintain and fix if something is changing. Plus you do not have to take care of your Enum.
TLDR: If it works probably it is OK. You do not have to write million dollar production code to test something, but if you would like to get a good habit You should consider the refactor that code into separate processor methods.
You Can follow service pattern:
a) Create resource interface which are your exposed rest methods eg:
public interface AppResource {
#Headers({"Accept: application/json", "Content-Type: application/json"})
#GET(ApiConstants.API_VERSION_V1 + "/users")
Call<List<User>> getUsers();
}
b) Create RetrofitFactory
public class RetrofitFactory {
private static Retrofit userRetrofit;
#NonNull
private static Retrofit initRetrofit(String serverUrl) {
final HttpLoggingInterceptor logging = new HttpLoggingInterceptor();
// set your desired log level
logging.setLevel(HttpLoggingInterceptor.Level.BODY);
final OkHttpClient okHttpClient = new OkHttpClient.Builder()
.connectTimeout(10, TimeUnit.SECONDS)
.readTimeout(30, TimeUnit.SECONDS)
.addInterceptor(new Interceptor() {
#Override
public Response intercept(Chain chain) throws IOException {
final Request original = chain.request();
final Request request = original.newBuilder()
.method(original.method(), original.body())
.build();
return chain.proceed(request);
}
})
.addInterceptor(logging)
.build();
return new Retrofit.Builder()
.baseUrl(serverUrl)
.addConverterFactory(JacksonConverterFactory.create())
.client(okHttpClient)
.build();
}
public static Retrofit getUserRetrofit() {
if (userRetrofit == null) {
final String serverUrl = context.getString(R.string.server_url); //Get context
userRetrofit = initRetrofit(serverUrl);
}
return userRetrofit;
}
}
c) Create a abstract BaseService which your every service will extend
public abstract class BaseService<Resource> {
protected final Resource resource;
final Retrofit retrofit;
public BaseService(Class<Resource> clazz) {
this(clazz, false);
}
public BaseService(Class<Resource> clazz) {
retrofit = RetrofitFactory.getUserRetrofit();
resource = retrofit.create(clazz);
}
protected <T> void handleResponse(Call<T> call, final ResponseHandler<T> responseHandler) {
call.enqueue(new Callback<T>() {
#Override
public void onResponse(final Call<T> call, final Response<T> response) {
if (response.isSuccess()) {
if (responseHandler != null) {
responseHandler.onResponse(response.body());
}
} else {
final ErrorResponse errorResponse = parseError(response);
if (responseHandler != null) {
responseHandler.onError(errorResponse);
}
}
}
#Override
public void onFailure(final Call<T> call, final Throwable throwable) {
if (responseHandler != null) {
responseHandler.onFailure(throwable);
}
}
});
}
}
d) Now your user service with their response handler
public interface UserService {
void getUsers(ResponseHandler<List<User>> userListResponse);
}
e) Now your user service implementation class which extends baseservice
public class UserServiceImpl extends BaseService<UserResource> implements UserService {
public UserServiceImpl () {
super(AppResource.class);
}
#Override
public void getUsers(ResponseHandler<List<User>> userListResponse) throws UserServiceException {
final Call<List<User>> response = resource.getUsers();
handleResponse(response, userListResponse);
}
f) Create a service factory which you will reuse to call services eg:
public class ServiceFactory {
private static UserService userservice;
UserService getUserService(){
if (UserService == null) {
UserService = new UserServiceImpl();
}
return UserService ;
}
g) Now simply call service and pass your response handler
ServiceFactory.getUserService().getUsers(getUserListResponseHandler());
} catch (your exception handler) {
//log your excp
}

How to maintain versions of RESTful API in Java?

I'd like to implement versioning in my RESTful web service API. I intend to put the version into the URL, viz.: /api/v3/endpoint
What is the ideal way to do this (in Java)?
Although this irritates my aversion to manual version control, my best guess is to save the API interface into a new file and include a bunch of comments to defend against too much entropy:
/** Do not leave more than 2 previous versions in existence! **/
#Path("/api/v3")
public interface RestfulAPIv3
{
int version = 3;
#Path("/resources")
#Method(GET)
public Response getResources();
}
Of course the idea would be not to copy the implementation also, but to allow it to support multiple versions. This might require moving identical signatures forward to the newer versions so no collisions would happen across interfaces in the class file:
public class RestfulAPIImpl implements RestfulAPIv3, RestfulAPIv2
{
public Response getResources()
{
List<Resource> rs = HibernateHelper.getAll(Resource.class);
// Can we do something with v2/v3 diffs here?
}
#Deprecated
public Response getOptions() // ONLY in v2!
{
return HibernateHelper.getOptions();
}
}
Thinking it through, I have no idea how we'd know which version of an endpoint the client has called, except maybe forwarding the request into the methods which is not my favorite thing.
So, my question is - what have all the versioned API implementers been doing to keep all this stuff from getting out of hand? What's the best way to do this? Am I on the right track?
(Note: this other question is about the 'if' - my question is about the 'how'.)
An alternative to not passing forward a parameter specifying the version number is to add an annotation to the method so that it automatically captures that information and saves it on a request object that can be read elsewhere.
Taking into account that your API might have requests with parameters that differ amongst versions and also have responses that look different you might have to have multiple controllers and view-model classes, one for each version of the API.
UPDATE
As per request, follows some sample code (I've used Play Framework 2.4).
So the objective is to achieve something like this in a controller class:
#Versioned(version = 0.1)
public Result postUsers() {
// get post data
UsersService service = new UsersService(getContext());
service.postUsers(postData);
// return result
}
And like this in a service class:
public class UsersService extends Service {
public UsersService(RequestContext context) {
super(context);
}
public ReturnType postUsers() {
double apiVersion = getContext().getAPIVersion();
// business logic
}
}
In order to accomplish that, you would have a Versioned annotation:
import java.lang.annotation.*;
import play.mvc.With;
#With(VersioningController.class)
#Target(ElementType.METHOD)
#Retention(RetentionPolicy.RUNTIME)
public #interface Versioned {
double version();
}
And a VersioningController:
import play.libs.F.Promise;
import play.mvc.*;
public class VersioningController extends Action<Versioned> {
public final static String API_VERSION = "version";
#Override
public Promise<Result> call(Http.Context context) throws Throwable {
context.args.put(API_VERSION, configuration.version());
return delegate.call(context);
}
}
And a RequestContext to help you manage that (you could also use the request context to manage the request timestamp, the user requesting the operation, etc):
public class RequestContext {
private double version;
public RequestContext(Double version) {
setAPIVersion(version);
}
public double getAPIVersion() {
return version;
}
public void setAPIVersion(double version) {
this.version = version;
}
}
Moreover, your controllers could have a GenericController from which they all extend:
import play.api.Play;
import play.mvc.*;
import play.mvc.Http.Request;
public abstract class GenericController extends Controller {
protected static RequestContext getContext() {
return new RequestContext(getAPIVersion());
}
protected static double getAPIVersion() {
return (double) Http.Context.current().args
.get(VersioningController.API_VERSION);
}
}
And an abstract Service from which all service classes extend:
public abstract class Service {
private RequestContext context;
public Service(RequestContext context) {
setContext(context);
}
public RequestContext getContext() {
return context;
}
public void setContext(RequestContext context) {
this.context = context;
}
}
Having all that said, keep in mind that it could be better to try to isolate the API versioning in as few layers as possible. If you can keep your business logic classes from having to manage API versions it's all the better.

Retrofit.Callback's success() and failure() in case of two interface implementations in same Activity

I am currently writing an app that connects to a server to make POST requests. For this reason, I have created multiple Retrofit interfaces for various network operations. I have one that performs registration: I take the username, e-mail etc, make a POST request and then as a final parameter I have a Callback (RegistrationResult is a POJO that accepts "success" or "failure" in a class variable). This interface looks like this:
public interface RegistrationInterface {
#FormUrlEncoded
#POST("/api/apiregistration.php")
void connect(#Field("country") String country,
#Field("state") String state, #Field("email") String email,
#Field("pwd") String password,
Callback<RegistrationResult> resp);
}
Now, because I'm going to use GCM for push notifications, I have another interface that sends the registration id of the particular device to the server and, again, gets a response back. That interface looks like this:
public interface SendRegIDInterface {
#FormUrlEncoded
#POST("/api/apiregid.php")
void connect(#Field("reg_id") String reg_id,
Callback<RegIDResult> resp);
}
So far, so good. The problem arises when I try to create implementations of both interfaces in the same class. Let's say I have an Activity that, for some reason, should use implementations from both interfaces. So I have this:
public class MessageActivity extends Activity implements Callback {
public void onCreate(Bundle firebug) {
RestAdapter restAdapter1 = new RestAdapter.Builder().setEndpoint(endpoint).build();
RegistrationInterface regInter = restAdapter1.create(RegistrationInterface.class);
regInter.connect(// parameters here, MessageActivity.this);
RestAdapter restAdapter2 = new RestAdapter.Builder().setEndpoint(endpoint).build();
SendRegIDInterface regIDInter = restAdapter2.create(SendRegIDInterface.class);
regIDInter.connect(reg_id, MessageActivity.this);
}
#Override
public void failure(RetrofitError arg0) {
}
#Override
public void success(Object arg0, Response arg1) {
}
}
My problem is this: where do the overridden methods (failure and success) of the Retrofit.Callback interface correspond? Since I'm having two implementations of Retrofit interfaces in the same Activity, how can I distinguish what gets returned in, eg. the success() method? Is it the response from the RegistrationInterface implementation or the response from the SendRegIDInterface that's contained in the Callback's success() method's arguments? As long as I only had an implementation of the RegistrationInterface interface in the Activity, everything was clear: the success() method's arguments contain the server's response to the registration request. Now that I'm using a second interface implementation (SendRegIDInterface) I'm super-confused!
Or am I going completely wrong about this?
I think you need a bit more separation. If you want to place the callbacks in your activity, the business logic will mess too much with the UI related things.
When I use Retrofit I do it this way (will demonstrate it with your code): First, I have a RegistrationClient.java, where I define all the endpoints for the API:
public interface RegistrationClient {
#FormUrlEncoded
#POST("/api/apiregistration.php")
void connect(#Field("country") String country,
#Field("state") String state, #Field("email") String email,
#Field("pwd") String password,
Callback<RegistrationResult> resp);
}
In this case, it is only one endpoint, but there will be cases, where there will be more, e.g.:
GET /persons/{id}
POST /persons/
PUT /persons/{id}
When I got my client, then I create a model for the interface. I would name it RegistrationModel in your case:
public class RegistrationModel {
private RegistrationClient _client;
public RegistrationModel() {
RestAdapter restAdapter = new RestAdapter.Builder() ... // create an adapter
_client = restAdapter.create(RegistrationClient.class);
}
public void connect(String country, String state, String email, String password) {
// you can add an additional callback parameter for returning info to the caller.
_client.connect(country, state, email, password, new Callback<RegistrationResult> {
#Override
public void failure(RetrofitError error) {
// Do the essential things, and do a callback to the caller if needed
}
#Override
public void success(RegistrationResult result, Response response) {
// Do the essential things, and do a callback to the caller if needed
}
}
}
Then I would hold a singleton reference to each of the models using a my own service locator, or using dependency injection (with Dagger).
So from your activity, the call would be something like this:
...
ServiceLocator.getRegistrationModel().connect(country, state, email, password);
...
Or if you added your own callback:
...
ServiceLocator.getRegistrationModel().connect(country, state, email, password,
new RegistrationCallback(boolean result) {
// do something.
}
);
...

DropWizard/Jersey API Clients

DropWizard uses Jersey under the hood for REST. I am trying to figure out how to write a client for the RESTful endpoints my DropWizard app will expose.
For the sake of this example, let's say my DropWizard app has a CarResource, which exposes a few simple RESTful endpoints for CRUDding cars:
#Path("/cars")
public class CarResource extends Resource {
// CRUDs car instances to some database (DAO).
public CardDao carDao = new CarDao();
#POST
public Car createCar(String make, String model, String rgbColor) {
Car car = new Car(make, model, rgbColor);
carDao.saveCar(car);
return car;
}
#GET
#Path("/make/{make}")
public List<Car> getCarsByMake(String make) {
List<Car> cars = carDao.getCarsByMake(make);
return cars;
}
}
So I would imagine that a structured API client would be something like a CarServiceClient:
// Packaged up in a JAR library. Can be used by any Java executable to hit the Car Service
// endpoints.
public class CarServiceClient {
public HttpClient httpClient;
public Car createCar(String make, String model, String rgbColor) {
// Use 'httpClient' to make an HTTP POST to the /cars endpoint.
// Needs to deserialize JSON returned from server into a `Car` instance.
// But also needs to handle if the server threw a `WebApplicationException` or
// returned a NULL.
}
public List<Car> getCarsByMake(String make) {
// Use 'httpClient' to make an HTTP GET to the /cars/make/{make} endpoint.
// Needs to deserialize JSON returned from server into a list of `Car` instances.
// But also needs to handle if the server threw a `WebApplicationException` or
// returned a NULL.
}
}
But the only two official references to Drop Wizard clients I can find are totally contradictory to one another:
DropWizard recommended project structure - which claims I should put my client code in a car-client project under car.service.client package; but then...
DropWizard Client manual - which makes it seem like a "DropWizard Client" is meant for integrating my DropWizard app with other RESTful web services (thus acting as a middleman).
So I ask, what is the standard way of writing Java API clients for your DropWizard web services? Does DropWizard have a client-library I can utilize for this type of use case? Am I supposed to be implementing the client via some Jersey client API? Can someone add pseudo-code to my CarServiceClient so I can understand how this would work?
Here is a pattern you can use using the JAX-RS client.
To get the client:
javax.ws.rs.client.Client init(JerseyClientConfiguration config, Environment environment) {
return new JerseyClientBuilder(environment).using(config).build("my-client");
}
You can then make calls the following way:
javax.ws.rs.core.Response post = client
.target("http://...")
.request(MediaType.APPLICATION_JSON)
.header("key", value)
.accept(MediaType.APPLICATION_JSON)
.post(Entity.json(myObj));
Yes, what dropwizard-client provides is only to be used by the service itself, most likely to communicate other services. It doesn't provide anything for client applications directly.
It doesn't do much magic with HttpClients anyway. It simply configures the client according to the yml file, assigns the existing Jackson object mapper and validator to Jersey client, and I think reuses the thread pool of the application. You can check all that on https://github.com/dropwizard/dropwizard/blob/master/dropwizard-client/src/main/java/io/dropwizard/client/JerseyClientBuilder.java
I think I'd go about and structure my classes as you did using Jersey Client. Following is an abstract class I've been using for client services:
public abstract class HttpRemoteService {
private static final String AUTHORIZATION_HEADER = "Authorization";
private static final String TOKEN_PREFIX = "Bearer ";
private Client client;
protected HttpRemoteService(Client client) {
this.client = client;
}
protected abstract String getServiceUrl();
protected WebResource.Builder getSynchronousResource(String resourceUri) {
return client.resource(getServiceUrl() + resourceUri).type(MediaType.APPLICATION_JSON_TYPE);
}
protected WebResource.Builder getSynchronousResource(String resourceUri, String authToken) {
return getSynchronousResource(resourceUri).header(AUTHORIZATION_HEADER, TOKEN_PREFIX + authToken);
}
protected AsyncWebResource.Builder getAsynchronousResource(String resourceUri) {
return client.asyncResource(getServiceUrl() + resourceUri).type(MediaType.APPLICATION_JSON_TYPE);
}
protected AsyncWebResource.Builder getAsynchronousResource(String resourceUri, String authToken) {
return getAsynchronousResource(resourceUri).header(AUTHORIZATION_HEADER, TOKEN_PREFIX + authToken);
}
protected void isAlive() {
client.resource(getServiceUrl()).get(ClientResponse.class);
}
}
and here is how I make it concrete:
private class TestRemoteService extends HttpRemoteService {
protected TestRemoteService(Client client) {
super(client);
}
#Override
protected String getServiceUrl() {
return "http://localhost:8080";
}
public Future<TestDTO> get() {
return getAsynchronousResource("/get").get(TestDTO.class);
}
public void post(Object object) {
getSynchronousResource("/post").post(object);
}
public void unavailable() {
getSynchronousResource("/unavailable").get(Object.class);
}
public void authorize() {
getSynchronousResource("/authorize", "ma token").put();
}
}
if anyone is trying to use DW 0.8.2 when building a client, and you're getting the following error:
cannot access org.apache.http.config.Registry
class file for org.apache.http.config.Registry not found
at org.apache.maven.plugin.compiler.AbstractCompilerMojo.execute(AbstractCompilerMojo.java:858)
at org.apache.maven.plugin.compiler.CompilerMojo.execute(CompilerMojo.java:129)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:132)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
... 19 more
update your dropwizard-client in your pom.xml from 0.8.2 to 0.8.4 and you should be good. I believe a jetty sub-dependency was updated which fixed it.
<dependency>
<groupId>io.dropwizard</groupId>
<artifactId>dropwizard-client</artifactId>
<version>0.8.4</version>
<scope>compile</scope>
</dependency>
You can integrated with Spring Framework to implement

Categories

Resources