Using RestTemplate with local json file - java

Background
I'm writing a web service that makes calls to an external api. This api has not yet been put into production. As such, when I make a call to my web service in my dev environment, I want to stub out the responses that it returns. Note: this is not a unit testing question.
My Solution So Far
The api calls are made using RestTemplate from the lovely Spring people, the url of which is held in application.properties. This has allowed me to set different urls for different environments using Active Profiles. So, for example, application-dev.properties holds a different url.
The dev url is ideally a pointer to a json file under resources/.
My Issue
I can't seem to get RestTemplate to pick up the local json file. The url I'm using is:
url = "file://staticJson.json"
However that comes up with a
Object of class [sun.net.www.protocol.ftp.FtpURLConnection] must be an instance of class java.net.HttpURLConnection
And now I'm unsure of how to proceed, or if this is even possible without extending RestTemplate.
Any directions to try or solutions would be fantastic.
If any more information is required I'll do my best to provide it asap.

Related

Is it possible to discover endpoint parameters in a Spring Boot application?

I have a Spring Boot application exposing some endpoints for REST requests. I am trying to decouple two components (which currently call each other's code directly), and I want them to make REST requests to each other (for an eventual microservice implementation). I know endpoint discovery can be done through a discovery service (e.g. Eureka), but is there a way to also communicate parameter information from the service which requires it to the client requesting it?
Example: I have a shopping cart service which relies on information about products available for purchase. Using Eureka (or another service discovery tool) I can register my services so that the shopping cart service is now aware of the product service. If the product service has something like:
#GetMapping("/product")
public Product findById(#RequestParam int id) {
return products.find(id);
}
I know to use this endpoint to get product with ID 3 the cart service would have to make a request to http://localhost:1234/product?id=3, but is there a way that it can discover these parameters and their required types at run time? In my example, if I use Eureka, the shopping cart service is dynamically made aware of the product service's location, but is not made aware of the parameters that its endpoints will accept. I know Spring Boot Actuator is supposed to provide this information, but whenever I use it, the params field for my endpoints is always empty.
Is there a way that it can discover these parameters and their required types at run time?
Do you really need this information at runtime?
Because it would be much simpler if you need this at development time. Then you could just use something like Swagger, which will provide a service specification endpoint. It can even generate a client for your service.
But, if you really want it at runtime, you may be better served with a HATEOAS API. HATEOAS (Hypermedia as the Engine of Application State) means that your API is navigable. That means that clients can "crawl" you API following the links presented, like a human navigating through a website. How useful it will be, depend on what you are trying to achieve.
There are two popular "solutions" to this:
HAL (Hypertext Application Language) - Still an Internet Draft
JSON-LD (JSON for Linking Data) - A W3C Standard
Spring Boot has HAL support, see spring-boot-starter-hateoas.
You can use parameter map, that will be dynamically populated with all used parameters:
#GetMapping("/product")
public Product findById(#RequestParam Map<String,String> allRequestParams) {
return products.find(allRequestParams.get("id"));
}
You can iterate the parameter map to find which parameters are present.
You can use Swagger in your product service to generate apis, exposed in a json file. For example you can look at sample project petstore with its json exposed, containing apis and parameters with definition.
Then in your shopping cart service you can navigate at runtime product service apis, iterating for specific api parameters.

How karate mocks the external integrations URLS [duplicate]

This question already has an answer here:
In Karate API mocking not working as expected for me
(1 answer)
Closed 1 year ago.
I have a question about external mocking server. My set up is:
I have an API which I want to test
The service internally calls Database , Gateways , payment aggregators whichs have their own URLS
I control the Mock URLS which I can call. But if it is internally initiated how can i mock it without changing my code?
For example I call service
I call Controller of paymentservice which I can mock
What about my controller call java module which makes a call to gateway
I want to mock that gateway not controller. I see all the examples of karate-netty and Proxy . Proxy tracks all the request after host:port but in my case the host will be real host and how proxy will track it ?
Looks like I tried so much but did not get any perfect solution
This is not a question regarding karate, because you want to make your code more testable.
Your controller that call other services have to know how to request service. I would expect that you have at leas some sort of configuration file where all the urls and other application properties are specified.
In more complex environment, I would expect some sort of service discovery with consul for instance.
The simplest thing you can do is to read a system or environment property in your controller to make the service url configurable.

Handling Post Requests

yesterday i started brainstorm for a project of mine and I'm not sure if its the correct approach.
On my website I'm having an (kind of an order form) which sends a post to a target URL, which works with a simple curl php script. The target is an external service (where I have no access no rights, nothing). I only know that I will get a POST with further processed data back from the service, which I have to save into my DB.
in steps:
Users fills out the (order) form and posts data to an external url on my website.
data gets externally handled and after finishing that resents a post.
read incoming post data.
save data into DB.
Success page on my website.
My thoughts were to handle the incoming data with a servlet (spring maven project) but I'm not sure if this is a correct approach. Is there a better why to do this. Or is the first step with the php scripts wrong. thx for any help.
A simplest workflow could be
1. Forward the initial (Order form with values) request to a servlet
2. Invoke a post request using java to an external url inside this servlet (Using Apache http client, or libraries such as HTMLUnit)
3. Once you get the incoming response in your servlet, you can update your database.
If you are using spring, the controller could forward initial request to a business class which will handle this post processing and delegate the database update to respective DAO.
There are a number of suitable ways to handle this, and the decision is largely a matter of preference and what you're familiar with. Spring can handle this sort of job quite well.
Note: Maven is a build system for Java and some other JVM languages. I recommend using it, but it's not part of Spring; what you're probably looking for is Spring MVC.

How do I differentiate between GET & PUT request transaction in New relic, when URL is same?

I am monitoring my web application(CXF, Spring, Hibernate) running on Tomcat 7, using New Relic Java agent 2.18.0.
My services has end point URLs such that they are same for GET & PUT request only HTTP method is different, but when I look at "web transactions" I see only 1 URL instead of separate URLs for separate methods because of which I cannot find whether GET request is slow or PUT request . Is there any config/hack using which I can divide transactions further by HTTP methods instead of just URL strings?
Unfortunately your best bet is to make an API call to set the transaction name to what you want.
https://newrelic.com/docs/java/naming-web-transactions
If you think adding the method automatically is a valuable feature, I suggest submiting a feature request to New Relic support.

Can I write a Java loader class that will hook HTTP requests in the loaded class?

I have a class that I want to hook and redirect HTTP requests in.
I also have a loader class already written, but all it does it replace the functions that contain the HTTP requests I want to change.
Is there a way to hook HTTP requests in Java so that I can redirect them all more easily?
Sort of like a proxy-wrapper.
Clarification:
The app sends out a GET or POST request to a URL.
I need the content to remain the same, just change the URL.
DNS redirects won't work, the Host HTTP header needs to be correct for the new server.
PS: This is a Desktop App, not a server script.
A cumbersome but reliable way of doing this would be to make your application use a proxy server, and then write a proxy server which makes the changes you need. The proxy server could be in-process in your application; it wouldn't need to be a separate program.
To use a proxy, set a couple of system properties - http.proxyHost and http.proxyPort. Requests made via HttpURLConnection will then use that proxy (unless they specifically override the default proxy settings). Requests made using some other method like Apache HttpClient will not, i think, be affected, but hopefully, all your requests are using HttpURLConnection.
To implement the proxy, if you're using a Sun JRE, then you should probably use the built-in HTTP server; set up a single handler mapped to the path "/", and this will pick up all requests being sent by your app, and can then determine the right URL to send them to, and make a connection to that URL (with all the right headers too). To make the connection, use URL.openConnection(Proxy.NO_PROXY) to avoid making a request to the proxy and so getting caught in an infinite loop. You'll then need to pump input and output between the two sockets.
The only other way i can think of to do this would be to override HttpURLConnection with a new handler which steers requests to your desired destination; you'd need to find a way to persuade the URL class to use your handler instead of the default one. I don't know how you'd do that in a clean way.
While an older post, this should give some ideas of some kinds of bytecode injects which can be peformed: Java Programming: Bytecode Injection. Another tool is Javassist and you may be able to find some links from the Aspected-oriented programming wiki article (look at the bytecode weavers section).
There are some products which extensively dynamically modify code.
Depending upon what is desired, there may be ... less painful ... methods. If you simply want to 'hook' HTTP requests, another option is just to use a proxy (which could be an external process) and funnel through that. Using a proxy would likely require control over the name resolution used.
you can use servlet filters which intercept the requests, the requests can further be wrapped, redirected, forwarded or completed from here.
http://www.oracle.com/technetwork/java/filters-137243.html
Do you control all of the code? If so, I suggest using Dependency Injection to inject the concrete implementation you want, which would allow you to instead inject a proxy class.
If you can change the source code, just change it and add your extra code on each HTTP request.
If you can't change the source code, but it uses dependency injection, perhaps you can inject something to catch requests.
Otherwise: use aspect-oriented programming and catch to URL class, or whatever you use to do HTTP requests. #AspectJ (http://www.eclipse.org/aspectj/doc/next/adk15notebook/ataspectj.html ) is quite easy and powerful.

Categories

Resources