Mock data for client calling to spring web service - java

I need to get mock data for client calling to spring restful web services. I know unit testing purpose we can use mock. but my case is not the testing.

Use hard coded data, or external data file, or external data source to store mock data. I understand the need to host a service that responds but may not be fully wired, to allow early integration with downstream clients. These are the techniques I use, each have pros and cons.
Hard coded data - as you say, is not intuitive or easy to change. Okay for temporary state.
External data file - able to update dynamically as needed
External data source - able to create multiple scenarios with dynamic mock payloads, and change on demand

Related

mock a web service only for test purpose

i am working on integration tests, getting Responses from An API A.
the API A interacts with another API B which also calls a web Service to get data from it.
the problem is the data may change in the future, so the integration tests may fail and as long as the data changes, i have to edit the tests too to make it work.
i want to Mock the web service from which i have the data, but i don't know how to tell API B to call the mock only for tests,
does anyone has an idea about the best way to do this ?
You can use tools like http://rest-assured.io/ or http://wiremock.org/.
With this you can your API calls will be done the same like you would normally (probably you need to change the hostname). Then you can give a certain result on a URI, Content-Type, etc.
It is even possible to do assertion, to see if the request actual took place, and can do some checking on the requested content.

Microservices - Stubbing/Mocking

I am developing a product using microservices and am running into a bit of an issue. In order to do any work, I need to have all 9 services running on my local development environment. I am using Cloud Foundry to run the applications, but when running locally I am just running the Spring Boot Jars themselves. Is there anyway to setup a more lightweight environment so that I don't need everything running? Ideally, I would only like to have the service I am currently working on to have to be real.
I believe this is a matter of your testing strategy. If you have a lot of micro-services in your system, it is not wise to always perform end-to-end testing at development time -- it costs you productivity and the set up is usually complex (like what you observed).
You should really think about what is the thing you wanna test. Within one service, it is usually good to decouple core logic and the integration points with other services. Ideally, you should be able to write simple unit tests for your core logic. If you wanna test integration points with other services, use mock library (a quick google search shows this to be promising http://spring.io/blog/2007/01/15/unit-testing-with-stubs-and-mocks/)
If you don't have already, I would highly recommend to set up a separate staging area with all micro-services running. You should perform all your end-to-end testing there, before deploying to production.
This post from Martin Fowler has a more comprehensive take on micro-service testing stratey:
https://martinfowler.com/articles/microservice-testing
It boils down to a test technique that you use. Here my recent answer in another topic that you could find useful https://stackoverflow.com/a/44486519/2328781.
In general, I think that Wiremock is a good choice because of the following reasons:
It has out-of-the-box support by Spring Boot
It has out-of-the-box support by Spring Cloud Contract, which gives a possibility to use a very powerful technique called Consumer Driven Contracts.
It has a recording feature. Setup your Wiremock as a proxy and make requests through it. This will generate stubs for you automatically based on your requests and responses.
There are multiple tools out there that let you create mocked versions of your microservices.
When I encountered this exact problem myself I decided to create my own tool which is tailored for microservice testing. The goal is to never have to run all microservices at once, only the one that you are working on.
You can read more about the tool and how to use it to mock microservices here: https://mocki.io/mock-api-microservices. If you only want to run them locally, it is possible using the open source CLI tool
It can be solved if your microservices allow passing metadata along with requests.
Good microservice architecture should use central service discovery, also every service should be able to take metadata map along with request payload. Known fields of this map can be somehow interpreted and modified by the service then passed to next service.
Most popular usage of per-request metadata is request tracing (i.e. collecting tree of nodes used to process this request and timings for every node) but it also can be used to tell entire system which nodes to use
Thus plan is
register your local node in dev environment service discovery
send request to entry node of your system along with metadata telling everyone to use your local service instance instead of default one
metadata will propagate and your local node will be called by dev environment, then local node will pass processed results back to dev env
Alternatively:
use code generation for inter-service communication to reduce risk of failing because of mistakes in RPC code
resort to integration tests, mocking all client apis for microservice under development
fully automate deployment of your system to your local machine. You will possibly need to run nodes with reduced memory (which is generally OK as memory is commonly consumed only under load) or buy more RAM.
An approach would be to use / deploy an app which maps paths / urls to json response files. I personally haven't used it but I believe http://wiremock.org/ might help you
For java microservices, you should try Stybby4j. This will mock the json responses of other microservices using Stubby server. If you feel that mocking is not enough to map all the features of your microservices, you should setup a local docker environment to deploy the dependent microservices.

Backup and restore entities from Google App engine

I am looking to implement a way to transfer data from one application to another programmatically in Google app engine.
I know there is a way to achieve this with database_admin console but that process is very time inefficient.
I am currently implementing this with the use of Google Cloud Storage(GCS) but that involves querying data, saving it to GCS and then reading from GCS from different app and restoring it.
Please let me know if anyone knows a simpler way of transferring data between two applications programmatically.
Thanks!
Haven't tried this myself but it sounds like it should work: Use the data_store admin to backup your objects to GCS from one app, then use your other app to restore that file from GCS. This should be a good method if you only require a one time sync.
If you need to constantly replicate data from one app to another, introducing REST endpoints at one or both sides could help:
https://code.google.com/p/appengine-rest-server/ (this is in Python, I know, but just define a version of your app for the REST endpoint)
You just need to make sure your model definitions match on both sides (pretty much update the app at both sides with the same deployment code) and only have the side that needs to sync data track time of last sync and use the REST endpoints to pull in new data. Cron Jobs can do this.
Alternatively, create a PostPut callback on all of your models to make a POST call every time a model is written to your datastore to the proper REST endpoint on the other app.
You can batch update with one method, or keep a constantly updated version with the other method (at the expense of more calls).
Are you trying to move data between two App Engine applications or trying to export all your data from App Engine so you can move to a different hosting system? Your question doesn't have enough information to understand what you're attempting to do. Based on the vague requirements I would say typically this would be handled with a web service that you write in one application that exposes the data and the other application calls that service to consume the data. I'm not sure why Cloud Endpoints was down voted because that provides a nice way to expose your data as a JSON based web service with a minimum of coding fuss.
I'd recommend adding some more details into your question like exactly what are you trying to accomplish and maybe a mock data sample.
You could create a backup of your data using bulkloader and then restore it on another application.
Details here:
https://developers.google.com/appengine/docs/python/tools/uploadingdata?csw=1#Python_Downloading_and_uploading_all_data
Refer to this post if you are using Java:
Downloading Google App Engine Database
I don't know if this could be suitable for your concrete scenario, but Google Cloud Endpoints are definitely a simple way of transferring data programmatically from Google App Engine.
This is kind of the Google implementation of REST web services, so they allow you to share resources using URLs. This is still an experimental technology, but as long as I've worked with them, they work perfectly. Moreover they are very well integrated with GAE and the Google Plugin for Eclipse.
You can automatically generate an endpoint from a JDO persistent class and I think you can automatically generate the client libraries as well (although I didn't try that).

JNDI In .Net / PHP / IPhone

Let me exaplain you the complete situation currently I am stuck with in.
We are developing very much complex application in GWT and Hibernate, we are trying to host client and server code on different servers because of client's requirement. Now, I am able to achieve so using JNDI.
Here comes the tricky part, client need to have that application on different Platform also, database would be same and methods would be the same, lets say iPhone / .Net version of our application. we don't want to generate Server code again because it's gonna be the same for all.
I have tried for WebServices wrapper on the top of my server code but because of complexity of architecture and Classes dependencies I am not able to do so. For example, Lets consider below code.
class Document {
List<User>;
List<AccessLevels>;
}
Document class have list of users, list of accesslevels and lot more list of other classes and that other classes have more lists. Some important server methods takes Class (Document or any other) as input and return some other class in output. And we shouldn't use complex architecture in WebServices.
So, I need to stick with JNDI. Now, I don't know how can I access JNDI call to any other application ???
Please suggest ways to overcome this situation. I am open for technology changes that means JNDI / WebServices or any other technology that servers me well.
Thanking You,
Regards,
I have never seen JNDI used as a mechanism for request/response inter-process communication. I don't believe that this will be a productive line of attack.
You believe that Web Services are inappropriate when the payloads are complex. I disagree, I have seen many successful projects using quite large payloads, with many nested classes. Trivial example: Customers with Orders with Order Lines with Products with ... and so on.
It is clearly desirable to keep payload sizes small, there are serialization and network costs, big objects will be more expensive. But it's by far preferable to have one big request than lot's of little one. A "busy" interface will not perform well across a network.
I suspect that the one problem you may have is that certain of the server-side classes are not pure data, they refer to classes that only make sense on the server, you don't want those classes in you client.
I this case you need to build an "adapter" layer. This is dull work, but no matter what Inter-process communication technique you use you will need to do it. You need what I refer to as Data Transfer Objects (DTOs) - these represent payloads that are understood in client, using only classes reasonable for the client, and which the server can consume and create.
Lets suppose that you use technology XXX (JNDI, Web Service, direct socket call, JMS)
Client --- sends Document DTO --XXX---> Adapter transform DTO to server's Document
and similarly in reverse. My claim is that no matter what XXX is chosen you have the same problem, you need the client to work with "cut-down" objects that reveal none of the server's implementation details.
The adapter has responsibility for creating and understanding DTOs.
I find that working with RESTful Web Services using JAX/RS is very easy once you have a set of DTOs it's the work of minutes to create Web Services.

Simpler-than-JBoss way to have POJO RPC in Java with container session management

Currently, I only know a way of doing RPC for POJOs in Java, and is with the very complex EJB/JBoss solution.
Is there any better way of providing a similar functionality with a thiner layer (within or without a Java EE container), using RMI or something that can serialize and send full blown objects over the wire?
I'm not currently interested in HTTP/JSON serialization BTW.
EDIT: For clarification: I'm trying to replace an old EJB 2.1/JBoss 4 solution with something more easy to manage at the container level. I need to have entire control over the database(planning to use iBATIS which would allow me to use fairly complex SQL very easily), but the only things I want to keep over the wire are:
Invocation of lookup/data modification methods (automagic serialization goes here).
Transparent session control (authentication/authorization). I still have to see how to accomplish this.
Both items have to work as a whole, of course. No access should be granted to users without credentials.
Because I'm not very fond of writing webapps, I plan to build a GUI (Swing or SWT) that would only manage POJOs, do some reporting and invoke methods from the container. I want the serialization to be as easy as possible.
As is nearly always the case, Spring comes to the rescue. From the reference documentation, you will want to read Chapter 17. Remoting and web services using Spring.
There are several methods to choose from. The beauty of Spring is that all your interfaces and implementations are vanilla POJOs. The wiring into RMI or whatever is handled by Spring. You can:
Export services using RMI:
probably the simplest approach;
Use HTTP invoker: if remote access is an issue, this might be better for firewalls, etc than pure RMI; or
Use Web Services, in which case I would favour JAX-WS over JAX-RPC.
Spring has the additional benefit in that it can do the wiring for both the server and the client, easily and transparently.
Personally I would choose either (2) or (3). HTTP is network friendly. It's easy to deploy in a Web container. Jetty's long-lived connections give you the option over server push (effectively) over HTTP.
All of these methods allow complex objects to be sent across the wire but they are subtly different in this regard. You need to consider if your server and client are going to be distributed separately and whether it's an issue if you change the interface that you need to redistribute the class files. Or you can use a customized serialization solution (even XML) to avoid this. But that has issues as well.
Using a Web container will allow you to easily plug-in Spring Security, which can be a bit daunting at first just because there are so many options. Also, HttpSession can be used to provide state information between requests.
Simple RPC is exactly what RMI was built for. If you make a serializable interface, you can call methods on one app from another app.
If you only need value objects then just ensure the POJOs implement Serializable and write the objects across sockets (using ObjectOutputStream). On the receiving end read the objects using ObjectInputStream. The receiving end has to have a compatible version of the POJO (see serialVersionUID).
Hessian/Burlap 'protocol-ize this: http://hessian.caucho.com/ and http://www.caucho.com/resin-3.0/protocols/burlap.xtp
You could try XStream (http://x-stream.github.io/) over REST. Easy to apply on e pre-existing set of pojos.
Can you give some further information as to what you're trying to achieve, since you're not interested in rest/json ?

Categories

Resources