Validation of method parameters - java

I have a RESTful web service. For implementation using JAX-RS (Jersey).
Have the following method:
public void foo (#PathParam ("name") String uuid) {
...
}
I need to do validation of input parameters. And if data invalid throw WebApplicationException.
I added my custom annotation CheckUuid (extends ):
public void foo (#PathParam ("name") #CheckUuid String uuid) {
...
}
Is it possible to do validation using annotations on a stage when the method chosen, but not yet called? For example using PreProcessInterceptor?

Java EE6 has some built in validation functionality.
http://docs.oracle.com/javaee/6/tutorial/doc/gircz.html
I have not used it however, but I saw it brought up during Java One and it looks pretty cool.
I'm not sure at what point this would happen, but I think it might work out for you.

As a result, it was decided to use the standard pattern in the method validation. Because in Jersey do not have PreProcessInterceptor.

Related

How do I add a custom directive to a query resolved through a singleton

I have managed to add custom directives to the GraphQL schema but I am struggling to work out how to add a custom directive to a field definition. Any hints on the correct implementation would be very helpful.
I am using GraphQL SPQR 0.9.6 to generate my schema
ORIGINAL ANSWER: (now outdated, see the 2 updates below)
It's currently not possible to do this. GraphQL SPQR v0.9.9 will be the first to support custom directives.
Still, in 0.9.8 there's a possible work-around, depending on what you're trying to achieve. SPQR's own meta-data about a field or a type is kept inside custom directives. Knowing that, you can get a hold of the Java method/field underlying the GraphQL field definition. If what you want is e.g. an instrumentation that does something based on a directive, you could instead obtain any annotations on the underlying element, having the full power of Java at your disposal.
The way to get the method would something like:
Operation operation = Directives.getMappedOperation(env.getField()).get();
Resolver resolver = operation.getApplicableResolver(env.getArguments().keySet());
Member underlyingElement = resolver.getExecutable().getDelegate();
UPDATE:
I posted a huge answer on this GitHub issue. Pasting it here as well.
You can register an additional directive as such:
generator.withSchemaProcessors(
(schemaBuilder, buildContext) -> schemaBuilder.additionalDirective(...));
But (according to my current understanding), this only makes sense for query directives (something the client sends as a part of the query, like #skip or #deffered).
Directives like #dateFormat simply make no sense in SPQR: they're there to help you when parsing SDL and mapping it to your code. In SPQR, there's no SDL and you start from your code.
E.g. #dateFormat is used to tell you that you need to provide date formatting to a specific field when mapping it to Java. In SPQR you start from the Java part and the GraphQL field is generated from a Java method, so the method must already know what format it should return. Or it has an appropriate annotation already. In SPQR, Java is the source of truth. You use annotations to provide extra mapping info. Directives are basically annotation in SDL.
Still, field or type level directives (or annotations) are very useful in instrumentations. E.g. if you want to intercept field resolution and inspect the authentication directives.
In that case, I'd suggest you simply use annotations for the same purpose.
public class BookService {
#Auth(roles= {"Admin"}) //example custom annotation
public Book addBook(Book book) { /*insert a Book into the DB */ }
}
As each GraphQLFieldDefinition is backed by a Java methods (or a field), you can get the underlying objects in your interceptor or wherever:
GraphQLFieldDefinition field = ...;
Operation operation = Directives.getMappedOperation(field).get();
//Multiple methods can be hooked up to a single GraphQL operation. This gets the #Auth annotations from all of them
Set<Auth> allAuthAnnotations = operation.getResolvers().stream()
.map(res -> res.getExecutable().getDelegate()) //get the underlying method
.filter(method -> method.isAnnotationPresent(Auth.class))
.map(method -> method.getAnnotation(Auth.class))
.collect(Collectors.toSet());
Or, to inspect only the method that can handle the current request:
DataFetchingEnvironment env = ...; //get it from the instrumentation params
Auth auth = operation.getApplicableResolver(env.getArguments().keySet()).getExecutable().getDelegate().getAnnotation(Auth.class);
Then you can inspect your annotations as you wish, e.g.
Set<String> allNeededRoles = allAuthAnnotations.stream()
.flatMap(auth -> Arrays.stream(auth.roles))
.collect(Collectors.toSet());
if (!currentUser.getRoles().containsAll(allNeededRoles)) {
throw new AccessDeniedException(); //or whatever is appropriate
}
Of course, there's no real need to actually implement authentication this way, as you're probably using a framework like Spring or Guice (maybe even Jersey has the needed security features), that already has a way to intercept all methods and implement security. So you can just use that instead. Much simpler and safer. E.g. for Spring Security, just keep using it as normal:
public class BookService {
#PreAuth(...) //standard Spring Security
public Book addBook(Book book) { /*insert a Book into the DB */ }
}
Make sure you also read my answer on implementing security in GraphQL if that's what you're after.
You can use instrumentations to dynamically filter the results in the same way: add an annotation on a method, access it from the instrumentation, and process the result dynamically:
public class BookService {
#Filter("title ~ 'Monkey'") //example custom annotation
public List<Book> findBooks(...) { /*get books from the DB */ }
}
new SimpleInstrumentation() {
// You can also use beginFieldFetch and then onCompleted instead of instrumentDataFetcher
#Override
public DataFetcher<?> instrumentDataFetcher(DataFetcher<?> dataFetcher, InstrumentationFieldFetchParameters parameters) {
GraphQLFieldDefinition field = parameters.getEnvironment().getFieldDefinition();
Optional<String> filterExpression = Directives.getMappedOperation(field)
.map(operation ->
operation.getApplicableResolver(parameters.getEnvironment().getArguments().keySet())
.getExecutable().getDelegate()
.getAnnotation(Filter.class).value()); //get the filtering expression from the annotation
return filterExpression.isPresent() ? env -> filterResultBasedOn Expression(dataFetcher.get(parameters.getEnvironment()), filterExpression) : dataFetcher;
}
}
For directives on types, again, just use Java annotations. You have access to the underlying types via:
Directives.getMappedType(graphQLType).getAnnotation(...);
This, again, probably only makes sense only in instrumentations. Saying that because normally the directives provide extra info to map SDL to a GraphQL type. In SPQR you map a Java type to a GraphQL type, so a directive makes no sense in that context in most cases.
Of course, if you still need actual GraphQL directives on a type, you can always provide a custom TypeMapper that puts them there.
For directives on a field, it is currently not possible in 0.9.8.
0.9.9 will have full custom directive support on any element, in case you still need them.
UPDATE 2: GraphQL SPQR 0.9.9 is out.
Custom directives are now supported. See issue #200 for details.
Any custom annotation meta-annotated with #GraphQLDirective will be mapped as a directive on the annotated element.
E.g. imagine a custom annotation #Auth(requiredRole = "Admin") used to denote access restrictions:
#GraphQLDirective //Should be mapped as a GraphQLDirective
#Retention(RetentionPolicy.RUNTIME)
#Target({ElementType.METHOD}) //Applicable to methods
public #interface Auth {
String requiredRole();
}
If a resolver method is then annotated with #Auth:
#GraphQLMutation
#Auth(requiredRole = {"Admin"})
public Book addBook(Book newBook) { ... }
The resulting GraphQL field fill look like:
type Mutation {
addBook(newBook: BookInput): Book #auth(requiredRole : "Admin")
}
That is to say the #Auth annotation got mapped to a directive, due to the presence of #GraphQLDirective meta-annotation.
Client directives can be added via: GraphQLSchemaGenerator#withAdditionalDirectives(java.lang.reflect.Type...).
SPQR 0.9.9 also comes with ResolverInterceptors which can intercept the resolver method invocation and inspect the annotations/directives. They are much more convenient to use than Instrumentations, but are not as general (have a much more limited scope). See issue #180 for details, and the related tests for usage examples.
E.g. to make use of the #Auth annotation from above (not that #Auth does not need to be a directive for this to work):
public class AuthInterceptor implements ResolverInterceptor {
#Override
public Object aroundInvoke(InvocationContext context, Continuation continuation) throws Exception {
Auth auth = context.getResolver().getExecutable().getDelegate().getAnnotation(Auth.class);
User currentUser = context.getResolutionEnvironment().dataFetchingEnvironment.getContext();
if (auth != null && !currentUser.getRoles().containsAll(Arrays.asList(auth.rolesRequired()))) {
throw new IllegalAccessException("Access denied"); // or return null
}
return continuation.proceed(context);
}
}
If #Auth is a directive, you can also get it via the regular API, e.g.
List<GraphQLDirective> directives = dataFetchingEnvironment.getFieldDefinition().get.getDirectives();
DirectivesUtil.directivesByName(directives);

togglz annotation based approach for feature validation

I have been using the togglz since last few days.
I am trying to find out if there is annotation based approach available in togglez API.
I want to do it like below -
public class Application {
public static void main(String[] args) {
Application application = new Application();
boolean first=false;
first=application.validate1();
System.out.println(first);
}
#Togglz(feature = "FEATURE_01")
public boolean validate1() {
System.out.println("validate1");
return false;
}
}
Is there anything available in togglz.
I could not find it anywhere , if you have any idea about such annotation please help.
My requirement is to skip the method execution based on feature value passed into it
No, there is no such annotation in Togglz. You will need some framework that support interceptors for that (like Spring, CDI, EJB). Then you can implement such an interceptor yourself.
However, to be honest I'm not sure if such an annotation would make sense. What should be the result if the feature is off? What does the method return? null? Explicit feature checks using a simple if statement are more straight forward to use in theses cases. But that's just my opinion. ;-)

Java : Is `#WebParam` optional?

I put in place a simple web service using JAX-WS RI (default Java implementation).
I read many tutorials where I find web methods with parameters declared with the WebParam annotation. Ex:
#WebMethod
void foobar(#WebParam("foo") String bar);
In my case I didn't put it and it worked.
Is #WebParam optional ?
Regards.
Yes its Optional, This Option is basically used to give a custom name to your web-Method param, and also the proper format is:
#WebMethod
void foobar(#WebParam(name="foo") String bar);
Also there's a concept of Holder's as well so this annonation can be help ful in that as well, i.e. if you want your method to return more one thing then try the method below
#WebMethod
void foobar(#WebParam(name="foo", Mode=INOUT) Holder<String>bar,
#WebParam(name="param2", Mode=INOUT) Holder<String> newParam);
Now what this will do , you can input two string in the web service and get in return two ouputs from that service's method
Last thing to mention there are three Mode's supported
IN
OUT
INOUT

Can we have more than one #Path annotation for same REST method [duplicate]

This question already has answers here:
JAX-RS: Multiple paths
(4 answers)
Closed 2 years ago.
Can we have more than one #Path annotation for same REST method i.e. the method executed is the same, but it is executed on accessing more than one URL?
E.g.: I want to run the searchNames() method on both http://a/b/c and http://a/b.
You can't have mutliple #Path annotations on a single method. It causes a "duplicate annotation" syntax error.
However, there's a number of ways you can effectively map two paths to a method.
Regular expressions in #Path annotation
The #Path annotation in JAX-RS accepts parameters, whose values can be restricted using regular expressions.
This annotation:
#Path("a/{parameter: path1|path2}")
would enable the method to be reached by requests for both /a/path1 and /a/path2. If you need to work with subpaths, escape slashes: {a:path1\\/subPath1|path2\\/subPath2}
Serving responses with a redirection status code
Alternatively, you could set up a redirection. Here's a way to do it in Jersey (the reference implementation of JAX-RS), by defining another subresource. This is just an example, if you prefer a different way of handling redirections, feel free to use it.
#Path("basepath")
public class YourBaseResource {
//this gets injected after the class is instantiated by Jersey
#Context
UriInfo uriInfo;
#Path("a/b")
#GET
public Responce method1(){
return Response.ok("blah blah").build();
}
#Path("a/b/c")
#GET
public Response method2(){
UriBuilder addressBuilder = uriInfo.getBaseUriBuilder();
addressBuilder.path("a/b");
return Response.seeOther(addressBuilder.build()).build();
}
}
Using a servlet filter to rewrite URLs
If you're going to need such functionality often, I suggest intercepting the incoming requests using a servlet filter and rewriting the paths on the fly. This should help you keep all redirections in one place. Ideally, you could use a ready library. UrlRewriteFilter can do the trick, as long as you're fine with a BSD license (check out their google code site for details)
Another option is to handle this with a proxy set up in front of your Java app. You can set up an Apache server to offer basic caching and rewrite rules without complicating your Java code.
As explained in Tom's answer, you can not use more than one #Path annotation on a single method, because you will run into error: duplicate annotation at compile time.
I think the simplest way to get around this is to use method overloading:
#Path("{foo}")
public Response rest(#PathParam("foo") final String foo) {
return this.rest(foo, "");
}
#Path("{foo}/{bar}")
public Response rest(#PathParam("foo") final String foo,
#PathParam("bar") final String bar) {
return Response.ok(foo + " " + bar).build();
}
You could also use more different method names if you run into the case where multiple overloaded methods have the signature.
Another solution for your particular example:
http://a/b/c
http://a/b
Let's suppose that:
/a is for the resource class
/b/c and /b are the paths for the methods
because a full path looks like:
<protocol><host><port><app><url-pattern><resource-path><method-path>.
Use optional parameter
#Path("/b{c : (/c)?}")
public Response searchNames(#PathParam("c") String val) {
...
}
The example above works for all examples like:
/b
/b/
/b/c
/b/c/
but when c is provided, the val is /c (it has a / before).
If you want to fix the problem above (to avoid Java parsing), you need something more complex:
#Path("/b{slash : (/)?}{c:((?<=/).*)?}")
which will return only c (not /c) for the 3rd bullet point, but for the 4th bullet point it will return c/ which has to be parsed in Java.
But for your case ("the method executed is the same"), don't worry about parsing because you don't have different actions.
If you are using Spring then try
#RequestMapping(value = {"/def", "/abc"}, method = RequestMethod.POST)
This will work for both /abc and /def.
– sSaroj Nov 17 '17 at 10:13

Can we implement method overloading in web service class?

I would like to implement method overloading in the Java web service class as follows:
public String myMethod(User user)
{
// My code
}
public String myMethod(User[] user)
{
for(int i=0; i<user.length; i++)
{
myMethod(user[i]);
}
}
If I forward a single User object to myMethod(), it should trigger the first method and if I send an array of Users, it should trigger the second method.
In the WSDL file it shows only a single method. However, if I try to call #WebMethod(operationName="") for both calls, I am unable to generate the WSDL file.
Operation overloading is not allowed for web services.
It is explicitely prohibited in WS-BP and WSDL 1.2 also disallows it.
Even if you found a stack that has some support for this I would recommend not to follow this approach.
Overloading is an OO concept. Don't try to apply them to Service Oriented paradigm
Overloading the web service methods is not difficult. With Axis 1.4 at least it is fairly simple. If there are two overloaded methods in the service like below:
public String myMethod(String firstName, String lastName) throws RemoteException
public String myMethod(String name) throws RemoteException
Then a request like this:
http://localhost:8080/services/testService?method=myMethod&name=<name>
will invoke the second method.
And a request like this one:
http://localhost:8080//services/testService?method=myMethod&firstName=<first_name>&lastName=<last_name>
will invoke the first method.
The resolution is done by Axis.

Categories

Resources