I could not find much resources on my question so I guess this is not an easy resolution.
We use JodaTime in our codebase and I wish to forbid (or at least warn) using some methods from this library as they are prone to errors (around timezone management).
I tried the reflections library already, without success due to a non released issue.
We used to have a custom sonar rule to handle this but it is not supported by sonarcloud so I looking for another way.
Do you have any lead to handle this?
I would recommend using ArchUnit for this, which allows you to specify restrictions such as this as unit tests:
public class DisallowedMethodsTest {
#Test
public void forbidJodaTimeMethods()
{
JavaClasses importedClasses = new ClassFileImporter().importPackages("your.base.package");
ArchRule rule = noClasses().should()
.callMethodWhere(target(name("disallowedMethodName"))
.and(target(owner(assignableTo(DateTime.class)))))
.because("Your reasons");
rule.check(importedClasses);
}
}
If you are looking for something works in unit test environment, Jeroen Steenbeeke' answer might be helpful.
If you are looking for something works in production environmen, you'll need HOOK.
In case you cannot require partners to use java.lang.reflect.Proxy to construct related object, I'd recommend you have a look on AspectJ if you are working on a regular Java project or Xposed if you are working on an Android project.
Both of them could add restrictions without modifing existing codebase nor programming flow.
I solved such kind of problems by writing an interceptor like the following, as explained at https://docs.oracle.com/javaee/7/tutorial/interceptors002.htm:
import javax.interceptor.AroundInvoke;
import javax.interceptor.InvocationContext;
import java.lang.reflect.Method;
import java.lang.reflect.Parameter;
public class MethodCallTracerInterceptor {
#AroundInvoke
Object intercept(InvocationContext context)
throws Exception
{
Method method = context.getMethod();
String methodClass = method.getDeclaringClass().getName();
String methodName = method.getName();
if (methodClass.equals("myClass") && methodName.equals("myMethod")) {
//TODO Raise an exception or log a warning.
}
return context.proceed();
}
}
Related
If I wanted to keep a certain Java package free of 3rd party dependencies with ArchUnit, how would I do it?
More specifically I am looking at keeping my domain model in a hexagonal architecture free from spring code. I specified some rules which I believe ought to prevent the model from using spring. However, I am able to use spring annotations like #Component and #Bean without causing a violation.
What I tried so far is
layeredArchitecture().
layer("domain").definedBy(DOMAIN_LAYER).
layer("application").definedBy(APPLICATION_LAYER).
layer("primary-adapters").definedBy(PRIMARY_ADAPTERS).
layer("secondary-adapters").definedBy(SECONDARY_ADAPTERS).
layer("spring").definedBy("org.springframework..")
whereLayer("spring").mayOnlyBeAccessedByLayers("primary-adapters", "secondary-adapters", "application").
because("Domain should be kept spring-free").
check(CLASSES);
As well as
noClasses().that().resideInAPackage(DOMAIN_LAYER).
should().dependOnClassesThat().resideInAPackage("org.springframework..").
check(CLASSES);
noClasses().that().resideInAPackage(DOMAIN_LAYER).
should().accessClassesThat().resideInAPackage("org.springframework..").
check(CLASSES);
Here a code example which executes the tests just fine, although com.example.app.domain.Factory is importing org.springframework....
You can use DescribedPredicate:
void domainSpring() {
DescribedPredicate<JavaAnnotation> springAnnotationPredicate = new DescribedPredicate<JavaAnnotation>("Spring filter") {
#Override
public boolean apply(JavaAnnotation input) {
return input.getType().getPackageName().startsWith("org.springframework");
}
};
classes().that().resideInAPackage(DOMAIN_LAYER).should()
.notBeAnnotatedWith(springAnnotationPredicate).check(CLASSES);
}
You can also go with name matching.
So you don't have to write a custom DescribedPredicate.
ApplicationCoreMustNotDependOnFrameworks = noClasses()
.that().resideInAnyPackage(DOMAIN_LAYER)
.should().dependOnClassesThat().haveNameMatching("org.springframework.")
.orShould().dependOnClassesThat().haveNameMatching("javax.persistence.*")
.because("Domain should be free from Frameworks");
in my case, I wanted an exception to that rule.
I.e., instead of excluding completely Spring, I wanted to accept the classes in the event package (EventHanlder)
so you can replace "org.springframework" with "org.springframework(?!.*event).*" which is a regular expression
I'm using TestNG for my unit tests and I'd like to check exception messages. OK, #Test(expectedExceptionsMessageRegExp = ...) is exactly what I need, right? Well, at the same time I'd like to externalize my messages so they aren't mixed with my code. I'm loosely following a guide by Brian Goetz, so my exception code looks like
throw new IllegalArgumentException(MessageFormat.format(
EXCEPTIONS.getString(EX_NOT_A_VALID_LETTER), c));
Works perfectly for me, except these two things don't exactly mix. I can't write
#Test(dataProvider = "getInvalidLetters",
expectedExceptions = {IllegalArgumentException.class},
expectedExceptionsMessageRegExp = regexize(EXCEPTIONS.getString(EX_NOT_A_VALID_LETTER)))
Here, regexize is a function that is supposed to replace {0}-style placeholders with .*. However, this fails with a “element value must be a constant expression”. Makes sense, since it's needed at compile time. But what are possible workarounds?
I can imagine a test code generator that would replace these constructs with real message regexps, but it would be a pain to integrate it with IDE, SCM, build tools and so on.
Another option is to use try-catch and check exception message manually. But this is ugly.
Lastly, I think it should be possible to hack TestNG with something like
#Test(expectedExceptionsMessageBundle = "bundle.name.goes.here",
expectedExceptionsMessageLocaleProvider = "functionReturningListOfLocales"
expectedExceptionsMessageKey = "MESSAGE_KEY_GOES_HERE")
This would be a great thing, really. Except that it won't be the same TestNG that Maven fetches for me from the repo. Another option is to implement this, contribute a patch to TestNG and wait for it to be released. I'm seriously considering this option now, but maybe there's an easier way? Haven't I missed something obvious? I can't possibly be the only one with this issue!
Or maybe I'm externalizing my messages in a wrong way. But a guy like Brian Goetz can't be wrong, now can he? Or did I get him wrong?
Update
Based on the answer given here, I've made a tutorial on the topic, covering some pitfalls, especially when using NetBeans 8.1.
Why not using an annotation transformer here?
You will be able to do something like:
#LocalizedException(expectedExceptionsMessageBundle = "bundle.name.goes.here",
expectedExceptionsMessageLocaleProvider = "functionReturningListOfLocales"
expectedExceptionsMessageKey = "MESSAGE_KEY_GOES_HERE")
#Test(dataProvider = "getInvalidLetters",
expectedExceptions = {IllegalArgumentException.class)
public void test() {
// ...
}
Where the annotation transformer will look like:
public class LocalizedExceptionTransformer implements IAnnotationTransformer {
public void transform(ITest annotation, Class testClass,
Constructor testConstructor, Method testMethod) {
if (testMethod != null) {
LocalizedException le = testMethod.getAnnotation(LocalizedException.class);
if (le != null) {
String regexp = regexize(le);
annotation.setExpectedExceptionsMessageRegExp(regexp);
}
}
}
}
I'm trying to do some unit testing, and I haven't really settle yet on a runner or test class.
At the end, this is the method I'm verifying it works:
public static void getData(final Context context, final Callback<MyObject> callback) {
Locale locale = context.getResources().getConfiguration().locale;
MyObjectService myService = new MyObjectService(getRequestHeaders(context), locale);
}
So I need a mock context that has resources, configuration, locale and SharedPreferences.
I've tried PowerMockito, AndroidTestCase, AndroidTestRunner, ApplicationTestCase. All I can get is a mock Context with nothing in it, or mock Resources, but I can't figure out how to add them to the mock context (basically remake a Context).
This is currently my last attempt (although I've tried more complex ones, without any success):
import com.myApp.android.app.shop.util.ServiceUtils;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mockito;
import org.powermock.modules.junit4.PowerMockRunner;
import android.content.Context;
#RunWith(PowerMockRunner.class)
public class TestLogin
{
#Test
public void test()
{
Context context = Mockito.mock(Context.class);
ServiceUtils.getData(context, dataCallback);
}
}
The exception I get is a NPE from getData() as the passed context does not have resources.
Any suggestions?
Android is not an easily mockable system. I too have tried using Mockito to mock the framework, but it quickly becomes extremely complex. Things like resources and SharedPreferences are not easy to mock. These problems have been tackled in 3rd-party APIs like Robolectric.
I suggest you use Robolectric for your Android unit tests. Robolectric runs in a standard VM (i.e. no emulator needed). Robolectric simulates the majority of the Android system, thus minimizing (but not eliminating) the need to build mocks. I have found it to be a very useful tool for Android TDD.
As a second answer (and maybe more useful than the first) I suggest you to split the logics from the platform's dependencies.
In this way you can consider to unit test the logic in a pure-java way.
Take a look on MVP pattern. It should helps a lot!
In this case, for example, you would move this line...
MyObjectService myService = new MyObjectService(getRequestHeaders(context), locale);
... inside a "Service" that is completely a pure-java class allocated by a constructor that require (in this case) a String value called "locale" and a data structure called "headers"
Easy, no? ;)
My 2 cents: everytime that you need Robolectric to test logics (instead of UI) you probably make a mistake
Use the RuntimeEnvironment by Robolectric
Locale locale = RuntimeEnvironment.application.getResources().getConfiguration().locale;
I am working on a project that has been through multiple hands with a sometimes rushed development. Over time the message.properties file has become out of sync with the jsps that use it. Now I don't know which properties are used and which aren't. Is there a tool (eclipse plugin perhaps) that can root out dead messages?
The problem is that messages may be accessed by JSP or Java, and resource names may be constructed rather than literal strings.
Simple grepping may be able to identify "obvious" resource access. The other solution, a resource lookup mechanism that tracks what's used, is only semi-reliable as well since code paths may determine which resources are used, and unless every path is traveled, you may miss some.
A combination of the two will catch most everything (over time).
Alternatively you can hide the functionality of ResourceBundle behind another façade ResourceBundle, which should generally pipe all calls to original one, but add logging and/or statistics collection on the top.
The example can be as following:
import java.util.Collection;
import java.util.Enumeration;
import java.util.HashSet;
import java.util.NoSuchElementException;
import java.util.ResourceBundle;
public class WrapResourceBundle {
static class LoggingResourceBundle extends ResourceBundle {
private Collection<String> usedKeys = new HashSet<String>();
public LoggingResourceBundle(ResourceBundle parentResourceBundle) {
setParent(parentResourceBundle);
}
#Override
protected Object handleGetObject(String key) {
Object value = parent.getObject(key);
if (value != null) {
usedKeys.add(key);
return value;
}
return null;
}
#Override
public Enumeration<String> getKeys() {
return EMPTY_ENUMERATOR;
}
public Collection<String> getUsedKeys() {
return usedKeys;
}
private static EmptyEnumerator EMPTY_ENUMERATOR = new EmptyEnumerator();
private static class EmptyEnumerator implements Enumeration<String> {
EmptyEnumerator() {
}
public boolean hasMoreElements() {
return false;
}
public String nextElement() {
throw new NoSuchElementException("Empty Enumerator");
}
}
}
public static void main(String[] args) {
LoggingResourceBundle bundle = new LoggingResourceBundle(ResourceBundle.getBundle("test"));
bundle.getString("key1");
System.out.println("Used keys: " + bundle.getUsedKeys());
}
}
Considering that some of your keys are run-time generated, I don't think you'll ever be able to find a tool to validate which keys are in use and which ones are not.
Given the problem you posed, I would probably write an AOP aspect which wraps the MessageSource.getMessage() implementation and log all the requested codes that are being retrieved from the resource bundle. Given that MessageSource is an interface, you would need to know the implementation that you are using, but I suspect that you must know that already.
Given that you would be writing the aspect yourself, you can create a format that is easily correlated against your resource bundle and once you are confident that it contains all the keys required, it becomes a trivial task to compare the two files and eliminate any superfluous lines.
If you really want to be thorough about this, if you already have Spring configured for annotation scan, you could even package up your aspect as its own jar (or .class) and drop it in a production WEB-INF/lib (WEB-INF/classes) folder, restart the webapp and let it run for a while. The great thing about annotations is that it can all be self contained. Once you are sure that you have accumulated enough data you just delete the jar (.class) and you're good to go.
I know that at least two of the major java IDEs can offer this functionality.
IntelliJ IDEA has a (disabled, by default) Inspection that you can
use to do this:
go to Settings -> Inspections -> Properties files -> ... and enable
the 'Unused property'
..Only problem I had was that it didn't pick up some usages of the property from a custom tag library I had written, which I was using in a few JSPs.
Eclipse also has something like this ( http://help.eclipse.org/helios/index.jsp?topic=%2Forg.eclipse.jdt.doc.user%2Ftasks%2Ftasks-202.htm ) but I haven't really exhausted the how well it works.
I'm experimenting with java annotation processors. I'm able to write integration tests using the "JavaCompiler" (in fact I'm using "hickory" at the moment). I can run the compile process and analyse the output. The Problem: a single test runs for about half a second even without any code in my annotation processor. This is way too long to using it in TDD style.
Mocking away the dependencies seems very hard for me (I would have to mock out the entire "javax.lang.model.element" package). Have someone succeed to write unit tests for an annotation processor (Java 6)? If not ... what would be your approach?
This is an old question, but it seems that the state of annotation processor testing hadn't gotten any better, so we released Compile Testing today. The best docs are in package-info.java, but the general idea is that there is a fluent API for testing compilation output when run with an annotation processor. For example,
ASSERT.about(javaSource())
.that(JavaFileObjects.forResource("HelloWorld.java"))
.processedWith(new MyAnnotationProcessor())
.compilesWithoutError()
.and().generatesSources(JavaFileObjects.forResource("GeneratedHelloWorld.java"));
tests that the processor generates a file that matches GeneratedHelloWorld.java (golden file on the class path). You can also test that the processor produces error output:
JavaFileObject fileObject = JavaFileObjects.forResource("HelloWorld.java");
ASSERT.about(javaSource())
.that(fileObject)
.processedWith(new NoHelloWorld())
.failsToCompile()
.withErrorContaining("No types named HelloWorld!").in(fileObject).onLine(23).atColumn(5);
This is obviously a lot simpler than mocking and unlike typical integration tests, all of the output is stored in memory.
You're right mocking the annotation processing API (with a mock library like easymock) is painful. I tried this approach and it broke down pretty rapidly. You have to setup to many method call expectations. The tests become unmaintainable.
A state-based test approach worked for me reasonably well. I had to implement the parts of the javax.lang.model.* API I needed for my tests. (That were only < 350 lines of code.)
This is the part of a test to initiate the javax.lang.model objects. After the setup the model should be in the same state as the Java compiler implementation.
DeclaredType typeArgument = declaredType(classElement("returnTypeName"));
DeclaredType validReturnType = declaredType(interfaceElement(GENERATOR_TYPE_NAME), typeArgument);
TypeParameterElement typeParameter = typeParameterElement();
ExecutableElement methodExecutableElement = Model.methodExecutableElement(name, validReturnType, typeParameter);
The static factory methods are defined in the class Model implementing the javax.lang.model.* classes. For example declaredType. (All unsupported operations will throw exceptions.)
public static DeclaredType declaredType(final Element element, final TypeMirror... argumentTypes) {
return new DeclaredType(){
#Override public Element asElement() {
return element;
}
#Override public List<? extends TypeMirror> getTypeArguments() {
return Arrays.asList(argumentTypes);
}
#Override public String toString() {
return format("DeclareTypeModel[element=%s, argumentTypes=%s]",
element, Arrays.toString(argumentTypes));
}
#Override public <R, P> R accept(TypeVisitor<R, P> v, P p) {
return v.visitDeclared(this, p);
}
#Override public boolean equals(Object obj) { throw new UnsupportedOperationException(); }
#Override public int hashCode() { throw new UnsupportedOperationException(); }
#Override public TypeKind getKind() { throw new UnsupportedOperationException(); }
#Override public TypeMirror getEnclosingType() { throw new UnsupportedOperationException(); }
};
}
The rest of the test verifies the behavior of the class under test.
Method actual = new Method(environment(), methodExecutableElement);
Method expected = new Method(..);
assertEquals(expected, actual);
You can have a look at the source code of the Quickcheck #Samples and #Iterables source code generator tests. (The code is not optimal, yet. The Method class has to many parameters and the Parameter class is not tested in its own test but as part of the Method test. It should illustrate the approach nevertheless.)
Viel Glück!
jOOR is a small Java reflection library that also provides simplified access to the in-memory Java compilation API in javax.tool.JavaCompiler. We added support for this to unit test jOOQ's annotation processors. You can easily write unit tests like this:
#Test
public void testCompileWithAnnotationProcessors() {
AProcessor p = new AProcessor();
try {
Reflect.compile(
"org.joor.test.FailAnnotationProcessing",
"package org.joor.test; " +
"#A " +
"public class FailAnnotationProcessing { " +
"}",
new CompileOptions().processors(p)
).create().get();
Assert.fail();
}
catch (ReflectException expected) {
assertFalse(p.processed);
}
}
The above example has been taken from this blog post
I was in a similar situation, so I created the Avatar library. It won't give you the performance of a pure unit test with no compilation, but if used correctly you shouldn't see much of a performance hit.
Avatar lets you write a source file, annotate it, and convert it to elements in a unit test. This allows you to unit test methods and classes which consume Element objects, without manually invoking javac.
I ran into the same problem awhile ago and found this question. Although the other answers provided are decent, I felt that that there was still room for improvement. Based on the other answers for this question, I created Elementary, a suite of JUnit 5 extensions that provide a real annotation processing environment for unit tests.
Most libraries test annotation processors by running them. However, most annotation processors are pretty complex and broken into more fine-grained components. It is not feasible to test individual components by running the annotation processor. Instead, we make the annotation processing environment available to these tests.
The following code snippet illustrates how to test a Lint component:
import com.karuslabs.elementary.junit.Cases;
import com.karuslabs.elementary.junit.Tools;
import com.karuslabs.elementary.junit.ToolsExtension;
import com.karuslabs.elementary.junit.annotations.Case;
import com.karuslabs.elementary.junit.annotations.Introspect;
import com.karuslabs.utilitary.type.TypeMirrors;
#ExtendWith(ToolsExtension.class)
#Introspect
class ToolsExtensionExampleTest {
Lint lint = new Lint(Tools.typeMirrors());
#Test
void lint_string_variable(Cases cases) {
var first = cases.one("first");
assertTrue(lint.lint(first));
}
#Test
void lint_method_that_returns_string(Cases cases) {
var second = cases.get(1);
assertFalse(lint.lint(second));
}
#Case("first") String first;
#Case String second() { return "";}
}
class Lint {
final TypeMirrors types;
final TypeMirror expectedType;
Lint(TypeMirrors types) {
this.types = types;
this.expectedType = types.type(String.class);
}
public boolean lint(Element element) {
if (!(element instanceof VariableElement)) {
return false;
}
var variable = (VariableElement) element;
return types.isSameType(expectedType, variable.asType());
}
}
By annotating the test class with #Introspect and test cases with #Case, we can declare test cases in the same file as the tests. The corresponding Element representation of the test cases can be retrieved by a test using Cases.
If anyone is interested, I wrote an article, The Problem with Annotation Processors that details the problems with unit testing annotation processors.
I have used http://hg.netbeans.org/core-main/raw-file/default/openide.util.lookup/test/unit/src/org/openide/util/test/AnnotationProcessorTestUtils.java though this is based on java.io.File for simplicity and so has the performance overhead you complain about.
Thomas's suggestion of mocking the whole JSR 269 environment would lead to a pure unit test. You might instead want to write more of an integration test which checks how your processor actually runs inside javac, giving more assurance it is correct, but merely want to avoid disk files. Doing this would require you to write a mock JavaFileManager, which is unfortunately not as easy as it seems and I have no examples handy, but you should not need to mock other things like Element interfaces.
An option is to bundle all tests in one class. Half a second for compiling etc. is then a constant for a given set of tests, the real test time for a test is negligible, I assume.