I am writing Android JUnit test cases for Project A which has Project B as a library (Project Properties / Android / Library). My test cases need to access resources (view ids, strings, etc.) from both Project A and Project B. If I add Project A and/or Project B as Android / Libraries or as Java Build Path / Projects to my test project, any call to ActivityInstrumentationTestCase2.getActivity() throws a ClassCastException.
If I don't add them, the call returns the appropriate Activity, but I don't have access to the resource ids of the two projects under test. I also don't have access to the objects in Project B, which are needed to properly test Project A. I can't use mock objects here.
Has anybody encountered and resolved this before?
Try this to access resources from the library: (I think it will work)
getInstrumentation().getTargetContext().getResources()...;
This will load resources from the test project: (definitely works)
getInstrumentation().getContext().getResources()...;
Related
I'm developing plugin for IntelliJ IDEA. How can plugin get the name and version of libraries that are imported to the project that is being checked by plugin? I have PsiClass of the project, but cannot convert it to java.lang.Class. Maybe there's the way to get ClassLoader from PsiElement?
super.visitImportStatement(psiImport);
Class importedClass = Class.forName(psiImport.getQualifiedName(), true, psiImport.getClass().getClassLoader());
PsiImport.getClass().GetClassLoader() - returns ClassLoader of class PsiImportStatementImpl instead of ClassLoader of class that I've imported.
IntelliJ does mostly static analysis on your code. In fact, the IDE and the projects you run/debug have completely different classpaths. When you open a project, your dependencies are not added to the IDE classpath. Instead, the IDE will index the JARs, meaning it will automatically discover all the declarations (classes, methods, interfaces etc) and save them for later in a cache.
When you write code in your editor, the static analysis tool will leverage the contents of this index to validate your code and show errors when you're trying to use unknown definitions for example.
On the other hand, when you run a Main class from your project, it will spawn a new java process that has its own classpath. This classpath will likely contain every dependency declared in your module.
Knowing this, you should now understand why you can't "transform" a PsiClass to a corresponding Class.
Back to your original question:
How can plugin get the name and version of libraries that are imported to the project that is being checked by plugin?
You don't need to access Class objects for this. Instead, you can use IntelliJ SDK libraries. Here's an example:
Module mod = ModuleUtil.findModuleForFile(virtualFile,myProject);
ModuleRootManager.getInstance(mod).orderEntries().forEachLibrary(library -> {
// do your thing here with `library`
return true;
});
I have a weird behavior in my project using Spring. I have this structure:
Main-Spring-Project
|_Depends on Library A
|_Depends on Library B
Now... there is an #Autowired in a component in A which injects a component which is in B.
When I have all 3 projects opened in my development environment and start de main project, everythin works fine. But if I remove the project A and leave B, it all crashes when starting saying:
Field factory in [class in A] required a bean of type '[class in B]' that could not be found.
so... I suspect it is loading A, before B.
What I don't understand is why it works perfect if I have both projects open, and why it crashes when I have A closed (main project is using its JAR)
By the way... if I remove A and B, everything works perfect again.
No Spring doesn't make a mistake during the context loading.
This behavior makes sense : the class is just not available at runtime.
Here you work with snapshot/current development source code/compiled classes of library B since library B makes part of your current development code.
It means that the dependencies/classes of library B needed during the run of the application doesn't depend on Maven for retrieving them but for what your IDE has access : the library A and B projects.
To avoid that you should install (mvn clean install) the B dependency in your local Maven/Gradle repository but that may be not practical if you need to repeat the task 20 days a day because the B source code changes 20 days a day.
Note that if you didn't use your IDE, you would be constraint to install the dependency at each modification. So finally, it is not so bad at all. No ?
We have a very productive and robust SPL (software production line) for .NET platform to create web applications and HTTP services.
Now we want to import that knowledge into Android.
This is the scenario:
Developer A gets our android framework, and project A into the following paths:
D:\Android\Framework
D:\Android\ProjectA
And of course, project A should reuse code and stuff in Framework (layouts, java utilities, etc.)
Developer B's setup is:
C:\Users\Jack\AndroidStudioProjects\Framework
C:\Users\Jack\AndroidStudioProjects\ProjectB
Again, project B reuses Framework libraries.
Both developer A and developer B have AndroidProjectsRoot environment variables defined in their systems. For developer A it refers to D:\Android and for developer B the path is C:\Users\Jack\AndroidStudioProjects.
It's a team-convention that we all have one root folder and get each project in a direct child directory of that root folder.
We use Android Studio and gradle, and here's the place where we're stuck. We can't make project A or B build using Framework libraries. Framework has these libraries:
--jzp.framework
--validation
--http
-- more libraries here
Inside project A/B's settings.gradle we have:
include ':app', ':http', ':validation' //, more includes here
project(':http').projectDir = new File('$System.getenv("")/Framework/http/libs')
project(':validation').projectDir = new File('$System.getenv("")/Framework/validation/libs')
// more directory configurations here
Then in the app's gradle we have:
dependencies {
// other dependencies here
compile project(':http')
compile project(':validation')
// more compile statements here
}
However, project A/B won't build and we can't see our libraries packages inside our projects A/B etc.
What might be wrong? How to add dependencies to other modules on the hard drive, via environment variables?
I have following dependency hierarchy in a service:
Service A
-> Lib A
-> Lib B
-> Lib A
-> Lib C
-> Lib A
-> Lib D
-> Lib A
"->" means depends on.
There are lot of problems that pops up because of above structure. The one that requires most of the efforts is to keep the Lib A sync across all modules to avoid class conflicts and other issues.
I am using maven for dependency management but it doesn't solve the issue as I have to update all dependencies to avoid conflicts (semantic and functional)
Is there any better way to manage these dependencies?
Edit 1:
Scenario 1: Adding new code to Lib A which is only going to be used by Lib B.
In this case, it wouldn't be sufficient to change Lib B. I will have to add Latest version of Lib A in service A so that maven picks up the correct version.
Scenario 2: Non Backward compatible changes in A.
This will cause problem if I just update Lib A in Service A because other libs (B, C and D) doesn't know about this new change and it might break (e.g Adding a new argument in an existing method method). I will have to update all of them.
Scenario 3: Changing an existing method in Lib A.
This will work fine. If I update this Lib A in service A, maven will pick up latest Lib A and all libs (B,C and D) will use latest version of Lib A.
Not sure if i understand the question right , are you mentioning of dependency conflict between libraries.
If you have same dependencies across libraries , one way of sorting it out could be to use exclusions , that way you can add one dependency and exclude the others (https://maven.apache.org/guides/introduction/introduction-to-optional-and-excludes-dependencies.html)
You might look into OSGi or Java 9 modules (not finished yet of course). I am not an expert but I believe these frameworks allow for a concept of modules/bundles where each module has its own class loader. This means if lib B and lib C were modules they could have different versions of lib A and everything would work smoothly. It does prevent lib B and lib C from communicating directly but given your dependency tree that should be desired.
--Update to match your update--
Scenario 1:
Update lib A version only in module B. Module C/D can remain unchanged.
Scenario 2:
You can update lib A in module B/C/D independently when each of those modules is ready to use the new version of lib A.
Scenario 3:
No issue.
Think about Domain Driven Design and more specifically Bounded Context Pattern. It allows code duplication to certain degree. The flexibility will allow you to decouple "jar A" from the rest very easily.
What exactly is a Bounded Context?
Lets imagine that we have User. This User in one context can be UserAdmin or in another context it can be PowerUser or in another context can be just User.
Now lets imaging that we have 3 services addressing three different functions of a User based on hist type. Each Service represents a different Context of the term User. Now the point here is that instead of creating one almighty "User" in your case it will go in the almighty "libA" we will create 3 users based on the context and will give different functions of these 3 users. Eventually we will end up with 3 domain objects AdminUser for service 1 PowerUser for service 2 and just User for service 3. This three services will not have dependencies on each other, unless it fits us and even if they have it will be piece of cake to decouple them at any point.
If we start thinking of a service as Onion. On the first layer we would have persistence. On the second layer we will start unwrapping our domain model. On the next layer we will start building our services. Then our network representation and so on.
Thinking in terms of Bounded Context will allow you to create a unique Bounded context on per service basis and this will allow you to not have this interdependencies in between the different Jars . This is because your service will not necessarily share code with other services.
Service A
-> Project Commons
-> Lib B_A (functional dependency, commons B no functional dependency with C and D)
-> Lib C_A (functional dependency, commons C no functional dependency with B and D)
-> Lib D_A (functional dependency, commons C no functional dependency with B and C)
-> Lib A (technical dependencies , commons in B C D)
-> pom_parent
-> Lib B
-> Lib A and Lib B_A
-> Lib C
-> Lib A and Lib C_A
-> Lib D
-> Lib A and Lib D_A
pom_parent (SNAPSHOT)
I am new to Maven, we are converting ant based project into Maven project. Every thing is working fine. Additionally we need to compile source code package wise.
To be more clear, we have three packages in src/main/java folder, namely dao, svc and controller. I want to compile dao first and then compile svc by giving dao reference to it, similarly compile controller by giving reference of svc only but not dao.
Finally the goal is to make sure that the controllers are not using any of the dao classes, they can use svc classes only. If this condition fails, Maven build has to be failed.
Please suggest.
It sounds like you need a multi-module maven project. Create a parent project, whose task is simply to aggregate your three modules and to provide one thing to build. Create one module for each of your packages, then define dependencies between those modules in the individual POM files.
The Maven build system is clever enough to know in which order to build the modules, based on the dependencies you declare between them. In cases where you don't define a dependency (e.g. between controller and dao), the controller module cannot access classes in the dao module.
The final source layout will be something like:
your-project
your-project/pom.xml <--- parent POM
your-project/dao
your-project/dao/pom.xml
your-project/dao/src/main/...
your-project/svc
your-project/svc/pom.xml
your-project/svc/src/main/...
your-project/controller
your-project/controller/pom.xml
your-project/controller/src/main/...
To be more clear, we have three packages in src/main/java folder, namely dao, svc and controller. I want to compile dao first and then compile svc by giving dao reference to it, similarly compile controller by giving reference of svc only but not dao.
Judging by your requirements, I think more or less you need to setup a maven based Multi Module project. Take a look at these links
Link 1
Link 2