Loading classes for Reflections library outside Netbeans - java

I'm trying to use the Reflections library to give me a list of all the classes in a specific package in an external jar file. This works in Netbeans, but not when running the jar file from the command line. It looks like Netbeans finds and loads the classes I need, whereas the command line run doesn't. How should I set this up so it works everywhere?
I've tried both the usage example on the Reflections readme, as well as the response to this issue. Both methods have the same result.
Here's the test code I've been working with to reproduce the issue:
package javaapplication1;
import java.net.MalformedURLException;
import java.net.URL;
import java.net.URLClassLoader;
import org.reflections.Reflections;
import org.reflections.util.ConfigurationBuilder;
import org.reflections.util.ClasspathHelper;
import org.reflections.scanners.SubTypesScanner;
import org.reflections.scanners.ResourcesScanner;
import java.util.Set;
import org.externaljar.package.*;
public class JavaApplication1
{
private static Reflections reflections;
public static void main(String[] args)
{
final String myPkg = "org.externaljar.package";
URL[] urlPath = new URL[1];
try{
urlPath[0] = new URL("jar:file:/path/to/external.jar!/");
}catch(MalformedURLException ex){
ex.printStackTrace();
}
URLClassLoader urlLoader = URLClassLoader.newInstance(urlPath);
final ConfigurationBuilder config = new ConfigurationBuilder()
.addClassLoader(urlLoader)
.setScanners(new ResourcesScanner(), new SubTypesScanner(false))
.setUrls(ClasspathHelper.forPackage(myPkg));
reflections = new Reflections(config, new SubTypesScanner(false));
Set<Class<? extends ObjectBase>> objects = reflections.getSubTypesOf(ObjectBase.class);
System.out.println("\n\nFound " + objects.size() + " Objects\n\n");
}
}
Running the project in Netbeans gives a non-zero value for objects.size(), but running java -jar JavaApplication1.jar prints "Found 0 Objects". Adding -verbose:class to each shows that Netbeans loads all the classes I'm looking for, but those aren't loaded when run from the command line.

Related

Hadoop is complaining for nonexistent anonymous class (NoClassDefFoundError)

Consider a simple Java file which creates a BufferedInputStream to copy a local file 1400-8.txt to Hadoop HDFS and print some dots as a progress status. The example is Example 3-3 from the Hadoop book here.
// cc FileCopyWithProgress Copies a local file to a Hadoop filesystem, and shows progress
import java.io.BufferedInputStream;
import java.io.FileInputStream;
import java.io.InputStream;
import java.io.OutputStream;
import java.net.URI;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
import org.apache.hadoop.util.Progressable;
// vv FileCopyWithProgress
public class FileCopyWithProgress {
public static void main(String[] args) throws Exception {
String localSrc = args[0];
String dst = args[1];
InputStream in = new BufferedInputStream(new FileInputStream(localSrc));
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(dst), conf);
OutputStream out = fs.create(new Path(dst), new Progressable() {
public void progress() {
System.out.print(".");
}
});
IOUtils.copyBytes(in, out, 4096, true);
}
}
// ^^ FileCopyWithProgress
I compile the code and create the JAR file with
hadoop com.sun.tools.javac.Main FileCopyWithProgress.java
jar cf FileCopyWithProgress.jar FileCopyWithProgress.class
The above commands generate the files FileCopyWithProgress.class, FileCopyWithProgress$1.class and FileCopyWithProgress.jar. Then, I try to run it
hadoop jar FileCopyWithProgress.jar FileCopyWithProgress 1400-8.txt hdfs://localhost:9000/user/kostas/1400-8.txt
But, I receive the error
Exception in thread "main" java.lang.NoClassDefFoundError:
FileCopyWithProgress$1
To my understanding, the FileCopyWithProgress$1.class is due to the anonymous callback function the program declares. But since the file exists what is the issue here? Am I running the correct sequence of commands?
I found the issue so I am just posting in case it helps someone. I had to include the class FileCopyWithProgress$1.class in the JAR. The correct one should be
jar cf FileCopyWithProgress.jar FileCopyWithProgress*.class

How do I include Xtext generator in my Maven project?

I am currently building a framework which would benefit from having a DSL for creating a configuration file, so I created one using Xtext.
Now I want to add a dependency to the classes that I have created so that I can generate configurations at runtime, but on Xtext's site it looks like the only two cases for integration are:
When I want CI for the language itself;
When I want to include a plugin that would generate code at build time.
How can I use the generator that I wrote in Xtext at runtime in my Maven project?
For CI for Xtext itself simpy use the new project wizard and select Maven as build system on the second page of the project. To build your model files have a look that the xtext-maven-plugin e.g. as used here https://github.com/xtext/maven-xtext-example/blob/master/example-project/pom.xml or here https://github.com/cdietrich/xtext-maven-example/blob/master/org.xtext.example.mydsl.model/pom.xml
If you simply want to read a model file and call the generator
package org.eclipse.xtext.example.domainmodel;
import java.util.ArrayList;
import java.util.List;
import org.eclipse.emf.common.util.URI;
import org.eclipse.emf.ecore.resource.Resource;
import org.eclipse.emf.ecore.resource.ResourceSet;
import org.eclipse.xtext.generator.GeneratorContext;
import org.eclipse.xtext.generator.GeneratorDelegate;
import org.eclipse.xtext.generator.IGeneratorContext;
import org.eclipse.xtext.generator.JavaIoFileSystemAccess;
import org.eclipse.xtext.util.CancelIndicator;
import org.eclipse.xtext.validation.CheckMode;
import org.eclipse.xtext.validation.IResourceValidator;
import org.eclipse.xtext.validation.Issue;
import com.google.common.collect.Lists;
import com.google.inject.Injector;
/**
* #author dietrich - Initial contribution and API
*/
public class Main {
public static void main(String[] args) {
// TODO traverse directory
List<String> files = Lists.newArrayList("model/a.dmodel", "model/b.dmodel");
Injector injector = new DomainmodelStandaloneSetup().createInjectorAndDoEMFRegistration();
ResourceSet rs = injector.getInstance(ResourceSet.class);
ArrayList<Resource> resources = Lists.newArrayList();
for (String file : files) {
Resource r = rs.getResource(URI.createFileURI(file), true);
resources.add(r);
}
IResourceValidator validator = injector.getInstance(IResourceValidator.class);
for (Resource r : resources) {
List<Issue> issues = validator.validate(r, CheckMode.ALL, CancelIndicator.NullImpl);
for (Issue i : issues) {
System.out.println(i);
}
}
GeneratorDelegate generator = injector.getInstance(GeneratorDelegate.class);
JavaIoFileSystemAccess fsa = injector.getInstance(JavaIoFileSystemAccess.class);
fsa.setOutputPath("src-gen-code/");
GeneratorContext context = new GeneratorContext();
context.setCancelIndicator(CancelIndicator.NullImpl);
for (Resource r : resources) {
generator.generate(r, fsa, context);
}
}
}

SdkManager class is not available in latest android SDK

Downloaded latest android Studio (android-studio-bundle-162.3871768-windows).
We were using com.android.sdklib.SdkManager class in our software but in latest Android Studio I'm not able to find the above mentioned class in any jar present inside the tools\lib folder.
Can anyone suggest what is the better alternative for this?
if you want to get a list of all the targets installed for knowledge, then you can just simply run the SDK manager. But since you want to call the getTargets() method, it means you need it for other purposes. check up the documentation on the android studio web page to find out if it the class you are searching for exists and the location of its jar file.
We can find the soure code of all the android classes in the below link.
https://javalibs.com/artifact/com.android.tools/sdklib?className=com.android.sdklib.tool.SdkManagerCli&source
SdkManagerCli class have equivalent method listPackages()which will list the packages.
We need to import sdklib-25.3.2.jar, repository-25.3.2.jar and common-25.3.2.jar to project.
Below is the working code for listing packages:-
import java.io.File;
import java.lang.reflect.Method;
import java.util.Collection;
import java.util.TreeSet;
import com.android.repository.Revision;
import com.android.repository.api.ConsoleProgressIndicator;
import com.android.repository.api.LocalPackage;
import com.android.repository.api.ProgressIndicator;
import com.android.repository.api.RepoManager;
import com.android.repository.impl.meta.RepositoryPackages;
import com.android.sdklib.repository.AndroidSdkHandler;
public class AndroidTesting {
public static void main(String[] args) {
listPackages();
}
private static void listPackages() {
AndroidSdkHandler mHandler = AndroidSdkHandler.getInstance(new
File("filePath")); //for eg:-sdk/platforms for API
ProgressIndicator progress = new ConsoleProgressIndicator();
RepoManager mRepoManager = mHandler.getSdkManager(progress);
mRepoManager.loadSynchronously(cacheExpirationMs, progress,
downloader, settings)(0, progress, null, null);
RepositoryPackages packages = mRepoManager.getPackages();
Collection<LocalPackage> locals = new TreeSet<LocalPackage>();
Collection<LocalPackage> localObsoletes = new
TreeSet<LocalPackage>();
for (LocalPackage local : packages.getLocalPackages().values()) {
if (local.obsolete()) {
localObsoletes.add(local);
} else {
locals.add(local);
}
Revision version = local.getVersion();
System.out.println(local.getDisplayName() + " "
+ local.getVersion() );
}
}
}

Could not find or load main class FaceDetect in java with angus.ai

I get the following error when I try to run
java -cp 'angus-sdk-java-0.0.2-jar-with-dependencies.jar:.' FaceDetect
I am following a tutorial for face detection in http://angus-doc.readthedocs.io/en/latest/getting-started/java.html . Below is my java code,
import java.io.IOException;
import org.json.simple.JSONObject;
import ai.angus.sdk.Configuration;
import ai.angus.sdk.Job;
import ai.angus.sdk.ProcessException;
import ai.angus.sdk.Root;
import ai.angus.sdk.Service;
import ai.angus.sdk.impl.ConfigurationImpl;
import ai.angus.sdk.impl.File;
public class FaceDetect {
public static void main(String[] args) throws IOException, ProcessException {
Configuration conf = new ConfigurationImpl();
Root root = conf.connect();
Service service = root.getServices().getService("age_and_gender_estimation", 1);
JSONObject params = new JSONObject();
params.put("image", new File("Downloads/IMG_1060.jpg"));
Job job = service.process(params);
System.out.println(job.getResult().toJSONString());
}
}
I don't understand the problem with it. I have tried all the answers in the stack overflow but nothing is working for me.
remove the single qoutes around the classpath:
java -cp angus-sdk-java-0.0.2-jar-with-dependencies.jar:. FaceDetect

DeepLearning4j‘s example compiled error

I got a few problems while programming with DeepLearning4j.
When I open and compile the example MnistMultiThreadedExample in Eclipse, these problems occured.
import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator;
import org.deeplearning4j.datasets.test.TestDataSetIterator;
import org.deeplearning4j.iterativereduce.actor.multilayer.ActorNetworkRunner;**(error)**
import org.deeplearning4j.models.classifiers.dbn.DBN;**(error)**
import org.deeplearning4j.nn.conf.NeuralNetConfiguration;
import org.deeplearning4j.scaleout.conf.Conf;**(error)**
It is saying these package are not in the target package. And I couldn't find these modules in the package and couldn't find it in Maven Center Repository while I couldn't find the Class in Source Code.
Now I want to know how I get these modules and what should I do before creating a AutoEncoder which could running on Spark.
The example code is shown below:
import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator;
import org.deeplearning4j.datasets.test.TestDataSetIterator;
import org.deeplearning4j.iterativereduce.actor.multilayer.ActorNetworkRunner;
import org.deeplearning4j.models.classifiers.dbn.DBN;
import org.deeplearning4j.nn.conf.NeuralNetConfiguration;
import org.deeplearning4j.scaleout.conf.Conf;
public class MnistMultiThreadedExample {
public static void main(String[] args) throws Exception {
//5 batches of 100: 20 each
MnistDataSetIterator mnist = new MnistDataSetIterator(20, 60000);
TestDataSetIterator iter = new TestDataSetIterator(mnist);
ActorNetworkRunner runner = new ActorNetworkRunner(iter);
NeuralNetConfiguration conf2 = new NeuralNetConfiguration.Builder()
.nIn(784).nOut(10).build();
Conf conf = new Conf();
conf.setConf(conf2);
conf.getConf().setFinetuneEpochs(1000);
conf.setLayerSizes(new int[]{500,250,100});
conf.setMultiLayerClazz(DBN.class);
conf.getConf().setnOut(10);
conf.getConf().setFinetuneLearningRate(0.0001f);
conf.getConf().setnIn(784);
conf.getConf().setL2(0.001f);
conf.getConf().setMomentum(0.5f);
conf.setSplit(10);
conf.getConf().setUseRegularization(false);
conf.setDeepLearningParams(new Object[]{1,0.0001,1000});
runner.setup(conf);
runner.train();
}
}
You should add the following dependency to your POM:
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-scaleout-akka</artifactId>
<version>0.0.3.3</version>
</dependency>
This will add as transitive dependencies deeplearning4j-scaleout-api and deeplearning4j-core. Those three dependencies will provide you the imports you are missing.

Categories

Resources