I got a few problems while programming with DeepLearning4j.
When I open and compile the example MnistMultiThreadedExample in Eclipse, these problems occured.
import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator;
import org.deeplearning4j.datasets.test.TestDataSetIterator;
import org.deeplearning4j.iterativereduce.actor.multilayer.ActorNetworkRunner;**(error)**
import org.deeplearning4j.models.classifiers.dbn.DBN;**(error)**
import org.deeplearning4j.nn.conf.NeuralNetConfiguration;
import org.deeplearning4j.scaleout.conf.Conf;**(error)**
It is saying these package are not in the target package. And I couldn't find these modules in the package and couldn't find it in Maven Center Repository while I couldn't find the Class in Source Code.
Now I want to know how I get these modules and what should I do before creating a AutoEncoder which could running on Spark.
The example code is shown below:
import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator;
import org.deeplearning4j.datasets.test.TestDataSetIterator;
import org.deeplearning4j.iterativereduce.actor.multilayer.ActorNetworkRunner;
import org.deeplearning4j.models.classifiers.dbn.DBN;
import org.deeplearning4j.nn.conf.NeuralNetConfiguration;
import org.deeplearning4j.scaleout.conf.Conf;
public class MnistMultiThreadedExample {
public static void main(String[] args) throws Exception {
//5 batches of 100: 20 each
MnistDataSetIterator mnist = new MnistDataSetIterator(20, 60000);
TestDataSetIterator iter = new TestDataSetIterator(mnist);
ActorNetworkRunner runner = new ActorNetworkRunner(iter);
NeuralNetConfiguration conf2 = new NeuralNetConfiguration.Builder()
.nIn(784).nOut(10).build();
Conf conf = new Conf();
conf.setConf(conf2);
conf.getConf().setFinetuneEpochs(1000);
conf.setLayerSizes(new int[]{500,250,100});
conf.setMultiLayerClazz(DBN.class);
conf.getConf().setnOut(10);
conf.getConf().setFinetuneLearningRate(0.0001f);
conf.getConf().setnIn(784);
conf.getConf().setL2(0.001f);
conf.getConf().setMomentum(0.5f);
conf.setSplit(10);
conf.getConf().setUseRegularization(false);
conf.setDeepLearningParams(new Object[]{1,0.0001,1000});
runner.setup(conf);
runner.train();
}
}
You should add the following dependency to your POM:
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-scaleout-akka</artifactId>
<version>0.0.3.3</version>
</dependency>
This will add as transitive dependencies deeplearning4j-scaleout-api and deeplearning4j-core. Those three dependencies will provide you the imports you are missing.
Related
I am setting up a back-end test using Java.
When running my test I am presented with the following error:
java.lang.NoSuchMethodError: io.netty.util.internal.PlatformDependent.allocateUninitializedArray(I)[B
My pom file contains the following dependencies in regard to Netty:
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-transport</artifactId>
<version>4.1.36.Final</version>
</dependency>
My code itself looks as follows:
import cucumber.api.java.en.And;
import org.mockserver.client.MockServerClient;
import org.mockserver.matchers.Times;
import org.mockserver.model.HttpRequest;
import org.mockserver.model.HttpResponse;
import org.springframework.beans.factory.annotation.Autowired;
import java.io.IOException;
import java.net.URISyntaxException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
public class PdfGenerateStep {
#Autowired
private MockServerClient mockServerClient;
#And("Pdf {string} is generated")
public void generatePDF(String pdfFile) {
HttpRequest httpRequest = new HttpRequest();
httpRequest.withPath("/pdf-service/doc/request")
.withHeader("template", "TEST")
.withHeader("docFormat", "pdf")
.withHeader("fromParty", "PDFGEN")
.withHeader("APPLICATION", "App")
.withMethod("POST");
HttpResponse httpResponse = new HttpResponse();
httpResponse.withStatusCode(200);
httpResponse.withBody(readPdfFile(pdfFile));
mockServerClient.when(httpRequest, Times.once()).respond(httpResponse);
}
private byte[] readPdfFile(String file) {
try {
Path path = Paths.get(getClass().getClassLoader().getResource(file).toURI());
return Files.readAllBytes(path);
} catch (URISyntaxException | IOException e) {
e.printStackTrace();
}
return null;
}
}
There is no allocateUninitializedArray method in io.netty.util.internal.PlatformDependent class, so in your classpath there is an another jar that contains this class, but the version, and therefore the code of this class will be different.
The io.netty.util.internal.PlatformDependent class can be found in netty-common, which is a transitive dependency of netty-transport.
So, check the dependency tree of your project.
Very probably you have an another dependency that has a different version of netty-common as a transitive dependency.
Exclude the wrong one and you will be done.
<dependency>
<groupId>com.corundumstudio.socketio</groupId>
<artifactId>netty-socketio</artifactId>
<version>1.7.13</version>
</dependency>
add this dependency in your pom.xml file.
I'm trying to use the Reflections library to give me a list of all the classes in a specific package in an external jar file. This works in Netbeans, but not when running the jar file from the command line. It looks like Netbeans finds and loads the classes I need, whereas the command line run doesn't. How should I set this up so it works everywhere?
I've tried both the usage example on the Reflections readme, as well as the response to this issue. Both methods have the same result.
Here's the test code I've been working with to reproduce the issue:
package javaapplication1;
import java.net.MalformedURLException;
import java.net.URL;
import java.net.URLClassLoader;
import org.reflections.Reflections;
import org.reflections.util.ConfigurationBuilder;
import org.reflections.util.ClasspathHelper;
import org.reflections.scanners.SubTypesScanner;
import org.reflections.scanners.ResourcesScanner;
import java.util.Set;
import org.externaljar.package.*;
public class JavaApplication1
{
private static Reflections reflections;
public static void main(String[] args)
{
final String myPkg = "org.externaljar.package";
URL[] urlPath = new URL[1];
try{
urlPath[0] = new URL("jar:file:/path/to/external.jar!/");
}catch(MalformedURLException ex){
ex.printStackTrace();
}
URLClassLoader urlLoader = URLClassLoader.newInstance(urlPath);
final ConfigurationBuilder config = new ConfigurationBuilder()
.addClassLoader(urlLoader)
.setScanners(new ResourcesScanner(), new SubTypesScanner(false))
.setUrls(ClasspathHelper.forPackage(myPkg));
reflections = new Reflections(config, new SubTypesScanner(false));
Set<Class<? extends ObjectBase>> objects = reflections.getSubTypesOf(ObjectBase.class);
System.out.println("\n\nFound " + objects.size() + " Objects\n\n");
}
}
Running the project in Netbeans gives a non-zero value for objects.size(), but running java -jar JavaApplication1.jar prints "Found 0 Objects". Adding -verbose:class to each shows that Netbeans loads all the classes I'm looking for, but those aren't loaded when run from the command line.
I am currently building a framework which would benefit from having a DSL for creating a configuration file, so I created one using Xtext.
Now I want to add a dependency to the classes that I have created so that I can generate configurations at runtime, but on Xtext's site it looks like the only two cases for integration are:
When I want CI for the language itself;
When I want to include a plugin that would generate code at build time.
How can I use the generator that I wrote in Xtext at runtime in my Maven project?
For CI for Xtext itself simpy use the new project wizard and select Maven as build system on the second page of the project. To build your model files have a look that the xtext-maven-plugin e.g. as used here https://github.com/xtext/maven-xtext-example/blob/master/example-project/pom.xml or here https://github.com/cdietrich/xtext-maven-example/blob/master/org.xtext.example.mydsl.model/pom.xml
If you simply want to read a model file and call the generator
package org.eclipse.xtext.example.domainmodel;
import java.util.ArrayList;
import java.util.List;
import org.eclipse.emf.common.util.URI;
import org.eclipse.emf.ecore.resource.Resource;
import org.eclipse.emf.ecore.resource.ResourceSet;
import org.eclipse.xtext.generator.GeneratorContext;
import org.eclipse.xtext.generator.GeneratorDelegate;
import org.eclipse.xtext.generator.IGeneratorContext;
import org.eclipse.xtext.generator.JavaIoFileSystemAccess;
import org.eclipse.xtext.util.CancelIndicator;
import org.eclipse.xtext.validation.CheckMode;
import org.eclipse.xtext.validation.IResourceValidator;
import org.eclipse.xtext.validation.Issue;
import com.google.common.collect.Lists;
import com.google.inject.Injector;
/**
* #author dietrich - Initial contribution and API
*/
public class Main {
public static void main(String[] args) {
// TODO traverse directory
List<String> files = Lists.newArrayList("model/a.dmodel", "model/b.dmodel");
Injector injector = new DomainmodelStandaloneSetup().createInjectorAndDoEMFRegistration();
ResourceSet rs = injector.getInstance(ResourceSet.class);
ArrayList<Resource> resources = Lists.newArrayList();
for (String file : files) {
Resource r = rs.getResource(URI.createFileURI(file), true);
resources.add(r);
}
IResourceValidator validator = injector.getInstance(IResourceValidator.class);
for (Resource r : resources) {
List<Issue> issues = validator.validate(r, CheckMode.ALL, CancelIndicator.NullImpl);
for (Issue i : issues) {
System.out.println(i);
}
}
GeneratorDelegate generator = injector.getInstance(GeneratorDelegate.class);
JavaIoFileSystemAccess fsa = injector.getInstance(JavaIoFileSystemAccess.class);
fsa.setOutputPath("src-gen-code/");
GeneratorContext context = new GeneratorContext();
context.setCancelIndicator(CancelIndicator.NullImpl);
for (Resource r : resources) {
generator.generate(r, fsa, context);
}
}
}
I am using the following code to create list of rules fires in ODM ,but eclipse is showing above compilation error.
package com.cper.brms.model.questions;
import ilog.rules.res.session.IlrSessionRequest;
import ilog.rules.res.session.IlrSessionResponse;
import ilog.rules.res.session.ruleset.IlrBusinessExecutionTrace;
import ilog.rules.res.session.ruleset.IlrExecutionTrace;
import ilog.rules.teamserver.auth.AuthenticationCredentials;
import ilog.rules.teamserver.model.IlrConnectException;
import ilog.rules.teamserver.model.IlrSession;
import ilog.rules.teamserver.model.IlrSessionFactory;
import java.util.List;
import java.util.Map;
public class RulesTrace<IlrStatelessSession>
{
IlrSessionFactory sessionFactory = new IlrJ2SESessionFactory();
IlrSessionRequest sessionRequest = sessionFactory.createRequest();
String rulesetPath = "/miniloanruleapp/miniloanrules";
}
sessionRequest.setRulesetPath(IlrPath.parsePath(rulesetPath));
sessionRequest.setTraceEnabled(true);
sessionRequest.getTraceFilter().setInfoAllFilters(true);
Map<String,Object> inputParameters = sessionRequest.getInputParameters();
inputParameters.put("loan", loan);
inputParameters.put("borrower", borrower);
IlrStatelessSession session = sessionFactory.createStatelessSession();
IlrSessionResponse response = session.execute(sessionRequest);
IlrExecutionTrace sessionTrace = response.getRulesetExecutionTrace();
int rulesNumber = sessionTrace.getTotalRulesFired();
IlrBusinessExecutionTrace execResult = new IlrBusinessExecutionTrace(response.getRulesetExecutionTrace());
List<String> rulesFired = execResult.getRuleFiredBusinessNames();
loan = (Loan) response.getOutputParameters().get("loan");
}
Do I need to create any customized code to create sessionFactory or am i missing any jar?
You are missing jrules-res-execution.jar from your project. Adding it should solve the problem.
Its the version mismatch causing the error.I have JAVAEE in my eclipse,but above code is for JavaSE. Changed the code to right version.
I get the following error when I try to run
java -cp 'angus-sdk-java-0.0.2-jar-with-dependencies.jar:.' FaceDetect
I am following a tutorial for face detection in http://angus-doc.readthedocs.io/en/latest/getting-started/java.html . Below is my java code,
import java.io.IOException;
import org.json.simple.JSONObject;
import ai.angus.sdk.Configuration;
import ai.angus.sdk.Job;
import ai.angus.sdk.ProcessException;
import ai.angus.sdk.Root;
import ai.angus.sdk.Service;
import ai.angus.sdk.impl.ConfigurationImpl;
import ai.angus.sdk.impl.File;
public class FaceDetect {
public static void main(String[] args) throws IOException, ProcessException {
Configuration conf = new ConfigurationImpl();
Root root = conf.connect();
Service service = root.getServices().getService("age_and_gender_estimation", 1);
JSONObject params = new JSONObject();
params.put("image", new File("Downloads/IMG_1060.jpg"));
Job job = service.process(params);
System.out.println(job.getResult().toJSONString());
}
}
I don't understand the problem with it. I have tried all the answers in the stack overflow but nothing is working for me.
remove the single qoutes around the classpath:
java -cp angus-sdk-java-0.0.2-jar-with-dependencies.jar:. FaceDetect