I have downloaded the ActiveMQ zip file on my windows system and extracted it. Then I tried to run the activemq.bat file and it is not getting started. It is showing the following shared log file, Can any one tell me what is the issue and what needs to be done to start the activeMQ,
D:\apache-activemq-5.11.1\bin>activemq.bat
Java Runtime: Oracle Corporation 1.7.0_51 C:\Program Files\Java\jdk1.7.0_51\jre
Heap sizes: current=1005568k free=995061k max=1005568k
JVM args: -Dcom.sun.management.jmxremote -Xms1G -Xmx1G -Djava.util.logging.c
onfig.file=logging.properties -Djava.security.auth.login.config=D:\apache-active
mq-5.11.1\bin\..\conf\login.config -Dactivemq.classpath=D:\apache-activemq-5.11.
1\bin\..\conf;D:\apache-activemq-5.11.1\bin\../conf;D:\apache-activemq-5.11.1\bi
n\../conf; -Dactivemq.home=D:\apache-activemq-5.11.1\bin\.. -Dactivemq.base=D:\a
pache-activemq-5.11.1\bin\.. -Dactivemq.conf=D:\apache-activemq-5.11.1\bin\..\co
nf -Dactivemq.data=D:\apache-activemq-5.11.1\bin\..\data -Djava.io.tmpdir=D:\apa
che-activemq-5.11.1\bin\..\data\tmp
Extensions classpath:
[D:\apache-activemq-5.11.1\bin\..\lib,D:\apache-activemq-5.11.1\bin\..\lib\cam
el,D:\apache-activemq-5.11.1\bin\..\lib\optional,D:\apache-activemq-5.11.1\bin\.
.\lib\web,D:\apache-activemq-5.11.1\bin\..\lib\extra]
ACTIVEMQ_HOME: D:\apache-activemq-5.11.1\bin\..
ACTIVEMQ_BASE: D:\apache-activemq-5.11.1\bin\..
ACTIVEMQ_CONF: D:\apache-activemq-5.11.1\bin\..\conf
ACTIVEMQ_DATA: D:\apache-activemq-5.11.1\bin\..\data
Usage: Main [--extdir <dir>] [task] [task-options] [task data]
Tasks:
browse - Display selected messages in a specified destinat
ion.
bstat - Performs a predefined query that displays useful
statistics regarding the specified broker
create - Creates a runnable broker instance in the specifi
ed path.
decrypt - Decrypts given text
dstat - Performs a predefined query that displays useful
tabular statistics regarding the specified destination type
encrypt - Encrypts given text
export - Exports a stopped brokers data files to an archiv
e file
list - Lists all available brokers in the specified JMX
context
purge - Delete selected destination's messages that match
es the message selector
query - Display selected broker component's attributes an
d statistics.
start - Creates and starts a broker using a configuration
file, or a broker URI.
stop - Stops a running broker specified by the broker na
me.
Task Options (Options specific to each task):
--extdir <dir> - Add the jar files in the directory to the classpath.
--version - Display the version information.
-h,-?,--help - Display this help information. To display task specific he
lp, use Main [task] -h,-?,--help
Task Data:
- Information needed by each specific task.
JMX system property options:
-Dactivemq.jmx.url=<jmx service uri> (default is: 'service:jmx:rmi:///jndi/r
mi://localhost:1099/jmxrmi')
-Dactivemq.jmx.user=<user name>
-Dactivemq.jmx.password=<password>
You must start ActiveMQ by command:
activemq-admin.bat start
activemq.bat is for managment, that's why you have set arguments
Related
Enviornment - solr-8.9.0, java version "11.0.12" 2021-07-20 LTS, apache-zookeeper-3.6.1-bin/
To set-up solrCloud i have done following steps-
Setting-up Zookeeper on Node 1
a. Go inside <ZK_HOME>/conf directory.
b. Make a copy of zoo_sample.cfg & rename to zoo.cfg (or mv zoo_sample.cfg to zoo.cfg)
c. Edit zoo.cfg and modify data_dir parameter to a directory location where you would like Zookeeper to store its data.
dataDir=<ZK_HOME>/conf/data
d. Now start Zookeeper with command
./bin/zkServer.sh start
Solr Setup on Node 1 / Machine 1
a. Create directory solr-8.9.0/server/solr/node1/solr/.
b. Copy default zoo.cfg & solr.xml from solr-8.9.0/server/solr to solr5.x.x/server/solr/node1/solr/
c. Now lets start Solr using below command (basically you want to start in cloud mode with Zookeeper)
./bin/solr start -cloud -s solr-8.9.0/server/solr/node1/solr -p 8983 -z <Node1 IP>:2181 -m 2g
Solr Setup on Node 2 / Machine 2
a. Create directory solr-8.9.0/server/solr/node1/solr/.
b. Copy default zoo.cfg & solr.xml from solr-8.9.0/server/solr to solr5.x.x/server/solr/node1/solr/
c. ./solr start -cloud -s solr-8.9.0/server/solr/node1/solr -p 8983 -z <Node1 IP>:2181 -m 2g
Upload configs to Zookeeper
a. ./server/scripts/cloud-scripts/zkcli.sh -zkhost <Node1 IP>:2181 -cmd upconfig -confname _defaults -confdir solr-8.9.0/server/solr/configsets/_defaults/conf
Creating a collection
http://<Node1 IP>:8983/solr/admin/collections?action=CREATE&name=<myCollection>&numShards=2&replicationFactor=2&maxShardsPerNode=2&collection.configName=_defaults
But i am getting following error while creation of collection
{
"responseHeader":{
"status":400,
"QTime":1213},
"failure":{
"$Node2:8983_solr":"org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException:Error from server at http://$Node2:8983/solr: Path /home/solr/solr-8.9.0/server/solr/node1/solr/myCollection_shard1_replica_n2 must be relative to SOLR_HOME, SOLR_DATA_HOME coreRootDirectory. Set system property 'solr.allowPaths' to add other allowed paths.",
"$Node2:8983_solr":"org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException:Error from server at http://$Node2:8983/solr: Path /home/solr/solr-8.9.0/server/solr/node1/solr/myCollection_shard2_replica_n6 must be relative to SOLR_HOME, SOLR_DATA_HOME coreRootDirectory. Set system property 'solr.allowPaths' to add other allowed paths.",
"127.0.1.1:8983_solr":"org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException:Error from server at http://127.0.1.1:8983/solr: Path /data/Lucene/solr/solrcloud/solr-8.9.0/server/solr/node1/solr/myCollection_shard2_replica_n4 must be relative to SOLR_HOME, SOLR_DATA_HOME coreRootDirectory. Set system property 'solr.allowPaths' to add other allowed paths.",
"127.0.1.1:8983_solr":"org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException:Error from server at http://127.0.1.1:8983/solr: Path /data/Lucene/solr/solrcloud/solr-8.9.0/server/solr/node1/solr/myCollection_shard1_replica_n1 must be relative to SOLR_HOME, SOLR_DATA_HOME coreRootDirectory. Set system property 'solr.allowPaths' to add other allowed paths."},
"Operation create caused exception:":"org.apache.solr.common.SolrException:org.apache.solr.common.SolrException: Underlying core creation failed while creating collection: myCollection",
"exception":{
"msg":"Underlying core creation failed while creating collection: myCollection",
"rspCode":400},
"error":{
"metadata":[
"error-class","org.apache.solr.common.SolrException",
"root-error-class","org.apache.solr.common.SolrException"],
"msg":"Underlying core creation failed while creating collection: myCollection",
"code":400}}
Why above error was occurred? What steps i am missing while setting up solrCloud on 2 machines with 1 zookeeper instance? Could someone help me find the missing piece?
As error suggest
Use absolute path while starting the solr instance on both nodes.
Use abosolute path for 'confdir' parameter while uploading configuration to zookeeper.
I have a file "myFile.gcode" which contains some G-code commands. As it stands Universal Gcode Sender allows the user to upload a file to be executed. However I would like "myFile.gcode" to be run as soon as the program starts up instead of me having to upload it.
Is this possible with the out-of-the-box .jar file?
.jar available at https://winder.github.io/ugs_website/
If not I have the following idea,
Trace through the code and reverse engineer it to auto run the file. To do this I decompiled the .jar file, but when tracing through the code I'm having trouble finding the starting point (main class) of the code.
In summary, is this possible?
And, what could make tracing this code easier?
Yes there is a new feature for running CLI-commands, download the latest nightly build of Universal G-code Sender Classic and run the follwing:
# java -cp UniversalGcodeSender.jar com.willwinder.ugs.cli.TerminalClient --help
This will print the available parameters and options:
-b,--baud <baudrate> Baud rate to connect with.
-c,--controller <controller> What type of controller firmware we are
connecting to, defaults to "GRBL". These
are the available firmwares: [GRBL, TinyG,
Testing (Delay), Smoothie Board, Testing]
-d,--daemon Starts in daemon mode providing a web
pendant UI
-dr,--driver <driver> Sets and saves the connection driver
setting. These are the available drivers:
[JSERIALCOMM, JSSC, TCP]
-f,--file <filename> Opens a file for streaming to controller
and will exit upon completion.
-h,--help Prints the help information.
-ho,--home If a homing process should be done before
any gcode files are sent to the
controller.
-l,--list Lists all available ports.
-p,--port <port> Which port for the controller to connect
to. I.e /dev/ttyUSB0 (on Unix-like systems
or COM4 (on windows).
-pp,--print-progressbar Prints the progress of the file stream
-ps,--print-stream Prints the streamed lines to console
-r,--reset-alarm Resets any alarm
-v,--version Prints the software version.
-w,--workspace <dir> Sets and saves the workspace directory
setting
Sending a file can be done using the following command:
# java -cp UniversalGcodeSender.jar com.willwinder.ugs.cli.TerminalClient --controller GRBL --port /dev/ttyUSB0 --baud 115200 --print-progressbar --file test.gcode
Connected to "Grbl 0.9z" on baud 115200
Running file "test.gcode"
test.gcode 52% │██████████████████████▉ │ 55/105 (0:00:06 / 0:00:05)
I have trained a custom NER model with Stanford-NER. I created a properties file and used the -serverProperties argument with the java command to start my server (direction I followed from another question of mine, seen here) and load my custom NER model but when the server attempts to load my custom model it fails with this error: java.io.EOFException: Unexpected end of ZLIB input stream
The stderr.log output with error is as follows:
[main] INFO CoreNLP - --- StanfordCoreNLPServer#main() called ---
[main] INFO CoreNLP - setting default constituency parser
[main] INFO CoreNLP - warning: cannot find edu/stanford/nlp/models/srparser/englishSR.ser.gz
[main] INFO CoreNLP - using: edu/stanford/nlp/models/lexparser/englishPCFG.ser.gz instead
[main] INFO CoreNLP - to use shift reduce parser download English models jar from:
[main] INFO CoreNLP - http://stanfordnlp.github.io/CoreNLP/download.html
[main] INFO CoreNLP - Threads: 4
[main] INFO CoreNLP - Liveness server started at /0.0.0.0:9000
[main] INFO CoreNLP - Starting server...
[main] INFO CoreNLP - StanfordCoreNLPServer listening at /0.0.0.0:80
[pool-1-thread-3] INFO CoreNLP - [/127.0.0.1:35546] API call w/annotators tokenize,ssplit,pos,lemma,depparse,natlog,ner,openie
[pool-1-thread-3] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator tokenize
[pool-1-thread-3] INFO edu.stanford.nlp.pipeline.TokenizerAnnotator - No tokenizer type provided. Defaulting to PTBTokenizer.
[pool-1-thread-3] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator ssplit
[pool-1-thread-3] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator pos
[pool-1-thread-3] INFO edu.stanford.nlp.tagger.maxent.MaxentTagger - Loading POS tagger from edu/stanford/nlp/models/pos-tagger/english-left3words/english-left3words-distsim.tagger ... done [0.7 sec].
[pool-1-thread-3] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator lemma
[pool-1-thread-3] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator depparse
[pool-1-thread-3] INFO edu.stanford.nlp.parser.nndep.DependencyParser - Loading depparse model file: edu/stanford/nlp/models/parser/nndep/english_UD.gz ... [pool-1-thread-3] INFO edu.stanford.nlp.parser.nndep.Classifier - PreComputed 99996, Elapsed Time: 12.297 (s)
[pool-1-thread-3] INFO edu.stanford.nlp.parser.nndep.DependencyParser - Initializing dependency parser ... done [13.6 sec].
[pool-1-thread-3] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator natlog
[pool-1-thread-3] INFO edu.stanford.nlp.pipeline.StanfordCoreNLP - Adding annotator ner
java.io.EOFException: Unexpected end of ZLIB input stream
at java.util.zip.InflaterInputStream.fill(InflaterInputStream.java:240
at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:158)
at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:117)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at java.io.ObjectInputStream$PeekInputStream.read(ObjectInputStream.java:2620)
at java.io.ObjectInputStream$PeekInputStream.readFully(ObjectInputStream.java:2636)
at java.io.ObjectInputStream$BlockDataInputStream.readDoubles(ObjectInputStream.java:3333)
at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1920)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1529)
at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1933)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1529)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)
at edu.stanford.nlp.ie.crf.CRFClassifier.loadClassifier(CRFClassifier.java:2650)
at edu.stanford.nlp.ie.AbstractSequenceClassifier.loadClassifier(AbstractSequenceClassifier.java:1462)
at edu.stanford.nlp.ie.AbstractSequenceClassifier.loadClassifier(AbstractSequenceClassifier.java:1494)
at edu.stanford.nlp.ie.crf.CRFClassifier.getClassifier(CRFClassifier.java:2963)
at edu.stanford.nlp.ie.ClassifierCombiner.loadClassifierFromPath(ClassifierCombiner.java:282)
at edu.stanford.nlp.ie.ClassifierCombiner.loadClassifiers(ClassifierCombiner.java:266)
at edu.stanford.nlp.ie.ClassifierCombiner.<init>(ClassifierCombiner.java:141)
at edu.stanford.nlp.ie.NERClassifierCombiner.<init>(NERClassifierCombiner.java:128)
at edu.stanford.nlp.pipeline.AnnotatorImplementations.ner(AnnotatorImplementations.java:121)
at edu.stanford.nlp.pipeline.AnnotatorFactories$6.create(AnnotatorFactories.java:273)
at edu.stanford.nlp.pipeline.AnnotatorPool.get(AnnotatorPool.java:152)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.construct(StanfordCoreNLP.java:451)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:154)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.<init>(StanfordCoreNLP.java:145)
at edu.stanford.nlp.pipeline.StanfordCoreNLPServer.mkStanfordCoreNLP(StanfordCoreNLPServer.java:273)
at edu.stanford.nlp.pipeline.StanfordCoreNLPServer.access$500(StanfordCoreNLPServer.java:50)
at edu.stanford.nlp.pipeline.StanfordCoreNLPServer$CoreNLPHandler.handle(StanfordCoreNLPServer.java:583)
at com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:79)
at sun.net.httpserver.AuthFilter.doFilter(AuthFilter.java:83)
at com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:82)
at sun.net.httpserver.ServerImpl$Exchange$LinkHandler.handle(ServerImpl.java:675)
at com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:79)
at sun.net.httpserver.ServerImpl$Exchange.run(ServerImpl.java:647)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
I have googled this error and most of what I read is in regards to an issue with Java from 2007-2010 where an EOFException is "arbitrarily" thrown. This information is from here.
"When using gzip (via new Deflater(Deflater.BEST_COMPRESSION, true)), for some files, and EOFException is thrown at the end of inflating. Although the file is correct, the bug is the EOFException is thrown inconsistently. For some files it is thrown, other it is not."
Answers to other peoples questions in regards to this error state that you have to close the output streams for the gzip...? Not entirely sure what that means and I don't know how I would execute that advice as Stanford-NER is the software creating the gzip file for me.
Question: What actions can I take to eliminate this error? I am hoping this has happened to others in the past. Also looking for feedback from #StanfordNLPHelp as to whether there have been similar issues risen in the past and if there is something being done/something that has been done to the CoreNLP software to eliminate this issue. If there is a solution from CoreNLP, what files do I need to change, where are these files located within the CoreNLP framework, and what changes do I need to make?
ADDED INFO (PER #StanfordNLPHelp comments):
My model was trained using the directions found here. To train the model I used a TSV as outlined in the directions which contained text from around 90 documents. I know this is not a substantial amount of data to train with but we are just in the testing phases and will improve the model as we acquire more data.
With this TSV file and the Standford-NER software I ran the command below.
java -cp stanford-ner.jar edu.stanford.nlp.ie.crf.CRFClassifier -prop austen.prop
I then was had my model built and was even able to load and successfully tag a larger corpus of text with the ner GUI that comes with the Stanford-NER software.
During trouble shooting why I was unable to get the model to work I also attempted to update my server.properties file with the file path to the "3 class model" that comes standard in CoreNLP. Again it failed with the same error.
The fact that both my custom model and the 3 class model both work in the Stanford-NER software but fail to load makes me believe my custom model is not the issue and that there is some issue with how the CoreNLP software loads these models through the -serverProperties argument. Or it could be something I am completely unaware of.
The properties file I used to train my NER model was similar to the on in the directions with the train file changed and the output file name changed. It looks like this:
# location of the training file
trainFile = custom-model-trainingfile.tsv
# location where you would like to save (serialize) your
# classifier; adding .gz at the end automatically gzips the file,
# making it smaller, and faster to load
serializeTo = custome-ner-model.ser.gz
# structure of your training file; this tells the classifier that
# the word is in column 0 and the correct answer is in column 1
map = word=0,answer=1
# This specifies the order of the CRF: order 1 means that features
# apply at most to a class pair of previous class and current class
# or current class and next class.
maxLeft=1
# these are the features we'd like to train with
# some are discussed below, the rest can be
# understood by looking at NERFeatureFactory
useClassFeature=true
useWord=true
# word character ngrams will be included up to length 6 as prefixes
# and suffixes only
useNGrams=true
noMidNGrams=true
maxNGramLeng=6
usePrev=true
useNext=true
useDisjunctive=true
useSequences=true
usePrevSequences=true
# the last 4 properties deal with word shape features
useTypeSeqs=true
useTypeSeqs2=true
useTypeySequences=true
wordShape=chris2useLC
My server.properties file contained only one line ner.model = /path/to/custom_model.ser.gz
I also added /path/to/custom_model to the $CLASSPATH variable in the start up script. Changed line CLASSPATH="$CLASSPATH:$JAR to CLASSPATH="$CLASSPATH:$JAR:/path/to/custom_model.ser.gz. I am not sure if this is a necessary step because I get prompted with the ZLIB error first. Just wanted to include this for completeness.
Attempted to "gunzip" my custom model with the command gunzip custom_model.ser.gz and got a similar error that I get when trying to load the model. It is gzip: custom_model.ser.gz: unexpected end of file
I'm assuming you downloaded Stanford CoreNLP 3.7.0 and have a folder somewhere called stanford-corenlp-full-2016-10-31. For the sake of this example let's assume it's in /Users/stanfordnlphelp/stanford-corenlp-full-2016-10-31 (change this to your specific situation)
Also just to clarify, when you run a Java program, it looks in the CLASSPATH for compiled code and resources. A common way to set the CLASSPATH is to just set the CLASSPATH environment variable with export command.
Typically Java compiled code and resources are stored in jar files.
If you look at stanford-corenlp-full-2016-10-31 you'll see a bunch of .jar files. One of them is called stanford-corenlp-3.7.0-models.jar. You can look at what's inside a jar file with this command: jar tf stanford-corenlp-3.7.0-models.jar.
You'll notice when you look inside that file that there are (among others) various ner models. For instance you should see this file:
edu/stanford/nlp/models/ner/english.all.3class.distsim.crf.ser.gz
in the models jar.
So a reasonable way for us to get things working is to run the server and tell it to only load 1 model (since by default it will load 3).
run these commands in one window (in the same directory as the file ner-server.properties)
export CLASSPATH=/Users/stanfordnlphelp/stanford-corenlp-full-2016-10-31/*:
java -Xmx12g edu.stanford.nlp.pipeline.StanfordCoreNLPServer -port 9000 -timeout 15000 -serverProperties ner-server.properties
with ner-server.properties being a 2-line file with these 2 lines:
annotators = tokenize,ssplit,pos,lemma,ner
ner.model = edu/stanford/nlp/models/ner/english.all.3class.distsim.crf.ser.gz
The export command above is putting EVERY jar in that directory on the CLASSPATH. That is what the * means. So stanford-corenlp-3.7.0-models.jar should be on the CLASSPATH. Thus when the Java code runs, it will be able to find edu/stanford/nlp/models/ner/english.all.3class.distsim.crf.ser.gz.
In a different Terminal window, issue this command:
wget --post-data 'Joe Smith lives in Hawaii.' 'localhost:9000/?properties={"outputFormat":"json"}' -O -
When this runs, you should see in the first window (where the server is running) that only this model is loading edu/stanford/nlp/models/ner/english.all.3class.distsim.crf.ser.gz.
You should note that if you deleted the ner.model from your file and re-did all of these, 3 models would load instead of 1.
Please let me know if that all works or not.
Let's assume I made an NER model called custom_model.ser.gz , and that file is what StanfordCoreNLP output after the training process. Let's say I put it in the folder /Users/stanfordnlphelp/.
If steps 1 and 2 worked, you should be able to alter ner-server.properties to this:
annotators = tokenize,ssplit,pos,lemma,ner
ner.model = /Users/stanfordnlphelp/custom_model.ser.gz
And when you do the same thing, it will show your custom model loading. There should not be any kind of gzip issue. If you are still having a gzip issue, please let me know what kind of system you are running this on? Mac OS X, Unix, Windows, etc...?
And to confirm, you said that you have run your custom NER model with the standalone Stanford NER software right? If so, that sounds like the model file is fine.
I am trying to implement the example given here: http://benjsicam.me/blog/running-a-java-application-as-a-windows-service-part-1-tutorial/
The code basically turns a Java application into a service. The application outputs Date and Time to the console at specific intervals. The whole project gets exported to a runnable JAR file, with required libraries in a separate folder. All I have to do is modify the wrapper.conf file to run Main.jar (which is the exported JAR) and put the exported libraries in the lib folder.
I have followed everything exactly, but I am getting the following problem: WrapperManager cannot be resolved. I am including the links for snapshots which show the Maven project structure, the contents of POM.xml, and code contents of four Java files. Also included are the required source files in Google Drive.
Java version jdk1.8.0_11
Main.java
Google Drive Link
Error Shown:
Wrapper.conf file (removed most of the comments+top part)
wrapper.java.mainclass=batch_Proc.main_prog.Main
# Java Classpath (include wrapper.jar) Add class path elements as
# needed starting from 1
wrapper.java.classpath.1=../lib/wrapper.jar
wrapper.java.classpath.2=../lib/aopalliance-1.0.jar
wrapper.java.classpath.3=../lib/commons-logging-1.1.1.jar
wrapper.java.classpath.4=../lib/spring-aop-3.2.1.RELEASE.jar
wrapper.java.classpath.5=../lib/spring-beans-3.2.1.RELEASE.jar
wrapper.java.classpath.6=../lib/spring-context-3.2.1.RELEASE.jar
wrapper.java.classpath.7=../lib/spring-context-support-3.2.1.RELEASE.jar
wrapper.java.classpath.8=../lib/spring-core-3.2.1.RELEASE.jar
wrapper.java.classpath.9=../lib/spring-expression-3.2.1.RELEASE.jar
wrapper.java.classpath.10=../lib/spring-web-3.2.1.RELEASE.jar
wrapper.java.classpath.11=../lib/wrappertest.jar
wrapper.java.classpath.12=Main.jar
# Java Library Path (location of Wrapper.DLL or libwrapper.so)
wrapper.java.library.path.1=../lib
# Java Bits. On applicable platforms, tells the JVM to run in 32 or 64-bit mode.
wrapper.java.additional.auto_bits=FALSE
# Java Additional Parameters
wrapper.java.additional.1=
# Initial Java Heap Size (in MB)
#wrapper.java.initmemory=3
# Maximum Java Heap Size (in MB)
#wrapper.java.maxmemory=64
# Application parameters. Add parameters as needed starting from 1
#wrapper.app.parameter.1=
#********************************************************************
# Wrapper Logging Properties
#********************************************************************
# Enables Debug output from the Wrapper.
wrapper.debug=FALSE
# Format of output for the console. (See docs for formats)
wrapper.console.format=PM
# Log Level for console output. (See docs for log levels)
wrapper.console.loglevel=INFO
# Log file to use for wrapper output logging.
wrapper.logfile=../logs/wrapper.log
# Format of output for the log file. (See docs for formats)
wrapper.logfile.format=LPTM
# Log Level for log file output. (See docs for log levels)
wrapper.logfile.loglevel=INFO
# Maximum size that the log file will be allowed to grow to before
# the log is rolled. Size is specified in bytes. The default value
# of 0, disables log rolling. May abbreviate with the 'k' (kb) or
# 'm' (mb) suffix. For example: 10m = 10 megabytes.
wrapper.logfile.maxsize=0
# Maximum number of rolled log files which will be allowed before old
# files are deleted. The default value of 0 implies no limit.
wrapper.logfile.maxfiles=0
# Log Level for sys/event log output. (See docs for log levels)
wrapper.syslog.loglevel=NONE
#********************************************************************
# Wrapper General Properties
#********************************************************************
# Allow for the use of non-contiguous numbered properties
wrapper.ignore_sequence_gaps=TRUE
# Do not start if the pid file already exists.
wrapper.pidfile.strict=TRUE
# Title to use when running as a console
wrapper.console.title=Test Wrapper Sample Application
#********************************************************************
# Wrapper JVM Checks
#********************************************************************
# Detect DeadLocked Threads in the JVM. (Requires Standard Edition)
wrapper.check.deadlock=TRUE
wrapper.check.deadlock.interval=10
wrapper.check.deadlock.action=RESTART
wrapper.check.deadlock.output=FULL
# Out Of Memory detection.
# (Ignore output from dumping the configuration to the console. This is only needed by the TestWrapper sample application.)
wrapper.filter.trigger.999=wrapper.filter.trigger.*java.lang.OutOfMemoryError
wrapper.filter.allow_wildcards.999=TRUE
wrapper.filter.action.999=NONE
# Ignore -verbose:class output to avoid false positives.
wrapper.filter.trigger.1000=[Loaded java.lang.OutOfMemoryError
wrapper.filter.action.1000=NONE
# (Simple match)
wrapper.filter.trigger.1001=java.lang.OutOfMemoryError
# (Only match text in stack traces if -XX:+PrintClassHistogram is being used.)
#wrapper.filter.trigger.1001=Exception in thread "*" java.lang.OutOfMemoryError
#wrapper.filter.allow_wildcards.1001=TRUE
wrapper.filter.action.1001=RESTART
wrapper.filter.message.1001=The JVM has run out of memory.
#********************************************************************
# Wrapper Email Notifications. (Requires Professional Edition)
#********************************************************************
# Common Event Email settings.
#wrapper.event.default.email.debug=TRUE
#wrapper.event.default.email.smtp.host=<SMTP_Host>
#wrapper.event.default.email.smtp.port=25
#wrapper.event.default.email.subject=[%WRAPPER_HOSTNAME%:%WRAPPER_NAME%:%WRAPPER_EVENT_NAME%] Event Notification
#wrapper.event.default.email.sender=<Sender email>
#wrapper.event.default.email.recipient=<Recipient email>
# Configure the log attached to event emails.
#wrapper.event.default.email.attach_log=TRUE
#wrapper.event.default.email.maillog.lines=50
#wrapper.event.default.email.maillog.format=LPTM
#wrapper.event.default.email.maillog.loglevel=INFO
# Enable specific event emails.
#wrapper.event.wrapper_start.email=TRUE
#wrapper.event.jvm_prelaunch.email=TRUE
#wrapper.event.jvm_start.email=TRUE
#wrapper.event.jvm_started.email=TRUE
#wrapper.event.jvm_deadlock.email=TRUE
#wrapper.event.jvm_stop.email=TRUE
#wrapper.event.jvm_stopped.email=TRUE
#wrapper.event.jvm_restart.email=TRUE
#wrapper.event.jvm_failed_invocation.email=TRUE
#wrapper.event.jvm_max_failed_invocations.email=TRUE
#wrapper.event.jvm_kill.email=TRUE
#wrapper.event.jvm_killed.email=TRUE
#wrapper.event.jvm_unexpected_exit.email=TRUE
#wrapper.event.wrapper_stop.email=TRUE
# Specify custom mail content
wrapper.event.jvm_restart.email.body=The JVM was restarted.\n\nPlease check on its status.\n
# Name of the service
wrapper.name=JavaWindowsServiceSample
# Display name of the service
wrapper.displayname=Java Windows Service Sample
# Description of the service
wrapper.description=A sample java windows service application
# Service dependencies. Add dependencies as needed starting from 1
wrapper.ntservice.dependency.1=
# Mode in which the service is installed. AUTO_START, DELAY_START or DEMAND_START
wrapper.ntservice.starttype=AUTO_START
# Allow the service to interact with the desktop.
wrapper.ntservice.interactive=false
I solved this problem! I had made a simple mistake of not making sure the correct dependencies were downloaded in Maven.
The required jars for WrapperManager and WrapperListener were not available which were causing the errors.
I am trying to populate my datastore Entity with data which I have in csv file but don't have success.
This is my CSV file places.csv:
name,placeId,location,key,address
A store at City1 Shopping Center,store101,"47,-122",1,"Some address of the store in City 1"
A big store at Some Mall,store102,"47,-122",2,"Some address of the store in City 2"
bulkloader.yaml:
python_preamble:
- import: base64
- import: re
- import: google.appengine.ext.bulkload.transform
- import: google.appengine.ext.bulkload.bulkloader_wizard
- import: google.appengine.ext.db
- import: google.appengine.api.datastore
- import: google.appengine.api.users
transformers:
- kind: Place
connector: csv
connector_options:
property_map:
- property: __key__
external_name: key
export_transform: transform.key_id_or_name_as_string
- property: address
external_name: address
# Type: String Stats: 6 properties of this type in this kind.
- property: location
external_name: location
# Type: GeoPt Stats: 6 properties of this type in this kind.
import_transform: google.appengine.api.datastore_types.GeoPt
- property: name
external_name: name
# Type: String Stats: 6 properties of this type in this kind.
- property: placeId
external_name: placeId
# Type: String Stats: 6 properties of this type in this kind
upload_data.sh:
#!/bin/sh
../Eclipse/plugins/com.google.appengine.eclipse.sdkbundle_1.9.1/appengine-java-sdk-1.9.1/bin/appcfg.sh upload_data --config_file bulkloader.yaml --url=http://localhost:8888/remote_api --filename places.csv --kind=Place -e nobody#nowhere.com
I created folder gae and placed there upload_data.sh, bulkloader.yaml and places.csv.
After I run sudo ./upload_data.sh, I receive the message:
sudo: ./upload_data.sh: command not found
After I run sudo sh upload_data.sh I receive the following error:
Bad argument: Expected an action: [update, request_logs, rollback, update_indexes, update_cron, update_dispatch, update_dos, update_queues, cron_info, vacuum_indexes, help, download_app, version, set_default_version, resource_limits_info, start_module_version, stop_module_version, backends list, backends rollback, backends update, backends start, backends stop, backends delete, backends configure, backends, list_versions, delete_version, debug]
usage: AppCfg [options] <action> [<app-dir>] [<argument>]
Action must be one of:
help: Print help for a specific action.
download_app: Download a previously uploaded app version.
request_logs: Write request logs in Apache common log format.
rollback: Rollback an in-progress update.
start_module_version: Start the specified module version.
stop_module_version: Stop the specified module version.
update: Create or update an app version.
update_indexes: Update application indexes.
update_cron: Update application cron jobs.
update_queues: Update application task queue definitions.
update_dispatch: Update the application dispatch configuration.
update_dos: Update application DoS protection configuration.
version: Prints version information.
set_default_version: Set the default serving version.
cron_info: Displays times for the next several runs of each cron job.
resource_limits_info: Display resource limits.
vacuum_indexes: Delete unused indexes from application.
backends list: List the currently configured backends.
backends update: Update the specified backend or all backends.
backends rollback: Roll back a previously in-progress update.
backends start: Start the specified backend.
backends stop: Stop the specified backend.
backends delete: Delete the specified backend.
backends configure: Configure the specified backend.
list_versions: List the currently uploaded versions.
delete_version: Delete the specified version.
Use 'help <action>' for a detailed description.
options:
-s SERVER, --server=SERVER
The server to connect to.
-e EMAIL, --email=EMAIL
The username to use. Will prompt if omitted.
-H HOST, --host=HOST Overrides the Host header sent with all RPCs.
-p PROXYHOST[:PORT], --proxy=PROXYHOST[:PORT]
Proxies requests through the given proxy server.
If --proxy_https is also set, only HTTP will be
proxied here, otherwise both HTTP and HTTPS will.
--proxy_https=PROXYHOST[:PORT]
Proxies HTTPS requests through the given proxy server.
--no_cookies Do not save/load access credentials to/from disk.
--sdk_root=root Overrides where the SDK is located.
--passin Always read the login password from stdin.
-A APP_ID, --application=APP_ID
Override application id from appengine-web.xml or app.yaml
-M MODULE, --module=MODULE
Override module from appengine-web.xml or app.yaml
-V VERSION, --version=VERSION
Override (major) version from appengine-web.xml or app.yaml
--oauth2 Use OAuth2 instead of password auth.
--enable_jar_splitting
Split large jar files (> 10M) into smaller fragments.
--jar_splitting_excludes=SUFFIXES
When --enable-jar-splitting is set, files that match
the list of comma separated SUFFIXES will be excluded
from all jars.
--disable_jar_jsps
Do not jar the classes generated from JSPs.
--enable_jar_classes
Jar the WEB-INF/classes content.
--delete_jsps
Delete the JSP source files after compilation.
--retain_upload_dir
Do not delete temporary (staging) directory used in
uploading.
--compile_encoding
The character encoding to use when compiling JSPs.
-n NUM_DAYS, --num_days=NUM_DAYS
Number of days worth of log data to get. The cut-off
point is midnight UTC. Use 0 to get all available
logs. Default is 1.
--severity=SEVERITY Severity of app-level log messages to get. The range
is 0 (DEBUG) through 4 (CRITICAL). If omitted, only
request logs are returned.
-a, --append Append to existing file.
-n NUM_RUNS, --num_runs=NUM_RUNS
Number of scheduled execution times to compute
-f, --force Force deletion of indexes without being prompted.
What can I do to upload that data to datastore? Thank you.
I think you are using appcfg.sh instead of appcfg.py. See:
https://developers.google.com/appengine/docs/python/tools/uploadingdata
Also, your output clearly shows why you got the Bad Argument error - the action parameters listed by appcfg.sh do not include "update_data", but that is what your script passes as the action.
I was doing this exact thing and didn't immediately make the leap of intuition either:
Download the python SDK, which will give you the appcfg.py tool. Just call that one in your upload_data.sh script.
The appcfg.sh program doesn't have the upload_data action. Which I found weird.