I am trying to build a Java wrapper around the native SDK and I am rewriting NanoPlayer. I think I managed to get the same flow of events as the native version, but when I play a song, I get a QUEUELIST_NEED_NATURAL_NEXT instead of MEDIASTREAM_DATA_READY. You can see the output below.
What could cause this? And what am I supposed to do on such event?
Thanks a lot in advance.
Stefano
34511:327803 dz_crash_handler: [dz_crash_handler_init:286] Crash
Handler available Device ID: e91f2fce333d4a7ab9b75cfaee3115e4
### MENU
Please press key for comands: - P : PLAY / PAUSE S : START/STOP + : NEXT
: PREVIOUS R : NEXT REPEAT MODE ? : TOGGLE SHUFFLE MODE Q : QUIT [1-4] : LOAD CONTENT [1-4]
#
OnConnectCallback
(native#0x7f1d843271e0,native#0x7f1d200f2a60,native#0x7f1d842c95c0)(App:native#0x7f1d842c95c0:1)
++++ CONNECT_EVENT ++++ USER_OFFLINE_AVAILABLE OnConnectCallback (native#0x7f1d843271e0,native#0x7f1d200eee50,native#0x7f1d842c95c0)(App:native#0x7f1d842c95c0:4)
++++ CONNECT_EVENT ++++ USER_LOGIN_OK LOAD => dzmedia:///track/136332242 (App:native#0x7f1d842c95c0:2) ====
PLAYER_EVENT ==== QUEUELIST_LOADED for idx: 0 Entity: line 1: parser
error : Document is empty sas_noad = true; ^ S PLAY track n° 0 of =>
dzmedia:///track/136332242
PLAY track n° 0 of => dzmedia:///track/136332242
(App:native#0x7f1d842c95c0:7) ==== PLAYER_EVENT ====
QUEUELIST_TRACK_SELECTED for idx: 0 - is_preview:false
canPauseUnpause: true, canSeek: true, numSkipAllowed: 1 now:{...}
(App:native#0x7f1d842c95c0:8) ==== PLAYER_EVENT ====
QUEUELIST_NEED_NATURAL_NEXT for idx: 0 (App:native#0x7f1d842c95c0:11)
==== PLAYER_EVENT ==== UNKNOWN or default
I found the issue: I provided in the config object a wrong cache path value - it must be a directory (existing) while I was setting a file (although existing).
Advise for the beginners: to see some more log, do not call dz_connect_debug_log_disable().
Hope this helps
Stefano
Related
In the experimenter mod in weka I have this configuration :
Results destination : Ex1_1.arff
Experiment type : cross-validation | number of folds : 10 | Classification
Dataset : soybean.arff.gz
Iteration Control : Number of repetitions : 10 | Data sets first
Algorithms : J48 -C 0.25 -M 2
This experience is saved as Ex1_1.xml (saving with .exp give the following error : Couldn't save experiment file : Ex1_1.exp Reason : java.awt.Component$AccessibleAWTComponent$AccessibleAWTComponentHandler)
And when I try to run this experience I have the following error : Problem creating experiment copy to run: java.awt.Component$AccessibleAWTComponent$AccessibleAWTComponentHandler
So it seem I have a problem with something like AWT in java... Do somebody have an idea ?
Thank you !
I'm trying to setup Morena 7 in my java application, but i can't configure my scanner from my code, it ignores the settings i set.
Even though my scanner works with the example projects they provide with every supported setting.
I have searched the web for explanations but i have found very little to none documentation.
This the code i use to scan, it is identical to sample given in the tutorial document :
public void scan() throws Exception {
Manager manager = Manager.getInstance();
List devices = manager.listDevices();
if(devices.isEmpty()) {
System.out.println("No scanners detected");
return;
}
Device device = (Device) devices.get(0);
if (device instanceof Scanner) {
Scanner scanner = (Scanner) device;
scanner.setMode(Scanner.RGB_8);
scanner.setResolution(75);
scanner.setFrame(100, 100, 500, 500);
BufferedImage bimage = SynchronousHelper.scanImage(scanner);
// Do the necessary processes with bimage
manager.close();
}
else {
System.out.println("Please Connect A Scanner");
}
}
When i run this code, i get back an image but with default values from the printer, every setting like color, resolution and scan area (frame) are ignored.
First I think one reason can be the problem that Morena 7 always spools the scanner data into a file. You cannot access this scanner data before written to a file (unfortunately). So in case you want to scan bilevel images you will get a jpg image with greylevels. Morena saves scannerdata as jpg on Mac OSX and as a bmp on Windows.
You should check the temp file Morena 7 creates. Assuming you use the class SynchronousHelper from the Moran example you can edit the scanImage method which just loads the temp file using ImageIO.
If I check this temp file (on Mac OSX) all the set values as resolution and colormode are considered. Probably your scanner does not support some things? Or Morena does something wrong while saving the image.
And check the system error output. Should look something like the following where you can see that I set the resolution to 400dpi and the colormode to bilevel (ICScannerPixelDataTypeBW and bitDepth 1).
Functional unit: ICScannerFunctionalUnitFlatbed <0x7fefe850f4e0>:
pixelDataType : ICScannerPixelDataTypeBW
supportedBitDepths : <NSMutableIndexSet: 0x7fefe850f4b0>[number of indexes: 2 (in 2 ranges), indexes: (1 8)]
bitDepth : 1
supportedDocumentTypes : <NSMutableIndexSet: 0x7fefede9a9f0>[number of indexes: 6 (in 2 ranges), indexes: (1-5 10)]
documentType : 1
physicalSize : [width = 8.50 inches, height = 14.00 inches]
measurementUnit : 0
supportedResolutions : <NSMutableIndexSet: 0x7fefedee4390>[number of indexes: 7 (in 7 ranges), indexes: (100 150 200 300 400 600 1200)]
preferredResolutions : <NSMutableIndexSet: 0x7fefedee4390>[number of indexes: 7 (in 7 ranges), indexes: (100 150 200 300 400 600 1200)]
resolution : 400
overviewResolution : 150
supportedScaleFactors : <NSMutableIndexSet: 0x7fefedec3dd0>[number of indexes: 1 (in 1 ranges), indexes: (100)]
preferredScaleFactors : <NSMutableIndexSet: 0x7fefedec3dd0>[number of indexes: 1 (in 1 ranges), indexes: (100)]
scaleFactor : 100
acceptsThresholdForBlackAndWhiteScanning : NO
usesThresholdForBlackAndWhiteScanning : NO
thresholdForBlackAndWhiteScanning : 0
templates : (null)
vendorFeatures : (null)
state : 0x00000001
Short: I'd like to know the name of this format!
I would like to know if this is a special common format or just a simple self-generated config file:
scenes : {
Scene : {
class : Scene
sources : {
Game Capture : {
render : 1
class : GraphicsCapture
data : {
window : "[duke3d]: Duke Nukem 3D Atomic Edition 1.4.3 STABLE"
windowClass : SDL_app
executable : duke3d.exe
stretchImage : 0
alphaBlend : 0
ignoreAspect : 0
captureMouse : 1
invertMouse : 0
safeHook : 0
useHotkey : 0
hotkey : 123
gamma : 100
}
cx : 1920
cy : 1080
}
}
}
}
My background is, that I would like to read multiple files like this one above. And I don't want to implement a whole new parser for this. That's why I want to fall back on java libraries which have already implemented this feature. But without being aware of such code formats, it's quite difficult to search for this libraries.
// additional info
This is a config file or a "scene file" for Open Broadcaster Software.
Filename extension is .xconfig
This appears to be a config file or a "scene file" for Open Broadcaster Software.
When used with OBS it has a extension of .xconfig
Hope this helps.
-Yang
I got some feedback from the main developer of this files.
As i thought, this is not a know format - just a simple config file.
solved!
what is the correct setting for the files core-site.xml and mapred-site.xml for Hadoop?
Because I'm trying to run hadoop but get the following error:
starting secondarynamenode , logging to / opt/hadoop/hadoop-1.2.1/libexec/../logs/hadoop-hadoop-secondarynamenode-lbad012.out
lbad012 : Exception in thread “main ” java.lang.IllegalArgumentException : Does not contain a valid host : port authority : file :/ / /
lbad012 : at org.apache.hadoop.net.NetUtils.createSocketAddr ( NetUtils.java : 164 )
lbad012 : at org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress ( NameNode.java : 212 )
lbad012 : at org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress ( NameNode.java : 244 )
lbad012 : at org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress ( NameNode.java : 236 )
lbad012 : at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize ( SecondaryNameNode.java : 194 )
lbad012 : at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode . ( SecondaryNameNode.java : 150 )
lbad012 : at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main ( SecondaryNameNode.java : 676 )
You didn't specify which version of hadoop you're using, or whether or not you're using CDH (cloudera's hadoop distro)
You also didn't specify whether or not you're looking to run in a pseudo-distributed, single node, or distributed cluster setup. These options specifically are set up in the files youre mentioning (core-site and mapred-site)
Hadoop is very finicky so these details are important when asking questions related to hadoop.
Since you didn't specify any of the above though, Im guessing you're a beginner -- in which case this guide should help you (and show you what core-site and mapred-site should look like in a pseudo-distributed configuration)
Anyway, Hadoop has a 'Quick Start' guide for almost every version of hadoop they upload, so find one that relates to the version and setup you're looking for and it should be fairly easy to walk through.
I'm trying to install WWW::HTMLUnit on Windows 7. There're step that I run through:
Install Inline::Java 0.53
Install WWW::HTMLUnit 0.15
At step 2, after nmake, I type nmake test to test module but it failed. Here's output:
C:\nmake test
Microsoft (R) Program Maintenance Utility Version 9.00.30729.01
Copyright (C) Microsoft Corporation. All rights reserved.
C:\Perl\bin\perl.exe "-MExtUtils::Command::MM" "-e" "test_harness(0, 'blib\lib', 'blib\arch')" t/*.t
t/00_basic...........
t/00_basic...........NOK 1/1# Failed test 'use WWW::HtmlUnit;'
# at t/00_basic.t line 9.
# Tried to use 'WWW::HtmlUnit'.
# Error: Class com.gargoylesoftware.htmlunit.WebClient not found at C:/Perl/site/lib/Inline/Java.pm line 619
# BEGIN failed--compilation aborted at (eval 4) line 2, <GEN7> line 4.
# Looks like you failed 1 test of 1.
t/00_basic...........dubious
Test returned status 1 (wstat 256, 0x100)
DIED. FAILED test 1
Failed 1/1 tests, 0.00% okay
t/01_hello...........Class com.gargoylesoftware.htmlunit.WebClient not found at C:/Perl/site/lib/Inline/Java.pm line 619
BEGIN failed--compilation aborted at t/01_hello.t line 4, <GEN7> line 4.
t/01_hello...........dubious
Test returned status 26 (wstat 6656, 0x1a00)
t/02_hello_sweet.....dubious
Test returned status 19 (wstat 4864, 0x1300)
t/03_clickhandler....Class com.gargoylesoftware.htmlunit.WebClient not found at C:/Perl/site/lib/Inline/Java.pm line 619
BEGIN failed--compilation aborted at t/03_clickhandler.t line 6, <GEN7> line 4.
t/03_clickhandler....dubious
Test returned status 29 (wstat 7424, 0x1d00)
DIED. FAILED tests 1-8
Failed 8/8 tests, 0.00% okay
Failed Test Stat Wstat Total Fail List of Failed
-------------------------------------------------------------------------------
t/00_basic.t 1 256 1 1 1
t/01_hello.t 26 6656 ?? ?? ??
t/02_hello_sweet.t 19 4864 ?? ?? ??
t/03_clickhandler.t 29 7424 8 16 1-8
Failed 4/4 test scripts. 9/9 subtests failed.
Files=4, Tests=9, 3 wallclock secs ( 0.00 cusr + 0.00 csys = 0.00 CPU)
Failed 4/4 test programs. 9/9 subtests failed.
NMAKE : fatal error U1077: 'C:\Perl\bin\perl.exe' : return code '0x1d'
Stop.
From above log, I could see that:
class Error: com.gargoylesoftware.htmlunit.WebClient could not be found.
I have no idea that I missed anything.
Any help would be appreciated.
Thanks.
Minh.
I found it.
There's different between path in Unix and Windows system. Unix uses ':' for a delimiter but Windows uses ';'. So what I've done is that open HTMLUnit.pm and change all of ':' to ';'.
With HTMLUnit version 0.15 I made changes at these lines below:
Line 78:
return join ';', map { "$jar_path/$_" } qw( # return join ':', map { "$jar_path/$_" } qw(
Line 127:
$custom_jars = join(';', #{$parameters{'jars'}}); # $custom_jars = join(':', #{$parameters{'jars'}});
Line 148:
CLASSPATH => collect_default_jars() . ";" . $custom_jars, # CLASSPATH => collect_default_jars() . ":" . $custom_jars,
And it works like a magic.
(it wouldn't let me comment on an existing answer)
I see your answer about ':' vs ';'. I'll try to include a fix in the next WWW::HtmlUnit release (I am the author of the perl bindings).