In PrimeFaces 8.0 the DefaultStreamedContent cannot be initialized like new DefaultStreamedContent(inputStream, contentType, name) because it has been deprecated, instead you shound use DefaultStreamedContent.builder().
Although while doing .stream() it asks for a SerializableSupplier<InputStream> instead of an InputStream like in the version before 8.0.
DefaultStreamedContent.builder().contentType(contentType).name(name).stream(is).build();
^^
How can I convert a InputStream to a SerializableSupplier?
Everthing is in the migration guide here: https://github.com/primefaces/primefaces/wiki/Migration-Guide.
in general the following will work:
DefaultStreamedContent.builder().contentType(contentType).name(name).stream(() -> is).build();
But the idea behind the change is a different.
If you use a RequestScoped bean to build the StreamedContent, your bean and therefore the StreamedContent will be created twice:
when rendering the view
when streaming the resource (this is a new browser request!)
In this case, your is will probably created 2 times. Most of the times this results in 1 useless IO access or DB call.
To only create the is one time, you should lazy initialize it via the supplier lambda:
DefaultStreamedContent.builder().contentType(contentType).name(name).stream(() -> new FileInputStream(....)).build();
This worked for me
DataHandler dataHandler = someBean.getFileData();
byte contents[] = IOUtils.toByteArray(dataHandler.getInputStream());
StreamedContent streamedContent = DefaultStreamedContent.builder()
.name(someBean.getFileName())
.contentType("application/octet-stream")
.stream(() -> new ByteArrayInputStream(contents)).build();
The lazy initialize answer above by #tandraschko did not work for me in Netbeans using java 8. I had to have the FileInputStream created before injecting it into the builder.
So my code looks like :
public StreamedContent getFiledownload() {
FileInputStream fis = new FileInputStream("...");
filedownload = DefaultStreamedContent.builder()
.contentType("...")
.name("...")
.stream(() -> fis)
.build();
return filedownload;
}
Thought I would comment just in case someone else was running into compiling issues.
For MySQL stored image I use this:
resultset = statement.executeQuery("call sp_query()");
if(resultset.next()) {
new DefaultStreamedContent();
StreamedContent photo = DefaultStreamedContent.builder().contentType("contentType").name("name").stream(() -> resultset.getBinaryStream("picture")).build());
}
// Close the connection
con.close();
Related
why does spring webflux (or java nio) makes multipartData DataBuffer tmp file?
in my case on macOS, files like /private/var/folders/v6/vtrxqpbd4lb3pq8v_sbm10hc0000gn/T/nio-file-upload/nio-body-1-82f11dbe-61b3-4e5d-8c43-92e02aa38481.tmp made on request and then deleted.
is it possible to improve performance with preventing disk write?
this is my code:
public class FileHandler {
public Mono<ServerResponse> postFile(ServerRequest req) {
val file = req.multipartData()
.map(map -> map.getFirst("file"))
.ofType(FilePart.class);
val buffer = file.flatMap(part -> part.content().next());
val hash = buffer.map(d -> {
try {
val md = MessageDigest.getInstance("SHA-1");
md.update(d.asByteBuffer());
return Base64Utils.encodeToString(md.digest());
} catch (NoSuchAlgorithmException e) {
// does not reach here!
return "";
}
});
val name = file.map(FilePart::filename);
return ok().body(hash, String.class);
}
}
The multipart file support in Spring WebFlux is using the Synchronoss NIO Multipart library. The downside of that implementation is that it's not fully reactive and as a result it can create temporary files to not load the whole content in memory.
What makes you think that this behavior is a performance problem? Do you have a sample or benchmark results that show that this is an issue?
The Spring Framework team already worked on this and a fully reactive implementation will be available as the default in Spring Framework 5.2 (see spring-framework#21659).
I have HBase code that I use for gets (Although I don't have Kerberos on, I plan to have it later so I wanted to make sure that user credentials were handled correctly when connecting and doing a Put or Get).
final ByteArrayOutputStream bos = new ByteArrayOutputStream();
MyHBaseService.getUserHBase().runAs(new PrivilegedExceptionAction<Object>() {
#Override
public Object run() throws Exception {
Connection connection = null;
Table StorageTable = null;
List<hFile> HbaseDownload = new ArrayList<>();
try {
// Open an HBase Connection
connection = ConnectionFactory.createConnection(MyHBaseService.getHBaseConfiguration());
Get get = new Get(Bytes.toBytes("filenameCell"));
Result result = table.get(get);
byte[] data = result.getValue(Bytes.toBytes(MyHBaseService.getDataStoreFamily()), Bytes.toBytes(MyHBaseService.getDataStoreQualifier()));
bos.write(data, 0, data.length);
bos.flush();
...
}
});
// now get the outputstream.
// I am assuming byteArrayStream is synchronized and thread-safe.
return bos.toByteArray();
However, I wasn't sure if this was running an asynchronous or synchronous thread.
The problem:
I use:
Get get = new Get(Bytes.toBytes("filenameCell"));
Result result = table.get(get);
Inside this run() function. But to get information OUT of the run() thread I use a new ByteOutputArrayStream OUTSIDE the run(). ByteOutputArrayStream.write & ByteOutputArrayStream.flush inside the run(). Then toByteArray() to get the binary bytes of the HBase content out of the function. This causes null bytes to be returned though, so maybe I'm not doing this right.
However, I am having difficulty finding good examples of HBase Java API to do these things and no one seems to use runAs like I do. It's so strange.
I have HBase 1.2.5 client running inside a Web App (request-based function calls).
Here in this code the thread is running inside "MyHBaseService.getUserHBase().runAs" this. But if it is running asyncronously then before executing it properly program will return "bos.toByteArray();" as this is outside the runAs(). So before even executing the complete function it will return the output.
I think thats the reason of null values.
I am trying to:
Use a word document with "MergeFields" to fill it with data
Convert to a PDF document, using java
I have had this working before, and now all of a sudden i get the following error:
java.lang.NoSuchMethodError:
org.apache.poi.xwpf.usermodel.XWPFHyperlinkRun.
This occurs when i put the .war file on an Amazon EC2 server.
(all other libraries work fine)
Here are the libraries that i use:
fr.opensagres.xdocreport.converter.odt.odfdom (v 1.0.4)
fr.opensagres.xdocreport.template.freemarker (v 1.0.4)
org.apache.poi.xwpf.converter.core (1.0.5)
org.apache.poi.xwpf.converter.pdf (1.0.5)
org.apache.poi.xwpf.converter.xhtml (1.0.5)
org.apache.poi (3.11)
Is there anything wrong with my libraries? or is this just a server deployment issue?
Very thankful for help.
Below is my code:
public byte[] wordToPdf(RequestHelper reqHelper, Map<String, Object> values, String docPath) throws IOException, XDocReportException, ServiceUnavailableException, E24Exception {
try {
ServletContext ctx = reqHelper.getRequest().getServletContext();
InputStream tpl = new BufferedInputStream(ctx.getResourceAsStream(docPath));
IXDocReport report = XDocReportRegistry.getRegistry().loadReport(tpl, TemplateEngineKind.Velocity);
Options options = Options.getTo(ConverterTypeTo.PDF).via(ConverterTypeVia.XWPF);
ByteArrayOutputStream pdfOut = new ByteArrayOutputStream();
report.convert(report.createContext(values), options, pdfOut);
byte[] pdfImage = pdfOut.toByteArray();
return pdfImage;
}
catch (FileNotFoundException ex) {
}
return null;
}
Ok i finally got to a solution that worked for me, since this post has alot of views and no answers, i'll answer it myself for those who are in need!
I changed the version of all libraries that has anything to do with
"apache.poi" to version 1.0.4
After that i used org.apache.poi version 3.9 instead of 3.11
So finally, to wrap it up... this is what i used in the end
org.apache.poi.xwpf.converter.core (1.0.4)
org.apache.poi.xwpf.converter.pdf (1.0.4)
org.apache.poi.xwpf.converter.xhtml (1.0.4)
org.apache.poi (3.9)
/Marcus
I created my weka model in the machine and imported it to the android project. When i try to create the classifier it gives an error "exception.java.io.StreamCorruptedException" when i try to deserialise the model i created. The code perfectly works in machine.
This is my Code,
InputStream fis = null;
fis = new InputStream("/modle.model");
InputStream is = fis;
Classifier cls = null;
//here im getting the error when trying to read the Classifier
cls = (Classifier) SerializationHelper.read(is);
FileInputStream datais = null;
datais = new FileInputStream("/storage/emulated/0/window.arff");
InputStream dataIns = datais;
DataSource source = new DataSource(dataIns);
Instances data = null;
try {
data = source.getDataSet();
} catch (Exception e) {
e.printStackTrace();
}
data.setClassIndex(data.numAttributes() - 1);
Instance in = new Instance(13);
in.setDataset(data);
in.setValue(0, testWekaModle1[0]);
in.setValue(1, testWekaModle1[1]);
in.setValue(2, testWekaModle1[2]);
in.setValue(3, testWekaModle1[3]);
in.setValue(4, testWekaModle1[4]);
in.setValue(5, testWekaModle1[5]);
in.setValue(6, testWekaModle1[6]);
in.setValue(7, testWekaModle1[7]);
in.setValue(8, testWekaModle1[8]);
in.setValue(9, testWekaModle1[9]);
in.setValue(10, testWekaModle1[10]);
in.setValue(11, testWekaModle1[11]);
double value = 0;
value = cls.classifyInstance(in);
in.setClassValue(value);
This is the full stacktrace,
java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:2109)
java.io.ObjectInputStream.<init>(ObjectInputStream.java:372)
weka.core.SerializationHelper.read(SerializationHelper.java:288)
info.androidhive.sleepApp.model.ControllerWeka.wekaModle(ControllerWeka.java:81)
info.androidhive.sleepApp.activity.HomeFragment.extract(HomeFragment.java:278)
info.androidhive.sleepApp.activity.HomeFragment.stop(HomeFragment.java:146)
"info.androidhive.sleepApp.activity.HomeFragment$2.onClick(HomeFragment.java:107)"
android.view.View.performClick(View.java:4475)"
android.view.View$PerformClick.run(View.java:18786)"
android.os.Handler.handleCallback(Handler.java:730)"
dalvik.system.NativeStart.main(Native Method)"
com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1025)"
com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1209)"
java.lang.reflect.Method.invoke(Method.java:525)"
java.lang.reflect.Method.invokeNative(Native Method)"
android.app.ActivityThread.main(ActivityThread.java:5419)"
android.os.Looper.loop(Looper.java:137)"
android.os.Handler.dispatchMessage(Handler.java:92)"
Please help me to overcome this problem.
this is resolved, the model was created in a different environment(PC) and tried to deserialise in the android environment which gave error because of the two types of JDK wasn't same at all.
Be sure that both of the weka.jar have the same version.
And do NOT use the GUI version of Weka to save the model since the Android runtime does not contain GUI related packages used by weka.
It would be fine that build and save the model programmatically with desktop and deserialise it through Android.
I get the following error
java.lang.IllegalStateException: Unable to determine the default workspace location. Check your OSGi-less platform configuration of the plugin or datatools workspace path.
which makes little sense.
Reports are created using the BIRT designer within Eclipse, and we are using code to covert the reports in to PDF.
the code looks something like
final EngineConfig config = new EngineConfig();
config.setBIRTHome("./birt");
Platform.startup(config);
final IReportEngineFactory factory = (IReportEngineFactory) Platform
.createFactoryObject(IReportEngineFactory.EXTENSION_REPORT_ENGINE_FACTORY);
final HTMLRenderOption ho = new HTMLRenderOption();
ho.setImageHandler(new HTMLCompleteImageHandler());
config.setEmitterConfiguration(RenderOption.OUTPUT_FORMAT_HTML, ho);
// Create the engine.
this.engine = factory.createReportEngine(config);
final IReportRunnable report = this.engine.openReportDesign(reportName);
final IRunAndRenderTask task = this.engine.createRunAndRenderTask(report);
final RenderOption options = new HMTLRenderOption();
options.setOutputFormat(HTMLRenderOption.OUTPUT_FORMAT_PDF);
options.setOutputFormat("pdf");
final String output = reportName.replaceFirst(".rptdesign", ".xls");
final String output = name.replaceFirst(".rptdesign", "." + HTMLRenderOption.OUTPUT_FORMAT_PDF);
options.setOutputFileName( outputReporttName);
task.setRenderOption(options);
// Run the report.
task.run();
but it seems during the task.run() method, the system throws the error.
This needs to be able to run standalone, without the need of eclipse, and hopped thatt he setting of BIRT home would make it happy, but these seems to be some other connection profile i am unaware of and probably don't need.
The full error :
07-Jan-2013 14:55:31 org.eclipse.datatools.connectivity.internal.ConnectivityPlugin log
SEVERE: Unable to determine the default workspace location. Check your OSGi-less platform configuration of the plugin or datatools workspace path.
07-Jan-2013 14:55:31 org.eclipse.birt.report.engine.api.impl.EngineTask handleFatalExceptions
SEVERE: An error happened while running the report. Cause:
java.lang.IllegalStateException: Unable to determine the default workspace location. Check your OSGi-less platform configuration of the plugin or datatools workspace path.
at org.eclipse.datatools.connectivity.internal.ConnectivityPlugin.getDefaultStateLocation(ConnectivityPlugin.java:155)
at org.eclipse.datatools.connectivity.internal.ConnectivityPlugin.getStorageLocation(ConnectivityPlugin.java:191)
at org.eclipse.datatools.connectivity.internal.ConnectionProfileMgmt.getStorageLocation(ConnectionProfileMgmt.java:1060)
at org.eclipse.datatools.connectivity.oda.profile.internal.OdaProfileFactory.defaultProfileStoreFile(OdaProfileFactory.java:170)
at org.eclipse.datatools.connectivity.oda.profile.OdaProfileExplorer.defaultProfileStoreFile(OdaProfileExplorer.java:138)
at org.eclipse.datatools.connectivity.oda.profile.OdaProfileExplorer.loadProfiles(OdaProfileExplorer.java:292)
at org.eclipse.datatools.connectivity.oda.profile.OdaProfileExplorer.getProfileByName(OdaProfileExplorer.java:537)
at org.eclipse.datatools.connectivity.oda.profile.provider.ProfilePropertyProviderImpl.getConnectionProfileImpl(ProfilePropertyProviderImpl.java:184)
at org.eclipse.datatools.connectivity.oda.profile.provider.ProfilePropertyProviderImpl.getDataSourceProperties(ProfilePropertyProviderImpl.java:64)
at org.eclipse.datatools.connectivity.oda.consumer.helper.ConnectionPropertyHandler.getEffectiveProperties(ConnectionPropertyHandler.java:123)
at org.eclipse.datatools.connectivity.oda.consumer.helper.OdaConnection.getEffectiveProperties(OdaConnection.java:826)
at org.eclipse.datatools.connectivity.oda.consumer.helper.OdaConnection.open(OdaConnection.java:240)
at org.eclipse.birt.data.engine.odaconsumer.ConnectionManager.openConnection(ConnectionManager.java:165)
at org.eclipse.birt.data.engine.executor.DataSource.newConnection(DataSource.java:224)
at org.eclipse.birt.data.engine.executor.DataSource.open(DataSource.java:212)
at org.eclipse.birt.data.engine.impl.DataSourceRuntime.openOdiDataSource(DataSourceRuntime.java:217)
at org.eclipse.birt.data.engine.impl.QueryExecutor.openDataSource(QueryExecutor.java:407)
at org.eclipse.birt.data.engine.impl.QueryExecutor.prepareExecution(QueryExecutor.java:317)
at org.eclipse.birt.data.engine.impl.PreparedQuery.doPrepare(PreparedQuery.java:455)
at org.eclipse.birt.data.engine.impl.PreparedDataSourceQuery.produceQueryResults(PreparedDataSourceQuery.java:190)
at org.eclipse.birt.data.engine.impl.PreparedDataSourceQuery.execute(PreparedDataSourceQuery.java:178)
at org.eclipse.birt.data.engine.impl.PreparedOdaDSQuery.execute(PreparedOdaDSQuery.java:145)
at org.eclipse.birt.report.data.adapter.impl.DataRequestSessionImpl.execute(DataRequestSessionImpl.java:624)
at org.eclipse.birt.report.engine.data.dte.DteDataEngine.doExecuteQuery(DteDataEngine.java:152)
at org.eclipse.birt.report.engine.data.dte.AbstractDataEngine.execute(AbstractDataEngine.java:267)
at org.eclipse.birt.report.engine.executor.ExecutionContext.executeQuery(ExecutionContext.java:1939)
at org.eclipse.birt.report.engine.executor.QueryItemExecutor.executeQuery(QueryItemExecutor.java:80)
at org.eclipse.birt.report.engine.executor.TableItemExecutor.execute(TableItemExecutor.java:62)
at org.eclipse.birt.report.engine.internal.executor.dup.SuppressDuplicateItemExecutor.execute(SuppressDuplicateItemExecutor.java:43)
at org.eclipse.birt.report.engine.internal.executor.wrap.WrappedReportItemExecutor.execute(WrappedReportItemExecutor.java:46)
at org.eclipse.birt.report.engine.internal.executor.l18n.LocalizedReportItemExecutor.execute(LocalizedReportItemExecutor.java:34)
at org.eclipse.birt.report.engine.layout.html.HTMLBlockStackingLM.layoutNodes(HTMLBlockStackingLM.java:65)
at org.eclipse.birt.report.engine.layout.html.HTMLPageLM.layout(HTMLPageLM.java:92)
at org.eclipse.birt.report.engine.layout.html.HTMLReportLayoutEngine.layout(HTMLReportLayoutEngine.java:100)
at org.eclipse.birt.report.engine.api.impl.RunAndRenderTask.doRun(RunAndRenderTask.java:180)
at org.eclipse.birt.report.engine.api.impl.RunAndRenderTask.run (RunAndRenderTask.java:77)
has anyone seen this error and can point me in the right direction ?
When I had this issue then I tried two things. The first thing solved the error but then I just got to the next error.
The first thing I tried was setting the setenv.sh file to have the following line:
export CATALINA_OPTS="$CATALINA_OPTS -Djava.io.tmpdir=/opt/local/share/tomcat/apache-tomcat-8.0.8/temp/tmpdir -Dorg.eclipse.datatools_workspacepath=/opt/local/share/tomcat/apache-tomcat-8.0.8/temp/tmpdir/workspace_dtp"
This solution worked after I made the tmpdir and the workspace_dtp directories in my local tomcat server. This was done in response to the guidance here.
However, I just got to the next error, which was a connection profile error. I can look into it again if you need. I know how to replicate the issue.
The second thing I tried ended up solving the issue completely and had to do with our report designer selecting the wrong type of datasource in the report design process. See my post on the Eclipse BIRT forums here for the full story: post.
Basically, the report type was set to "JDBC Database Connection for Query Builder" when it should have been set to "JDBC Data Source." See the picture for reference:
Here I give you a tip that save me from that pain :
just launch eclipse with "-clean" option after installing BIRT plugins.
To be clear, my project was built from BIRT maven dependencies, and so should not use eclipse dependencies to run (except for designing reports), but ... i think there was a conflict somewhere ... especially with org.eclipse.datatools.connectivity_1.2.4.v201202041105.jar
For global understanding, you should follow the migration guide :
http://wiki.eclipse.org/Birt_3.7_Migration_Guide#Connection_Profiles
It helps using a connection profile to externalize datasource parameters.
So it's not required if you define JDBC parameters directly in report design.
I used this programmatic way to initialize worskpace directory :
#Override
public void initializeEngine() throws BirtException {
// define eclipse datatools workspace path (required)
String workspacePath = setDataToolsWorkspacePath();
// set configuration
final EngineConfig config = new EngineConfig();
config.setLogConfig(workspacePath, Level.WARNING);
// config.setResourcePath(getSqlDriverClassJarPath());
// startup OSGi framework
Platform.startup(config); // really needed ?
IReportEngineFactory factory = (IReportEngineFactory) Platform
.createFactoryObject(IReportEngineFactory.EXTENSION_REPORT_ENGINE_FACTORY);
engine = factory.createReportEngine(config);
engine.changeLogLevel(Level.WARNING);
}
private String setDataToolsWorkspacePath() {
String workspacePath = System.getProperty(DATATOOLS_WORKSPACE_PATH);
if (workspacePath == null) {
workspacePath = FilenameUtils.concat(SystemUtils.getJavaIoTmpDir().getAbsolutePath(), "workspace_dtp");
File workspaceDir = new File(workspacePath);
if (!workspaceDir.exists()) {
workspaceDir.mkdir();
}
if (!workspaceDir.canWrite()) {
workspaceDir.setWritable(true);
}
System.setProperty(DATATOOLS_WORKSPACE_PATH, workspacePath);
}
return workspacePath;
}
I also needed to force datasource parameters at runtime this way :
private void generateReportOutput(InputStream reportDesignInStream, File outputFile, OUTPUT_FORMAT outputFormat,
Map<PARAM, Object> params) throws EngineException, SemanticException {
// Open a report design
IReportRunnable design = engine.openReportDesign(reportDesignInStream);
// Use data-source properties from persistence.xml
forceDataSource(design);
// Create RunAndRender task
IRunAndRenderTask runTask = engine.createRunAndRenderTask(design);
// Use data-source from JPA persistence context
// forceDataSourceConnection(runTask);
// Define report parameters
defineReportParameters(runTask, params);
// Set render options
runTask.setRenderOption(getRenderOptions(outputFile, outputFormat, params));
// Execute task
runTask.run();
}
private void forceDataSource(IReportRunnable runableReport) throws SemanticException {
DesignElementHandle designHandle = runableReport.getDesignHandle();
Map<String, String> persistenceProperties = PersistenceUtils.getPersistenceProperties();
String dsURL = persistenceProperties.get(AvailableSettings.JDBC_URL);
String dsDatabase = StringUtils.substringAfterLast(dsURL, "/");
String dsUser = persistenceProperties.get(AvailableSettings.JDBC_USER);
String dsPass = persistenceProperties.get(AvailableSettings.JDBC_PASSWORD);
String dsDriver = persistenceProperties.get(AvailableSettings.JDBC_DRIVER);
SlotHandle dataSources = ((ReportDesignHandle) designHandle).getDataSources();
int count = dataSources.getCount();
for (int i = 0; i < count; i++) {
DesignElementHandle dsHandle = dataSources.get(i);
if (dsHandle != null && dsHandle instanceof OdaDataSourceHandle) {
// replace connection properties from persistence.xml
dsHandle.setProperty("databaseName", dsDatabase);
dsHandle.setProperty("username", dsUser);
dsHandle.setProperty("password", dsPass);
dsHandle.setProperty("URL", dsURL);
dsHandle.setProperty("driverClass", dsDriver);
dsHandle.setProperty("jarList", getSqlDriverClassJarPath());
// #SuppressWarnings("unchecked")
// List<ExtendedProperty> privateProperties = (List<ExtendedProperty>) dsHandle
// .getProperty("privateDriverProperties");
// for (ExtendedProperty extProp : privateProperties) {
// if ("odaUser".equals(extProp.getName())) {
// extProp.setValue(dsUser);
// }
// }
}
}
}
I was having the same issue
Changing the Data Source type from "JDBC Database Connection for Query Builder" to "JDBC Data Source" solved the problem for me.