I have a simple bpmn process
in which i am using 2 service task,I am executing my process by using
processEngine.getRuntimeService().startProcessInstanceByKey("Process_1", variables);
where my variables is as follows:
Map variables = new HashMap();
variables.put("a", 2);
variables.put("b", 5);
Service task 1 implements an Addition java class and service task 2 implements a Multiplication class.
Now I want to have 3 variables (constants) c = 5, d = 10, e = 2 so that I can use c for service task 1 such that in Addition I can use this variable, similarly I want to use d in my Multiplication class, and e should be global so that I can use it in both classes.
Can anyone guide me on this?
As a quick fix you could include a Setup-Service Task as the first Task of the process which prefills your process-variables.
Depending on how you start a process you could either:
Set the Variables via the java-object-api
https://docs.camunda.org/manual/7.5/user-guide/process-engine/variables/#java-object-api
or you if you use a REST call you can provide these fixed values within the request body:
https://docs.camunda.org/manual/7.5/reference/rest/process-definition/post-start-process-instance/
Another simple solution would be a class with static values or a enum holding the needed values.
--edit--
if you want to use the inputOutput extension add something like this to your bpmn file:
<bpmn:process id="Process_1" isExecutable="false">
<bpmn:extensionElements>
<camunda:inputOutput>
<camunda:inputParameter name="c">5</camunda:inputParameter>
<camunda:inputParameter name="d">10</camunda:inputParameter>
<camunda:inputParameter name="e">2</camunda:inputParameter>
</camunda:inputOutput>
</bpmn:extensionElements>
this can't be done in the diagram view of the camunda modeler, just switch to the XML representation of the process and add the extensionElement.
The documentation shows two different ways to store the value:
Java object api
Typed value api
I think using Java object api requires the java object to implement serializable interface? The following code would break, if Order object does not implement Serializable interface
com.example.Order order = new com.example.Order();
runtimeService.setVariable(execution.getId(), "order", order);
com.example.Order retrievedOrder = (com.example.Order) runtimeService.getVariable(execution.getId(), "order");
==
I would use the following format for java object
ObjectValue customerDataValue = Variables.objectValue(customerData)
.serializationDataFormat(Variables.SerializationDataFormats.JAVA)
.create();
execution.setVariable("someVariable", customerDataValue);
customerdata refers to any java object. However if there member variables contains some other references, those references needs to serializable as well. To avoid this you will have declare those references as transient
Further more, use setVariableLocal method if you dont want the data to be persisted in DB
To create variable as gloable:org.camunda.bpm.engine.variable.Variables.putValue("keyName", VariableType);
To get global varibale: VariableType value = (VariableType) delegateExecution.getVariable("getKey");
Note: Your dto has to be serializable otherwise it camnuda will throw serialization error.
Related
My java project uses the Apache Thrift framework and it has a similar Thrift object structure as the following:
struct MyStruct {
1: required string something;
2: optional OptionEnum option;
}
enum OptionEnum {
VALUE_A = 0,
VALUE_B = 1
}
So when my project compiles it builds a Java class for this structure (ie: class MyStruct).
What I am trying to do is to serialize this into a string Json.
I've tried using TSerializer:
TSerializer serializer = new TSerializer(new TSimpleJSONProtocol.Factory());
return serializer.toString(instanceOfMyStruct);
This generates a json but loses the string name of the enum (it converts into a numeric value instead):
{
something: 'value',
option: 1
}
Is there a way to keep the enum name (I mean option being VALUE_B instead of 1 in the above example) ?
The issue here is that the conversion is baked into the code parts generated by the Thrift compiler. The protocol level classes only know about a few very basic data types - when the data reach that level it is already too late.
So, unless you want to fork and implement your own (incompatible) version of the code generator, I'm afraid there is no way.
PS: I might add that the main purpose driving the design is efficiency, not human readibility.
I am currently facing problems integrating the existing Piketec TPT Java API (http://javadoc.jenkins.io/plugin/piketec-tpt/com/piketec/tpt/api/package-summary.html) in a Java project by using Reflection.
The TPT Api provides an interface called "TptApi", which contains several abstract methods, that are used to access TPT projects.
I have already integrated other APIs such as the Dox4j-API, where a class instance was used as invokation target. Obvisouly, this is not the correct way for accessing method from an interface.
My goal is to access the method "OpenResult openProject(File f)" from the TptApi interface (http://javadoc.jenkins.io/plugin/piketec-tpt/com/piketec/tpt/api/TptApi.html#openProject-java.io.File-).
My code:
ClassLoader cl = new URLClassLoader(...);
Map c = new HashMap();
File file = new File("test.prj");
c.put("TptApi", cl.loadClass("com.piketec.tpt.api.TptApi"));
c.put("OpenResult", cl.loadClass("com.piketec.tpt.api.OpenResult"));
//The way I did it with 'normal' classes, not applicable with the interface:
//Object target = ((Class) c.get("TptApi")).newInstance();
OpenResult or = (OpenResult)((Class) c.get("TptApi")).getMethod("openProject", new Class[]{File.class}).invoke(target, new Object[]{_file});
So how do I access abstract interface methods by Reflection?
I just stumbled over this Question so let me answer it even if it is a bit outdated. I read in your comments that you assume you do not need the TPT tool itselfe to use the API. That is simply wrong. The API is just a way to communicate via RMI with an open TPT instance. To connect to TPT the TPT instance must have enabled RMI and you have to know the configurable port and binding name. You can do that in the Preferences under "TPT API" or by starting TPT with command line arguments "--apiPort " and "--apiBindingName ". Now you can obtain the TptApi instance using these two lines of code:
Registry registry = LocateRegistry.getRegistry(HOST, PORT); // get Server/RMI-Registry
TptApi remoteApi = (TptApi)registry.lookup(BINDING_NAME); // get TPT-API
My initial post did not assume that TPT is not necessary for the usage of the API. It indeed is. The way how to enable the API in TPT is well-documented, the two lines of code you posted are also ok. The problem I described is to access the API by using Java reflection. For other APIs I could do this by using "newInstance" to get access to the tool. With TPT, the corresponding API object is an interface instead of a class, so there is no way to instantiate it. Therefore I wanted to know how this specific API can be accessed via reflection.
I'm customizing a PLM Windchill Workflow, which provides a mechanism to execute java code snippets. Unfortunately, they are 'inserted' into prepared service's method, which means that there is no way to import classes, so I have to include full package names to use it. Don't try to understand the snippet below, just look how does it looks like:
wt.fc.QueryResult activities = wt.fc.PersistenceHelper.manager.find((wt.pds.StatementSpec) activitiesQuery);
while (activities.hasMoreElements()) {
wt.workflow.work.WfAssignedActivity activity = (wt.workflow.work.WfAssignedActivity) activities.nextElement();
if(activity.getDisplayIdentifier().toString().equals("Analyze Image Request")){
java.util.List<wt.workflow.work.WorkItem> workItems = wt.workflow.status.WfWorkflowStatusHelper.service.getWorkItems(activity);
for (wt.workflow.work.WorkItem workItem : workItems){
String action = workItem.getActionPerformed();
if(action != null && action.equals("Accepted")){
wt.org.WTPrincipalReference approver = workItem.getOwnership().getOwner();
n_approver = approver.getFullName() + " ("+approver.getDisplayName()+")";
wt.fc.collections.WTHashSet approverSet = new wt.fc.collections.WTHashSet(java.util.Arrays.asList(approver));
wt.project.Role role = wt.project.Role.toRole("APPROVER");
com.ptc.windchill.pdmlink.change.server.impl.WorkflowProcessHelper.setChangeItemParticipants(report, role, approverSet);
break;
}
}
break;
}
}
And my question is - how to make this code any more readable? Of course there is no way to import classes inside the method, there is even no way to divide this snippet into separate methods (as it is 'pasted' into one) so I'm looking for other ideas.
One option to make the code more readable would be to separate chained method/property calls across multiple lines.
For example, this line:
wt.project.Role role = wt.project.Role.toRole("APPROVER");
could be rewritten as:
wt.project.Role role = wt
.project
.Role
.toRole("APPROVER");
You can call this complete code from a Customized Java class.
You just have to call your class and take the final parameters required from the Java class to make it more readable.
If you need multiple outputs write multiple methods in Java class and call them in workflow expression.
You can't.
Workflows expressions are methods bodies.
A statement like
wt.fc.QueryResult activities = wt.fc.PersistenceHelper.manager.find((wt.pds.StatementSpec) activitiesQuery);
ends in a class under $WT_HOME/codebase/wt/workflow/expr/
with a method :
public static Object executemethod_1(Object[] var0, Object[] var1) throws Exception {
wt.fc.QueryResult activities = wt.fc.PersistenceHelper.manager.find((wt.pds.StatementSpec) activitiesQuery);
// some generated code to handle variables...
}
So, you can't use import.
However :
If you have a PDMLink version greater than 10,
You can externalize workflow expression
http://support.ptc.com/cs/help/windchill_hc/wc100_hc/index.jspx?id=WFTemplateExtExpression&action=show
This create a java class under /codebase/ext/wt/workflow/externalize
Then you can do what you want, but you'll have to compile these classes, and do a stop/start in case of modifications.
Basically, it's nothing more than calling external code from the expression, so I don't use it a lot...
I want to implement a network protocol. To obtain a maintainable design I am looking for fitting patterns.
The protocol is based on XML and should be read with java. To simplify the discussion here I assume the example grammar:
<User>
<GroupList>
<Group>group1</Group>
<Group>group2</Group>
</GroupList>
</User>
Short question:
What is a good design pattern to parse such thing?
Long version:
I have found this and this question where different patterns (mostly state pattern) are proposed.
My actual (but lacking) solution is the folowing:
I create for each possible entry in the XML a class to contain the data and a parser. Thus I have User, User.Parser, ... as classes.
Further there is a ParserSelector that has a Map<String,AbstractParser> in which all possible subentries get registered.
For each parser a ParserSelector gets instantiated and set up.
For example the ParserSelector of the GroupList.Parser has one entry: The mapping from the string "Group" to an instance of Group.Parser.
If I did not use the ParserSleector class, I would have to write this block of code into every single parser.
The problem is now how to get the read data to the superobjects.
The Group.Parser would create a Group object with content group1.
This object must now be registered in the GroupList object.
I have read of using Visitor or Observer patterns but do not understand how they might fit here.
I give some pseudo code below to see the problem.
You see, that I have to check via instanceof for the type as statically there is the type information not available.
I thought this should be possible to solve using polymorphism in java in a cleaner (more maintainable) way.
I always face then the problem that java does only do dynamic binding on overriding.
Thus I cannot add a parameter to the XMLParser.parse(...) method to allow of "remote updating" as in a visitor/observer like approach.
Side remark: The real grammar is "deep" that is, it is such that there are quite many XML entries (here only three: User, GroupList and Group) while most of them might contain only very few different subentries (User and GroupList may only contain one subentry here, while Group itself contains only text).
Here comes some lines of pseude java code to explain the problem:
class User extends AbstractObject {
static class Parser implements XMLParser {
ParserSelector ps = ...; // Initialize with GroupList.Parser
void parse(XMLStreamReader xsr){
XMLParser p = ps.getParser(...); // The corresponding parser.
// We know only that it is XMLParser statically.
p.parse(...);
if(p instanceof GroupList.Parser){
// Set the group list in the User class
}
}
}
}
class GroupList extends AbstractObject{...}
class Group extends AbstractObject{...}
class ParserSelector{
Map<String,XMLParser> = new Map<>();
void registerParser(...){...} // Registers a possible parser for subentries
XMLParser getParser(String elementName){
return map.get(elementName); // Returns the parser registered with the given name
}
}
interface XMLParser {
void parse(XMLStreamReader xsr);
}
abstract class AbstractObject{}
To finish this question:
I ended up with JAXB. In fact I was not aware of the fact that it allows to easily create a XML Schema from java source code (using annotations).
Thus I just have to write the code with classical java objects which are used for transfer. Then the API handles the conversion to and from XML quite well.
I'm trying to use Groovy to create an interactive scripting / macro mode for my application. The application is OSGi and much of the information the scripts may need is not know up front. I figured I could use GroovyShell and call eval() multiple times continually appending to the namespace as OSGi bundles are loaded. GroovyShell maintains variable state over multiple eval calls, but not class definitions or methods.
goal: Create a base class during startup. As OSGi bundles load, create derived classes as needed.
I am not sure about what you mean about declared classes not existing between evals, the following two scripts work as expected when evaled one after another:
class C {{println 'hi'}}
new C()
...
new C()
However methods become bound to the class that declared them, and GroovyShell creates a new class for each instance. If you do not need the return value of any of the scripts and they are truly scripts (not classes with main methods) you can attach the following to the end of every evaluated scrips.
Class klass = this.getClass()
this.getMetaClass().getMethods().each {
if (it.declaringClass.cachedClass == klass) {
binding[it.name] = this.&"$it.name"
}
}
If you depend on the return value you can hand-manage the evaluation and run the script as part of your parsing (warning, untested code follows, for illustrative uses only)...
String scriptText = ...
Script script = shell.parse(scriptText)
def returnValue = script.run()
Class klass = script.getClass()
script.getMetaClass().getMethods().each {
if (it.declaringClass.cachedClass == klass) {
shell.context[it.name] = this.&"$it.name"
}
}
// do whatever with returnValue...
There is one last caveat I am sure you are aware of. Statically typed variables are not kept between evals as they are not stored in the binding. So in the previous script the variable 'klass' will not be kept between script invocations and will disappear. To rectify that simply remove the type declarations on the first use of all variables, that means they will be read and written to the binding.
Ended up injecting code before each script compilation. End goal is that the user written script has a domain-specific-language available for use.
This might be what you are looking for?
From Groovy in Action
def binding = new Binding(x: 6, y: 4)
def shell = new GroovyShell(binding)
def expression = '''f = x * y'''
shell.evaluate(expression)
assert binding.getVariable("f") == 24
An appropriate use of Binding will allow you to maintain state?