Hazelcast cache implementation using Apache camel blueprint - java

I am trying to implement Hazelcast caching with Camel blueprint. But I couldn't accomplish.I am able to create a hazelcast instance through java code (not through hazelcast XML config file). Instance has been craeted, but the cache loader class is not called during the instance creation (even though initialization method is EAGER). Attached some of the code snippets.
Let me know, if anyone come across this.
Code:
Hazelcast config
public class ConfigHack extends Config {
public ConfigHack(String instanceName ){
super(instanceName);
System.out.println("Going to create Hazelcast instance
................"+instanceName);
TcpIpConfig tcpIpConfig = new TcpIpConfig();
List membersList = new ArrayList<String>();
membersList.add("localhost");
tcpIpConfig.setMembers(membersList);
MulticastConfig multicastConfig = new MulticastConfig();
multicastConfig.setEnabled(true);
JoinConfig join = new JoinConfig();
join.setTcpIpConfig(tcpIpConfig);
join.setMulticastConfig(multicastConfig);
NetworkConfig networkConfig = new NetworkConfig();
networkConfig.setPort(5701);
networkConfig.setPortAutoIncrement(true);
networkConfig.setJoin(join);
GroupConfig groupConfig = new GroupConfig();
groupConfig.setName("devuser");
groupConfig.setPassword("devpassword");
MapStoreConfig mapStoreConfig = new MapStoreConfig();
//Absolute path in class name field below
mapStoreConfig.setClassName("VehicleCacheLoader");
mapStoreConfig.setImplementation(new VehicleCacheLoader());
mapStoreConfig.setEnabled(true);
mapStoreConfig.setInitialLoadMode(InitialLoadMode.EAGER);
mapStoreConfig.setWriteDelaySeconds(500);
MapConfig mapConfig = new MapConfig();
mapConfig.setName("vehicleMap");
mapConfig.setBackupCount(2);
mapConfig.setMaxIdleSeconds(1000000);
mapConfig.setEvictionPercentage(30);
mapConfig.setEvictionPolicy(EvictionPolicy.LFU);
mapConfig.setMapStoreConfig(mapStoreConfig);
Map<String,MapConfig> mapConfigs = new HashMap<String,MapConfig>();
mapConfigs.put("vehicleMap", mapConfig);
//config.setMapConfigs(mapConfigs);
addMapConfig(mapConfig);
setGroupConfig(groupConfig);
setNetworkConfig(networkConfig);
}
}
Cache loader class:
public class VehicleCacheLoader implements MapLoader<String, VehicleVO> {
#Override
public VehicleVO load(String paramK) {
System.out.println("Calling load method for Key " + paramK);
VehicleVO vehicleVO = new VehicleVO();
vehicleVO.setCustId("XXX");
vehicleVO.setVehicleHeader("XXX");
vehicleVO.setVehicleInitial("001");
vehicleVO.setVehicleNumber("1234");
vehicleVO.setVehicleObjId(paramK);
return vehicleVO;
}
#Override
public Map<String, VehicleVO> loadAll(Collection<String> paramCollection) {
System.out.println("Calling Load all values() " + "Got key = ");
VehicleVO vehicleVO = null;
Map<String, VehicleVO> vehicleDataMap = new HashMap<String, VehicleVO>();
for (String paramKey : paramCollection) {
System.out.println("Calling ...." + paramKey);
vehicleVO = new VehicleVO();
vehicleVO.setCustId("XXX");
vehicleVO.setVehicleHeader("XXX");
vehicleVO.setVehicleInitial("001");
vehicleVO.setVehicleNumber("1234");
vehicleVO.setVehicleObjId(paramKey);
vehicleDataMap.put(paramKey, vehicleVO);
}
return vehicleDataMap;
}
#Override
public Set<String> loadAllKeys() {
System.out.println("Calling Load all keys() ");
Set<String> vehicleKeys = new HashSet<String>();
vehicleKeys.add("XXX001");
vehicleKeys.add("XXX002");
vehicleKeys.add("XXX003");
vehicleKeys.add("XXX004");
return vehicleKeys;
}
}
Blueprint config:
-----------------
<bean id="hazelcastInstance" class="com.hazelcast.core.Hazelcast"
factory-method="newHazelcastInstance" destroy-method="shutdown">
<argument ref="hazelcastConfig"/>
</bean>
<bean id="hazelcastConfig" class="xx.yy.zz.ss.tt.cache.ConfigHack">
<argument value="TestInstance" />
</bean>

The line
mapConfigs.put("vehicleMap", mapConfig);
defines the configuration that will be used for maps with names matching "vehicaleMap".
In order to create such a map you need to run an operation against it, such as
hazelcastInstance.getMap("vehicleMap");
The distinction is clearer if the configuration was:
mapConfigs.put("vehicleMap*", mapConfig);
This would be used when you create a map named "vehicleMap1", or "vehicleMap123".
The configuration defines the configuration which will be used if needed. It's not needed til you first access the map, which is when the maps are created.
"EAGER" here refers to how the map loader is run, not to how the map is created.

Related

Use in spring boot application into in other

I want to integrate my spring boot project into in another.
For this I export the .jar and I put it in the libraries of the other project which is also spring boot.
My .jar is :
https://drive.google.com/file/d/0B96L3Vd9zNeoQzhhcmFjT05vRWc/view?usp=sharing
And my main in the other project is :
#SpringBootApplication
#EnableJpaRepositories
public class UpsysmarocApplicationTestlogApplication {
public static void main(String[] args) {
ConfigurableApplicationContext context = SpringApplication.run(UpsysmarocApplicationTestlogApplication.class, args);
TraceabilityLogService traceabilityLogService = context.getBean(TraceabilityLogService.class);
List<Map<String, String>> items = new ArrayList<>();
Map<String, String> item = new HashMap<>();
item.put("element", "Nom");
item.put("oldValue", "Mkharbach2");
item.put("newValue", "Mounji2");
items.add(item);
item = new HashMap<>();
item.put("element", "Prenom");
item.put("oldValue", "Ayoub2");
item.put("newValue", "Said2");
items.add(item);
List<Map<String, String>> connections = new ArrayList<>();
Map<String, String> connection = new HashMap<>();
connection.put("className", "User");
connection.put("originId", "3");
connections.add(connection);
TraceabilityLog traceabilityLog = traceabilityLogService.save("Eladlani2", "CREATION", items, connections);
System.out.println("RETURN => " + traceabilityLog.getId());
}
}
But i want another way that does not ask to instantiate the context but just to use the functionality fornie part our module
So I always wait for the best method that works well and thanks in advance.
Thank you.
To solve the problem I puted the project as maven dependency.

Programmatic SchemaExport / SchemaUpdate with Hibernate 5 and Spring 4

With Spring 4 and Hibernate 4, I was able to use Reflection to get the Hibernate Configuration object from the current environment, using this code:
#Autowired LocalContainerEntityManagerFactoryBean lcemfb;
EntityManagerFactoryImpl emf = (EntityManagerFactoryImpl) lcemfb.getNativeEntityManagerFactory();
SessionFactoryImpl sf = emf.getSessionFactory();
SessionFactoryServiceRegistryImpl serviceRegistry = (SessionFactoryServiceRegistryImpl) sf.getServiceRegistry();
Configuration cfg = null;
try {
Field field = SessionFactoryServiceRegistryImpl.class.getDeclaredField("configuration");
field.setAccessible(true);
cfg = (Configuration) field.get(serviceRegistry);
} catch (NoSuchFieldException | SecurityException | IllegalArgumentException | IllegalAccessException e) {
e.printStackTrace();
}
SchemaUpdate update = new SchemaUpdate(serviceRegistry, cfg);
With Hibernate 5, I must use some MetadataImplementor, which doesn't seems to be available from any of those objects. I also tried to use MetadataSources with the serviceRegistry. But it did say that it's the wrong kind of ServiceRegistry.
Is there any other way to get this working?
Basic idea for this problem is:
implementation of org.hibernate.integrator.spi.Integrator which stores required data to some holder. Register implementation as a service and use it where you need.
Work example you can find here https://github.com/valery-barysok/spring4-hibernate5-stackoverflow-34612019
create org.hibernate.integrator.api.integrator.Integrator class
import hello.HibernateInfoHolder;
import org.hibernate.boot.Metadata;
import org.hibernate.engine.spi.SessionFactoryImplementor;
import org.hibernate.service.spi.SessionFactoryServiceRegistry;
public class Integrator implements org.hibernate.integrator.spi.Integrator {
#Override
public void integrate(Metadata metadata, SessionFactoryImplementor sessionFactory, SessionFactoryServiceRegistry serviceRegistry) {
HibernateInfoHolder.setMetadata(metadata);
HibernateInfoHolder.setSessionFactory(sessionFactory);
HibernateInfoHolder.setServiceRegistry(serviceRegistry);
}
#Override
public void disintegrate(SessionFactoryImplementor sessionFactory, SessionFactoryServiceRegistry serviceRegistry) {
}
}
create META-INF/services/org.hibernate.integrator.spi.Integrator file
org.hibernate.integrator.api.integrator.Integrator
import org.hibernate.boot.spi.MetadataImplementor;
import org.hibernate.tool.hbm2ddl.SchemaExport;
import org.hibernate.tool.hbm2ddl.SchemaUpdate;
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
#SpringBootApplication
public class Application implements CommandLineRunner {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
#Override
public void run(String... args) throws Exception {
new SchemaExport((MetadataImplementor) HibernateInfoHolder.getMetadata()).create(true, true);
new SchemaUpdate(HibernateInfoHolder.getServiceRegistry(), (MetadataImplementor) HibernateInfoHolder.getMetadata()).execute(true, true);
}
}
I would like to add up on Aviad's answer to make it complete as per OP's request.
The internals:
In order to get an instance of MetadataImplementor, the workaround is to register an instance of SessionFactoryBuilderFactory through Java's ServiceLoader facility. This registered service's getSessionFactoryBuilder method is then invoked by MetadataImplementor with an instance of itself, when hibernate is bootstrapped. The code references are below:
Service Loading
Invocation of getSessionFactoryBuilder
So, ultimately to get an instance of MetadataImplementor, you have to implement SessionFactoryBuilderFactory and register so ServiceLoader can recognize this service:
An implementation of SessionFactoryBuilderFactory:
public class MetadataProvider implements SessionFactoryBuilderFactory {
private static MetadataImplementor metadata;
#Override
public SessionFactoryBuilder getSessionFactoryBuilder(MetadataImplementor metadata, SessionFactoryBuilderImplementor defaultBuilder) {
this.metadata = metadata;
return defaultBuilder; //Just return the one provided in the argument itself. All we care about is the metadata :)
}
public static MetadataImplementor getMetadata() {
return metadata;
}
}
In order to register the above, create simple text file in the following path(assuming it's a maven project, ultimately we need the 'META-INF' folder to be available in the classpath):
src/main/resources/META-INF/services/org.hibernate.boot.spi.SessionFactoryBuilderFactory
And the content of the text file should be a single line(can even be multiple lines if you need to register multiple instances) stating the fully qualified class path of your implementation of SessionFactoryBuilderFactory. For example, for the above class, if your package name is 'com.yourcompany.prj', the following should be the content of the file.
com.yourcompany.prj.MetadataProvider
And that's it, if you run your application, spring app or standalone hibernate, you will have an instance of MetadataImplementor available through a static method once hibernate is bootstraped.
Update 1:
There is no way it can be injected via Spring. I digged into Hibernate's source code and the metadata object is not stored anywhere in SessionFactory(which is what we get from Spring). So, it's not possible to inject it. But there are two options if you want it in Spring's way:
Extend existing classes and customize all the way from
LocalSessionFactoryBean -> MetadataSources -> MetadataBuilder
LocalSessionFactoryBean is what you configure in Spring and it has an object of MetadataSources. MetadataSources creates MetadataBuilder which in turn creates MetadataImplementor. All the above operations don't store anything, they just create object on the fly and return. If you want to have an instance of MetaData, you should extend and modify the above classes so that they store a local copy of respective objects before they return. That way you can have a reference to MetadataImplementor. But I wouldn't really recommend this unless it's really needed, because the APIs might change over time.
On the other hand, if you don't mind building a MetaDataImplemetor from SessionFactory, the following code will help you:
EntityManagerFactoryImpl emf=(EntityManagerFactoryImpl)lcemfb.getNativeEntityManagerFactory();
SessionFactoryImpl sf=emf.getSessionFactory();
StandardServiceRegistry serviceRegistry = sf.getSessionFactoryOptions().getServiceRegistry();
MetadataSources metadataSources = new MetadataSources(new BootstrapServiceRegistryBuilder().build());
Metadata metadata = metadataSources.buildMetadata(serviceRegistry);
SchemaUpdate update=new SchemaUpdate(serviceRegistry,metadata); //To create SchemaUpdate
// You can either create SchemaExport from the above details, or you can get the existing one as follows:
try {
Field field = SessionFactoryImpl.class.getDeclaredField("schemaExport");
field.setAccessible(true);
SchemaExport schemaExport = (SchemaExport) field.get(serviceRegistry);
} catch (NoSuchFieldException | SecurityException | IllegalArgumentException | IllegalAccessException e) {
e.printStackTrace();
}
Take a look on this one:
public class EntityMetaData implements SessionFactoryBuilderFactory {
private static final ThreadLocal<MetadataImplementor> meta = new ThreadLocal<>();
#Override
public SessionFactoryBuilder getSessionFactoryBuilder(MetadataImplementor metadata, SessionFactoryBuilderImplementor defaultBuilder) {
meta.set(metadata);
return defaultBuilder;
}
public static MetadataImplementor getMeta() {
return meta.get();
}
}
Take a look on This Thread which seems to answer your needs
Well, my go to on this:
public class SchemaTranslator {
public static void main(String[] args) throws Exception {
new SchemaTranslator().run();
}
private void run() throws Exception {
String packageName[] = { "model"};
generate(packageName);
}
private List<Class<?>> getClasses(String packageName) throws Exception {
File directory = null;
try {
ClassLoader cld = getClassLoader();
URL resource = getResource(packageName, cld);
directory = new File(resource.getFile());
} catch (NullPointerException ex) {
throw new ClassNotFoundException(packageName + " (" + directory + ") does not appear to be a valid package");
}
return collectClasses(packageName, directory);
}
private ClassLoader getClassLoader() throws ClassNotFoundException {
ClassLoader cld = Thread.currentThread().getContextClassLoader();
if (cld == null) {
throw new ClassNotFoundException("Can't get class loader.");
}
return cld;
}
private URL getResource(String packageName, ClassLoader cld) throws ClassNotFoundException {
String path = packageName.replace('.', '/');
URL resource = cld.getResource(path);
if (resource == null) {
throw new ClassNotFoundException("No resource for " + path);
}
return resource;
}
private List<Class<?>> collectClasses(String packageName, File directory) throws ClassNotFoundException {
List<Class<?>> classes = new ArrayList<>();
if (directory.exists()) {
String[] files = directory.list();
for (String file : files) {
if (file.endsWith(".class")) {
// removes the .class extension
classes.add(Class.forName(packageName + '.' + file.substring(0, file.length() - 6)));
}
}
} else {
throw new ClassNotFoundException(packageName + " is not a valid package");
}
return classes;
}
private void generate(String[] packagesName) throws Exception {
Map<String, String> settings = new HashMap<String, String>();
settings.put("hibernate.hbm2ddl.auto", "drop-create");
settings.put("hibernate.dialect", "org.hibernate.dialect.PostgreSQL94Dialect");
MetadataSources metadata = new MetadataSources(
new StandardServiceRegistryBuilder()
.applySettings(settings)
.build());
for (String packageName : packagesName) {
System.out.println("packageName: " + packageName);
for (Class<?> clazz : getClasses(packageName)) {
System.out.println("Class: " + clazz);
metadata.addAnnotatedClass(clazz);
}
}
SchemaExport export = new SchemaExport(
(MetadataImplementor) metadata.buildMetadata()
);
export.setDelimiter(";");
export.setOutputFile("db-schema.sql");
export.setFormat(true);
export.execute(true, false, false, false);
}
}

How can I read in a list of objects from yaml using Spring's PropertiesConfigurationFactory?

If I have a set of properties, I understand that Springboot's relaxed data binder will read in a list of properties (or yaml) and populate the matching object. Like so:
Properties props = new Properties();
props.put("devices.imports[0]","imp1");
props.put("devices.imports[1]","imp2");
props.put("devices.definitions[0].id","first");
props.put("devices.definitions[1].id", "second");
DeviceConfig conf = new DeviceConfig();
PropertiesConfigurationFactory<DeviceConfig> pcf = new PropertiesConfigurationFactory<>(conf);
pcf.setProperties(props);
conf = pcf.getObject();
assertThat(conf.getDefinitions()).hasSize(2); //Definitions is coming in as 0 instead of the expected 2
DeviceConfig looks like this:
#ConfigurationProperties(prefix="devices")
public class DeviceConfig {
private List<String> imports = new ArrayList<>();
private List<DeviceDetailsProperties> definitions = new ArrayList<>();
public List<String> getImports() {
return this.imports;
}
public List<DeviceDetailsProperties> getDefinitions() {
return definitions;
}
public void setImports(List<String> imports) {
this.imports = imports;
}
public void setDefinitions(List<DeviceDetailsProperties> definitions) {
this.definitions = definitions;
}
}
DeviceDetailsProperties just has an id field with getters/setters.
Strangely neither the definitions (objects) or imports (Strings) are getting populated.
Using SpringBoot 1.2.0.RELEASE
When using the PropertiesConfigurationFactory in a manual way like this, it won't automatically use the prefix value in the annotation.
Add a targetName like so:
pcf.setTargetName("devices");
The corrected code would be:
Properties props = new Properties();
props.put("devices.imports[0]","imp1");
props.put("devices.imports[1]","imp2");
props.put("devices.definitions[0].id","first");
props.put("devices.definitions[1].id", "second");
DeviceConfig conf = new DeviceConfig();
PropertiesConfigurationFactory<DeviceConfig> pcf = new PropertiesConfigurationFactory<>(conf);
pcf.setProperties(props);
pcf.setTargetName("devices"); // <--- Add this line
conf = pcf.getObject();
assertThat(conf.getDefinitions()).hasSize(2);

How to bind input externally to xquery using saxon?

I have to invoke external java methods in xquery using saxon HE. I could able to invoke the methods with the below code. But the problem is i want to bind my input externally.
final Configuration config = new Configuration();
config.registerExtensionFunction(new ShiftLeft());
final StaticQueryContext sqc = new StaticQueryContext(config);
final XQueryExpression exp = sqc.compileQuery(new FileReader(
"input/names.xq"));
final DynamicQueryContext dynamicContext = new DynamicQueryContext(config);
String xml = "<student_list><student><name>George Washington</name><major>Politics</major><phone>312-123-4567</phone><email>gw#example.edu</email></student><student><name>Janet Jones</name><major>Undeclared</major><phone>311-122-2233</phone><email>janetj#example.edu</email></student><student><name>Joe Taylor</name><major>Engineering</major><phone>211-111-2333</phone><email>joe#example.edu</email></student></student_list>";
DocumentBuilderFactory newInstance = DocumentBuilderFactory.newInstance();
newInstance.setNamespaceAware(true);
Document parse = newInstance.newDocumentBuilder().parse(new InputSource(new StringReader(xml)));
DocumentWrapper sequence = new DocumentWrapper(parse, "", config);
StructuredQName qname = new StructuredQName("", "", "student_list");
dynamicContext.setParameter(qname, sequence);
Properties props = new Properties();
final SequenceIterator iter = exp.iterator(dynamicContext);
props.setProperty(OutputKeys.OMIT_XML_DECLARATION, "yes");
props.setProperty(OutputKeys.INDENT, "yes");
StringWriter writer = new StringWriter();
QueryResult.serializeSequence(iter, config, writer, props);
System.out.println("Result is " + writer);
names.xq
declare namespace eg="http://example.com/saxon-extension";
declare namespace xs = "http://www.w3.org/2001/XMLSchema";
declare variable $student_list as element(*) external;
<Students>
<value> {
let $n := eg:shift-left(2, 2)
return $n
}</value>
<student_names>
{ $student_list//student_list/student/name }
</student_names>
</Students>
But getting the below error
Error at procedure student_list on line 3 of students.xml:
XPTY0004: Required item type of value of variable $student_list is element(); supplied
value has item type document-node(element(Q{}student_list))
net.sf.saxon.trans.XPathException: Required item type of value of variable $student_list is element(); supplied value has item type document- node(element(Q{}student_list))
at net.sf.saxon.expr.ItemTypeCheckingFunction.testConformance(ItemTypeCheckingFunction.java:69)
at net.sf.saxon.expr.ItemTypeCheckingFunction.mapItem(ItemTypeCheckingFunction.java:50)
at net.sf.saxon.expr.ItemMappingIterator.next(ItemMappingIterator.java:95)
at net.sf.saxon.expr.CardinalityCheckingIterator.<init>(CardinalityCheckingIterator.java:52)
at net.sf.saxon.type.TypeHierarchy.applyFunctionConversionRules(TypeHierarchy.java:230)
at net.sf.saxon.expr.instruct.GlobalParameterSet.convertParameterValue(GlobalParameterSet.java:105)
at net.sf.saxon.expr.instruct.Bindery.useGlobalParameter(Bindery.java:136)
at net.sf.saxon.expr.instruct.GlobalParam.evaluateVariable(GlobalParam.java:62)
at net.sf.saxon.expr.GlobalVariableReference.evaluateVariable(GlobalVariableReference.java:105)
at net.sf.saxon.expr.VariableReference.evaluateItem(VariableReference.java:460)
at net.sf.saxon.expr.Atomizer.evaluateItem(Atomizer.java:313)
at net.sf.saxon.expr.Atomizer.evaluateItem(Atomizer.java:35)
at net.sf.saxon.expr.AtomicSequenceConverter.evaluateItem(AtomicSequenceConverter.java:275)
at net.sf.saxon.expr.AtomicSequenceConverter.evaluateItem(AtomicSequenceConverter.java:30)
at net.sf.saxon.functions.Doc.doc(Doc.java:235)
at net.sf.saxon.functions.Doc.evaluateItem(Doc.java:190)
at net.sf.saxon.functions.Doc.evaluateItem(Doc.java:28)
at net.sf.saxon.expr.SimpleStepExpression.iterate(SimpleStepExpression.java:85)
at net.sf.saxon.expr.SlashExpression.iterate(SlashExpression.java:842)
at net.sf.saxon.expr.sort.DocumentSorter.iterate(DocumentSorter.java:168)
at net.sf.saxon.expr.SlashExpression.iterate(SlashExpression.java:842)
at net.sf.saxon.expr.sort.DocumentSorter.iterate(DocumentSorter.java:168)
at net.sf.saxon.expr.Expression.process(Expression.java:552)
at net.sf.saxon.expr.instruct.ElementCreator.processLeavingTail(ElementCreator.java:450)
at net.sf.saxon.expr.instruct.ElementCreator.processLeavingTail(ElementCreator.java:389)
at net.sf.saxon.expr.instruct.Block.processLeavingTail(Block.java:669)
at net.sf.saxon.expr.instruct.Instruction.process(Instruction.java:144)
at net.sf.saxon.expr.instruct.ElementCreator.constructElement(ElementCreator.java:539)
at net.sf.saxon.expr.instruct.ElementCreator.evaluateItem(ElementCreator.java:476)
at net.sf.saxon.expr.instruct.Instruction.iterate(Instruction.java:363)
at net.sf.saxon.query.XQueryExpression.iterator(XQueryExpression.java:332)
at com.example.saxon.ExternalMethodCaller.main(ExternalMethodCaller.java:77)
Thanks in advance..
Unless you have a very good reason not to, my advice is to use Snappi (the Saxon 9 API, or s9api):
Processor saxon = new Processor(false);
saxon.registerExtensionFunction(new MyExtension());
XQueryCompiler compiler = saxon.newXQueryCompiler();
XQueryExecutable exec = compiler.compile(new File("input/names.xq"));
XQueryEvaluator query = exec.load();
DocumentBuilder builder = saxon.newDocumentBuilder();
String students = "<xml>...</xml>";
Source src = new StreamSource(new StringReader(students));
XdmNode doc = builder.build(src);
query.setExternalVariable(new QName("student_list"), doc);
XdmValue result = query.evaluate();
With MyExtension looking something like the following:
public class MyExtension
implements ExtensionFunction
{
#Override
public QName getName()
{
return new QName("http://example.org/my-project", "my-fun");
}
#Override
public SequenceType getResultType()
{
return SequenceType.makeSequenceType(
ItemType.INTEGER, OccurrenceIndicator.ONE);
}
#Override
public SequenceType[] getArgumentTypes()
{
return new SequenceType[] {
SequenceType.makeSequenceType(
ItemType.INTEGER, OccurrenceIndicator.ONE),
SequenceType.makeSequenceType(
ItemType.INTEGER, OccurrenceIndicator.ONE)
};
}
#Override
public XdmValue call(XdmValue[] args) throws SaxonApiException
{
long first = ((XdmAtomicValue)args[0].itemAt(0)).getLongValue();
long second = ((XdmAtomicValue)args[0].itemAt(0)).getLongValue();
long result = ...;
return new XdmAtomicValue(result);
}
}
See the documentation at http://www.saxonica.com/documentation9.5/extensibility/integratedfunctions/ext-simple-J.html for details.
EXPath also has a project called tools-saxon, containing several tools for using Saxon in Java. Including extension functions. It introduces the concept of a function library, which is convenient if you have several extension functions. It also introduces a function definition builder, allowing one to build a function definition with as less boiler plate code as possible (and providing convenient shortcuts for type sequences). In the above code, replace the function registering (the first 2 lines) by:
Processor saxon = new Processor(false);
Library lib = new MyLibrary();
lib.register(saxon.getUnderlyingConfiguration());
and replace the extension class with the 2 following classes (a library and a function, resp.):
public class MyLibrary
extends Library
{
public MyLibrary()
{
super("http://example.org/my-project", "my");
}
#Override
protected Function[] functions()
{
return new Function[] {
new MyFunction(this)
};
}
}
public class MyFunction
extends Function
{
public MyFunction(Library lib)
{
super(lib);
}
#Override
protected Definition makeDefinition()
{
return library()
.function(this, "my-fun")
.returns(Types.SINGLE_INTEGER)
.param(Types.SINGLE_INTEGER, "first")
.param(Types.SINGLE_INTEGER, "second")
.make();
}
#Override
public Sequence call(XPathContext ctxt, Sequence[] args)
throws XPathException
{
Parameters params = checkParams(args);
long first = params.asLong(0, true);
long second = params.asLong(1, true);
long result = 0;
return Return.value(result);
}
}
See all informatio on the project home on Github, at https://github.com/expath/tools-saxon.
Note: not tested.

How to use SchemaExportTool with JPA and Hibernate 4.3

At Hibernate 4.3 Ejb3Configuration class was removed. This class was commonly used for creating hibernate configuration file from a persistence unit (persistence.xml file) to SchemaExport tool.
As a simple alternative to export schema to .sql file I'm using the following code:
public static void export(String persistenceUnit, String exportFileName) {
Map<String, String> hash = new HashMap<String, String>();
hash.put("hibernate.hbm2ddl.auto", "create-drop");
EntityManagerFactory factory = Persistence.createEntityManagerFactory(
persistenceUnit, hash);
org.hibernate.jpa.internal.EntityManagerFactoryImpl hibFactory = (org.hibernate.jpa.internal.EntityManagerFactoryImpl) factory;
SessionFactoryImpl hibSessionFactory = hibFactory.getSessionFactory();
SchemaExport schema = ReflectUtils.getPrivateFieldValue(
hibSessionFactory, "schemaExport");
schema.setOutputFile(exportFileName);
schema.setFormat(false);
schema.setDelimiter(";");
schema.drop(true, false);
schema.create(true, false);
}
At this piece of code, i'm basically using schemaexport object created by HibernateSessionFactoryImpl. The drawback is that every time it´s executed the database schema is recreated. Is there any other simple way to use SchemaExporTool with Hibernate 4.3 and JPA? It seems that the real question is how to create Hibernate Configuration Object from a persistenceunit?
I ran into the same Problem. I ended up by using the internal PersistenceXmlParser of Hibernate to access information in the persistence.xml file and creating the Configuration object manually:
public static void main(String[] args) {
PersistenceXmlParser parser = new PersistenceXmlParser(new ClassLoaderServiceImpl(), PersistenceUnitTransactionType.RESOURCE_LOCAL);
List<ParsedPersistenceXmlDescriptor> allDescriptors = parser.doResolve(new HashMap<>());
for (ParsedPersistenceXmlDescriptor descriptor : allDescriptors) {
Configuration cfg = new Configuration();
cfg.setProperty("hibernate.hbm2ddl.auto", "create");
cfg.setProperty("hibernate.dialect", "org.hibernate.dialect.Oracle10gDialect");
cfg.setProperty("hibernate.id.new_generator_mappings", "true");
List<String> managedClassNames = descriptor.getManagedClassNames();
for (String className : managedClassNames) {
try {
cfg.addAnnotatedClass(Class.forName(className));
} catch (ClassNotFoundException e) {
System.out.println("Class not found: " + className);
}
}
SchemaExport export = new SchemaExport(cfg);
export.setDelimiter(";");
export.setOutputFile("C:\\dev\\" + descriptor.getName() + "_create_schema.sql");
export.setFormat(true);
export.execute(true, false, false, false);
}
}

Categories

Resources