Use in spring boot application into in other - java

I want to integrate my spring boot project into in another.
For this I export the .jar and I put it in the libraries of the other project which is also spring boot.
My .jar is :
https://drive.google.com/file/d/0B96L3Vd9zNeoQzhhcmFjT05vRWc/view?usp=sharing
And my main in the other project is :
#SpringBootApplication
#EnableJpaRepositories
public class UpsysmarocApplicationTestlogApplication {
public static void main(String[] args) {
ConfigurableApplicationContext context = SpringApplication.run(UpsysmarocApplicationTestlogApplication.class, args);
TraceabilityLogService traceabilityLogService = context.getBean(TraceabilityLogService.class);
List<Map<String, String>> items = new ArrayList<>();
Map<String, String> item = new HashMap<>();
item.put("element", "Nom");
item.put("oldValue", "Mkharbach2");
item.put("newValue", "Mounji2");
items.add(item);
item = new HashMap<>();
item.put("element", "Prenom");
item.put("oldValue", "Ayoub2");
item.put("newValue", "Said2");
items.add(item);
List<Map<String, String>> connections = new ArrayList<>();
Map<String, String> connection = new HashMap<>();
connection.put("className", "User");
connection.put("originId", "3");
connections.add(connection);
TraceabilityLog traceabilityLog = traceabilityLogService.save("Eladlani2", "CREATION", items, connections);
System.out.println("RETURN => " + traceabilityLog.getId());
}
}
But i want another way that does not ask to instantiate the context but just to use the functionality fornie part our module
So I always wait for the best method that works well and thanks in advance.
Thank you.

To solve the problem I puted the project as maven dependency.

Related

Flink 1.4.2 SQL Maps?

I am currently using Flink V 1.4.2
If I have a POJO:
class CustomObj{
public Map<String, String> custTable = new HashMap<>();
public Map<String, String> getcustTable(){ return custTable; }
public void setcustTable(Map<String, String> custTable){
this.custTable = custTable;
}
}
I have a DataStream<POJO> ds = //from some kafka source
Now I do tableEnv.registerDataStream("tableName", ds);
And I want to run
tableEnv.sqlQuery("SELECT * FROM tableName WHERE custTable['key'] = 'val'");
When I try running this I get the error:
org.apache.flink.table.api.TableException: Type is not supported: ANY
What can I do about this and how can I fix it?

Hazelcast cache implementation using Apache camel blueprint

I am trying to implement Hazelcast caching with Camel blueprint. But I couldn't accomplish.I am able to create a hazelcast instance through java code (not through hazelcast XML config file). Instance has been craeted, but the cache loader class is not called during the instance creation (even though initialization method is EAGER). Attached some of the code snippets.
Let me know, if anyone come across this.
Code:
Hazelcast config
public class ConfigHack extends Config {
public ConfigHack(String instanceName ){
super(instanceName);
System.out.println("Going to create Hazelcast instance
................"+instanceName);
TcpIpConfig tcpIpConfig = new TcpIpConfig();
List membersList = new ArrayList<String>();
membersList.add("localhost");
tcpIpConfig.setMembers(membersList);
MulticastConfig multicastConfig = new MulticastConfig();
multicastConfig.setEnabled(true);
JoinConfig join = new JoinConfig();
join.setTcpIpConfig(tcpIpConfig);
join.setMulticastConfig(multicastConfig);
NetworkConfig networkConfig = new NetworkConfig();
networkConfig.setPort(5701);
networkConfig.setPortAutoIncrement(true);
networkConfig.setJoin(join);
GroupConfig groupConfig = new GroupConfig();
groupConfig.setName("devuser");
groupConfig.setPassword("devpassword");
MapStoreConfig mapStoreConfig = new MapStoreConfig();
//Absolute path in class name field below
mapStoreConfig.setClassName("VehicleCacheLoader");
mapStoreConfig.setImplementation(new VehicleCacheLoader());
mapStoreConfig.setEnabled(true);
mapStoreConfig.setInitialLoadMode(InitialLoadMode.EAGER);
mapStoreConfig.setWriteDelaySeconds(500);
MapConfig mapConfig = new MapConfig();
mapConfig.setName("vehicleMap");
mapConfig.setBackupCount(2);
mapConfig.setMaxIdleSeconds(1000000);
mapConfig.setEvictionPercentage(30);
mapConfig.setEvictionPolicy(EvictionPolicy.LFU);
mapConfig.setMapStoreConfig(mapStoreConfig);
Map<String,MapConfig> mapConfigs = new HashMap<String,MapConfig>();
mapConfigs.put("vehicleMap", mapConfig);
//config.setMapConfigs(mapConfigs);
addMapConfig(mapConfig);
setGroupConfig(groupConfig);
setNetworkConfig(networkConfig);
}
}
Cache loader class:
public class VehicleCacheLoader implements MapLoader<String, VehicleVO> {
#Override
public VehicleVO load(String paramK) {
System.out.println("Calling load method for Key " + paramK);
VehicleVO vehicleVO = new VehicleVO();
vehicleVO.setCustId("XXX");
vehicleVO.setVehicleHeader("XXX");
vehicleVO.setVehicleInitial("001");
vehicleVO.setVehicleNumber("1234");
vehicleVO.setVehicleObjId(paramK);
return vehicleVO;
}
#Override
public Map<String, VehicleVO> loadAll(Collection<String> paramCollection) {
System.out.println("Calling Load all values() " + "Got key = ");
VehicleVO vehicleVO = null;
Map<String, VehicleVO> vehicleDataMap = new HashMap<String, VehicleVO>();
for (String paramKey : paramCollection) {
System.out.println("Calling ...." + paramKey);
vehicleVO = new VehicleVO();
vehicleVO.setCustId("XXX");
vehicleVO.setVehicleHeader("XXX");
vehicleVO.setVehicleInitial("001");
vehicleVO.setVehicleNumber("1234");
vehicleVO.setVehicleObjId(paramKey);
vehicleDataMap.put(paramKey, vehicleVO);
}
return vehicleDataMap;
}
#Override
public Set<String> loadAllKeys() {
System.out.println("Calling Load all keys() ");
Set<String> vehicleKeys = new HashSet<String>();
vehicleKeys.add("XXX001");
vehicleKeys.add("XXX002");
vehicleKeys.add("XXX003");
vehicleKeys.add("XXX004");
return vehicleKeys;
}
}
Blueprint config:
-----------------
<bean id="hazelcastInstance" class="com.hazelcast.core.Hazelcast"
factory-method="newHazelcastInstance" destroy-method="shutdown">
<argument ref="hazelcastConfig"/>
</bean>
<bean id="hazelcastConfig" class="xx.yy.zz.ss.tt.cache.ConfigHack">
<argument value="TestInstance" />
</bean>
The line
mapConfigs.put("vehicleMap", mapConfig);
defines the configuration that will be used for maps with names matching "vehicaleMap".
In order to create such a map you need to run an operation against it, such as
hazelcastInstance.getMap("vehicleMap");
The distinction is clearer if the configuration was:
mapConfigs.put("vehicleMap*", mapConfig);
This would be used when you create a map named "vehicleMap1", or "vehicleMap123".
The configuration defines the configuration which will be used if needed. It's not needed til you first access the map, which is when the maps are created.
"EAGER" here refers to how the map loader is run, not to how the map is created.

How to pass HashMap to forEach tag in xls generated by jett?

I have a Map in managed bean
private Map<FaseProducao, Set<FichaTecnicaOperacao>> fichasTecnicasOperacaoResumo;
that reference to entity FichaTecnica:
public class FichaTecnica{
//...
private Set<FichaTecnicaOperacao> operacoes;
}
which I need to pass as a parameter on a beans.put () to generate an xls with jett:
public void createRelatorioFichaTecnica(FichaTecnica fichaTecnica) throws IOException {
//ommited...
Map<String, Object> beans = new HashMap<String, Object>();
beans.put("operacaoResumo", fichasTecnicasOperacaoResumo);
try (ByteArrayOutputStream saida = new ByteArrayOutputStream();
InputStream template = this.getClass().getResourceAsStream("/templates/jett/fichaTecnica.xls");
Workbook workbook = transformer.transform(template, beans);) {
//ommited...
}
}
when the xls is generated the exception happens:
WARNING [javax.enterprise.resource.webcontainer.jsf.lifecycle] (default task-28) #{ProdutoManagedBean.createRelatorioFichaTecnica(row)}: net.sf.jett.exception.AttributeExpressionException: Expected a "java.util.Collection" for "items", got a "java.util.HashMap": "${operacaoResumo}".
so I'm not understanding this error because a Map is a correct collection? So why does not jett recognize it in items = "$ {operacaoResumo}"? I created this forEach based on the link on the site:
http://jett.sourceforge.net/tags/forEach.html
As #rgettman said I did:
public void createRelatorioFichaTecnica(FichaTecnica fichaTecnica) throws IOException {
//ommited...
Map<String, Object> beans = new HashMap<String, Object>();
beans.put("operacaoResumo", fichasTechicasOperacaoResumo.keySet());
}
thank you all!

Creating mongodb database user in Java using spring data mongodb

I need to create a mongodb database user in a Spring boot application using spring data mongodb. I will be creating this user as part of application startup.
I could not find any reference for doing this using spring data mongodb.
Is that possible by using Spring data mongodb?
I had the same issue in the past and I end up by creating the user before the context load, like this:
#Configuration
#EnableAutoConfiguration
#ComponentScan
public class Application extends SpringBootServletInitializer {
#SuppressWarnings("resource")
public static void main(final String[] args) {
createMongoDbUser();
ConfigurableApplicationContext context = SpringApplication.run(Application.class, args);
}
private void createMongoDbUser() {
MongoClient mongo = new MongoClient(HOST, PORT);
MongoDatabase db = mongo.getDatabase(DB);
Map<String, Object> commandArguments = new BasicDBObject();
commandArguments.put("createUser", USER_NAME);
commandArguments.put("pwd", USER_PWD);
String[] roles = { "readWrite" };
commandArguments.put("roles", roles);
BasicDBObject command = new BasicDBObject(commandArguments);
db.runCommand(command);
}
}
Spring-data-mongodb will create the db all by itself if it can't find it, when declaring your mongo-db factory.
For instance, I declare my db-factory in xml using the following:
<mongo:db-factory id="mongofactory" dbname="dbNameHere" mongo-ref="mongo" />
I did not have to create it myself, it was created by spring-data-mongodb upon firing may app the first time.

How can I read in a list of objects from yaml using Spring's PropertiesConfigurationFactory?

If I have a set of properties, I understand that Springboot's relaxed data binder will read in a list of properties (or yaml) and populate the matching object. Like so:
Properties props = new Properties();
props.put("devices.imports[0]","imp1");
props.put("devices.imports[1]","imp2");
props.put("devices.definitions[0].id","first");
props.put("devices.definitions[1].id", "second");
DeviceConfig conf = new DeviceConfig();
PropertiesConfigurationFactory<DeviceConfig> pcf = new PropertiesConfigurationFactory<>(conf);
pcf.setProperties(props);
conf = pcf.getObject();
assertThat(conf.getDefinitions()).hasSize(2); //Definitions is coming in as 0 instead of the expected 2
DeviceConfig looks like this:
#ConfigurationProperties(prefix="devices")
public class DeviceConfig {
private List<String> imports = new ArrayList<>();
private List<DeviceDetailsProperties> definitions = new ArrayList<>();
public List<String> getImports() {
return this.imports;
}
public List<DeviceDetailsProperties> getDefinitions() {
return definitions;
}
public void setImports(List<String> imports) {
this.imports = imports;
}
public void setDefinitions(List<DeviceDetailsProperties> definitions) {
this.definitions = definitions;
}
}
DeviceDetailsProperties just has an id field with getters/setters.
Strangely neither the definitions (objects) or imports (Strings) are getting populated.
Using SpringBoot 1.2.0.RELEASE
When using the PropertiesConfigurationFactory in a manual way like this, it won't automatically use the prefix value in the annotation.
Add a targetName like so:
pcf.setTargetName("devices");
The corrected code would be:
Properties props = new Properties();
props.put("devices.imports[0]","imp1");
props.put("devices.imports[1]","imp2");
props.put("devices.definitions[0].id","first");
props.put("devices.definitions[1].id", "second");
DeviceConfig conf = new DeviceConfig();
PropertiesConfigurationFactory<DeviceConfig> pcf = new PropertiesConfigurationFactory<>(conf);
pcf.setProperties(props);
pcf.setTargetName("devices"); // <--- Add this line
conf = pcf.getObject();
assertThat(conf.getDefinitions()).hasSize(2);

Categories

Resources