I am using Hazelcast for clustered data distribution. I read in the documentation about data persistence, using the interfaces MapStore and MapLoader. I need to implement these interfaces and write the class name in the hazelcast.xml file.
Is there any example of implementation of these interfaces for file persistence with hazelcast? Does anyone know about any source code or jar file that I can download and work with?
Thanks
You can implement your own just using ObjectOutputStream and ObjectInputStream.
You can create a directory with map's name.
store(key, value) operation creates a file with name key.dat, with content of serialized value.
load(key) method reads "key.dat" file into an object and returns.
Here usage examples of ObjectOutputStream and ObjectInputStream
http://www.mkyong.com/java/how-to-write-an-object-to-file-in-java/
http://www.mkyong.com/java/how-to-read-an-object-from-file-in-java/
Then you should add this implementation class to your class path and set it in your hazelcast.xml
Related
I'm trying to create a custom annotation processor that generates code at compilation time (as hibernate-jpamodelgen does). I've looked in the web, and I find custom annotation processors that works with maven, but do nothing when added to the Annotation Processing > Factory Path option. How could I create a processor compatible in this way? I have not found a tutorial that works.
My idea is to, for example, annotate an entity to generate automatically a base DTO, a base mapper, etc that can be extended to use in the final code.
Thank you all
OK, Already found out the problem. The tutorial I hda found out dint't specified that, in order to the compiler to be able to apply the annotation processor, there must be a META-INF/services/javax.annotation.processing.Processor file that contains the qualified class name of the processor (or processors).
I created the file pointing to my processor class, generated the jar and added it to Annotation Processing > Factory Path and all worked correctly.
Just be careful to keep the order of the processors correctly (for example, hibernate model generator claims the classes, so no more generation will be made after it), and change the jar file name each time you want to replace the library (it seems eclipse keeps a cache). These two things have given me a good headache.
Thanks all
Is beanio can support more stream format other than csv, fixedLength, delimited, json and xml? I have created a new module from beanio to add new format, But the beanio.properties used by StreamCompiler to map the format is resides in the beanio parent project, how will I add a new format to it?
I don't know if you can extend BeanIO this way, but it would be great if this works for you.
See Section 8 of the reference documentation on how to provide your custom beanio.properties file
8.0. Configuration
In some cases, BeanIO behavior can be controlled by setting optional property values. Properties can be set using System properties or a property file. BeanIO will load configuration setting in the following order of priority:
System properties.
A property file named beanio.properties. The file will be looked for first in the application's working directory, and then on the classpath.
The name and location of beanio.properties can be overridden using the System property org.beanio.configuration. In the following example, configuration settings will be loaded from the file named config/settings.properties, first relative to the application's working directory, and if not found, then from the root of the application's classpath.
java -Dorg.beanio.configuration=config/settings.properties example.Main
Please let us know if you can extend the formats supported this way.
How can an attribute be set programmatically in a reference.conf file?
For example, I am using something like this in Spring to set the attributes of the keystore:
System.setProperty("server.ssl.keyStore", "keystore.jks")
System.setProperty("server.ssl.keyStorePassword", "password123")
Same way, you can override configuration from the reference conf file using system properties, they have the highest precedence order as described here:
https://github.com/lightbend/config#standard-behavior
Please be aware you need to do it before the config is loaded by the class that uses it (via ConfigFactory.load()) and if any other class has already used ConfigFactory, then a call to ConfigFactory.invalidateCaches() will also be required, otherwise the cached value will be used.
My question has two parts:
1) How can I create and/or modify and then store EMF ecore files (ecore metamodels with .ecore suffixes) from my scala/java code?
2) How can I create and/or modify an instance of an ecore file (i.e. a model conforming to an ecore metamodel) from my scala/java code?
I am looking to see if there are some possible ways of doing these, other that manipulating directly their corresponding XML files using XML API's.
Providing a code spinet or a reference to it is very much appreciated.
ps. As a background thought, I am wondering if I can use a single API for performing both of the above tasks, as we can look to an ecore file as a model/instance of Ecore.ecore.
Basic Concepts (Resource, ResourceSet, Resource Factory and Registry):
Before answering this question I will explain some concepts in ecore API. The First two concepts are Resource and ResourceSet. Resource is a program level representation of a persistent resource (like an ecore file), and ResourceSet is simply a set of these kind of resources. Each ecore metamodel document as well as a model document (which conforms to its metamodel) is a resource. Therefore the first step to work with these files is to provide a program level representation of them as resources in a resourceSet.
Another two concepts are Resource Factory and Registry. Factory classes are used to generate resources, and registries are keeping track of list of these factories in resourceSets. Depending on how our resource are stored, we can use different factories to manipulate them. For example, EcoreResourceFactoryImpl, XMLResourceFactoryImpl, and XMIResourceFactoryImpl are some examples of factory implementations that can be used to handle, respectively, ecore, xml, and xmi files. If we want to use these factories to handle resources in a resourceSet we need to put them in the registry list of resourceSet first. So, for each resourceSet that I mentioned above, there is a registry list.
With all the above being said, let's see how loading and modifying an ecore file (metamodel) and an instance file (model) happens in a java code.
First, We need to create a resourceSet to represent our persistent resources we would like to work with:
ResourceSet resourceSet = new ResourceSetImpl();
Then in the registry of this resourceSet, we need to register the Factories we want to work with:
resourceSet.getResourceFactoryRegistry().getExtensionToFactoryMap().put("ecore", new EcoreResourceFactoryImpl());
resourceSet.getResourceFactoryRegistry().getExtensionToFactoryMap().put("xmi", new XMIResourceFactoryImpl());
Above two lines of code simply register EcoreResourceFactoryImpl and XMIResourceFactoryImpl as, respectively, ecore and xmi file factories (note that ecore and xmi are file extensions there). I assumed that my metamodel file extension is ecore and my model file extension is xmi.
After registering these Factories, we can now ask our ResourceSet to load our metamodel (i.e., ecore) file as below:
Resource myMetaModel= resourceSet.getResource(URI.createFileURI("./univ.ecore"), true);
univ.ecore is the name of my ecore file.
For loading a model file we need to take one further step! We need to first register one more thing in our resourceSet. That is to register the package of our ecore metamodel in the registry list of packages in our resource set. For doing this we need to first get a programming level representation of our ecore package as bellow:
EPackage univEPackage = (EPackage) myMetaModel.getContents().get(0);
And then, register this package in the registry list of packages of our resource set as below:
resourceSet.getPackageRegistry().put("http://gholizadeh.org", univEPackage);
We are now ready to load our model (xmi file). We use the following code for this:
Resource myModel = resourceSet.getResource( URI.createURI( "./univModel.xmi"), true);
Now we have brought both of our metamodel and model files to the programming level, and we can simply manipulate them in code.
Change the Metamodel:
For example, for creating a new Class in an ecore file, we use EcoreFactory API: we first obtain an instance of this class as below:
EcoreFactory theCoreFactory = EcoreFactory.eINSTANCE;
and then create an EClass as the following:
EClass adultEClass= theCoreFactory.createEClass();
Then for keeping this Class, we need to add it to the list of our loaded ecore package classifiers as bellow:
univEPackage.getEClassifiers().add(adultEClass);
For doing aditional changes you need to get more familiar with ecore API.
Change the Model:
for changing a model, we need to create objects of type EObject. Similar to EcoreFactory in the above, we need a factory to do this. But instead of EcoreFactory, we need an object factory. For each ecore package there is an specific object factory of type EFactory that we can get as the following:
EFactory univInstance = univEPackage.getEFactoryInstance();
Note that univEPackage in above code, represents our ecore package (see some paragraphs above). After doing this, we are ready to create objects for our model. For example
EObject adultObject = univInstance.create(adultEClass);
create an object of type adultEClass, in our model. Note that for persisting this newly created object we need to add it to the content of our resource (that represent our model, i.e., myModel). Since our persistent file is in xmi format and it has only one root we need to put all of our objects in a list and add this list to our resource:
EList<EObject> ModelObjects = new BasicEList<EObject>();
ModelObjects.add(adultObject);
myModel.getContents().addAll(ModelObjects);
Storing Model and Metamodel files:
Finally, after we modified our metamodel and model elements we need to store them again in their corresponding files. This is simply done by calling save method of their corresponding Resources:
myModel.save(null);
myMetaModel.save(null);
It possible in Resteasy to extract the URI mapping to an external, dedicated file?
Annotating classes and methods is quick and easy but I would like to have a file that maps the URIs to functions. Something like:
/teams/{team}/player/{player-id} TeamResource.fetchPlayer
As far as I know this is not currently supported as part of the JAX-RS specification, but I could see you being able to do this with byte code insertion at runtime using something like javassist.
Basically you would add the #Path annotations to the your resource classes at runtime with the values loaded from your uri mapping file. Once the annotations were added to the resource you would then inject them into Resteasy.