Spring AOP #Aspect J : how do I give Aspects access to other classes - class

I am quite new to Java and Spring. I would like to find out if it is possible and if so how I can get my aspects to apply to more than one class without having to call the method from the class where the aspects "work".
This is my main class. Aspects work on any methods I call diresctly from this class, but will not work on any of the other methods called by other classes (even if they are not internal)
public class AopMain {
public static void main(String[] args) {
String selection = "on";
ApplicationContext ctx = new ClassPathXmlApplicationContext("spring.xml");
do {
try{
System.out.println("Enter 'length' for a length conversion and 'temperature' for a temperature conversion and 'quit' to quit");
BufferedReader br = new BufferedReader(new InputStreamReader(System.in));
selection = br.readLine();
if(selection.contentEquals("length")) {
LengthService lengthService = ctx.getBean("lengthService", LengthService.class);
lengthService.runLengthService();
lengthService.display();
}
else if(selection.contentEquals("temperature")) {
TemperatureService temperatureService = new TemperatureService();
temperatureService.runTempertureService();
temperatureService.display();
}
}
catch (Exception e) {
System.out.println("Input error");
}
} while (!selection.contentEquals("quit"));
}
}
This is one of the conversion service classes:
public class TemperatureService {
String fromUnit = null;
String toUnit = null;
double val = 0;
double converted = 0;
public void runTempertureService() {
Scanner in = new Scanner(System.in);
System.out.println("Convert from (enter C, K, F): ");
fromUnit = in.nextLine();
System.out.println("Convert to (enter C, K, F): ");
toUnit = in.nextLine();
TemperatureConverter from = new TemperatureConverter(fromUnit);
TemperatureConverter to = new TemperatureConverter(toUnit);
System.out.println("Value:");
val = in.nextDouble();
double celcius = from.toCelcius(val);
converted = to.fromCelcius(celcius);
from.display(val, fromUnit, converted, toUnit);
System.out.println(val + " " + fromUnit + " = " + converted + " " + toUnit);
}
public String[] display(){
String[] displayString = {Double.toString(val), fromUnit, Double.toString(converted), toUnit};
return displayString;
}
}
And this is one of the conversion classes:
public class TemperatureConverter {
final double C_TO_F = 33.8;
final double C_TO_C = 1;
final double C_TO_KELVIN = 274.15;
private double factor;
public TemperatureConverter(String unit) {
if (unit.contentEquals("F"))
factor = C_TO_F;
else if(unit.contentEquals("C"))
factor = C_TO_C;
else if(unit.contentEquals("K"))
factor = C_TO_KELVIN;
}
public double toCelcius(double measurement) {
return measurement * factor;
}
public double fromCelcius(double measurement) {
return measurement/factor;
}
public TemperatureConverter() {}
public void display(double val, String fromUnit, double converted, String toUnit) {}
}
This is my configuration file:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:aop="http://www.springframework.org/schema/aop" xmlns:context="http://www.springframework.org/schema/context" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-2.0.xsd http://www.springframework.org/schema/aop http://www.springframework.org/schema/aop/spring-aop-2.0.xsd http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-2.5.xsd">
<aop:aspectj-autoproxy/>
<bean name= "lengthConverter" class= "converter.method.LengthConverter"/>
<bean name= "temperatureConverter" class= "converter.method.TemperatureConverter"/>
<bean name= "lengthService" class= "converter.service.LengthService" autowire = "byName"/>
<bean name= "temperatureService" class= "converter.service.TemperatureService"/>
<bean name="ValidationAspect" class= "converter.aspect.ValidationAspect" />
<bean name="DisplayAspect" class= "converter.aspect.DisplayAspect" />
</beans>
I want to be able to apply an aspect to functions of the converter class called by the service class but like I have mentioned, it doesnt work unnless the method is called from the main class directly. (the display function was originally part of the converter class but I moved it so that the aspect would work). Also why will an aspect not pick up the newline() method call?
Edit:
This is one of my aspects:
#Aspect
public class DisplayAspect {
#AfterReturning(pointcut = "execution(* display(..))", returning = "retVal")
public void fileSetUp(Object retVal) {
System.out.println("So we found the display things");
Writer writer = null;
String[] returnArray = (String[]) retVal;
try {
System.out.println("inside try");
String text = "The opertion performed was: " + returnArray[0] + " in " + returnArray[1] + " is " + returnArray[2] + " " + returnArray[3] + "\n";
File file = new File("Log.txt");
writer = new BufferedWriter(new FileWriter(file, true));
writer.write(text);
} catch (FileNotFoundException e1) {
e1.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
if (writer != null) {
writer.close();
}
} catch (IOException e) {
e.printStackTrace();
}
}
}
}

I want to be able to apply an aspect to functions of the converter class
Well, then change your pointcut so as intercept the methods (not functions, they are called methods) you want to handle in your advice. At the moment the pointcut is
execution(* display(..))
I.e. it will intercept all methods named display with any number of parameters and any return type. If you want to intercept all converter methods instead, change it to
execution(* converter.method.TemperatureConverter.*(..))
instead.
like I have mentioned, it doesnt work unnless the method is called from the main class directly.
I need to guess because this description is unclear, but probably What you are trying to describe is that the advice is only applied if the TemperatureService.display() is called from outside the class, not from a method within TemperatureService. This is a known and well described limitation of Spring AOP, see Spring Manual, chapter 9.6, "Proxying mechanisms": Due to the proxy-based "AOP lite" approach of Spring AOP, this cannot work because internal calls to methods of this, are not routed through the dynamic proxy created by the Spring container. Thus, Spring AOP only works for inter-bean calls, not intra-bean ones. If you need to intercept internal calls, you need to switch to full-blown AspectJ, which can be easily integrated into Spring applications via LTW (load-time weaving) as described in chapter 9.8, "Using AspectJ with Spring applications".

Related

Tree like view of Triplets and remove URI's

I have written a code in java that reads the ontology and print the triplets. the code is working fine. i want to hide the URI's in output and also print the output in the tree hierarchy form. Currently it gives me output in lines. Any idea how can i do this.
Tree Form Like:
Thing
Class
SubClass
Individual
so on ...
this is the ReadOntology class, this class i use in servlet.
public class ReadOntology {
public static OntModel model;
public static void run(String ontologyInFile) {
model = ModelFactory.createOntologyModel(OntModelSpec.OWL_MEM, null);
InputStream ontologyIn = FileManager.get().open(ontologyInFile);
loadModel(model, ontologyIn);
}
protected static void loadModel(OntModel m, InputStream ontologyIn) {
try {
m.read(ontologyIn, "RDF/XML");
} catch (Exception e) {
System.out.println(e.getMessage());
}
}
this is the servlet
public class Ontology extends HttpServlet{
OntClass ontClass = null;
public void service(HttpServletRequest req, HttpServletResponse res) throws IOException, ServletException
{
PrintWriter out = res.getWriter();
ServletContext context = this.getServletContext();
String fullPath = context.getRealPath("/WEB-INF/Data/taxi.owl");
ReadOntology.run(fullPath);
SimpleSelector selector = new SimpleSelector(null, null, (RDFNode)null);
StmtIterator iter = ReadOntology.model.listStatements(selector);
while(iter.hasNext()) {
Statement stmt = iter.nextStatement();
out.print(stmt.getSubject().toString());
out.print(stmt.getPredicate().toString());
out.println(stmt.getObject().toString());
}
}
}
As one step towards your goal, this groups the statements by subject, and for the predicates only shows the local name:
ResIterator resIt = ReadOntology.model.listSubjects()
while (resIt.hasNext()) {
Resource r = resIt.nextResource();
out.println(r);
StmtIterator iter = r.listProperties();
while (iter.hasNext()) {
Statement stmt = iter.nextStatement();
out.print(" ");
out.print(stmt.getPredicate().getLocalName());
out.println(stmt.getObject());
}
}
There are lots of useful methods in the API for Resource and Model.
To render a full class tree, use the methods on OntModel and OntClass. Perhaps:
private void printClass(Writer out, OntClass clazz, int indentation) {
String space = ' '.repeat(indentation);
// print space + clazz.getLocalName()
...
// iterate over clazz.listSubClasses(true)
// and call printClass for each with indentation increased by 1
...
// iterator over clazz.listInstances()
// and print all their properties as in the
// snippet above but with space added
}
Then in the service method, iterate over the OntModel's classes, and for any where hasSuperClass() is false, call printClass(out, clazz, 0).

Native Client - Serialization Exception when executing Continuous Query

I'm trying to set up a simple Java <-> #C/.NET proof of concept using Apache Geode, specifically testing the continuous query functionality using the .NET native client. Using a regular Query works fine from .NET, only the Continuous Query has an issue. I run into my problem when I call the Execute() method on the continuous query object. The specific error I get is
Got unhandled message type 26 while processing response, possible serialization mismatch
I'm only storing simple strings in the cache region so I'm a bit surprised that I'm having serialization issues. I've tried enabling PDX serialization on both sides (and running without it), it doesn't seem to make a difference. Any ideas?
Here is my code for both sides:
Java
Starts a server, puts some data, and then keeps updating a given cache entry.
public class GeodePoc {
public static void main(String[] args) throws Exception {
ServerLauncher serverLauncher = new ServerLauncher.Builder().setMemberName("server1")
.setServerBindAddress("localhost").setServerPort(10334).set("start-locator", "localhost[20341]")
.set(ConfigurationProperties.LOG_LEVEL, "trace")
.setPdxReadSerialized(true)
.set(ConfigurationProperties.CACHE_XML_FILE, "cache.xml").build();
serverLauncher.start();
Cache c = CacheFactory.getAnyInstance();
Region<String, String> r = c.getRegion("example_region");
r.put("test1", "value1");
r.put("test2", "value2");
System.out.println("Cache server successfully started");
int i = 0;
while (true) {
r.put("test1", "value" + i);
System.out.println(r.get("test1"));
Thread.sleep(3000);
i++;
}
}
}
Server cache.xml
<?xml version="1.0" encoding="UTF-8"?>
<cache xmlns="http://geode.apache.org/schema/cache" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://geode.apache.org/schema/cache http://geode.apache.org/schema/cache/cache-1.0.xsd"
version="1.0">
<cache-server bind-address="localhost" port="40404"
max-connections="100" />
<pdx>
<pdx-serializer>
<class-name>org.apache.geode.pdx.ReflectionBasedAutoSerializer</class-name>
<parameter name="classes">
<string>java.lang.String</string>
</parameter>
</pdx-serializer>
</pdx>
<region name="example_region">
<region-attributes refid="REPLICATE" />
</region>
</cache>
.NET Client
public static void GeodeTest()
{
Properties<string, string> props = Properties<string, string>.Create();
props.Insert("cache-xml-file", "<path-to-cache.xml>");
CacheFactory cacheFactory = new CacheFactory(props)
.SetPdxReadSerialized(true).SetPdxIgnoreUnreadFields(true)
.Set("log-level", "info");
Cache cache = cacheFactory.Create();
cache.TypeRegistry.PdxSerializer = new ReflectionBasedAutoSerializer();
IRegion<string, string> region = cache.GetRegion<string, string>("example_region");
Console.WriteLine(region.Get("test2", null));
PoolManager pManager = cache.GetPoolManager();
Pool pool = pManager.Find("serverPool");
QueryService qs = pool.GetQueryService();
// Regular query example (works)
Query<string> q = qs.NewQuery<string>("select * from /example_region");
ISelectResults<string> results = q.Execute();
Console.WriteLine("Finished query");
foreach (string result in results)
{
Console.WriteLine(result);
}
// Continuous Query (does not work)
CqAttributesFactory<string, object> cqAttribsFactory = new CqAttributesFactory<string, object>();
ICqListener<string, object> listener = new CacheListener<string, object>();
cqAttribsFactory.InitCqListeners(new ICqListener<string, object>[] { listener });
cqAttribsFactory.AddCqListener(listener);
CqAttributes<string, object> cqAttribs = cqAttribsFactory.Create();
CqQuery<string, object> cquery = qs.NewCq<string, object>("select * from /example_region", cqAttribs, false);
Console.WriteLine(cquery.GetState());
Console.WriteLine(cquery.QueryString);
Console.WriteLine(">>> Cache query example started.");
cquery.Execute();
Console.WriteLine();
Console.WriteLine(">>> Example finished, press any key to exit ...");
Console.ReadKey();
}
.NET Cache Listener
public class CacheListener<TKey, TResult> : ICqListener<TKey, TResult>
{
public virtual void OnEvent(CqEvent<TKey, TResult> ev)
{
object val = ev.getNewValue() as object;
TKey key = ev.getKey();
CqOperation opType = ev.getQueryOperation();
string opStr = "DESTROY";
if (opType == CqOperation.OP_TYPE_CREATE)
opStr = "CREATE";
else if (opType == CqOperation.OP_TYPE_UPDATE)
opStr = "UPDATE";
Console.WriteLine("MyCqListener::OnEvent called with key {0}, op {1}.", key, opStr);
}
public virtual void OnError(CqEvent<TKey, TResult> ev)
{
Console.WriteLine("MyCqListener::OnError called");
}
public virtual void Close()
{
Console.WriteLine("MyCqListener::close called");
}
}
.NET Client cache.xml
<client-cache
xmlns="http://geode.apache.org/schema/cache"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://geode.apache.org/schema/cache http://geode.apache.org/schema/cache/cache-1.0.xsd"
version="1.0">
<pool name="serverPool" subscription-enabled="true">
<locator host="localhost" port="20341"/>
</pool>
<region name="example_region">
<region-attributes refid="CACHING_PROXY" pool-name="serverPool" />
</region>
</client-cache>
This ended up being a simple oversight on my part. In order for continuous query to function you must include the geode-cq dependency on the Java side. I didn't do this, and this caused the exception.

Spring batch : FlatFileItemWriter header never called

I have a weird issue with my FlatFileItemWriter callbacks.
I have a custom ItemWriter implementing both FlatFileFooterCallback and FlatFileHeaderCallback. Consequently, I set header and footer callbacks in my FlatFileItemWriter like this :
ItemWriter Bean
#Bean
#StepScope
public ItemWriter<CityItem> writer(FlatFileItemWriter<CityProcessed> flatWriter, #Value("#{jobExecutionContext[inputFile]}") String inputFile) {
CityItemWriter itemWriter = new CityItemWriter();
flatWriter.setHeaderCallback(itemWriter);
flatWriter.setFooterCallback(itemWriter);
itemWriter.setDelegate(flatWriter);
itemWriter.setInputFileName(inputFile);
return itemWriter;
}
FlatFileItemWriter Bean
#Bean
#StepScope
public FlatFileItemWriter<CityProcessed> flatFileWriterArchive(#Value("#{jobExecutionContext[outputFileArchive]}") String outputFile) {
FlatFileItemWriter<CityProcessed> flatWriter = new FlatFileItemWriter<CityProcessed>();
FileSystemResource isr;
isr = new FileSystemResource(new File(outputFile));
flatWriter.setResource(isr);
DelimitedLineAggregator<CityProcessed> aggregator = new DelimitedLineAggregator<CityProcessed>();
aggregator.setDelimiter(";");
BeanWrapperFieldExtractor<CityProcessed> beanWrapper = new BeanWrapperFieldExtractor<CityProcessed>();
beanWrapper.setNames(new String[]{
"country", "name", "population", "popUnder25", "pop25To50", "pop50to75", "popMoreThan75"
});
aggregator.setFieldExtractor(beanWrapper);
flatWriter.setLineAggregator(aggregator);
flatWriter.setEncoding("ISO-8859-1");
return flatWriter;
}
Step Bean
#Bean
public Step stepImport(StepBuilderFactory stepBuilderFactory, ItemReader<CityFile> reader, ItemWriter<CityItem> writer, ItemProcessor<CityFile, CityItem> processor,
#Qualifier("flatFileWriterArchive") FlatFileItemWriter<CityProcessed> flatFileWriterArchive, ExecutionContextPromotionListener executionContextListener) {
return stepBuilderFactory.get("stepImport").<CityFile, CityItem> chunk(10).reader(reader(null)).processor(processor).writer(writer).stream(flatFileWriterArchive)
.listener(executionContextListener).build();
}
I have the classic content in my writeFooter, writeHeader and write methods.
ItemWriter code
public class CityItemWriter implements ItemWriter<CityItem>, FlatFileFooterCallback, FlatFileHeaderCallback, ItemStream {
private FlatFileItemWriter<CityProcessed> writer;
private static int totalUnknown = 0;
private static int totalSup10000 = 0;
private static int totalInf10000 = 0;
private String inputFileName = "-";
public void setDelegate(FlatFileItemWriter<CityProcessed> delegate) {
writer = delegate;
}
public void setInputFileName(String name) {
inputFileName = name;
}
private Predicate<String> isNullValue() {
return p -> p == null;
}
#Override
public void write(List<? extends CityItem> cities) throws Exception {
List<CityProcessed> citiesCSV = new ArrayList<>();
for (CityItem item : cities) {
String populationAsString = "";
String less25AsString = "";
String more25AsString = "";
/*
* Some processing to get total Unknown/Sup 10000/Inf 10000
* and other data
*/
// Write in CSV file
CityProcessed cre = new CityProcessed();
cre.setCountry(item.getCountry());
cre.setName(item.getName());
cre.setPopulation(populationAsString);
cre.setLess25(less25AsString);
cre.setMore25(more25AsString);
citiesCSV.add(cre);
}
writer.write(citiesCSV);
}
#Override
public void writeFooter(Writer fileWriter) throws IOException {
String newLine = "\r\n";
String totalUnknown= "Subtotal:;Unknown;" + String.valueOf(nbUnknown) + newLine;
String totalSup10000 = ";Sum Sup 10000;" + String.valueOf(nbSup10000) + newLine;
String totalInf10000 = ";Sum Inf 10000;" + String.valueOf(nbInf10000) + newLine;
String total = "Total:;;" + String.valueOf(nbSup10000 + nbInf10000 + nbUnknown) + newLine;
fileWriter.write(newLine);
fileWriter.write(totalUnknown);
fileWriter.write(totalSup10000);
fileWriter.write(totalInf10000);
fileWriter.write(total );
}
#Override
public void writeHeader(Writer fileWriter) throws IOException {
String newLine = "\r\n";
String firstLine= "FILE PROCESSED ON: ;" + new SimpleDateFormat("MM/dd/yyyy").format(new Date()) + newLine;
String secondLine= "Filename: ;" + inputFileName + newLine;
String colNames= "Country;Name;Population...;...having less than 25;...having more than 25";
fileWriter.write(firstLine);
fileWriter.write(secondLine);
fileWriter.write(newLine);
fileWriter.write(colNames);
}
#Override
public void close() throws ItemStreamException {
writer.close();
}
#Override
public void open(ExecutionContext context) throws ItemStreamException {
writer.open(context);
}
#Override
public void update(ExecutionContext context) throws ItemStreamException {
writer.update(context);
}
}
When I run my batch, I only have the data for each city (write method part) and the footer lines. If I comment the whole content of write method and footer callback, I still don't have the header lines. I tried to add a System.out.println() text in my header callback, it looks like it's never called.
Here is an example of the CSV file produced by my batch :
France;Paris;2240621;Unknown;Unknown
France;Toulouse;439553;Unknown;Unknown
Spain;Barcelona;1620943;Unknown;Unknown
Spain;Madrid;3207247;Unknown;Unknown
[...]
Subtotal:;Unknown;2
;Sum Sup 10000;81
;Sum Inf 10000;17
Total:;;100
What is weird is that my header used to work before, when I added both footer and header callbacks. I didn't change them, and I don't see what I've done in my code to "broke" my header callback... And of course, I have no save of my first code. Because I see only now that my header has disappeared (I checked my few last files, and it looks like my header is missing for some time but I didn't see it), I can't just remove my modifications to see when/why it happens.
Do you have any idea to solve this problem ?
Thanks
When using Java config as you are, it's best to return the most specific type possible (the opposite of what you're normally told to do in java programming). In this case, your writer is returning ItemWriter, but is step scoped. Because of this a proxy is created that can only see the type that your java config returns which in this case is ItemWriter and does not expose the methods on the ItemStream interface. If you return CityItemWriter, I'd expect things to work.

Eclipselink entity mappings cache

I am using EclipseLink for my project.
I extend XMLMetadataSource (to provide a custom class loader) because entities I persist are runtime created. And it works OK.
I am getting "unknown entity type" when I do following.
Create entity
Create mapping
Create entity manager factory, provide custom class loader
create entity manager and persist. -- IT WORKS OK.
now drop entity , and drop from class loader
create same entity ,
create mapping again (of course it looks same)
try to refresh entity manager factory with new properties (new class loader, mapping file)
try to persist - complains "unknown type"
Any idea, if EL caches XML mappings.
I tried to re-creating factory again but its same error.
I am tried MySQL and Derby. with 'drop-and-create-tables' and 'create-or-extend-tables' .
same result.
I filed a bug with eclipse link.
https://bugs.eclipse.org/bugs/show_bug.cgi?id=426310
Its not a bug per say in EL. But problem is with EL "not building or re-creating 'class-->class_descriptor' map (an internal map that holds Class object of each entity and entities description). I found this accidentally. For those interested, here is a sample code that might help.
public class Test1 {
public Test1(String pu, Map<String, Object> props ) {
pu_name = pu;
properties = new HashMap<String, Object> ();
properties.putAll(props);
loader = new MyClassLoader();
}
public void initialization( ) {
mms = new WAMetadataSource();
properties.put(PersistenceUnitProperties.METADATA_SOURCE, mms);
properties.put(PersistenceUnitProperties.CLASSLOADER,loader);
if(emf == null || !emf.isOpen()) {
synchronized(Test1.class) {
if (emf == null || !emf.isOpen()) {
emf = Persistence.createEntityManagerFactory(pu_name, properties);
}
}
} else {
JpaHelper.getEntityManagerFactory(emf).refreshMetadata(properties);
}
System.out.println("======> refreshed. emf.hascode : " + emf.hashCode() + ", loader.h : " + loader.hashCode());
} public EntityManager getEntityManager(Map<String, Object> props) {
if (em == null) {
em = emf.createEntityManager(props);
}
return em;
} public void persist(Object obj) {
try {
getEntityManager(properties);
System.out.println("===> em.hascode =" + em.hashCode() +", " + JpaHelper.getEntityManager(em).getProperties().get(PersistenceUnitProperties.CLASSLOADER).hashCode() );
em.clear();
em.getTransaction().begin();
em.persist(obj);
em.getTransaction().commit();
} finally {
}
}public Object getRuntimeEntityObject(int ii) {
Object obj=null;
Class clazz = loader.loadClass("com.xxx.sample.entity.runtime.User");
if(ii == 1){
obj = clazz.getConstructor(String.class).newInstance("Jai Ramjiki-1");
} else {
obj = clazz.getConstructor(String.class).newInstance("Jai Ramjiki-2");
}
obj = clazz.cast(obj);
return obj;
}public static void main(String[] args) {
Map<String, Object> props = new HashMap<String, Object>();
props.put(PersistenceUnitProperties.JDBC_DRIVER, "com.mysql.jdbc.Driver");
props.put(PersistenceUnitProperties.JDBC_URL, "jdbc:mysql://localhost:3306/test" );
props.put(PersistenceUnitProperties.JDBC_USER, "root");
props.put(PersistenceUnitProperties.JDBC_PASSWORD, "root");
props.put(PersistenceUnitProperties.DDL_GENERATION, "create-or-extend-tables");
Test1 t1 = new Test1("mysql", props);
Object obj1 = t1.getRuntimeEntityObject(1);
System.out.println(" ****> obj1 = " + obj1 + ", classloader hashcode : " + obj1.getClass().getClassLoader().hashCode() );
t1.initialization();
t1.persist(obj1);
System.out.println("Class 1 : " + obj1.getClass().hashCode() + ", obj1 : " + obj1);
t1.close();
// now drop the previous class loader and rerun same.
Test1 t2 = new Test1("mysql", props);
Object obj2 = t2.getRuntimeEntityObject(2);
System.out.println(" ****> obj2 = " + obj2 + ", classloader hashcode : " + obj2.getClass().getClassLoader().hashCode() );
t2.initialization();
t2.persist(obj2);
t2.close();
Object obj3 = t1.getRuntimeEntityObject(1);
System.out.println(" ****> obj3 = " + obj3 + ", classloader hashcode : " + obj3.getClass().getClassLoader().hashCode() );
t1.persist(obj3);
}
AND extend XMLMetadatSource
#Override
public XMLEntityMappings getEntityMappings(Map<String, Object> properties, ClassLoader classLoader, SessionLog log) {
properties.put(PersistenceUnitProperties.METADATA_SOURCE_XML_FILE, "eclipselink-orm-user.xml");
properties.put(PersistenceUnitProperties.VALIDATOR_FACTORY, null);
return super.getEntityMappings(properties, classLoader, log);
}
And create a runtime class using javassist in your CustomClassloader which extends ClassLoader
public void createRuntimeClass(String className) throws Exception {
CtClass bclass = pool.makeClass(className);
bclass.addConstructor(CtNewConstructor.defaultConstructor(bclass));
Map<String, String> fields = new HashMap<String, String>();
addFields(fields);
int noOfFields = fields.size();
CtClass[] fclasses = new CtClass[noOfFields];
int ii=0;
for (Entry<String, String> field : fields.entrySet()) {
String fieldName = field.getKey();
String fieldType = field.getValue();
//.. code to add field
bclass.addField(bfield);
//add getter method.
// add getter and setters
}
CtConstructor userConstructor = CtNewConstructor.make(constructorSource, bclass);
bclass.addConstructor(userConstructor);
byte bytes [] = bclass.toBytecode();
Class cls = bclass.toClass(this, null);
loadedClasses.put(className, cls);
loadClassBytes.put(className, bytes);
}
and Override loadClass and getResourceAsStream methods.
public Class<?> loadClass(String name) throws ClassNotFoundException {return clazz = loadedClasses.get(name);}
public InputStream getResourceAsStream(String name) {return loadClassBytes.get(className);}
hope this helps
EL has provided a way to clear the cache of current project, and setting descriptors maps. but none of them worked. Not sure it is intended behavior or by mistake they exposed that API.
Gopi
Yes, persistence unit loading is only done once. If you are using an XMLMetadataSource to change mappings, you must tell the factory to refresh its mappings using refreshMetadata() on the EMF as described here:
http://wiki.eclipse.org/EclipseLink/DesignDocs/340192#Refresh
After that, the next EntityManagers obtained will be using the new mappings, while existing EMs will still use the old mappings.

XML serialization of hash table(C#3.0)

Hi I am trying to serialize a hash table but not happening
private void Form1_Load(object sender, EventArgs e)
{
Hashtable ht = new Hashtable();
DateTime dt = DateTime.Now;
for (int i = 0; i < 10; i++)
ht.Add(dt.AddDays(i), i);
SerializeToXmlAsFile(typeof(Hashtable), ht);
}
private void SerializeToXmlAsFile(Type targetType, Object targetObject)
{
try
{
string fileName = #"C:\output.xml";
//Serialize to XML
XmlSerializer s = new XmlSerializer(targetType);
TextWriter w = new StreamWriter(fileName);
s.Serialize(w, targetObject);
w.Flush();
w.Close();
}
catch (Exception ex) { throw ex; }
}
After a google search , I found that objects that impelment IDictonary cannot be serialized. However, I got success with binary serialization.
But I want to have xml one. Is there any way of doing so?
I am using C#3.0
Thanks
First of all starting with C# 2.0 you can use type safe version of very old Hashtable which come from .NET 1.0. So you can use Dictionary<DateTime, int>.
Starting with .NET 3.0 you can use DataContractSerializer. So you can rewrite you code like following
private void Form1_Load(object sender, EventArgs e)
{
MyHashtable ht = new MyHashtable();
DateTime dt = DateTime.Now;
for (int i = 0; i < 10; i++)
ht.Add(dt.AddDays(i), i);
SerializeToXmlAsFile(typeof(Hashtable), ht);
}
where SerializeToXmlAsFile and MyHashtable type you define like following:
[CollectionDataContract (Name = "AllMyHashtable", ItemName = "MyEntry",
KeyName = "MyDate", ValueName = "MyValue")]
public class MyHashtable : Dictionary<DateTime, int> { }
private void SerializeToXmlAsFile(Type targetType, Object targetObject)
{
try {
string fileName = #"C:\output.xml";
DataContractSerializer s = new DataContractSerializer (targetType);
XmlWriterSettings settings = new XmlWriterSettings ();
settings.Indent = true;
settings.IndentChars = (" ");
using (XmlWriter w = XmlWriter.Create (fileName, settings)) {
s.WriteObject (w, targetObject);
w.Flush ();
}
}
catch (Exception ex) { throw ex; }
}
This code produce C:\output.xml file with the following contain:
<?xml version="1.0" encoding="utf-8"?>
<AllMyHashtable xmlns:i="http://www.w3.org/2001/XMLSchema-instance"
xmlns="http://schemas.datacontract.org/2004/07/DataContractXmlSerializer">
<MyEntry>
<MyDate>2010-06-09T22:30:00.9474539+02:00</MyDate>
<MyValue>0</MyValue>
</MyEntry>
<MyEntry>
<MyDate>2010-06-10T22:30:00.9474539+02:00</MyDate>
<MyValue>1</MyValue>
</MyEntry>
<!-- ... -->
</AllMyHashtable>
So how we can see all names of the elements of the destination XML files we can free define.
You can create your own Hashtable derived from standart Hashtable with implementation of IXmlSerializable. So you will implmenent ReadXml(XmlReader reader) & WriteXml(XmlWriter writer) where you can put your own logic on how to read and write values from your Hashtablw with given XmlReader & XmlWriter.
I suggest you use DataContractSerializer, it's more powerful and easier to use.