The project is required to receive lot of data(the possible historical weather data of one State) from different datasources, like zip files, data files within a website. The data format is not clear, the files might be txt, pdf, or .xml.
Since it specifies that JMS and JPA should be used for implementation, I am thinking use JMS ObjectMessage to transfer data to application server. The advantage of Obejct Message is that it can read data as an object and so I can store them as persistent object in memory. And use JPA access them for lata query.
I am looking for a simple and extendable way to do this with JMS and JPA. The data source, data format, data size might change in the future.
Create a client (a sample test class) which will read data from the FSL/INGRA file and create text JMS messages and send them to your server.
Related
I want to send data generated by my application to OPC UA client using OPC UA Server. I have gone through Eclipse Milo Project which is great resource for the same. But I don't know how to integrate it in our Application.The Application produces output in JSON format. The data of multiple nodes is stored, processed and sent using JSON format like following.
{"deviceId":"36860","timestamp":"2019-03-07 10:37:20+05:30","1":"228.6","2":"237.65","3":"237.21","4":"0.13","5":"0.0","6":"0.11","7":"-2.95","8":"0.0","9":"4.03","10":"22.2","11":"0.0","12":"16.43","13":"-21.83","14":"0.0","15":"-15.72","16":"-0.13","17":"1.0","18":"0.25","19":"262.35","20":"0.0","21":"284.18","22":"234.49","23":"703.47","24":"0.08","25":"0.24","26":"0.36","27":"1.08","28":"12.87","29":"38.62","30":"-12.52"}
where 36860 is the nodeId/deviceId, timestamp is the time when data is captured from the node rest are the Parameter Ids and their actual reading values of node in key value pair.
How to use the ExampleServer to send this data and How client would receive it ?
If possible can anyone provide an example ?
I think the first think you should decide is how to model the data in the server.
You could of course stuff that String into a single VariableNode with DataType of String but then why bother using OPC UA?
I am trying utilise Django REST APIs to insert data into the database, instead of the direct write. I've been able to read JSON data using the tRESTClient component but I am not too sure about the insertion/POST. Could someone point me to the components (and relation) that I should use?
The current job that I have is mostly:
Read data from raw file -> tMap -> DB
and I wish to do something like:
Read data from raw file -> tMap -> (pass on data to REST endpoint via POST)
Used the tRestClient component after my tMap and I could see the records getting inserted into the DB but all of them are without any data. Strangely nowhere I was asked to specify the JASON tree. The number of records getting inserted are equal to rows being read from raw file so at least something is right. But I couldn't locate the menu/options to specify which data element read from the raw file should tag to which JASON element.
How do I specify the data to JSON mapping?
PS: I realise that this might not be the most efficient way to ingest data but that's what the business wants since it brings in an additional layer of control.
I set up a Spark-Streaming pipeline that gets measuring data via Kafka. This data was serialized using Avro. The data can be of two types - EquidistantData and DiscreteData. I created these using an avdl file and the sbt-avrohugger plugin. I use the variant that generates Scala case classes that inherit from SpecificRecord.
In my receiving application, I can get the two schemas by querying EquidistantData.SCHEMA$ and DiscreteData.SCHEMA$.
Now, my Kafka stream gives me RDDs whose value class is Array[Byte]. So far so good.
How can I find out from the byte array which schema was used when serializing it, i.e., whether to use EquidistantData.SCHEMA$ or DiscreteData.SCHEMA$?
I thought of sending an appropriate info in the message key. Currently, I don't use the message key. Would this be a feasible way or can I get the schema somehow from the serialized byte array I received?
Followup:
Another possibility would be to use separate topics for discrete and equidistant data. Would this be feasible?
In DDS what my requirement is, I have many subscribers but the publisher is single. My subscriber reads the data from the DDS and checks the message is for that particular subscriber. If the checking success then only it takes the data and remove from DDS. The message must maintain in DDS until the authenticated subscriber takes it's data. How can I achieve this using DDS (in java environment)?
First of all, you should be aware that with DDS, a Subscriber is never able to remove data from the global data space. Every Subscriber has its own cached copy of the distributed data and can only act on that copy. If one Subscriber takes data, then other Subscribers for the same Topic will not be influenced by that in any way. Only Publishers can remove data globally for every Subscriber. From your question, it is not clear whether you know this.
Independent of that, it seems like the use of a ContentFilteredTopic (CFT) is suitable here. According to the description, the Subscriber knows the file name that it is looking for. With a CFT, the Subscriber can indicate that it is only interested in samples that have a particular value for the file_name attribute. The infrastructure will take care of the filtering process and will ensure that the Subscriber will not receive any data with a different value for the attribute file_name. As a consequence, any take() action done on the DataReader will contain relevant information and there is no need to check the data first and then take it.
The API documentation should contain more detailed information about how to use a ContentFilteredTopic.
Is it possible to setup HSQLDB in a way, so that the files with the db information are written into memory instead of using actual files? I want to use hsqldb to export some data structures together with hibernate mappings. Is is, however, not possible to write temporary files, so that I need to generate the files in-memory and return a stream with their contents as a response.
Setting hsqldb to use nio seems not to be a solution, because there is no way to get hold of those files before they get written onto the filesystem.
What I'm thinking of is a protocol handler for hsqldb, but I didn't find a suitable solution yet.
Just to describe in other words: A hack solution would be to pass hsqldb a stream or several streams. It would then during its operation write data into those streams. After all data is written, the user of the db could then use those streams to send it back over the network.
Yes, of course, we use it all the time for integration testing.
use as url : jdbc:hsqldb:mem:aname
see here for more details
DbUnit offers a handy database dump method as part of their package :
// database connection
Class driverClass = Class.forName("org.hsqldb.jdbcDriver");
Connection jdbcConnection = DriverManager.getConnection(
"jdbc:hsqldb:sample", "sa", "");
IDatabaseConnection connection = new DatabaseConnection(jdbcConnection);
// full database export
IDataSet fullDataSet = connection.createDataSet();
FlatXmlDataSet.write(fullDataSet, new FileOutputStream("full.xml"));
see DbUnit FAQ for more details. Of course there are routines to restore the data, as that is actually the puropose of the package : prepare a test database for integration testing. Usually we do this with an annotation, but you'll have to use tha API for that.