I am doing a POC to save data into CouchDB using IBM' Kitura application. I am able to upload some data in CouchDB using scripts and able to fetch and send that using a web API.
Similarly, I want another API that accepts data in JSON format and save in Couch DB.
Any guidance will be really helpful.
Have you taken a look at the Kitura-CouchDB package available here: https://github.com/IBM-Swift/Kitura-CouchDB
It also provides a usage sample case.
You can also take a look at our TodoList example using CouchDB (or Cloudant) databases.
https://github.com/IBM-Swift/TodoList-CouchDB/
let couchDBClient = CouchDBClient(connectionProperties: connectionProperties)
let database = couchDBClient.database(databaseName)
let x = JSON(json)
print("JSON: \(x.rawString())")
database.create(JSON(json)) {
}
Related
I have a hierarchy of codable Swift structs that I need to store in MongoDB in the cloud (Atlas). I do not need to persist these structs locally, and this app won't need to access this data once it's stored. (There are other apps accessing it, but that's out of the scope of this question.)
MongoDB used to provide a solution called Stitch that did allow me to do just that: my iOS app had a codable Swift struct, the iOS app connected to a Stitch app on MongoDB's Atlas cloud, and I was able to insert the struct into a collection in the DB.
With Stitch, I was able to just do it like this:
let itemsCollection = mongoServiceClient.db(“myDatabase”).collection(“myCollection”,
withCollectionType: MyStruct.self)
itemsCollection.insertOne(myStruct)
Now Stitch is apparently deprecated and replaced by Realm, a formerly-third-party local persistency layer that MongoDB acquired and integrated with their Atlas cloud.
Realm isn't backwards-compatible with Stitch, and it no longer seems to be able to map Swift structs to a MongoDB backend: it can now sync Realm objects, i.e. ObjC-bridged subclasses of an abstract base class that Realm defines. Converting dozens of structs that my app uses in multiple places into such classes would be way too involved, especially since I would only do that for the sole purpose of uploading them to a backend as I do not need any of the (no doubt excellent) core functionalities of Realm.
(By the way, I find the migration from structs to objects, especially with an Objective-C foundation, quite baffling, as it goes very much against the flow: it's quite clear that Swift and value types are the future for Apple's platforms...)
My question: is there a way I may have missed to just connect to a Realm app on Atlas, and insert documents defined as Swift structs into a collection in an Atlas DB?
Or is there a reasonably-easy way to convert JSON into a Realm object? From what I read, I'd still need to define a schema for the object, and since my source struct contains several embedded structs, it would require creating Object subclasses for each, which is something I'd really need to avoid. Basically, I designed this functionality of my app around MongoDB and Stitch, and I'm not ready to suddenly accommodate the limitations of Realm for no added value.
Finally, failing all that, is there a way to encode my structs into a format that some official MongoDB API (e.g. Realm) can use for inserting?
I can already encode my struct into JSON, but that doesn't seem to be enough. I have seen the doc about MongoDB Realm Remote Access, but it doesn't say anything about my use case beyond connecting to a DB.
Error messages I get suggest that I would need to create a Document (aka Dictionary<String, Optional<AnyBSON>>), where AnyBSON is an enum that defines the type of the BSON value. I haven't found any documentation about converting anything into that format: is there an API for that, or do I need to break down my struct or JSON into a hierarchy of AnyBSON values?
I'll preface this answer by stating it's not a specific answer but more guidance on how to accomplish what's asked in the question.
First, let me re-state the question
What is the process to insert data into MongoDB Realm without storing
it locally or using Realm objects; use Classes, Structs (or something else) instead.
To illustrate, we have some tasks stored in MongoDB Realm (Atlas) with nothing stored locally. Here I will create a Document that contains task information and insert it.
We first need to define where the data will be stored and what collection to store it in:
let app = Your App
let client = app.currentUser!.mongoClient("mongodb-atlas")
let database = client.database(named: "task-database")
let collection = database.collection(withName: "TaskClass")
then we'll create some BSON's to store the textual data in
let _id = AnyBSON(ObjectId.generate())
let taskName = AnyBSON(stringLiteral: "Inserted Task")
let status = AnyBSON(stringLiteral: "Open")
let partition = AnyBSON(stringLiteral: Constants.REALM_PARTITION_VALUE)
then, for each BSON, give each a key: value pair (Dictionary)
let idDict = ("_id", _id)
let taskDict = ("name", taskName )
let statusDict = ("status", status)
let partitionDict = ("_partitionKey", partition)'
finally we'll create and write a Document to MongoDB Realm
let myTaskDoc = Document(dictionaryLiteral: idDict, taskDict, statusDict, partitionDict)
collection.insertOne(myTaskDoc, { result in
print(result)
})
The above code answers the question
Can the MongoDB Realm Swift API still just insert a struct into an
Atlas collection, like in Stitch?
Yes! In fact I did it without using a Struct at all. Keeping in mind the above code is really for simplicity - obviously leveraging Structs or Classes will provide a lot more flexibility and encapsulation of your data. This would be accomplished by crafting or updating an existing class or struct so when data is to be written utilize a function to set up the properties in AnyBSON format and return a Document.
class TaskClass {
var name = ""
func toBson() -> Document {
let taskName = AnyBSON(stringLiteral: self.name)
let taskDict = ("name", taskName )
let myTaskDoc = Document(dictionaryLiteral: taskDict) //abbreviated
return myTaskDoc
}
then to use
let doc = myTask.toBson()
collection.insertOne(doc, { result in
print(result)
})
This could easily be expanded on using Codable and JSONEncoder() to directly store JSON data in your structs, and then send BSON data to the server
We are trying to put data in Apahe Ignite Cache using this REST API provided by Ignite. https://apacheignite.readme.io/docs/rest-api.
I want to know if I can pass JSON data to it from spring boot application. Tried the basic GET and PUT it's working fine. But how to pass lots of Data from the JSON.?
Like Example JSON
{
Name : CYZ,
Id:12345
Dept: xyz
}
P.S The JSON is for understanding purposes only. I will tweak the answer as per my requirement.
Thanks.
You can use a ConnectorMessageInterceptor to convert the JSON representation into a Java object.
You can specify it in Ignite configuration as ConnectorConfiguration#messageInterceptor property. ConnectorConfiguration can be specified as IgniteConfiguration#connectorConfiguration property.
I have setup Cygnus with MySQL agent and now data is storing into MySQL server.
As a next step, we need to show reports of historical data (Saved in Cygnus_MySQL) in GUI. It may be based upon some filters like data for today, last week/month, area wise etc.
My questions are
Do we need to create our own APIs for fetching data from MySQL and returning to GUI application?
Or is there any GUI component (WireCloud or any other) in which we can directly fetch Cygnus Data? Or we need to create our custom GUI ourselves?
I found a component STH-Comet (https://github.com/telefonicaid/fiware-sth-comet), will it fit in our requirements for fetching Cygnus_MySQL data?
Do let me know if Fiware have any other components matching with our requirement.
Regards,
Krishan
Is it possible to create a H2OFrame using the H2O's REST API and if so how?
My main objective is to utilize models stored inside H2O so as to make predictions on external H2OFrames.
I need to be able to generate those H2OFrames externally from JSON (I suppose by calling an endpoint)
I read the API documentation but couldn't find any clear explanation.
I believe that the closest endpoints are
/3/CreateFrame which creates random data and /3/ParseSetup
but I couldn't find any reliable tutorial.
Currently there is no REST API endpoint to directly convert some JSON record into a Frame object. Thus, the only way forward for you would be to first write the data to a CSV file, then upload it to h2o using POST /3/PostFile, and then parse using POST /3/Parse.
(Note that POST /3/PostFile endpoint is not in the documentation. This is because it is handled separately from the other endpoints. Basically, it's an endpoint that takes an arbitrary file in the body of the post request, and saves it as "raw data file").
The same job is much easier to do in Python or in R: for example in order to upload some dataset into h2o for scoring, you only need to say
df = h2o.H2OFrame(plaindata)
I am already doing something similar in my project. Since, there is no REST API endpoint to directly convert JSON record into a Frame object. So, I am doing the following: -
1- For Model Building:- first transfer and write the data into the CSV file where h2o server or cluster is running.Then import data into the h2o using POST /3/ImportFiles, and then parse and build a model etc. I am using the h2o-bindings APIs (RESTful APIs) for it. Since I have a large data (hundreds MBs to few GBs), so I use /3/ImportFiles instead POST /3/PostFile as latter is slow to upload large data.
2- For Model Scoring or Prediction:- I am using the Model MOJO and POJO. In your case, you use POST /3/PostFile as suggested by #Pasha, if your data is not large. But, as per h2o documentation, it's advisable to use the MOJO or POJO for model scoring or prediction in a production environment and not to call h2o server/cluster directly. MOJO and POJO are thread safe, so you can scale it using multithreading for concurrent requests.
I'm working on a project that sends us some CDA documents so I have to parse and extract the data using Mirth Connect as interface engine and save them in a Mirth Results (provider portal). Any idea what is the best way to approach this like configuration or coding to a channel in Mirth to load content of CCD document and extract fields from the CCD document and populate the channel variables map.
I happen to come across this question. I think you would have got the answer, anyway let me share what I have, it may help you in fututre
The CDA document that you fetch is bascially parsed as a XML document. You can either use the MDHT libraries or a simple javascipt that Mirth tool support.
It is not always mandatory that you have to go for external libraries. I have worked with CCDA document structure which is parsible with Javascript supported by mirth.
It depends of what process you follow.If its only one CDA document you are parsing, then fetch it in inbound template, the CDA document will contain a lot of sections like patient demographics, vital signs and other fields. To provide a generalized solution we have to loop through the segments to get rid of referring index inside the array.
Example for looping thorugh care plan section:
function parseCarePlan(section) {
var careplan = [],
care, entries = section['entry'],
entry;
for (j = 0; j < entries.length(); j++) {
entry = entries[j];
care = {};
care.date = entry['procedure']['effectiveTime']['center']['#value'].toString();
care.text = entry['procedure']['code']['text'].toString();
care.code = entry['procedure']['code']['#code'].toString();
}
We have to create a JSON data from the XML (CDA) and then provide the JSON objects inside the Database
If you have a license for the Mirth Results software you will have a support contract to help you answer questions like this. In fact the Mirth Results software has very good native support for CCDA documents. Mirth did very well at Connectathon in 2014 with their CCDA library.
You can use this library https://www.projects.openhealthtools.org/sf/projects/mdht/ to parse CCDA, Create a jar for parsing your CCD document and call that jar - > public method which will accept document and return JSON as response to mirth connect javascript.
Its working for me.