What is the best way to convert a JSON object to be inserted in SQL Server - mule-studio

I'm new to Mule Studio and am running a query against a MySQL database, converting the object to JSON and then mapping it to the columns in a SQL Server database that I am pushing the data to.
However, I am not sure about the best way to handle the mapped JSON. Can I insert the JSON directly into SQL Server or does it need to be converted to an object first?

Your question seems revolving around JSON and database storage.
First you need to decode JSON data (in php there is json_decode), for mule studio look at this documentation. It looks like you should use #JsonAutoDetect.
As for storage on database, the data type of a variable should match that with the field.

Related

How would you achieve local data persistence in Flutter when remote versions of the same data are returned as nested JSON objects?

When the server stores data in a MongoDB database and is accessed through GraphQL, it would be cool if local/cached versions of the same data could be stored similarly - in some sort of local NoSQL data store.
However, from my research it looks like there aren't that many data persistence options available in Flutter and the best one available is SQFLite. If I use SQFLite, though, I have to wrangle different formats of the same data - the nested-object NoSQL/GraphQL format and the "separate objects joined through relations" format of SQL.
Has anyone dealt with this before? Even if you're not using MongoDB/GraphQL in your remote backend, your API likely still returns nested objects which can't be stored as-is in your local SQL DB and can't be used interchangeably with their locally persisted versions.
So how would you deal with this issue and achieve clean syncing of local and remote data without it turning into a mess?

PostgreSQL Store-and-Forrward

I'm building a data collection RESTful API 0 External devices will post json data to database server. So, my idea is to make it by Store-and-Forward ideology.
At the moment of the post, it will store raw json data to table with timestamp and processed true/false fields.
At the next moment when(if) the db server is not loaded will run some function, trigger or stored procedure, etc. The idea is to process all json data into suitable tables and fields for charting graphs/bars and map it over Google Maps later.
So how and what to use to run this functionality(2) when the db server no loaded and free for processing the posted json data?

Inserting .NET object into MongoDB

We have a large application with hundreds of classes/enums, and we want to use MongoDB to store some of these.
The situation is that there is a current system whereby we binary serialize the .NET object into a field in a SQL database, then deserialize on demand. What we want is put the object into Mongo in a way that will allow us to query the object's properties directly (ie. without having to load the object into memory, deserialize, etc.). This is so we can start to get some analytics from the historic data without having to drastically change the code base.
My question is, is this something that easily possible? are there in built serializers in the C# driver to do this?
I'm also open to answers that propose a better way to do this if what I'm trying to do is inherently wrong.
Update: to be clear, what I'm trying to do is take an object that has been loaded using NHibernate, and insert it into Mongo as a Queryable object. Ultimately, I'll want to load it back into memory at some point too.
MongoDB is basically a store of JSON documents, so if you can serialize your objects in a JSON way, you should be ok to store it in MongoDB, and I assume there are lots of JSON serializers for .NET, so should be easy to find one.
Once everything is stored as JSON in MongoDB you will be able to query it without any more tools that the ones to query the database directly.
Regards,
You can use Simple.Data.MongoDB a lightweight, dynamic data access .NET component for MongoDB

Data Type for storing document in SQL Server 2008 via Entity Framework

I'm trying to store a document in SQL Server 2008 using the Entity Framework.
I believe I have the code for doing this completed. The problem I'm now facing is which Data Type to use in SQL Server and in my entity model.
'Image' was my first choice, but this causes an "Invalid mapping" error when I update the model. I see that there's no equivalent of 'Image' (going by the Type drop-down in the entity's properties).
So then I tried 'varbinary(MAX)' and I see that this maps to 'binary' in the entity model. However, when I run the code it tells me that the data would be truncated so it stopped. Upon investigation I see that the SQL Server Data Type 'binary' is 8000 bytes long - which is why I chose 'varbinary(MAX)' - so the entity model seems to be reducing/mapping 'varbinary(MAX)' to 'binary'.
Is this right?
If so, what should my Data Types be (in both SQL Server 2008 and in my entity model) please? Any suggestions?
Change the data type of your model to byte[] and it will be all right dude, if you need more explanations please leave a comment.
EDIT:
Dude, I had tried it before in Linq to Sql and this time I tried it in EF, in conceptual model of your namely Foo.edmx file your type is Binary(you can open it through open with context menu of Visual Studio and then selecting Xml Editor or any other text editor like notepad) but in a generated file named Foo.designer.cs your data type is Byte[].
And there is not a limit you mentioned above.
I tried it with a 10000 bytes and it's inserted successfully without truncating my array. About benchmarking on saving documents in database or file system I read an article and it said that in Sql Server 7, file system have a better performance on retrieving stored data but in later versions of Sql Server, it take over the file system speed and it suggested saving documents on Sql Server.
IMHO on saving documents, if they are not too large, I prefer to store them on DB (NoSql DBs has great performance here as far as I know),
First: Integrity of my data,
Second: More performance that you can have(cause if your folder has large number of files, reading and writing files in those folder slows down gradually more and more unless you organize them in more than one folder and preferably in a tree like folders),
Third: Security policies that you may apply to them through your application more easily(although you can do this on file system approach but i think it's easier here)
Fourth: you can benefit from facilities provided by your DBMS for querying and manipulating and ... those files
and much more ... :-)
Ideally you should not store documents in the database, instead store the path to the document in the database, which then points to the physical document on the web server itself (or some other storage, CDN, etc).
However, if you must store it in SQL Server (and seeing as though your on SQL 2008),
you should be using FILESTREAM.
And it is supported by EF4 (i believe). It maps to binary.
As i said though, i'm not sure how well this will perform, run some benchmarks - and if it's not performing too well, try using regular ADO.NET/FileStream API.
I still think you should put it on the file system, not in the database (my opinion of course)

How to insert data coming from web service into sql database in iphone?

I am developing an app where I need to insert data coming from web service into sqlite3 database.Web service returns XML data with 5 tags.Now after XML parsing how to insert parsed data into sql database?
Can I code for this???
Thanks in advance..
Inserting XML data in sqlite database is no different that inserting any data. So there are 3 parts of the problem you are trying to solve:
Calling the web service and getting the data
Parsing the data and populating some object
Inserting that data in sqlite database which has columns as per your object structure
NSURLConnection and NSXMLParser are the classed you need to look at for solving first 2 problems. Third one would be solved using sqlite library. Without more information about object structure it is difficult to suggest anything else. But you should find enough documentation on using sqlite if you search around.