Parse server : Update multiple rows/Objects by REST API - rest

In parse server, have to update multiple rows of the a table by REST API Is there anyway I can achieve this instead of looping the record and update each of them.

You're better off using Batch Operations.

You can define a function in the cloud code to perform your service: Parse.Cloud.define

Related

Rest API call from copy activity

Hi i am processing a set of ~50K records from a pipe delimeted flat kn azure data factory and need to invoke a rest API call for each input record. So, I am using a foreach loop to access each record and inside the loop, I am using a copy activity to invoke a rest API call.
My question is, can I invoke the rest API call in bulk for all the records at once, as the foreach loop is slowing the pipeline execution. I want to remove the foreach loop and also process the API json response and store it in azure sql database.
Thanks
You will have to check the Pagination properties so that you can decide how much payload you need to return from source API:
https://learn.microsoft.com/en-us/azure/data-factory/connector-rest?tabs=data-factory#pagination-support
Also, if you need to store the API JSON response in Azure SQL, then you can do so with many built in functions like JSON_PATH
More details can be found in this link:
https://learn.microsoft.com/en-us/azure/azure-sql/database/json-features

Load multiple records into marklogic server

How can I upload multiple records in a file into marklogic server using RESTapi.
I tried to insert simple json format file
[{"Id":100000,"Name":"Dennis"},
{"Id":100001,"Name":"Andrea"},
{"Id":100002,"Name":"Robert"},
{"Id":100003,"Name":"Sara"}]
But, it gives me like one single record.
How do I convert this into 4 different records?
Thanks in advance,
Y.Prithvi
There isn't an out-of-the-box way to do that split at the moment. Your best bet is to do a client-side split and then do a bulk-write POST with multiple JSON items to /v1/documents
For the client-side split, you might use something like underscore_cli to do the splitting.
As Dave points out, the easiest approach is to split out the documents on the client and send a multipart/mixed payload.
The alternative is to write a resource service extension to do the split. In MarkLogic 7, the service must be implemented in XQuery. In MarkLogic 8, you will also be able to implement a service in JavaScript.
The Java API bundles an example that illustrates the basic idea of a service that splits documents:
scripts/docsplit.xqy
com.marklogic.client.example.extension.DocumentSplitter

Recording GET requests to a table from REST API

I would like to record the various GET requests to my API in a table and use that table as part of the calculation of what to return for future GET requests.
Perhaps the easiest test example would be a GET function that returns the number of GET requests in the last hour.
The REST protocol says that GET requests should only have data returns.
Do I need to POST the request and then GET the results of the same request?
You can easily achieve that with nodejs
You should have the requests saved in a json file or database for example and have another service that returns this saved data.
Take a look at expressjs
Best luck

Spring batch Item reader to iterate over a rest api call

I have a spring batch job, which needs to fetch details from rest api call and process the particular data on my side. My rest api call will have mainly the below parameters :
StartinIdNumber(offset)
PageSize(limit)
ps: StartinIdNumber serves the same purpose as rownumber or "offset" in this particular API. The API response results are sorted by IdNumber, so by specifying a StartinIdNumber, the API will in turn perform a "where IdNumber >= StartinIdNumber order by IdNumber limit pageSize" in their DB query.
It will return the given number of user details, I need to iterate through all the ids by changing the StartingIdNumber parameter for each request.
I have seen current ItemReader implementations of spring batch framework,which read through database or xml etc. But I didn't come across any reader which helps in my case. Please suggest a way to iterate through the user details as specified above .
Note : If I write my own custom item reader, I have to take care of preserving state (last processed "StartingIdNumer") which is proving challenging to me.
Does implementing ItemStream serves my purpose? Or is there any better way?
Implementing the ItemStream interface and writing my own custom reader served my purpose. It is now state-full as required for me. Thanks.

What is the best practice to handle Multitenant security in Breeze?

I'm developing an Azure application using this stack:
(Client) Angular/Breeze
(Server) Web API/Breeze Server/Entity Framework/SQL Server
With every request I want to ensure that the user actually has the authorization to execute that action using server-side code. My question is how to best implement this within the Breeze/Web API context.
Is the best strategy to:
Modify the Web API Controller and try to analyze the contents of the
Breeze request before passing it further down the chain?
Modify the EFContextProvider and add an authorization test to
every method exposed?
Move the security all into the database layer and make sure that a User GUID and Tenant GUID are required parameters for every query and only return relevant data?
Some other solution, or some combination of the above?
If you are using Sql Azure then one option is to use Azure Federation to do exactly that.
In a very simplistic term if you have TenantId in your table which stores data from multiple tenants then before you execute a query like SELECT Col1 FROM Table1, you execute USE FEDERATION... statement to restrict the query results to a particular TenantId only, and you don't need to add WHERE TenantId=#TenantId to your query,
USE FEDERATION example: http://msdn.microsoft.com/en-us/library/windowsazure/hh597471.aspx
Note that use of Sql Azure Federation comes with lots of strings attached when it comes to Building a DB schema one of the best blog I have found about it is http://blogs.msdn.com/b/cbiyikoglu/archive/2011/04/16/schema-constraints-to-consider-with-federations-in-sql-azure.aspx.