How to execute COSMOS DB stored procedure with parameters via powershell - powershell

Looking for a powershell script/rest API to execute cosmos db stored procedure with partition key value.

You can use the REST API to execute Stored Procedures.
https://{databaseaccount}.documents.azure.com/dbs/{db-id}/colls/{coll-id}/sprocs/{sproc-name}

There is no native means to interact with Cosmos DB on its data-plane via PowerShell. There are three options you can explore. One of them is calling REST directly from PowerShell as indicated in the previous answer below. Your other options...
You can use this PowerShell Rest API Sample from the .NET SDK GitHub Repo. However, this requires authenticating via the REST API mentioned in the previous answer which can be a bit cumbersome.
You can create your own Custom PowerShell cmdlet in C#/.NET and then call that from your PS script. This may take longer than the example above but is easier to write and maintain. It also gives you the ability to do whatever you were looking to do in a stored procedure and simply implement in C# using the .NET SDK which can also yield benefits in maintainability.

Related

For Azure Data Factories is there a way to 'Validate all' using powershell rather than the GUI?

A working Azure Data Factory (ADF) exists that contains pipelines with activities that are dependent on database tables
The definition of a database table changes
The next time the pipeline runs it fails
Of course we can set up something so it fails gracefully but ...
I need to proactively execute a scheduled Powershell script that iterates through all ADFs (iterating is easy) to do the equivalent of the 'Validate All' (validating is impossible?) functionality that the GUI provides
I do realise that the Utopian CI/CD DevOps environment I dream about will one day in the next year or so achieve this via other ways
I need the automation validation method today - not in a year!
I've looked at what I think are all of the powershell cmdlets available and short of somehow deleting and redeploying each ADF (fraught with danger) I can't find a simple method to validate an Azure Data Factory via Powershell.
Thanks in advance
In the ".Net" SDK, each of the models has a "Validate()" method. I have not yet found anything similar in the Powershell commands.
In my experience, the (GUI) validation is not foolproof. Some things are only tested at runtime.
I know it has been a while and you said you didn't want the validation to work in an year - but after a couple of years we finally have both the Validate all and Export ARM template features from the Data Factory user experience via a publicly available npm package #microsoft/azure-data-factory-utilities. The full guidance can be found on this documentation.

Query Azure database using API

I host my database on Azure. I would like to search data on the table in that database. I am trying to use B4I and the tech help their said I need to use REST API's. I am pretty sure I need to use ODATA. I have the auth token but I am not sure if this is even possible.
In order to query Azure SQL with an API you need to add a layer between it and the destination. As mentioned in this question, OData is a specification that can be implemented fairly easily as there are plenty of libraries that will take care of the bulk of the code for you.
As far as where to host the API, you have several options within Azure. The most common being App Services, Azure Functions, and Logic Apps.

USQL Execution and refer another script

Could you guys help with following:
How can we execute script usql script stored in ADL store using ADF. What is standard practice of storing script?
Currently I don't see a way to refer script from another script. It will make script execution simple because then I can make a deep chain where ScriptA will refer to ScriptB and so on and Only submitting ScriptB would be sufficient since it will automatically invoke dependent script.
Please point me to documentation for recommendation for better partition/indexing schema and performance improvement tips/tricks
This was just asked and answered here: Execute U-SQL script in ADL storage from Data Factory in Azure
U-SQL offers you a meta-data service with Procedures and Functions. So instead of doing file chaining, you can register your reusable script components as procedures and functions instead.
take a look at the performance tuning slides on http://www.slideshare.net/MichaelRys. If you have access to SQLPASS or TechReady presentation recordings, there are videos of that presentation available as well.

Use Azure to GET from RESTful API

I would like to use Azure to retrieve JSON data from a REST api then store that data into a table. Data retrieval would occur daily and a parameter would be passed to the api to restrict the results to the prior day's data.
Which Azure component/mechanism should I use for calling the api?
The data would be the foundation for a data warehouse. Should I use Azure SQL table or Azure table?
I have recently begun exploring Azure and am not sure how to do this.
I look forward to feedback.
Thank you.
Take a look at Azure Functions. You can create an Azure Function that is periodically invoked, it has input bindings for different sources (or you can add some C# code to read from URL) and then place results into Azure Database.
Here is example of Azure Function that sends JSON to stored procedure:
https://www.codeproject.com/Articles/1169531/Sending-events-from-Azure-Event-Hub-to-Azure-SQL-D

How to import or sync data to Neo4?

I have a REST API around a PostgreSQL database, the API was built using the Django REST Framework (python). I have access to the PostgreSQL database and the API, but I'm not allowed to modify the django/python code.
My first approach is to make, kind of, an HTTP POST request via a trigger every time a new record is created in PostgreSQL. I found this but seems like it's not the best way to do what I need.
In the Neo4j side, I'm thinking of a periodic HTTP GET request to the API from within a cypher function, but does not exist such thing.
You should use the APOC procedures for integrating with other DBs via JDBC. PostgreSQL is supported.
You can also use APOC procedures like apoc.periodic.schedule or apoc.periodic.countdown to periodically execute a Cypher query in the background.