Azure Batch Job: Is there a code example for batch jobs created by using User Subscription - azure-batch

All the examples I have seen so far user "Batch services" is there any code example for "User Subscription"?
https://github.com/Azure/azure-batch-samples/tree/master/CSharp/ArticleProjects

Submitting jobs for User Subscription Azure Batch accounts is no different from that of Batch Service Azure Batch accounts; only the method for obtaining authentication/credentials for the service client differs. You can view this article for more information - there are code samples at the bottom of the article for C#/.NET.

Related

Azure Data Factory - Batch Accounts - BlobAccessDenied

I'm trying to work with a custom activity in Data Factory to execute in a batch accounts pool a python batch stored in a blob storage.
I followed the Microsoft tutorial https://learn.microsoft.com/en-us/azure/batch/tutorial-run-python-batch-azure-data-factory
My problem is when I execute the ADF pipeline the activity failed:
When I check in the Batch Explorer tool, I got this BlobAccessDenied message:
Depending of the execution, it happens on all ADF reference files but also for my batch file.
I have linked the Storage Account to the Batch Accounts
I'm new to this and I'm not sure of what I must do to solve this.
Thank you in advance for your help.
I tried to reproduce the issue and it is working fine for me.
Please check the following points while creating the pipeline.
Check if you have pasted storage account connection string at line number 6 in main.py file
You need to create a Blob Storage and a Batch Linked Services in the Azure Data Factory(ADF). These linked services will be required in “Azure Batch” and “Settings” Tabs when configure ADF Pipeline. Please follow below snapshots to create Linked Services.
In ADF Portal, click on left ‘Manage’ symbol and then click on +New to create Blob Storage linked service.
Search for “Azure Blob Storage” and then click on Continue
Fill the required details as per your Storage account, test the connection and then click on apply.
Similarly, search for Azure Batch Linked Service (under Compute tab).
Fill the details of your batch account, use the previously created Storage Linked service under “Storage linked service name” and then test the connection. Click on save.
Later, when you will create custom ADF pipeline, under “Azure Batch” Tab, provide the Batch Linked Service Name.
Under “Settings” Tab, provide the Storage Linked Service name and other required information. In "Folder Path", provide the Blob name where you have main.py and iris.csv files.
Once this is done, you can Validate, Debug, Publish and Trigger the pipeline. Pipeline should run successfully.
Once pipeline ran successfully, you will see the iris_setosa.csv file in your output Blob.

Log Azure Data Factory Pipeline run events to Azure App Insights

Is there a way to publish ADF pipeline run events with status to App Insights?
Per my knowledge, you could use Web Activity in the ADF to invoke the Application Insights REST API after execution of your main activities(Or using Execute Pipeline Activity to execute your root pipeline and get the status or output of it).Then send it to App Insights REST API.
More details,please refer to this document:https://www.ben-morris.com/using-azure-data-factory-with-the-application-insights-rest-api/

Cannot sign in to azure services within an azure function

I need to scale an azure sql database two times a day.
I created an azure function
and set the identity access
this is the function body
Now, when I run the function I've got this 2 errors
Error 1
Error 2
I understand that the second error ('this.Client.SubscriptionId' cannot be null) is true because I can see just two lines under that the user has no subscription.
But I don't understand why the
Set-AzContext -SubscriptionId '< GUID-SUBSCRIPTION >'
command generates the first error
Set-AzContext : Please provide a valid tenant or a valid subscription
I have already checked and the tenantID is the one that contains the function.
So for the subscriptionId that is the one that contains the function.
Should I assign a role to the function?
Should I use a different authentication method?
Below I will show you to how to achieve this by using Azure Automation. I am providing an scenario where the requirement is to have a database at S2(50 DTU's) from 7A.M to 7 P.M and rest of the hours in Basic Edition (5 DTU's). This activity has to be performed daily.
Step 1: Create your Automation Account:
To do the Automation you should create an automation account. Add the details for the Automation account and proceed with the creation.
Step 2: Create a Runbook under the Automation account you created
Click on Add a Runbook Icon and create the run book with PowerShell work flow as run book type.
Step-3: Publish the Runbook
Copy this script and publish it in the runbook
Step-4: Creation of credential for the runbook
create a credential with the server admin user and password like the one I mentioned in the below screenshot.
Step-5: Scheduling the run book
You can decide up on the time where you want to upscale and down scale your database. But you need to create separate run book for each of the options.
Step-6: Configure Parameter setting
Configure the parameter setting for the Runbook like the server name, database name, Edition, Perf Level and the credential
Edition: Basic, Standard, Premium
Perf Level: Basic, S0, S1, S2, P1, P2, P3
In this case I am downscaling my database from S1 to S0
It Successfully downgraded to S0.
For more information, please read this article.

Add reply URL to Azure Active Directory register app via command line

I have an Azure Active Directory app and it has various reply URLs. I've being adding reply URLs manually in the Azure portal AAD-> register-app-> settings-> reply-URLS.
My goal is to be able to run an azure pipeline task that can retrieve the reply URL I need from an azure app service( which I know how to do) and add it to the reply URL list from the register app in AAD with a command. Using either Azure-cli, Azure-powershell or Powershell from azure pipeline task list.
If there's another way of doing it with another task I'm open to suggestions.
This is what i tried:
This is what the log/debug output:
I guess that a better questions is:
How Can I give privileges to an Azure CLI task from Azure DevOps to achieve the task from previous problem?
Your question has changed a bit after your edit, so I've tried to revise and answer both parts.. i.e. adding reply URLs through script and something to possibly help with privileges issues:
Adding Reply URLs to your application through PowerShell script
Make use of application object's ReplyUrls list and Set-AzureADApplication command. Here's a quick sample script:
# ObjectId for application from App Registrations in your AzureAD
$appObjectId = "<Your Application Object Id>"
$app = Get-AzureADApplication -ObjectId $appObjectId
# reply URL to add
$newURL = "https://mynewurl"
# Existing reply URLs list
$replyURLList = $app.ReplyUrls;
$replyURLList.Add($newURL)
Set-AzureADApplication -ObjectId $app.ObjectId -ReplyUrls $replyURLList
Assigning correct privileges for execution of script
To execute your script as part of pipeline, this article provides very detailed step-by-step instructions: Set up continuous deployment in Azure Pipelines
I would point you to option 1 in the article, which talks about creating a separate application/service principal for executing the script. Once you do that, you can assign the required privileges to this service principal that will be used to execute the script and resolve your current issue of insufficient privileges.
Screenshot for important parts from article:
For step h, you can follow the first link to register application from Azure Portal.
Once you have the separate application/service principal created for executing script, please go to it's settings > Required Permissions
"Windows Azure Active Directory" should already be available in list of APIs (if not, you can click Add button to add it)
Pick the appropriate privilege under application permissions.
Make sure you go through Admin consent at the end of this process by clicking on the "Grant permissions" button at the end of this process.

Rest to Azure SQL database integration service

Hei
I have a simple scenario, where I have an on-premise system that hosts a rest api. I what based on data in that rest to fill data into a Auzure SQL database using some type of synchronization job. I just unsure of the best method to do this? Can one use Azure Data Factory for this? What other services can do the job?
Under Azure app services is a background task service called Azure Web Jobs
Here are links to help you get started:
Azure App Service Overview: https://azure.microsoft.com/en-us/documentation/articles/app-service-value-prop-what-is/
Overview from another website: http://www.informit.com/articles/article.aspx?p=2423911
Azure Web Jobs introduction: https://azure.microsoft.com/en-us/documentation/articles/web-sites-create-web-jobs/