Azure Batch and API Job ADD: upload file in wd directory with - azure-batch

I'm using API Job Add to create one Job with one Task in Azure Batch.
This is my test code:
{
"id": "20211029-1540",
"priority": 0,
"poolInfo": {
"poolId": "pool-test"
},
"jobManagerTask": {
"id": "task2",
"commandLine": "cmd /c dir",
"resourceFiles": [
{
"storageContainerUrl": "https://linkToMyStorage/MyProject/StartTask.txt"
}
]
}
}
To execute the API I'm using Postman and to monitor the result I'm using BatchExplorer.
The job and it's task are created correctly, but the 'wd' folder generate automatically is empty.
If I understood fine, I should see the linked file in the storage variable, right?
Maybe, some other parameter is needed in the Json of the body?
Thank you!

Task state of completed does not necessarily indicate success. From your json body, you most likely have an error:
"resourceFiles": [
{
"storageContainerUrl": "https://linkToMyStorage/MyProject/StartTask.txt"
}
You've specified a storageContainerUrl with a file. Also ensure you have provided proper permissions (either via SAS or a user managed identity).

Related

Pass a list of string to values in Azure App Service Settings

I have a list of string in my appsettings.json.
I an using Azure DevOps to deploy my code to Azure.
I am using the Release pipeline task "Azure App Sevice Settings".
I want to pass my list of strings to the above task parameter.
I have tried the below
[
{ "name": "test", "value": "["1", "2"]", "slotSetting": false }
]
The release pipeline is giving me the below error
Error: Application Settings object is not a valid JSON.
How do I pass list of string in the Azure App Service Settings task?
Thanks in advance
Your JSON string is not valid. Possibly you meant this, with the double-quotes escaped?
[
{ "name": "test", "value": "[\"1\", \"2\"]", "slotSetting": false }
]

Create Azure Devops/Pipeline release including all meta data like buildNumber via REST

I'am aware of https://learn.microsoft.com/en-us/rest/api/azure/devops/release/releases/create?view=azure-devops-rest-6.0#uri-parameters and i can create releases via REST, but ...
Problem:
The issue with those releases is, that a release triggered via the REST lacks some predefined variables like Build.BuildNumber - or at least, they are not available at all scopes.
It seems like Build.BuildNumber is available in the pipeline stage, but is missing in when the release name format is computed. This means, a release name format Release-$(Build.BuildNumber)($(Release.ReleaseId)) will end up with a blank Build.BuildNumber when created using the payload below.
Details:
My json payload
{
"definitionId": 9,
"description": "Test Release",
"artifacts": [
{
"alias": "My Build Artifact",
"instanceReference": {
"id": "6989",
"name": null
}
}
],
"isDraft": false,
"reason": "none",
"manualEnvironments": null
}
While artifacts.instanceReference.id references a valid build (which of course has a buildNumber) - so that during the release stage Build.BuildNumber is properly populated.
I send the payload via
curl -X POST -u username:redactedPAT -H "Content-Type: application/json" -d #payload.json https://vsrm.dev.azure.com/redactedCompany/redactedProjectId/_apis/release/releases\?api-version\=6.0
Question:
What is the right way to create a release as if this would be created via the GUI including all the META-Data?
Do i miss use the API somehow or do i need to manually set the environment variables for this to work (or even somehow via variables).
Since the buildNumber is available in the stage, could that be rather/even a bug?
The release name is evaluated at the compile time, which means it is populated before the pipeline tasks are executed. You can check out below workarounds to populated the release name.
1, The $(Build.BuildNumber) variable you defined in the release name format will retrieve the name property you pass to the artifacts instanceReference in the request body. So you can pass the BuildNumber value to the name property under instanceReference in the request body. See here.
{
...
"artifacts": [
{
"alias": "My Build Artifact",
"instanceReference": {
"id": "6989",
"name": BuildNumber #set the buildNumber here.
}
}
],
...
}
2, Another workaround is to use logging commands to update the release name in a script task.
You can add a script task in your release pipeline stage. And run below logging command to update the release name during the release pipeline execution.
echo "##vso[release.updatereleasename]Release-$(Build.BuildNumber)($(Release.ReleaseId))"
See below example:
When the task is executed. it will update the release name with your desired format.

Using Function app connector in ADF - How to override parameters in CI-CD?

I need a work around pretty quickly - this was a late surprise in the dev process when we added an Az function to our development ADF pipeline.
When you use a function app in ADF V2, when you generate the ARM template, it does not parameterize the key references unlike in other linked services. Ugh!
So for CI/CD scenarios, when we deploy we now have a fixed function app reference. What we'd like to do is the same as other linked services - override the key parameters to point to the correct Dev/UAT /Production environment versions of the functions.
I can think of dirty hacks using powershell to overwrite (does powershell support ADF functions yet? don't know - in January they didn't).
Any other ideas on how to override function app linked service settings?
the key parameters are under typeProperties (assuming the function key is in keyvault):
{"functionAppUrl:="https://xxx.azurewebsites.net"}
{"functionkey":{"store":{"referenceName"="xxxKeyVaultLS"}}}
{"functionkey":{"secretName"="xxxKeyName"}}
Right now these are hard coded from the UI settings - no parameter and no default.
ok, eventually got back to this.
The solution looks a lot but it is pretty simple.
In my devops release, I create a Powershell task after both the data factory ARM template has been deployed and the powershell task for deployment.ps1 with the "predeployment=$false" setting has run (see ADF CI/CD here.)
I have a json file for each environment (dev/uat/prod) in my git repo (I actually use a separate "common" repo to store scripts apart from the ADF git repo and its alias in DevOps is "_Common" - you'll see this below in the -File parameter of the script).
The json file to replace the deployed function linked service is a copy of the function linked service json in ADF and looks like this for DEV:
(scripts/Powershell/dev.json)
{
"name": "FuncLinkedServiceName",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"annotations": [],
"type": "AzureFunction",
"typeProperties": {
"functionAppUrl": "https://myDEVfunction.azurewebsites.net",
"functionKey": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "MyKeyvault_LS",
"type": "LinkedServiceReference"
},
"secretName": "MyFunctionKeyInKeyvault"
}
},
"connectVia": {
"referenceName": "MyintegrationRuntime",
"type": "IntegrationRuntimeReference"
}
}
}
...and the PROD file would be like this:
(scripts/Powershell/prod.json)
{
"name": "FuncLinkedServiceName",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"annotations": [],
"type": "AzureFunction",
"typeProperties": {
"functionAppUrl": "https://myPRODfunction.azurewebsites.net",
"functionKey": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "MyKeyvault_LS",
"type": "LinkedServiceReference"
},
"secretName": "MyFunctionKeyInKeyvault"
}
},
"connectVia": {
"referenceName": "MyintegrationRuntime",
"type": "IntegrationRuntimeReference"
}
}
}
then in the devops pipeline, I use a Powershell script block that looks like this:
Set-AzureRMDataFactoryV2LinkedService -ResourceGroup "$(varRGName)" -DataFactoryName "$(varAdfName)" -Name "$(varFuncLinkedServiceName)" -File "$(System.DefaultWorkingDirectory)/_Common/Scripts/Powershell/$(varEnvironment).json" -Force
or for Az
Set-AzDataFactoryV2LinkedService -ResourceGroupName "$(varRGName)" -DataFactoryName "$(varAdfName)" -Name "$(varFuncLinkedServiceName)" -DefinitionFile "$(System.DefaultWorkingDirectory)/_Common/Scripts/Powershell/Converter/$(varEnvironment).json" -Force
Note:
the $(varXxx) are defined in my pipeline variables e.g.
varFuncLinkServiceName = FuncLinkedServiceName.
varEnvironment = "DEV", "UAT", "PROD" depending on the target release
Force is used because the Linked service must already exist in the Data Factory ARM deployment and then we need to force the overwrite of just the function linked service.
Hopefully MSFT will release a function app linked service that uses parameters but until then, this has got us moving with the release pipeline.
HTH. Mark.
Update: Added the Az cmdlet version of the AzureRM command and changed to Set ("New-Az..." worked but in the new Az - there is only Set- for V2 linked services).

Why using Google Cloud Drive Rest API file.list can not get all the files?

I am using the following CURL command to retrieve all my google drive files, however, it only list a very limited part of the whole bunch of files. Why?
curl -H "Authorization: Bearer ya29.hereshouldbethemaskedaccesstokenvalue" https://www.googleapis.com/drive/v3/files
result
{
"kind": "drive#fileList",
"incompleteSearch": false,
"files": [
{
"kind": "drive#file",
id": "2fileidxxxxxxxx",
"name": "testnum",
"mimeType": "application/vnd.google-apps.folder"
},
{
"kind": "drive#file",
"id": "1fileidxxxxxxx",
"name": "test2.txt",
...
}
token scope includes
https://www.googleapis.com/auth/drive.file
https://www.googleapis.com/auth/drive.appdata
Using the Android SDK also facing the same issue.
Any help would be appreciated.
Results from files.list are paginated -- your response should include a "nextPageToken" field, and you'll have to make another call for the next page of results. See documentation here about the files list call. You may want to use one of the client libraries to make this call (see the examples at the bottom of the page)
I have the same problem when try to get list of files in Google Drive folder. This folder has more than 5000 files, but API return only two of them. The problem is -- when files in folder shared with anyone with a link, in fact it isn't shared with you until you open it. Owner of this folder must specify you as viewer.

Rest API Testing from commandline

I am preparing a SDK, and SDK as of now, does not have CI system separately.
I want to test some REST endpoints which should be available when the user uses SDK to create the software and try to run with our framework.
I have written all the manual steps in shell script and planning to put the script as crontab to run it every few hours.
Now, for rest end point testing, I was thinking of just using curl and checking if we getting data back. but this can turn into a lot of work,as we expand the functionality. I looked into frisby framework which kind of suits my needs.
Is there any recommendation for allowing me to test rest services when the framework software is started.
Probably swat is exactly what you need. Reasons :
This is DSL for web, rest services test automation
it uses curl command line API to create http requests
it is both DSL and command line tool to run test scenarios written on DSL
it is configurable both from bash style scripts and general configs
it is very easy to start with
probably in your case curl based test cases could be easily converted into swat DSL format
(*) disclosure - I am the author of swat.
I have created a very small bash script to test JSON APIs which might be useful. It uses jq and curl as dependencies. curl for making request and jq for JSON processing.It is only designed to test JSON APIs.
Link: api-test
Every API call you want to run is stored in a JSON file with format below:
{
"name": "My API test",
"testCases": {
"test_case_1": {
"path": "/path_1",
"method": "POST",
"description": "Best POST api",
"body": {
"value": 1
},
"header": {
"X-per": "1"
}
},
}
"url": "http://myapi.com"
}
To run a test case:
api-test -f test.json run test_case_1
api-test -f test.json run all # run all API call at once.
It will produce output in an organized way
Running Case: test_case_1
Response:
200 OK
{
"name": "Ram",
"full_name": "Ram Shah"
}
META:
{
"ResponseTime": "0.078919s",
"Size": "235 Bytes"
}
It also supports automated testing of API with jq JSON comparison and normal equality/subset comparisons.