Powershell append to an existing object breaks the format of json - powershell

Trying to append an array to a json object but having formatting issues. after appending the desired output format is not a json or as expected.
i have a json object, then converted a json object to custom nested object then trying to append new objects to array which is where am getting formatting issues.
help on this is really appreciated.
below is code
$json = #'
{
"scope": {
"entities": [],
"matches": [
{
"type": null,
"managementZoneId": null,
"mzId": null,
"tags": [
{
"context": "CONTEXTLESS",
"key": "DetectedName",
"value": ""
}
],
"tagCombination": "AND"
}
]
}
}
'#
# Convert from JSON to a nested custom object.
$obj = $json | ConvertFrom-Json
# Append new objects to the array.
$obj.scope.matches += [pscustomobject] #{ type = 'null'
managementZoneId = 'null'
mzId = 'null'
tags = '[{ "context": "CONTEXTLESS", "key": "DetectedName", "value": "naae22let1ci1.ktb.kontoorbrands.com" }]'
tagCombination = 'AND' }
# Convert back to JSON.
$obj | ConvertTo-Json -Depth 100"
output is:
{
"scope": {
"entities": [],
"matches": [
{
"type": null,
"managementZoneId": null,
"mzId": null,
"tags": [
{
"context": "CONTEXTLESS",
"key": "DetectedName",
"value": ""
}
],
"tagCombination": "AND"
}
{
"type": "null",
"managementZoneId": "null",
"mzId": "null",
"tags": "[{ \"context\": \"CONTEXTLESS\", \"key\": \"DetectedName\", \"value\": \"naae22let1ci1.ktb.kontoorbrands.com\" }]",
"tagCombination": "AND"
}
]
}
}
desired output :
$obj | ConvertTo-Json -Depth 100
{
"scope": {
"entities": [
],
"matches": [
{
"type": null,
"managementZoneId": null,
"mzId": null,
"tags": [
{
"context": "CONTEXTLESS",
"key": "DetectedName",
"value": ""
}
],
"tagCombination": "AND"
},
{
"type": null,
"managementZoneId": null,
"mzId": null,
"tags": [
{
"context": "CONTEXTLESS",
"key": "DetectedName",
"value": ""
}
],
"tagCombination": "AND"
}
]
}
}

As you are converting the json to an object, you need to add the new items as objects too, not as json strings.
Try
$json = #'
{
"scope": {
"entities": [],
"matches": [
{
"type": null,
"managementZoneId": null,
"mzId": null,
"tags": [
{
"context": "CONTEXTLESS",
"key": "DetectedName",
"value": ""
}
],
"tagCombination": "AND"
}
]
}
}
'#
# Convert from JSON to a nested custom object.
$obj = $json | ConvertFrom-Json
# Append new objects to the array.
$obj.scope.matches += [PsCustomObject]#{
type = $null
managementZoneId = $null
mzId = $null
# the next element is an array of PsCustomObjects, hence the need for #()
tags = #([PsCustomObject]#{
context = "CONTEXTLESS"
key = "DetectedName"
value = "naae22let1ci1.ktb.kontoorbrands.com"
})
tagCombination = 'AND'
}
# Convert back to JSON.
$obj | ConvertTo-Json -Depth 100
Output:
{
"scope": {
"entities": [
],
"matches": [
{
"type": null,
"managementZoneId": null,
"mzId": null,
"tags": [
{
"context": "CONTEXTLESS",
"key": "DetectedName",
"value": ""
}
],
"tagCombination": "AND"
},
{
"type": null,
"managementZoneId": null,
"mzId": null,
"tags": [
{
"context": "CONTEXTLESS",
"key": "DetectedName",
"value": "naae22let1ci1.ktb.kontoorbrands.com"
}
],
"tagCombination": "AND"
}
]
}
}
P.S. PowerShell does not produce 'pretty' json. If you need to convert it to properly spaced json, see my function Format-Json

Related

Convert Json input to CSV output file using powerShell

I have a json file as given below.
{
"count": 100,
"value": [
{
"id": 264871,
"release": {
"id": 36803,
"name": "Test_1020_SP_1",
"url": "https://vsrm.dev.azure.com/TestProject/6d203219-63b3-4b1c-b4fe-fa9172d74fb1/_apis/Release/releases/36803",
"artifacts": [
{
"sourceId": "6d203219-63b3-4b1c-b4fe-fa9172d74fb1/efa62ff5-7dc3-4b5a-b7e1-29a319550b28:12e54cbd-881f-43de-8442-4454ffba61fb",
"type": "PackageManagement",
"alias": "_Project.scripts.release",
"definitionReference": {
"definition": {
"id": "12e54cbd-881f-43de-8442-4454ffba61fb",
"name": "Project.scripts.release"
},
"feed": {
"id": "6d203219-63b3-4b1c-b4fe-fa9172d74fb1/efa62ff5-7dc3-4b5a-b7e1-29a319550b28",
"name": "Project_release_scripts"
},
"files": {
"id": "**",
"name": "**"
},
"packageType": {
"id": "nuget",
"name": "NuGet"
},
"skipextract": {
"id": "",
"name": ""
},
"version": {
"id": "2022.10.11.1",
"name": "2022.10.11.1"
},
"view": {
"id": "",
"name": ""
}
},
"isPrimary": true,
"isRetained": false
}
],
"webAccessUri": "https://dev.azure.com/TestProject/6d203219-63b3-4b1c-b4fe-fa9172d74fb1/_release?releaseId=36803\u0026_a=release-summary",
"_links": {
"self": {
"href": "https://vsrm.dev.azure.com/TestProject/6d203219-63b3-4b1c-b4fe-fa9172d74fb1/_apis/Release/releases/36803"
},
"web": {
"href": "https://dev.azure.com/TestProject/6d203219-63b3-4b1c-b4fe-fa9172d74fb1/_release?releaseId=36803\u0026_a=release-summary"
}
}
},
"releaseDefinition": {
"id": 6777,
"name": "Test_Deploy",
"path": "\\Test",
"projectReference": {
"id": "6d203219-63b3-4b1c-b4fe-fa9172d74fb1",
"name": null
},
"url": "https://vsrm.dev.azure.com/TestProject/6d203219-63b3-4b1c-b4fe-fa9172d74fb1/_apis/Release/definitions/6777",
"_links": {
"self": {
"href": "https://vsrm.dev.azure.com/TestProject/6d203219-63b3-4b1c-b4fe-fa9172d74fb1/_apis/Release/definitions/6777"
},
"web": {
"href": "https://dev.azure.com/TestProject/6d203219-63b3-4b1c-b4fe-fa9172d74fb1/_release?definitionId=6777"
}
}
}
},
{
"id": 264870,
"release": {
"id": 36800,
"name": "Test_2698_SP_1",
"url": "https://vsrm.dev.azure.com/TestProject/6d203219-63b3-4b1c-b4fe-fa9172d74fb1/_apis/Release/releases/36800",
"artifacts": [
{
"sourceId": "6d203219-63b3-4b1c-b4fe-fa9172d74fb1/efa62ff5-7dc3-4b5a-b7e1-29a319550b28:12e54cbd-881f-43de-8442-4454ffba61fb",
"type": "PackageManagement",
"alias": "_Project.scripts.release",
"definitionReference": {
"definition": {
"id": "12e54cbd-881f-43de-8442-4454ffba61fb",
"name": "Project.scripts.release"
},
"feed": {
"id": "6d203219-63b3-4b1c-b4fe-fa9172d74fb1/efa62ff5-7dc3-4b5a-b7e1-29a319550b28",
"name": "Project_release_scripts"
},
"files": {
"id": "**",
"name": "**"
},
"packageType": {
"id": "nuget",
"name": "NuGet"
},
"skipextract": {
"id": "",
"name": ""
},
"version": {
"id": "2022.10.11.1",
"name": "2022.10.11.1"
},
"view": {
"id": "",
"name": ""
}
},
"isPrimary": true,
"isRetained": false
}
],
"webAccessUri": "https://dev.azure.com/TestProject/6d203219-63b3-4b1c-b4fe-fa9172d74fb1/_release?releaseId=36800\u0026_a=release-summary",
"_links": {
"self": {
"href": "https://vsrm.dev.azure.com/TestProject/6d203219-63b3-4b1c-b4fe-fa9172d74fb1/_apis/Release/releases/36800"
},
"web": {
"href": "https://dev.azure.com/TestProject/6d203219-63b3-4b1c-b4fe-fa9172d74fb1/_release?releaseId=36800\u0026_a=release-summary"
}
}
},
"releaseDefinition": {
"id": 6777,
"name": "Test_Deploy",
"path": "\\Test",
"projectReference": {
"id": "6d203219-63b3-4b1c-b4fe-fa9172d74fb1",
"name": null
},
"url": "https://vsrm.dev.azure.com/TestProject/6d203219-63b3-4b1c-b4fe-fa9172d74fb1/_apis/Release/definitions/6777",
"_links": {
"self": {
"href": "https://vsrm.dev.azure.com/TestProject/6d203219-63b3-4b1c-b4fe-fa9172d74fb1/_apis/Release/definitions/6777"
},
"web": {
"href": "https://dev.azure.com/TestProject/6d203219-63b3-4b1c-b4fe-fa9172d74fb1/_release?definitionId=6777"
}
}
}
}
]
}
I wan to convert the above json file to csv format with heading as given below
I tried using the below command
(Get-Content -Path "C:\Test.json") | ConvertFrom-Json | Select-Object -expand value | ConvertFrom-Csv |Out-File C:\Test.csv.
But the value is not coming in the expected format and it is coming without the column names
This should work, though it will generate a column per property (including inner arrays).
I would not expect this to work if the json had nested arrays.
# function to flatten a PSCustomObject into a hashtable of primitives
# this can probably be cleaned up
function flatten($customObj) {
$flatHashtable = #{}
foreach ($entry in #($customObj.PSObject.Properties)) {
$value = $entry.Value
if ($value -is [Array]) {
$valueArray = $value
$size = $value.Count
$keyTemplate = '/{0}/'
} else {
$valueArray = #($value)
$size = 1
$keyTemplate = '/'
}
for ($i = 0; $i -lt $size; $i++) {
$item = $valueArray[$i]
$keySeparator = $keyTemplate -f $i
if ($item -is [PSObject]) {
$subentries = flatten $item # recursive
foreach ($subentry in $subentries.GetEnumerator()) {
$flattenedKey = $entry.Name + $keySeparator + $subentry.Key
if (!($flatHashtable.ContainsKey($flattenedKey))) {
$flatHashtable.Add($flattenedKey, $subentry.Value)
}
}
} else {
if (!($flatHashtable.ContainsKey($entry.Name))) {
$flatHashtable.Add($entry.Name, $item)
}
}
}
}
return $flatHashtable
}
$jsonText = '<your json here>'
$jsonObj = $jsonText | ConvertFrom-Json
# ignore "count" property
$releases = $jsonObj.value
$rows = #()
$releases | ForEach-Object { $rows += (flatten $_) }
$csvColumns = $rows.Keys | Group-Object | Select-Object -ExpandProperty Name | Sort-Object
$rows | Select-Object -Property $csvColumns | ConvertTo-Csv | Set-Content '<path to csv file>'

Druid Using multiple dimensions for a Dimension Extraction Function

Is it possible to use multiple dimensions for a dimension extraction function?
Something like:
{
"type": "extraction",
"dimension": ["dimension_1", "dimension_2"],
"outputName": "new_dimension",
"outputType": "STRING",
"extractionFn": {
"type": "javascript",
"function": "function(x, y){ // do sth with both x and y to return the result }"
}
}
I do not think this is possible. However, you can create something like that by first "merge" the 2 different dimensions using a virtualColumn, and then use an extraction function. You can then split the values again.
Example query (using https://github.com/level23/druid-client)
$client = new DruidClient([
"router_url" => "https://your.druid"
]);
// Build a groupBy query.
$builder = $client->query("hits")
->interval("now - 1 hour/now")
->select("os_name")
->select("browser")
->virtualColumn("concat(os_name, ';', browser)", "combined")
->sum("hits")
->select("combined", "coolBrowser", function (ExtractionBuilder $extractionBuilder) {
$extractionBuilder->javascript("function(t) { parts = t.split(';'); return parts[0] + ' with cool ' + parts[1] ; }");
})
->where("os_name", "!=", "")
->where("browser", "!=", "")
->orderBy("hits", "desc")
;
// Execute the query.
$response = $builder->groupBy();
Example result:
+--------+--------------------------------------------------+--------------------------+---------------------------+
| hits | coolBrowser | browser | os_name |
+--------+--------------------------------------------------+--------------------------+---------------------------+
| 418145 | Android with cool Chrome Mobile | Chrome Mobile | Android |
| 62937 | Windows 10 with cool Edge | Edge | Windows 10 |
| 27956 | Android with cool Samsung Browser | Samsung Browser | Android |
| 9460 | iOS with cool Safari | Safari | iOS |
+--------+--------------------------------------------------+--------------------------+---------------------------+
Raw native druid json query:
{
"queryType": "groupBy",
"dataSource": "hits",
"intervals": [
"2021-10-15T11:25:23.000Z/2021-10-15T12:25:23.000Z"
],
"dimensions": [
{
"type": "default",
"dimension": "os_name",
"outputType": "string",
"outputName": "os_name"
},
{
"type": "default",
"dimension": "browser",
"outputType": "string",
"outputName": "browser"
},
{
"type": "extraction",
"dimension": "combined",
"outputType": "string",
"outputName": "coolBrowser",
"extractionFn": {
"type": "javascript",
"function": "function(t) { parts = t.split(\";\"); return parts[0] + \" with cool \" + parts[1] ; }",
"injective": false
}
}
],
"granularity": "all",
"filter": {
"type": "and",
"fields": [
{
"type": "not",
"field": {
"type": "selector",
"dimension": "os_name",
"value": ""
}
},
{
"type": "not",
"field": {
"type": "selector",
"dimension": "browser",
"value": ""
}
}
]
},
"aggregations": [
{
"type": "longSum",
"name": "hits",
"fieldName": "hits"
}
],
"virtualColumns": [
{
"type": "expression",
"name": "combined",
"expression": "concat(os_name, ';', browser)",
"outputType": "string"
}
],
"context": {
"groupByStrategy": "v2"
},
"limitSpec": {
"type": "default",
"columns": [
{
"dimension": "hits",
"direction": "descending",
"dimensionOrder": "lexicographic"
}
]
}
}

Replace values in Json using powershell

I have a json file in which i would like to change values and save again as a Json:
Values that need to be updated:
domain
repo
[
{
"name": "[concat(parameters('factoryName'), '/LS_New')]",
"type": "Microsoft.DataFactory/factories/linkedServices",
"apiVersion": "2018-06-01",
"properties": {
"description": "Connection",
"annotations": [],
"type": "AzureDatabricks",
"typeProperties": {
"domain": "https://url.net",
"accessToken": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "LS_vault",
"type": "LinkedServiceReference"
},
"secretName": "TOKEN"
},
"newClusterNodeType": "Standard_DS4_v2",
"newClusterNumOfWorker": "2:10",
"newClusterSparkEnvVars": {
"PYSPARK_PYTHON": "/databricks/python3/bin/python3"
},
"newClusterVersion": "7.2.x-scala2.12"
}
},
"dependsOn": [
"[concat(variables('factoryId'), '/linkedServices/LS_evaKeyVault')]"
]
},
{
"name": "[concat(parameters('factoryName'), '/PIP_Log')]",
"type": "Microsoft.DataFactory/factories/pipelines",
"apiVersion": "2018-06-01",
"properties": {
"description": "Unzip",
"activities": [
{
"name": "Parse",
"description": "This notebook",
"type": "DatabricksNotebook",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"notebookPath": "/dataPipelines/main_notebook.py",
"baseParameters": {
"businessgroup": {
"value": "#pipeline().parameters.businessgroup",
"type": "Expression"
},
"project": {
"value": "#pipeline().parameters.project",
"type": "Expression"
}
},
"libraries": [
{
"pypi": {
"package": "cytoolz"
}
},
{
"pypi": {
"package": "log",
"repo": "https://b73gxyht"
}
}
]
},
"linkedServiceName": {
"referenceName": "LS_o",
"type": "LinkedServiceReference"
}
}
],
"parameters": {
"businessgroup": {
"type": "string",
"defaultValue": "test"
},
"project": {
"type": "string",
"defaultValue": "log-analytics"
}
},
"annotations": []
},
"dependsOn": [
"[concat(variables('factoryId'), '/linkedServices/LS_o')]"
]
}
]
I tried using regex but i am only able to update 1 value :
<valuesToReplace>
<valueToReplace>
<regExSearch>(\/PIP_Log[\w\W]*?[pP]roperties[\w\W]*?[lL]ibraries[\w\W]*?[pP]ypi[\w\W]*?"repo":\s)"(.*?[^\\])"</regExSearch>
<replaceWith>__PATValue__</replaceWith>
</valueToReplace>
<valueToReplace>
<regExSearch>('\/LS_New[\w\W]*?[pP]roperties[\w\W]*?[tT]ypeProperties[\w\W]*?"domain":\s"(.*?[^\\])")</regExSearch>
<replaceWith>__LSDomainName__</replaceWith>
</valueToReplace>
</valuesToReplace>
Here is the powershell code. The loop goes through all the values that are to be replaced.
I tried using dynamic variable in select-string and looping, but it doesn't seem to work
foreach($valueToReplace in $configFile.valuesToReplace.valueToReplace)
{
$regEx = $valueToReplace.regExSearch
$replaceValue = '"' + $valueToReplace.replaceWith + '"'
$matches = [regex]::Matches($json, $regEx)
$matchExactValueRegex = $matches.Value | Select-String -Pattern """repo\D:\s*(.*)" | % {$_.Matches.Groups[1].Value}
$updateReplaceValue = $matches.Value | Select-String -Pattern "repo\D:\s\D__(.*)__""" | % {$_.Matches.Groups[1].Value}
$updateReplaceValue = """$patValue"""
$json1 = [regex]::Replace($json, $matchExactValueRegex , $updateReplaceValue)
$matchExactValueRegex1 = $matches.Value | Select-String -Pattern """domain\D:\s*(.*)" | % {$_.Matches.Groups[1].Value}
$updateReplaceValue1 = $matches.Value | Select-String -Pattern "domain\D:\s\D__(.*)__""" | % {$_.Matches.Groups[1].Value}
$updateReplaceValue1 = """$domainURL"""
$json = [regex]::Replace($json1, $matchExactValueRegex1 , $updateReplaceValue1)
}
else
{
Write-Warning "Inactive config value"
}
$json | Out-File $armFileWithReplacedValues
Where am i missing??
You should not peek and poke in serialized files (as e.g. Json files) directly. Instead deserialize the file with the ConvertFrom-Json cmdlet, make your changes to the object and serialize it again with the ConvertTo-Json cmdlet:
$Data = ConvertFrom-Json $Json
$Data[0].properties.typeproperties.domain = '_LSDomainName__'
$Data[1].properties.activities.typeproperties.libraries[1].pypi.repo = '__PATValue__'
$Data | ConvertTo-Json -Depth 9 | Out-File $armFileWithReplacedValues

Powershell Invoke-RestMethod in LogicApps

For this line of code in Powershell I used an HTTP connector in Logic Apps using Joey Cai's advice.
$body_login = #{"method"="login";"username"="qq";"password"="qqq"} | ConvertTo-Json
Now, I have this line of code in Powershell. How do I do the equivalent in LogicApps?
$Conn = Invoke-RestMethod -Method Post $uri_login -Headers $header -Body $body_login
Do I use the same HTTP connector or do I need something else? It's the Invoke-RestMethod syntax that I'm unsure of in Logic Apps.
I will need the output in JSON format, so I can parse it.
Thanks for the first answer. I need to know what to put in the uri, header and body. Here is the rest of the code which I should have provided before.
$baseuri = "https://test"
$header = #{
"Accept" = "text/json"
"Content-Type" = "text/json"
}
$G_header = #{"Accept" = "text/json"}
Write-Output "Login ..."
$uri_login = $baseuri + "SPDEDJSONSERVICE.LOGIN"
$body_login = #{"method"="login";"username"="qqq";"password"="qqq"} | ConvertTo-Json
$Conn = Invoke-RestMethod -Method Post $uri_login -Headers $header -Body $body_login
$SessionID = $conn.sessionID</code>
How do I do the equivalent in LogicApps?
As I have provided before, use HTTP connector.
I will need the output in JSON format, so I can parse it.
You could use Compose to work with data in JSON format.
1.Add Headers/Body which you want into Compose.
2.Add Outputs into Parse JSON. Copy the HTTP response Headers/Body info, and click use sample payload to generate schema, then parse Headers in it.
3.Use Initialize variable to get info what you want such as Date.
The result:
With Azure Logic Apps and the built-in HTTP action, you can create automated tasks and workflows that regularly send requests to any HTTP or HTTPS endpoint.
Sign in to the Azure portal. Open your logic app in Logic App Designer.
Under the step where you want to add the HTTP action, select New step.
To add an action between steps, move your pointer over the arrow between steps. Select the plus sign (+) that appears, and then select Add an action.
Under Choose an action, in the search box, enter "http" as your filter. From the Actions list, select the HTTP action.
Select HTTP action
For your scenarion you can use Basic Authentication.
This seems to work, but I could not have done it without Joey
<code>
{
"definition": {
"$schema":
"https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-
01/workflowdefinition.json#",
"actions": {
"HTTP_2": {
"inputs": {
"body": {
"FORMAT": "payload",
"FROM": 0,
"GRIDID": "PROP",
"GRIDVIEW": "1",
"HITS": 100,
"ORDERBY": "PR_DATESOLD",
"PROFILE": [
{
"PR_NAME": "G*",
"PR_USER1": "GENERATED"
}
],
"sessionID": "#body('Parse_JSON3')['sessionID']"
},
"headers": {
"Accept": "text/json",
"Content-Type": "text/json"
},
"method": "POST",
"uri": "#variables('uri_DefGrid')"
},
"runAfter": {
"Parse_JSON3": [
"Succeeded"
]
},
"type": "Http"
},
"Initialize_Header": {
"inputs": {
"variables": [
{
"name": "Header",
"type": "string",
"value": "{\"Accept\":\"text/json\",\"Content-
Type\":\"text/json\"}"
}
]
},
"runAfter": {
"Initialize_body_login": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Initialize_body_DefGrid": {
"inputs": {
"variables": [
{
"name": "body_DefGrid",
"type": "string",
"value": "json(#{body('HTTP_2')})"
}
]
},
"runAfter": {
"HTTP_2": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Initialize_body_login": {
"inputs": {
"variables": [
{
"name": "body_login",
"type": "string",
"value": "json(#{triggerBody()})"
}
]
},
"runAfter": {},
"type": "InitializeVariable"
},
"Initialize_uri_DefGrid": {
"inputs": {
"variables": [
{
"name": "uri_DefGrid",
"type": "string",
"value": "https://test/SPDEDMHAPI.GRIDGET"
}
]
},
"runAfter": {
"Initialize_uri_login": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Initialize_uri_login": {
"inputs": {
"variables": [
{
"name": "uri_login",
"type": "string",
"value": "https://test/SPDEDJSONSERVICE.LOGIN"
}
]
},
"runAfter": {
"Initialize_Header": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Initialize_uri_logout": {
"inputs": {
"variables": [
{
"name": "uri_logout",
"type": "string",
"value": "https://test/SPDEDJSONSERVICE.LOGOUT"
}
]
},
"runAfter": {
"Initialize_body_DefGrid": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Logout": {
"inputs": {
"body": {
"method": "logout",
"sessionID": "#body('Parse_JSON3')['sessionID']"
},
"headers": {
"Accept": "text/json",
"Content-Type": "text/json"
},
"method": "POST",
"uri": "#variables('uri_logout')"
},
"runAfter": {
"Initialize_uri_logout": [
"Succeeded"
]
},
"type": "Http"
},
"Parse_JSON3": {
"inputs": {
"content": "#triggerBody()",
"schema": {
"properties": {
"RLS_WHERE": {
"$id": "#/properties/RLS_WHERE",
"type": "string"
},
"contact": {
"type": "string"
},
"error": {
"type": "string"
},
"errorId": {
"type": "string"
},
"fullName": {
"type": "string"
},
"labellanguage": {
"type": "string"
},
"language": {
"type": "string"
},
"message": {
"type": "string"
},
"params": {
"properties": {
"WOPARTSOPT": {
"type": "string"
}
},
"required": [
"WOPARTSOPT"
],
"title": "The Params Schema",
"type": "object"
},
"role": {
"type": "string"
},
"sessionID": {
"type": "string"
},
"success": {
"type": "string"
},
"userEmail": {
"$id": "#/properties/userEmail",
"type": "string"
}
},
"required": [
"success",
"message",
"sessionID",
"language",
"labellanguage",
"error",
"errorId",
"fullName",
"role",
"contact",
"RLS_WHERE",
"userEmail",
"params"
],
"title": "The Root Schema",
"type": "object"
}
},
"runAfter": {
"Initialize_uri_DefGrid": [
"Succeeded"
]
},
"type": "ParseJson"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {},
"triggers": {
"HTTP": {
"inputs": {
"body": {
"method": "login",
"password": "qqq",
"username": "qqq"
},
"headers": {
"Accept": "text/json",
"Content-Type": "text/json"
},
"method": "POST",
"uri": "https://test/SPDEDJSONSERVICE.LOGIN"
},
"recurrence": {
"frequency": "Minute",
"interval": 4
},
"type": "Http"
}
}
},
"parameters": {}
}
</code>

Not able to fetch the individual details from JSON data

"Ns": {
"value": [
{
"Nname": "exa",
"SR": [
{
"name": "port1",
"properties": {
"description": "Allow port1",
"destinationPortRange": "1111",
"priority": 100
}
},
{
"name": "port1_0",
"properties": {
"description": "Allow port1",
"destinationPortRange": "1111",
"priority": 150
}
},
{
"name": "port2",
"properties": {
"description": "Allow 1115",
"destinationPortRange": "1115",
"priority": 100,
}
}
]
}
]
}
Want to assert the details of priority and name but was not able to do it.
Here is what I have implemented:
$Ndetails = templateProperties.parameters.Ns.value.SR
foreach ($Ndata in $Ndetails) {
$Ndata .properties.destinationPortRange |
Should -BeExactly #('1111','1111','1115')
} 
How to resolve the same using Pester in PowerShell?
You don't need to use foreach for this. You can just use Select-Object for this. Assuming your JSON is as #Mark Wragg linked in the comments:
$Json = #'
[{
"Ns": {
"value": [{
"Nname": "exa",
"SR": [{
"name": "port1",
"properties": {
"description": "Allow port1",
"destinationPortRange": "1111",
"priority": 100
}
},
{
"name": "port1_0",
"properties": {
"description": "Allow port1",
"destinationPortRange": "1111",
"priority": 150
}
},
{
"name": "port2",
"properties": {
"description": "Allow 1115",
"destinationPortRange": "1115",
"priority": 100
}
}
]
}]
}
}]
'#
$t = $Json | ConvertFrom-Json
Your test file should look like this:
$result = $t.Ns.value.SR.properties.destinationPortRange
it 'destinationPortRange matches' {
$result | Should -BeExactly #('1111','1111','1115')
}
Explanation
Your use of foreach was incorrect as you compared single element (also notice that I deleted unnecessary space)
$Ndata.properties.destinationPortRange
to the array
| Should -BeExactly #('1111','1111','1115')
What you have to do is to compare array to array as in my example.