Strange jsonpath not able to capture (Gatling/Scala) - scala

I have a responsebody that looks like this:
[
 {
 "system": "http://snomed.info/sct",
 "code": "735938006",
 "display": "akutt hodepine"
 },
 {
 "system": "http://snomed.info/sct",
 "code": "267096005",
 "display": "frontal hodepine"
 },
 {
 "system": "http://snomed.info/sct",
 "code": "103011009",
 "display": "godartet anstrengelseshodepine"
 },
 {
 "system": "http://snomed.info/sct",
 "code": "25064002",
 "display": "hodepine"
 },
 {
 "system": "http://snomed.info/sct",
 "code": "38823002",
 "display": "hodepine med aura"
 },
 {
 "system": "http://snomed.info/sct",
 "code": "193031009",
 "display": "klasehodepine"
 },
 {
 "system": "http://snomed.info/sct",
 "code": "230465000",
 "display": "migrene med aura uten hodepine"
 },
 {
 "system": "http://snomed.info/sct",
 "code": "330007",
 "display": "oksipital hodepine"
 },
 {
 "system": "http://snomed.info/sct",
 "code": "54012000",
 "display": "posttraumatisk hodepine"
 },
 {
 "system": "http://snomed.info/sct",
 "code": "4969004",
 "display": "sinushodepine"
 },
 {
 "system": "http://snomed.info/sct",
 "code": "398057008",
 "display": "tensjonshodepine"
 },
 {
 "system": "http://snomed.info/sct",
 "code": "128187005",
 "display": "vaskulær hodepine"
 }
]
and I want to capture the value "hodepine" element 3 from the top. I use IntelliJ and "Copy JSON Pointer" and it gives me this:
/3/display
This does not work using it in Gatling/Scala like this:
.check(jsonPath("$../3/display").saveAs("display"))
The error is:
> jsonPath($../3/display).find.exists, found nothing
Any tips on how to obtain the value?

I have the answer on similar question and it can be helpful for you.
For your case: $.[3].display

Related

How to host Flutter Webapp on a subfolder?

I'm trying to host a webApp inside a subfolder of my host. I uploaded all the contents of the build/web folder, but in the console I get the error:
how can i set url of my subfolder?
{
"name": "pagamento",
"short_name": "pagamento",
"start_url": "./pagamenti",
"display": "standalone",
"background_color": "#0175C2",
"theme_color": "#0175C2",
"description": "A new Flutter project.",
"orientation": "portrait-primary",
"prefer_related_applications": false,
"icons": [
{
"src": "icons/Icon-192.png",
"sizes": "192x192",
"type": "image/png"
},
{
"src": "icons/Icon-512.png",
"sizes": "512x512",
"type": "image/png"
},
{
"src": "icons/Icon-maskable-192.png",
"sizes": "192x192",
"type": "image/png",
"purpose": "maskable"
},
{
"src": "icons/Icon-maskable-512.png",
"sizes": "512x512",
"type": "image/png",
"purpose": "maskable"
}
]
}
but it did not work
thank you and happy 2023 to all.

Importing ARM template for creating ADF resources not creating any

I am new to ADF & ARM. I have a blank Data Factory-v2(TestDataFactory-123Test) which I want to get it populated using an existing ADF(TestDataFactory-123). I followed step by step what is mentioned in the official documentation Create a Resource Manager template for each environment. The deployment shows succeeded but I can't see anything in it. I used 'Build your own template in the editor' option in the portal for importing the existing ARM template. Am I missing anything?
Below is the ARM which I got by 'exporting' the ARM for TestDataFactory-123:
{
"$schema": "http://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"factoryName": {
"type": "string",
"metadata": "Data Factory name",
"defaultValue": "TestDataFactory-123"
},
"AzureBlobStorageLinkedService_connectionString": {
"type": "secureString",
"metadata": "Secure string for 'connectionString' of 'AzureBlobStorageLinkedService'",
"defaultValue": "TestDataFactory-123"
}
},
"variables": {
"factoryId": "[concat('Microsoft.DataFactory/factories/', parameters('factoryName'))]"
},
"resources": [
{
"name": "[concat(parameters('factoryName'), '/AzureBlobStorageLinkedService')]",
"type": "Microsoft.DataFactory/factories/linkedServices",
"apiVersion": "2018-06-01",
"properties": {
"annotations": [],
"type": "AzureBlobStorage",
"typeProperties": {
"connectionString": "[parameters('AzureBlobStorageLinkedService_connectionString')]"
}
},
"dependsOn": []
},
{
"name": "[concat(parameters('factoryName'), '/InputDataset')]",
"type": "Microsoft.DataFactory/factories/datasets",
"apiVersion": "2018-06-01",
"properties": {
"linkedServiceName": {
"referenceName": "AzureBlobStorageLinkedService",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "Binary",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"fileName": "emp.txt",
"folderPath": "input",
"container": "adftutorial"
}
}
},
"dependsOn": [
"[concat(variables('factoryId'), '/linkedServices/AzureBlobStorageLinkedService')]"
]
},
{
"name": "[concat(parameters('factoryName'), '/OutputDataset')]",
"type": "Microsoft.DataFactory/factories/datasets",
"apiVersion": "2018-06-01",
"properties": {
"linkedServiceName": {
"referenceName": "AzureBlobStorageLinkedService",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "Binary",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"folderPath": "output",
"container": "adftutorial"
}
}
},
"dependsOn": [
"[concat(variables('factoryId'), '/linkedServices/AzureBlobStorageLinkedService')]"
]
},
{
"name": "[concat(parameters('factoryName'), '/CopyPipeline')]",
"type": "Microsoft.DataFactory/factories/pipelines",
"apiVersion": "2018-06-01",
"properties": {
"activities": [
{
"name": "CopyFromBlobToBlob",
"type": "Copy",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "BinarySource",
"storeSettings": {
"type": "AzureBlobStorageReadSettings",
"recursive": true
}
},
"sink": {
"type": "BinarySink",
"storeSettings": {
"type": "AzureBlobStorageWriteSettings"
}
},
"enableStaging": false
},
"inputs": [
{
"referenceName": "InputDataset",
"type": "DatasetReference",
"parameters": {}
}
],
"outputs": [
{
"referenceName": "OutputDataset",
"type": "DatasetReference",
"parameters": {}
}
]
}
],
"annotations": []
},
"dependsOn": [
"[concat(variables('factoryId'), '/datasets/InputDataset')]",
"[concat(variables('factoryId'), '/datasets/OutputDataset')]"
]
}
]
}
The fix was as simple as replacing the 'defaultValue' for the 'factoryName' parameter with the name of the empty data factory viz. 'TestDataFactory-123Test' and not the existing one 'TestDataFactory-123'! Also, I replaced the 'defaultValue' of the 'AzureBlobStorageLinkedService_connectionString' parameter with the actual connection string.

AzureDevOps - Different behavior from custom task for Services and Server

I created a custom task using the documentation, however it works on Azure DevOps Services but on Server it gives the error
An error occurred while loading the YAML build pipeline. Value cannot be null. Parameter name: key
My first thoughts are "what is the parameter that is missing?" so i filled all the available and possible parameters and still continued with the error.
After that i went to the event viewer in the machine running Azure DevOps Server and got this error:
Detailed Message: The subscriber Pipelines Check Run: build completed event listener raised an exception while being notified of event Microsoft.TeamFoundation.Build2.Server.BuildCompletedEvent.
Exception Message: Value cannot be null.
Parameter name: definition and repository (type ArgumentNullException)
Exception Stack Trace: at Microsoft.TeamFoundation.Pipelines.Server.Providers.TfsGitProvider.TfsGitConnectionCreator.IsProviderDefinition(IVssRequestContext requestContext, BuildDefinition definition)
at Microsoft.TeamFoundation.Pipelines.Server.Extensions.BuildCompletedEventListener2.HandleCompletedEvent(IVssRequestContext requestContext, IReadOnlyBuildData build, BuildDefinition definition)
at Microsoft.TeamFoundation.Pipelines.Server.Extensions.BuildCompletedEventListener.ProcessEvent(IVssRequestContext requestContext, NotificationType notificationType, Object notificationEvent, Int32& statusCode, String& statusMessage, ExceptionPropertyCollection& properties)
at Microsoft.TeamFoundation.Framework.Server.TeamFoundationEventService.SubscriptionList.Notify(IVssRequestContext requestContext, NotificationType notificationType, Object notificationEventArgs, String& statusMessage, ExceptionPropertyCollection& properties, Exception& exception)
task.json:
{
"id": "25156245-9317-48e2-bcf4-7dab4c130a3e",
"name": "ping-pong-build-trigger",
"friendlyName": "Ping Pong Build Trigger",
"description": "Randomly trigger builds to find a sequenced build order",
"helpMarkDown": "https://github.com/brunomartinspro/Ping-Pong-Build-Trigger-AzureDevOps",
"category": "Build",
"author": "Bruno Martins (brunomartins.pro)",
"version": {
"Major": 1,
"Minor": 0,
"Patch": 0
},
"instanceNameFormat": "Ping Pong Build Trigger",
"properties": {
"mode": {
"type": "string",
"description": "Mode to be used",
"label": "Mode",
"required": "true"
},
"apiKey": {
"type": "string",
"label": "PAT",
"defaultValue": "",
"description": "Personal Access Token.",
"required": "true"
},
"source": {
"type": "string",
"label": "AzureDevOps Project URI",
"defaultValue": "http://kamina.azuredevops.local/DefaultCollection/Kamina",
"description": "AzureDevOps Project URI.",
"required": "true"
},
"projectName": {
"type": "string",
"label": "AzureDevOps Project Name",
"defaultValue": "Kamina",
"description": "AzureDevOps Project Name.",
"required": "true"
},
"sourceBranch": {
"type": "string",
"label": "Git Source Branch",
"defaultValue": "develop",
"description": "The branch the builds will trigger",
"required": "true"
},
"lastKnownFile": {
"type": "string",
"label": "Sequence Location",
"defaultValue": "",
"description": "The location of the Build Order.",
"required": "true"
},
"maxErrorCycles": {
"type": "int",
"label": "Maximum Error Cycles",
"defaultValue": 10,
"description": "The number of fails allowed.",
"required": "true"
},
"infiniteCycles": {
"type": "string",
"label": "Infinite Cycles",
"defaultValue": "false",
"description": "Infinite Cycles - only ends until everything succeeds.",
"required": "true"
}
},
"inputs": [{
"name": "mode",
"type": "string",
"label": "Mode",
"defaultValue": "AzureDevOps",
"helpMarkDown": "Mode to be used.",
"required": "true"
},
{
"name": "apiKey",
"type": "string",
"label": "PAT",
"defaultValue": "",
"helpMarkDown": "Personal Access Token.",
"required": "true"
},
{
"name": "source",
"type": "string",
"label": "AzureDevOps Project URI",
"defaultValue": "http://kamina.azuredevops.local/DefaultCollection/Kamina",
"helpMarkDown": "AzureDevOps Project URI.",
"required": "true"
},
{
"name": "projectName",
"type": "string",
"label": "AzureDevOps Project Name",
"defaultValue": "Kamina",
"helpMarkDown": "AzureDevOps Project Name.",
"required": "true"
},
{
"name": "sourceBranch",
"type": "string",
"label": "Git Source Branch",
"defaultValue": "develop",
"helpMarkDown": "The branch the builds will trigger",
"required": "true"
},
{
"name": "lastKnownFile",
"type": "string",
"label": "Sequence Location",
"defaultValue": "",
"helpMarkDown": "The location of the Build Order.",
"required": "true"
},
{
"name": "maxErrorCycles",
"type": "int",
"label": "Maximum Error Cycles",
"defaultValue": 10,
"helpMarkDown": "The number of fails allowed.",
"required": "true"
},
{
"name": "infiniteCycles",
"type": "string",
"label": "Infinite Cycles",
"defaultValue": "false",
"helpMarkDown": "Infinite Cycles - only ends until everything succeeds.",
"required": "true"
}
],
"execution": {
"PowerShell": {
"target": "ping-pong-build-trigger.ps1",
"argumentFormat": ""
}
}
}
vss-extension.json
{
"manifestVersion": 1,
"id": "ping-pong-build-trigger-task",
"name": "Ping Pong Build Trigger",
"version": "1.0.0",
"publisher": "BrunoMartinsPro",
"targets": [{
"id": "Microsoft.VisualStudio.Services"
}],
"description": "Randomly trigger builds to find a sequenced build order",
"categories": [
"Azure Pipelines"
],
"icons": {
"default": "extensionIcon.png"
},
"files": [{
"path": "task"
}],
"contributions": [{
"id": "ping-pong-build-trigger",
"type": "ms.vss-distributed-task.task",
"targets": [
"ms.vss-distributed-task.tasks"
],
"properties": {
"name": "task"
}
}]
}
How can i use a custom task in both Services and Server?
The .vsix can be downloaded in the release page of the Github Repository: https://github.com/brunomartinspro/Ping-Pong-Build-Trigger-AzureDevOps
Developer Community: https://developercommunity.visualstudio.com/content/problem/715570/server-and-services-have-different-behavior.html
So it appears that there is some sort of cache mechanism in the extensions, i need 3 azure devops server editions to debug.
The first one was used for development, the second one also for development but uninstalled and installed again, the third one for testing public releases.
I couldn't find the physical directory of where the cache gets stored, if there is cache at all.

Github : How to list milestone updates for an issue

How to list for a specific ticket the current milestone and its precedent assigned ones if applicable (+ date of each milestone update) ? I checked in the github API (https://developer.github.com/v3/issues/#get-a-single-issue) and I can extract the current milestone but not the previous assigned ones (if they exist). Any idea ?
Thank you,
You can use the Issues Events API. To test this I went to a test issue and:
Added milestone new
Added milestone two
After using that endpoint to query the API I got three events, two with type milestoned and one demilestoned. The endpoint allows you to filter for those events specifically to remove noise on very active issues. Here is an example of the reply (with obfuscated data)
[
{
"id": 2193329921,
"url": "https://api.github.com/repos/myOrg/myRepo/issues/events/2193329921",
"actor": {
"login": "myuser",
"id": 1192590,
"node_id": "MDQ6VXNlcjExOTI1OTA=",
"gravatar_id": "",
"url": "https://api.github.com/users/myuser",
"html_url": "https://github.com/myuser",
"followers_url": "https://api.github.com/users/myuser/followers",
"following_url": "https://api.github.com/users/myuser/following{/other_user}",
"gists_url": "https://api.github.com/users/myuser/gists{/gist_id}",
"starred_url": "https://api.github.com/users/myuser/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/myuser/subscriptions",
"organizations_url": "https://api.github.com/users/myuser/orgs",
"repos_url": "https://api.github.com/users/myuser/repos",
"events_url": "https://api.github.com/users/myuser/events{/privacy}",
"received_events_url": "https://api.github.com/users/myuser/received_events",
"type": "User",
"site_admin": true
},
"event": "milestoned",
"commit_id": null,
"commit_url": null,
"created_at": "2019-03-11T09:42:00Z",
"milestone": {
"title": "new"
}
},
{
"id": 2193330104,
"url": "https://api.github.com/repos/myOrg/myRepo/issues/events/2193330104",
"actor": {
"login": "myuser",
"id": 1192590,
"url": "https://api.github.com/users/myuser",
"html_url": "https://github.com/myuser",
"followers_url": "https://api.github.com/users/myuser/followers",
"following_url": "https://api.github.com/users/myuser/following{/other_user}",
"gists_url": "https://api.github.com/users/myuser/gists{/gist_id}",
"starred_url": "https://api.github.com/users/myuser/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/myuser/subscriptions",
"organizations_url": "https://api.github.com/users/myuser/orgs",
"repos_url": "https://api.github.com/users/myuser/repos",
"events_url": "https://api.github.com/users/myuser/events{/privacy}",
"received_events_url": "https://api.github.com/users/myuser/received_events",
"type": "User",
"site_admin": true
},
"event": "demilestoned",
"commit_id": null,
"commit_url": null,
"created_at": "2019-03-11T09:42:04Z",
"milestone": {
"title": "new"
}
},
{
"id": 2193330105,
"url": "https://api.github.com/repos/myOrg/myRepo/issues/events/2193330105",
"actor": {
"login": "myuser",
"url": "https://api.github.com/users/myuser",
"html_url": "https://github.com/myuser",
"followers_url": "https://api.github.com/users/myuser/followers",
"following_url": "https://api.github.com/users/myuser/following{/other_user}",
"gists_url": "https://api.github.com/users/myuser/gists{/gist_id}",
"starred_url": "https://api.github.com/users/myuser/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/myuser/subscriptions",
"organizations_url": "https://api.github.com/users/myuser/orgs",
"repos_url": "https://api.github.com/users/myuser/repos",
"events_url": "https://api.github.com/users/myuser/events{/privacy}",
"received_events_url": "https://api.github.com/users/myuser/received_events",
"type": "User",
"site_admin": true
},
"event": "milestoned",
"commit_id": null,
"commit_url": null,
"created_at": "2019-03-11T09:42:04Z",
"milestone": {
"title": "second"
}
}
]

bq load and I get "Too many positional args - still have [..."

I am attempting to load data from Cloud Storage into a table and am getting the error message below.
bq load --skip_leading_rows=1 --field_delimiter='\t' --source_format=CSV projectID:dataset.table gs://bucket/source.txt sku:STRING,variant_id:STRING,title:STRING,category:STRING,description:STRING,buy_url:STRING,mobile_url:STRING,itemset_url:STRING,image_url:STRING,swatch_url:STRING,availability:STRING,issellableonline:STRING,iswebexclusive:STRING,price:STRING,saleprice:STRING,quantity:STRING,coresku_inet:STRING,condition:STRING,productreviewsavg:STRING,productreviewscount:STRING,mediaset:STRING,webindexpty:INTEGER,NormalSalesIndex1:FLOAT,NormalSalesIndex2:FLOAT,NormalSalesIndex3:FLOAT,SalesScore:FLOAT,NormalInventoryIndex1:FLOAT,NormalInventoryIndex2:FLOAT,NormalInventoryIndex3:FLOAT,InventoryScore:FLOAT,finalscore:FLOAT,EDVP:STRING,dropship:STRING,brand:STRING,model_number:STRING,gtin:STRING,color:STRING,size:STRING,gender:STRING,age:STRING,oversized:STRING,ishazardous:STRING,proddept:STRING,prodsubdept:STRING,prodclass:STRING,prodsubclass:STRING,sku_attr_names:STRING,sku_attr_values:STRING,store_id:STRING,store_quantity:STRING,promo_name:STRING,product_badge:STRING,cbl_type_names1:STRING,cbl_type_value1:STRING,cbl_type_names2:STRING,cbl_type_value2:STRING,cbl_type_names3:STRING,cbl_type_value3:STRING,cbl_type_names4:STRING,cbl_type_value4:STRING,cbl_type_names5:STRING,cbl_type_value5:STRING,choice1_name_value:STRING,choice2_name_value:STRING,choice3_name_value:STRING,cbl_is_free_shipping:STRING,isnewflag:STRING,shipping_weight:STRING,masterpath:STRING,accessoriesFlag:STRING,short_copy:STRING,bullet_copy:STRING,map:STRING,display_msrp:STRING,display_price:STRING,suppress_sales_display:STRING,margin:FLOAT
I have also tried to load the schema into a json file and I get the same error message.
As this was too big for comment, will post it here.
I wonder what happens if you set the schema file to have this content:
[{"name": "sku", "type": "STRING"},
{"name": "variant_id", "type": "STRING"},
{"name": "title", "type": "STRING"},
{"name": "category", "type": "STRING"},
{"name": "description", "type": "STRING"},
{"name": "buy_url", "type": "STRING"},
{"name": "mobile_url", "type": "STRING"},
{"name": "itemset_url", "type": "STRING"},
{"name": "image_url", "type": "STRING"},
{"name": "swatch_url", "type": "STRING"},
{"name": "availability", "type": "STRING"},
{"name": "issellableonline", "type": "STRING"},
{"name": "iswebexclusive", "type": "STRING"},
{"name": "price", "type": "STRING"},
{"name": "saleprice", "type": "STRING"},
{"name": "quantity", "type": "STRING"},
{"name": "coresku_inet", "type": "STRING"},
{"name": "condition", "type": "STRING"},
{"name": "productreviewsavg", "type": "STRING"},
{"name": "productreviewscount", "type": "STRING"},
{"name": "mediaset", "type": "STRING"},
{"name": "webindexpty", "type": "INTEGER"},
{"name": "NormalSalesIndex1", "type": "FLOAT"},
{"name": "NormalSalesIndex2", "type": "FLOAT"},
{"name": "NormalSalesIndex3", "type": "FLOAT"},
{"name": "SalesScore", "type": "FLOAT"},
{"name": "NormalInventoryIndex1", "type": "FLOAT"},
{"name": "NormalInventoryIndex2", "type": "FLOAT"},
{"name": "NormalInventoryIndex3", "type": "FLOAT"},
{"name": "InventoryScore", "type": "FLOAT"},
{"name": "finalscore", "type": "FLOAT"},
{"name": "EDVP", "type": "STRING"},
{"name": "dropship", "type": "STRING"},
{"name": "brand", "type": "STRING"},
{"name": "model_number", "type": "STRING"},
{"name": "gtin", "type": "STRING"},
{"name": "color", "type": "STRING"},
{"name": "size", "type": "STRING"},
{"name": "gender", "type": "STRING"},
{"name": "age", "type": "STRING"},
{"name": "oversized", "type": "STRING"},
{"name": "ishazardous", "type": "STRING"},
{"name": "proddept", "type": "STRING"},
{"name": "prodsubdept", "type": "STRING"},
{"name": "prodclass", "type": "STRING"},
{"name": "prodsubclass", "type": "STRING"},
{"name": "sku_attr_names", "type": "STRING"},
{"name": "sku_attr_values", "type": "STRING"},
{"name": "store_id", "type": "STRING"},
{"name": "store_quantity", "type": "STRING"},
{"name": "promo_name", "type": "STRING"},
{"name": "product_badge", "type": "STRING"},
{"name": "cbl_type_names1", "type": "STRING"},
{"name": "cbl_type_value1", "type": "STRING"},
{"name": "cbl_type_names2", "type": "STRING"},
{"name": "cbl_type_value2", "type": "STRING"},
{"name": "cbl_type_names3", "type": "STRING"},
{"name": "cbl_type_value3", "type": "STRING"},
{"name": "cbl_type_names4", "type": "STRING"},
{"name": "cbl_type_value4", "type": "STRING"},
{"name": "cbl_type_names5", "type": "STRING"},
{"name": "cbl_type_value5", "type": "STRING"},
{"name": "choice1_name_value", "type": "STRING"},
{"name": "choice2_name_value", "type": "STRING"},
{"name": "choice3_name_value", "type": "STRING"},
{"name": "cbl_is_free_shipping", "type": "STRING"},
{"name": "isnewflag", "type": "STRING"},
{"name": "shipping_weight", "type": "STRING"},
{"name": "masterpath", "type": "STRING"},
{"name": "accessoriesFlag", "type": "STRING"},
{"name": "short_copy", "type": "STRING"},
{"name": "bullet_copy", "type": "STRING"},
{"name": "map", "type": "STRING"},
{"name": "display_msrp", "type": "STRING"},
{"name": "display_price", "type": "STRING"},
{"name": "suppress_sales_display", "type": "STRING"},
{"name": "margin", "type": "FLOAT"}]
If you save it say in file "schema.json" and run the command:
bq load --skip_leading_rows=1 --field_delimiter='\t' --source_format=CSV projectID:dataset.table gs://bucket/source.txt schema.json
Do you still get the same error?
Cherba nailed it. The typo was in my batch file that included the extra load parameter. Thanks for all your time consideration.