ERROR: function jsonb_array_elements_text(jsonb[]) does not exist - postgresql

Having a hard time traversing and querying elements from a jsonb[] row.
CREATE TABLE foo (
id uuid PRIMARY KEY,
work_experience jsonb[] NOT NULL
);
INSERT INTO foo (id, work_experience)
VALUES (
'b4e942a0-49b4-4fa7-8f7a-5fbf0541d1c9',
E'{"{\\"id\\": \\"7cd74bae-ff5b-4f58-ab20-0218f820ffff\\", \\"skills\\": [{\\"id\\": \\"c8a5328d-87ba-419d-802f-80b1d940bb68|313384d2-486d-4b7f-ba33-76b1cd696f0a|fd3c41b8-8c15-47e2-a80d-cf3683b2d0da\\", \\"level1\\": \\"Programming languages\\", \\"level2\\": \\"Scripting languages\\", \\"level3\\": \\"TypeScript\\", \\"level1_id\\": \\"c8a5328d-87ba-419d-802f-80b1d940bb68\\", \\"level2_id\\": \\"313384d2-486d-4b7f-ba33-76b1cd696f0a\\", \\"level3_id\\": \\"fd3c41b8-8c15-47e2-a80d-cf3683b2d0da\\"}, {\\"id\\": \\"84dff39f-2ed6-408c-a188-14cf55a09c5b|e13b48c1-fd0f-4ecc-89de-58e9312b9435|686af7e4-6d58-4148-b227-3bf65ff10273\\", \\"level1\\": \\"Software frameworks & libraries\\", \\"level2\\": \\"Frontend frameworks & libraries\\", \\"level3\\": \\"React\\", \\"level1_id\\": \\"84dff39f-2ed6-408c-a188-14cf55a09c5b\\", \\"level2_id\\": \\"e13b48c1-fd0f-4ecc-89de-58e9312b9435\\", \\"level3_id\\": \\"686af7e4-6d58-4148-b227-3bf65ff10273\\"}, {\\"id\\": \\"84dff39f-2ed6-408c-a188-14cf55a09c5b|c4e54726-7bd5-44bb-8597-a05eb2272e2b|cda4441f-dba6-495c-9e2e-7429bd5e0465\\", \\"level1\\": \\"Software frameworks & libraries\\", \\"level2\\": \\"Backend frameworks & libraries\\", \\"level3\\": \\"Node.js\\", \\"level1_id\\": \\"84dff39f-2ed6-408c-a188-14cf55a09c5b\\", \\"level2_id\\": \\"c4e54726-7bd5-44bb-8597-a05eb2272e2b\\", \\"level3_id\\": \\"cda4441f-dba6-495c-9e2e-7429bd5e0465\\"}, {\\"id\\": \\"84dff39f-2ed6-408c-a188-14cf55a09c5b|e13b48c1-fd0f-4ecc-89de-58e9312b9435|fd3c41b8-8c15-47e2-a80d-cf3683b2d0da\\", \\"level1\\": \\"Software frameworks & libraries\\", \\"level2\\": \\"Frontend frameworks & libraries\\", \\"level3\\": \\"TypeScript\\", \\"level1_id\\": \\"84dff39f-2ed6-408c-a188-14cf55a09c5b\\", \\"level2_id\\": \\"e13b48c1-fd0f-4ecc-89de-58e9312b9435\\", \\"level3_id\\": \\"fd3c41b8-8c15-47e2-a80d-cf3683b2d0da\\"}], \\"end_date\\": null, \\"position\\": \\"Senior Software Engineer + Team Lead\\", \\"start_date\\": \\"2019-10-01T00:00:00\\", \\"description\\": \\"Draper, Utah, United States\\\\n• Architected Expert Portal* from the ground up using a Node/Typescript\\\\nbackend, a\\\\nPostgreSQL database, a GraphQL API layer, a Webpack build process, with a\\\\nTypescript/React front-end and XState for state management\\\\n• Enforced coding best practices with linting rules and code formatters by\\\\nautomating it in\\\\ngit workflow\\\\n• Automated deployment Expert Portal* to EC2 instances and the #pluralsight\\\\nNPM\\\\nartifactory using Github, TeamCity, and Octopus\\\\n• Improved product team workflow by building a browser extention to add\\\\nLeanKit card\\\\ntemplate functionality\\\\n• Consumed and published data through Kafka streams and RabbitMQ\\\\nmessages\\\\n• Interviewed, onboarded, and trained junior to mid-level engineers\\", \\"company_name\\": \\"Pluralsight\\"}","{\\"id\\": \\"9e2c2b44-39a4-4369-b237-c51fd938e61d\\", \\"skills\\": [{\\"id\\": \\"c8a5328d-87ba-419d-802f-80b1d940bb68|313384d2-486d-4b7f-ba33-76b1cd696f0a|012abcd1-3a6a-4803-a47e-42f46b402024\\", \\"level1\\": \\"Programming languages\\", \\"level2\\": \\"Scripting languages\\", \\"level3\\": \\"JavaScript\\", \\"level1_id\\": \\"c8a5328d-87ba-419d-802f-80b1d940bb68\\", \\"level2_id\\": \\"313384d2-486d-4b7f-ba33-76b1cd696f0a\\", \\"level3_id\\": \\"012abcd1-3a6a-4803-a47e-42f46b402024\\"}], \\"end_date\\": \\"2019-10-01T00:00:00\\", \\"position\\": \\"Software Engineer II\\", \\"start_date\\": \\"2017-11-01T00:00:00\\", \\"description\\": \\"Greater Salt Lake City Area\\\\nWorked on the KSL Jobs Classifieds team as a full-stack developer. Following\\\\nthe scrum methodology, I added new features and maintained all things\\\\npowered by KSL Jobs.\\\\n• Built and deployed a \\\\\\"white label\\\\\\" version of KSL Jobs for the Silicon Slopes\\\\nbrand. (https://siliconslopes.ksl.com)\\\\n• Rewrote major sections of the current KSL Jobs site in React.js\\\\n• Automated querying data for reports and analytic purposes through Node and\\\\nPHP scripts\\\\n• Provided rich data tracking through Google Tag Manager, Google Analytics,\\\\nand BigQuery\\\\n• Migrated Solr search engine to ElasticSearch with a GraphQL API\\", \\"company_name\\": \\"Deseret Digital Media\\"}","{\\"id\\": \\"efbf68f4-7bdc-4ab6-bba9-fbf7ec38aeef\\", \\"skills\\": [{\\"id\\": \\"c8a5328d-87ba-419d-802f-80b1d940bb68|f45abc59-1e7e-4853-a498-999fcf12d498|4b266297-6e25-4443-90ec-248bded4225a\\", \\"level1\\": \\"Programming languages\\", \\"level2\\": \\"High-level languages\\", \\"level3\\": \\"PHP\\", \\"level1_id\\": \\"c8a5328d-87ba-419d-802f-80b1d940bb68\\", \\"level2_id\\": \\"f45abc59-1e7e-4853-a498-999fcf12d498\\", \\"level3_id\\": \\"4b266297-6e25-4443-90ec-248bded4225a\\"}], \\"end_date\\": \\"2019-08-01T00:00:00\\", \\"position\\": \\"Full Stack Developer\\", \\"start_date\\": \\"2017-01-01T00:00:00\\", \\"description\\": \\"Provo, Utah Area\\\\nWorked with Appritech LLC to modernize their legacy software and add new\\\\nfeatures to automate their business processes.\\\\n• Building new call handler from the ground up using JavaScript ES6,\\\\nBootstrap, SASS for\\\\nfront-end, and PHP7/Laravel for back-end\\\\n• Implemented real-time call management system and built reporting API.\\\\nImproved call agent\\\\nproductivity by 70%\\\\n• Upgraded deprecated PHP legacy code to PHP 7\\\\n• Synchronized follow up calls with Twilio API functionality for SMS message\\\\nforwarding\\\\n• Installed and setup Apache server for an after hours call center\\", \\"company_name\\": \\"Appritech Software\\"}","{\\"id\\": \\"2db60c6c-c214-4d9b-9034-baba676203a8\\", \\"skills\\": [{\\"id\\": \\"c8a5328d-87ba-419d-802f-80b1d940bb68|f45abc59-1e7e-4853-a498-999fcf12d498|4b266297-6e25-4443-90ec-248bded4225a\\", \\"level1\\": \\"Programming languages\\", \\"level2\\": \\"High-level languages\\", \\"level3\\": \\"PHP\\", \\"level1_id\\": \\"c8a5328d-87ba-419d-802f-80b1d940bb68\\", \\"level2_id\\": \\"f45abc59-1e7e-4853-a498-999fcf12d498\\", \\"level3_id\\": \\"4b266297-6e25-4443-90ec-248bded4225a\\"}], \\"end_date\\": \\"2017-09-01T00:00:00\\", \\"position\\": \\"Web Developer\\", \\"start_date\\": \\"2017-02-01T00:00:00\\", \\"description\\": \\"Provo, Utah Area\\\\nWorked with Redcore LLC to build entrepreneurial tools, marketing tools, and\\\\nadding new functionality to the current Wordpress-integrated website and\\\\nCMS.\\\\n• Built Brand Management website from the ground up using Bootstrap and\\\\nJavaScript on the\\\\nfront end, with PHP and MySQL on the back-end\\\\n• Automated managerial and accounting tasks, such as invoice generation,\\\\nand transactions using Stripe API\\\\n• Debugged and perfected current web applications to enrich UX\\\\n• Doubled clientele by expanding Redcore services offered to include website\\\\ncreation and\\\\nmanagement\\", \\"company_name\\": \\"Redcore LLC\\"}","{\\"id\\": \\"c3f1d5b2-5586-477d-ae4c-e2927463244e\\", \\"skills\\": [{\\"id\\": \\"c8a5328d-87ba-419d-802f-80b1d940bb68|313384d2-486d-4b7f-ba33-76b1cd696f0a|012abcd1-3a6a-4803-a47e-42f46b402024\\", \\"level1\\": \\"Programming languages\\", \\"level2\\": \\"Scripting languages\\", \\"level3\\": \\"JavaScript\\", \\"level1_id\\": \\"c8a5328d-87ba-419d-802f-80b1d940bb68\\", \\"level2_id\\": \\"313384d2-486d-4b7f-ba33-76b1cd696f0a\\", \\"level3_id\\": \\"012abcd1-3a6a-4803-a47e-42f46b402024\\"}], \\"end_date\\": \\"2017-03-01T00:00:00\\", \\"position\\": \\"Software and Web Developer\\", \\"start_date\\": \\"2016-12-01T00:00:00\\", \\"description\\": \\"Provo, Utah Area\\\\nThis internship was focused on building core website and software for\\\\nCentaurific using the LAMP stack.\\\\n• Created dynamic landing pages for products to generate leads\\\\n• Designed a MySQL database to organize data and generate excel reports\\\\n• Revitalized marketing efforts through analytics and a social media sharing\\\\npage\\", \\"company_name\\": \\"Centaurific\\"}"}'
);
(Here is what that work_experience row looks like in JSON:)
[
{
"id": "7cd74bae-ff5b-4f58-ab20-0218f820ffff",
"skills": [
{
"id": "c8a5328d-87ba-419d-802f-80b1d940bb68|313384d2-486d-4b7f-ba33-76b1cd696f0a|fd3c41b8-8c15-47e2-a80d-cf3683b2d0da",
"level1": "Programming languages",
"level2": "Scripting languages",
"level3": "TypeScript",
"level1_id": "c8a5328d-87ba-419d-802f-80b1d940bb68",
"level2_id": "313384d2-486d-4b7f-ba33-76b1cd696f0a",
"level3_id": "fd3c41b8-8c15-47e2-a80d-cf3683b2d0da"
},
{
"id": "84dff39f-2ed6-408c-a188-14cf55a09c5b|e13b48c1-fd0f-4ecc-89de-58e9312b9435|686af7e4-6d58-4148-b227-3bf65ff10273",
"level1": "Software frameworks & libraries",
"level2": "Frontend frameworks & libraries",
"level3": "React",
"level1_id": "84dff39f-2ed6-408c-a188-14cf55a09c5b",
"level2_id": "e13b48c1-fd0f-4ecc-89de-58e9312b9435",
"level3_id": "686af7e4-6d58-4148-b227-3bf65ff10273"
},
{
"id": "84dff39f-2ed6-408c-a188-14cf55a09c5b|c4e54726-7bd5-44bb-8597-a05eb2272e2b|cda4441f-dba6-495c-9e2e-7429bd5e0465",
"level1": "Software frameworks & libraries",
"level2": "Backend frameworks & libraries",
"level3": "Node.js",
"level1_id": "84dff39f-2ed6-408c-a188-14cf55a09c5b",
"level2_id": "c4e54726-7bd5-44bb-8597-a05eb2272e2b",
"level3_id": "cda4441f-dba6-495c-9e2e-7429bd5e0465"
},
{
"id": "84dff39f-2ed6-408c-a188-14cf55a09c5b|e13b48c1-fd0f-4ecc-89de-58e9312b9435|fd3c41b8-8c15-47e2-a80d-cf3683b2d0da",
"level1": "Software frameworks & libraries",
"level2": "Frontend frameworks & libraries",
"level3": "TypeScript",
"level1_id": "84dff39f-2ed6-408c-a188-14cf55a09c5b",
"level2_id": "e13b48c1-fd0f-4ecc-89de-58e9312b9435",
"level3_id": "fd3c41b8-8c15-47e2-a80d-cf3683b2d0da"
}
],
"end_date": null,
"position": "Senior Software Engineer + Team Lead",
"start_date": "2019-10-01T00:00:00",
"description": "Draper, Utah, United Statesn• Architected Expert Portal* from the ground up using a Node/Typescriptnbackend, anPostgreSQL database, a GraphQL API layer, a Webpack build process, with anTypescript/React front-end and XState for state managementn• Enforced coding best practices with linting rules and code formatters bynautomating it inngit workflown• Automated deployment Expert Portal* to EC2 instances and the #pluralsightnNPMnartifactory using Github, TeamCity, and Octopusn• Improved product team workflow by building a browser extention to addnLeanKit cardntemplate functionalityn• Consumed and published data through Kafka streams and RabbitMQnmessagesn• Interviewed, onboarded, and trained junior to mid-level engineers",
"company_name": "Pluralsight"
},
{
"id": "9e2c2b44-39a4-4369-b237-c51fd938e61d",
"skills": [
{
"id": "c8a5328d-87ba-419d-802f-80b1d940bb68|313384d2-486d-4b7f-ba33-76b1cd696f0a|012abcd1-3a6a-4803-a47e-42f46b402024",
"level1": "Programming languages",
"level2": "Scripting languages",
"level3": "JavaScript",
"level1_id": "c8a5328d-87ba-419d-802f-80b1d940bb68",
"level2_id": "313384d2-486d-4b7f-ba33-76b1cd696f0a",
"level3_id": "012abcd1-3a6a-4803-a47e-42f46b402024"
}
],
"end_date": "2019-10-01T00:00:00",
"position": "Software Engineer II",
"start_date": "2017-11-01T00:00:00",
"description": "Greater Salt Lake City AreanWorked on the KSL Jobs Classifieds team as a full-stack developer. Followingnthe scrum methodology, I added new features and maintained all thingsnpowered by KSL Jobs.n• Built and deployed a "white label" version of KSL Jobs for the Silicon Slopesnbrand. (https://siliconslopes.ksl.com)n• Rewrote major sections of the current KSL Jobs site in React.jsn• Automated querying data for reports and analytic purposes through Node andnPHP scriptsn• Provided rich data tracking through Google Tag Manager, Google Analytics,nand BigQueryn• Migrated Solr search engine to ElasticSearch with a GraphQL API",
"company_name": "Deseret Digital Media"
},
{
"id": "efbf68f4-7bdc-4ab6-bba9-fbf7ec38aeef",
"skills": [
{
"id": "c8a5328d-87ba-419d-802f-80b1d940bb68|f45abc59-1e7e-4853-a498-999fcf12d498|4b266297-6e25-4443-90ec-248bded4225a",
"level1": "Programming languages",
"level2": "High-level languages",
"level3": "PHP",
"level1_id": "c8a5328d-87ba-419d-802f-80b1d940bb68",
"level2_id": "f45abc59-1e7e-4853-a498-999fcf12d498",
"level3_id": "4b266297-6e25-4443-90ec-248bded4225a"
}
],
"end_date": "2019-08-01T00:00:00",
"position": "Full Stack Developer",
"start_date": "2017-01-01T00:00:00",
"description": "Provo, Utah AreanWorked with Appritech LLC to modernize their legacy software and add newnfeatures to automate their business processes.n• Building new call handler from the ground up using JavaScript ES6,nBootstrap, SASS fornfront-end, and PHP7/Laravel for back-endn• Implemented real-time call management system and built reporting API.nImproved call agentnproductivity by 70%n• Upgraded deprecated PHP legacy code to PHP 7n• Synchronized follow up calls with Twilio API functionality for SMS messagenforwardingn• Installed and setup Apache server for an after hours call center",
"company_name": "Appritech Software"
},
{
"id": "2db60c6c-c214-4d9b-9034-baba676203a8",
"skills": [
{
"id": "c8a5328d-87ba-419d-802f-80b1d940bb68|f45abc59-1e7e-4853-a498-999fcf12d498|4b266297-6e25-4443-90ec-248bded4225a",
"level1": "Programming languages",
"level2": "High-level languages",
"level3": "PHP",
"level1_id": "c8a5328d-87ba-419d-802f-80b1d940bb68",
"level2_id": "f45abc59-1e7e-4853-a498-999fcf12d498",
"level3_id": "4b266297-6e25-4443-90ec-248bded4225a"
}
],
"end_date": "2017-09-01T00:00:00",
"position": "Web Developer",
"start_date": "2017-02-01T00:00:00",
"description": "Provo, Utah AreanWorked with Redcore LLC to build entrepreneurial tools, marketing tools, andnadding new functionality to the current Wordpress-integrated website andnCMS.n• Built Brand Management website from the ground up using Bootstrap andnJavaScript on thenfront end, with PHP and MySQL on the back-endn• Automated managerial and accounting tasks, such as invoice generation,nand transactions using Stripe APIn• Debugged and perfected current web applications to enrich UXn• Doubled clientele by expanding Redcore services offered to include websitencreation andnmanagement",
"company_name": "Redcore LLC"
},
{
"id": "c3f1d5b2-5586-477d-ae4c-e2927463244e",
"skills": [
{
"id": "c8a5328d-87ba-419d-802f-80b1d940bb68|313384d2-486d-4b7f-ba33-76b1cd696f0a|012abcd1-3a6a-4803-a47e-42f46b402024",
"level1": "Programming languages",
"level2": "Scripting languages",
"level3": "JavaScript",
"level1_id": "c8a5328d-87ba-419d-802f-80b1d940bb68",
"level2_id": "313384d2-486d-4b7f-ba33-76b1cd696f0a",
"level3_id": "012abcd1-3a6a-4803-a47e-42f46b402024"
}
],
"end_date": "2017-03-01T00:00:00",
"position": "Software and Web Developer",
"start_date": "2016-12-01T00:00:00",
"description": "Provo, Utah AreanThis internship was focused on building core website and software fornCentaurific using the LAMP stack.n• Created dynamic landing pages for products to generate leadsn• Designed a MySQL database to organize data and generate excel reportsn• Revitalized marketing efforts through analytics and a social media sharingnpage",
"company_name": "Centaurific"
}
]
I'd like to query out the structure. I tried this and a few other variants of it to no avail:
SELECT workexp
FROM foo,
jsonb_array_elements(work_experience) workexp;
And I get this error message:
ERROR: function jsonb_array_elements_text(jsonb[]) does not exist
LINE 3: jsonb_array_elements(work_experience) workexp;
^
HINT: No function matches the given name and argument types. You might need to add explicit type casts.
Everything I look up has documentation on traversing and querying jsonb rows, but not jsonb[] ones. How can I query for the "skills"."id"s?
EDIT -- SOLUTION:
All of the solutions given were helpful but I selected Ramin Faracov for suggesting to use unnest. Thank you!
This is the query that I came up with:
SELECT
id expert_id,
we_sk->>'id' work_exp_skill_id
FROM (
SELECT
id,
unnest(work_experience)->'skills' we
FROM foo_json
) sk,
jsonb_array_elements(sk.we) we_sk;

In PostgreSQL all jsonb array functions only for using JSONB types, not JSONB[].
First-way you can use unnest function to convert arrays to rows, after then you can use jsonb_array_elements.
Second-way change type JSONB[] to JSON in your table. Don't worry, you can still insert your JSON string data which is written above into JSONB field without any problems. Inside the JSON and JSONB strings, we can use arrays using formatted strings. Jsonb array functions only for arrays written inside JSON string.

db fiddle demo.
you don't need jsonb[], just use jsonb.
You need to properly reformat the jsonb, so that you can insert to postgresql.
I user datagrip syntax highlight to reformat your code to db. like this.
CTE is more easier to reformat the logic. So i use it.
--query filter mainid then the inner id.
-- "84dff39f-2ed6-408c-a188-14cf55a09c5b|e13b48c1-fd0f-4ecc-89de-58e9312b9435|fd3c41b8-8c15-47e2-a80d-cf3683b2d0da"
with a as (
SELECT workexp
FROM foo_json,
jsonb_array_elements(work_experience) workexp),
b as(select workexp['id'] mainid,
jsonb_array_elements(workexp['skills']) last_level,
count(*) over() from a),
c as(select
last_level,
pg_typeof(mainid) from b
where mainid::text = '"7cd74bae-ff5b-4f58-ab20-0218f820ffff"'::text)
select *,
last_level['level1'],
last_level['level2'],
last_level['level3'],
last_level['level1_id'],
last_level['level2_id'],
last_level['level3_id']
from c
where
last_level['id']::text
= '"84dff39f-2ed6-408c-a188-14cf55a09c5b|e13b48c1-fd0f-4ecc-89de-58e9312b9435|fd3c41b8-8c15-47e2-a80d-cf3683b2d0da"'::text;
--query filter based on mainid (not the inner nested)
with a as (
SELECT workexp
FROM foo_json,
jsonb_array_elements(work_experience) workexp),
b as(select workexp['id'] mainid, workexp['skills'], count(*) over() from a)
select *, pg_typeof(mainid) from b
where mainid::text = '"7cd74bae-ff5b-4f58-ab20-0218f820ffff"'::text;
postgresql 15 (dev)
In postgresql 15, there is json_table. make nested json/jsonb query more easier.
SELECT jt.* FROM foo_json,
JSON_TABLE(work_experience, '$[*]' COLUMNS(
main_id text path '$.id', start_date date path '$.start_date',
position text path '$.position', company_name text path '$.company_name',
NESTED PATH '$.skills[*]' COLUMNS (
innerid text path '$.id',
level2 text, level3 text)
)) jt;

Related

Azure Pipelines: Bulk approve of deployments to environments

Is there any way to approve runs via the CLI or the API (or anything else)? I'm looking for a way to bulk approve multiple runs from different pipelines but it's not available in the UI.
Let's say I have 100 pipelines that have a deployment job to a production environment. I would like to approve all awaiting for approval runs.
Currently, I cannot find something like it in the docs of the Azure DevOps REST API or the CLI.
The feature docs:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/environments
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/approvals
The following question is related but I'm looking for any way of solving it but not just via API:
Approve a yaml pipeline deployment in Azure DevOps using REST api
I was just searching for an answer for this regarding getting the approval id that you would need. In fact there is an undocumented API to approve an approval check.
This is as Merlin explain the following
https://dev.azure.com/{org}/{project}/_apis/pipelines/approvals/{approvalId}
The body has to look like this
[{
"approvalId": "{approvalId}",
"status": {approvalStatus},
"comment": ""
}]
where {approvalStatus} is telling the API if you approved or not. You probly have to try, but I had a 4 as a status. I guess there are only 2 possibilities. Either for "approved" or "denied".
The question is now how you get the approval ID? I found it. You get it by using the timeline API of a classic build. The build API documentation says that you get it by the following
https://dev.azure.com/{organization}/{project}/_apis/build/builds/{buildId}?api-version=5.1
the build timeline you get in the response of the build run, but it has a pattern which is
https://dev.azure.com/{organization}/{project}/_apis/build/builds/{buildId}/Timeline?api-version=5.1
Besides a flat array container a parent / child rleationship from stage, phase, job and tasks, you can find within it something like the following:
{
"records": [
{
"previousAttempts": [
],
"id": "95f5837e-769d-5a92-9ecb-0e7edb3ac322",
"parentId": "9e7965a8-d99d-5b8f-b47b-3ee7c58a5b1c",
"type": "Checkpoint",
"name": "Checkpoint",
"startTime": "2020-08-14T13:44:03.05Z",
"finishTime": null,
"currentOperation": null,
"percentComplete": null,
"state": "inProgress",
"result": null,
"resultCode": null,
"changeId": 73,
"lastModified": "0001-01-01T00:00:00",
"workerName": null,
"details": null,
"errorCount": 0,
"warningCount": 0,
"url": null,
"log": null,
"task": null,
"attempt": 1,
"identifier": "Checkpoint"
},
{
"previousAttempts": [
],
"id": "9e7965a8-d99d-5b8f-b47b-3ee7c58a5b1c",
"parentId": null,
"type": "Stage",
"name": "Power Platform Test (orgf92be262)",
"startTime": null,
"finishTime": null,
"currentOperation": null,
"percentComplete": null,
"state": "pending",
"result": null,
"resultCode": null,
"changeId": 1,
"lastModified": "0001-01-01T00:00:00",
"workerName": null,
"order": 2,
"details": null,
"errorCount": 0,
"warningCount": 0,
"url": null,
"log": null,
"task": null,
"attempt": 1,
"identifier": "Import_Test"
},
{
"previousAttempts": [
],
"id": "e54149c5-b5a7-4b82-8468-56ad493224b5",
"parentId": "95f5837e-769d-5a92-9ecb-0e7edb3ac322",
"type": "Checkpoint.Approval",
"name": "Checkpoint.Approval",
"startTime": "2020-08-14T13:44:03.02Z",
"finishTime": null,
"currentOperation": null,
"percentComplete": null,
"state": "inProgress",
"result": null,
"resultCode": null,
"changeId": 72,
"lastModified": "0001-01-01T00:00:00",
"workerName": null,
"details": null,
"errorCount": 0,
"warningCount": 0,
"url": null,
"log": null,
"task": null,
"attempt": 1,
"identifier": "e54149c5-b5a7-4b82-8468-56ad493224b5"
}
],
"lastChangedBy": "00000002-0000-8888-8000-000000000000",
"lastChangedOn": "2020-08-14T13:44:03.057Z",
"id": "86fb4204-9c5e-4e72-bdb1-eefe230480ec",
"changeId": 73,
"url": "https://dev.azure.com/***"
}
below you can see a step that is called "Checkpoint.Approval". The id of that step IS the approval Id you need to approve everything. If you want to know from which stage the approval is, then you can follow up the parentIds until the parentId property is null.
This will then be the stage.
With this you can successfully get the approval id and use it to approve with the said
What jessehouwing's guess is correct. Now multi-stage still be in preview, and the corresponding SDK/API/extension hasn't been expanded and provided to public.
You may think that what about not using API. I have checked the corresponding code from our backend, all of operations to multi-stage approval contain one required parameter: approvalId. I'm sure you have known that this value is unique and different approval map with different approvalId value. This means, no matter which method you want to try with, approvalId is the big trouble. And based on my known, until now, there's no any api/SDK, third tool or extension can achieve this value directly.
In addition, for multi-stage YAML, its release process logic is not same with the release that defined with UI. So, all of public APIs which can work with release(UI), are not suitable with the release of multi-stage.
We have one undisclosed api, can get Approval message of multi-stage:
https://dev.azure.com/{org}/{project}/_apis/pipelines/approvals/{approvalId}
You can try with listing approval without specifying approvalId: https://dev.azure.com/{org}/{project}/_apis/pipelines/approvals. And its response message: Query for approvals failed. A minimum of one query parameter is required.\r\nParameter name: queryParameters. This represents you must tell system the specified approval(the big trouble I mentioned previously).
In fact, for why approvalId is a necessary part, it is caused from our backend code structure. I'd suggest you raise suggestion on developing API/SDK for multi-stage here.
I can confirm that Sebastian's answer worked for me, even in Azure DevOps 2020 on-prem.
After retrieving the approvalId from either methods used above (I was specifically using a service hook for my integration), I used the following API PATCH call:
https://dev.azure.com/{organization}/{project}/_apis/pipelines/approvals/?api-version=6.0-preview
and in the body:
[
{
"approvalId": "{approvalId}",
"status": {status integer}, (4 - approved; 8 - rejected)
"comment": ""
}
]
The call is passed with the application/json Content-Type, but in some situations it did not like that I was using the [] brackets, so you will need to work around that, only then will the call work.
I was even able to integrate this call into my custom connector in MS Power Automate
I added support to the latest version of the AzurePipelinesPS Powershell module to support bulk pipeline approvals.
Code snippet without using the AzurePipelinesPS sessions
$instance = 'https://dev.azure.com'
$collection = 'your_project'
$project = 'your_project'
$apiVersion = '5.1-preview'
$securePat = 'your_personal_access_token' | ConvertTo-SecureString -Force -AsPlainText
Get-APPipelinePendingApprovalList -Instance $instance -Collection $collection -Project $project -PersonalAccessToken $securePat -ApiVersion $apiVersion | Out-GridView -Passthru | % { Update-APPipelineApproval -Instance $instance -Collection $collection -Project $project -PersonalAccessToken $securePat -ApiVersion $apiVersion -ApprovalId $PSitem.approvalId -status 'approved'}
Code snippet with AzurePipelinesPS sessions
$session = 'your_session'
Get-APPipelinePendingApprovalList $session | Out-GridView -Passthru | % { Update-APPipelineApproval $session -ApprovalId $PSitem.approvalId -status 'approved'}
See the AzurePipelinesPS project page for details on secure session handling.
Function Definitions used in the code above
Get-APPipelinePendingApprovalList
Loops through pipeline build runs with the status of 'notStarted' or 'inProgress' in a project. This build lookup supports filters like pipeline definition ids or a source branch name.
For each build it then looks up the timeline record where the approval id, the stage name and the stage identifier are found.
Optionally with the ExpandApproval switch it can expand each approval with details
The object returned from this function contains the following properties, the values have been mocked
pipelineDefinitionName : MyPipeline
pipelineDefinitionId : 100
pipelineRunId : 2001
pipelineUrl : https://dev.azure.com/your_project/_build/results?
sourceBranch : refs/heads/master
stageName : Prod Deployment
stageIdentifier : Prod_Deployment
approvalId : xxxxxx-xxxx-xxxx-xxxx-xxxxxxxxx
Out-GridView
Displays data in a Grid View where the results can be filtered, ordered and selected.
%
The percent sign is shorthand for Foreach-Object
Update-APPipelineApproval
Updates the status of an approval to approved or rejected.
Credit
Thanks to Sebastian Schütze for cracking the timeline part!
The az pipelines extension doesn't suport approvals yet, I suppose due to the fact that multi-stage pipelines are still in preview and the old release hub will eventually be replaced by it.
But there is a REST API you can use to list and update approvals. These can be called from PowerShell with relative ease.
Or use the vsteam powershell module and Get-VSTeamApproval and Set-VSTeamApproval.

Get the groups of a customeruser in otrs

I am extending OTRS with an app and need to get the groups a customeruser is in. I want to do this by communicating with the SessionGet-Endpoint (https://doc.otrs.com/doc/api/otrs/6.0/Perl/Kernel/GenericInterface/Operation/Session/SessionGet.pm.html)
The Endpoint for SessionGet returns a lot of information about the user but not the groups he is in. I am not talking about agents who can login to the backend of otrs but customerusers.
I am using OTRS 6 because it was the only one available in docker. I created the REST-endpoints in the backend and everything works well. There is a new functionality why I need to get the information about the groups.
Had a look at the otrs system-config but could not figure out if it is possible to include this information in the response.
Although I am a programmer, I did not want to write perl because of ... reasons.
I had a look at the file which handles the incoming request at /opt/otrs/Kernel/GenericInterface/Operation/Session/SessionGet.pm and traced the calls to the actual file where the information is collected from the database in /opt/otrs/Kernel/System/AuthSession/DB.pm. In line 169 the SQL-statement is written so it came to my mind that I just can extend this to also get the information of the groups, because, as I said, I did not want to write perl...
A typical response from this endpoint looks like this:
{
"SessionData": [
{
"Value": "2",
"Key": "ChangeBy"
},
{
"Value": "2019-06-26 13:43:18",
"Key": "ChangeTime"
},
{
"Value": "2",
"Key": "CreateBy"
},
{
"Value": "2019-06-26 13:43:18",
"Key": "CreateTime"
},
{
"Value": "XXX",
"Key": "CustomerCompanyCity"
},
{
"Value": "",
"Key": "CustomerCompanyComment"
}
...
}
A good thing would be to just insert another Value-Key-pair with the IDs of the groups. The SQL-statement queries only one table $Self->{SessionTable} mostly called otrs.sessions.
I used the following resources to create a SQL-statement which extends the existing SQL-statement with the needed information. You can find it here:
$DBObject->Prepare(
SQL => "
(
SELECT id, data_key, data_value, serialized FROM $Self->{SessionTable} WHERE session_id = ? ORDER BY id ASC
)
UNION ALL
(
SELECT
(
SELECT MAX(id) FROM $Self->{SessionTable} WHERE session_id = ?
) +1
AS id,
'UserGroupsID' AS data_key,
(
SELECT GROUP_CONCAT(DISTINCT group_id SEPARATOR ', ')
FROM otrs.group_customer_user
WHERE user_id =
(
SELECT data_value
FROM $Self->{SessionTable}
WHERE session_id = ?
AND data_key = 'UserID'
ORDER BY id ASC
)
)
AS data_value,
0 AS serialized
)",
Bind => [ \$Param{SessionID}, \$Param{SessionID}, \$Param{SessionID} ],
);
Whoever needs to get the groups of a customeruser can replace the existing code with the one provided. At least in my case it works very well. Now, I get the expected key-value-pair:
{
"Value": "10, 11, 6, 7, 8, 9",
"Key": "UserGroupsID"
},
I used the following resources:
Adding the results of multiple SQL selects?
Can I concatenate multiple MySQL rows into one field?
Add row to query result using select
Happy coding,
Nico

NiFi: Can I get variables in NiFI on REST API?

Can I get 'variables' in NiFI on REST API?
I found to get variables in NiFi's rest api document, but I do not found.
variables is :
is it provided?
You can make a GET request to /process-groups/{id}/variable-registry where {id} is the process group ID you are interested in. You will receive a JSON response similar to:
{
"processGroupRevision": {…},
"variableRegistry": {
"variables": [{
"variable": {
"name": "value",
"value": "value",
"processGroupId": "value",
"affectedComponents": [{…}]
},
"canWrite": true
}],
"processGroupId": "value"
},
"disconnectedNodeAcknowledged": true
}
This is all documented on the Apache NiFi REST API page under Process Groups. You can also use your browser's developer tools panel to inspect the requests that the NiFi UI makes to the server as you interact with the UI to observe what calls are made.
You can also easily fetch these using the Community NiFi Python Client: NiPyApi
Python 3.6.5 (v3.6.5:f59c0932b4, Mar 28 2018, 03:03:55)
[GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin
>> import nipyapi
# Get your ProcessGroup object
>> pg = nipyapi.canvas.get_process_group('myProcessGroup')
# Get the VariableRegistry for that ProcessGroup
>> vars = nipyapi.canvas.get_variable_registry(pg)
>> vars.variable_registry.variables
[{'can_write': True,
'variable': {'affected_components': [],
'name': 'foo',
'process_group_id': 'fb88a5cb-0164-1000-d5ce-d89ad0e93df2',
'value': 'bar'}}]

How to add a ETW provider to an existing service fabric cluster using powershell?

I have already created a service fabric cluster with azure diagnostics and it is functional currently with my services deployed into that cluster. I have an ETW EventSource in my service that I would like to start collecting events from because my service code already uses this event source to write my service related events. Since the cluster is already enabled for azure diagnostics and my services are already deployed into that cluster, I think it is a simple matter of updating the ETW provider with my event source in this service fabric cluster. Here is the exported template (only a partial is shown that is relevant for azure diagnostics):
{
"properties": {
"publisher": "Microsoft.Azure.Diagnostics",
"type": "IaaSDiagnostics",
"typeHandlerVersion": "1.5",
"autoUpgradeMinorVersion": true,
"settings": {
"WadCfg": {
"DiagnosticMonitorConfiguration": {
"overallQuotaInMB": "50000",
"EtwProviders": {
"EtwEventSourceProviderConfiguration": [
{
"provider": "Microsoft-ServiceFabric-Actors",
"scheduledTransferKeywordFilter": "1",
"scheduledTransferPeriod": "PT5M",
"DefaultEvents": {
"eventDestination": "ServiceFabricReliableActorEventTable"
}
},
{
"provider": "Microsoft-ServiceFabric-Services",
"scheduledTransferPeriod": "PT5M",
"DefaultEvents": {
"eventDestination": "ServiceFabricReliableServiceEventTable"
}
},
{
"provider": "Bb.ServiceFabric.Infrastructure.Container",
"scheduledTransferPeriod": "PT1M",
"DefaultEvents": {
"eventDestination": "ServiceFabricReliableServiceEventTable"
}
}
],
"EtwManifestProviderConfiguration": [
{
"provider": "cbd93bc2-71e5-4566-b3a7-595d8eeca6e8",
"scheduledTransferLogLevelFilter": "Information",
"scheduledTransferKeywordFilter": "4611686018427387904",
"scheduledTransferPeriod": "PT5M",
"DefaultEvents": {
"eventDestination": "ServiceFabricSystemEventTable"
}
}
]
}
}
},
"StorageAccount": "sfdgsmsraghuplaygrou6827"
}
},
"name": "VMDiagnosticsVmExt_vmNodeType0Name"
}
I would like to update following EtwProviders/EtwEventSourceProviderConfiguration to contain following section (as MyCompany.MyServices.MyStatelessService is the name of my service's EventSource):
{
"provider": "MyCompany.MyServices.MyStatelessService",
"scheduledTransferPeriod": "PT5M",
"DefaultEvents": {
"eventDestination": "ServiceFabricReliableServiceEventTable"
}
}
Here are my questions:
Is this the correct way of inserting an ETW provider/EventSource (from my service) into an existing cluster (that is already enabled with azure diagnostics)?
Can I add this event source (as a ETW event source provider) using a powershell command(s)?
If so, what is the exact powershell command (using all the information from the above code fragment)?
Note: I am using .net framework 4.5.2.
All seems good with the added configuration above. Just be aware that for ETWProviders the EventDestination cannot contain hyphens (-), yours don't so you are ok.
To update the Windows Azure Diagnostics (WAD) agent configuration, you can use either PowerShell or Cloud Explorer in Visual Studio.
For the former, simply update the ARM template and use the New-AzureRmResourceGroupDeployment cmdlet. See here for further information: https://azure.microsoft.com/en-us/documentation/articles/service-fabric-diagnostics-how-to-setup-wad/#update-diagnostics-to-collect-and-upload-logs-from-new-eventsource-channels
For using Cloud Explorer in Visual Studio. Browse to your Virtual Machine Scale Set (as this is the Azure resource that holds the WAD configuration). Right-click and choose Update Diagnostics. In the dialog shown, you have the option to upload a private and public configuration file. Simple take a .json document containing the {"WadCfg": {}} element, and upload that as a public configuration.
If you need to update the private configuration specifies the storage account name and AccessKey:
{
"storageAccountName": "",
"storageAccountKey": "",
"storageAccountEndPoint": "https://core.windows.net",
}
Hope this helps.
Mikkel

How can I target .NET35 on OSX using mono and dnx/dnvm?

on Windows using Visual Studio 2015, I can compile a project with dnu build.
The project.json file looks as follows:
{
"version": "1.0.0-*",
"description": "My Class Library",
"authors": [ "Class Library template" ],
"tags": [""],
"projectUrl": "",
"licenseUrl": "",
"tooling": {
"defaultNamespace": "Common"
},
"frameworks": {
"dnx451": { },
"dnxcore50": {
"dependencies": {
"Microsoft.CSharp": "4.0.1-beta-*",
"System.Collections": "4.0.11-beta-*",
"System.Linq": "4.0.1-beta-*",
"System.Runtime": "4.0.21-beta-*",
"System.Threading": "4.0.11-beta-*"
}
}
}
}
After installing mono, dnvm and dnx on a Mac, as per this tutorial, I can actually compile the same project on OSX! This in itself is already pretty awesome!
now, I added the following framework to my project.json file:
"frameworks": {
"dnx35": { }, //"net35"
"dnx451": { },
"dnxcore50": {
"dependencies": {
"Microsoft.CSharp": "4.0.1-beta-*",
"System.Collections": "4.0.11-beta-*",
"System.Linq": "4.0.1-beta-*",
"System.Runtime": "4.0.21-beta-*",
"System.Threading": "4.0.11-beta-*"
}
}
This still compiles on Windows, and produces three sets of dlls, as expected.
However, on OSX it does not build the dnx20 target. Though as far as I understand, the mono compiler mcs can be set to target .net35 by passing in a sdk parameter.
So my question is: Can I target .NET35 with dnx on OSX using mono?
EDIT
The goal of this question is to compile a set of dll's that can be imported into Unity3d. And because Unity3d uses mono as a runtime, I would like to be able to do that by using dnu build, as to be able to develop these dll's on any platform.
You can pass custom Rosyln compiler options via compilationOptions in the project.json file:
"compilationOptions": {
"optimize": true,
"define": ["RELEASE", "TRACE", "PLAYSCRIPT"]
},
Take a look at the various project.json files in the Github aspnet/dnx project, i.e. project.json
But those options are being passed to Rosyln not mcs, Mono is being used as a CLR host for Rosyln's compiler as a service), but I do not believe you are substitute Mono's mcs for that (comments on this anyone?)