How to resolve "Validation error: The task.json file was not found in contribution task." when validating Azure DevOps extension? - azure-devops

I am attempting to build and validate an Azure DevOps extension that contains a task contribution but it fails to validate and returns "Validation error: The task.json file was not found in contribution task."
This is an example extension I have created that contains two things:
The root vss-extension.json file
A folder called "task" containing the "task.json" file
The command that I am running to validate the extension is as follows. I am running this from the root folder containing the vss-extension.json file.
tfx extension isvalid --publisher my-publisher --extension-id my-id --service-url https://marketplace.visualstudio.com/
vss-extension.json
{
"manifestVersion": 1,
"id": "demo-extension",
"name": "Demo Extension",
"version": "1.0.0",
"publisher": "demo-publisher",
"targets": [
{
"id": "Microsoft.VisualStudio.Services"
}
],
"description": "Demo extension",
"categories": ["Azure Pipelines"],
"files": [
{
"path": "task"
}
],
"contributions": [
{
"id": "build-task",
"type": "ms.vss-distributed-task.task",
"targets": ["ms.vss-distributed-task.tasks"],
"properties": {
"name": "task"
}
}
]
}
task/task.json
{
"$schema": "https://raw.githubusercontent.com/Microsoft/azure-pipelines-task-lib/master/tasks.schema.json",
"id": "a81df1d3-750f-4d60-a8dc-21970f1956e2",
"name": "DemoBuild",
"friendlyName": "Demo task",
"instanceNameFormat": "Demo task",
"description": "Demo task",
"helpMarkDown": "",
"category": "Build",
"author": "Demo Company",
"version": {
"Major": 0,
"Minor": 1,
"Patch": 0
},
"groups": [
{
"name": "someinput",
"displayName": "SomeInput",
"isExpanded": true
}
],
"inputs": [
{
"name": "someinput",
"type": "string",
"label": "Some Input",
"defaultValue": "",
"required": true,
"helpMarkDown": "Input something",
"groupName": "someinput"
}
],
"execution": {
"Node10": {
"target": "index.js"
}
}
}
In addition to the above I have also downloaded and attempted to validate these official Microsoft extensions with the same error message:
https://github.com/microsoft/azure-devops-extension-tasks
https://github.com/microsoft/PR-Metrics
How might I go about fixing this issue please?

Related

Can't see my custom extension on Azure Devops Marketplace

My issue
I created an Azure Devops extension task. Deploy it on a publisher, shared it. But I can't find it on the MarkePlace.
What I did
This is my project:
This is my task.json:
{
"id": "0f6ee401-2a8e-4110-b51d-c8d05086c1d0",
"name": "deployRG",
"category": "Utility",
"visibility": [
"Build",
"Release"
],
"demands": [],
"version": {
"Major": "0",
"Minor": "1",
"Patch": "0"
},
"instanceNameFormat": "DeployRG $(name)",
"groups": [],
"inputs": [
{
"name": "Name",
"type": "string",
"label": "RG name",
"defaultValue": "",
"required": true,
}
],
"execution": {
"PowerShell3": {
"target": "CreateRG.ps1"
}
}
}
My manifest vss-extension.json:
{
"manifestVersion": 1,
"id": "deployRG",
"version": "0.1.0",
"name": "Deploy RG",
"publisher": "Amethyste-MyTasks",
"public": false,
"categories": [
"Azure Pipelines"
],
"tags": [
"amethyste"
],
"contributions": [
{
"id": "DeployRG",
"type": "ms.vss-distributed-task.task",
"targets": [
"ms.vss-distributed-task.tasks"
],
"properties": {
"name": "DeployRG"
}
}
],
"targets": [
{
"id": "Microsoft.VisualStudio.Services"
}
],
"files": [
{
"path": "DeployRG",
"packagePath": "DeployRG"
},
{
"path": "VstsTaskSdk"
}
]
}
What I checked
I am owner of the organization and belong to Project Collection Administrators group.
On the portal:
On the publisher portal:
What I need
I checked some tutorial on Internet and can't see what I do wrong.
Has anybody an idea?
Thank you
Aargh, I have just found and its easy.
After sharing, one should install the extension as indicated here:
https://learn.microsoft.com/en-us/azure/devops/extend/publish/overview?view=azure-devops
Don't know why so many tutorials skip this step

Google Cloud Data Fusion produces inconsistent output data

I am creating a DataFusion pipeline to ingest a CSV file from s3 bucket, applying wrangler directives and storing it in GCS bucket. The input CSV file had 18 columns. However, the output CSV file has only 8 columns. I have a doubt that this could be due to the CSV encoding format, but I am not sure. What could be the reason here?
Pipeline JSON
{
"name": "aws_fusion_v1",
"description": "Data Pipeline Application",
"artifact": {
"name": "cdap-data-pipeline",
"version": "6.1.2",
"scope": "SYSTEM"
},
"config": {
"resources": {
"memoryMB": 2048,
"virtualCores": 1
},
"driverResources": {
"memoryMB": 2048,
"virtualCores": 1
},
"connections": [
{
"from": "Amazon S3",
"to": "Wrangler"
},
{
"from": "Wrangler",
"to": "GCS2"
},
{
"from": "Argument Setter",
"to": "Amazon S3"
}
],
"comments": [],
"postActions": [],
"properties": {},
"processTimingEnabled": true,
"stageLoggingEnabled": true,
"stages": [
{
"name": "Amazon S3",
"plugin": {
"name": "S3",
"type": "batchsource",
"label": "Amazon S3",
"artifact": {
"name": "amazon-s3-plugins",
"version": "1.11.0",
"scope": "SYSTEM"
},
"properties": {
"format": "text",
"authenticationMethod": "Access Credentials",
"filenameOnly": "false",
"recursive": "false",
"ignoreNonExistingFolders": "false",
"schema": "{\"type\":\"record\",\"name\":\"etlSchemaBody\",\"fields\":[{\"name\":\"body\",\"type\":\"string\"}]}",
"referenceName": "aws_source",
"path": "${input.bucket}",
"accessID": "${input.access_id}",
"accessKey": "${input.access_key}"
}
},
"outputSchema": [
{
"name": "etlSchemaBody",
"schema": "{\"type\":\"record\",\"name\":\"etlSchemaBody\",\"fields\":[{\"name\":\"body\",\"type\":\"string\"}]}"
}
],
"type": "batchsource",
"label": "Amazon S3",
"icon": "icon-s3"
},
{
"name": "Wrangler",
"plugin": {
"name": "Wrangler",
"type": "transform",
"label": "Wrangler",
"artifact": {
"name": "wrangler-transform",
"version": "4.1.5",
"scope": "SYSTEM"
},
"properties": {
"field": "*",
"precondition": "false",
"threshold": "1",
"workspaceId": "804a2995-7c06-4ab2-b342-a9a01aa03a3d",
"schema": "${output.schema}",
"directives": "${directive}"
}
},
"outputSchema": [
{
"name": "etlSchemaBody",
"schema": "${output.schema}"
}
],
"inputSchema": [
{
"name": "Amazon S3",
"schema": "{\"type\":\"record\",\"name\":\"etlSchemaBody\",\"fields\":[{\"name\":\"body\",\"type\":\"string\"}]}"
}
],
"type": "transform",
"label": "Wrangler",
"icon": "icon-DataPreparation"
},
{
"name": "GCS2",
"plugin": {
"name": "GCS",
"type": "batchsink",
"label": "GCS2",
"artifact": {
"name": "google-cloud",
"version": "0.14.2",
"scope": "SYSTEM"
},
"properties": {
"project": "auto-detect",
"suffix": "yyyy-MM-dd-HH-mm",
"format": "csv",
"serviceFilePath": "auto-detect",
"location": "us",
"referenceName": "gcs_sink",
"path": "${output.path}",
"schema": "${output.schema}"
}
},
"outputSchema": [
{
"name": "etlSchemaBody",
"schema": "${output.schema}"
}
],
"inputSchema": [
{
"name": "Wrangler",
"schema": ""
}
],
"type": "batchsink",
"label": "GCS2",
"icon": "fa-plug"
},
{
"name": "Argument Setter",
"plugin": {
"name": "ArgumentSetter",
"type": "action",
"label": "Argument Setter",
"artifact": {
"name": "argument-setter-plugins",
"version": "1.1.1",
"scope": "USER"
},
"properties": {
"method": "GET",
"connectTimeout": "60000",
"readTimeout": "60000",
"numRetries": "0",
"followRedirects": "true",
"url": "${argfile}"
}
},
"outputSchema": [
{
"name": "etlSchemaBody",
"schema": ""
}
],
"type": "action",
"label": "Argument Setter",
"icon": "fa-plug"
}
],
"schedule": "0 * * * *",
"engine": "spark",
"numOfRecordsPreview": 100,
"description": "Data Pipeline Application",
"maxConcurrentRuns": 1
}
}
Edit:
The missing columns in the output file were due to spaces in the column names. But I am facing another issue. In wrangler, when I pass a directive as
"parse-as-csv :body ',' false", the output file is empty. But when I pass something like "parse-as-csv :body ',' true", the output file has all the data without header as expected.

How to create an ETL from BigQuery to Google Storage using CDAP?

I'm setting up CDAP in my Google Cloud Environment, but having problems to execute the following pipeline: run a query on BigQuery and save the result in a CSV file on Google Storage.
My process was:
Install CDAP using the CDAP OSS image at Google Marketplace.
Build the following pipeline:
{
"artifact": {
"name": "cdap-data-pipeline",
"version": "6.0.0",
"scope": "SYSTEM"
},
"description": "Data Pipeline Application",
"name": "cdap_dsc_test",
"config": {
"resources": {
"memoryMB": 2048,
"virtualCores": 1
},
"driverResources": {
"memoryMB": 2048,
"virtualCores": 1
},
"connections": [
{
"from": "BigQuery",
"to": "Google Cloud Storage"
}
],
"comments": [],
"postActions": [],
"properties": {},
"processTimingEnabled": true,
"stageLoggingEnabled": true,
"stages": [
{
"name": "BigQuery",
"plugin": {
"name": "BigQueryTable",
"type": "batchsource",
"label": "BigQuery",
"artifact": {
"name": "google-cloud",
"version": "0.12.2",
"scope": "SYSTEM"
},
"properties": {
"project": "bi-data-science",
"serviceFilePath": "/home/ubuntu/bi-data-science-cdap-4cbf526de374.json",
"schema": "{\"type\":\"record\",\"name\":\"etlSchemaBody\",\"fields\":[{\"name\":\"destination_name\",\"type\":[\"string\",\"null\"]},{\"name\":\"destination_country\",\"type\":[\"string\",\"null\"]},{\"name\":\"timestamp\",\"type\":[\"double\",\"null\"]},{\"name\":\"desktop\",\"type\":[\"double\",\"null\"]},{\"name\":\"tablet\",\"type\":[\"double\",\"null\"]},{\"name\":\"mobile\",\"type\":[\"double\",\"null\"]}]}",
"referenceName": "test_tables",
"dataset": "google_trends",
"table": "devices"
}
},
"outputSchema": [
{
"name": "etlSchemaBody",
"schema": "{\"type\":\"record\",\"name\":\"etlSchemaBody\",\"fields\":[{\"name\":\"destination_name\",\"type\":[\"string\",\"null\"]},{\"name\":\"destination_country\",\"type\":[\"string\",\"null\"]},{\"name\":\"timestamp\",\"type\":[\"double\",\"null\"]},{\"name\":\"desktop\",\"type\":[\"double\",\"null\"]},{\"name\":\"tablet\",\"type\":[\"double\",\"null\"]},{\"name\":\"mobile\",\"type\":[\"double\",\"null\"]}]}"
}
]
},
{
"name": "Google Cloud Storage",
"plugin": {
"name": "GCS",
"type": "batchsink",
"label": "Google Cloud Storage",
"artifact": {
"name": "google-cloud",
"version": "0.12.2",
"scope": "SYSTEM"
},
"properties": {
"project": "bi-data-science",
"suffix": "yyyy-MM-dd",
"format": "json",
"serviceFilePath": "/home/ubuntu/bi-data-science-cdap-4cbf526de374.json",
"schema": "{\"type\":\"record\",\"name\":\"etlSchemaBody\",\"fields\":[{\"name\":\"destination_name\",\"type\":[\"string\",\"null\"]},{\"name\":\"destination_country\",\"type\":[\"string\",\"null\"]},{\"name\":\"timestamp\",\"type\":[\"double\",\"null\"]},{\"name\":\"desktop\",\"type\":[\"double\",\"null\"]},{\"name\":\"tablet\",\"type\":[\"double\",\"null\"]},{\"name\":\"mobile\",\"type\":[\"double\",\"null\"]}]}",
"delimiter": ",",
"referenceName": "gcs_cdap",
"path": "gs://hurb_sandbox/cdap_experiments/"
}
},
"outputSchema": [
{
"name": "etlSchemaBody",
"schema": "{\"type\":\"record\",\"name\":\"etlSchemaBody\",\"fields\":[{\"name\":\"destination_name\",\"type\":[\"string\",\"null\"]},{\"name\":\"destination_country\",\"type\":[\"string\",\"null\"]},{\"name\":\"timestamp\",\"type\":[\"double\",\"null\"]},{\"name\":\"desktop\",\"type\":[\"double\",\"null\"]},{\"name\":\"tablet\",\"type\":[\"double\",\"null\"]},{\"name\":\"mobile\",\"type\":[\"double\",\"null\"]}]}"
}
],
"inputSchema": [
{
"name": "BigQuery",
"schema": "{\"type\":\"record\",\"name\":\"etlSchemaBody\",\"fields\":[{\"name\":\"destination_name\",\"type\":[\"string\",\"null\"]},{\"name\":\"destination_country\",\"type\":[\"string\",\"null\"]},{\"name\":\"timestamp\",\"type\":[\"double\",\"null\"]},{\"name\":\"desktop\",\"type\":[\"double\",\"null\"]},{\"name\":\"tablet\",\"type\":[\"double\",\"null\"]},{\"name\":\"mobile\",\"type\":[\"double\",\"null\"]}]}"
}
]
}
],
"schedule": "0 * * * *",
"engine": "mapreduce",
"numOfRecordsPreview": 100,
"description": "Data Pipeline Application",
"maxConcurrentRuns": 1
}
}
The credential key has owner privileges and I'm able to access the query result using the "preview" option.
Pipeline result:
Files:
_SUCCESS (empty)
part-r-00000 (query result)
None csv file has been generated and I'm also not found a place where I can set a name to my output file in CDAP. Did I miss any configuration step?
Update:
We eventualy gave up on CDAP, and we're using Google DataFlow.
When configuring the GCS sink in the pipeline, there is a 'format' field, which you have set to JSON. You can set this to CSV to achieve the format you would like.

AzureDevOps - Different behavior from custom task for Services and Server

I created a custom task using the documentation, however it works on Azure DevOps Services but on Server it gives the error
An error occurred while loading the YAML build pipeline. Value cannot be null. Parameter name: key
My first thoughts are "what is the parameter that is missing?" so i filled all the available and possible parameters and still continued with the error.
After that i went to the event viewer in the machine running Azure DevOps Server and got this error:
Detailed Message: The subscriber Pipelines Check Run: build completed event listener raised an exception while being notified of event Microsoft.TeamFoundation.Build2.Server.BuildCompletedEvent.
Exception Message: Value cannot be null.
Parameter name: definition and repository (type ArgumentNullException)
Exception Stack Trace: at Microsoft.TeamFoundation.Pipelines.Server.Providers.TfsGitProvider.TfsGitConnectionCreator.IsProviderDefinition(IVssRequestContext requestContext, BuildDefinition definition)
at Microsoft.TeamFoundation.Pipelines.Server.Extensions.BuildCompletedEventListener2.HandleCompletedEvent(IVssRequestContext requestContext, IReadOnlyBuildData build, BuildDefinition definition)
at Microsoft.TeamFoundation.Pipelines.Server.Extensions.BuildCompletedEventListener.ProcessEvent(IVssRequestContext requestContext, NotificationType notificationType, Object notificationEvent, Int32& statusCode, String& statusMessage, ExceptionPropertyCollection& properties)
at Microsoft.TeamFoundation.Framework.Server.TeamFoundationEventService.SubscriptionList.Notify(IVssRequestContext requestContext, NotificationType notificationType, Object notificationEventArgs, String& statusMessage, ExceptionPropertyCollection& properties, Exception& exception)
task.json:
{
"id": "25156245-9317-48e2-bcf4-7dab4c130a3e",
"name": "ping-pong-build-trigger",
"friendlyName": "Ping Pong Build Trigger",
"description": "Randomly trigger builds to find a sequenced build order",
"helpMarkDown": "https://github.com/brunomartinspro/Ping-Pong-Build-Trigger-AzureDevOps",
"category": "Build",
"author": "Bruno Martins (brunomartins.pro)",
"version": {
"Major": 1,
"Minor": 0,
"Patch": 0
},
"instanceNameFormat": "Ping Pong Build Trigger",
"properties": {
"mode": {
"type": "string",
"description": "Mode to be used",
"label": "Mode",
"required": "true"
},
"apiKey": {
"type": "string",
"label": "PAT",
"defaultValue": "",
"description": "Personal Access Token.",
"required": "true"
},
"source": {
"type": "string",
"label": "AzureDevOps Project URI",
"defaultValue": "http://kamina.azuredevops.local/DefaultCollection/Kamina",
"description": "AzureDevOps Project URI.",
"required": "true"
},
"projectName": {
"type": "string",
"label": "AzureDevOps Project Name",
"defaultValue": "Kamina",
"description": "AzureDevOps Project Name.",
"required": "true"
},
"sourceBranch": {
"type": "string",
"label": "Git Source Branch",
"defaultValue": "develop",
"description": "The branch the builds will trigger",
"required": "true"
},
"lastKnownFile": {
"type": "string",
"label": "Sequence Location",
"defaultValue": "",
"description": "The location of the Build Order.",
"required": "true"
},
"maxErrorCycles": {
"type": "int",
"label": "Maximum Error Cycles",
"defaultValue": 10,
"description": "The number of fails allowed.",
"required": "true"
},
"infiniteCycles": {
"type": "string",
"label": "Infinite Cycles",
"defaultValue": "false",
"description": "Infinite Cycles - only ends until everything succeeds.",
"required": "true"
}
},
"inputs": [{
"name": "mode",
"type": "string",
"label": "Mode",
"defaultValue": "AzureDevOps",
"helpMarkDown": "Mode to be used.",
"required": "true"
},
{
"name": "apiKey",
"type": "string",
"label": "PAT",
"defaultValue": "",
"helpMarkDown": "Personal Access Token.",
"required": "true"
},
{
"name": "source",
"type": "string",
"label": "AzureDevOps Project URI",
"defaultValue": "http://kamina.azuredevops.local/DefaultCollection/Kamina",
"helpMarkDown": "AzureDevOps Project URI.",
"required": "true"
},
{
"name": "projectName",
"type": "string",
"label": "AzureDevOps Project Name",
"defaultValue": "Kamina",
"helpMarkDown": "AzureDevOps Project Name.",
"required": "true"
},
{
"name": "sourceBranch",
"type": "string",
"label": "Git Source Branch",
"defaultValue": "develop",
"helpMarkDown": "The branch the builds will trigger",
"required": "true"
},
{
"name": "lastKnownFile",
"type": "string",
"label": "Sequence Location",
"defaultValue": "",
"helpMarkDown": "The location of the Build Order.",
"required": "true"
},
{
"name": "maxErrorCycles",
"type": "int",
"label": "Maximum Error Cycles",
"defaultValue": 10,
"helpMarkDown": "The number of fails allowed.",
"required": "true"
},
{
"name": "infiniteCycles",
"type": "string",
"label": "Infinite Cycles",
"defaultValue": "false",
"helpMarkDown": "Infinite Cycles - only ends until everything succeeds.",
"required": "true"
}
],
"execution": {
"PowerShell": {
"target": "ping-pong-build-trigger.ps1",
"argumentFormat": ""
}
}
}
vss-extension.json
{
"manifestVersion": 1,
"id": "ping-pong-build-trigger-task",
"name": "Ping Pong Build Trigger",
"version": "1.0.0",
"publisher": "BrunoMartinsPro",
"targets": [{
"id": "Microsoft.VisualStudio.Services"
}],
"description": "Randomly trigger builds to find a sequenced build order",
"categories": [
"Azure Pipelines"
],
"icons": {
"default": "extensionIcon.png"
},
"files": [{
"path": "task"
}],
"contributions": [{
"id": "ping-pong-build-trigger",
"type": "ms.vss-distributed-task.task",
"targets": [
"ms.vss-distributed-task.tasks"
],
"properties": {
"name": "task"
}
}]
}
How can i use a custom task in both Services and Server?
The .vsix can be downloaded in the release page of the Github Repository: https://github.com/brunomartinspro/Ping-Pong-Build-Trigger-AzureDevOps
Developer Community: https://developercommunity.visualstudio.com/content/problem/715570/server-and-services-have-different-behavior.html
So it appears that there is some sort of cache mechanism in the extensions, i need 3 azure devops server editions to debug.
The first one was used for development, the second one also for development but uninstalled and installed again, the third one for testing public releases.
I couldn't find the physical directory of where the cache gets stored, if there is cache at all.

SAPUI5 - Implementing smart variant management

I have implemented Smart table with TablePersonalisation & VariantManagement. I have deployed my application in ABAB repository & tested. When I save a new variant, It gets saved. But it is not retrieved.
I have not coded anything for variant management in controller. Below is the screenshot of json data that is passed & saved in backend.
domain:8010/sap/bc/lrep/flex/data/tracking.Component not found.
Please suggest.
I had the same problem, and to resolve the issue, I had to modify the routes in the neo-app.json as below. In my case, I have an ABAP backend system, and the way I understand it now, is that these routes can help identify where the persistance is done for the variants. Look at:
"path": "/sap/bc/lrep/changes", "path": "/sap/bc/lrep/variants", "path": "/sap/bc/lrep/flex/settings", "path": "/sap/bc/lrep/flex/data"
neo-app.json contents:
{
"welcomeFile": "/webapp/index.html",
"routes": [
{
"path": "/resources",
"target": {
"type": "service",
"name": "sapui5",
"entryPath": "/resources"
},
"description": "SAPUI5 Resources"
},
{
"path": "/test-resources",
"target": {
"type": "service",
"name": "sapui5",
"entryPath": "/test-resources"
},
"description": "SAPUI5 Resources"
},
{
"path": "/webapp/resources",
"target": {
"type": "service",
"name": "sapui5",
"entryPath": "/resources"
},
"description": "SAPUI5 Resources"
},
{
"path": "/webapp/test-resources",
"target": {
"type": "service",
"name": "sapui5",
"entryPath": "/test-resources"
},
"description": "SAPUI5 Test Resources"
},
{
"path": "/sap/opu/odata",
"target": {
"type": "destination",
"name": "S4H",
"entryPath": "/sap/opu/odata"
},
"description": "S4H"
},
{
"path": "/sap/bc/lrep/flex/data",
"target": {
"type": "destination",
"name": "S4H",
"entryPath": "/sap/bc/lrep/flex/data"
},
"description": "S4H_data"
},
{
"path": "/sap/bc/lrep/flex/settings",
"target": {
"type": "destination",
"name": "S4H",
"entryPath": "/sap/bc/lrep/flex/settings"
},
"description": "S4H_settings"
}
,
{
"path": "/sap/bc/lrep/changes",
"target": {
"type": "destination",
"name": "S4H",
"entryPath": "/sap/bc/lrep/changes"
},
"description": "S4H_changes"
}
,
{
"path": "/sap/bc/lrep/variants",
"target": {
"type": "destination",
"name": "S4H",
"entryPath": "/sap/bc/lrep/variants"
},
"description": "S4H_variants"
}
],
"sendWelcomeFileRedirect": true
}