TF51005: The query references a field that does not exist. The error is caused by «[Custom.ReflectedWorkItemId]» - azure-devops

I try to migrate Azure DevOps boards from one organization to another.
And I use this tool: https://github.com/nkdAgility/azure-devops-migration-tools
I follow the instructions in the document but errors occur:
[16:43:59 INF] Processor: WorkItemMigration
[16:43:59 INF] Migration Context Start: WorkItemMigration [16:43:59 INF] WorkItemMigrationContext::InternalExecute ...
[16:44:00 INF] MigrationClient: Access granted to https://dev.azure.com/yyy/ for xxx (xxx#xxx.com) ...
[16:44:03 INF] MigrationClient: Access granted to https://dev.azure.com/zzz/ for xxx (xxx#xxx.com)
[16:44:04 INF] Migrating all Nodes before the Processor run.
[16:44:05 WRN] The node \xxx\Iteration\Sprint 1 is being excluded due to your basePath setting.
[16:44:06 INF] Querying items to be migrated: SELECT [System.Id], [System.Tags] FROM WorkItems WHERE [System.TeamProject] = #TeamProject AND [System.WorkItemType] NOT IN ('Test Suite', 'Test Plan') ORDER BY [System.ChangedDate] desc ...
[16:44:13 INF] Replay all revisions of 20 work items?
[16:44:13 INF] Found target project as test-han [16:44:13 INF] [FilterWorkItemsThatAlreadyExistInTarget] is enabled. Searching for work items that have already been migrated to the target...
[16:44:13 ERR] Error running query Microsoft.TeamFoundation.WorkItemTracking.Client.ValidationException: TF51005: The query references a field that does not exist. The error is caused by «[Custom.ReflectedWorkItemId]».
at Microsoft.TeamFoundation.WorkItemTracking.Client.Query.Initialize(WorkItemStore store, String wiql, IDictionary context, Int32[] ids, Int32[] revs, Boolean dayPrecision)
at Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore.Query(String wiql, IDictionary context)
at MigrationTools._EngineV1.Clients.TfsWorkItemQuery.GetWorkItemsFromQuery(TfsWorkItemMigrationClient wiClient)
in D:\a\1\s\src\MigrationTools.Clients.AzureDevops.ObjectModel\_EngineV1\Clients\TfsWorkItemQuery.cs:line 40
[16:44:13 FTL] Error while running WorkItemMigration Microsoft.TeamFoundation.WorkItemTracking.Client.ValidationException: TF51005: The query references a field that does not exist. The error is caused by «[Custom.ReflectedWorkItemId]».
at Microsoft.TeamFoundation.WorkItemTracking.Client.Query.Initialize(WorkItemStore store, String wiql, IDictionary context, Int32[] ids, Int32[] revs, Boolean dayPrecision)
at Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItemStore.Query(String wiql, IDictionary context)
at MigrationTools._EngineV1.Clients.TfsWorkItemQuery.GetWorkItemsFromQuery(TfsWorkItemMigrationClient wiClient)
in D:\a\1\s\src\MigrationTools.Clients.AzureDevops.ObjectModel\_EngineV1\Clients\TfsWorkItemQuery.cs:line 70 at MigrationTools._EngineV1.Clients.TfsWorkItemQuery.GetWorkItems()
in D:\a\1\s\src\MigrationTools.Clients.AzureDevops.ObjectModel\_EngineV1\Clients\TfsWorkItemQuery.cs:line 30
at MigrationTools._EngineV1.Clients.TfsWorkItemMigrationClient.FilterExistingWorkItems(List 1 sourceWorkItems, TfsWiqlDefinition wiqlDefinition, TfsWorkItemMigrationClient sourceWorkItemMigrationClient)
in D:\a\1\s\src\MigrationTools.Clients.AzureDevops.ObjectModel\_EngineV1\Clients\TfsWorkItemMigrationClient.cs:line 54
at VstsSyncMigrator.Engine.WorkItemMigrationContext.InternalExecute()
in D:\a\1\s\src\VstsSyncMigrator.Core\Execution\MigrationContext\WorkItemMigrationContext.cs:line 120
at MigrationTools._EngineV1.Processors.MigrationProcessorBase.Execute()
in D:\a\1\s\src\MigrationTools\_EngineV1\Processors\MigrationProcessorBase.cs:line 47 [16:44:13 ERR] WorkItemMigration The Processor MigrationEngine entered the failed state...stopping run
[16:44:13 INF] Application is shutting down...
Below is my config file
{
"ChangeSetMappingFile": null,
"Source": {
"$type": "TfsTeamProjectConfig",
"Collection": "https://dev.azure.com/yyy/",
"Project": "y01",
"ReflectedWorkItemIDFieldName": "Custom.ReflectedWorkItemId",
"AllowCrossProjectLinking": false,
"AuthenticationMode": "AccessToken",
"PersonalAccessToken": "MY_TOKEN",
"LanguageMaps": {
"AreaPath": "Area",
"IterationPath": "Iteration"
}
},
"Target": {
"$type": "TfsTeamProjectConfig",
"Collection": "https://dev.azure.com/zzz/",
"Project": "z01",
"ReflectedWorkItemIDFieldName": "Custom.ReflectedWorkItemId",
"AllowCrossProjectLinking": false,
"AuthenticationMode": "AccessToken",
"PersonalAccessToken": "MY_TOKEN2",
"LanguageMaps": {
"AreaPath": "Area",
"IterationPath": "Iteration"
}
},
"GitRepoMapping": null,
"LogLevel": "Information",
"Processors": [
{
"$type": "WorkItemMigrationConfig",
"Enabled": true,
"ReplayRevisions": false,
"PrefixProjectToNodes": false,
"UpdateCreatedDate": true,
"UpdateCreatedBy": true,
"WIQLQueryBit": "AND [System.WorkItemType] NOT IN ('Test Suite', 'Test Plan')",
"WIQLOrderBit": "[System.ChangedDate] desc",
"LinkMigration": false,
"AttachmentMigration": false,
"AttachmentWorkingPath": "c:\temp\WorkItemAttachmentWorkingFolder\",
"FixHtmlAttachmentLinks": false,
"SkipToFinalRevisedWorkItemType": true,
"WorkItemCreateRetryLimit": 5,
"FilterWorkItemsThatAlreadyExistInTarget": true,
"PauseAfterEachWorkItem": false,
"AttachmentMaxSize": 480000000,
"AttachRevisionHistory": false,
"LinkMigrationSaveEachAsAdded": false,
"GenerateMigrationComment": false,
"NodeStructureEnricherEnabled": null,
"NodeBasePaths": [
"/"
],
"WorkItemIDs": null,
"MaxRevisions": 0
}
],
"Version": "11.11",
"workaroundForQuerySOAPBugEnabled": false,
"WorkItemTypeDefinition": {
"sourceWorkItemTypeName": "targetWorkItemTypeName"
},
"Endpoints": {
"InMemoryWorkItemEndpoints": [
{
"Name": "Source",
"EndpointEnrichers": null
},
{
"Name": "Target",
"EndpointEnrichers": null
}
]
}
}
I follow some solutions from other articles, setup a custom column called 'ReflectedItemId' for workitem type 'Product Backlog' (just this
type).
And apply this self-define process (inherited form Scrum) to both source & destination project. And still can not fix it.
Then, I try to run the query shown above(error log) with Wiql playground extension on source project.
It says that "#TeamProject " is not a valid name, which means the query from source project is not working, not to mention the succeeding tasks
. Am I understand it right?
Please give me some suggestion.

This field is used to releate WorkItems migrated. You need to create it in your destiny project. For this, follow this steps:
1.- Create an inherited process
2.- For each work item, you have to create a field (for first) or use an existing field (for the next) with name "ReflectedWorkItemId"
3.- Start migration again
If you have been follow this steps, you should put in your config file this value "Custom.ReflectedWorkItemId"

Related

Interpreting sophos static file analysis score

Is there an explanation anywhere for what the score field from the sophos static/dynamic file analysis report means?
The schema simply states: Maliciousness score of the analyzed file (0 = malicious, 100 = benign).
I expected this to be interpreted the same way as the file hash lookup reputationScore:
The following ranges are defined:
[0-19]: Malware
[20-29]: PUA (potentially unwanted application)
[30-69]: Unknown/suspicious
[70-100]: Known good
However, I've received a score of 10 which would indicate malware, but I have used a safe PDF file, which seems unexpected.
Does Sophos think the file is malicious, if it responds with a report score of 10 for static file analysis?
This is the response from Sophos:
{
"jobId": "3aee2c04a73bb64b3572271389cc2e95",
"jobStatus": "SUCCESS",
"report": {
"analysis_subject": {
"mime_type": "application/pdf",
"sha1": "5b03ccec77b416805d6d8e270d33942aaedcc6dd",
"sha256": "f6edcd8a1b4f7cb85486d0c6777f9174eadbc4d1d0d9e5aeba7132f30b34bc3e"
},
"analysis_summary": [
{
"description": "Document contains links to external domains",
"name": "edr_contains_domain_links",
"severity": 1
},
{
"description": "Document file size is small",
"name": "edr_info_file_size_small",
"severity": 1
},
{
"description": "Document has a small number of pages",
"name": "edr_info_page_count_small",
"severity": 1
}
],
"analysis_type": "static",
"detection": {
"permalink": "https://www.virustotal.com/gui/file/f6edcd8a1b4f7cb85486d0c6777f9174eadbc4d1d0d9e5aeba7132f30b34bc3e/detection/f-f6edcd8a1b4f7cb85486d0c6777f9174eadbc4d1d0d9e5aeba7132f30b34bc3e-1656684162",
"positives": 0,
"sophos": "",
"sophos_ml": "",
"total": 59
},
"document_analysis": {
"meta_data": {
"author": "Yukon Department of Education",
"bytes": 20597,
"content_type": "PDF",
"encryption": "Standard V2.3 (128-bit)",
"language": "EN-US",
"last_saved_time": "2008-06-04T15:47:36Z",
"num_pages": 1,
"title": "PDF Test Page",
"version": 1.6
}
},
"linked_with_dynamic_analysis": false,
"ml_aggregate_results": {
"overall_score": 30
},
"ml_file": {
"analyses": {
"black_box": {
"benign": {
"raw": 0.39815810322761536,
"score": 30
},
"model_name": "dsml_model_pdf",
"model_version": "20211118"
},
"feature_intersections": [
{
"benign": 7120629,
"benign_fraction": 0.7120629263895423,
"category": "severity=1",
"description": "Feature NOT Observed: Document file size is large",
"indicator": "Feature NOT Observed: Document file size is large --> severity=1",
"malware": 9997092,
"malware_fraction": 0.9997092138044599,
"probability": 0.5840200283264096,
"scale_factor": 10000000
},
{
"benign": 617857,
"benign_fraction": 0.06178572053380388,
"category": "severity=1",
"description": "Feature Observed: Document has a small number of pages",
"indicator": "Feature Observed: Document has a small number of pages --> severity=1",
"malware": 5394909,
"malware_fraction": 0.5394909111791909,
"probability": 0.8972424383801834,
"scale_factor": 10000000
},
{
"benign": 537367,
"benign_fraction": 0.0537367720738131,
"category": "severity=2",
"description": "Feature NOT Observed: Document contains behaviour that executes on open",
"indicator": "Feature NOT Observed: Document contains behaviour that executes on open --> severity=2",
"malware": 5372856,
"malware_fraction": 0.5372856378987031,
"probability": 0.9090782833830074,
"scale_factor": 10000000
},
{
"benign": 536021,
"benign_fraction": 0.0536021766615527,
"category": "severity=2",
"description": "Feature NOT Observed: Document contains javascript",
"indicator": "Feature NOT Observed: Document contains javascript --> severity=2",
"malware": 5371043,
"malware_fraction": 0.5371043501402938,
"probability": 0.9092575175159124,
"scale_factor": 10000000
},
{
"benign": 509534,
"benign_fraction": 0.050953467474390626,
"category": "severity=1",
"description": "Feature NOT Observed: Document is possibly a phishing PDF",
"indicator": "Feature NOT Observed: Document is possibly a phishing PDF --> severity=1",
"malware": 5364929,
"malware_fraction": 0.5364929575695063,
"probability": 0.9132627839711798,
"scale_factor": 10000000
},
{
"benign": 428577,
"benign_fraction": 0.04285774978628207,
"category": "severity=2",
"description": "Feature NOT Observed: Field contains potentially suspicious content",
"indicator": "Feature NOT Observed: Field contains potentially suspicious content --> severity=2",
"malware": 5364293,
"malware_fraction": 0.5364293727421457,
"probability": 0.9260163947729065,
"scale_factor": 10000000
}
],
"feature_maliciousness": {
"Document contains links to external domains --> severity=1": {
"benign": 1828421,
"benign_fraction": 0.18284217995357024,
"category": "severity=1",
"description": "Document contains links to external domains",
"indicator": "Document contains links to external domains --> severity=1",
"malware": 9869267,
"malware_fraction": 0.9869267695627242,
"probability": 0.8436937653122213,
"scale_factor": 10000000
},
"Document file size is small --> severity=1": {
"benign": 7120119,
"benign_fraction": 0.7120119817558322,
"category": "severity=1",
"description": "Document file size is small",
"indicator": "Document file size is small --> severity=1",
"malware": 9899084,
"malware_fraction": 0.9899084471664678,
"probability": 0.581642026468478,
"scale_factor": 10000000
},
"Document has a small number of pages --> severity=1": {
"benign": 4086919,
"benign_fraction": 0.40869198927301453,
"category": "severity=1",
"description": "Document has a small number of pages",
"indicator": "Document has a small number of pages --> severity=1",
"malware": 5416129,
"malware_fraction": 0.5416129670799382,
"probability": 0.569935959461394,
"scale_factor": 10000000
}
},
"genetic_analysis": {
"neighbor_info": {
"1f1006182c2e9b6e2b09b07f9be9e122fdc1e681577af68984ab63a076a15fed": {
"filepath": "1f1006182c2e9b6e2b09b07f9be9e122fdc1e681577af68984ab63a076a15fed",
"is_malware": false,
"match_percentage": 0.25,
"score": 66.06397
},
"672cfdffbc33f07c0ad65633cbf610c5ec4bb7787c72d84a5460266aaa9a2dfa": {
"filepath": "672cfdffbc33f07c0ad65633cbf610c5ec4bb7787c72d84a5460266aaa9a2dfa",
"is_malware": false,
"match_percentage": 0.21875,
"score": 62.829075
},
"6cdde8eee67aa38917dfa4249f91381ffa983f2ff95a84d0f6076a4ddecf3de8": {
"filepath": "6cdde8eee67aa38917dfa4249f91381ffa983f2ff95a84d0f6076a4ddecf3de8",
"is_malware": false,
"match_percentage": 0.21875,
"score": 63.53914
},
"9a0d27944893e40316037fd47fb4d9836c1518705b1baa4a0ebf0fe34b045c00": {
"filepath": "9a0d27944893e40316037fd47fb4d9836c1518705b1baa4a0ebf0fe34b045c00",
"is_malware": false,
"match_percentage": 0.1875,
"score": 58.78177
},
"a881bffc0893ae55112a9370f9cf693c3893d672b96c2160e341d9f20d47cd2f": {
"filepath": "a881bffc0893ae55112a9370f9cf693c3893d672b96c2160e341d9f20d47cd2f",
"is_malware": false,
"match_percentage": 0.8125,
"score": 234.79837
},
"add263021a636c93d1fd6f9d7ac880ac8afaacc917dca01dbb66d388c71d1e6c": {
"filepath": "add263021a636c93d1fd6f9d7ac880ac8afaacc917dca01dbb66d388c71d1e6c",
"is_malware": false,
"match_percentage": 0.1875,
"score": 59.46551
}
},
"neighbor_matrix": {
"1f1006182c2e9b6e2b09b07f9be9e122fdc1e681577af68984ab63a076a15fed": {
"0_6659": false,
"10_9152": false,
"11_4861": false,
"12_5543": false,
"13_3732": false,
"14_5431": false,
"15_5899": false,
"16_1078": false,
"17_2637": true,
"18_6885": false,
"19_8710": false,
"1_7974": false,
"20_6372": true,
"21_7672": false,
"22_8447": false,
"23_5023": false,
"24_7353": false,
"25_4809": false,
"26_7069": true,
"27_5993": false,
"28_2717": true,
"29_2739": true,
"2_7985": true,
"30_7482": true,
"31_5233": false,
"3_7524": false,
"4_6424": true,
"5_110": false,
"6_8324": false,
"7_6214": false,
"8_7332": false,
"9_8770": false
},
"672cfdffbc33f07c0ad65633cbf610c5ec4bb7787c72d84a5460266aaa9a2dfa": {
"0_6659": false,
"10_9152": false,
"11_4861": false,
"12_5543": false,
"13_3732": false,
"14_5431": false,
"15_5899": false,
"16_1078": false,
"17_2637": true,
"18_6885": false,
"19_8710": false,
"1_7974": true,
"20_6372": true,
"21_7672": false,
"22_8447": false,
"23_5023": false,
"24_7353": false,
"25_4809": true,
"26_7069": false,
"27_5993": false,
"28_2717": true,
"29_2739": false,
"2_7985": false,
"30_7482": false,
"31_5233": false,
"3_7524": false,
"4_6424": true,
"5_110": false,
"6_8324": false,
"7_6214": false,
"8_7332": false,
"9_8770": true
},
"6cdde8eee67aa38917dfa4249f91381ffa983f2ff95a84d0f6076a4ddecf3de8": {
"0_6659": false,
"10_9152": false,
"11_4861": false,
"12_5543": false,
"13_3732": false,
"14_5431": false,
"15_5899": false,
"16_1078": true,
"17_2637": false,
"18_6885": false,
"19_8710": false,
"1_7974": false,
"20_6372": true,
"21_7672": false,
"22_8447": false,
"23_5023": false,
"24_7353": false,
"25_4809": false,
"26_7069": true,
"27_5993": false,
"28_2717": true,
"29_2739": false,
"2_7985": true,
"30_7482": true,
"31_5233": false,
"3_7524": false,
"4_6424": true,
"5_110": false,
"6_8324": false,
"7_6214": false,
"8_7332": false,
"9_8770": false
},
"9a0d27944893e40316037fd47fb4d9836c1518705b1baa4a0ebf0fe34b045c00": {
"0_6659": false,
"10_9152": false,
"11_4861": false,
"12_5543": false,
"13_3732": true,
"14_5431": false,
"15_5899": false,
"16_1078": true,
"17_2637": false,
"18_6885": true,
"19_8710": false,
"1_7974": true,
"20_6372": false,
"21_7672": false,
"22_8447": false,
"23_5023": false,
"24_7353": true,
"25_4809": false,
"26_7069": false,
"27_5993": false,
"28_2717": true,
"29_2739": false,
"2_7985": false,
"30_7482": false,
"31_5233": false,
"3_7524": false,
"4_6424": false,
"5_110": false,
"6_8324": false,
"7_6214": false,
"8_7332": false,
"9_8770": false
},
"a881bffc0893ae55112a9370f9cf693c3893d672b96c2160e341d9f20d47cd2f": {
"0_6659": true,
"10_9152": true,
"11_4861": true,
"12_5543": true,
"13_3732": true,
"14_5431": true,
"15_5899": true,
"16_1078": true,
"17_2637": true,
"18_6885": false,
"19_8710": true,
"1_7974": false,
"20_6372": true,
"21_7672": true,
"22_8447": false,
"23_5023": true,
"24_7353": true,
"25_4809": true,
"26_7069": true,
"27_5993": true,
"28_2717": true,
"29_2739": true,
"2_7985": true,
"30_7482": true,
"31_5233": false,
"3_7524": true,
"4_6424": true,
"5_110": false,
"6_8324": true,
"7_6214": false,
"8_7332": true,
"9_8770": true
},
"add263021a636c93d1fd6f9d7ac880ac8afaacc917dca01dbb66d388c71d1e6c": {
"0_6659": false,
"10_9152": false,
"11_4861": false,
"12_5543": false,
"13_3732": false,
"14_5431": false,
"15_5899": false,
"16_1078": false,
"17_2637": false,
"18_6885": false,
"19_8710": false,
"1_7974": false,
"20_6372": true,
"21_7672": false,
"22_8447": false,
"23_5023": false,
"24_7353": false,
"25_4809": true,
"26_7069": false,
"27_5993": false,
"28_2717": true,
"29_2739": false,
"2_7985": true,
"30_7482": true,
"31_5233": false,
"3_7524": false,
"4_6424": true,
"5_110": false,
"6_8324": false,
"7_6214": false,
"8_7332": false,
"9_8770": false
}
}
}
},
"analyzed_counts": {
"black_box": {
"benign": 0,
"malware": 0
},
"feature_intersections": {
"benign": 2798922,
"malware": 6340055
},
"feature_maliciousness": {
"benign": 2798922,
"malware": 6340055
},
"genetic_analysis": {
"benign": 7701633,
"malware": 2298367
}
},
"overall_score": 30,
"overall_scores": {
"black_box": 30,
"feature_intersections": 15,
"feature_maliciousness": 15,
"genetic_analysis": 13
}
},
"ml_filepath": {
"analyses": {
"neighbor_maliciousness": {
"most_similar": [],
"most_similar_benign": [],
"most_similar_malware": []
}
},
"analyzed_counts": {
"neighbor_maliciousness": {
"benign": -1,
"malware": -1
}
},
"overall_score": -1,
"overall_scores": {
"neighbor_maliciousness": -1
}
},
"ml_inputs": {
"filepath": null
},
"object_type": "file",
"reputation": {
"first_seen": "2022-02-08T19:28:46",
"last_seen": "2022-07-04T07:46:43",
"prevalence": "Popular",
"score": 62,
"score_string": "Prevalent"
},
"schema_version": "1.1.0",
"score": 10,
"submission": "2022-07-04T08:43:34Z",
"target": {
"file_name": "pdf-test.pdf",
"mime_type": "application/pdf",
"object_id": "f6edcd8a1b4f7cb85486d0c6777f9174eadbc4d1d0d9e5aeba7132f30b34bc3e",
"sha1": "5b03ccec77b416805d6d8e270d33942aaedcc6dd",
"sha256": "f6edcd8a1b4f7cb85486d0c6777f9174eadbc4d1d0d9e5aeba7132f30b34bc3e"
}
},
"requestId": "68db2f66-c63e-4a04-93f9-7067231e42e1"
}
File: https://www.orimi.com/pdf-test.pdf
There are a couple of interesting points in your question. Let's start with the scoring.
You are correct the API documentation is not entirely accurate. A score <20 is malicious and >70 is clean. You can see a sample implementation of processing the scores around line 139 here.
In the case of the report that you provided the ML analyzers are causing the file to be convicted. From the report it looks like the following file features (which are commonly seen in malicious files) are causing the ML model to believe the file is malicious:
Document contains links to external domains
Document file size is small
Document has a small number of pages
Looking at the dynamic analysis results and the information from Virus Total etc. this could be a false positive and should be escalated to Sophos. The escalation path for FP / FN's is here:
https://support.sophos.com/support/s/filesubmission?language=en_US

Configuring keycloak as IDP in another keycloak

I am trying to configure a keycloak as an IDP in another keycloak. In my test setup there are 2 keycloak containers - keycloak-1 and keycloak-2.
In keycloak-1, I have created an openid client called idp-client. The configuration exported as JSON is shown below,
{
"clientId": "idp-client",
"surrogateAuthRequired": false,
"enabled": true,
"alwaysDisplayInConsole": false,
"clientAuthenticatorType": "client-secret",
"redirectUris": [
"http://localhost:8081/auth/realms/master/broker/oidc/endpoint"
],
"webOrigins": [],
"notBefore": 0,
"bearerOnly": false,
"consentRequired": false,
"standardFlowEnabled": true,
"implicitFlowEnabled": false,
"directAccessGrantsEnabled": true,
"serviceAccountsEnabled": false,
"publicClient": false,
"frontchannelLogout": false,
"protocol": "openid-connect",
"attributes": {
"saml.assertion.signature": "false",
"saml.force.post.binding": "false",
"saml.multivalued.roles": "false",
"saml.encrypt": "false",
"saml.server.signature": "false",
"saml.server.signature.keyinfo.ext": "false",
"exclude.session.state.from.auth.response": "false",
"saml_force_name_id_format": "false",
"saml.client.signature": "false",
"tls.client.certificate.bound.access.tokens": "false",
"saml.authnstatement": "false",
"display.on.consent.screen": "false",
"saml.onetimeuse.condition": "false"
},
"authenticationFlowBindingOverrides": {},
"fullScopeAllowed": true,
"nodeReRegistrationTimeout": -1,
"defaultClientScopes": [
"web-origins",
"role_list",
"profile",
"roles",
"email"
],
"optionalClientScopes": [
"address",
"phone",
"offline_access",
"microprofile-jwt"
],
"access": {
"view": true,
"configure": true,
"manage": true
}
}
In keycloak-2, I have added keycloak-1 as an identity provider. I have attached the images of the configuration for reference.
Now when I try to login to keycloak-2 using keycloak-1's user, after successful login, I am redirected to keycloak-2's but to an error page, its showing "Unexpected error when authenticating with identity provider"
Stacktrace in the keycloak-2's terminal is as below,
keycloak2_1 | 08:21:17,079 TRACE [org.keycloak.events] (default task-10) type=CODE_TO_TOKEN_ERROR, realmId=master, clientId=idp-client, userId=null, ipAddress=127.0.0.1, error=invalid_client_credentials, grant_type=authorization_code, requestUri=http://localhost:8080/auth/realms/master/protocol/openid-connect/token, cookies=[]
...
...
keycloak2_1 | 08:21:17,081 TRACE [org.keycloak.events] (default task-12) type=LOGIN_ERROR, realmId=master, clientId=null, userId=null, ipAddress=172.21.0.1, error=identity_provider_login_failure, requestUri=http://localhost:8081/auth/realms/master/broker/oidc/endpoint?state=MOz5_i2-SpoSLRtS4IWkXtUBsSzGciBysUrdYq8gGy0.k0e5WlQElRw.security-admin-console&session_state=1cc90330-cf3c-45c3-a44e-53fc71b17bb1&code=be12828b-3408-4822-b275-31afeb1c0405.1cc90330-cf3c-45c3-a44e-53fc71b17bb1.eb068c43-9f4e-45b7-b2e8-ac346f139141, cookies=[KEYCLOAK_SESSION_LEGACY=master/ce30941f-407f-4344-9019-58d2f26ea832/1cc90330-cf3c-45c3-a44e-53fc71b17bb1, PrivacyPolicy=accepted, KEYCLOAK_IDENTITY_LEGACY=eyJhbGciOiJIUzI1NiIsInR5cCIgOiAiSldUIiwia2lkIiA6ICIxNjgzNzMzNi1mMTA3LTRiMTktODk1Yi0wNWJmZDliZGIzYTUifQ.eyJleHAiOjE2MDk3ODQ0NzcsImlhdCI6MTYwOTc0ODQ3NywianRpIjoiNTQ3YmRlZGQtOGQyYy00MTJkLWJhMzYtZjkyNzAwNjlhYzQxIiwiaXNzIjoiaHR0cDovL2xvY2FsaG9zdDo4MDgwL2F1dGgvcmVhbG1zL21hc3RlciIsInN1YiI6ImNlMzA5NDFmLTQwN2YtNDM0NC05MDE5LTU4ZDJmMjZlYTgzMiIsInR5cCI6IlNlcmlhbGl6ZWQtSUQiLCJzZXNzaW9uX3N0YXRlIjoiMWNjOTAzMzAtY2YzYy00NWMzLWE0NGUtNTNmYzcxYjE3YmIxIiwic3RhdGVfY2hlY2tlciI6IkZFT3psWkRScXhqREVVOGV6MFd6MU5menZGYU1jNHRzV1J4TnlrNUc5NDQifQ.YeBB6vJYZ9Z4IUXY2-og17EMRodUdeqTBTQ31pY3P1s, AUTH_SESSION_ID_LEGACY=1cc90330-cf3c-45c3-a44e-53fc71b17bb1.4c93e8da0ca1]
In keycloak-1, the user is logged in i.e. a session is shown for idp-client.
The mentioned issue was rectified once the SSL was enabled.

How do I Migrate In-Line Links to Other Work Items That Point to New Project Work Items

I've got the Azure DevOps Migration Tools setup to where it looks like everything is coming over correctly, with one exception. When in-line links to work items are brought over the link still references the old project instead of the new one. I'm assuming that I'm missing some attribute that is telling the tool to still reference the source project but I can't for the life of me find said attribute.
Example:
There are 2 projects,: "Test Source Project" and "Test Target Project"
When "Test Source Project" gets migrated to "Test Target Project" the links in "Test Target Project" still reference the original task in "Test Source Project." Below is a screenshot of what I'm referencing.
I'm expecting the link to be: https://dev.azure.com/Company/Test%20Target%20Project/_workitems/edit/75
But I'm getting https://dev.azure.com/Company/Test%20Source%20Project/_workitems/edit/75
The version I am on is 8.9 and here's my config:
{
"Version": "8.9",
"TelemetryEnableTrace": false,
"workaroundForQuerySOAPBugEnabled": false,
"Source": {
"Collection": "https://dev.azure.com/Company/",
"Project": "Test Source Project",
"ReflectedWorkItemIDFieldName": "Custom.ReflectedWorkItemId",
"AllowCrossProjectLinking": false,
"PersonalAccessToken": ""
},
"Target": {
"Collection": "https://dev.azure.com/grda365/",
"Project": "Test Target Project",
"ReflectedWorkItemIDFieldName": "Custom.ReflectedWorkItemId",
"AllowCrossProjectLinking": false,
"PersonalAccessToken": ""
},
"FieldMaps": [],
"WorkItemTypeDefinition": {
"sourceWorkItemTypeName": "targetWorkItemTypeName"
},
"GitRepoMapping": null,
"Processors": [
{
"ObjectType": "VstsSyncMigrator.Engine.Configuration.Processing.NodeStructuresMigrationConfig",
"PrefixProjectToNodes": false,
"Enabled": true,
"BasePaths": []
},
{
"ObjectType": "VstsSyncMigrator.Engine.Configuration.Processing.WorkItemMigrationConfig",
"ReplayRevisions": true,
"PrefixProjectToNodes": false,
"UpdateCreatedDate": true,
"UpdateCreatedBy": true,
"UpdateSourceReflectedId": false,
"BuildFieldTable": false,
"AppendMigrationToolSignatureFooter": false,
"QueryBit": "AND [System.WorkItemType] NOT IN ('Test Suite', 'Test Plan')",
"OrderBit": "[System.ChangedDate] desc",
"Enabled": true,
"LinkMigration": true,
"AttachmentMigration": true,
"AttachmentWorkingPath": "c:\\temp\\WorkItemAttachmentWorkingFolder\\",
"FixHtmlAttachmentLinks": false,
"SkipToFinalRevisedWorkItemType": false,
"WorkItemCreateRetryLimit": 5,
"FilterWorkItemsThatAlreadyExistInTarget": false,
"PauseAfterEachWorkItem": false,
"AttachmentMazSize": 480000000,
"CollapseRevisions": false
},
{
"ObjectType": "VstsSyncMigrator.Engine.Configuration.Processing.WorkItemQueryMigrationConfig",
"Enabled": true
}
]
}
The tool does not update inline links.
This is an implementation issue as we migrate by iterating through all of the existing work items. For integrated links we can just add links to work items that exist in the target, and it will add from both ends once the other item is added.
For example if we are migrating 1, 2, 3 and 1, 2 reference 3 then:
#1 is migrated and no links are added as #3 does not exist
#2 is migrated and no links are added as #3 does not exist
#3 is migrated and links are added to #1 and #2
At the point of adding #3 and creating the links there is no way to know which work items have inline links to any other work items.
Ideas for fixing
OK, so that's how the tool currently works, so I am imagining a fix. Ther could be an option "RefactorInlineLinks" that parsed any description that it encountered and fixed the link if it was in scope for the migration.
However this would only work as a second pass after the migration was completed and all of the work items that will exist do exist.

import metadata from RDBMS into Apache Atlas

I am learning Atlas and trying to find a way to import metadata from RDBMS like (Sql Server or Postgre Sql).
Could somebody provide reference/s to do it or steps?
I am using Atlas in docker with build in HBase and Solr. Intention is to import metadata from AWS RDS.
Update 1
To rephrase my question. Can we import metadata directly from RDS Sql Server or PostgreSql without importing actual data in hive (hadoop)?
Any comment/s or answer is appreciated. Thank you!
AFAIK, Atlas works on hive metastore.
Below is the AWS documention of how to do it in AWS Emr while creating the cluster it self. ... Metadata classification, lineage, and discovery using Apache Atlas on Amazon EMR
Here is Cloudera source from sqoop stand point.
From Cloudera source : Populate metadata repository from RDBMS in Apache Atlas question from Cloudera.
1) you create the new types in Atlas. For example, in the case of Oracle, and Oracle table type, column type, etc.
2) create a script or process that pulls the meta data from the source meta data store.
3) Once you have the meta data you want to store in Atlas, your process would create the associated Atlas entities, based on the new types, using the Java API or JSON representations through the REST API directly. If you wanted to, you could add lineage to that as you store the new entities.
The below documentation has step by step details on how to use sqoop to move from any RDBMS to hive.
https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.5.3/bk_data-access/content/using_sqoop_to_move_...
You can refer to this as well: http://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html#_literal_sqoop_import_all_tables_literal
To get the metadata of all this sqoop imported data in to Atlas, make sure the below configurations are set properly.
http://atlas.incubator.apache.org/Bridge-Sqoop.html
Please note the above configuration step is not needed if your cluster configuration is managed by Ambari.
Using Rest API is one way is a good way to show MySQL metadata to the atlas catalog
other way using spark hive_support() spark -> read MySQL using JDBC -> write into hive , or using sqoop
To help to create RDBMS related instances, DB, tables, columns, I have created a GitHub repository
contains a template that can help you to understand how to add RDBMS or MySQL entities to the atlas
https://github.com/vettrivikas/Apche-Atlas-for-RDBMS
We can use REST API to create a type and then send data to it. Like
Lets say i have a dashboard and a visualization on it. I can create a Type Definition and then push data to it
{
"entityDefs": [
{
"superTypes": [
"DataSet"
],
"name": "Dashboard",
"description": "The definition of a Dashboard",
"attributeDefs": [
{
"name": "name",
"typeName": "string",
"isOptional": true,
"cardinality": "SINGLE",
"valuesMinCount": -1,
"valuesMaxCount": 1,
"isUnique": false,
"isIndexable": false,
"includeInNotification": false,
"searchWeight": -1
},
{
"name": "childDataset",
"typeName": "array<Visualization>",
"isOptional": true,
"cardinality": "SET",
"valuesMinCount": 0,
"valuesMaxCount": 2147483647,
"isUnique": false,
"isIndexable": false,
"includeInNotification": false,
"searchWeight": -1
}
]
},
{
"superTypes": [
"DataSet"
],
"name": "Visualization",
"description": "The definition of a Dashboard",
"attributeDefs": [
{
"name": "name",
"typeName": "string",
"isOptional": true,
"cardinality": "SINGLE",
"valuesMinCount": -1,
"valuesMaxCount": 1,
"isUnique": false,
"isIndexable": false,
"includeInNotification": false,
"searchWeight": -1
},
{
"name": "parentDataset",
"typeName": "array<Dashboard>",
"isOptional": true,
"cardinality": "SET",
"valuesMinCount": 0,
"valuesMaxCount": 2147483647,
"isUnique": false,
"isIndexable": false,
"includeInNotification": false,
"searchWeight": -1
}
]
}
],
"relationshipDefs": [
{
"category": "RELATIONSHIP",
"name": "dashboards_visualization_assignment",
"description": "The relationship between a Dashboard and a Visualization",
"relationshipCategory": "ASSOCIATION",
"attributeDefs": [],
"propagateTags": "NONE",
"endDef1": {
"type": "Dashboard",
"name": "childDataset",
"isContainer": false,
"cardinality": "SET",
"isLegacyAttribute": false
},
"endDef2": {
"type": "Visualization",
"name": "parentDataset",
"isContainer": false,
"cardinality": "SET",
"isLegacyAttribute": false
}
}
]
}
Then, you can simply add data using a REST Call to {servername}:{port}/api/atlas/v2/entity/bulk
{
"entities": [
{
"typeName": "Dashboard",
"guid": -1000,
"createdBy": "admin",
"attributes": {
"name": "sample dashboard",
"childDataset": [
{
"guid": "-200",
"typeName": "Visualization"
}
]
}
}
],
"referredEntities": {
"-200": {
"guid": "-200",
"typeName": "Visualization",
"attributes": {
"qualifiedName": "bar-chart"
}
}
}
}
}
Now, Look for Entities in Atlas.
Dashboard Entity on Atlas

Azure DevOps Pipeline - Release API - Update Release Definition add and remove a templated stage

We have a Azure DevOps Pipeline Release Definition, and i am looking at putting in place some automation for a new 'Stage' to be created from a template when a pull request is triggered on branch x, and the stage to be removed when the branch is deleted. i will be using github actions for this.
The API doc's are not super easy to follow.
My question are:
is this possible, dose the API support making such adding and removing of stage's to a _releaseDefinition ?
if so is there any examples on how to do this ?
Doc's
https://learn.microsoft.com/en-us/rest/api/azure/devops/release/releases/update%20release?view=azure-devops-rest-5.1#releasedefinitionshallowreference
The api you should use is Update-definition api:
PUT https://vsrm.dev.azure.com/{org name}/{project name}/_apis/release/definitions?api-version=5.1
For the request body of adding stage/removing stage, it in fact only made changes into environments parameter:
One stage definition corresponds to one grey code block.
Adding stage: Add a template JSON code block of stage definition(the grey one display in my left screenshots). This code structure is fixed.
Removing stage: Remove the complete corresponding stage definition.
Here is the one complete stage definition sample:
{
"id": -1,
"name": "Stage 3",
"rank": 3,
"variables": {},
"variableGroups": [],
"preDeployApprovals": {
"approvals": [
{
"rank": 1,
"isAutomated": true,
"isNotificationOn": false,
"id": 7
}
],
"approvalOptions": {
"requiredApproverCount": null,
"releaseCreatorCanBeApprover": false,
"autoTriggeredAndPreviousEnvironmentApprovedCanBeSkipped": false,
"enforceIdentityRevalidation": false,
"timeoutInMinutes": 0,
"executionOrder": "beforeGates"
}
},
"deployStep": {
"id": 8
},
"postDeployApprovals": {
"approvals": [
{
"rank": 1,
"isAutomated": true,
"isNotificationOn": false,
"id": 9
}
],
"approvalOptions": {
"requiredApproverCount": null,
"releaseCreatorCanBeApprover": false,
"autoTriggeredAndPreviousEnvironmentApprovedCanBeSkipped": false,
"enforceIdentityRevalidation": false,
"timeoutInMinutes": 0,
"executionOrder": "afterSuccessfulGates"
}
},
"deployPhases": [
{
"deploymentInput": {
"parallelExecution": {
"parallelExecutionType": "none"
},
"agentSpecification": {
"identifier": "vs2017-win2016"
},
"skipArtifactsDownload": false,
"artifactsDownloadInput": {
"downloadInputs": []
},
"queueId": 247,
"demands": [],
"enableAccessToken": false,
"timeoutInMinutes": 0,
"jobCancelTimeoutInMinutes": 1,
"condition": "succeeded()",
"overrideInputs": {}
},
"rank": 1,
"phaseType": "agentBasedDeployment",
"name": "Agent job",
"refName": null,
"workflowTasks": []
}
],
"environmentOptions": {
"emailNotificationType": "OnlyOnFailure",
"emailRecipients": "release.environment.owner;release.creator",
"skipArtifactsDownload": false,
"timeoutInMinutes": 0,
"enableAccessToken": false,
"publishDeploymentStatus": true,
"badgeEnabled": false,
"autoLinkWorkItems": false,
"pullRequestDeploymentEnabled": false
},
"demands": [],
"conditions": [],
"executionPolicy": {
"concurrencyCount": 1,
"queueDepthCount": 0
},
"schedules": [],
"owner": {
"displayName": "{user name}",
"id": "{user id}",
"isContainer": false,
"uniqueName": "{creator account}",
"url": "https://dev.azure.com/{org name}/"
},
"retentionPolicy": {
"daysToKeep": 30,
"releasesToKeep": 3,
"retainBuild": true
},
"processParameters": {},
"properties": {
"BoardsEnvironmentType": {
"$type": "System.String",
"$value": "unmapped"
},
"LinkBoardsWorkItems": {
"$type": "System.String",
"$value": "False"
}
},
"preDeploymentGates": {
"id": 0,
"gatesOptions": null,
"gates": []
},
"postDeploymentGates": {
"id": 0,
"gatesOptions": null,
"gates": []
},
"environmentTriggers": [],
"badgeUrl": "https://vsrm.dev.azure.com/{org}/_apis/{project name}/Release/badge/3c3xxx6414512/2/3"
},
Here has some key parameters you need pay attention: id, owner, rank and conditions.
id: This is the stage id you specified to stage, its value must less than 1. Any value that less than 1 is okay here.
owner: This is required. Or you will receive the message that notify you the stage must has owner.
rank: The natural numbers greater than 1. Here I suggest you increment it based on other stages.
conditions: This is the one which you can configure which stage the current new one will depend on. The nutshell is it used to specify stage execution location of release.
When you updating release, you must pack and set the whole release definition as request body. Get the original one, add new customized stage definition part into it. Then update to api.
In fact, I suggest you do adding a stage with UI for test. Then go History tab of release definition page. Then choose Compare difference of three dots.
We provided you the difference of definition in the panel(left panel is the original, the right is updated), and you can clearly get what you should focus on to apply your idea.