Search and replace JSON multiline using regex in VSCode - visual-studio-code

I have a really long JSON schema. Using VSCode, I need to replace the partnerName type to be string, null (it appears more than 20 times, the snippet below is just 1 appearance).
How can I search and replace the multiline for the entire partnerName entry?
From other question, I've tried using regex [\n\s]+, (.*\n)+ to be
"partnerName": {(.*\n)+"type": "null"(.*\n)+}
But it's still not matching.
Search for:
"partnerName": {
"type": "null"
},
Replace with:
"partnerName": {
"type": "string, null"
},
Snippet example:
{
"type": "object",
"properties": {
"node": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"name": {
"type": "string"
},
"description": {
"type": "string"
},
"type": {
"type": "string"
},
"frequency": {
"type": "string"
},
"maxCount": {
"type": "integer"
},
"points": {
"type": "integer"
},
"startAt": {
"type": "string"
},
"endAt": {
"type": "string"
},
"partnerName": {
"type": "null"
},
"action": {
"type": "null"
}
},
"required": [
"id",
"name",
"description",
"type",
"frequency",
"maxCount",
"points",
"startAt",
"endAt",
"partnerName",
"action"
]
}
},
"required": [
"node"
]
},

Try this regex:
(partnerName".*\n\s*"type":\s*)"null"
and replace with:
$1"string, null"

Related

How is the value NULL?

I am getting query results that determine if user story hasn't been changed (changedate) in the last one day.
I'm following this article to build the logic app as the intention is similar
For some reason, despite the query returning a valid response (at least 1 user story result), the foreach expression is throwing this error:
ExpressionEvaluationFailed. The execution of template action 'For_each' failed: the result of the evaluation of 'foreach' expression '#body('Parse_JSON')?['body']?['value']' is of type 'Null'. The result must be a valid array.
How is it NULL when clearly there is a user story returned?
Get query results:
OUTPUTS:
[
{
"System.Id": 12345,
"System.WorkItemType": "User Story",
"System.State": "New",
"System.Title": "Experiment"
}
]
Parse JSON:
Inputs:
Content:
{
"value": [
{
"System.Id": 12345,
"System.WorkItemType": "User Story",
"System.State": "New",
"System.Title": "Experiment"
}
],
"#odata.nextLink": null
}
Schema
{
"type": "object",
"properties": {
"body": {
"type": "object",
"properties": {
"value": {
"type": "array",
"items": {
"type": "object",
"properties": {
"System.AssignedTo": {
"type": "string"
},
"System.Id": {
"type": "integer"
},
"System.State": {
"type": "string"
},
"System.Tags": {
"type": "string"
},
"System.Title": {
"type": "string"
},
"System.WorkItemType": {
"type": "string"
}
},
"required": [
"System.Id",
"System.WorkItemType",
"System.State",
"System.AssignedTo",
"System.Title"
]
}
},
"#odata.nextLink": {}
}
},
"headers": {
"type": "object",
"properties": {
"Cache-Control": {
"type": "string"
},
"Content-Length": {
"type": "string"
},
"Content-Type": {
"type": "string"
},
"Date": {
"type": "string"
},
"Expires": {
"type": "string"
},
"Pragma": {
"type": "string"
},
"Set-Cookie": {
"type": "string"
},
"Strict-Transport-Security": {
"type": "string"
},
"Timing-Allow-Origin": {
"type": "string"
},
"Transfer-Encoding": {
"type": "string"
},
"Vary": {
"type": "string"
},
"X-Content-Type-Options": {
"type": "string"
},
"X-Frame-Options": {
"type": "string"
},
"x-ms-apihub-cached-response": {
"type": "string"
},
"x-ms-apihub-obo": {
"type": "string"
},
"x-ms-request-id": {
"type": "string"
}
}
},
"statusCode": {
"type": "integer"
}
}
}
Outputs:
{
"value": [
{
"System.Id": 12345,
"System.WorkItemType": "User Story",
"System.State": "New",
"System.Title": "Experiment"
}
],
"#odata.nextLink": null
}
Using the Value from the Get Query Results directly works.

AWS-API gateway -- jsonschema child object should validate when parent object exists

I need to create Jsonschema for the following JSON input. Here properties under Vehicle like( Manufacturer, Model, etc) should be required only when Vehicle object exists.
{
"Manufacturer": "",
"Characteristics": {
"Starts": "new",
"vehicle": {
"Manufacturer": "hello",
"Model": "hh",
"Opening": "",
"Quantity": "",
"Principle": "",
"Type": ""
}
}
}
I tried the following JsonSchema but this works when Vehicle object is not there but if we rename Vehicle to some other ex: Vehicle1 it doesn't give an error. Please guide me on how to fix this.
{
"$schema": "http://json-schema.org/draft-07/schema",
"type": "object",
"properties": {
"Manufacturer": {
"type": [
"string",
"null"
]
},
"Characteristics": {
"type": "object",
"properties": {
"Starts": {
"type": [
"string",
"null"
]
},
"Vehicle": {
"$ref": "#/definitions/Vehicle"
}
},
"required": [
"Starts", "Vehcle"
]
}
},
"required": [
"Manufacturer"
],
"definitions": {
"Vehicle": {
"type": "object",
"properties": {
"Manufacturer": {
"type": [
"string",
"null"
]
},
"Model": {
"type": [
"string",
"null"
]
},
"Opening": {
"type": [
"string",
"null"
]
},
"PanelQuantity": {
"type": [
"string",
"null"
]
},
"Principle": {
"type": [
"string",
"null"
]
},
"Type": {
"type": [
"string",
"null"
]
}
},
"required": ["Manufacturer", "Model", "Opening", "Quantity", "Principle", "Type"]
}
}
}
Thanks,
Bhaskar
Sounds like you want to add "additionalProperties": false -- which will generate an error if any other properties are present that aren't defined under properties.

Getting error on null and empty string while copying a csv file from blob container to Azure SQL DB

I tried all combination on the datatype of my data but each time my data factory pipeline is giving me this error:
{
"errorCode": "2200",
"message": "ErrorCode=UserErrorColumnNameNotAllowNull,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Empty or Null string found in Column Name 2. Please make sure column name not null and try again.,Source=Microsoft.DataTransfer.Common,'",
"failureType": "UserError",
"target": "xxx",
"details": []
}
My Copy data source code is something like this:{
"name": "xxx",
"description": "uuu",
"type": "Copy",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "DelimitedTextSource",
"storeSettings": {
"type": "AzureBlobStorageReadSettings",
"recursive": true,
"wildcardFileName": "*"
},
"formatSettings": {
"type": "DelimitedTextReadSettings"
}
},
"sink": {
"type": "AzureSqlSink"
},
"enableStaging": false,
"translator": {
"type": "TabularTranslator",
"mappings": [
{
"source": {
"name": "populationId",
"type": "Guid"
},
"sink": {
"name": "PopulationID",
"type": "String"
}
},
{
"source": {
"name": "inputTime",
"type": "DateTime"
},
"sink": {
"name": "inputTime",
"type": "DateTime"
}
},
{
"source": {
"name": "inputCount",
"type": "Decimal"
},
"sink": {
"name": "inputCount",
"type": "Decimal"
}
},
{
"source": {
"name": "inputBiomass",
"type": "Decimal"
},
"sink": {
"name": "inputBiomass",
"type": "Decimal"
}
},
{
"source": {
"name": "inputNumber",
"type": "Decimal"
},
"sink": {
"name": "inputNumber",
"type": "Decimal"
}
},
{
"source": {
"name": "utcOffset",
"type": "String"
},
"sink": {
"name": "utcOffset",
"type": "Int32"
}
},
{
"source": {
"name": "fishGroupName",
"type": "String"
},
"sink": {
"name": "fishgroupname",
"type": "String"
}
},
{
"source": {
"name": "yearClass",
"type": "String"
},
"sink": {
"name": "yearclass",
"type": "String"
}
}
]
}
},
"inputs": [
{
"referenceName": "DelimitedTextFTDimensions",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "AzureSqlTable1",
"type": "DatasetReference"
}
]
}
Can anyone please help me understand the issue. I see in some blogs they ask me use treatnullasempty but I am not allowed to modify the JSON. is there a way to do that??
I suggest to using Data Flow DerivedColumn, DerivedColumn can help you build expression to replace the null column.
For example:
Derived Column, if Column_2 is null =true, return 'dd' :
iifNull(Column_2,'dd')
Mapping the column
Reference: Data transformation expressions in mapping data flow
Hope this helps.
fixed it.it was a easy fix as one of my column in destination was marked as not null, i changed it as null and it worked.

Conversion of JSON to Avro failed: Failed to convert JSON to Avro: Unknown union branch

I am trying to send a JSON message to a Kafka topic using Kafka-rest service to serialize JSON as an Avro object, but the JSON message is failed to get accepted by Kafka-rest with the following error:
Conversion of JSON to Avro failed: Failed to convert JSON to Avro: Unknown union branch postId
I suspect that there is an issue with the Avro schema I am using as it is a nested record type with nullable fields.
Avro schema:
{
"type": "record",
"name": "ExportRequest",
"namespace": "com.example.avro.model",
"fields": [
{
"name": "context",
"type": {
"type": "map",
"values": {
"type": "string",
"avro.java.string": "String"
},
"avro.java.string": "String"
}
},
{
"name": "exportInfo",
"type": {
"type": "record",
"name": "ExportInfo",
"fields": [
{
"name": "exportId",
"type": {
"type": "string",
"avro.java.string": "String"
}
},
{
"name": "exportType",
"type": {
"type": "string",
"avro.java.string": "String"
}
},
{
"name": "exportQuery",
"type": {
"type": "record",
"name": "ExportQuery",
"fields": [
{
"name": "postExport",
"type": [
"null",
{
"type": "record",
"name": "PostExport",
"fields": [
{
"name": "postId",
"type": {
"type": "string",
"avro.java.string": "String"
}
},
{
"name": "isCommentIncluded",
"type": "boolean"
}
]
}
],
"default": null
},
{
"name": "feedExport",
"type": [
"null",
{
"type": "record",
"name": "FeedExport",
"fields": [
{
"name": "accounts",
"type": {
"type": "array",
"items": {
"type": "string",
"avro.java.string": "String"
}
}
},
{
"name": "recordTypes",
"type": {
"type": "array",
"items": {
"type": "string",
"avro.java.string": "String"
}
}
},
{
"name": "actions",
"type": {
"type": "array",
"items": {
"type": "string",
"avro.java.string": "String"
}
}
},
{
"name": "contentTypes",
"type": {
"type": "array",
"items": {
"type": "string",
"avro.java.string": "String"
}
}
},
{
"name": "startDate",
"type": "long"
},
{
"name": "endDate",
"type": "long"
},
{
"name": "advancedSearch",
"type": [
"null",
{
"type": "record",
"name": "AdvancedSearchExport",
"fields": [
{
"name": "allOfTheWords",
"type": {
"type": "array",
"items": {
"type": "string",
"avro.java.string": "String"
}
}
},
{
"name": "anyOfTheWords",
"type": {
"type": "array",
"items": {
"type": "string",
"avro.java.string": "String"
}
}
},
{
"name": "noneOfTheWords",
"type": {
"type": "array",
"items": {
"type": "string",
"avro.java.string": "String"
}
}
},
{
"name": "hashtags",
"type": {
"type": "array",
"items": {
"type": "string",
"avro.java.string": "String"
}
}
},
{
"name": "keyword",
"type": {
"type": "string",
"avro.java.string": "String"
}
},
{
"name": "exactPhrase",
"type": {
"type": "string",
"avro.java.string": "String"
}
}
]
}
],
"default": null
}
]
}
],
"default": null
}
]
}
}
]
}
}
]
}
Json message:
{
"context": {
"user_id": "1",
"group_id": "1",
"organization_id": "1"
},
"exportInfo": {
"exportId": "93874dd7-35d7-4f1f-8cf8-051c606d920b",
"exportType": "json",
"exportQuery": {
"postExport": {
"postId": "dd",
"isCommentIncluded": false
},
"feedExport": {
"accounts": [
"1677143852565319"
],
"recordTypes": [],
"actions": [],
"contentTypes": [],
"startDate": 0,
"endDate": 0,
"advancedSearch": {
"allOfTheWords": [
"string"
],
"anyOfTheWords": [
"string"
],
"noneOfTheWords": [
"string"
],
"hashtags": [
"string"
],
"keyword": "string",
"exactPhrase": "string"
}
}
}
}
}
I would appreciate it if someone could help me to understand what the issue is.
Both of your JSON and Avro looks good.
You are facing the issue because JSON doesn't conform to Avro's JSON encoding spec.
So, if you convert your JSON accordingly, it will somehow look like this
{
"context": {
"user_id": "1",
"group_id": "1",
"organization_id": "1"
},
"exportInfo": {
"exportId": "93874dd7-35d7-4f1f-8cf8-051c606d920b",
"exportType": "json",
"exportQuery": {
"postExport": {
"com.example.avro.model.PostExport": {
"postId": "dd",
"isCommentIncluded": false
}
},
"feedExport": {
"com.example.avro.model.FeedExport": {
"accounts": [
"1677143852565319"
],
"recordTypes": [],
"actions": [],
"contentTypes": [],
"startDate": 0,
"endDate": 0,
"advancedSearch": {
"com.example.avro.model.AdvancedSearchExport": {
"allOfTheWords": [
"string"
],
"anyOfTheWords": [
"string"
],
"noneOfTheWords": [
"string"
],
"hashtags": [
"string"
],
"keyword": "string",
"exactPhrase": "string"
}
}
}
}
}
}
}

LoopBack 3.0: where filter not returning results from REST API

I have a LoopBack API with a single simple model like this:
{
"name": "Establishment",
"base": "PersistedModel",
"idInjection": true,
"options": {
"validateUpsert": true
},
"properties": {
"Distance": {
"type": "number"
},
"EstablishmentId": {
"type": "number"
},
"EstablishmentType": {
"type": "string"
},
"Location": {
"type": "string"
},
"MinCost": {
"type": "number"
},
"Name": {
"type": "string"
},
"Stars": {
"type": "number"
},
"UserRating": {
"type": "number"
},
"UserRatingTitle": {
"type": "string"
},
"UserRatingCount": {
"type": "number"
},
"ImageUrl": {
"type": "string"
},
"ThumbnailUrl": {
"type": "string"
}
},
"validations": [],
"relations": {},
"acls": [],
"methods": {}
}
A simple call to http://localhost:3000/api/Establishments returns all of the results, as expected; but a call to http://localhost:3000/api/Establishments?filter[where][distance][gt]=30 yields no results at all: an empty array.
There are lots of Establishments with a Distance greater than 30; and indeed using the where filter on other properties also results in an empty array. What could I be missing?
As I mentioned in the comment, it is case-sensitive and I varified it on my app to be certain about it.
it should be :
http://localhost:3000/api/Establishments?filter[where][Distance][gt]=30
or you can try with this format :
http://localhost:3000/api/Establishments?filter={"where":{"Distance":{"gt":30}}}