Nestjs | GraphQL : "message": "Cannot read properties of undefined (reading 'method')" - nestjs-graphql

How can I fix this error.
I just try using NestJs with GraphQL.
When I try to execute the Mutation or Query type. I get this error
Best regards
{
"errors": [
{
"message": "Cannot read properties of undefined (reading 'method')",
"locations": [
{
"line": 2,
"column": 3
}
],
"path": [
"login"
],
"extensions": {
"code": "INTERNAL_SERVER_ERROR",
"exception": {
"stacktrace": [
"TypeError: Cannot read properties of undefined (reading 'method')",
" at CacheInterceptor.isRequestCacheable (D:\\project\\my-card-tracker\\backend-app\\node_modules\\#nestjs\\common\\cache\\interceptors\\cache.interceptor.js:63:49)",
" at CacheInterceptor.trackBy (D:\\project\\my-card-tracker\\backend-app\\node_modules\\#nestjs\\common\\cache\\interceptors\\cache.interceptor.js:56:19)",
" at CacheInterceptor.intercept (D:\\project\\my-card-tracker\\backend-app\\node_modules\\#nestjs\\common\\cache\\interceptors\\cache.interceptor.js:21:26)",
" at D:\\project\\my-card-tracker\\backend-app\\node_modules\\#nestjs\\core\\interceptors\\interceptors-consumer.js:23:36",
" at InterceptorsConsumer.intercept (D:\\project\\my-card-tracker\\backend-app\\node_modules\\#nestjs\\core\\interceptors\\interceptors-consumer.js:25:24)",
" at target (D:\\project\\my-card-tracker\\backend-app\\node_modules\\#nestjs\\core\\helpers\\external-context-creator.js:77:60)",
" at Object.login (D:\\project\\my-card-tracker\\backend-app\\node_modules\\#nestjs\\core\\helpers\\external-proxy.js:9:30)",
" at field.resolve (D:\\project\\my-card-tracker\\backend-app\\node_modules\\apollo-server-core\\src\\utils\\schemaInstrumentation.ts:106:18)",
" at executeField (D:\\project\\my-card-tracker\\backend-app\\node_modules\\graphql\\execution\\execute.js:481:20)",
" at D:\\project\\my-card-tracker\\backend-app\\node_modules\\graphql\\execution\\execute.js:377:22"
]
}
}
}
],
"data": null
}

Related

PowerBI REST API - Failed to create a new datasource on the specified gateway

I'm following this page in order to create a new PowerBI datasource.
And used this kind of body for the POST request:
{
"connectionDetails": "{\"server\":\"aaa\",\"database\":\"bbb\"}",
"credentialDetails": {
"credentialType": "Basic",
"credentials": "{\"credentialData\":[{\"name\":\"username\", \"value\":\"ccc\"},{\"name\":\"password\", \"value\":\"XoZT6aM1r0puO\"}]}",
"encryptedConnection": "Encrypted",
"encryptionAlgorithm": "RSA-OAEP",
"privacyLevel": "None",
"useEndUserOAuth2Credentials": "False"
},
"datasourceName": "ddd",
"dataSourceType": "Sql"
}
But getting the below error for the POST request:
{
"error": {
"code": "DM_GWPipeline_UnknownError",
"pbi.error": {
"code": "DM_GWPipeline_UnknownError",
"parameters": {},
"details": [
{
"code": "DM_ErrorDetailNameCode_UnderlyingErrorMessage",
"detail": {
"type": 1,
"value": "The input is not a valid Base-64 string as it contains a non-base 64 character, more than two padding characters, or an illegal character among the padding characters. "
}
},
{
"code": "DM_ErrorDetailNameCode_UnderlyingHResult",
"detail": {
"type": 1,
"value": "-2146233033"
}
}
]
}
}
}
Any advice would be appreciated!
Thanks in advance.
BTW, as an alternative I would be happy to know if there is a powershell code that was found as working.

Backward Comaptibility issue and uncertainity in Schema Registry

I have a use case where I have a JSON and I want to generate schema and record out of the JSON and publish a record.
I have configured the value serializer and Schema setting is Backward compatible.
First JSON
String json = "{\n" +
" \"id\": 1,\n" +
" \"name\": \"Headphones\",\n" +
" \"price\": 1250.0,\n" +
" \"tags\": [\"home\", \"green\"]\n" +
"}\n"
;
Version 1 schema registered.
Received message in avro console consumer.
Second JSON.
String json = "{\n" +
" \"id\": 1,\n" +
" \"price\": 1250.0,\n" +
" \"tags\": [\"home\", \"green\"]\n" +
"}\n"
;
Registered schema Successfully.
Sent message.
Now tried sending the JSON 1 sent successfully
Schema 3:
String json = "{\n" +
" \"id\": 1,\n" +
" \"name\": \"Headphones\",\n" +
" \"tags\": [\"home\", \"green\"]\n" +
"}\n"
;
Got error for this case.
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Schema being registered is incompatible with an earlier schema; error code: 409
How is that schema generated from 2nd JSON was registered and the
third one was rejected? Although I didn't have any Default key for the
deleted field? Is it that the Schema Registry always accepts the 1st
evolution? (2nd schema over 1st)
Schema in schema registry
Version 1 schema
{
"fields": [
{
"doc": "Type inferred from '1'",
"name": "id",
"type": "int"
},
{
"doc": "Type inferred from '\"Headphones\"'",
"name": "name",
"type": "string"
},
{
"doc": "Type inferred from '1250.0'",
"name": "price",
"type": "double"
},
{
"doc": "Type inferred from '[\"home\",\"green\"]'",
"name": "tags",
"type": {
"items": "string",
"type": "array"
}
}
],
"name": "myschema",
"type": "record" }
Version 2:
{
"fields": [
{
"doc": "Type inferred from '1'",
"name": "id",
"type": "int"
},
{
"doc": "Type inferred from '1250.0'",
"name": "price",
"type": "double"
},
{
"doc": "Type inferred from '[\"home\",\"green\"]'",
"name": "tags",
"type": {
"items": "string",
"type": "array"
}
}
],
"name": "myschema",
"type": "record" }
Let's go over the backwards compatibility rules... https://docs.confluent.io/current/schema-registry/avro.html#compatibility-types
First, the default isn't transitive, so version 3 only will look at version 2.
The backwards rule states you can delete fields or add optional fields (those with a default). I assume your schema generator tool doesn't know how to use optionals, so you're only allowed to delete, not add.
Between version 1 and 2, you've deleted the name field, which is valid.
Between version 2 and the incoming 3, it thinks you're trying to post a new schema which removes price (this is okay}, but adds a required name field, which is not allowed.

Google Fit REST API - dataStreamId with whitespace results in error

I want to use Google's REST API to get the Fitness data of my account. To do so i issue 2 subsequent calls.
GET https://www.googleapis.com/fitness/v1/users/me/dataSources. This returns a list of all available dataSources as in [1].
POST https://www.googleapis.com/fitness/v1/users/me/dataset:aggregate.
I use the dataType name and dataStreamId in the request body from [1] to build the request body [2].
The problem: The second call returns an error [3] for all dataSourceIds that contain whitespace although they were returned exactly that way in the first request. In the code sample there is whitespace because the dataSourceId contains the phone model "Nexus 5". If there is no whitespace, the request succeeds without errors.
I already tried replacing the space by something else ("%20" or "_" or "+") but nothing helped. Is this a bug in the API or am i doing something fundamentally wrong?
Thanks in advance!
Edit 1:
btw i am using Google's oauth-playground with all the fitness scopes selected.
https://developers.google.com/oauthplayground/
Edit 2:
In code sample [2] i used the wrong dataTypeName. Was "activity_confidence" but should be "com.google.activity.samples".
[1] GET response
{
"dataSource": [
{
"application": {
"packageName": "com.google.android.gms"
},
"dataQualityStandard": [
],
"dataStreamId": "derived:com.google.activity.samples:com.google.android.gms:LGE:Nexus 5:c80045fc:detailed",
"dataStreamName": "detailed",
"dataType": {
"field": [
{
"format": "map",
"name": "activity_confidence"
}
],
"name": "com.google.activity.samples"
},
"device": {...},
"type": "derived"
},
...
]
}
[2] POST body
{
"aggregateBy": [
{
"dataSourceId": "derived:com.google.activity.samples:com.google.android.gms:LGE:Nexus 5:c80045fc:detailed",
"dataTypeName": "com.google.activity.samples"
}
],
"endTimeMillis": 1511132400000,
"startTimeMillis": 1510268400000
}
[3] POST Error message
{
"error": {
"code": 400,
"errors": [
{
"domain": "global",
"message": "datasource not found: derived:com.google.activity.samples:com.google.android.gms:LGE:Nexus 5:c80045fc:detailed",
"reason": "invalidArgument"
}
],
"message": "datasource not found: derived:com.google.activity.samples:com.google.android.gms:LGE:Nexus 5:c80045fc:detailed"
}
}
Did you try using a escape character like '\'?
Your data stream ID would look like
derived:com.google.activity.samples:com.google.android.gms:LGE:Nexus\ 5:c80045fc:detailed

ElasticSearch Reindex API and painless script to access date field

I try to familiarize myself with the Reindexing API of ElasticSearch and the use of Painless scripts.
I have the following model:
"mappings": {
"customer": {
"properties": {
"firstName": {
"type": "text",
"fields": {
"keyword": {
"ignore_above": 256,
"type": "keyword"
}
}
},
"lastName": {
"type": "text",
"fields": {
"keyword": {
"ignore_above": 256,
"type": "keyword"
}
}
},
"dateOfBirth": {
"type": "date"
}
}
}
}
I would like to reindex all documents from test-v1 to test-v2 and apply a few transformations on them (for example extract the year part of dateOfBirth, convert a date value to a timestamp, etc) and save the result as a new field. But I got an issue when I tried to access it.
When I made the following call, I got an error:
POST /_reindex?pretty=true&human=true&wait_for_completion=true HTTP/1.1
Host: localhost:9200
Content-Type: application/json
{
"source": {
"index": "test-v1"
},
"dest": {
"index": "test-v2"
},
"script": {
"lang": "painless",
"inline": "ctx._source.yearOfBirth = ctx._source.dateOfBirth.getYear();"
}
}
And the response:
{
"error": {
"root_cause": [
{
"type": "script_exception",
"reason": "runtime error",
"script_stack": [
"ctx._source.yearOfBirth = ctx._source.dateOfBirth.getYear();",
" ^---- HERE"
],
"script": "ctx._source.yearOfBirth = ctx._source.dateOfBirth.getYear();",
"lang": "painless"
}
],
"type": "script_exception",
"reason": "runtime error",
"script_stack": [
"ctx._source.yearOfBirth = ctx._source.dateOfBirth.getYear();",
" ^---- HERE"
],
"script": "ctx._source.yearOfBirth = ctx._source.dateOfBirth.getYear();",
"lang": "painless",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "Unable to find dynamic method [getYear] with [0] arguments for class [java.lang.String]."
}
},
"status": 500
}
According to this tutorial Date fields are exposed as ReadableDateTime so they support methods like getYear, and getDayOfWeek. and indeed, the Reference mentions those as supported methods.
Still, the response mentions [java.lang.String] as the type of the dateOfBirth property. I could just parse it to e.g. an OffsetDateTime, but I wonder why it is a string.
Anyone has a suggestion what I'm doing wrong?

How to create google datastore composite indices via REST API?

I am trying to change the order of my results but I keep getting an error saying You need an index to execute this query.
In my console, I doesn't say that any indices exist, but I set most of the indexed options to true.
I know in Java, I can create indices that relate to multiple properties either ascending or descending, how do I do this with the REST API?
Following the REST API docs for Google Datastore, my entities are created like this:
{
"mode": "TRANSACTIONAL",
"transaction": "Eb2wksWfYDjkGkkABRmGMQ_vKGijwNwm-tbxAbUPRt8N2RaUCynjSbGT7jFQw3pgaDCT7U0drs3RTPLSIN8TQikdqkdl7pLm2rkMqORmKlO_I_dp",
"mutation": {
"insertAutoId": [
{
"key": {
"path": [
{
"kind": "Attendance"
}
]
},
"properties": {
"section": {
"indexed": true,
"stringValue": "Venturers"
},
"date": {
"dateTimeValue": "2015-01-16T00:00:00+00:00",
"indexed": true
},
"attendee": {
"indexed": true,
"keyValue": {
"path": [
{
"id": "5659313586569216",
"kind": "Attendee"
}
]
}
},
"presence": {
"indexed": false,
"integerValue": 0
}
}
}
]
}
}
And I am trying to query like this:
{
"gqlQuery": {
"allowLiteral": true,
"queryString": "SELECT * FROM Attendance WHERE section = #section ORDER BY date ASC",
"nameArgs": [
{
"name": "section",
"value": {
"stringValue": "Venturers"
}
}
]
}
}
And I get this error:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "FAILED_PRECONDITION",
"message": "no matching index found.",
"locationType": "header",
"location": "If-Match"
}
],
"code": 412,
"message": "no matching index found."
}
}
For future reference:
You can't make a composite index directly through the REST API. You must go through php app engine.
How to build datastore indexes (PHP GAE)