Nested Rest resource throws constraint violation in Jhipster - rest

I have a nested resource like this:
#GetMapping("/tour-requests/{tourRequestId}/tour-request-messages")
#Secured({AuthoritiesConstants.ADMIN})
public ResponseEntity<List<TourRequestMessageDTO>> getTourRequestMessagesForTourRequest(
#PathVariable("tourRequestId") long tourRequestId,
TourRequestMessageCriteria criteria) {
...
}
When I call this resource, for example with GET api/tour-requests/1301/tour-request-messages I get unexpected error:
{
"type": "https://zalando.github.io/problem/constraint-violation",
"title": "Constraint Violation",
"status": 400,
"path": "/api/tour-requests/1301/tour-request-messages",
"violations": [
{
"field": "tourRequestId",
"message": "Failed to convert property value of type 'java.lang.String' to required type 'io.github.jhipster.service.filter.LongFilter' for property 'tourRequestId'; nested exception is java.lang.IllegalStateException: Cannot convert value of type 'java.lang.String' to required type 'io.github.jhipster.service.filter.LongFilter' for property 'tourRequestId': no matching editors or conversion strategy found"
}
],
"message": "error.validation"
}
I tried to debug this, it seems that the exception is happening before the method is called-

The problem is that the search criteria has hijacked the path parameter tourRequestId, as it happen to be also a possible search parameter of the generated QueryService.
That it is why it tried to convert the tourRequestId parameter to LongFilter.
Renaming the path variable to ìd` did also not helped, but after renaming it to something different, the problem disappeared.

I've also hit this problem, but my choice was to remove the field from child's pt.up.hs.project.service.dto.ChildCriteria. When the resource is always nested, it just does not make sense to allow querying by the field which is also specified in the path.

Related

Micronaut POJO deserialisation error message when the format is invalid or type throws error

When providing the incorrect format of a field for a request to my application if the type throws an error then the error message returned by micronaut is vague.
E.G two scenarios
public class fakeClass {
#NotNull
private String fakeName;
}
if my request is {"fakeName": ""}
then the response, correctly, would be something like
{
"violations": [
{
"field": "create.fakeClass.fakeName",
"message": "must not be blank"
}
],
"type": "https://zalando.github.io/problem/constraint-violation",
"title": "Constraint Violation",
"status": 400 }
But lets say my class looks like this:
public class fakeClass {
#Format("yyyy-MM-dd")
private LocalDate exampeDate;
}
With an invalid date or incorrect format of {"exampleDate": 202222--01-01} or {"exampleDate": 2022/01/01}
Then the error message is
{
"type": "about:blank",
"parameters": {
"path": "/project"
},
"status": 400,
"detail": "Required argument [fakeClass fakeClass] not specified"
}
Is there a simple way to provide more information to the error message to make it clear why the request failed for an invalid format or type like #NotNull or #NotBlank?
The problem here is not Micronaut but your payloads. The examples you mentioned are invalid JSON documents.
For example this on here is invalid, since the value is not a number nor a string.
{
"exampleDate": 202222--01-01
}
this would be the a valid variant
{
"exampleDate": "202222--01-01"
}
Make sure you send the date as a String. In your case this is expected to be valid.
{
"exampleDate": "2022-11-01"
}
In general it is recommended to send date using the ISO-8601 format, which you did (yyyy-MM-dd). Furthermore I recommend to apply a global configuration rather than using on each POJO a #Format("yyyy-MM-dd") annotation.
jackson:
dateFormat: yyyyMMdd
timeZone: UTC
serializationInclusion: NON_NULL
serialization:
writeDatesAsTimestamps: false
#Format("yyyy-MM-dd") is a formatter not a Constraint.
You can use #Pattern(<regex>). There is also date specific ones like #Past, #PastOrPresent, #Futureand #FutureOrPresent.

Problem when the entity value attribute contain special character

I have tied to insert in OCB an entity with a password attribute codified:
{
"id": "prueba-tipo-string2",
"type": "StringParser",
"dateObserved": {
"type": "DateTime",
"value": "2020-08-13T08:56:56.00Z"
},
"password": {
"type": "text",
"value": "U2FsdGVkX10bFP8Rj7xLAQDFwMBphXpK/+leH3mlpQ="
}
}
OCB always response to me with the following error:
"found a forbidden character in the value of an attribute"
In Postman
{
"error": "BadRequest",
"description": "Invalid characters in attribute value"
}
Orion restricts the usage of some characters due to security reasons (script injections attack in some circumstances), see this piece of documentation. In particular, the = you have in the password attribute value.
You can avoid this, for instance, by encoding the password in base 64, or using URL encoding before storing it in Orion.
Another alternative using TextUnrestricted in attribute type. This special attribute type does not check if the attribute value contains a forbidden character. However, it could have security implications, use it at your own risk!

Actions Builder webhookResponse Unexpected internal error at List Response

I tried to add a List Response from my webhook and always receiving an error such as:
Unexpected internal error id=c57c97b2-0b6f-492b-88a3-3867cf2e7203
(The id changes each time.)
After comparing the expected JSON webhookResponse from the Docs with the generated Response from the Actions SDK I found a difference at the typeOverrides object:
JSON from Docs
"typeOverrides": [
{
"name": "prompt_option",
"synonym": {
"entries": []
},
"typeOverrideMode": "TYPE_REPLACE"
}
]
Generated JSON Response from Actions SDK
"typeOverrides": [
{
"name": "prompt_option",
"synonym": {
"entries": []
},
"mode": "TYPE_REPLACE"
}
]
There seems to be an error in the example documentation, but the reference docs say that it should be mode. I've tested it both ways, and that isn't causing the error.
The likely problem is that if you're replying with a List, you must do two things:
You need a Slot in the Scene that will accept the Type that you specify in the typeOverride.name. (And remember - you're updating the Type, not the name of the Slot.)
In the prompt for this slot, you must call the webhook that generates the list. (It has to be that slots prompt. You can't request it in On Enter, for example.)

Druid: Cached lookup fails because of "Null or Empty Dimension found" during ingestion time

I am setting up a new kafka ingestion stream into druid. This works fine, but now I need to do a lookup during the ingestion time. I am running apache-druid-0.13.0-incubating.
I have created a Registered Lookup function which finds a optional referral_id by a promo_id. When I do a lookup introspection for this function it works fine.
Now I want to store the result of this lookup (referral_id) in my datasource. The lookup cannot be done during query time.
In my dimensionsSpec I have defined the dimension as listed below.
{
"type" : "extraction",
"dimension" : "promo_id",
"outputName":"referral_id",
"outputType": "long",
"replaceMissingValueWith":"0",
"extractionFn": {
"type": "registeredLookup",
"lookup": "referral_promoter"
}
}
It does not work. I see this error in the logs:
WARN [KafkaSupervisor-TEST6] org.apache.druid.data.input.impl.DimensionSchema - Null or Empty Dimension found
"2019-01-22T10:08:39,784 ERROR [KafkaSupervisor-TEST6] org.apache.druid.indexing.kafka.supervisor.KafkaSupervisor - KafkaSupervisor[TEST6] failed to handle notice: {class=org.apache.druid.indexing.kafka.supervisor.KafkaSupervisor, exceptionType=class java.lang.IllegalArgumentException, exceptionMessage=Instantiation of [simple type, class org.apache.druid.data.input.impl.DimensionsSpec] value failed: null (through reference chain: org.apache.druid.data.input.impl.StringInputRowParser["parseSpec"]->org.apache.druid.data.input.impl.JSONParseSpec["dimensionsSpec"]), noticeClass=RunNotice}
" java.lang.IllegalArgumentException: Instantiation of [simple type, class org.apache.druid.data.input.impl.DimensionsSpec] value failed: null (through reference chain: org.apache.druid.data.input.impl.StringInputRowParser["parseSpec"]->org.apache.druid.data.input.impl.JSONParseSpec["dimensionsSpec"])
...
Caused by: java.lang.NullPointerException
I have no clue how to solve this. What am I doing wrong? I have tried various dimension configurations, but none worked.

Github GraphQL API: How can I find out which fields are searchable?

When I run the query:
{
"query": "{user(login: \"furknyavuz\") {repositories(first: 50, isPrivate: false) {nodes {name url}}}}"
}
I getting the following error:
{
"data": null,
"errors": [
{
"message": "Field 'repositories' doesn't accept argument 'isPrivate'",
"locations": [
{
"line": 1,
"column": 51
}
]
}
]
}
I can see isPivate is field of Repository object but I'm unable to search with it.
I'm not expecting to search with all fields of the object, but critical question is, how can I see which fields are searchable or indexable?
isPrivate is a field of Repository object but repositories inside User object is of type RepositoryConnection and repositories connection item has the following argument/type :
affiliations [RepositoryAffiliation]
after String
before String
first Int
isFork Boolean
isLocked Boolean
last Int
orderBy RepositoryOrder
privacy RepositoryPrivacy
RepositoryPrivacy is an enum with two values : PUBLIC and PRIVATE.
the following request will return private repo :
{
user(login: "furknyavuz") {
repositories(first: 50, privacy:PRIVATE) {
nodes {
name
url
}
}
}
}
Note that in the explorer, if you type CTRL+space you will have the schema listing with types :
Also, CTRL+space again after ":" will gives you the enum values :
Autocomplete:
Navigate to Github's GraphQL API Explorer. This is a GraphiQL interface that lets you write your queries and run in them in real time. One of the neat features of GraphiQL is that includes an auto-complete feature. When you're typing the arguments for a field, just press Alt+Space or Shift+Space and a list of possible arguments will pop up. This works for fields too.
The docs:
You can also view the documentation for the schema by hitting the Docs link in the upper right corner of the interface. This will bring up a list of all possible fields, including what arguments they take. There's also a schema reference page here.
GraphQL:
Lastly, you can actually just ask the GraphQL endpoint yourself. For example, running this query will list all types for the schema and the arguments used by each one:
{
__schema {
types {
name
inputFields {
name
description
type {
name
}
defaultValue
}
}
}
}