LOOPBACK: order by YEAR of a date - loopback

So, I have a model with a property of type date, let's say it's:
"start_date": {
"type": "date",
"required": true
}
when I get the information from this model, I would like to order by Year of that date
so instead of doing
{
"order": [
"start_date ASC"
]
}
I would like something like
{
"order": [
"dateYear(start_date) ASC"
]
}
is this possible?

If to check DataAccessObject._normalize() function (dao.js in loopback-datasource-juggler), loopback do nothing special with order property of a filter and pass the result to a datasource connector.
So, looks like loopback doesn't support it. The only way to get this functionality, it should be supported by your datasource connector.

Related

Mapping multiple csv files in Data Factory

Does anyone know if there is a way to pass a schema mapping to multiple csv without doing it manually? I have 30 csv passed through a data flow in a foreach activity, so I can't detect or set fields's type. (Because i could only for the first)
Thanks for your help! :)
A Copy Activity mapping can be parameterized and changed at runtime if explicit mapping is required. The parameter is just a json object that you'd pass in for each of the files you are processing. It looks something like this:
{
"type": "TabularTranslator",
"mappings": [
{
"source": {
"name": "Id"
},
"sink": {
"name": "CustomerID"
}
},
{
"source": {
"name": "Name"
},
"sink": {
"name": "LastName"
}
},
{
"source": {
"name": "LastModifiedDate"
},
"sink": {
"name": "ModifiedDate"
}
}
]
}
You can read more about it here: Schema and data type mapping in copy activity
So, you can either pre-generate these mapping and fetch them via a lookup in a previous step in the pipeline or if they need to be dynamic you an create them at runtime with code (e.g. have an Azure Function that looks up the current schema of the CSV and returns a properly formatted translator object).
Once you have the object as a parameter you can pass it to the copy activity. On the mapping properties of the copy activity you just Add Dynamic Content and select the appropriate parameter. It will look something like this:

druid groupBy query - json syntax - intervals

Im attempting to create this query (which works as I hope)
SELECT userAgent, COUNT(*) FROM page_hour GROUP BY userAgent order by 2 desc limit 10
as a json. I've tried this:
{
"queryType": "groupBy",
"dataSource": "page_hour",
"granularity": "hour",
"dimensions": ["userAgent"],
"aggregations": [
{ "type": "count", "name": "total", "fieldName": "userAgent" }
],
"intervals": [ "2020-02-25T00:00:00.000/2020-03-25T00:00:00.000" ],
"limitSpec": { "type": "default", "limit": 50, "columns": ["userAgent"] },
"orderBy": {
"dimension" : "total",
"direction" : "descending"
}
}
but instead of doing the aggregation over the full range it appears to pick an arbitrary time span (EG 2020-03-19T14:00:00Z)
If you want results from the entire interval to be combined in a single result entry per user agent, set granularity to all in the query.
A few notes on Druid queries:
You can generate a native query by entering a SQL statement in the management console and selecting the explain/plan menu option from the three-dot menu by the run button.
It's worth confirming expectations that the count query-time aggregator will return the number of database rows (not the number of ingested events). This could be the reason the resulting number is smaller than anticipated.
A granularity of all will prevent bucketing results by hour.
A fieldName spec within the count aggregator? I don't know what behavior might be defined for this, so I would remove this property. The docs:
see: https://druid.apache.org/docs/latest/querying/aggregations.html#count-aggregator

Validate referential integrity of object arrays with Joi

I'm trying to validate that the data I am returned it sensible. Validating data types is done. Now I want to validate that I've received all of the data needed to perform a task.
Here's a representative example:
{
"things": [
{
"id": "00fb60c7-520e-4228-96c7-13a1f7a82749",
"name": "Thing 1",
"url": "https://lolagons.com"
},
{
"id": "709b85a3-98be-4c02-85a5-e3f007ce4bbf",
"name": "Thing 2",
"url": "https://lolfacts.com"
}
],
"layouts": {
"sections": [
{
"id": "34f10988-bb3d-4c38-86ce-ed819cb6daee",
"name": "Section 1",
"content:" [
{
"type": 2,
"id": "00fb60c7-520e-4228-96c7-13a1f7a82749" //Ref to Thing 1
}
]
}
]
}
}
So every Section references 0+ Things, and I want to validate that every id value returned in the Content of Sections also exists as an id in Things.
The docs for Object.assert(..) implies that I need a concrete reference. Even if I do the validation within the Object.keys or Array.items, I can't resolve the reference at the other end.
Not that it matters, but my context is that I'm validating HTTP responses within IcedFrisby, a Frisby.js fork.
This wasn't really solveable in the way I asked (i.e. with Joi).
I solved this for my context by writing a plugin for icedfrisby (published on npm here) which uses jsonpath to fetch each id in Content and each id in Things. The plugin will then assert that all of the first set exist within the second.

magento 2 rest api product filters

I am working on magento 2 api. I need products based on below filters
store id
by product name search
shorting by name
category id
add limit
I have try with this api but no option available
index.php/rest/V1/categories/{id}/products
Please someone suggest how to archive this.
Thanks
You are looking for the (GET) API /rest/V1/products.
the store ID should be automatically detected by the store, because you can pass the store code in the URL before. If you have a store with code test, the API will start with GET /rest/test/V1/products/[...].
You can use the likecondition type. Ex.: products with "sample" in their name: ?searchCriteria[filter_groups][0][filters][0][field]=name
&searchCriteria[filter_groups][0][filters][0][value]=%sample%
&searchCriteria[filter_groups][0][filters][0][condition_type]=like
you are looking for the sortOrders. Ex.: searchCriteria[sortOrders][0][field]=name. You can even add the sort direction, for example DESC, with searchCriteria[sortOrders][0][direction]=DESC.
Use the category_id field and the eq condition type. Ex.: if you want products from category 10: searchCriteria[filter_groups][0][filters][0][field]=category_id&
searchCriteria[filter_groups][0][filters][0][value]=10&
searchCriteria[filter_groups][0][filters][0][condition_type]=eq
use searchCriteria[pageSize]. Ex.: 20 products starting from the 40th, equivalent in SQL to LIMIT 20 OFFSET 40: &searchCriteria[pageSize]=20&searchCriteria[currentPage]=3
Of course you can perform AND and OR operations with filters.
[
"filter_groups": [
{
"filters": [
{
"field": "type_id",
"value": "simple",
"condition_type": "eq"
}
]
},
{
"filters": [
{
"field": "category_id",
"value": "611",
"condition_type": "eq"
}
]
}
],
"page_size": 100,
"current_page": 1,
"sort_orders": [
{
"field": "name",
"direction": "ASC"
}
]
]

Subscribing to an entity without specifying attributes

I'm trying to subscribe to an entity to get notifications using ONCHANGE.
The thing is that I'd like to get notified when new attributes get added or removed from the entity, otherwise said, I want a notification whenever anything changes on that entity.
Is that possible? I tried setting an empty condValues list in the query like this:
{
"entities": [
{
"type": "case",
"isPattern": "false",
"id": "Case1"
}
],
"reference": "http://localhost:1028/accumulate",
"duration": "P1M",
"notifyConditions": [
{
"type": "ONCHANGE",
"condValues": [
"Test Node 1"
]
}
],
"throttling": "PT5S"
}
But it didn't work.
PS: Note that I omitted the attributes Array to receive all the attributes on notification, this does work.
Current Orion version (0.19.0) doesn't implement such feature. However, it is planed to be done in the future (see this issue at Orion github repository).
EDIT: since Orion 0.27.0 you can subscribe to changes in any attribute. In order to do so, do the subscription omitting the condValues field (or use an empty array [] as value).