Filter data on call to getHyperCubeData - qliksense

When I run the following, I got all records from my table object (assuming i have 100 records in all). Is there a way to send the selection/filter, for example, I want to retrieve only those where department='procuring'.
table.getHyperCubeData('/qHyperCubeDef', [{
qWidth: 8,
qHeight: 100
}]).then(data => console.log(data));

I figured out the answer. Before getting the hypercube data, I need to get the field from the Doc class, then do the following:
.then(doc => doc.getField('department'))
.then(field => field.clear().then(() => field.select({qMatch: filter['procuring']})))

Related

How to filter an array of object in Mui DataGrid?

I recently changed my tables to Mui-datagrid on Material UI 5, and I have a special use case with an array of objects. I want to enable the phone number filter in this column, but the number is provided as an object list.
phone: [
{ type: "home", number: "795-946-1806" },
{ type: "mobile", number: "850-781-8104" }
]
I was expecting a 'customFilterAndSearch' or an option to customise how to search in this specific field.
customFilterAndSearch: (term, rowData) =>
!!rowData?.suppressedOptions.find(({ description }) =>
description?.toLowerCase().includes(term.toLowerCase())
),
I have made some tries with the filterOperators, but no success yet. I have made a full example here https://codesandbox.io/s/mui-data-grid-vs05fr?file=/demo.js
As far as I can see from the DataGrid documentation I don't see any way to change the filter function for a specific function.
Likely the best workaround for your use case will be converting this to a string be converting the data to a string before you pass it to the datagrid. Though you will lose the styling that you currently do by making the phone type bold.
On second though your best best would probably be to split the phone column into two columns which would probably be the cleanest way of solving your problem
Add helper function.
You could potentially add a helper function to just map all the phone lists to something like mobilePhone or homePhone
const mapPhoneObject = (rows) => {
rows.forEach((row) => {
row.phone.forEach((phone) => {
row[`${phone.type}Phone`] = phone.number;
});
});
return rows
};
I've added a fork of your snippet with my function, it is I think the most viable solution for your problem: https://codesandbox.io/s/mui-data-grid-forked-ppii8y

Get number of documents in a big collection in Cloud Firestore

I know this question was already asked but I'm being specific about my case: I've got a large database (approximately 1 million documents inside the collection users).
I wanna get the exact number of documents inside users. I'm trying this:
export const count_users = functions.https.onRequest((request, response) => {
corsHandler(request, response, () => {
db.collection('users').select().get().then(
(snapshot) => response.json(snapshot.docs.length)
)
.catch(function(error) {
console.error("[count_users] Error counting users: ", error);
response.json("Failed");
});
});
});
Although it seems right, it takes forever to give me a result. I'm not allowed to add or remove documents from the database.
Is there any possible approach for getting this quantity?

Two outputs in logstash. One for certain aggregations only

I'm trying to specify a second output of logstash in order to save certain aggregated data only. No clue how to achieve it at the moment. Documentation doesn't cover such a case.
At the moment I use a single input and a single output.
Input definition (logstash-udp.conf):
input {
udp {
port => 25000
codec => json
buffer_size => 5000
workers => 2
}
}
filter {
grok {
match => [ "message", "API call happened" ]
}
aggregate {
task_id => "%{example_task}"
code => "
map['api_calls'] ||= 0
map['api_calls'] += 1
map['message'] ||= event.get('message')
event.cancel()
"
timeout => 60
push_previous_map_as_event => true
timeout_code => "event.set('aggregated_calls', event.get('api_calls') > 0)"
timeout_tags => ['_aggregation']
}
}
Output definition (logstash-output.conf):
output {
elasticsearch {
hosts => ["localhost"]
manage_template => false
index => "%{[#metadata][udp]}-%{+YYYY.MM.dd}"
document_type => "%{[#metadata][type]}"
}
}
What I want to achieve now? I need to add a second, different aggregation (different data and conditions) which will save all the not aggregated data to Elasticsearch like now however aggregated data for this aggregation would be saved to Postgres. I'm pretty much stuck at the moment and searching the web for some docs/examples doesn't help.
I'd suggest using multiple pipelines: https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html
This way you can have one pipeline for aggregation and second one for pure data.

Dataloader did not return an array of the same length?

I Am building an express JS application with graphql, and mongodb (mongoose). I am using facebooks Dataloader to batch and cache requests.
Its working perfectly fine except for this use case.
I have a database filled with users posts. Each post contains the users ID for reference. When i make a call to return all the posts in the database. The posts are returned fine but if i try to get the user in each post. Users with multiple posts will only return a single user because the key for the second user is cached. So 2 posts(keys) from user "x" will only return 1 user object "x".
However Dataloader has to return the same amount of promises as keys that it recieves.
It has a option to specify cache as false so each key will make a request. But this doesnt seem to work for my use case.
Sorry if i havn't explained this very well.
this is my graphql request
query {
getAllPosts {
_id // This is returned fine
user {
_id
}
}
}
Returned error:
DataLoader must be constructed with a function which accepts Array<key> and returns Promise<Array<value>>, but the function did not return a Promise of an Array of the same length as the Array of keys.
are you trying to batch post keys [1, 2, 3] and expecting to get user results [{ user1 }, {user2}, { user1 }]?
or are you trying to batch user keys [1, 2] and expecting to get post results [{ post1}, {post3}] and [{ post2 }]?
seems like only in the second case will you run into a situation where you have length of keys differing from length of results array.
to solve the second, you could do something like this in sql:
const loader = new Dataloader(userIDs => {
const promises = userIDs.map(id => {
return db('user_posts')
.where('user_id', id);
});
return Promise.all(promises);
})
loader.load(1)
loader.load(2)
so you return [[{ post1}, {post3}], [{ post2 }]] which dataloader can unwrap.
if you had done this instead:
const loader = new Dataloader(userIDs => {
return db('user_posts')
.where('user_id', [userIDs]);
})
loader.load(1)
loader.load(2)
you will instead get [{ post1}, {post3}, { post2 }] and hence the error: the function did not return a Promise of an Array of the same length as the Array of keys
not sure if the above is relevant / helpful. i can revise if you can provide a snippet of your batch load function
You need to map the data returned from the database to the Array of keys.
Dataloader: The Array of values must be the same length as the Array of keys
This issue is well explained in this YouTube Video - Dataloader - Highly recommended

Order Posts by Most Votes (Overall, Last Month, etc.) with Laravel MongoDB

I am trying to understand more advanced functions of mongodb and laravel but having trouble with this. Currently I have my schema setup with a users, posts, and posts_votes collections. The posts_votes has a user_id, post_id and timestamp field.
In a relational DB, I would just left join the posts_votes collection, count, and order by that count. Exclude dates when need be and all that.
MongoDB I am having difficulty b/c there's no left join equivalent. So I'd like to learn how to accomplish my goal in a more document-y way.
On my Post model in Laravel, I reference this way. So looking at an individual post, I can get the vote count, see if current user voted for a specific post, etc.
public function votes()
{
return $this->hasMany(PostVote::class, 'post_id');
}
And my current working query looks like this:
$posts = Post::forCategoryType($type)
->with('votes', 'author', 'businessType')
->where('approved', true)
->paginate(25);
The forCategoryType method is just extended scope I added. Here it is on the Post model/document class.
public function scopeForCategoryType($builder, $catType)
{
if ($catType->exists) {
return $builder->where('cat_id', $catType->id);
}
return $builder;
}
So when I look at posts like this one, it's close to what I want to accomplish, but I am not applying it properly. For instance, I changed my main query to look like this:
$posts = Post::forBusinessType($type)
->with('votes', 'author', 'businessType')
->where('approved', true)
->sortByVotes()
->paginate(25);
And created this new method on the Post model:
public function scopeSortByVotes($builder, $dir = 'desc')
{
return $builder->raw(function($collection) {
return $collection->aggregate([
['$group' => [
'_id' => ['post_id' => 'votes.$post_id', 'user_id' => 'votes.$user_id']
],
'vote_count' => ['$sum' => 1]
],
['$sort' => ['vote_count' => -1]]
]);
});
}
This returns the error exception: A pipeline stage specification object must contain exactly one field.
Not sure how to fix that (still looking), so then I tried:
return $collection->aggregate([
['$unwind' => '$votes'],
['$group' => [
'_id' => ['post_id' => ['$votes.post_id', 'user_id' => '$votes.user_id']],
'count' => ['$sum' => 1]
]
]
]);
returns an empty ArrayIterator, so then I tried:
public function scopeSortByVotes($builder, $dir = 'desc')
{
return $builder->raw(function($collection) {
return $collection->aggregate([
'$lookup' => [
'from' => 'community_posts_votes',
'localField' => 'post_id',
'foreignField' => '_id',
'as' => 'vote_count'
]
]);
});
}
But on this setup, I just get the list of posts unsorted. On version 3.2.8.
The default loads everything by most recent. But ultimately I want to be able to pull these posts based on how many votes they got lifetime, but also query based on which posts got the most votes in the last week, month, etc.
That example I shared has the grand total linked in the Post model and an array of all the user ids that voted on it. With the way I have things setup using a separate collection holding the user_id, post_id and timestamps of when the vote happened, can I still accomplish the same goal?
Note: using this laravel mongodb library.