I am writing a test case using jest in to test the data returned from a method.The method returns array of non repeating elements.Now i am trying to use expect() in jest to test whether the array returned from the method has only unique elements.
Returned array from method
arr = [ 'Pizza' ,'Burger' , 'HotDogs'] // All elements are unique
Are there any jest matchers like below to check non repeating elements in array ?
expect(arr).toBeUnique()
Or any logic using existing matchers should be done ?
There is no built on the method to check that array has a unique value, but I would suggest doing something like that:
const goods = [ 'Pizza' ,'Burger' , 'HotDogs'];
const isArrayUnique = arr => Array.isArray(arr) && new Set(arr).size === arr.length; // add function to check that array is unique.
expect(isArrayUnique(goods)).toBeTruthy();
You can use expect.extend to add your own matchers to Jest.
For example:
expect.extend({
toBeDistinct(received) {
const pass = Array.isArray(received) && new Set(received).size === received.length;
if (pass) {
return {
message: () => `expected [${received}] array is unique`,
pass: true,
};
} else {
return {
message: () => `expected [${received}] array is not to unique`,
pass: false,
};
}
},
});
and use it:
const goods = [ 'Pizza' ,'Burger' , 'HotDogs'];
const randomArr = [ 'Pizza' ,'Burger' , 'Pizza'];
expect(goods).toBeDistinct(); // Passed
expect(randomArr).toBeDistinct(); // Failed
This is very similar to Yevhen's answer but I've changed the error message to describe the first duplicate item encountered, like
item 3 is repeated in [1,2,3,3]
at the expense of sorting the array
expect.extend({
toContainUniqueItems(received) {
const items = [...(received || [])];
items.sort();
for (let i = 0; i < items.length - 1; i++) {
if (items[i] === items[i + 1]) {
return {
pass: false,
message: () => `item ${items[i]} is repeated in [${items}]`,
};
}
}
return {
pass: true,
message: () => `all items are unique in [${items}]`,
};
},
});
Related
I've queries like:
useQuery(['myquery',{test:1}], fetchFn)
useQuery(['myquery',{test:2}], fetchFn)
useQuery(['myquery',{test:3}], fetchFn)
I would like to observe the data of all those queries with myquery without knowing the rest of the items of queryKey.
In documentation, as I understood it is possible to observe multiple queries but my matching condition seems not covered.
const observer = new QueriesObserver(queryClient, [
{ queryKey: ['post', 1], queryFn: fetchPost },
{ queryKey: ['post', 2], queryFn: fetchPost },
])
const unsubscribe = observer.subscribe(result => {
console.log(result)
unsubscribe()
})
I could only find similar usage for useIsFetching but it only gives a number of matching queries:
// How many queries matching the posts prefix are fetching?
const isFetchingPosts = useIsFetching(['posts'])
But I want to access the result of the queries, specifically the last updated one.
This is the best thing i can come up with using queryClient :
const Component = () => {
// match all queries with:
const keyPrefix = "courseSection_list";
// since it is loading state, it will trigger twice for each returning result
const matchingQueriesUpdated = useIsFetching([keyPrefix]);
const data = useMemo(() => {
const lastUpdatedMatchingQuery = queryClient.queryCache.queries
.filter((q) => q.queryKey[0] === keyPrefix)
.sort((a, b) => b.state.dataUpdatedAt - a.state.dataUpdatedAt)[0] // sorting puts the last updated one to the 1st index;
return lastUpdatedMatchingQuery.state.data;
}, [matchingQueriesUpdated]);
return <div> bla bla </div>
}
Extra render can be prevented by catching dataUpdatedAt value at 0 for loading state. But i rather keep my code more simple for now.
I have a collection of Users in my database with corresponding unique ids. I am writing a function that takes an array of ids as an argument, e.g:
["user_id1", "user_id2", "user_id3", "user_id4"].
I want my query to return the first and only the first match. i.e. Using the example above, if user_id2 and user_id4 were the only two matching users in the database, my result would only return user_id2. User ids that are not in the database are ignored.
My current approach is to use a while loop, but I wanted to see if there was a better solution provided by Mongoose.
Current Pseudo Code:
function findOneUser(userIdArr) {
let user = 0;
let returnedUser;
while(!returnedUser || user < userIdArr.length) {
let id = userIdArr[user];
user = await User.findByID(id);
user++;
}
}
try this.
Use promises and mongoose findOne method:
let argumentArr = ["user_id1", "user_id2", "user_id3", "user_id4"];
let getUser = new Promise(function(resolve, reject) {
for (let i = 0; i < argumentArr.length; i++) {
User.findOne({_id:argumentArr[i]}).then(user => {
if(Object.keys(user).length > 0){
resolve(user)
}
}).catch(err => {reject(err)})
}
});
getUser.then(
(user) => {console.log(user);},//expected results: user(:object)
(err) => {console.log(err);}
);
}
I have a Cloud Functions transaction that uses FieldValue.increment() to update a nested map, but it isn't running atomically, thus the value updates aren't accurate (running the transaction in quick succession results in an incorrect count).
The function is fired via:
export const updateCategoryAndSendMessage= functions.firestore.document('events/{any}').onUpdate((event, context) => {
which include the following transaction:
db.runTransaction(tx => {
const categoryCounterRef = db.collection("data").doc("categoryCount")
const intToIncrement = event.after.data().published == true ? 1 : -1;
const location = event.after.data().location;
await tx.get(categoryCounterRef).then(doc => {
for (const key in event.after.data().category) {
event.after.data().category[key].forEach(async (subCategory) => {
const map = { [key]: { [subCategory]: FieldValue.increment(intToIncrement) } };
await tx.set(categoryCounterRef, { [location]: map }, { merge: true })
})
}
},
).then(result => {
console.info('Transaction success!')
})
.catch(err => {
console.error('Transaction failure:', err)
})
}).catch((error) => console.log(error));
Example:
Value of field to increment: 0
Tap on button that performs the function multiple times in quick succession (to switch between true and false for "Published")
Expected value: 0 or 1 (depending on whether reference document value is true or false)
Actual value: -3, 5, -2 etc.
As far as I'm aware, transactions should be performed "first come, first served" to avoid inaccurate data. It seems like the function isn't "queuing up" correctly - for lack of a better word.
I'm a bit stumped, would greatly appreciate any guidance with this.
Oh goodness, I was missing return...
return db.runTransaction(tx => {
I have a sap.ui.table.Table whose selectionMode is Single and selectionBehavior is RowOnly.
I want to select a row programmatically based on content;
There is code to select by index like
table.setSelectedIndex()
table.setSelectionInterval()
but I am not able to get the index of the content, whose row is to be selected.
Is there any other way?
As commented in the question, there is currently no straightforward solution to select row(s) programmatically by content. But I read:
I want an answer that works. Best practices/ suggestions are not accepted.
If that's still the case, I assume you're ok with accessing internal properties. The only internal property I'm using is the aKeys from the ODataListBinding instance. The following snippets are from this example: https://embed.plnkr.co/7lcVJOaYsnIMJO1w [1]
Single Select
<Table xmlns="sap.ui.table"
id="myGridTable"
selectionMode="Single"
selectionBehavior="RowOnly"
rows="{
path: '/Customers',
events: {
change: '.onRowsDataChange'
}
}"
>
<!-- columns -->
</Table>
Controller.extend("demo.controller.TableSingleSelect", {
onRowsDataChange: function(event) {
this.selectCustomer(/*your key part(s) e.g.:*/ "ANTON"/*, ...*/);
},
selectCustomer: function(customerId/*, ...*/) {
const rowsBinding = this.byId("myGridTable").getBinding("rows");
this.selectIndexByKey(rowsBinding.getModel().createKey("Customers", {
CustomerID: customerId,
//...
}), rowsBinding.aKeys);
},
selectIndexByKey: function(targetKey, keys) {
const table = this.byId("myGridTable");
const index = +Object.keys(keys).find(key => targetKey === keys[key]);
const shouldSelect = index > -1 && !table.isIndexSelected(index);
return shouldSelect ? table.setSelectedIndex(index) : table;
},
});
Multi Select
<Table xmlns="sap.ui.table"
id="myGridTable"
selectionMode="MultiToggle"
rows="{
path: '/Orders',
events: {
change: '.onRowsDataChange'
}
}"
>
<!-- columns -->
</Table>
Controller.extend("demo.controller.TableMultiSelect", {
onRowsDataChange: function(event) {
const value1 = new Date("1996"); // 1996-01-01
const value2 = new Date("1997"); // 1997-01-01
this.selectOrdersBy("OrderDate", "BT", value1, value2);
},
selectOrdersBy: function(propertyName, filterOperator, value1, value2) {
const table = this.byId("myGridTable").clearSelection();
const keys = table.getBinding("rows").aKeys;
const loadedContexts = this.getLoadedContexts(keys, table, "rows");
const filteredContexts = FilterProcessor.apply(loadedContexts, [
new Filter(propertyName, filterOperator, value1, value2),
], (context, path) => context && context.getProperty(path));
this.selectIndices(keys, filteredContexts, table);
},
getLoadedContexts: function(keys, control, aggregationName) {
const model = control.getBinding(aggregationName).getModel();
const parameters = control.getBindingInfo(aggregationName).parameters;
return keys.map(key => model.createBindingContext(`/${key}`, parameters));
},
selectIndices: (keys, contexts, table) => Object.keys(keys).map(index => +index)
.filter(i => contexts.find(context => `/${keys[i]}` == context.getPath()))
.map(i => table.isIndexSelected(i) || table.addSelectionInterval(i, i)),
});
* FilterProcessor is a private module.
The internal property aKeys consists of keys, from loaded contexts, with indices reflecting the table row indices. E.g.: If the table has 3 rows loaded and I call table.getContextByIndex(90), the aKeys will be:
0: "Customers('ALFKI')"
1: "Customers('ANATR')"
2: "Customers('ANTON')"
90: "Customers('WOLZA')"
The change handler onRowsDataChange is fired on any ChangeReason. This keeps the selection from being removed whatever happens to the table rows, be it sorting, filtering, refreshing, etc..
[1]: The samples in this answer work with an ODataModel. In case of a client-side JSONModel, take a look at this answer: stackoverflow.com/a/52664812.
Depending on your design you can for instance use a button receive the selected index:
oEvent.getSource().getParent().getIndex()
example
This small piece of code did the work for me. Table is bound to JSONModel.
const rowsBinding = oTable.getBinding("rows");
var index1 = -1;
rowsBinding.oList.find(function(element){
index1++;
if(element.yourField== "your Content")
{
oTable.setSelectedIndex(index1);
}
});
Now i am able select my row on the basis of content.
JSONModel
There was another request to do it with a JSONModel. So here it is.
The following snippets are from https://embed.plnkr.co/xuSU3uH1rkXmEAV7:
<Table xmlns="sap.ui.table"
id="myGridTable"
selectionMode="Single"
selectionBehavior="RowOnly"
rows="{
path: '/Customers',
events: {
change: '.onRowsDataChange'
}
}"
>
<!-- columns -->
</Table>
Controller.extend("demo.controller.TableSingleSelect", {
onRowsDataChange: function(event) {
this.selectWhere(context => context.getProperty("CustomerID") == "ANTON" /*&& ...*/);
},
selectWhere: function(keysAreMatching) {
const table = this.byId("myGridTable");
const contexts = table.getBinding("rows").getContexts();
const index = this.getRowIndexWhere(keysAreMatching, contexts);
return this.selectRowByIndex(index, table);
},
getRowIndexWhere: function(keysAreMatching, contexts) {
let index = -1;
contexts.find((context, i) => keysAreMatching(context) && (index = i));
return index;
},
selectRowByIndex: function(i, table) {
const shouldSelect = i > -1 && !table.isIndexSelected(i);
return shouldSelect ? setTimeout(() => table.setSelectedIndex(i)) : table;
},
});
With a client-side model like JSONModel, it is a bit easier to find certain row(s) than with server-side models since all necessary data are locally available via rowsBinding.getContexts(). The returned contexts are assigned to the indices corresponding to table's row indices.
I run an IRC bot and I have a function which returns 1 random url using Math.random at the moment, from my Mongodb collection.
I would like to refactor it to return x number of unique items, and for each subsequent invocation of the url fetching command .getlinks I would like that it keeps everything unique, so that a user doesn't see the same link unless all the possible links have been already returned.
Is there some algorithm or native mongodb function I could use for this?
Here's a sample scenario:
I have a total of 9 records in the collection. They have a _id and url field.
user a: .getlinks()
bot returns: http://unique-link-1, http://unique-link-2, http://unique-link-3, http://unique-link-4
user a: .getlinks()
bot returns: http://unique-link-5, http://unique-link-6, http://unique-link-7, http://unique-link-8
user a: .getlinks()
bot returns: http://unique-link-9, http://unique-link-6, http://unique-link-1, http://unique-link-3
Background information:
There's a total of about 200 links. I estimate that will grow to around 5000 links by the end of next year.
Currently the only thing I can think of is keeping an array of all returned items, and grabbing all items from the collection at once and getting a random one 4 times and making sure it's unique and hasn't been shown already.
var shown = [], amountToReturn = 4;
function getLinks() {
var items = links.find(), returned = [];
for ( var i = 0; i<amountToReturn; i++ ) {
var rand = randItem( items );
if ( shown.indexOf( rand.url ) == -1 && shown.length < items.length ) ) {
returned.push( rand.url );
}
}
message.say( returned.join(',') );
}
You should find a number of possible options to get random item(s) from Collection here ...
http://jira.mongodb.org/browse/SERVER-533
Another intersting method is documented here ...
http://cookbook.mongodb.org/patterns/random-attribute/
The method mentioned above basically creates a new key/value on the document using Math.random()
> db.docs.drop()
> db.docs.save( { key : 1, ..., random : Math.random() } )
> db.docs.save( { key : 1, ..., random : Math.random() } )
> db.docs.save( { key : 2, ..., random : Math.random() } )
... many more insertions with 'key : 2' ...
> db.docs.save( { key : 2, ..., random : Math.random() } )
...
Get random records form mongodb via map/reduce
// map
function() {
emit(0, {k: this, v: Math.random()})
}
// reduce
function(k, v) {
var a = []
v.forEach(function(x) {
a = a.concat(x.a ? x.a : x)
})
return {a:a.sort(function(a, b) {
return a.v - b.v;
}).slice(0, 3 /*how many records you want*/)};
}
// finalize
function(k, v) {
return v.a.map(function(x) {
return x.k
})
}