Ionic2 Native Storage storing/retrieving Arrays - ionic-framework

I have an Ionic app in which I store an Array with Native-Storage.
This Array is an array of objects.
I stored it like this:
>>> array1: CertainType[] With at least 50 shuffled elements
this.nativeStorage.setItem('array1', { property: array1 })
.then(
() => { console.log('array1 stored') },
error => { console.log('array1 not Stored',error)
});
I retrieve the item like this:
this.nativeStorage.getItem('array1').then(
array1 => {
//On the Sucess of getting Array1 use it to create Array2
array2 = array1.splice(0,5); //<-- MY PROBLEM IS HERE
},
error => {
console.error('Error Getting Array', error);
}
);
I keep getting the Error of
I thought its because the process of storing and retrieving was messing with the type of the array, etc.
I tried to do casting:
..array1 as CertainType[]
-- EDITED>>
I tried stringfying and JSONparsing.
this.nativeStorage.setItem('array1', { property: JSON.stringify(array1)}).then(. . . .
array2 = JSON.parse(array1);
Throw this error:
ERROR Error: Uncaught (in promise): SyntaxError: Unexpected token o in JSON at position 1
SyntaxError: Unexpected token o in JSON at position 1
at JSON.parse (<anonymous>)
But i keep getting the same error on the splice().
If I not use the logic of storing, the code runs just fine.... Any clues. What am I missing? :/

Use JSON stringify before store in localStorage cause it will return a promise
just do this for example :
​var test = { test: "thing", test2: "thing2", test3: [0, 2, 44] }​​​​​​​;
localStorage.setItem("test", JSON.stringify(test));
var test2 = localStorage.getItem("test");
test = JSON.parse(test2);

The Ionic Native Storage Documentation confused me.
this.nativeStorage.setItem('myitem', {property: 'value', anotherProperty: 'anotherValue'})
.then(
() => console.log('Stored item!'),
error => console.error('Error storing item', error)
);
I sticked to the book and use almost equal code on my app. But that word "property" over there was breaking my floor.
The good contributor above me, insisted (thank god) on the use of JSON.stringify and JSON.parse to save and retrieve the data.
So I did, but kept getting errors. Then I realized: when I tried to retrieve the data, my Array was stored on an Object. Ok! But UNDER a p-r-o-p-e-r-y attribute..
If I did get my array1 using array1.property I would get what I was looking for.
In the end, just a little change will make it work like a clock:
this.nativeStorage.setItem('array1', JSON.stringfy(array)})
.then(. . .
this.storage.get('array').then( array1 => {
console.log(JSON.parse(array1));

Related

Adding a column and updating all records using knex and postgres

I need to add a column to my table of riders, allowing us to store the name of the image that will display on that rider's card. I then need to update all of the records with the auto-generated image names.
I've done a bunch of searching, and all roads seem to lead back to this thread or this one. I've tried the code from both of these threads, swapping in my own table and column names, but I still can't get it to work.
This is the latest version of the code:
export async function up(knex, Promise) {
return knex.transaction(trx => {
const riders = [
{
name: 'Fabio Quartararo',
card: 'rider_card_FabioQuartararo'
},
...24 other riders here...
{
name: 'Garrett Gerloff',
card: 'rider_card_GarrettGerloff'
},
];
return knex.schema.table('riders', (table) => table.string('card')).transacting(trx)
.then(() =>{
const queries = [];
riders.forEach(rider => {
const query = knex('riders')
.update({
card: rider.card
})
.where('name', rider.name)
.transacting(trx); // This makes every update be in the same transaction
queries.push(query);
});
Promise.all(queries) // Once every query is written
.then(() => trx.commit) // We try to execute all of them
.catch(() => trx.rollback); // And rollback in case any of them goes wrong
});
});
}
When I run the migration, however, it fails with the following error:
migration file "20211202225332_update_rider_card_imgs.js" failed
migration failed with error: Cannot read properties of undefined (reading 'all')
Error running migrations: TypeError: Cannot read properties of undefined (reading 'all')
at D:\Users\seona\Documents\_Blowfish\repos\MotoGP\dist\database\migrations\20211202225332_update_rider_card_imgs.js:134:25
at processTicksAndRejections (node:internal/process/task_queues:96:5)
So it's clearly having some sort of problem with Promise.all(), but I can't for the life of me figure out what. Searching has not turned up any useful results.
Does anyone have any ideas about how I can get this working? Thanks in advance.
I think you might be following some older documentation and/or examples (at least that's what I was doing).
The Promise argument is no longer passed into the migration up and down functions.
So, the signature should be something like this:
function up(knex) {
// Use the built in Promise class
Promise.all(<ARRAY_OF_QUERY_PROMISES>);
...
}

instantsearch.js with error in javascript console => RangeError: Invalid array length

I have this error on javascript console for a long time.
It is related with instantsearch.js and the paginator.
Can anyone help?
range.ts:18 Uncaught (in promise) RangeError: Invalid array length
at Ie (range.ts:18)
at t.value (Paginator.js:31)
at Object.render (connectPagination.js:147)
at index.ts:481
at Array.forEach (<anonymous>)
at Object.render (index.ts:472)
at InstantSearch.ts:510
at defer.ts:26
This error appear when the page load and I initialize the search instance.
Ok. Here is what you need to do.
Algolia Documentation
Code Sample
Finally. Inside your custom search client do this. Am using a sample computed property as my searchClient. Which will be cached thus not firing another request to Algolia.
search(requests) {
if (requests.every(({ params }) => !params.query)) {
// Here we have to do something else...
// Since the query is empty i.e. ""
return Promise.resolve({
results: requests.map(() => ({
// When an empty search is detected, a formatted response must be returned
// with atleast below properties.
// Otherwise;
// RangeError is generated if all other fields aren't provided.
hits: [],
nbHits: 0,
nbPages: 0,
page: 0,
facets: [],
processingTimeMS: 0,
hitsPerPage: 0,
exhaustiveNbHits: false,
query: "",
params: "",
})),
});
}
// Else query using provided input
return algoliaClient.search(requests);
},
};
}

geofirestore querydocumentsnapshot

Using geofirestore, I can query from collection users/{userId}/pros in cloud functions and get resulting documents(doc). Now I want to add a new collection users/{userId}/pros/{proId}/notifs right under each of the document users/{userId}/pros/{proId} that came from the query. So, I wrote like this;
exports.sendNotification = functions.firestore
.document("users/{user}/consumers/{consumer}")
.onCreate(async snapshot => {
try {
query.get().then(querySnapshot => {
querySnapshot.forEach(doc => {
await doc.ref.collection('notifs').add({ . . .});
}).catch ((error) =>
console.log(error));
However I keep getting errors TypeError: Cannot read property 'collection' of undefined. What seems to be wrong? Geofiretore queryDocumentSnapshot doesn't seem to have collection() property. Thanks for any help.
There is docs property missing before forEach loop. According to the QuerySnapshot documentation the property is:
An array of all the documents in the QuerySnapshot
So the if you want to loop over the docs it should be:
...
query.get().then(querySnapshot => {
querySnapshot.docs.forEach(doc => {
...
According to my tests this will work at least directly in node (which is used in cloud function as well)

Dataloader did not return an array of the same length?

I Am building an express JS application with graphql, and mongodb (mongoose). I am using facebooks Dataloader to batch and cache requests.
Its working perfectly fine except for this use case.
I have a database filled with users posts. Each post contains the users ID for reference. When i make a call to return all the posts in the database. The posts are returned fine but if i try to get the user in each post. Users with multiple posts will only return a single user because the key for the second user is cached. So 2 posts(keys) from user "x" will only return 1 user object "x".
However Dataloader has to return the same amount of promises as keys that it recieves.
It has a option to specify cache as false so each key will make a request. But this doesnt seem to work for my use case.
Sorry if i havn't explained this very well.
this is my graphql request
query {
getAllPosts {
_id // This is returned fine
user {
_id
}
}
}
Returned error:
DataLoader must be constructed with a function which accepts Array<key> and returns Promise<Array<value>>, but the function did not return a Promise of an Array of the same length as the Array of keys.
are you trying to batch post keys [1, 2, 3] and expecting to get user results [{ user1 }, {user2}, { user1 }]?
or are you trying to batch user keys [1, 2] and expecting to get post results [{ post1}, {post3}] and [{ post2 }]?
seems like only in the second case will you run into a situation where you have length of keys differing from length of results array.
to solve the second, you could do something like this in sql:
const loader = new Dataloader(userIDs => {
const promises = userIDs.map(id => {
return db('user_posts')
.where('user_id', id);
});
return Promise.all(promises);
})
loader.load(1)
loader.load(2)
so you return [[{ post1}, {post3}], [{ post2 }]] which dataloader can unwrap.
if you had done this instead:
const loader = new Dataloader(userIDs => {
return db('user_posts')
.where('user_id', [userIDs]);
})
loader.load(1)
loader.load(2)
you will instead get [{ post1}, {post3}, { post2 }] and hence the error: the function did not return a Promise of an Array of the same length as the Array of keys
not sure if the above is relevant / helpful. i can revise if you can provide a snippet of your batch load function
You need to map the data returned from the database to the Array of keys.
Dataloader: The Array of values must be the same length as the Array of keys
This issue is well explained in this YouTube Video - Dataloader - Highly recommended

Meteor-Mongo: Error handling for findone

I am trying to handle errors using findOne in meteor-mongo.
From this stackoverflow question, it appears that I should be able to handle errors by doing collection.findOne({query}, function(err, result){ <handleError> }, but doing so results in an errormessage:
"Match error: Failed Match.OneOf, Match.Maybe or Match.Optional validation"
The following code works:
export default createContainer((props) => {
let theID = props.params.theID;
Meteor.subscribe('thePubSub');
return {
x: theData.findOne({_id: theID}),
};
}, App);
The following code does not:
export default createContainer((props) => {
let theID = props.params.theID;
Meteor.subscribe('thePubSub');
return {
x: theData.findOne({_id: theID}, function(err,result){
if(!result){
return {}
};
}),
};
}, App);
What am I doing wrong and how should I be resolving this error? Is this a meteor specific error?
Any help is greatly appreciated!
What kind of error are you exactly trying to handle with your callback?
Meteor's findOne is different from node's mongodb driver's findOne that the post you link to uses.
The expected signature is:
collection.findOne([selector], [options])
There is no callback involved, since the method runs synchronously (but is reactive).
If you want to return a default value when the document is not found, you can simply use a JS logical OR:
// Provide an alternative value on the right that will be used
// if the left one is falsy.
theData.findOne({_id: theID}) || {};
A more rigorous approach would be to compare its type with
typeof queryResult === 'undefined'
Note that if theData collection is fed by the above subscription Meteor.subscribe('thePubSub'), I doubt Meteor will have time to populate the collection on the client by the time you query it…