Queries error sorting documents with find Pouchdb - ionic-framework

I wish to order/sort results with pouchdb.
I have created an index on the field i wish to sort by greater than 882
I've checked that the database exist
Then the result is like :
883
895
9
909
917
93
I am refering to the documentation : https://pouchdb.com/guides/mango-queries.html
and this documentation : http://docs.couchdb.org/en/stable/api/database/find.html
this.db= new PouchDB('parcelles', {adapter: 'idb'});
// Create an index to the field id_culture
this.db.createIndex({
index: {
fields: ['id_culture']
}
}).then((result)=> {
console.log(result);
}).catch(function (err) {
});
// Query with the sort filter
this.db.find({
selector: {
id_culture: {$gte: '882'}
},
sort: ['id_culture']
}).then( (result)=> {console.log(result);
}).catch(function (err) {console.log(err);
});

Your attribute id_culture is a text string not a number. You will have to decide on a maximum possible size, for example 100,000,000 and left pad all your ids.
I recommend prefixes as well, so you could try, for example: Culture_00000009, Culture_00000909, Culture_00000093, etc. With ids like that sorting will work....
id_culture: {$gte: 'Culture_00000882'}
... giving ...
Culture_00000883
Culture_00000895
Culture_00000909
Culture_00000917
Culture_00001093

Related

mongodb Alerts for frequent queries

I have this query that inserts when a listener is listening to a song.
const nowplayingData = {"type":"S","station": req.params.stationname, "song": data[1], "artist": data[0], "timeplay":npdate};
LNowPlaying.findOneAndUpdate(
nowplayingData,
{ $addToSet: { history: [uuid] } }, { upsert: true }, function(err) {
if (err) {
console.log('ERROR when submitting round');
console.log(err);
}
});
I have been getting the following emails for the last week - they are starting to get annoying.
Mongodb Alerts
These alerts don't show anything wrong with the query or the code.
I also have the following query that checks for the latest userID matching the station name.
I believe this is the query setting off the alerts - because of the amount of times we request the same query over and over (runs every 10 seconds and may have unto 1000 people requesting the info at the same time.)
var query = LNowPlaying.findOne({"station":req.params.stationname, "history":{$in: [y]}}).sort({"_id":-1})
query.exec(function (err, docs) {
/*res.status(200).json({
data: docs
});*/
console.error(docs)
if(err){
console.error("error")
res.status(200).json(
err
);
}
I am wondering how can I make this better so that I don't get the alerts - I know I either have to make an index works which I believe needs to be station name and history array.
I have tried to create a new index using the fields station and history But got this error
Index build failed: 6ed6d3f5-bd61-4d70-b8ea-c62d7a10d3ba: Collection AdStitchr.NowPlaying ( 8190d374-8f26-4c31-bcf7-de4d11803385 ) :: caused by :: Field 'history' of text index contains an array in document: { _id: ObjectId('5f102ab25b43e19dabb201f5'), artist: "Cobra Dukes", song: "Leave The Light On (Hook N Sling Remix) [nG]", station: "DRN1", timeplay: new Date(1594898580000), __v: 0, history: [ "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJleHAiOjE1OTQ5ODE0MjQsImlhdCI6MTU5NDg5NTAyNCwib2lkIjoicmFkaW9tZWRpYSJ9.ECVxBzAYZcpyueBP_Xlyncn41OgrezrOF8Dn3CdAnOU" ] }
Can you not index an Array?
How I am trying to create the index.
my index creation

How to use azure cognitive search ? Can we use for global filteron table with pagination and sorting?

I am trying to return all the records that match my searchtext.So far I have only seen examples where we need to specify field name but I want return records if any of the field contains the searchtext, without specifying any field name. And I got to see $text , but unfortunatly it's not supported in cosmosdb API mongodb.
Can someone please help me to resolve this issue ?
Here is what I tried but failed
let querySpec = {
entity: "project",
$text: { $search: "\"test\"" } ,
$or: [{
accessType: "Private",
userName: userName
}, {
accessType: "Public"
}]
}
dbHandler.findandOrder(querySpec, sortfilter, "project").then(function (response) {
res.status(200).json({
status: 'success',
data: utils.unescapeList(response),
totalRecords:response.length
});
exports.findandOrder = function (filterObject, sortfilter, collectionname) {
return new Promise((resolve, reject) => {
return getConnection().then((db) => {
if (db == null) {
console.log("db in findandOrder() is undefined");
} else {
db.db(config.mongodb.dbname).collection(collectionname).find(filterObject).sort(sortfilter).toArray((err, res) => {
if (db) {
//db.close();
}
if (err) {
reject(err);
} else {
resolve(res);
}
});
}
});
});
};
Error:
{"message":{"ok":0,"code":115,"codeName":"CommandNotSupported","name":"MongoError"}}
I am using $regex as temperory solution as $text is not supported.
Please suggest ...
From the MongoDB manual, Create a Wildcard Index on All Fields:
db.collection.createIndex( { "$**" : 1 } )
but this is far from an ideal way to index your data. On the same page is this admonition:
Wildcard indexes are not designed to replace workload-based index planning.
In other words, know your data/schema and tailor indices accordingly. There are many caveats to wildcard indices so read the manual page linked above.
Alternately you can concatenate all the text you want to search into a single field (while also maintaining the individual fields) and create a text index on that field.

Update large collection

Does anyone have a suggestion about how to update a field in each document in a large collection?
I use something like this:
MyModel.find().exec(function(err,data){
if(err){
return console.log(err);
}
data.forEach(function(doc){
doc.Field = doc.Field + 1;
doc.save(function (err) {
if(err) {
console.error('ERROR!');
}
});
});
});
But I get FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory.
Is there a way to process the above update in chunks or something like that?
You can use the async.eachLimit method of the async library to limit the number of concurrent save operations (doc link is to each, scroll down to see the eachLimit variant).
For example, to limit the saves to no more than 5 outstanding at a time:
MyModel.find().exec(function(err, data){
if (err) {
return console.log(err);
}
async.eachLimit(data, 5, function(doc, callback){
doc.Field = doc.Field + 1;
doc.save(function(err) {
if (err) {
console.error('ERROR!');
}
callback(err);
});
});
});
However, in this case it would be much more efficient to use a single update with the $inc operator and the multi: true option to increment each doc's Field value by 1.
MyModel.update({}, {$inc: {Field: 1}}, {multi: true), function(err) { ... });
you need more memory: --max_new_space_size and/or --max_old_space_size, like this:
node --max-old-space-size=4096 server.js
Currently, by default v8 has a memory limit of 512MB on 32-bit
systems, and 1.4GB on 64-bit systems. The limit can be raised by
setting --max_old_space_size to a maximum of ~1024 (~1 GB) (32-bit)
and ~4096 (~4GB) (64-bit), but it is recommended that you split your
single process into several workers if you are hitting memory limits

Can i decrement text in mongodb?

I have permalinks with numbers on the end like example_3 and I want to decrement each in a mongoDb update ( i.e. example_2) but it's text. Is there a way to do this?
posts.update(
{ 'title':doc.title, 'student':username, 'copy':false, 'class_number':{ '$gt': doc.class_number } },
{ '$inc': { 'class_number':-1, 'permalink':-1 } },
{ multi: true },
function(err, dox){
if (err) return callback(err, null);
console.log('decclassnumber');
console.log(dox + ' posts were decremented.');
callback(err, dox);
});
It doesn't make sense to increment/decrement an alphanumeric string; you need to separate the original string value into meaningful parts before asking MongoDB (or your application code) to adjust the numeric portion.
Normally with permalinks you would also be incrementing values rather than decrementing -- the whole intent of permalinks is to ensure that a given link is always pointing to the same resource.
It sounds like you actually want to implement a sequence pattern, where you find the next available sequence value to use.
For example, see: Create an Auto-Incrementing Sequence in the MongoDB manual.
Here's a slightly modified version of the getNextSequence() function in the documentation that uses upsert to either find an existing slug counter document or insert a new one. The return value is a new unique slug:
function getNextSequence(name) {
var ret = db.counters.findAndModify(
{
query: { _id: name },
update: { $inc: { seq: 1 } },
upsert: true,
new: true,
}
);
// Return the new slug (eg: "example_1")
return name + '_' + ret.seq;
}
> getNextSequence("example")
example_1
> getNextSequence("example")
example_2
> getNextSequence("example")
example_3
If you do want to decrement from some starting value, you could insert a starting value for your sequence and reduce that instead with $inc: { seq: -1 }.

MongoDb get last few documents and the await tailable cursor

I want to get 5 last documents from a MongoDB collection, then keep tailing it for new documents. Can this be done at all with one query, or do I really need two queries? If two queries, what's the best way to achieve this without adding extra fields?
While answer in any language is fine, here's an example node.js code snippet of what I try to achieve (error handling omitted, and snippet edited based on first answer to the question):
MongoClient.connect("mongodb://localhost:1338/mydb", function(err, db) {
db.collection('mycollection', function(err, col) {
col.count({}, function(err, total) {
col.find({}, { tailable:true, awaitdata:true, timeout:false, skip:total-5, limit:5 }, function(err, cursor) {
cursor.each(function(err, doc) {
console.dir(doc); // print the document object to console
});
});
});
});
});
Problem: Above code prints all the documents starting from first one, and then waits for more. Options skip and limit have no effect.
Question: How to easily get 5 latest documents, then keep on tailing for more? Example in any language is fine, does not have to be node.js.
(Answer edited, it's useful to know this does not work with these versions.)
If collection was not tailable, you'd need to find out how many items there is, for that use count, and then use skip option, to skip first count-5 items.
This will NOT work, tailable and skip do not work together (MongoDB 2.4.6, node.js 0.10.18):
MongoClient.connect("mongodb://localhost:1338/mydb", function(err, db) {
db.collection('mycollection', function(err, col) {
col.count({ }, function(err, total) {
col.find({ }, { tailable: true, awaitdata: true, timeout: false, skip: total - 5, limit: 5 }, function(err, cursor) {
cursor.each(function(err, doc) {
console.dir(doc);
});
});
});
});
});