#ExistQuery in Spring data mongodb - mongodb

Hello I would like to do exist query in spring mongo repository. I read about #ExistQuery but I don't know how write query inside, my method now:
#ExistsQuery("{ 'userAccount.socialTokenId': ?1}")
boolean existBySocialAccountId(String socialAccountId);
But I getting IndexOutOfBoundsException, 'userAccount' is a List of objects which contain variable socialTokenId. I know that I can just get whole User object and find it by myself but I would like to optimize my queries :).

I believe your problem is that the paramaters are zero indexed, so there is no parameter with index of 1, which is causing an IndexOutOfBoundsException.
Try changing your code to the following:
#ExistsQuery("{ 'userAccount.socialTokenId': ?0}")
boolean existBySocialAccountId(String socialAccountId);

Related

MongoDB Map Reduce: Auto-created index name too long, possible to customize?

Debugging MongoDB mapreduce is painful, so I'm not 100% sure I understand what's going on here, but I think I get the general idea...
The error message I'm getting is this: mr failed, removing collectionCannotCreateIndex: namespace name generated from index name "my_dbname.tmp.mr.collectionname_69.$_id.aggregation_method_1__id.date_key.start_1__id.date_key.timeres_1__id.region.center_2dsphere" is too long (127 byte max)
The key I'm using for mapreduce is a complex object with four or five properties, so I'm guessing what's happening is that when Mongo tries to create its temporary output collections using my specified key, it tries to auto-create an index on that complex key; but since the key itself has several properties, the default name for the key is too long. When I index complex objects like this under "normal" circumstances, I just give the index a custom name. But I don't see a way to do that for the collections mapreduce generates automatically.
Is there a simple way to fix this without changing my key structure?
Well, turns out I was tricked by the error message! the <collectionname> in the error message referenced above is the name of the INPUT collection whose records I'm processing with mapreduce... but the index it's referring to is an index that was part of the OUTPUT collection! So I just had to give the index in the output collection a name, and voila, problem solved. What weird behavior.

play-reactivemongo dealing with Indexes

I've found that in order to create an index in a collection I should use the indexesManager:
collection.indexesManager.ensure(...)
I would like to know which is the right place for this function call.
I put this call in the function of the Controller that performs the insertion of documents in the collection, and it works.
But I guess that it is not necessary to call this function on each insertion.
Is there a way to make this call only once when the DB is initialized?
Thanks
In reactivemongo 2.11, you should be able to get the current database instance like this :
val db = current.injector.instanceOf[ReactiveMongoApi].database
Please note that this line will give you back a Future[DefaultDB]. Then, you can do what you want by simply mapping on this Future :
db.map(_.collection("myCollection")))

Discard values while inserting and updating data using slick

I am using slick with play2.
I have multiple fields in the database which are managed by the database. I don't want to create or update them, however I want to get them while reading the values.
For example, suppose I have
case class MappedDummyTable(id: Int, .. 20 other fields, modified_time: Optional[Timestamp])
which maps Dummy in the database. modified_time is managed by the database.
The problem is during insert or update, I create an instance of MappedDummyTable without the modified time attribute and pass it to slick for create/update like
TableQuery[MappedDummyTable].insert(instanceOfMappedDummyTable)
For this, Slick creates query as
Insert INTO MappedDummyTable(id,....,modified_time) Values(1,....,null)
and updates the modified_time as NULL, which I don't want. I want Slick to ignore the fields while updating and creating.
For updating, I can do
TableQuery[MappedDummyTable].map(fieldsToBeUpdated).update(values)
but this leads to 20 odd fields in the map method which looks ugly.
Is there any better way?
Update:
The best solution that I found was using multiple projection. I created one projection to get the values and another to update and insert the data
maybe you need to write some triggers in table if you don't want to write code like row => (row.id,...other 20 fields)
or try use None instead of null?
I believe that the solution with mapping non-default field is the only way to do it with Slick. To make it less ugly you can define function ignoreDefaults on MappedDummyTable that will return only non default value and function in companion object to MappedDummyTable case class that returns projection
TableQuery[MappedDummyTable].map(MappedDummyTable.ignoreDefaults).insert(instanceOfMappedDummyTable.ignoreDefaults)

Ensure fields with same type across documents in a collection in mongodb

I am in the process of migrating a database in MySQL to MongoDB. However, I am running into a problem where MongoDB changes the document type based on the length/value of the string/integer data used to initialize it. Is there a way to prevent this? I want the types to be same across a collection.
I am new to this technology and apologize if I missed something. I looked around and could not find a solution to this. Any pointers are greatly appreciated.
thanks,
Asha
If you're writing your migration application in C++, check out the BSONObjBuilder class in "bson/bsonobjbuilder.h". If you create your individual BSON documents using the "append" methods of BSONObjBuilder, the builder will use the static types of the fields to set the appropriate BSON type in the output object.
For example:
int count = /*something from a mysql query*/;
std::string name = /*something else from a mysql query*/;
BSONObjBuilder builder;
builder.append("count", count);
builder.append("name", name);
BSONObj result = builder.obj();

Rogue query orderAsc with variable field according to its name

I am using Rogue/Lift Mongo record to query MongoDb. I am trying to create different query according to the sort field name. I have therefore a string name of the field that I want to use to sort the results.
I have tried to use Record.fieldByName in OrderAsc:
...query.orderAsc (elem => elem.fieldByName(columnName).open_!)
but I obtain "no type parameter for orderAsc".
How can I make it working? Honestly all the type programming in Rogue is quite difficult to follow.
Thanks
The problem is that you cannot dynamically generate a query with Rogue easily. As solution I used Lift Mongo Db that allows the usage of strings (without compile checking) for these kind of operations that requires dynamic sorting.