I have around 6 million rows in my mongodb collection and importing into meilisearch using php artisan scout:import 'model' takes forever to finish.
Importing data with limit option php artisan scout:import 'model' -c 10000 gives me the following error.
MongoDB\Exception\InvalidArgumentException
Expected "limit" option to have type "integer" but found "string"
at vendor/mongodb/mongodb/src/Exception/InvalidArgumentException.php:59
55▕
56▕ $expectedType = $typeString;
57▕ }
58▕
➜ 59▕ return new static(sprintf('Expected %s to have type "%s" but found >"%s"', $name, $expectedType, get_debug_type($value)));
60▕ }
61▕ }
62▕
+27 vendor frames
28 artisan:37
Illuminate\Foundation\Console\Kernel::handle()
I also tried exporting the collection as json from mongodb and manual importing into meilisearch using curl -X POST 'http://127.0.0.1:7700/indexs/posts/documents' / --data #/data/posts.json gives the following error.
{"message":"Invalid JSON: invalid type: map, expected a sequence at line 1 column 0","errorCode":"bad_request","errorType":"invalid_request_error","errorLink":"https://docs.meilisearch.com/errors#bad_request"}curl: (3) URL using bad/illegal format or missing URL
Posts.json is the exported json file of mongodb collection using mongoexport command.
How can I import data fast into meilisearch?
Versions
"laravel/scout":"^9.1"
"laravel/framework": "^8.12",
"meilisearch/meilisearch-php": "^0.18.2",
mongodb version : "3.6"
OS
Ubuntu 20.04
Related
I'm trying to insert one registry in mongodb with mongosh and ubuntu bash. I've get one registry with mongosh . I have to edit 3 fields and make an insert. I thought to make the edition with jq but I don't get it done.
{ "_id": {"fileName": "xxxxxx","namespace": "yyyyyy" },
"metainfo": {"file-type": "csv","environment": "int",
"creation-date": 1672306975130000000,"file-name":"xxxxxxx" }
}
I've to edit creation-date (is the date en nanos), the enviroment, change part of the fileName (make a substring). I've get the document with --eval "EJSON.stringlify(....)"
the command with jq I've tried is:
newDocument=$(echo "$fileData" | jq '.metainfo.environment |= "pro"')
and gives me error:
parse error: Invalid numeric literal at line 1, column 8
I've validated the JSON and it's well formed.
After made the changes I've to make Then make the insert. I've thought made with
--eval "......insertOne(EJSON.stringlify($newDocument))"
is this correct? What would be the best mannerto do all this?
Thanks for all.
The error was giving me because I was making the request without --quiet parameter.
The mongo shell allows json without problems.
mongoimport --host=myRemoteHost.cloud --port=30107 --username=my_cloud_user
--collection=profiles --db=controlpanel --file=/Users/myLocal/Documents/language profiles/languageprofiles.json
gives me the following error:
2022-09-28T19:29:17.668-0300 no collection specified
2022-09-28T19:29:17.668-0300 using filename '' as collection
2022-09-28T19:29:17.668-0300 error validating settings: invalid collection name: collection name cannot be an empty string
zsh: command not found: --collection=profiles
The suggestions on this similar thread (Mongoimport results in "no collection specified" despite having collection defined)didn't work. Not sure what I am missing, since my example is slightly different.
I had a project which was unfortunately being developed only using the synchronize: true option until now.
I decided to change it and what i did was create a dump and then ran it in the query runner. and found myself stuck in an error.
I did something like : [Database = postgres , ORM = typeorm]
pg_dump db > db.sql
this created a sql file i copied it's content and pasted in query runner which gives me an error like:
driverError: error: syntax error at or near "."
I am using RESTHeart to access a Mongo database. RESTHeart has a an API that is supposed to create a database, e.g.:
curl -X put http://localhost:8080/db1
Well, I was using a chrome browser-based REST client that happened to do the equivalent of the follow curl call, but I accidentally forgot to nuke the data portion. It contained the JSON {"e":"f"} for data.
curl -X put -H 'Content-Type: application/json' --data-raw '{"e":"f"}' http://localhost:8080/db2`
When I then tried to do a curl get, it returns a value with the key/value pair "e":"f" stuffed in there - which is not what I want.
$ curl http://localhost:8080/db2
... { "_id" : "db2" , "e" : "f" , "_etag" : { "$oid" : "570f90601d956327e8df28c4"} , "_size" : 0 , "_total_pages" : 0 , "_returned" : 0}
Now, using the Mongo shell, I try to find this key/value pair using just about every Mongo shell command. But, I can't find it, nor can I remove it either. In fact, I can create a rather large Mongo database, then do that curl put, and I'm screwed, but it then adds the pair to my nice clean database.
Does anyone know how I can remove that strange key/value pair, either using Mongo shell, or the RESTHeart API - short of nuking the database and recreating it from scratch?! Thanks.
To remove the db property just update the db:
With PATCH:
PATCH /db {"$unset": {"e": null}}
Or with PUT
PUT /db {}
For more info look at the documentation reference sheet and representation format
I have a json file of around 4M json lines, I tried to use:
mongoimport --db mydb --collection mycoll --file myfile.json
and what happened was weird. I got this error:
2018-12-29T17:00:50.424+0200 connected to: localhost
2018-12-29T17:00:50.483+0200 Failed: error processing document #1428: invalid character 'S' after object key:value pair
2018-12-29T17:00:50.483+0200 imported 1426 documents
so first I went to count this collection in mongo and saw that there are 1000 documents and not 1426 as the above mentioned.
second, I located a json with the 'S' in it, which is just a string that looks like "name" : "Double 'S' Transport" and left only this json in the file, import and it worked well.
does anyone understands why is it happening? my suspicion is that mongoimport dosent work on files that big...
any help would be great :)