I am using elasticsearch and mongodb. Setup river week ago. Today on searching records just check the elasticsearch index are not getting updated or modified as records in mongodb are added daily.
I am using following lines to create river
curl -XPUT "localhost:9200/_river/myindex/_meta" -d '
{
"type": "mongodb",
"mongodb": {
"host": "localhost",
"port": "27017",
"db": "qna",
"collection": "collection"
},
"index": {
"name": "myindex",
"type": "index_type"
}
}'
Related
I'm using Elasticsearch 1.1.1, River Plugin and MongoDB 2.4
I have a field called cidr that is being analyzed. I need to set it so that it is not_analyzed anymore to use it with Kibana correctly. Following is the index I used. But now Im going to reindex it again (delete and write a new one.)
Whats the proper way to write a new index in a way that the values in the "cidr" field are not analyzed? Thank you.
curl -XPUT 'http://localhost:9200/_river/mongodb/_meta' -d '{
"type": "mongodb",
"mongodb": {
"db": "collective_name",
"collection": "ips"
},
"index": {
"name": "mongoindex"
}
}'
I see. It's working now. Mapping should be created BEFORE creating the index.
curl -XPUT "localhost:9200/mongoindex" -d '
{
"mappings": {
"mongodb" : {
"properties": {
"cidr": {"type":"string", "index" : "not_analyzed"}
}
}
}
}'
This is it. :)
I am using ElasticSearch 1.1.0 (I was running 1.2.0 but had issues with a ElasticSearch plugin) and MongoDB 2.6.1. I've installed them using the tutorial provided at enter link description here. When I create an index using
curl -XPUT "localhost:9200/_river/tenders/_meta" -d '{
"type": "mongodb",
"mongodb": {
"servers": [
{ "host": "127.0.0.1", "port": 27017 }
],
"options": { "secondary_read_preference": true },
"db": "tenderdb",
"collection": "tenders"
},
"index": {
"name": "tendersidx",
"type": "page"
}
}'
Indexing starts fine of the collection but it only indexes a part of the collection. E.g. the collection has at the moment 5184 records while only 1060 are indexed.
Avish's comment did the trick, he wrote: "ElasticSearch rivers only monitor changes in the other data store; your river should only track documents added to the collection after the river has been set up."
I need create index from mongodb. Colection name is Product and have such structure:
{
"_id": ObjectId("5239656f60663de206b1053e"),
"brand": "<brandName>",
"category": {
"$ref": "Category",
"$id": ObjectId("50cb515760663d3577000043"),
"$db": "<dbName>"
},
"image": "<imageUrl>",
"integraId": "<someId>",
"isActive": <isActive>,
"name": "<productName>",
"slug": "<slug>"
}
Collection Product have more 30 000 rows, but elasticsearch indexing only ~10 000 rows.
My query to create index:
{
"type": "mongodb",
"mongodb": {
"servers": [
{ "host": "127.0.0.1", "port": 27017 }
],
"options": {
"secondary_read_preference": true
},
"db": "<dbName>",
"collection": "Product"
},
"index": {
"name": "test",
"type": "test_type"
}
}
And just a second question: How can I indexing only some fields (name, category (get row by id from other collection) and brand)?
You may have more luck in the Google Groups about it bro http://groups.google.com/group/elasticsearch/topics or in the IRC http://www.elasticsearch.org/community/
MongoDB has full text search built in experimentally in version 2.4 if you would like to experiment with that: http://docs.mongodb.org/manual/core/index-text/ you may be able to query more effeciently. I realize this isn't the same as the elasticsearch solution you're looking for but this might be another way to solve the problem. Good luck!
I have installed and configured MongoDB and ES with mongodb river. But I'm not sure if I really understand rivers in ES. For example, I want index collection "users" from mongodb.
I will send curl PUT/POST request to url /_river/mongodb_users/_meta
{
"type": "mongodb",
"mongodb": {
"db": "somedb",
"collection": "users"
},
"index": {
"name": "users",
"type": "user"
}
}
But now, I want to index second collection, for example "users2". I really need to create new river using curl POST/PUT on url like /_river/mongodb_users2/_meta with JSON:
{
"type": "mongodb",
"mongodb": {
"db": "somedb",
"collection": "users2"
},
"index": {
"name": "users2",
"type": "user"
}
}
I can not use already created river "mongodb_users"? I will need create one river for one collection?
Thank you for explanation!
Yes. The way MongoDB river works does not allow to fetch content from more than one collection in a single river.
But, you can create as many river as you need.
That said, if you want to index users1 to users type in Elasticsearch and users2 to the same users type, you can (as soon as they don't use the same IDs).
Just modify index.type to "users".
Does it help?
My search is not working now. I guess because my index was not configured for replica set:
curl -XPUT 'http://localhost:9200/_river/mongodb/_meta' -d '{
"type": "mongodb",
"mongodb": {
"db": "mongo",
"host": "local",
"port": "40000",
"collection": "users"
},
"index": {
"name": "api",
"type": "users"
}
}'`
Is there anyway to declare a replica set properly so that elasticsearch can find the master, the way PHP driver does:
$m = new Mongo(
"mongodb://localhost:40000,localhost:41000",
array("replicaSet" => true)
);
so that elasticsearch can automatically fail over to another member.
I solved this simply by updating to the latest version of the client driver.
The previous (minor) version had trouble connecting to the latest mongo server.