PGSync with aws elasticsearch: index not found - postgresql

I am trying to sync my Postgres database with Aws Elasticsearch using PGSync
I have defined a simple schema:
[
{
"database": "tenancyportal",
"index": "properties",
"nodes": [
{
"table": "properties",
"schema": "public",
"columns": ["id", "address"]
}
]
}
]
But when I am trying to bootstrap the database using
bootstrap --config schema.json
I get the following error:
elasticsearch.exceptions.NotFoundError: NotFoundError(404,
'index_not_found_exception', 'no such index [:9200]', :9200,
index_or_alias)
In the below screenshot, you will be able to see the GET URL for elasticsearch is completely wrong, I am not able to understand what config is causing it to be formed like that.

It looks like your AWS Elasticsearch URL is not constructed properly. This was adressed in a recent update to PGSync. Can you pull the latest master branch and try this again.

Related

Setting up Strapi using MongoDB via the PM2 Runtime

I'm quite new to Strapi and I'm following the Strapi deployment documentation at https://strapi.io/documentation/3.0.0-beta.x/guides/deployment.html#configuration. I have setup strapi using mongodb and it seems to work both in production and dev on my server. I can create content types and add data...
Now I'm trying to start Strapi using the PM2 Runtime. I have setup the ecosystem.config.js file (see below) and I run pm2 start ecosystem.config.js. The Strapi app seems to start just fine, but now what happens in the browser is that I am prompted with a new admin user. Seems like all users and data is lost... Mongo db not accessed or whats going on?
this is my ecosystem.config.js file
module.exports = {
apps : [{
name: 'cms.strapi',
cwd: '/var/www/domain/public_html',
script: 'server.js',
env: {
NODE_ENV: 'production',
DATABASE_HOST: '127.0.0.1',
DATABASE_PORT: '28015',
DATABASE_NAME: 'db-name',
DATABASE_USERNAME: 'db-u-name',
DATABASE_PASSWORD: 'pw',
},
}],
};
What am I missing?
Hi Jim and thanks for your reply! I believe the problem was a mixup between the prod and the dev environment. Sorry, my bad. I thought I was in one environment when I was really in the other. I guess it should be obvious when you start the server from the prompt whether your starting dev or prod, but once the web server is up and running in the browser I guess you can't tell from the gui whether it's the one or the other. At least I can't find one other than that the admin usernames (and possibly data) are different... Hmm..
Anyway my production/database.json file looks like this:
{
"defaultConnection": "default",
"connections": {
"default": {
"connector": "mongoose",
"settings": {
"uri": "mongodb://localhost:27017/db-prod",
"database": "db-prod",
"host": "127.0.0.1",
"srv": false,
"port": 27017,
"username": "u-name-prd",
"password": "pw"
},
"options": {
"ssl": false
}
}
}
}
PM2 Runtime seems to be working correctly with Strapi and Mongo now :-)

How do you connect over SSL to Postgres in Loopback v3

My datasource.json file looks like this...
{
"db": {
"name": "db",
"connector": "memory"
},
"mydb": {
"host": "mydbhost.db.ondigitalocean.com",
"port": 25060,
"url": "",
"database": "mydb-staging",
"password": "mypassword",
"name": "mydb",
"user": "myuser",
"connector": "postgresql",
"ssl" : true
}
}
But DigitalOcean managed Postgres provides you with a CA file to use.
Where do I put it?
How do i configure LB3 to know about it?
Loopback docs say https://loopback.io/doc/en/lb3/PostgreSQL-connector.html
The PostgreSQL connector uses node-postgres as the driver. For more information about configuration parameters, see node-postgres documentation. https://node-postgres.com/features/ssl
I just don't understand how to setup LB.
When I start my server up i get...
Unhandled rejection error: permission denied for database mydb-staging
If you are using the database services on digital ocean, only the default "doadmin" user can Read and Write on any database, any other added user can only read data.

Import database from mysql in orientdb

I'm trying to import a database having only one table into orientdb using their import functionality. I wrote this json
`{
"config": {
"log": "debug"
},
"extractor" : {
"jdbc": { "driver": "com.mysql.jdbc.Driver",
"url": "jdbc:mysql://localhost:8889/footballEvents",
"userName": "root",
"userPassword": "root",
"query": "select * from 10eventslight_2" }
},
"transformers" : [
{ "vertex": { "class": "events"} }
],
"loader" : {
"orientdb": {
"dbURL": "remote:localhost/footballEvents",
"dbUser": "root",
"dbPassword": "root",
"serverUser": "root",
"serverPassword": "root",
"dbAutoCreate": true
}
}
}`
Then I run the command sudo ./oetl.sh importScript.json and I don't get any error, the script runs normally. I attached the output of the command here
Reading the [orientdb] INFO committing message at the end I tried to connect to my database and run the commit command but the system answers me that no transaction is running. I'm quite sure that the dbUrl and the db/server credentials in my json are good because I can use this address to connect to my database via the orientdb console. Concerning the mysql part, no doubt it's working because it extracts data from the database and I know my credentials are ok.
So it looks like it's working, not any error comes up but nothing happens and I don't understand why.
If it has any importance, I'm on Mac OS 10.13.1 with orientdb 2.2.29.
Thanks in advance.
OrientDB Teleporter is a tool that synchronizes a RDBMS to OrientDB database. Teleporter is fully compatible with several RDBMS that have a JDBC driver: Teleporter has been successfully tested with Oracle, SQLServer, MySQL, PostgreSQL and HyperSQL. Teleporter manages all the necessary type conversions between the different DBMSs and imports all your data as Graph in OrientDB. This feature is available both for the OrientDB Enterprise Edition and the OrientDB Community Edition. But beware: in community edition you can migrate your source relational database but you cannot enjoy the synchronize feature, only available in the enterprise edition.
For more information: https://orientdb.com/docs/last/Teleporter-Home.html
Hope it helps
Regards

Remote MongoDB access through Cloud 9 gives login failed exception

I'm using the Cloud 9 IDE to develop an application using MongoDB. I created a database called "appdata" at MongoLab and the following user:
{
"_id": "appdata.admin",
"user": "admin",
"db": "appdata",
"credentials": {
"SCRAM-SHA-1": {
"iterationCount": 10000,
"salt": "K/WUzUDbi3Ip4Vy59gNV7g==",
"storedKey": "9ow35+PtcOOhfuhY7Dtk7KnfYsM=",
"serverKey": "YfsOlFx1uvmP+VaBundvmVGW+3k="
}
},
"roles": [
{
"role": "dbOwner",
"db": "appdata"
}
]
}
Whenever I try connecting to the database through Cloud 9 Shell using the following command (given by MongoLab with my newly created user):
mongo ds057244.mongolab.com:57244/appdata -u admin -p admin
I get the following error message:
MongoDB shell version: 2.6.11
connecting to: ds057244.mongolab.com:57244/appdata
2015-11-22T05:23:49.015+0000 Error: 18 { ok: 0.0, errmsg: "auth failed",
code: 18 } at src/mongo/shell/db.js:1292
exception: login failed
Also, on my javascript file running on Cloud 9, while following this tutorial (which uses mongoose to access the DB) I got stuck on the post route for bears. Whenever I send a post request through postman with the specified fields set, the route doesn't return anything, neither a bear created nor an error message, which makes me think the problem is also failing to login to the database. The previous get request is working just fine and my code is the exactly same as the tutorial.
Does anyone know what the problem in any of the cases and what I need to do to solve them?
The shell problem was fixed updating it to the Database version (which was 3.0.3).
For the javascript files, I restarted the tutorial and made sure I downloaded all necessary dependencies with the most recent stable version (not the ones shown on the tutorial), after that the problem was solved.

mongodb river for elasticsearch

Is there any official mongodb river available for elasticsearch ? I am using mongodb in node.js through the module mogoose.
I have seen one in http://www.matt-reid.co.uk/blog_post.php?id=68
Is this the correct one ? It says unofficial though...
Edit:
looks like, https://github.com/aparo/elasticsearch has inbuilt mongodb plugin.. Is there any doc available about how to configure this with mongodb and how mongodb pushes data for indexing to elasticsearch?
There is a new MongoDB river on github:
https://github.com/richardwilly98/elasticsearch-river-mongodb
according to the code you can specify several things but there is no separate doc (expect one mailing list discussion):
https://github.com/aparo/elasticsearch/blob/master/plugins/river/mongodb/src/main/java/org/elasticsearch/river/mongodb/MongoDBRiver.java
https://github.com/aparo/elasticsearch/blob/master/plugins/river/mongodb/src/test/java/org/elasticsearch/river/mongodb/MongoDBRiverTest.java
This isn't really the answer you're looking for. I looked at building this mongo river but I found some discussion on it having some memory leaks and I didn't want to fiddle with Java code. I wrote my own mongo->ES importer using the bulk API.
It's a work in progress, so feel free to contribute! :)
https://github.com/orenmazor/elastic-search-loves-mongo
Yes, There is a new MongoDB river on github:
https://github.com/richardwilly98/elasticsearch-river-mongodb
For Further Explanation You can follow below steps:
Step.1: -Install
ES_HOME/bin/plugin -install elasticsearch/elasticsearch-mapper-attachments/1.4.0
ES_HOME/bin/plugin -install richardwilly98/elasticsearch-river-mongodb/1.4.0
Step.2: -Restart Elasticsearch
ES_HOME/bin/service/elasticsearch restart
Step.3: -Enable replica sets in mongodb
go to mongod.conf & Add line
replSet=rs0
save & Exit
Restart mongod
Step.4: -Tell elasticsearch to index the “person” collection in testmongo database by issuing the following command in your terminal
curl -XPUT 'http://localhost:9200/_river/mongodb/_meta' -d '{
"type": "mongodb",
"mongodb": {
"db": "testmongo",
"collection": "person"
},
"index": {
"name": "mongoindex",
"type": "person"
}
}'
Step.5: -add some data to the mongodb through mongo terminal
use testmongo
var p = {firstName: "John", lastName: "Doe"}
db.person.save(p)
Step.6: -Use this command to search the data
curl -XGET 'http://localhost:9200/mongoindex/_search?q=firstName:John'
NOTE:
DELETE /_river
DELETE/_mongoindex
Again run this command,
curl -XPUT 'http://localhost:9200/_river/mongodb/_meta' -d '{
"type": "mongodb",
"mongodb": {
"db": "testmongo",
"collection": "person"
},
"index": {
"name": "mongoindex",
"type": "person"
}
}'
Step.7: -See HQ Plugin
In mongoindex, you will get your data.