logstash connection to database issue - postgresql

Using logstash, what I'm trying to do is dump all tagnames from a table in a database to an index. The problem I'm facing here is, the logstash works fine if specify the IP address as 127.0.0.1 for the postgresql connection.
But if I specify my actual IP or some other user's IP i'm not getting the connection and logstash eventually fails.
Here's the configuration :
input {
jdbc {
# Postgres jdbc connection string to our database, empris
jdbc_connection_string => "jdbc:postgresql://127.0.0.1:5432/empris"
# The user we wish to execute our statement as
jdbc_user => "postgres"
#The password for the user
jdbc_password => ""
# The path to our downloaded jdbc driver
jdbc_driver_library => "/home/aravind/Desktop/postgresql-9.4-1206-jdbc4.jar"
# The name of the driver class for Postgresql
jdbc_driver_class => "org.postgresql.Driver"
# our query
statement => "SELECT distinct tagname from filemetadata"
}
}
output {
elasticsearch {
hosts => "192.168.16.238:9200"
index => "arempris"
document_type => "tags"
}
}
And if the jdbc_connection_string is jdbc:postgresql://192.168.16.233:5432,
logstash fails.
Would greatly appreciate your help making me understanding the cause of this. Thanks in advance.

It seems like you ip you are mentioning is not accessible via logstash. Can you try to ping the ip and see if it is able to connect

Related

Logstash can't send logs to mongoDB

I'm trying to save the logs to my local mongoDB with logstash. this is my configuration in logstash:
input {
tcp {
port => 5001
codec => "json_lines"
}
}
output {
file {
path => "/var/log/some.log"
}
mongodb {
id => "logstash-mongo"
collection => "logs"
database => "somelogs"
uri => "mongodb://my-user:my-password#127.0.0.1:27017/somelogs"
codec => "json"
}
stdout { codec => rubydebug }
}
For output to file its actually works, but not for output to mongoDB. I setup the mongoDB using authentication. I created both admin and basic user for read and write to the specific db. I tried to run logstash but its give me this:
2021-10-29T06:37:51,798][WARN ][logstash.outputs.mongodb ][main][logstash-mongo] Failed to send event to MongoDB, retrying in 3 seconds {:event=>#<LogStash::Event:0x2f0b40c8>, :except
ion=>#<Mongo::Auth::Unauthorized: User my-user is not authorized to access somelogs.>}
[2021-10-29T06:37:54,887][WARN ][logstash.outputs.mongodb ][main][logstash-mongo] Failed to send event to MongoDB, retrying in 3 seconds {:event=>#<LogStash::Event:0x2f0b40c8>, :except
ion=>#<Mongo::Error::OperationFailure: command insert requires authentication (13)>}
I'm using both user to access the mongoDB but its still give me this output. Any suggestion what I have to do with this issue? many thanks
NOTE: I've done install output plugins for mongodb
My mistake, the uri must be
uri => "mongodb://my-user:my-password#127.0.0.1:27017
instead of
uri => "mongodb://my-user:my-password#127.0.0.1:27017/somelogs
The logstash running well now

I can not connect to Postgres DB with Strapi on Heroku

Trying to deploy Strapi on Heroku with Postgres as described here
https://strapi.io/documentation/v3.x/deployment/heroku.html
But I get this error
error: no pg_hba.conf entry for host "84.212.51.43", user "ssqqeaz***", database "d6gtu***", SSL off
I use Heroku Postgres add-on.
My database config:
module.exports = ({ env }) => ({
defaultConnection: 'default',
connections: {
default: {
connector: 'bookshelf',
settings: {
client: 'postgres',
host: env('DATABASE_HOST', '127.0.0.1'),
port: env.int('DATABASE_PORT', 27017),
database: env('DATABASE_NAME', 'strapi'),
username: env('DATABASE_USERNAME', ''),
password: env('DATABASE_PASSWORD', ''),
},
options: {
ssl: true
},
},
},
});
Why? Please help!
try to change ssl : true into ssl : false
The current configuration you've posted will not work with a Heroku Postgres database. The primary concern here is that you're reading components of your postgres database url out of manually set config vars. This is very much recommended against by Heroku because they may need to move the database to a new host in the case of disasters. DATABASE_URL is set by Heroku when you create a database on an app and it's the one config var you can rely on to stay up-to-date. Moving on...
You will need to parse the username, password, host, port and database name out of the DATABASE_URL config var and supply those to the attributes of the settings block. Based on the error you provided, I can tell you're not presently doing this because Heroku databse usernames all start with a 'u', so something is very wrong if you get the error user "ssqqeaz***". As a first step you might try hard coding these values in the settings block to make sure it works (make sure to rotate the credentials after you do it, or otherwise clean up your git history to prevent leaked creds). The pattern for a postgres connection url is something like this: postgres:// $USERNAME : $PASSWORD # $HOSTNAME : $PORT / $DATABASE_NAME.
Not sure if it will help moving your config around...
remove ssl from option Key
insert ssl after password inside of settings Key
eg.
ssl: env.bool('DATABASE_SSL', false),
also check your app config vars inside of Heroku and make sure you have the required postgres config vars setup and they match the heroku generated DATABASE_URL config var.
lastly check your ./config/server.js file and make sure your host is 0.0.0.0
eg.
module.exports = ({ env }) => ({
host: env('HOST', '0.0.0.0'),
port: env.int('PORT', 1337),
admin: {
auth: {
secret: env('ADMIN_JWT_SECRET', '**********************************'),
},
},
});

Mongo::Error::NoServerAvailable: No primary server is available in cluster -- An error occurred when I used logstash to mongodb data in docker

this is my logstash.conf config
input{
stdin{
}
jdbc{
jdbc_connection_string => "jdbc:mysql://ip:3306/database"
jdbc_user => "root"
jdbc_password => ""
jdbc_driver_library => "/app/mysql-connector-java-5.1.46/mysql-connector-java-5.1.46-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
record_last_run => "true"
use_column_value => "false"
tracking_column => "id"
last_run_metadata_path => "/app/info/station_parameter.txt"
clean_run => "false"
jdbc_paging_enabled => "true"
jdbc_page_size => 100
statement => "select * from database where id>=1"
schedule => "* * * * *"
}
}
output{
mongodb{
uri => "mongodb://192.168.1.103:27017"
database => "mydata"
collection => "mycollection"
}
}
the error is
Failed to send event to MongoDB, retrying in 3 seconds {:event=>#, :exception=>#]> with timeout=30, LT=0.015>}
Not sure if you find this useful but we experienced the same issue in two different scenarios and resolved it, which I would like to share with you,
client was not able to resolve the Primary Server address due to network related issues in docker files, it was resulting to this error, the mongodb server was up and running but the client was not able to resolve the address so the error was thrown.
mongodb was restarted (or for some reason connection was lost) and when mongodb was back online, the mongoid client (ruby) was not able to re-connect again, the solution was to restart the client as well, to establish the connection again. After investigation, I would say most likely it was a version incompatibility of the driver and mongodb server.
In general, we were not able to find a permanent solution for it and it appears to be an issue with mongodb drivers, but one thing for sure, please check the compatibility of mongodb driver and the server.

Eloquent innodb - SQLSTATE[42S02]: Base table or view not found Table doesn't exist

I have this weird crash and trying for few hours to solve with no success, check everywhere on the internet and didn't find a solution.
I am running Eloquent (without laravel). I have two servers (frontend and backend). The DB is in the backend server.
I am running the exact same code to connect to the database using laravel from both servers. From the backend server everything works. In the frontend server, I get mixed result. Out of 9 tables:
three works, and I am able to retrieve the data.
three return [items:protected] => Array (), while table has data to return.
three cause fatal error SQLSTATE[42S02]: Base table or view not found Table doesn't exist
from #1 I understand that this is not related to issue with database user/password.
for #2, I checked the queries using getQueryLog(), and I run the queries in phpmyadmin, and they work.
from what I observed so far, all the tables that cause #3 are all InnoDB. Why my other tables are MyIsam I don't know.
And the exact same code works in the backend server, so it is not an issue with tables names are in singular or some other code related issue.
backend server is php 5.5.9, eloquent 5.4.45
frontend server is php 5.6.32, eloquent 5.4.36
Any idea what to do about #2 (return data empty) and #3 (php fatal error)?
Ok, found this crazy bug(s).
First, I found out that we had an old version of the database on the frontend server. It was not supposed to be there, so I didn't even think about it as a possibility.
In addition, there was a problem in the code:
$dbhost = __MYSQLDB_HOST__;
$dbuser = 'xxxxx';
$dbpass = 'xxxxx';
$dbname = 'xxxxx';
$capsule = new Capsule;
$capsule->addConnection(array(
'driver' => 'mysql',
'host' => 'localhost', // <= this is the problem!!!
'database' => $dbname,
'username' => $dbuser,
'password' => $dbpass,
'charset' => 'utf8',
'collation' => 'utf8_unicode_ci',
'prefix' => ''
));
As you can see, when connecting to the database it used localhost instead of the $dbhost variable which contained the remote database host on the backend server.

Configure Dancer from environment variables?

I'm new to Dancer, but I'm trying to configure it to work within a Docker container. As a result, I need to pick up my database settings from the environment.
In my case, I have DB_PORT_3306_TCP_ADDR, and DB_PORT_3306_TCP_PORT coming from Docker. Unfortunately, the Dancer::Plugin::Database module is erroring before I can change the database to use those variables.
use Dancer ':syntax';
use Dancer::Plugin::Database;
if ($ENV{DB_PORT_3306_TCP}) {## Connected via docker.
database->({
driver => 'mysql',
username => 'username',
password => 'password',
host => $ENV{DB_PORT_3306_TCP_ADDR},
port => $ENV{DB_PORT_3306_TCP_PORT},
database => $ENV{DB_ENV_MYSQL_DATABASE},
});
}
So the question is, is there a good way to configure Dancer from environment variables, instead of through static YAML?
See Runtime Configuration in the Dancer::Plugin::Database docs:
You can pass a hashref to the database() keyword to provide configuration details to override any in the config file at runtime if desired, for instance:
my $dbh = database({ driver => 'SQLite', database => $filename });
You're adding a ->, which causes an error. The following should work:
use Dancer ':syntax';
use Dancer::Plugin::Database;
if ($ENV{DB_PORT_3306_TCP}) {## Connected via docker.
database({
driver => 'mysql',
username => 'username',
password => 'password',
host => $ENV{DB_PORT_3306_TCP_ADDR},
port => $ENV{DB_PORT_3306_TCP_PORT},
database => $ENV{DB_ENV_MYSQL_DATABASE},
});
}
At the beginning of your lib/myapp.pm, after module loading, add :
setting('plugins')->{'Database'}->{'host'}='postgres';
setting('plugins')->{'Database'}->{'database'}=$ENV{POSTGRES_DB};
setting('plugins')->{'Database'}->{'username'}=$ENV{POSTGRES_USER};
setting('plugins')->{'Database'}->{'password'}=$ENV{POSTGRES_PASSWORD};
and keep static config (Driver, port) in config.yml