Apache Airflow Init Db - postgresql

I am trying to initialize a database for my project which is based on using apache airflow. I am not too familiar with what happened but I changed my value from airflow.cfg file to sql_alchemy_conn =postgresql+psycopg2:////Users/gabeuy/airflow/airflow.db. Then when I saved the changes and ran the command airflow db init, the error occurred and did not allow me to run the db.
I tried looking up different ways to change it, ensured that I had Postgres and psycopg installed but it still resulted in an error when I ran the command. I was expecting it to run so that I could access the airflow db local host with the DAGs. error occured

Your sql_alchemy_conn is pointing to a local file path (indicating a SQLite DB), but the protocol is indicating a PostgreSQL DB. The error is telling you it's missing a password, which is required by PostgreSQL.
For PostgreSQL, the expected URL format is:
postgresql+psycopg2://<user>:<password>#<host>/<db>
And for a SQLite DB, the expected URL format is:
sqlite:////<path/to/airflow.db>
A SQLite DB is convenient for testing purposes. A SQLite DB is stored as a single file on your computer which makes it easy to set up (airflow db init will generate the file if it doesn't exist). A PostgreSQL DB takes a bit more work to set up, but is generally advised for a production scenario.
For more information about Airflow database configuration, see: https://airflow.apache.org/docs/apache-airflow/stable/howto/set-up-database.html.
And for more information about airflow db CLI commands, see: https://airflow.apache.org/docs/apache-airflow/stable/cli-and-env-variables-ref.html#db.

Related

Prisma with Bit.io connection issue

I have a next.js app setup with prisma (v3.13) as the ORM. I am testing out bit.io for db hosting, and I am getting this error when trying to connect with the client. Everything works as intended when I use a local postgres db. I'm currently using a connection string that looks like the following:
DATABASE_URL="postgresql://[username]:[password]#db.bit.io/[username]/[dbname]"
I am trying to run prisma db push and getting the following error
Environment variables loaded from .env
Prisma schema loaded from prisma/schema.prisma
Datasource "db": PostgreSQL database "eli-front/rankstl", schema "public" at "db.bit.io:5432"
Error: P1000: Authentication failed against database server at `db.bit.io`, the provided database credentials for `(not available)` are not valid.
Please make sure to provide valid database credentials for the database server at `db.bit.io`.
I am assuming the core of the issue has to due with the part of the error that says credentials for '(not available)' as if something isn't loading correctly.
Using the failing connection string with psql works completely fine, but not with prisma.
There are two things that need to be done in order for bit.io to work with Prisma.
Database names must be formatted as username.dbname rather than username/dbname. bit.io supports a number of different separator characters in the database name because different clients have different requirements around permissible characters in database names.
You have to create a second database on bit.io to use as a "shadow database." By default, this is done automatically—a shadow database is created, used, and deleted. However, most cloud database providers don't allow use of the CREATE DATABASE, so a shadow database must be created explicitly. See the prisma docs for details.
See the bit.io docs on connecting with Prisma for more details on setting up a minimum working connection.

Errors when configuring Apache Airflow to use a postgres database

I have been introducing myself to Apache Airflow, so far everything is going well however I have been using the default SQLite database and I now need to change to a PostgreSQL database. I have changed the executor to LocalExecutor and I have set the sql_alchemy_conn string to postgresql+psycopg2://airflow:airflow#postgres:5432/airflow which is the address of the airflow database I created in postgres.
Now when I run airflow initdb I recieve the error
airflow.exceptions.AirflowConfigException: error: cannot use sqlite with the LocalExecutor
I am using postgreSQL 9.4.24
Does anyone know why this is occuring?
resolved the issue
I was using the wrong postgres user for the location of the database. Should have been using postgresql+psycopg2://user:user#localhost:5432/airflow

AWS DMS Streaming replication : Logical Decoding Output Plugins(test_decoding) not accessible

I'm trying to migrate a PostgreSQL DB persisted on cloud (on DO droplet) to RDS using AWS Database Migration Service (DMS).
I've successfully configured the replication instance and endpoints.
I've created a task with Migrate existing data and replicate ongoing changes. When I start the task it shows some error ERROR: could not access file "test_decoding": No such file or directory.
I've tried to create a replication slot manually on my DB console it throws the same error.
I've followed the procedures which was suggested on the DMS documentation for Postgres
I'm using PostgreSQL 9.4.6 on my source endpoint.
I presume that the problem is the output plugin test_decoding was not accessible to do the replication.
Please assist me to resolve this. Thanks in advance!
You must install postgresql-contrib additional supplied modules on Your source endpoint.
If it is installed, make sure, directory where test_decoding module located is the same with directory where PostgreSQL expect it.
In *nix, You can check module directory by command:
pg_config --pkglibdir
If it is not the same, copy module, or make symlink, or some other solution You prefer.

How I can copy my local PostgreSQL database to Heroku for SpringBoot app

I have deployed my SpringBoot app to Heroku. Now I would like to copy my local PostgreSQL to Heroku.
I have found some information on devcenter.heroku.com.
However I don't understand enough about the using of file db.changelog-master.yaml.
Could anyone give me details about the simplest solutions to copy the database?
Create a valid dump of your local postgres database and host it somewhere publicly available. Now you will be able to restore this entire dataset (schema and records) with pg:backups:restore as shown here. The sole caveat here is that the target database must be completely empty for this to work. You can empty a Heroku postgres database with heroku pg:reset.
If you cannot take the approach listed above then you can run pg_restore directly from your local instance, provided your local version of Postgres is >= the target version of Postgres. This also applies to creating the dumpfile and is a requirement because pg utilities are not guaranteed to be forward compatible. Documentation for pg_restore is here.

How to access the database imported through datapump

I just imported data dump through below command:
IMPDP user/pass FULL=Y DUMPFILE=BIRDV24012014.DMP LOGFILE=BIRDV24012014.log;
The dump has been restored the issue is i dont know how to connect to this database that i just imported, what service or TNS does it resides and how can i query it?
You didn't import a database, you imported the contents of your file into your existing database. If you could successfully run impdp user/pass then your ORACLE_SID etc. is already set and you should be able to log in and query with sqlplus user/pass.
If you've come from another RDBMS background you may be confusing 'database' with 'schema'. Depending on what was in the dump, you've probably created a load of schema objects and data under the USER schema or whatever your real 'user' value was).
The import makes no difference to this, but if you want to access the database from another client (e.g. from another machine, or over JDBC) then you'll need to check your listener configuration to get the hostname/IP address and port it's listening on, and get the service name for the database; all of which can be obtained from lsnrctl services if you have permission to run that. You can then use those values for a JDBC URL, or in a tnsnames.ora entry, or ODBC, etc.
Look at your ORACLE_SID environment variable. There you'll find the instance ID. If you ran the IMPDP tool as user Oracle, you should also be able to connect to the database using
sqlplus / as sysdba
If all fails, look at your /etc/oratab file to see which instances are available on this host.
On another note, your command seems incomplete. Datapump requires a DIRECTORYparameter to know where to look for the dumpfile you specified.