Export conversations from Botpress - botpress

How can I easily export all the messages from my conversations with the chatbot? When I try to export the logs from Botpress the text file is always empty

You can check the sql lite database for all logs or if you are using postgress you can check the tables.

Related

GCP connect other user psql instance

We are a student group that wants to make a simple PostgreSQL project on google cloud.
I create database and tables etc. but I can't solve how my team-mates connect that database?
You can create users in the console to allow your team-mates to connect. Please follow the steps in the link for Creating a User.
As for the error message: you need to enable Cloud SQL Admin API.

Export Database from Google Cloud Sql to external Database

I'm trying to export my database created in Google Cloud Sql and import it into a new external server.
I tried to create a sql backup through the google console, downloaded it and copied it to the new server via filezilla and then launched the following command:
psql -U postgres -d ciclods-db -1 -f Backup-db_Cloud_SQL_Export_2019-03-23\ \(17_01_19\)
but i get this output:
ERROR: role "cloudsqladmin" does not exist
REVOKE
ERROR: role
"cloudsqlsuperuser" does not exist GRANT
what is the right procedure to follow in these cases?
I have resolved the same problem by locating and deleting the two lines from the exported sql file with "cloudsqladmin". My app does not use it anyway.
to do this task you can follow the official GCP guide about How to export data from Cloud SQL[1] in that document they give you the option to export the data into a dump file or csv files which can be used for other tools.
https://cloud.google.com/sql/docs/mysql/import-export/exporting
In order to create the export file, you have to do it from a command line and use additional flags. As per documentation‘s “Exporting data to a SQL dump file”, there is a section on Exporting data from an externally-managed database server.
As well you can find there the option to export the data into a CSV file.

Parse Migration - How to migrate parse data to localhost mongoDB?

I had been trying to migrate my parse data to localhost mongoDB but to no avail. There are a total of 12 steps as mentioned in https://parse.com/migration#database
I am currently still at step 1 and I had encountered some difficulties. I managed to set mongoDB on my computer (localhost). Then I went to my "app settings" in parse to start the data migration. Parse wanted me to paste the mongoDB connection URL which I had entered as "mongodb://localhost/". However, there was an error "no reachable servers". On my localhost, I am running the mongoDB using my terminal.
Any kind advise on this? This is my first time doing data migration and trying out mongoDB. Any help will be greatly appreciated!
Cheers
In your parse dashboard, go to App settings -> General. In this page you can find the "Export app data" button. Click and parse send you an email with the csv database data, use it for import your data in your local database (use rockmongo for example or mongoimport)

Can db2 import or load be used to populate DashDB?

I'm looking to bulk loads millions of rows into a DashDB database. After connecting using the DB2 CLI, I enter a command like:
db2 import from rowsToImport.csv of del insert into MY_TABLE
with results:
SQL0551N "DASHXXX" does not have the required authorization or privilege to
perform operation "BIND" on object "NULLID.SQLUAJ19". SQLSTATE=42501
Is this an inherent limitation with DashDB, or is something configured incorrectly on my client? I get a similar message when trying db2 load:
SQL2019N An error occurred while utilities were being bound to the database.
p.s. I'm aware of the rest client api for DashDB for loading data - I'm asking specifically how/if bulk loads can be done with the DB2 command line as an alternate option.
As per dashDB documentation you can use the Command line processor plus (CLPPlus). It is included in the dashDB driver package and provides a command-line user interface that you can use to connect to the dashDB database, BLUDB. You can use CLPPlus to define, edit, and run statements, scripts, and commands. Please take also a look at Connecting CLPPlus to the dashDB database to see how to connect and use the CLI.
Please note that in CLPPlus: IMPORT, EXPORT and LOAD commands have a restriction that processed files must be on the server: see here. So you should copy the input load file onto the remote server first with SCP. However SSH/SCP protocol should be blocked (not accessible) for a normal dashDB user.
Only geospatial data can be loaded from your local machine to dashDB, using IDA LOADGEOSPATIALDATA command in CLPPlus.
The file to be loaded in dashDB using the above command can be in the local file system, accessible to the CLPPlus user.
Alternative ways to do that are:
dashDB REST API (as you already mentioned). See Load delimited data using the REST API and cURL.
load the csv directly from the dashDB dashboard on Bluemix. See Loading data from the desktop into IBM dashDB.
load the csv using IBM Data Studio. See dashDB large file load using IBM Data Studio.
According to this technote, the package NULLID.SQLUAJ19 belongs to one of the early DB2 10.1 fix packs, so I suspect your client version is 10.1. When attempting to execute the IMPORT command it needs to bind some packages of that older version, since dashDB is DB2 10.5, obvisouly.
You may want to try installing the latest DB2 client fix pack, as the necessary packages may be already bound in the database.
To verify that you could run select pkgname from syscat.packages where pkgschema = 'NULLID' and pkgname like 'SQLUA%' -- you should see "SQLUAK20", which seems to be the corresponding package in DB2 10.5.
If that doesn't work, your other option might be to move to a dedicated dashDB instance, as you won't have sufficient privileges to bind missing packages in the entry-level shared dashDB service.

Data Migration from Java Hibernate SQL Server to Python Mongo Stack

I have one live website with multiple active users(around 30K) and each of them have their own configuration to render there homepages. The current stack of the portal is Java Spring Hibernate with SQL Server. Now, we have re written the code in Python MongoDB stack and want to migrate our users to new system. The issue here is that the old and new code will be deployed on the separate machines and we want to run this migration for few users as part of Beta Testing. Once the Beta testing is done, we will migrate all the users.
What would be the best approach to achieve this? We are thinking about dumping the data in alternative file system like XML/JSON on a remote server and then reading it in the new code.
Please suggest what should be the best way to accomplish this task
Import CSV, TSV or JSON data into MongoDB.
It will be faster and optimal to dump the file in a format like json, txt or csv , which should be copied to the new server and then import the data using mongoimport, in the command line shell.
Example
mongoimport -d databasename -c collectionname < users.json
Kindly regard to the link below for more information on mongoimport if you need to
http://docs.mongodb.org/manual/reference/mongoimport/