How to make Metabase write scheduled files into STFP instead of E-mails - postgresql

I use Metabase to generate dashboards and reports. I need to generate files using the scheduler and instead of sending them by e-mail, I need to make them available in SFTP. Do you have any suggestion on how to automate this processes ?
I use PostgreSQL as a database source.
I can also try other open source tools if needed.
I didn't find much information on how to do it yet.

Related

Updating online Mongo Database from offline copy

I have a large Mongo database (5M documents). I edit the database from an offline application, so I store the database on my local computer. However, I want to be able to maintain an online copy of the database, so that my website can access it.
How can I update the online copy regularly, without having to upload multiple GBs of data every time?
Is there some way to "track changes" and upload only the diff, like in Git?
Following up on my comment:
Can't you store the commands you used on your offline db, and then
apply them on the online db, through a script running on SSH for
instance ? Or even better upload a file with all the commands you ran
on your offline base, to your server and then execute them with a cron
job, or a bash script ? (The only requirement would be for your bases
to have the same start point, and same state, when you execute the
script)
I would recommend to store all the queries you execute on your offline base, to do this you have many options, I can think about the following : You can set the profiling level to log all your queries.
(Here is a more detailed thread on the matter: MongoDB logging all queries)
Then you would have to extract then somehow (grep ?), or store them directly in another file on the fly, when they are executed.
For the uploading of the script, it depends on what you would like to use, but i suppose you would need to do it during low usage hours, and you could automate the task with a CRON job, and an SSH tunnel.
I guess it all depends on your constraints (security, downtime, etc..)

How to take backup of Tableau Server Repository(PostgreSQL)

we are using 2018.3 version of Tableau Server. The server stats like user login, and other stats are getting logged into PostgreSQL DB. and the same being cleared regularly after 1 week.
Is there any API available in Tableau to connect the DB and take backup of data somewhere like HDFS or any place in Linux server.
Kindly let me know if there are any other way other than API as well.
Thanks.
You can enable access to the underlying PostgreSQL repository database with the tsm command. Here is a link to the documentation for your (older) version of Tableau
https://help.tableau.com/v2018.3/server/en-us/cli_data-access.htm#repository-access-enable
It would be good security practice to limit access to only the machines (whitelisted) that need it, create or use an existing read-only account to access the repository, and ideally to disable access when your admin programs are complete (i.e.. enable access, do your query, disable access)
This way you can have any SQL client code you wish query the repository, create a mirror, create reports, run auditing procedures - whatever you like.
Personally, before writing significant custom code, I’d first see if the info you want is already available another way, in one of the built in admin views, via the REST API, or using the public domain LogShark or TabMon systems or with the Addon (for more recent versions of Tableau) the Server Management Add-on, or possibly the new Data Catalog.
I know at least one server admin who somehow clones the whole Postgres repository database periodically so he can analyze stats offline. Not sure what approach he uses to clone. So you have several options.

FileMaker Task Automation

I'd like to automate several FileMaker tasks using Windows Task Scheduler. It looks like step scripts are the way to go, but I'm not sure. I'd like to run tasks, say exporting for example, several times per day, but WITHOUT opening the FileMaker GUI. Is that possible? Any tips you have would be great. Thanks.
It's possible to initiate a Filemaker script using a schedule server script with Filemaker Server. However, if the database is not hosted using Filemaker server, or not open using Filemaker Pro (sounds like your situation), then there is no active engine able to actually perform the calculations (script steps, etc). The database has to be running somewhere to initiate and perform any scripts.
If the database is hosted using Filemaker Server then it is pretty easy to setup a scheduled script that will run at a designated time. If you don't have a license of Filemaker Server some Filemaker cloud hosting providers have monthly plans that are relatively cheap ($20/month with unlimited connections), and they'll work with you to setup a scheduled script (for free).
The best way to automate FileMaker tasks is to use FileMaker Server which has scheduled scripts. Of course it is more expensive than standalone version of FileMaker Pro.
If you automate tasks on a local FileMaker file, you can not avoid starting FileMaker and opening the file.
FileMaker has a limited support for VBScript, you can run FileMaker, open file and run a FileMaker Script from VBScript and add that script to Windows Task Scheduler.
This is not preferable way, but if you have no other option, this may be handy.
in Task Scheduler, Create a task
on Action tab, choose
"Start Program"
on the next screen, point to FileMaker Pro exe file, typically it is in C:\Program Files\Filemaker Pro\FileMaker.exe
Add argument:
"fmp://hostName/fileName.fmp12?script=scriptName&param=optionalScriptParameters"
please read more here http://www.filemaker.com/help/12/fmp/en/html/sharing_data.16.7.html about url schema. This will vary depending on whether you are hosting your file on FileMaker Server or opening it locally.
Note: avoid having spaces or special characters in script name.
Save the task. Reopen task properties and save your windows account credentials, so that the task may run without you having to login.
either save FileMaker login credentials upon login (if your FM version allows), or pass credentials through fmp url (as described in the link above), or go to FileMaker file options, and use credentials in "Log in using": (which is not secure and not recommended).
I am using this method to automatically send emails with PDF attachments, since FileMaker server does not let you Export Records as PDF (not until v.16) on server scripts.

Publishing Reports on tableau via tabcmd

I wanted to publish tableau reports via tab cmd commands and was able to do it successfully, one concern I have is "Connecting the twbx file to a data source' via tabcmd commands.
Following are the commands which I used to :
Login to tableau server :
tabcmd.exe login --server http://serverName --user "userName" --password "password" --site ""
Publishing Tableau reports to the Tableau server :
publish -c "E:\Tableau\ActualReportName.twbx" -n "new Report name.twbx" --project ProjectName --db-user "DBuserName" --db-password "DBpassword"
Although I have given my db credentials while publishing reports, I have nowhere mentioned the DB server Name and DB Name for that matter from which the twbx files would fetch the data.
I have multiple DB's using the same credentials, is there any way in TabCmd to specify the Db server Name and DB name from which reports would fetch data from?
Any help in this would be great!
Unless you have a pressing reason, I'd publish a .twb file instead of a .twbx file
The first thing I'd look into is Tableau servers support for publishing data sources that your published workbook can connect to via the Tableau server. That will allow you to embed your credentials in the shared data source and to update the workbooks and the connections in separate steps. That is especially useful if the data connection and the workbooks change at different tempos.
The unsupported hack is to have your script update the twb file before publishing. It's just an XML file and the info you want to change should be with the data connection details. If you go this route, standard disclaimers apply. Save backups. Don't modify the original, generate a revised version, expect to have to tweak your script when Tableau versions change, etc. still it's not too hard to make sense of their XML. You could probably do it with just a few lines of XSLT, but even a simple string replacement might be good enough.
Still I'd go with a shared data source over hacking the TWB internals in almost all cases.

Send mail from EC2 or EMR on AWS

Is there any way to Send mails with Reports attached from EMR?
I am using Amazon Web Services. I don't want to write a script inside EC2 to fetch data from EMR, add it on cron, then send the mails daily. Any luck, there is already any Jobs Scheduler from Amazon to automate this?
Problem:
Implement daily job to generate .csv/.xls files on top of Hive
Send the report in email
Thanks in advance!
If you use AWS Data pipeline (and use EMR as a node inside it), it has OnSuccees and OnFailure alarm support. In the alarm you can configure to send email to you.
http://aws.amazon.com/datapipeline/faqs/ (look for "How do I add alarms to an activity?").
You cant but customize the email content. May be you can keep the CSVs in a predesignated location with some time/date based naming convention. This way when you get the success mail, you know where to look into for the record.
All this you can do without writing any extra code (just configurations).
One alternative is to setup Oozie in your EMR cluster and create a workflow that sends email through Amazon SES.
You can read up more on Oozie on their open source page:
https://oozie.apache.org/
You might find this helpful too:
https://github.com/lila/emr-oozie-sample
and finally Amazon SES:
http://aws.amazon.com/ses/