Is it possible to programmatically create Custom SQL Tableau extracts that can be published and then refreshed on the server? - tableau-api

Given some Custom SQL, I want to create a Tableau Extract programmatically.
Is this possible?
Context of the process is:
Generate SQL scripts for each extract (100+)
Create the (100+) extracts from Step 1
Publish the Extracts to Tableau Online
Refresh them there on a schedule
Step 2 can be done manually using Tableau Desktop's Custom SQL.
As seen here in this help doc, https://help.tableau.com/current/pro/desktop/en-us/customsql.htm
I want to do it (Step 2) programmatically, due to the number of extracts needed and the time it will take.

Yes, you can programmatically create extracts with the Hyper API. Then you have the option to use either Tabcmd, Tableau REST API or Tableau Server Client Python library to publish the extract. If you go with Python to create the extract, then in the same script, you could use the Server Client to publish. Instead of tableau server refreshing it, you would schedule your script with some sort task scheduler, like in Windows Task Scheduler.

Tableau can do Step 4, you just need to configure it.
Is the problem with steps 1-3 that you have the SQL and just need to automate that part ?

Related

Schedule script to attach CSV file report to a data source in servicenow

Schedule script to attach CSV file report to a data source in servicenow.
Schedule script that automatically attach csv file to the data source in servicenow.
how can we achieve this scenario?
Well this can be achieved in multiple ways. Bit of a vague description you have there. So I'll just drop a few general ideas for you:
If you don't mind turning things around, you could consider an external program push the file directly to ServiceNow and then run the associated TransformMap:
https://docs.servicenow.com/bundle/orlando-platform-administration/page/administer/import-sets/task/t_PostCSVOrExcelFilesToImportSet.html
If you have an FTP, you can have a scheduled script that will fetch the file from the FTP and run the transform:
https://docs.servicenow.com/bundle/orlando-platform-administration/page/administer/import-sets/task/t_ScheduleADataImport.html
You could use the MID Server application to have your custom logic of retrieving the file data. This is probably most complex to set up but also giving you the biggest advantages, like having your file encrypted etc. Basically, MID Server checks every couple seconds for a piece of code to be executed (called probes), for example you could trigger some Powershell script sitting on your server with it.
I'm sure there's other options as well. Good luck!

How to run the Tableau workbook with .hyper extracts in Tableau Online?

I am familiar with Tableau dev and developed a report that uses 2 .hyper extracts and one Excel file and it is running fine with correct data. This is from Dev environment.
But client need to run from Tableau Online, now I am not aware (in fact no knowledge) on how run a workbook on tableau online with this setup.
I am trying to get answers to the below points:
How to refresh Tableau .hyper extracts?
How to specify Excel location (to pick new one every time workbook is refreshed)
Refresh Tableau workbook to show new data every time.
Currently Tableau Online cannot reach files on a local network.
Tableau Online in the cloud cannot reach data sources that you
maintain on your local network. Depending on the connection, you might
be required to publish an extract and set up a refresh schedule using
Tableau Bridge. Source:
https://onlinehelp.tableau.com/current/pro/desktop/en-us/publish_datasources_about.htm
If by "Online" you mean Tableau Server, then there is a way to refresh the data from an Excel data source. Please, check this official link: https://kb.tableau.com/articles/howto/automatically-updating-data-in-server-workbook-that-uses-live-connection-to-excel
Could you give more details about your extracts? If your hyper extracts are from a published datasource, then you can refresh them easily. You just need to create a schedule for the workbook, after publishing it. It is necessary to "allow refresh access" like shown in the screenshot below.

Is there any way to run the datafactory slice using powershell cmdlets

Is there any way to re run the failed Azure data factory slices by using the powershell cmdlets.
As of now I am re running the slices manually from diagram page. But this is not helping much as I have more than 500 slices and all are scheduled to run on every week.
Just to brief you : 4 days back my database server went down, and all the slices are failed to execute and now I want re run all the slice again.
Also I wanted know, is there any way to get failure notification, if slices failed to execute then I should able to get mail or something so that I can get notified.
Thanks in advance.
You may also try the script mentioned in the link and let us know.
For more information, you may refer the article to monitor and manage Azure Data Factory pipelines.
Last time a similar issue happened to me I ended up using the "Monitor & Manage" tool from the Azure Portal.
You can use the grid view to select your failed slices, and there is a very useful "Rerun" button on the top left corner of the grid.
To get email alerts when a slice fails you can add one using the "Metrics and operations" tool.
The settings is quite well hidden but it exists :)

How to periodically update a table in Postgresql via data retrieved from a php API using cronjob?

I have a database in PostgreSQL in which few tables are supposed to be regularly updated. The data is retrieved from an external API written in PHP.
Basically the idea is to update a table related to meteo data everyday by the data collected from a meteo station. My primary idea is to do this job by using cron which will automatically update the data. In this case I probably need to write a cronjob in the form of a script and then run it in the server.
Being a newbie I find it little difficult to deal with. Please suggest me the best approach.
This works pretty much as you described and does not get any more simple.
You have to:
Write a client script (possibly in PHP) that will pull data from the remote API. You can use CURL extension or whatever you like.
Make the client script update the tables. Consider saving history, not just overwriting current values.
Make the client script log it's operation properly. You will need to know how it does once deployed to production.
Test that your script successfully runs on server.
Add (or ask your server admin to add) a line to the crontab that will execute your script.
PROFIT! :)
Good luck!

Live/Extract: How to change Tabelau data source connection from live to extract on the server?

I created a tableau report, with custom SQL query, and used a live connection to connect to my database. I have published a tableau report. Is there a way to change the connection now to extract, via browser?
As far as I know, there is no such option. You have to create a local extract first and then publish it to the server.
I didn't find a way to publish without extract and change it from live to extract online.
I created an empty extract instead and populated it online as a work-around.