I've a tableau workbook using excel as a data source. The excel uses lots for formulae inside and takes input parameters using cells on a particular worksheet. The issue is I want to input these parameters through Tableau, get the excel refreshed on the backend and show the output of the excel in Tableau. Any suggestions how can I accomplish this? Thanks much in advance
You can create parameters in Tableau. The Excel calculations using these parameters would also need to move into Tableau as Calculated Fields. Basically you would pull the raw Excel data into Tableau and perform calculations there using your Tableau parameters.
Related
I have a workbook with datasources blended and calculated fields referencing these datasources.
I get an error alert publishing to tableau server.
How do I publish the data sources?
I use tableau 2019
Create an extract of the blended data source, and publish that extract to the server. This extract will included all the calculated fields that are included in the current workbook.
Generally for this type of use case I suggest publishing your data extracts separately. They can contain your calculated fields. Publishing the calculated fields is actually more performant anyway, many of them become materialised.
You can connect to both server data sources in desktop, and perform the data blending within the Tableau workbook using the published data sources. You can create additional calculated fields, including those from data blends, within the workbook. You're then able to publish the workbook and those calculated fields will remain at the workbook level, not the data source.
Other users connecting to the data sources will only see the calculations published into the extract, not the calculations published with the workbook.
Hope that makes sense.
I have been using Spark-excel (https://github.com/crealytics/spark-excel) to write the output to a single sheet of an Excel sheet. However, I am unable to write the output to different sheets (tabs).
Can anyone suggest any alternative?
Thanks,
Sai
I would suggest to split the problem into two phases:
save the data into multiple csv using multiple Spark flows
write an application, that converts multiple csv files to a single excel sheet, using e.g. this Java library: http://poi.apache.org/
In my company we have 1K+ Tableau workbooks, all using same Vertica data source via multiple-table connection or custom SQL. Often we end up in situation where reports stop working because underlying data source was changed: table renamed, field removed etc.
How can we proactively react to these changes is my question.
Can we try to correct source code of tableau workbooks to batch replace deprecated query parts?
Or can we monitor what data tables are used in the workbook with/without parsing the source code of the workbook to create alert system?
Thanks
Is there a way to automatically generate stored procedures out of the t-sql in the datasets found in an ssrs project?
There's not an automatic way to do this. You might be able to cobble something together to do it though. SSRS reports are in an XML format. The Datasets are in a < DataSets > element.
Unfortunately, I don't know how helpful it would be since the parameters would need to be resolved.
Someone created a Powershell to retrieve the dataset definitions from a report if you want to try to automate something. I think it would still need manual work to convert them - especially if they use parameters or have calculated fields.
https://ask.sqlservercentral.com/questions/94491/retrieve-dataset-definitions-from-ssrs-report.html
I am trying to export 5.5M rows from PowerPivot (Excel 2010) into Access 2010 as a table. Do you know how I can do that?
Also, Access has no option of importing data from Powerpivot. I do not have powerquery in excel 2010 that I am using.
Please Help!!
If you had Excel 2013 you could use VBA to write directly from PowerPivot to Access. And Power Query would give you an easy way alternative of writing the PowerPivot data to Excel sheet in chunks and then you could use Access to import it piece by piece. Power Query is free to download and works with Excel 2010.
Without those I think you should use a flattened pivot table to materialize your PowerPivot data in Excel. Obviously you will have to do this in chunks also because of 1m row limits in Excel and likely resource constaints. So you could add filter and iterate through your dataset. This could be handled by VBA if this was to be a rerunnable process - but it wouldn't be pretty.