Integration and data upload - MuleSoft CloudHub to Tableau Server - tableau-api

I am using MuleSoft CloudHub with Runtime v4.4 to upload CSV data to Tableau Server. The documentation page https://docs.mulesoft.com/tableau-specialist-connector/1.1/ confirms that its not possible to use Hyper Configuration and its operations in CloudHub because CloudHub does not allow executing external code due to security reasons. That is why I am trying to use REST configurations and it's available operations.
With all my attempts, I am able to connect to Tableau server and successfully perform simple operations like Initial file upload, Query project, Append to file upload, etc. But, these operations does not help to publish my CSV content to Tableau. Publish workbook operation also requires file content in *.twbx format, and I am not sure how to convert CSV/JSON/XML to TWBX content type.
I have referred some websites and MuleSoft technical videos where HTTPS connection can be used to upload data to Tableau but those examples are using *.hyper file from classpath.
So, basically I am stuck at 2 different paths:
How can I transform CSV content to hyper file content in Mule flow. If this transform is possible then I can use HTTPS connection to Tableau and upload my data.
Using MuleSoft Tableau connector v1.1 and with REST configurations, is it possible to upload data to Tableau?
If there is any other solution, then I am happy to change my implementation strategy. Please can somebody guide me to correct direction?

Related

automatic pulling REST API data to visualize it in Apache Superset

I work in a large enterprise and have a project to build some custom automated dashboards for our IT department, the small amount of data needs to be fetched only from the REST API endpoints. This process needs to be fully automated and there is not enough time to build a custom API wrapper. For this approach I was going to use Apache Airflow + Apache Superset tools. I have been googling for a couple of days for more easier open source solution than the Apache Airflow to move data from the REST API endpoints to visualize it in Superset. Please share your experience what would you choose instead of the Apache Airflow?
I chose to go with fhe following solution:
Apache Airflow + PostgreSQL + Grafana (instead of a Superset, because in Grafana you can actually create a drill-down option using a workaround)

Copy List & Label server content from DEV to QA to PROD

Reproduce List & Label content in different environments.
I tried enabling List & Label API and use the methods, but input parameters need existing ParentFolderID, etc... which need to be explicitly created in server.
I am new with List & Label report server.
My requirement is to create all reports (data sources, etc...) in DEV server.
However to QA and PROD afterwards, there is a need to be able to 'export' all existing reports (data sources, etc...) from DEV.
My knowledge is more of Oracle BI Publisher.
In BIP server i have an option to download (zip) all the contents of a server folder (where i have all the reports) and i am able to log in another BIP server and upload the zipped file from original server.
This way i have exactly the same reports on all servers.
Is any sort of feature like it in List & Label report server?
Explicitly i need a kind of reports installer into List & Label report server.
Regards,
ccarvalho
Currently, there is no such feature. The suggested procedure would be to copy the entire database from Dev to Production if your're working with the Report Server on both ends. Would that work for you?
Other than that, if you're designing your reports from a desktop app and want to transfer them to the Server, you could use the REST API of the Report Server. Here is a blog post covering the basics of this API. The setup also contains a sample ("C# Client API Sample").

Get Files names pentaho rest api

Am using pentaho bi server community edition. I would like to integrate the pentaho report in my angular2 web application. I put the reports in a folder in the pentaho server , i would like to read the files names from this folder.
https://help.pentaho.com/Documentation/7.1/0R0/070/010.
As far as I understood your question, the response is under the File Management/Directory Resources section.
Don't forget to authenticate.

Live connection with web data connector

Is it possible to use a live connection by using the web data connector in Tableau? Or do I have always build an extract? Currently I am using the trial version and in that version the live connection option is greyed out.
The web data connector always creates an extract. See this link for more details. https://onlinehelp.tableau.com/current/api/wdc/en-us/help.htm#WDC/wdc_phases.htm

Create a Hadoop Cluster connection in Talend without open the IDE

I am trying to create a "one click solution" with a Hadoop cluster, Ambari Server and a Talend via Apache Brooklyn in the cloud.
I can create all of the things, but now I have to connect them.
I am able to create "the project/connection" between the Ambari Server and the Talend manually. I have the url of the Ambari Server, so I can open the Talend and create the connection with the Hadoop cluster using the wizard of the Talend.
The question is, is there any way to do it without opening the Talend. I mean, creating manually the files that are needed and leave them into the corresponding folders.
In case of yes, which would be the files I need to create and what would be the content of this files?
I'm not familiar with Talend but a few Google searches, as well as this anwser suggest that Talend Open Studio does not come with a REST API.
As for a configuration file, I could not find any results. So my conclusion is that it is not possible to automatised.
When you think about it, it actually makes sense as the Talend Open Studio is mean to be a graphical and visualisation tool to build complex jobs.