We are deploying Tableau for a bank.
We had created 6 test dashboards using dummy data on a staging data base using sql connection and lets say has an ip 10.10.10.10.
Now we need to use the same view we had used with the dummy data on Live data but using a different connection which is again an sql engine & IP lets say as 20.20.20.20. All the variable names and other properties are the same, not difference is that the Live data would not have calculated fields which we can deploy on the Live environment.
The challenge is: the LIVE data being of a bank is highly confidential and cannot be used from outside operations site rather we need to deploy it from an ODC [restricted environment]. Hence we simply cannot do a replace data source.
Hence we are planning to move twbx files and data extracts for each of these views using a shared folder to the ODC. Then the process would be like below:
As the LIVE sql data base is different from the dummy sql we will get error
We will select edit data connection
Will select tableau data extract for each sheet and dashboard
Will then select the option of replace data source and select LIVE SQL database
Will extract the new data
The visualization should work fine
Earlier we had just moved TWBX files hence it failed. Is there a different approach to it.
I did something similar to it
For that, you must have
same schema as of Live database and dummy database
do not change name of any source table or column
create your viz
send it in the .tbw form which is editable HTML format
Now the hard part- open your tbw in notepad and replace all connection details to new one
save and open in the tableau
tell me if it didn't worked
One method would be to modify your hosts file on your local computer, pointing the production server name the staging instance of the database. For example, let's say your production database is prod.url.com and you have a reporting staging db server instance called reportstage.otherurl.com
Open your hosts file. Add an entry for prod.url.com. Point it to reportstage.otherurl.com
Develop the report in Desktop, with the db connection string to prod.url.com.
When you publish the twb file to Server, no connection string changes are needed.
Another easier way is to publish the twb to Server with your staging connection string but edit the connection string in the data source in Server.
Develop the twb file on your local computer against your staging database.
Publish the twb file to Server.
Go to the workbook on Server and instead of looking at the views, click on Data Sources.
Edit the data source(s) connection information. This allows you to edit the server name, port, username, or password.
I've used this second method quite a bit. We have an environment where we can't hit the production db outside of the data center. Our staging environment doesn't have that restriction. We develop against the stage db, deploy, and edit the server name in the data source.
Related
Hi would just like to ask if this is possible, I am currently working on ADF, what I want to do is get workitems from analytics.dev.azure.com/[Organization]/[Project] then copy it to SQL Database. i am currently already doing this for 1 project, but want to do it for multiple projects without creating multiple copyto tasks within ADF but just run a Lookup to ForEach to iterate through all the team analytics URLs, is there anyway to do this?
We can use lookup and for-each activity to copy data to SQL dB tables from all URLs. Below are the steps
Create a lookup table which contains the entire list of URLs
Next in for each activity's settings, type the following in items for getting output of lookup activity
#activity('Lookup1').output.value
Inside for each activity, use copy activity.
In source, create a dataset and http linked service. Enter the base URL and relative URL. I have stored relative URLs in lookup activity. Thus I have given #{item().url} in relative URL
In sink, Create azure SQL database table for each item in for each activity or use the existing tables and copy data to those tables.
How can we make the Tableau dashboard templated? We would like to create just the template/wireframe of our reports and as the client requests we should be able to fetch that specific data and generate the report and display it to the client on tableau embedded-web?
There isn't a good way to do this, but there are some hacky workarounds.
Option 1: Separate DB Servers for each Client, Same Schema
If each client has a separate database server with the same schema, you can use the Tableau Server REST API to duplicate the workbook and data source for each client, then use the Update Data Source Connection endpoint to change the database server the data source points to to the new client's.
Option 2: Same Database Server and Schema
Create a column in your database table named 'client' and set it to the client's ID or client's name in all of your rows
Create a parameter in your Tableau workbook named "Client"
When connecting to the database and table in Tableau, you can use a custom SQL statement such as:
SELECT * FROM table WHERE client=<Parameters.Client>
Once you have the workbook loaded, you can use the JS API method Workbook.changeParameterValueAsync() method to set the Client parameter to the appropriate client ID
This has some critical security issues: If the user is able to figure out the client ID of another client, they can get their data. They can also brute force this by calling changeParameterValueAsync themselves.
In our current setup, we have a separate project for each customer. The customers all have the same dashboards but different data sources. Whenever we change a dashboard of one customer, we have to replicate the changes to all the other customers. Currently, this is a manual process, where we re-upload the changed workbook for each customer and replace the connected data sources.
I now want to automate that process by using tableaus rest API in combination with their document api. For the data sources, we are not using hyper files instead of having a live connection to a database.
This is my current implementation:
tableau_auth = TSC.TableauAuth('admin', 'tableau-admin')
server = TSC.Server('http://localhost:8080')
with server.auth.sign_in(tableau_auth):
# download original workbook
server.workbooks.download(workbook_id=source_work_book_id, filepath="tmp", include_extract=False)
source_wb = Workbook('tmp/source.twbx')
# get datasources which have connections
wb_datasources = [s for s in source_wb.datasources if len(s.connections) != 0]
# mappings to map datasource from source to target datasources
# (all the datasources have the same name but are in different projects)
dbname_to_datasource = {d.content_url: d.name for d in datasources if d.project_id == source_project_id}
source_to_target = {d.name: d.content_url for d in datasources if d.project_id == target_project_id}
# overwrite source datasources
for datasource in wb_datasources:
datasource.connections[0].dbname = source_to_target[dbname_to_datasource[datasource.connections[0].dbname]]
source_wb.save_as("tmp/target.twbx")
# write back to tableau server
new_workbook = TSC.WorkbookItem(target_project_id)
server.workbooks.publish(new_workbook, file_path="tmp/target.twb", mode=TSC.Server.PublishMode.CreateNew)
I get the following error when uploading it:
400011: Bad Request
There was a problem publishing the file 'target.twbx'.
It doesn't even work if I re-upload the workbook unmodified. If I change the download to include the extract file I can upload the unmodified workbook successfully:
server.workbooks.download(workbook_id=source_work_book_id, filepath=tmp_dir, include_extract=True)
The only problem with that is that I'm now adding unnecessary data and it also doesn't solve my issue of how to replace the data sources.
Does anyone have an idea what I'm doing wrong or is there an alternative way to do this?
I need to re-create a database from a DashDB Bluemix service into another. And I need to automate this procedure in bash scripts.
The best I can think of is a DashDB REST API that allows me to export the content of the entire database into json format (or any other format you can think of), and a corresponding API that allows me to re-import the content in a different database on the same service or on a different service, possibly in a different Bluemix space. Thanks.
I assume you want to do a one time move and this is not about a continuous replication. In that case simply sign up on http://datascience.ibm.com, navigate to DataWorks, select "Load Data" from navigation panel (open it clicking top left) and then select Cloud Database as source type.
DataWorks load data from dashDB to dashDB
If you however still would prefer to write an own app or script that does the data movement and you want a REST API to export JSON data, then I recommend to write a simple R script that reads the data from a table (using ibmdbR) and writes it to stdout, deploy the script into dashDB (POST /home) and run the R script from your app/script calling /rscript endpoint: https://developer.ibm.com/clouddataservices/wp-content/themes/projectnext-clouddata/dashDB/#/
For Db2 on Cloud and Db2 Warehouse on Cloud, there is a REST API available that allows you to export data from a table in CSV format (up to 100.000 rows) and then load the data back. It will require a few requests as:
POST /auth/tokens
GET /sql_query_export
POST /home_content/{path}
POST /load_jobs
GET /load_jobs/{id}
I've implemented a client npm module for this API - db2-rest-client and you can export a statement result to a JSON file as:
export DB_USERID='<USERID>';export DB_PASSWORD='<PASSWORD>';export DB_URI='https://<SOURCE_HOSTNAME>/dbapi/v3'
db2-rest-client query --query="SELECT * FROM SRC_SCHEMA.SRC_TABLE" > test.json
Then you can transform that data into a .csv file and use the load job:
export DB_USERID='<USERID>';export DB_PASSWORD='<PASSWORD>';export DB_URI='https://<TARGET_HOSTNAME>/dbapi/v3'
db2-rest-client load --file=transformed.csv --table='DEST_TABLE' --schema='DEST_SCHEMA'
Using .NET, I'd like to connect to an EA model from an external application.
If I have more than one EA model open -- eg two remote SQL Server hosted models -- how do I specify extracting data from only one of the models?
# any way to specify a specific data source?
var r = new EA.Repository();
# As I don't think is what I want because:
# 1) didn't want to Open a document -- just connect to it
# 2) don't have a filename - just a model name and/or ConnString...
bool isOpen = r.OpenFile("C:/Sparx-EA/Test Project.EAP");
# etc.
Element ele = r.GetElementByID(10);
Thank you!
This is documented in the manual:
OpenFile (string Filename) Boolean
Notes: This is the main point for
opening an Enterprise Architect project file from an automation
client, and working with the contained objects.
If the required
project is a DBMS repository, and you have created a shortcut .eap
file containing the database connection string, you can call this
shortcut file to access the DBMS repository.
You can also connect to a
SQL database by passing in the connection string itself instead of a
filename.
A valid connection string can be obtained from the Open
Project dialog by selecting a recently opened SQL repository.
Parameters: Filename: String - the filename of the Enterprise
Architect project to open
Repository.OpenFile() is the correct method to use. You can pass it a connection string, it doesn't have to be a file.
A Repository object can only be connected to one EA project at a time. So in order to connect to two EA projects in the same process, create two Repository objects.
Finally, the numeric identities used in calls like Repository.GetElementByID() are valid only within a repository. The number 10 likely refers to two different elements in two repositories -- or might have been deleted in one of them. If you know you've got the same element in two repositories, use Repository.GetElementByGuid() instead.