Facing problems while moving workbooks between solution environment (Development, Production etc. ) in Tableau Server? - tableau-api

I am working on Tableau Server where I have say, three solution environments viz. Development Staging and Production.
I have created workbooks in Development environment and each workbook use many data sources.
For instance lets say I created the workbook Account Workbook and I have used these two data sources Dev-DataSource1 and Dev-DataSource2. I have other data sources available like Stage-DataSource1, Stage-DataSource2, Dev-DataSource3, Stage-DataSource3 etc.
The problem I am facing is while moving workbooks from Development to Staging environment, the data sources remains the same.
So continuing with the same instance if I move Account Workbook to Staging from Development environment, Account Workbook still uses the same data sources Dev-DataSource1 and Dev-DataSource2. I want it to use the Stage-DataSource1 and Stage-DataSource2 .
This might be a very trivial question or maybe a bad design, but I have created so many workbooks such that creating them again might take loads of time, so please refrain from asking me to recreate those workbooks again. If there is no other way other than that then do tell me.
Thanks in advance

Can you not just edit the data sources, for instance from Dev-DataSource1 to Stage-DataSource1?
I assume they have the exact same structures, field names, etc?
Right click on the data source and click "Edit". When in the data source page, pull in table(s) from
the new datasource and replace the one(s) that are there. If you have joins you may need to recreate them. If you have custom sql you just need to change your sql to point to the new table(s).
You will need to refresh your data sources. It should recognize the fields assuming they are exactly the same as before.
If you need to edit the connection, you can do that as well (if you need to point to a different server, etc.)
Make sure you have a backup of your workbook in case something goes wrong.

It may also be worth your time to look into the Tableau Python document API: https://github.com/tableau/document-api-python
This would allow you to do (some/all?) of what are describing programmatically.

Related

How to optimize deployment strategy for Tableau dashboards?

We have two environments (TEST and PROD) to publish dashboards to. Data for those envs resides on the same RDBMS server, 1 schema per env. And we want to publish 1 dashboard to 2 envs, so that 1 dashboard looks at TEST schema, and the other looks at PROD schema.
That's the approach we came up with, but it's not working as expected: we can not find a way how to deploy Tableau dashboard to 2 envs (and use 2 separate schemas) without actually maintaining 2 versions of the same dashboard (one for TEST and one for PROD).
So either we chose a bad deployment strategy for Tableau, or we're doing smth wrong with it. Could anybody please share your experience on how to deploy Tableau dashboards to separate envs without having to manually edit dashboard for every env? Or what's the problem with our approach?
You didn't specify your RDBMS in the question so my answer will make a broad statement that may not necessarily apply in your specific case.
I think you've chosen a poor approach for your testing and production database environments. Normally, test and prod are on completely separate database instances and in many cases, separate servers. Having them together on the same instance or server means you are using up production hardware resources during testing, with the possibility of locking up the server because of run away queries against the non-production schema. With a single instance, it would be very easy to bring your production instance down inadvertently or maliciously without even being logged into the production schema.
Your better approach is to have two separate database instances: one for test, one for prod. The two instances could be on the same server but are isolated db instances. Both instances have the same users and same schemas. Then you can easily point your Tableau data sources to the respective instances without needing to update your workbooks or keep two copies of each workbook. For example, if you are using Oracle, keep the TNS aliases the same on both Tableau servers but alter the connection details in tnsnames to point to either the test or prod db server.
Your situation sounds like mine. I need to publish exact same report to different environments using different server/database configuration. This is what I do:
Maintain single source of truth, which is production version.
Create an utility converting report to different environments. You can also do it manually as following:
Save your workbook (Tableau file) as .twb
Open twb file with any text editor. It is xml like. Tableau creates connection and renders graphic from this information.
Text search for username, server, port ... You will see all your configuration information here.
Replace those information by target environment information.
Save and open it again using Tableau Desktop and publish.
If you have automation tool like Jenkin, you can develop one click solution for deployment using Tableau commands.

Publishing and changes in workbook for tableau online

I am working on an internal reporting dashboard project . There are majorly 3 roles/level to internal reporting dashboard like higher management, project management etc.
And the breakdown of information for every role/level is different as compare to other roles.
For internal reporting dashboard we have to create a database ( lets say D - SQL SERVER) whose data will be coming from 3 databases ( Lets say A,B,C) after integrating them.
For now as per my research, we can directly link database D using Tableau Live Connection in Tableau Desktop ( Professional ed) and use it to create a dashboard.
To host that workbook for users, I can use Tableau Online for publishing and to make data visible according to the roles I can use filters to restrict the data.
Now my questions are:
1. Will this workflow will be right ? Am I missing any step or process that I would need to cater.
2. How will the changes reflect in the dashboard once it is published ? Lets say if I have to add any filter/ parameter in the dashboard. Do I need to make the changes on the workbook using Tableau Desktop and automatically changes will be reflected ?
or do I have to host it again on Tableau Online ? Please educate me on this too.
Thanks for assistance I have attached a purposed workflow image too.
Regards,
Manail Pasha
WORKFLOW IMAGE
If your system is not a transactional database, I would avoid a live database connection. I would recommend a data extract that combines data blending techniques to create a data extract a.k.a .tde file.
I would publish a dashboard with user filters that enable row-level security via filters and ensure users could only see certain data.
Here is a diagram that I would follow if I were you.
To add filter/ parameter either you can do it from the desktop and publish it to Tableau online or login to online and add the filter/ parameter from the edit mode and Save it, it will get reflected if you do anyone of the above mentioned method.
If your data is frequently changing, i would recommend to go on with Live Connection, Extract refresh can be done on incremental, but the appropriate fields needs to chosen to do it( you should also consider, how to handle negated entries ). It all up-to you to decide to go on with Extract or Live

How can I deploy form / subform (i.e. display only) changes on Notes databases?

I have been asked by a client to assist in making the web frontends of number of Lotus / IBM Notes databases, used for critical LOB functions, compatible with modern browsers.
As it stands, the web frontends of these databases only work in IE7, and even then they're temperamental at best. The JS uses IE-specific extensions, everything is in tables, and they render poorly on pretty much every browser available today. With IE7 no longer in support, they want to modernise these interfaces.
I have very little experience with Notes, but as an exploratory exercise I've managed to open up the databases in Domino Designer, add a few Stylesheet / Script resources, include them in the $$HTMLHead variable and reworked one Form to use a frontend framework, which looks good.
Obviously working on live applications is out of the question, so my thinking is to take a copy of the NSF files, and make the changes on the copies. My question is: how can I then deploy only the form / subform / resource changes to the 'live' NSF files?
Deployment:
In your new modified database :
You define in the Database properties that is a Database file is a master template (give a name)
In the production database :
first do a backup ! copy (only design) to a new copy of the prod
You define in the Database properties that it inherits from master template (same name)
on the prod make refresh design
more details : https://www.ibm.com/support/knowledgecenter/SSVRGU_9.0.1/com.ibm.designer.domino.main.doc/H_ABOUT_REFRESHING_A_DESIGN.html
Sorry to state the obvious, but since you have a Notes client and a Domino server, you have a quite extensive documentation at your disposal in the form of databases located in the /help/ directory. Make sure they are full-text-indexed.
And since we are on the subject of templates, Domino comes with a host of ready-made, ready-to-use apps that you can customize and canibalize. Look for discussion9.ntf for starters.
You may want to start here, then go there, and finally that will give you the keys to build word-class web apps on Domino.
Last thing, if you are on V9, the Designer help is crap. Grap a copy of the 8.5 version. Seriously.
If you want to build a modern web based front-end to existing Domino data, take a look at the following presentations:
http://www.slideshare.net/TexasSwede/ad102-break-out-of-the-box
and
http://www.slideshare.net/TexasSwede/break-out-of-the-box-part-2
As others already said, you should create a template and then just refresh/replace the design of the production database using that template.
You may want to consider working with an experienced Notes/Domino developer for that project, there are quite a few caveats and workarounds you need to know know about...

persist and share data from docker mongo container (with docker)

Need a push in the right direction with this one.
I want to "dev0ps" the workflow of our local development with docker.
As best practice our mongodb should run in a seperate container, having a volume attached and working. (this part checks out fine)
Our developers should then keep the data in sync, so its able to be "pushed & pulled".
Can i achieve this with git? (The data folder is around 500mb, but this is just a fresh project)
Should i write a script that performs a mongodump and upload it (to git?)?
Should i consider spinning up a mongodb server of my own, where they push and pull from?
thanks in advance!
I see what you're trying to achieve, but I don't think the best approach is to keep neither the data from one local working copy in sync inside Git, as this is likely to provide inconsistencies in the Database if Developers have different copies of the data.
The best thing you could do is have a set of Fixtures (Application Data inside the MongoDB database) which is inserted and deleted via your code, that way you control what goes inside the database and what does not, That data can very well be included in the Git repository in a folder called fixtures or database.
You could also have different sets of data, one for developing, one for testing, etc. As you can see you can make it as complex or complementary as possible.
I've been using this approach for more than one year and has not presented any problems.
Let me know if you have any concerns about this approach.

Can I change the database server and database a report is pointing to dynamically?

I have a Crystal 2008 report that will be deployed to an InfoView server. There are four different databases the user might want to execute the report against. Each of the four databases have exactly the same schema. Only the data in each is different. Each database corresponds to a plant we have around the world.
Instead of creating four different reports (each one connected to one of the four databases), am I able to dynamically change the server/database the report hits based on a value the user enters into a parameter? I'm really trying to avoid having to create four identical reports except for the database connection on each. If this isn't possible, how do developers typically deal with this sort of scenario? I would imagine it's fairly common.
Thanks very much.
InfoView doesn't support dynamically changing a report's datasource. You certainly could modify InfoView source to suit your needs with the BusinessObjects Enterprise SDK, but that will be a challenge and won't be supported by BO.
Another option is to build a custom portal with the BusinessObjects Enterprise SDK, but this will require quite a bit of coding as well.
Probably the best option is to publish the report multiple times, set each datasource as appropriate (via the CMC), and change the name of the report to give an indication of its datasource (via the CMC). I there is a report property in the CMC that will save the datasource settings so you can quickly republish the reports if you make a change to the original.
I'm not familiar with InfoView, but it is quite common to do what you describe, I've done similar things with Asp.Net and Winforms; if you have access to the Crystal Reports object model, there are extensive options for setting logon info, I think it is SetDatabaseLogon; if you have subreports, you have to set the login for each of those separately.
The schemas do have to be completely identical, or the user will get warnings.