TABLEAU FILTERS AND SERVERS - tableau-api

There is an automated report on Tableau server set to run each business day. When we open the report, the filters are still present, but there are no results (no data, no viz). There are also no errors. What are some of the possible explanations for why there would not be data present in the report? How to Describe to check for each possible explanation?
2.Tableau reports that connect to multiple servers. Assume that one of the servers went down, it’s unclear when it will be back online, and that this server is the most widely used data source. What to do immediately to minimize the impact on business customers?

You are lucky that question is not closed normally these type of questions are not entertained in SO
There are no errors and no data
Since filters are present in reports possible reason would be filters are active on the report are not fetching any data to report hence you are getting blank report and no errors. For E.g if you run the report for emp ID 100 and there is no emp id with number 100 then no data is fetched from database and report will be black.
So possible debug way would be check the applied filters and check in database if there is any data for those filters and if database access is not provided then provided different values for the same report and check data.
one of the servers went down, it’s unclear when it will be back online
Normally in a live scenario there will be back up servers when primary server is down you need to connect to the back server and minimize the impact and check for the solution for primary server to back online ASAP

Related

Scheduled instance of report sending stale data

I have a scheduled instance emailing to a user. The instance works fine and user gets email. But the data in the report attached to the email is stale. It is missing item codes that do show up in the report if you go view it directly in web browser at BO server.
If I create a new instance scheduled to send to me - data looks up to date and good to go. If I add myself on the instance sending stale report and re-run the instance, I also get the stale version.
I'm worried about how whatever this is could be impacting other reports/users in the company without our knowledge. And also want to fix this one instance.
Is there some caching or other options that could be causing this? Why is the instance sending stale data?
Thanks!!
I figured this out. Turns out someone added record select formulas to the base report but did not re-create the scheduled instance. I looked at meta data from CI_INFOOBJECTS etc to see the record select formula on the instance. It does not match the updated record select on the base report.
This highlights a great best practice to keep in mind in this environment. KEEP YOUR FILTERS OUT OF CRYSTAL REPORTS! Keep your record selection and data transform logic inside SQL server in stored procs or views. That way you can update your report filter criterias without have to re create every scheduled report instance after every little report change :)

Published Workbook or Dashboards takes quite long time to open in Tableau server

I am using Tableau Desktop 8.2 and Tableau server 8.2 (Licensed versions) , the workbook created in Tableau are successfully published to Tableau server.
But when the user want to see the views or workbooks it takes a very long time to preview or open?
The Workbooks are created with Amazon RedShift Database having (>5 million records)
Could somebody guide me on this? like what is it taking a long to preview or open even after being published to Tableau server?
First question, are the views performant when opened using only Tableau Desktop? Get them working well on Desktop before introducing Server into the mix.
then look at the logs in My Tableau Repository which include query strings and timing info to see if you can narrow down the cause. You can also try the Performance Recorder feature.
A typical problem is an overly expensive query just to display a dashboard. In that case, simplify. Start with a simple high level summary viz and the introduce complexity testing the impact on performance. If one viz is too slow, there are usually alternative approaches available
Completely agree with Alex; I had a similar issue with HP Vertica. I had lot of action set on the dashboard. Considering the database structure is final, I did created the tableau extract and used the Online tableau extract in place of live connection. Vola! that solved my problem and the users are happy with the response time as well. Hope this helps you too..
Tableau provides two mode of data refreshes:
Live : Tableau will execute underlying queries every time the
dashboard is referred / refreshed. Apart from badly formulated
queries, this is one of the reason why your dashboard on Tableau Online
might take forever to load.
Extract : Query will be executed once, according to (your) specified
schedule and same data will reflect everytime the dashboard is
refreshed.
In extract mode, the time is taken only when the extract is being refreshed. However, if the extract is not refreshed periodically, the same, stale data will reflect on the dashboard. Thus, extract view is not recommended for representations of live data.
You can toggle Live <--> Extract from the Data Source pane of Tableau Desktop. (refer top right of the snapshot).

database delivery option for subscribed ssrs reports

I have a reports which is taking 20 minutes to run .I want to make it as subscribed ssrs report.I have seen that in SSRS the delivery options are email and fileshare but I want the report to be saved in DB directly.Is there a way of inserting the report in DB.
You can schedule a snapshot based on parameters to be rendered. If your report follows a model that allows it to be subscribed to with parameters similar to the snapshot then you can create a batch type process. This works if your users are not expecting real-time data.
For example:
Early in the morning create a snapshot.
Email cached version based on similar parameters.
Hope this helps.

Trying to prevent multiple database calls with a very large call

So we run a downline report. That gathers everyone in the downline of the person who is logged in. Some people of clients run this with no problem as it returns less than 100 records.
Some people of clients however returns 4,000 - 6,000 rows which comes out to be about 8 MB worth of information. I actually had to up my buffer limit on my development machine to handle the large request.
What are some of the best ways to store this large piece of data and help prevent it from being run multiple times consecutively?
Can it be stored in a cookie?
Session is out of the question as this would eat up way to much memory on the server.
I'm open to pretty much anything at this point, trying to better streamline the old process into a much quicker efficient one.
Right now what is done, is it loads the entire recordset, it loops through the recordset building out the data into return_value cells.
Would this be better to turn into a jquery/ajax call?
The only main requirements are:
classic asp
jquery/javascript
T-SQL
Why not change the report to be paged? Phase 1: run the entire query, but the page only displays the right set of rows based on selected page. Now your response buffer problem is fixed. Phase 2: move the paging into the query using Row_Number(), now your database usage problem is fixed. Phase 3: offer the user an option of "display to screen" (using above) or "export to csv" where you can most likely export all the data, since csv is nice and compact.
Using a cookie seems unwise, given the responses to the question What is the maximum size of a web browser's cookie's key?.
I would suggest using ASP to create a file on the Web server and writing the data to that file. When the user requests the report, you can then determine if "enough time" has passed for it to be worth running the report again, or if the cached version is sufficient. User's login details could presumably be used for naming the file, or the Session.SessionID, or you could store something new in the user's session. Advantage of using their login would be that your cache of the report can exist longer than a user's session.
Taking Brian's Answer further, query page count, which would be records returned / items per page rounded up. Then join the results of every page query on client side. Pages start at a offset provided through the query. Now you have the full amount on the client without overflowing your buffer. And it can be tailored to an interface and user option (display x per page).

Crystal Reports 9 Database Connection Issue

Crystal Reports 9 seems to save the database connection information inside the report file itself. I am having an issue changing that connection. I work with a team of developers who all have their own copy of a database on the same server. We are using Trusted Connections to the db. When we need to make changes to a crystal report, and we click the lightning bolt to execute the report, Crystal does not ask for login information to the database. It actually ends up connecting to the last database that was used when the report was saved last.
We came up with 2 workarounds:
Take the database that crystal thinks it should connect to offline, then crystal will ask for login info.
Remove permissions for the username that is making the crystal change.
Neither of these are acceptable for us. Does anyone know how to remove the crystal connection from the report file?
We have tried Log Off Datasource Location and all of the settings in the Database Expert.
UPDATE
I still have not found a solution that fits my case. But our newest workaround is to load up a crystal report and just before you click the lightning bolt (to run report against the database), unplug your ethernet cable. Then when Crystal cannot find the database, plug the ethernet cable back in and it will allow you to choose a different database server and name.
You could use a .dsn datasource file in a user-specific location (i.e. the same path for every user, but a different physical location) and point Crystal Reports at that. For example, on everyone's C drive: C:\DSNs\db.dsn, or on a network drive that is mapped to a different location for each user.
You can get more info on .dsn files on MSDN:
http://msdn.microsoft.com/en-us/library/ms710900(VS.85).aspx
We are using such way (using sql authentication however):
open report
database - log on server
database - set datasource location
refresh/preview
You may disable your [domain user] access to dev database, should help too :)
I am probably answering too late to have any chance at the bounty, but I'll offer an answer anyway.
If you are running the Crystal Report directly or with Crystal Enterprise then the only way I can think of to do this is by using a dsn as paulmorriss mentions. The drawback to this is that you'd be using ODBC which I believe is generally slower and thought of as outdated.
If you are using this in an application then you can simply change the database connection settings in code. Then, everyone can develop the report against their own test database and you can point it to the production database at runtime (assuming the developers database is up to date and contain the same fields as the production database).
To do this you should be able to use a function like the following:
private void SetDBLogonForReport(CrystalDecisions.Shared.ConnectionInfo connectionInfo, CrystalDecisions.CrystalReports.Engine.ReportDocument reportDocument)
{
CrystalDecisions.CrystalReports.Engine.Tables tables = reportDocument.Database.Tables;
foreach (CrystalDecisions.CrystalReports.Engine.Table table in tables)
{
CrystalDecisions.Shared.TableLogOnInfo tableLogonInfo = table.LogOnInfo;
tableLogonInfo.ConnectionInfo = connectionInfo;
table.ApplyLogOnInfo(tableLogonInfo);
}
}
For this to work you need to pass in a ConnectionInfo object (which will contain all of your login information) and the report document to apply it to. Hope this helps.
EDIT - Another option, that I can't believe I haven't thought of until now, is that if you are using SQL Server you can make sure that all of the development databases names are the same, then use "." or "(local)" for the server and integrated security so that everyone effectively has the same connection info locally. I think this is probably the best way to go assuming that you can get all of the developers to use the same setup.
EDIT Again :)
After reading some of the comments on the other answers, I think I may have misunderstood the question. There is no reason that I can think of why you wouldn't be able to do the steps in Arvo's answer outside of not having rights to edit the report, but I'm assuming that you've been able to make other changes so I doubt that is it. I assumed that to get the report to work for each developer you had been doing these steps all along.
Yeah I agree Crystal Reports is a pain. I have ran into the same problem in the applications that I have built that I was forced to use it.
1- Log off the server(inside crystal right click the database and log-off)
2- Click on the database and change the database location
If you are logged on and change the database location it doesn't seem to stick
You can set the logon at runtime. See this question...
How do I change a Crystal Report's ODBC database connection at runtime?
If you used ODBC, each dev could point their DSN at the appropriate database. Essentially pushing the connection string into the DSN and out of the crystal report.