Can we use loop functions(for,while,do while) in tableau calculated Fields? If we can, how can we use the these functions in calculated fields and how can we initialise the variables which are declared in these functions?
No we can't. There are some hacks to do some calculations like that, using PREVIOUS_VALUE and other table calculations, but there is no loop functions in Tableau.
Why? Because Tableau isn't meant to be a data processing tool, but rather a data visualization tool. Don't get me wrong, Tableau engine is very good to process data, but only to perform "query-like" operations.
So why don't you post exactly what you are trying to achieve and we can think if it's possible to be accomplished with Tableau, or you require some pre-processing in your data
Related
I'm looking at cumulative energy data, so the graphs look like this:
Which isn't helpful at all since I want to look at changes and patterns. So I'm querying the difference using the non_negative_difference() function in the Grafana query tool.
However, the crazy outliers just overshadow everything and make it impossible to see anything reasonable. Like this:
My idea was to filter outliers directly using a WHERE clause. However, I seem to not be able to add the "non_negative_difference" data into the WHERE clause of Grafana. Any pointers/ideas how to do that correctly?
When we do aggregated sum query in Drill for mongo storage , the output result is in exponential form .
Is there anywhere can we configure in drill , so that we can get output without exponential form ?
We dont want exponential result.
Thanks in Advance.
Drill provides two tools that display query results (and so would format numbers): the Drill web UI and the Sqlline command line tool. Are you using one of these? These tools are often used for experimental queries. I'm not aware of any way to customize the display in that UI.
That said, if you use the ODBC or JDBC driver, then numbers are stored in binary format and so any formatting would be done by the tool using the drivers to run queries. The xDBC drivers are more for production use as they handle large result sets better than the UI tools.
Now, a third possibility is that something in the Mongo plugin converts numbers to strings (VARCHAR). In that case, you might be able to cast the string back to a number.
If you can provide a bit more detail on the tool you are using, perhaps we can provide a bit more focused answer.
I have a dashboard where I have kept all the filters used in the dashboard as global filters and most used filters I have put as context filters,
The problem is the time taken to compute filters is about 1-2 minutes,How can I reduce this time taken in computing these filters
I have about 2 Million of extracted data, on Oracle with Tableau 9.3
Adding to Aron's point, you can also use a custom SQL to select only the dimensions and measures which you are going to use for the dashboard. I have worked on big data and it used to take around 5-7 mins to load the dashboard. Finally, ended up using custom sql and removing unnecessary filters and parameters. :)
There are several things you can look at it to guide performance optimization, but the details matter.
Custom SQL can help or hurt performance (more often hurt because it prevents some query optimizations). Context filters can help or hurt depending on user behavior. Extracts usually help, especially when aggregated.
An extremely good place to start is the following white paper by Alan Eldridge
http://www.tableau.com/learn/whitepapers/designing-efficient-workbooks
I have over 300k records (rows) in my dataset and I have a tab that stratifies these records into different categories and does conditional calculations. The issue with this is that it takes approximately an hour to run this tab. Is there any way that I can improve calculation efficiency in tableau?
Thank you for your help,
Probably the issue is accessing your source data. I have found this problem working directly live data as an sql dabase.
An easy solution is to use extracts. Quoting Tableau Improving Database Query Performance article
Extracts allow you to read the full set of data pointed to by your data connection and store it into an optimized file structure specifically designed for the type of analytic queries that Tableau creates. These extract files can include performance-oriented features such as pre-aggregated data for hierarchies and pre-calculated calculated fields (reducing the amount of work required to render and display the visualization).
Edited
If you are using an extraction and you still have performance issues I suggest to you to massage your source data to be more friendly for tableau, for example, generate pre-calculated fields on ETL process.
I have a couple of millions entries in a table which start and end timestamps. I want to implement an analysis tool which determines unique entries for a specific interval. Let's say between yesterday and 2 month before yesterday.
Depending on the interval the queries take between a couple of seconds and 30 minutes. How would I implement an analysis tool for a web front-end which would allow to quite quickly query this data, similar to Google Analytics.
I was thinking of moving the data into Redis and do something clever with interval and sorted sets etc. but I was wondering if there's something in PostgreSQL which would allow to execute aggregated queries, re-use old queries, so that for instance, after querying the first couple of days it does not start from scratch again when looking at different interval.
If not, what should I do? Export the data to something like Apache Spark or Dynamo DB and analysis in there to fill Redis for retrieving it quicker?
Either will do.
Aggregation is a basic task they all can do, and your data is smll enough to fit into main memory. So you don't even need a database (but the aggregation functions of a database may still be better implemented than if you rewrite them; and SQL is quite convenient to use.
Jusr do it. Give it a try.
P.S. make sure to enable data indexing, and choose the right data types. Maybe check query plans, too.