Create multiple tables from single query in Jasper Reports - jasper-reports

I have to create a report where main thing is a list of items. On the first page there is also supposed to be a bunch of tables that show figures counted from the list (things like amounts of different types of items etc).
Here is an example to demonstrate the idea. List of the items is actually quite long and goes for multiple pages. Other statistics will be listed only on the front page.
(..edit: seems I can't post images yet..)
I have a database query that retrieves the data for the item list. The question is: Is there a way to use this single query to form all the needed statistics?
All the statistics are such that they can be formed with for example using jaspers groups and variables. I also know how to achieve the desired result by using a subreport for each table (and then I'd even be able tailor the query for each table to directly result the wanted values) but I would like to avoid running the same (or almost same) query for multiple times.

Try to put your table component into the SUMMARY band because detail band repeats the record for every row in dataset.
See this

Related

Growing Tables with aggregations

I'm looking at creating a table that could potentially be loaded with 100s of rows. I was hoping to use the growing option that the tables provide. I have a few questions.
If I have a aggregation which is a total for all the rows in that column, will it be the total of all rows or only of those that have been loaded. Or can this be set with a variable etc.
similar to above the select all feature to tick all the rows, will this select every row even the ones not included, or will it just select the loaded rows. Again is this just a variable that I can set.
This is a first time really using any of the UI5 table elements, and the sap said this which I didn't really understand:
"Show Aggregations
Show aggregations (such as totals) on the table footer (sap.m.Column, aggregation: footer).
Do not show aggregations in “growing” mode. It is not clear, if an aggregation will only aggregate the items loaded into the front end, or all items."
For the growing tables, by default all actions and aggregations will only be processed for the data already loaded. Your citation from SAP means that it is not clear to the end user if the aggregated data refers to the visible data or to all data.
If you want to implement something like "Select all" or "Delete All", it would be better to implement this in the backend. From the guidelines of sap.m.List:
In multiple selection mode, users can (de)select all items using the shortcut CTRL+A. This only affects items that have already been loaded to the front-end server. All other items are not (de)selected before they are loaded (for example, items added via lazy loading with growingScrollToLoad). This conflicts with the guideline that all items the user can reach by scrolling must be (de)selected.
To process all items, listen to the selectionChange event and to its flag selectAll. This indicates whether CTRL+A was triggered. As soon as an action is triggered, process the items accordingly. Depending on the number of items, consider processing them in the back end.

Tableau performance

I've a problem with the dashboard in Tableau. In the dashboard there are many worksheets, and all the columns that are in the report are calculable. The problem is that dashboard is being formed for a very long time. The report contains approximately 2 million rows. And it is generated about 5 minutes.
Tell me, what are the solutions in this case?
Maybe I can somehow adjust the page display and not all the records at once?
To reduce the calculation time, try to exclude data you don't need with a data source filter in tableau. You can also hide or delete unused calculated fields. Other things you can do is reduce sheets that are not used.
Here's a link: https://www.tableau.com/about/blog/2016/1/5-tips-make-your-dashboards-more-performant-48574
Steps to follow to reduce calculation time:
Extract the data and use Extract data and also keep option as extract instead of live.Also replace the data source using extract data.
Use "User Filter" to reduce calculation time so that tableau will display of particular user data only.
I hope this will work to solve your problems.
I have one more idea to resolve this issue.
1)when you loan first time your dashboard put into Dashboard Action Filter
First Time load dashboard data exclude in your sheet.
Dashboard Menu->Action->add action->select sheet and exclude option.
2) Live to Extract data source and select radio button extract.
3)use user filter.
I am following the other answers (use extract, dashboard action filter...) and I want to add one point:
Drag every field used by any tablesheet on the dashboard on "Detail" of every tablesheet you are using on the Dashboard. Now Tableau loads all needed data while loading the first tablesheet and can use this data for the other sheets.
i.e. A dashboard contains three tablesheets (A, B, C) now you drag every field used by A on "Deatil" of B and C, every field used by B on "Deatil" of A and C, every field used by C on "Deatil" of B and A.
We are also having a similar issue with 150 million rows but I want to check if you are doing following steps. This may help you. This goes back to fundamentals of Tableau reporting.
1/ Try to make sure your data set is in star schema format. This will help a lot in report.
2/ Try to have tables and views in DB in such a way that same columns are used in Tableau. Any extra columns in tables adds to the performance issue.
3/Make sure indexing is done properly for all the fields that are joined.
4/ In my experience Dashboard adds extra performance lag. So make sure you try to get as much performance tuning on sheets as possible before even going to dashboard.
5/ If required try to use materialized views.
hope this helps.
Try to capture performance metrics using performance recorder option in Tableau.
Check for the underlying DB tables and joins present on the data source layer.
Try using optimized sets and parameters as required and get rid of less relevant filters.
Try using data extracts with scheduled refresh with data source filter for limited business years data.

Is it possible to change the way a form loads with different data fields and values (Acces 2010)

I have Access 2010. I was wondering if there is a way to get the form to load different for every selection.
Example
Item A has 10 Rows and 6 columns of filled in data
Item B has 3 Rows and 2 Columns of filled in data
Both are from the same table.
Is there a way when a certain item is selected from a drop down menu to load, without having multiple forms, only the filled in data? Output would resemble Excel format.
Thank you in advance for any help.
Quick Answer (TL;DR)
Creating dynamically-generated form structure in MSFT Access can be done with sub-forms.
Detailed Answer
Context
MSFT Access
Creating forms
Problem
Scenario: Developer wishes to create context-specific form structure that depends on the query output.
Solution
Create one or more sub-form with attached VBA that changes dependent on the query ouput.
Connect the subform(s) to the primary form and load as needed depending on the context.
See also
https://stackoverflow.com/questions/tagged/ms-access+forms+vba
https://duckduckgo.com/?q=msft+access+dynamic+subform
How to dynamically load, access and unload subforms in microsoft access

Filemaker GetSummary from related table

I'm been using FM for the first time and have a need to use Get Summary on a financial information table. This generates various summaries of different income by customer, year and type. The layout generated from this table is good. The use of Get Summary allows me to do math with the various results, whereas sub summary totals by income type (as far as I know) cannot be added and divided by each other.
The problem I'm facing is that I wish now to create a layout based on customers and include some of the Get Summary detail from the financial table. Because my new layout is based on customers, I understand I cannot use Get Summary from financial as either a related field or in a portal.
The end game is simply to scroll through customer records, one after the other, and have key financial information show on their 'home' screen if you will, for years and type.
Any help gratefully appreciated. Thanks
I understand I cannot use Get Summary from financial as either a
related field or in a portal.
No, that's not quite correct. The GetSummary() function returns the sub-summary value by breakfield - if records are sorted by breakfield. Thus if the portal (or the underlying relationship) sorts the related records by type, you will see sub-summary values in the portal. However, you won't be able to see only sub-summary values, since a portal has no sub-summary parts.
There are other ways to show summarized related data. If you don't have (and don't expect to have) a large amount of records, considering filtering a (one-row) portal to show only a specific type of related records, then place the summary field inside it. Of course, this assumes the types are known in advance and unchanging.

Preserve everything count and get filtered results in t-sql?

I have created a complex sql server 2008/coldfusion search page, that searches thru a variety of tables.
On the left is a list of the categories, plus an everything category, by each category or type of result is a total number of results of that type found in the current search result.
I have everything fine, but I am hoping there is a more optimal approach.
Because everytime i filter the search to a specific category, i still have to get all the results, so as to make sure the everything category has the correct totals.
And because of this, I have realized this is a problem I've had in lots of other programs in coldfusion/sql.
Where you want to reduce the number of results by some field in the select, but you need to keep the original recordcount total.
But you really don't want to re-run the whole massive query everytime, when you just need to get the trimmed results.
This program is 1 cfc, 1 cfm, 1 stored procedure, and jquery/ajax inside the cfm to call the cfc.
The cfm calls the cfc when it originally get's a form submitted search request, and then any filtering does the same thing.
However if there are more than 20 results then it show's a button at the bottom to do via ajax get 20 more records.
My main goal is to improve performance, make sure i keep an accurate record of what the record count is before any filtering is done, without having to rerun the unfiltered query every time.
This is a kind of complex problem, so there might not be any answers...
Thank you all for trying..
I would run the "big" query once, then pop it into a SESSION variable. Then I'd use Query-of-Query to return subsets based on filters.
The main query always exists, so you can query against that or use metadata like bigQuery.recordCount. Your QofQ is a smaller set of data you can use for display. And you can re-apply filters without having to return to the database.
Well you need to run the query (or a count(*)) at least once to get the total number. You could:
Cache this query and refer to the
cached query's recordcount again
and again
Store the record count in the session scope until the next time it is run for this user