How does paging work in SSRS?
Does it fetch whole data at one go and then just displays certain records per page or on clicking 'Next Page' it makes a DB call and pulls data from database and shows on screen everytime.
In my SSRS report, it ia taking considerable time to fetch around 3000 to 3500 records and show it on screen. So I want to know if Pagination can solve this problem.
To resolve timeout issue, I have mentioned Timeout as 36000 (Seconds) in Dataset properties.Also in Site settings of report manager, I have selected 'Do not timeout Report' Option.
According to this artical it depends on the SQL server version you have:
http://www.c-sharpcorner.com/UploadFile/bc1c71/custom-paging-in-sql-server-2012/
In SQL server 2012:
Microsoft SQL Server 2012 comes with two extended clauses of ORDER BY and they are OFFSET & FETCH. These two clauses are used with Order By clause and make our SQL Engines to read only the specified number of records given with Fetch after the Offset value.
Related
There is a relational database (MySQL 8) with tens of thousands of items in the table, which need to be displayed in sap.m.Table. The straight forward approach is to retrieve all the items with SQL-query and to deliver it to the client-side in JSON in an async way. The key drawback of this approach is performance and memory consumption at the client-side. The whole table needs to be displayed on the client-side to provide user and ability to conduct fast searches. This is crucial for the app.
Currently, there are two options:
Fetch top 100 records and push them into the table. This way user can search the last 100 records immediately. At the same time to run an additional query in a web worker, which will take about 2…5 seconds and get all records except those 100. Then, to merge two JSONs.
Keep JSON on the application server as a cached variable and update it when the user adds a new record or deletes a record. Then I fetch the JSON which supposed to be much faster than querying the database.
How can I show in OpenUI5's sap.m.Table thousands of items?
My opinion;
You need to create OData backend for your tables. User can filter or search records with OData capabilities. You don't need to push all data to client, sap.m.Table automatically request rest of data with OData protocol while user scroll the table.
Quick answer you can`t.
Use sap.ui.table or provide a proper odata service with top/skip support as shown here under 4.3 and 4.4.
Based on your backend code(java, abap, node) there are libs to help you.
The SAP recommandation says max 100 datasets for sap.m.Table. In praxis, I would advise to follow the recommendation, even on fast PC the rendering will be slowed down.
If you want to test more than 100 datasets, you need to set the size limit on your oModel like oModel.setSizeLimit(1000);
I am facing a performance issue with jasper server. My query is for CrossTab. Query works fine in Toad as well as in jasper studio but it's execution is very slow in jasper server and sometimes it even fail with connection timeout.
I can't understand what is the reason for this behavior. Please help me.
Thank you
Query performance in jasper server depends on various factors but to get a quick idea of where the bottle neck might be in case of CrossTab (AdHoc functionality), follow these steps:
Login to JasperReport server through web UI (login as superuser) and take a look at Manage => Server Setings => Ad Hoc Cache. Here, analyze Query and Fetch column values.
Query (msec)
It shows the time from when query was sent to the db until the first row was received. If this is slow then one possible improvement would be to index some fields in the underlying database query. If you are using derived tables then trying switching to actual tables because derived tables are sub-queries/sub-selects and are intensive performance wise.
Fetch (msec)
Time from when first row was received until the last
row was received. If this is slow there might be a network
bottleneck. Try to set the fetch size in the jasperreports.properties
file to modify the number of rows to fetch at a time. Optimizing this can reduce the number of trips to the underlying database.
I have a dashboard in Tableau which pulls data from about 10 tables in a SQL database.
These tables are refreshed at various times of day. There are occasions where one of them is not available (or has been deleted and awaiting rebuild)
However when I open my tableau dashboard on the server it wont let me see any of it. Not seeing the data from the missing table is fine but the majority of the data that does not come from that table is unavailable too.
I get this error
An unexpected error occurred. If you continue to receive this error please contact your Tableau Server Administrator.
TableauException: [Microsoft][SQL Server Native Client 11.0][SQL Server]Invalid object name 'dbo.survey_order_info_fy16_TV_L'. The table "[dbo].[survey_order_info_fy16_TV_L]" does not exist. Unable to connect to the server "dbedwro.vistaprint.net". Check that the server is running and that you have access privileges to the requested database.
"survey_order_info_fy16_TV_L" being the missing table but not one I'm bothered about right now.
Is there an option that might help me see all the other data?
I am not sure if it's possible to avoid this behavior.
If there isn't there is a workaround for that by creating extract of these tables and storing them on the Tableau server. You can then use these extracts instead of the tables on the DB and just refresh them either by schedule if you know when the tables are available again or from the SQL server (eg. with SSIS by triggering the refresh once the data is available again).
Advantage of that would be that
you can refresh them independently and always have the latest data
it performs better than an SQL connection
you don't jam your SQL server with connections (in case you have a lot of users accesing)
you can filter and select if you didn't want your users to get access to the full dataset
disadvantages:
you will have to create one extract per table, and replace all data sources in workbooks you already use
It's a matter of creating a workbook, connecting to the source (adding filters or hiding fields) and publishing it to the server. Details of that can be found here:
http://onlinehelp.tableau.com/current/pro/online/mac/en-us/publish_datasources.html
I have four parameters on my report. Three of them are required for the underlying stored procedure data source, but the fourth parameter is just used to show/hide items on the report.
If the user changes the value for that fourth parameter, is there a way to refresh the report using the existing data without running the stored procedure again? The result set won't change, only the rows that are to be displayed.
Reporting Services 2008 seems to treat each combination of report parameters as a unique set, even if some of them are internal to the report only, and not related to the stored procedure. Therefore, aside from using report caching, there is no way to prevent report server from making a round trip to the database, even if only the internal parameter changes. You basically have two options:
Turn on report caching in report server, and run all combinations of
the four parameters, so that the user will be accessing report
server's cache when she runs any report. This avoids making a round trip to the database, but only for the parameter values you've already tried.
write your underlying stored procedure with caching behavior so that it writes its results to a database table. Whenever the stored procedure is run, have it first check the table to see if the results for the current set of parameter values is already stored in the cache table, and if so return those rows to report server. This still requires a round trip, but it is faster than running the procedure again.
I'm really at a loss as to how to procede.
I have a very large database, and the table I'm accessing has approx. 600,000 records. This database is accessed using an accounting application, which provides the report with the SQL query by which this report accesses the database.
My report has a linked subreport which has restrictions that are placed in the report header. When this report is run, the average time to refresh, using a very base query is 36 minutes. When adding two more items to the query, the report takes 2.5 hours.
Here is what I've tried:
cleaned up the report only leaving items in absolutely necessary - no difference
removed most formulas (removing the remaining formulas makes no time difference)
tried editing the SQL query - wasn't allowed because of the accounting application
tried flipping subreport and main report - didn't work
added other groupings - no difference
removed groupings - no difference
checked all the servers for lack of temp disc space - no issue
tried "on demand" subreport - no change
checked Parameters (discrete vs. range) and it is as it should be
tried bursting indexes, grouping on server, etc. - no difference
the report requires 2 passes. I've tried getting it down to one pass unsuccessfully.
There must be something I'm missing.
There does not appear to be any other modifications to the report using regular crystal functions. Is there any way to speed up the accessing of the data without having to go through all 600,000 records? The SQL query that accesses this data is long and has many requests. It is not something I can change.
Can I add something (formula?) that nullifies these requests? I'm reaching now...
Couple of things we have had success with is adding indexes to the databases, and instead of importing tables into the report, we instead wrote a stored procedure to retrieve the desired results.
If indices and stored procedures dont get you where you need to be you have reached the denormalise until it works part of life with a database. You might want to look at creating an MI database with tables optimized for your reporting needs; and some data transformation scripts that can extract the data from production to your MI database. Depending on what it is oracle / ms have tools to help you do this.
We use Crystal Reports with a billing system, and we had queries in the database that take over 1.5 hours to complete. This doesn't even take into account the rendering/formatting of the reports.
We created Materialized Views and force the client to refresh them daily. A materialized view is basically a database view that holds the returned dataset. The dataset is not refreshed unless you explicitly tell it to refresh.
Do you know what the SQL query is? If so, you can move the report outside the accounting application and paste the query directly into the Command in the database expert. I've had to do this in a couple of cases with another application I work with.