I am using JasperSoft Reports v.6.2.1 and when running a report within the Studio preview the output comes after 2 seconds.
Running the same report (output xlsx) on the server takes > half a minute - though there is no data volume issue (crosstab, 500 lines, 17 columns in excel, "ignore pagination" = true).
I am using $P{LoggedInUsername} to filter data within the WHERE-part of a WITH-clause (based on the user's rights), run the report and realized, when using a fixed value (the user's id as a string) instead of the parameter in the query, the report execution speed is good.
Same against Oracle DB from SQL Developer - the query resultset with a user's id string is back in 2 sec.
Also the output of $P{LoggedInUsername} in a TextField produces a String.
Once switching back to the $P{LoggedInUsername}-parameter in the query, the report takes ages again or runs out of heap memory in the Studio/server.
What could be the issue?
Finally my problem was solved using the expression user_id = '$P!{LoggedInUsername}' instead of $P{LoggedInUsername} in the WHERE-part of my query.
Related
When I aggregate values in Google Data Studio with a date dimension on a PostgreSQL Connector, I see buggy behaviour. The symptom is that performing COUNT(DISTINCT) returns the same value as COUNT():
My theory is that it has something to do with the aggregation on the data occurring after the count has already happened. If I attempt the exact same aggregation on the same data in an exported CSV instead of directly from a PostgreSQL Connector Data Source, the issue does not reproduce:
My PostgreSQL Connector is connecting to Amazon Redshift (jdbc:postgresql://*******.eu-west-1.redshift.amazonaws.com) with the following custom query:
SELECT
userid,
submissionid,
date
FROM mytable
Workaround
If I stop using the default date field for the Date Dimension and aggregate my own dates directly in within the SQL query (date_byweek), the COUNT(DISTINCT) aggregation works as expected:
SELECT
userid,
submissionid,
to_char(date,'YYYY-IW') as date_byweek
FROM mytable
While this workaround solves my immediate problem, it sucks because I miss out on all the date functionality provided by Data Studio (Hierarchy Drill Down, Date Range filtering, etc.). Not to mention reducing my confidence at what else may be "buggy" within the product 😞
How to Reproduce
If you'd like to re-create the issue, using the following data as a PostgreSQL Data Source should suffice:
> SELECT * FROM mytable
userid submissionid
-------- -------------
1 1
2 2
1 3
1 4
3 5
> COUNT(DISTINCT userid) -- ERROR: Returns 5 when data source is PostgreSQL
> COUNT(DISTINCT userid) -- EXPECTED: Returns 3 when data source is CSV (exported from same PostgreSQL query above)
I'm happy to report that as of Sep 17 2020, there's a workaround.
DataStudio added the DATETIME_TRUNC function (see here https://support.google.com/datastudio/answer/9729685?), that allows you to add a custom field that truncs the original date to whatever granularity you want, without causing the distinct bug.
Attempting to set the display granularity in the report still causes the bug (i.e., you'll still set Oct 1 2020 12:00:00 instead of Oct 2020).
This can be solved by creating a SECOND custom field, which just returns the first, and then you can add IT to the report, change the display granularity, and everything will work OK.
I have the same issue with MySQL Connector. But my problem is solved, when I change date field format in DB from DATETIME (YYYY-MM-DD HH:MM:SS) to INT (Unixtimestamp). After connection this table to the Googe Datastudio I set type for this field as Date (YYYYMMDD) and all works, as expected. Hope, this may help you :)
In this Google forum there is a curious solution by Damien Choizit that involves combining your data source with itself. It works well for me.
https://support.google.com/datastudio/thread/13600719?hl=en&msgid=39060607
It says:
I figured out a solution in my case: I used a Blend Data joining twice the same data source with corresponding join key(s), then I specified a data range dimension only on the left side and selected the columns I wanted to CTD aggregate as "dimensions" (and not metric!) on the right side.
I have executed a query in HIVE CLI that should generate around 11.000.000 rows, I know the result because I have executed the query in the MS SQL Server Management Studio too.
The problem is that in HIVE CLI the rows are showing on an on ( right know there are more than 12 hours since I started the execution ) and all I want to know is the time processing, which is showed only after showing the results.
So I have 2 questions :
How to skip showing rows results in HIVE command line ?
If I will execute the query in Beeswax, how do I see statistics like execution time , similar with SET STATISTICS TIME ON in T-SQL ?
You can check it using link given in log .But it wont give you total processing left.
I have a subreport in a crystal report but there are 0 records returned, but the subreport insists on pulling every order remark ever entered yet still displaying 0 remarks.
It's very odd, it says "reading records 0 of 150000" and keeps increasing.
The subreports only selection critera is the company code, the customer code and the order number. they are all passed with the "{?PM-..." thing going on.
I opened a blank subreport's preview and the SQL didn't even make mention of the selection criteria. the query run by itself would indeed show every remark.
Any thoughts on how this can happen?
Subreport Formula:
({E_ORD_H.COMP_CODE} = {?Pm-E_ORD_H.COMP_CODE})
and
({E_ORD_H.CUST_CODE} = {?Pm-E_ORD_H.CUST_CODE})
and
({E_ORD_H.ORD_NUM} = {?Pm-E_ORD_H.ORD_NUM})
When a Crystal Report behaves like that it means that there is some logic that cannot be done on the DB server, so it must be done locally. Usually this is caused by using a CR function in the record selection formula that doesn't translate into the DB's language.
In this case, I believe it is when the parameters are null that is causing it (For example, the statement {E_ORD_H.COMP_CODE}=<null> does not mean that CR will be able to anticipate this situation and substitute {E_ORD_H.COMP_CODE} is null in its place in the subreport's query. Instead, you need to explicitly check for those null-valued parameters:
not(isnull({?Pm-E_ORD_H.COMP_CODE}))
and {E_ORD_H.COMP_CODE} = {?Pm-E_ORD_H.COMP_CODE}
and not(isnull({?Pm-E_ORD_H.CUST_CODE}))
and {E_ORD_H.CUST_CODE} = {?Pm-E_ORD_H.CUST_CODE}
and not(isnull({?Pm-E_ORD_H.ORD_NUM}))
and {E_ORD_H.ORD_NUM} = {?Pm-E_ORD_H.ORD_NUM}
I am having a problem running reports in Sql Server Reporting Services (SSRS) 2008 R2.
After setting up the site, creating a few users, and uploading a few reports, I select a report to run, enter my parameters, and wait for the report to render...but nothing happens. I see the "Loading" animation for a brief amount of time (varies between one and ten seconds irrespective of the report), but it soon disappears, leaving nothing except for the parameters at the top of the page. No report content is rendered at all.
This occurs on every report that I deploy to the SSRS instance regardless of size, number of parameters, etc. Most of the reports have around eight parameters of similar type, and their layouts are all of a relatively tabular format: there are several "sections", each containing a table that may have grouping and/or repeating rows. On average, each table contains about six fields. The queries that support the report datasets are not terribly heavy-weight: I can run them against SQL in under a half-minute in most cases.
I have examined and implemented the solutions in the following questions, but neither of them corrected the issue.
(Related to Safari/Chrome)
SSRS 2008 R2 - SSRS 2012 - ReportViewer: Reports are blank in Safari and Chrome
(Related to running browser as Administrator)
SSRS 2008 - Reporting Services Webpage is Blank except for headers
Other options that I have considered include running the reports from the computer directly instead of through Remote Desktop, and running the reports from other computers on the network other than the server that hosts them. Neither option bears consistent success.
What I have determined, however, is that this problem is directly related to the number of parameters passed to the report. For instance, most of my reports have upwards of six or seven multi-select parameters; each of them is populated at runtime from the database. If users perform a "Select All" or otherwise select a large number of the options in the parameter selection, the above-mentioned issue appears. However, if users are specific with their parameters and only select a few, then the report renders correctly.
Further complicating this situation is the fact that it suffers from the "works-on-my-machine" syndrome. It occurs in some environments but not in others.
As mentioned, I am able to run the reports by limiting the number of selected parameter values; this works in every environment. I am curious, however, as to why this happens (an internal SSRS limit, perhaps?), and what I can do to consistently avoid it in the future.
I had the same problem and by following Alex Zakharov's suggestion I found a parameter that had almost 700 rows with them all selected by default. Deselecting a few rows resolved my issue immediately, but that was unsatisfactory for a production release.
Looking for the log files suggested by lrb (\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\LogFiles\) I found the following error message:
ERROR: HTTP status code --> 500
-------Details--------
System.Web.HttpException: The URL-encoded form data is not valid. --->
System.InvalidOperationException: Operation is not valid due to the current state of the object.
at System.Web.HttpValueCollection.ThrowIfMaxHttpCollectionKeysExceeded()
Solution
This led me to this answer to find and update the web.configs, add the lines of code below and then restart the service.
<appSettings>
<!-- default is 1000 -->
<add key="aspnet:MaxHttpCollectionKeys" value="2000" />
</appSettings>
I had to update in 2 places, because SSRS has separate websites for report manager and report server:
\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\ReportServer\web.config
\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\ReportManager\web.config
SUGGESTION
I just feel like sharing my solution to this question as well. After so much struggle and so many tries, my problem was more on the front end, there was one piece of code which I was missing which was returning blank pages, but the server the report was working well.
Using Angular to return HttpResponseMessage from a web API, initially this is how my call was:
$http.get(url + 'Report/getUsersReport?CustomerId=' + searchCriteria.customerId + '&fromDate=' + searchCriteria.fromDate + '&toDate=' + searchCriteria.toDate).then(getContractsForBackOfficeReportComplete, function (err, status) {defered.reject(err);});
and after add this line of code { responseType: 'arraybuffer' }):
$http.get(url + 'Report/getUsersReport?CustomerId=' + searchCriteria.customerId + '&fromDate=' + earchCriteria.fromDate + '&toDate=' + searchCriteria.toDate, { responseType: 'arraybuffer' })(getUsersReportBackOffice, function (err, status) {
defered.reject(err);});
There is no internal constraints for parameters!
First suggestion:
Do easily debug, with displaying labels if table returns "no value"
I solved the same problem with the corrected query syntax for the parameters in the SQL query:
...
WHERE t.FieldName = ISNULL (#parameter, t.FieldName)
AND t.FieldName2 = ISNULL (#parameter2, t.FieldName2)
...
I ran into the same problem with drop down multi-choice parameter with long list of possible values (>100). Check your situation and investigate which parameter breaks a report.
I got a report scheduled to run daily in jasper & which ran first day & later it is giving me empty report. I am passing a parameter & I tested if it is getting passed to ireport & yes it is.
I ran the query in the DB backend & I get some records so the second day output (empty page) is not correct.
The same jasper report worked fine after just I cut & paste the same text from the code basically I didn't change anything, wondering if the first time compilation is making it to work fine & later I need to recomplile?
But still I could not crack this & any input from your side will be helpful on this.