Merged with Table tool in showing one fewer database records.
i'm using table tool of jasper report but when i run this using ireport, it's showing one less record for every SQl query! if any1 has used it and faced same problem then please show me a way!
Related
I am facing a performance issue with jasper server. My query is for CrossTab. Query works fine in Toad as well as in jasper studio but it's execution is very slow in jasper server and sometimes it even fail with connection timeout.
I can't understand what is the reason for this behavior. Please help me.
Thank you
Query performance in jasper server depends on various factors but to get a quick idea of where the bottle neck might be in case of CrossTab (AdHoc functionality), follow these steps:
Login to JasperReport server through web UI (login as superuser) and take a look at Manage => Server Setings => Ad Hoc Cache. Here, analyze Query and Fetch column values.
Query (msec)
It shows the time from when query was sent to the db until the first row was received. If this is slow then one possible improvement would be to index some fields in the underlying database query. If you are using derived tables then trying switching to actual tables because derived tables are sub-queries/sub-selects and are intensive performance wise.
Fetch (msec)
Time from when first row was received until the last
row was received. If this is slow there might be a network
bottleneck. Try to set the fetch size in the jasperreports.properties
file to modify the number of rows to fetch at a time. Optimizing this can reduce the number of trips to the underlying database.
I am new to SSIS and am after some assistance in creating an SSIS package to do a specific task. My data is stored remotely within a MySQL Database and this is downloaded to a SQL Server 2014 Database. What I want to do is the following, create a package where I can enter 2 dates that can be compared against the create date/date modified per record on a number of tables to give me a snap shot and compare the MySQL Data to the SQL Data so that I can see if there are any rows that are missing from my local SQL Database or if any need to be updated. Some tables have no dates so I just want to see a record count on what is missing if anything between the 2. If this is better achieved through TSQL I am happy to hear about other suggestions or sites to look at where things have been done similar.
In relation to your query Tab :
"Hi Tab, What happens at the moment is our master data is stored in a MySQL Database, the data was then downloaded to a SQL Server Database as a one off. What happens at the moment is I have a SSIS package that uses the MAX ID which can be found on most of the tables to work out which records are new and just downloads them or updates them. What I want to do is run separate checks on the tables to make sure that during the download nothing has been missed and everything is within sync. In an ideal world I would like to pass in to a SSIS package or tsql stored procedure a date range, shall we say calender week, this would then check for any differences between the remote MySQL database tables and the local SQL tables. It does not currently have to do anything but identify issues, correcting them may come later or changes would need to be made to the existing sync package. Hope his makes more sense."
Thanks P
To do this, you need to implement a Type 1 Slowly Changing Dimension type data flow in SSIS. There are a number of ways to do this, including a built in transformation aptly called the Slowly Changing Dimension transformation. Whilst this is easy to set up, it is a pain to maintain and it runs horrendously slowly.
There are numerous ways to set this up using other transformations or even SQL merge statements which are detailed here: https://bennyaustin.wordpress.com/2010/05/29/alternatives-to-ssis-scd-wizard-component/
I would recommend that you use Lookup transformations as they perform better than the Slowly Changing Dimension transformation but offer better diagnostics and error handling than the better performing SQL merge statement.
Before you do this you will need to add a Checksum or Hashbytes column to your SQL data for ease of comparison with the incoming MySQL data.
In short, calculate some sort of repeatable checksum as the data is downloaded into your SQL Server, then use this in an SSIS Lookup, matching on the row key, to check for changes. Where the checksum value is different for the same row it needs updating and where there is no matching row key in your SQL Data you need to insert the new row.
When running (or verifying the database) for a report in Crystal Reports 10, I am getting the message:
"The database table "SomeTable" cannot be found. Proceed to remove this table from the report?"
for multiple tables.
The report used to work fine. The report is getting data from multiple sources, and the missing tables are those that are coming from an ODBC connection to a SQL Server DB. I think the issue may be that when the report was created, the ODBC was pointing at a different instance of the database (same structures, just different location.)
I've checked and the report user has all the required permissions on the new database.
In Crystal, if you ignore the messages the report seems to run fine. However when deploying the report to be run from within the Crystal Report Viewer in a website, it is throwing a File I/O error.
This very handy blog post provides the solution: https://wisdomofsolomon.wordpress.com/2011/06/18/crystal-reports-tables-not-found-during-verify-database/
By running Show SQL Query you can see that the generated query is running SQL like
select * from databasename.dbo.SomeTable
It's the databasename part of that that seems to be causing the problem (although as far as I could tell, in my case the DB name isn't any different between the old DB connection and the new one in my case.) Amending the table queries to remove the databasename from the SQL solved the problem for me.
You can do this as follows:
go to Database / Set Datasource Location in the menus.
drill down in the report tree to the tables that are causing a problem
Under Properties, click Overriden Qualified Table Name:
In the text box, type the name of the table without the database name (e.g. dbo.SomeTable)
Do this for all the tables causing a problem
(As a comment on that blog post points out, you could also create a new connection and replace the tables with the equivalents from that new datasource, but that leaves you with the fully qualified table name from the new connection - so you might get the same problem again in future.)
in Crystal, under file and options and database tab, the data explorer must have tables checked. This is not an easy feature to know about
Failed to load rowset. Incorrect syntax near...this error comes when I run the program. I want to show report in a crystal report. I have many tables linked by key. Anyone can suggest
I don't know what kind of database you're pulling from, but you may want to do your joins before you pull it into CR.
If that doesn't do it, you may want to save your data to a temporary table as well, then link CR to the temporary table.
In most cases this means that Crystal Reports has build an incorrect SQL statement. This happens sometimes as f.e. with SQLBase. Mostly that is difficult to solve.
Sometimes it can be solved by using a different database driver, f.e. try to use OLEDB instead of ODBC.
If that doesn't help, please provide more details:
Database (SQL Server, Oracle, or ...)
Database driver: ODBC, OLEDB, or ...
Query (via menu [Database, Show SQL Query).
I'm really at a loss as to how to procede.
I have a very large database, and the table I'm accessing has approx. 600,000 records. This database is accessed using an accounting application, which provides the report with the SQL query by which this report accesses the database.
My report has a linked subreport which has restrictions that are placed in the report header. When this report is run, the average time to refresh, using a very base query is 36 minutes. When adding two more items to the query, the report takes 2.5 hours.
Here is what I've tried:
cleaned up the report only leaving items in absolutely necessary - no difference
removed most formulas (removing the remaining formulas makes no time difference)
tried editing the SQL query - wasn't allowed because of the accounting application
tried flipping subreport and main report - didn't work
added other groupings - no difference
removed groupings - no difference
checked all the servers for lack of temp disc space - no issue
tried "on demand" subreport - no change
checked Parameters (discrete vs. range) and it is as it should be
tried bursting indexes, grouping on server, etc. - no difference
the report requires 2 passes. I've tried getting it down to one pass unsuccessfully.
There must be something I'm missing.
There does not appear to be any other modifications to the report using regular crystal functions. Is there any way to speed up the accessing of the data without having to go through all 600,000 records? The SQL query that accesses this data is long and has many requests. It is not something I can change.
Can I add something (formula?) that nullifies these requests? I'm reaching now...
Couple of things we have had success with is adding indexes to the databases, and instead of importing tables into the report, we instead wrote a stored procedure to retrieve the desired results.
If indices and stored procedures dont get you where you need to be you have reached the denormalise until it works part of life with a database. You might want to look at creating an MI database with tables optimized for your reporting needs; and some data transformation scripts that can extract the data from production to your MI database. Depending on what it is oracle / ms have tools to help you do this.
We use Crystal Reports with a billing system, and we had queries in the database that take over 1.5 hours to complete. This doesn't even take into account the rendering/formatting of the reports.
We created Materialized Views and force the client to refresh them daily. A materialized view is basically a database view that holds the returned dataset. The dataset is not refreshed unless you explicitly tell it to refresh.
Do you know what the SQL query is? If so, you can move the report outside the accounting application and paste the query directly into the Command in the database expert. I've had to do this in a couple of cases with another application I work with.