I am using Spagobi version 3.6.0, Jaybird-2.2.2JDK_1.7 and Firebird 2.5 (x64). I set up a datasource and the testing is OK.
I set up a dataset and the preview shows the correct list of colunms, only there is no data. Access via some other SQL viewer shows the data.
The error message in the Catalina log is:
org.firebirdsql.jdbc.FBSQLException: The result set is closed
Does anybody have an idea what I did wrong?
After some testing the solution to your problem is to specify the connection property defaultHoldable=true in the connection URL of the datasource, so for example:
jdbc:firebirdsql://localhost/database?defaultHoldable=true
As commented earlier you also need to upgrade to Jaybird 2.2.7, otherwise you will be confronted with bugs JDBC-304 and/or JDBC-305.
I haven't checked the code of SpagoBI, but it looks like SpagoBI assumes that result sets are always holdable over commit and executes its queries using auto commit. It should either not use auto commit, or check the DatabaseMetaData.getResultSetHoldability() and/or Connection.getHoldability() and explicitly request holdable result sets.
Related
I'm facing this annoying problem in Jasper. I have created a report based on a PostgreSQL function. When I watch the preview, I do not have any problem with the results. However, when I publish the report and try to execute it, I get this error:
org.postgresql.util.PSQLException: ERROR: cannot execute CREATE TABLE in a read-only transaction
I've checked on the internet for a possible solution, so far this is the only thing that I have found with a similar problem:
https://community.jaspersoft.com/questions/814793/report-execution-fails-due-read-only-transaction-mode
However, adding the property to the URL does not work, or I'm not so sure if I have to write it in this way:
jdbc:postgresql://server:5432/data_base?defaultReadOnly="false"
In Jasper, what else can I do? I only can query the function, and requiring any change in it is an HUGE bureaucratic issue.
Jasper Studio 6.3.0
According to the documentation the JDBC connection parameter would be readOnly=false.
Have you verified that you are not connecting to a streaming replication standby server?
Updating an older report system which was developed using VS 2008 and Crystal Reports. After updates, some reports started prompting for database login, while others work perfectly (with updates). Reports were changed to include new table and fields. All table and report document connections are established via common routine, similar to: SetDBLogon(myConnectionInfo, Me.CrystalReportViewer1.ReportSource)
Public Sub SetDBLogon(ByVal myConnectionInfo As ConnectionInfo, ByVal myReportDocument As ReportDocument)
Dim myTables As Tables = myReportDocument.Database.Tables
For Each myTable As CrystalDecisions.CrystalReports.Engine.Table In myTables
Dim myTableLogonInfo As TableLogOnInfo = myTable.LogOnInfo
myTableLogonInfo.ConnectionInfo = myConnectionInfo
Try
myTable.ApplyLogOnInfo(myTableLogonInfo)
Catch ex As Exception
MsgBox(ex.Message)
End Try
Next End Sub
It scans through each table sets the connection. Also scans sub-reports. Not sure what causes crystal reports to request login when it's already set specifically. When correct credentials are provided, it still fails to connect.
I've tried removing the report object and inserting the latest version.
Here's the issue and the solution to this problem (in my case).
Crystal Reports data sources can include, ADO .NET, OLE DB, ODBC, etc... with various drivers. The reports were created with a specific connection and driver, that no longer applied. I used a new database connection. Since the application scans each report and sets the correct connection parameters eventually, this would normally work, and it has worked in the past. But the problem was that the target system didn't have the right drivers for the connection provider I used. What made this harder to troubleshoot was that the connectivity piece in Crystal Reports is not very intuitive and duplicate connection names can be created with different providers -- same names different providers.
The solution was to open the report, go to :
Database Expert > Set Datasource location
and this is the key part:
Select the connection with the correct provider.
In my case, this was SQLOLEDB
You can right-click the connection and choose "Properties" and check the provider.
Another way to resolve it would be to install the correct drivers and versions. In this case, since the SQLOLEDB provider was installed and already worked, I decided to keep all the reports exclusively use that provider instead.
You may need to check providers installed to verify, a direct way is to check registry, for example, SQL Native client SQLNCLI10 can be found:
HLKM\SOFTWARE\Microsoft\SQLNCLI10
I have a business scenario where, whenever a new record is loaded into a DB table,
a) A notification will be sent to the client. Notification message is to convey data is loaded and ready for querying.
b) Upon receiving the notification, the Client will make an OData query to the JBOSS vitrual DB. Odata is supported by Teiid VDB
Problem is that: The new records (inserted via manual/automated SL script) that are not returned in the ODATA query response. It is always returning the cached result for first 5 minutes. Because the Odata has a default cache time setting to 5 minutes.
We want TEIID to always return all the records including the newly inserted one.
I tried the following option but it is not working as expected (https://developer.jboss.org/wiki/AHowToGuideForMaterializationcachingViewsInTeiid)
1) Cache hints
/*+ cache(ttl:300000) */ select * from Source.UpdateProduct
2) OPTION NOCACH
**** This works when I make a JDBC query to the DB.
Please suggest, how to turn off this caching in case of ODATA REST query ?
I think Teiid documentation https://docs.jboss.org/author/display/TEIID/OData+Support could help.
You don't specify version of Teiid you use, so I enclose the most current version's documentation.
Now when you go through the docs page, at the bottom there is section Configuration, where there are several configurable options.
Doesn't the skiptoken-cache-time option serve your need? Try setting it to lower value/zero and see if this helps. Just locate the odata war, open it, and change the WEB-INF/web.xml file.
Jan
I added a database writer destination to a working mirth channel. The destination is not writing to the table like it is supposed to, but it is not generating errors on the dashboard. I'm not really sure how to get it to work.
Here are the steps I have taken so far:
changed the name of the table to a non-existing table // Does not generate error, suggesting that it does not even recognize the destination
validator connector (successful)
verified username/pw/URL are correct (I even cloned a working database writer from the same channel to try to get it to run)
removed all filters (in case it was filtering for some reason)
cloned the same transformer used in another working destination from the same channel
allowed nulls in the SQL server database in case it was trying to insert nulls
disabled/enabled channels. Started/restarted mirth. Opened/closed SQL server
I am not really sure what else there is to do. Any suggestions?
You have to click deploy all channels in the channels menu in order for mirth to launch the modified version of a channel after you make changes to it. Then you may have to start all channels in the dashboard too. That got my channel working
Whenever I try to apply filter to an attribute, which has ValueSelection= Dropdown, the dropdown is not populated and error message "The requested list could not be retrieved because the query is not valid or a connection could not be made to the data source" is shown instead.
If I set up ValueSelection=List I am getting a different error message:
An attempt has been made to use a semantic query extension associated with the data extension 'SQL' that is not registered for this report server.
(Microsoft.ReportingServices.SemanticQueryEngine)
This happens within BIDS environment and was observed both in SQL 2005 and SQL 2008.
I've already studied articles, which discussed the similiar problem, but neither of them applied to my case. The user account in data source has all necessary rights, data could be retrieved without any problem (for example if i try "Explore data" in data source view). The SQL profiler shows that no query is being sent to SQL Server when there is an attempt to populate dropdown. So nothing is wrong with the query, it is simply never executed.
Your connection is not working. Try to test you connection by trying a simple table and query output.
This will enable you to test the connection before trying anything advanced.
Got this problem and in my case it was caused by wrong connection string in Data Source - instead of just having a SQL Server name like "SOMESQLSERVER_MACHINE" I had for some reason "SOMESQLSERVER_MACHINE.our.corp.domain". It had to be the same, but then I realized that the domain is wrong, after removing it all works like a charm again. That said: it's always good idea to start with detailed checks on your basic settings.
Otherwise this could be a problem with permissions to the folders on Report Manager.