I have searched Google, BOL, and several forums and can't find the answer:
I have a very small data base application that I write some queries and SPs to extract data. A few days ago I opened an exsiting SP to find that something had added code similar to that below, sometimes multiple lines referring to every table in the database. When I set up a new simple SP like "Select * from TinyTable" and re-open it, the same code has been inserted.
The last thing I remember doing was reviewing the settings for results to grid in SSMS 2008 R2. I'm afraid I may have accidently changed a setting but I've spent hours reviewing them and can't identify what it might be.
I have considered reinstalling SSMS to set back to defaults but I have a linked server set up to solve a collation conflict, and don't want to cause problems with that. If anyone can point me in the right direction I'd appreciate it. I may be searching using the wrong terminology but can find nothing. As I say, I don't know for sure a change to the SSMS tools options is the problem but I suspect it could be something I have done.
Here's a sample of what gets automatically inserted at the bottom of every one of my procs:
GO
ALTER TABLE [dbo].[tblLot] WITH CHECK ADD CONSTRAINT [FK_tblLot_tblLocation] FOREIGN KEY([LocID])
REFERENCES [dbo].[tblLocation] ([LocID])
You likely have Tools > Options > SQL Server Object Explorer > Scripting > Object Scripting Options > Generate script for dependent objects set to True. Try changing the value to false.
Related
Is there any way to force SSMS (SQL Server Management Studio) 15 to create a column in the designer with the default data type that I want, in this case nvarchar(100) instead of nchar(10).
Interestingly, there are a few ideas about this on the web, involving changing the registry, however when I do this and restart SSMS the registry just reverts back to nchar(10).
This makes me wonder if there is now a setting or an option inside SSMS to achieve this? I have expanded every tree in the options and I cannot find anything there.
I would actually consider a plug-in at this point.
You can modify these values altering specific keys in the registry.
Look for registry keys named:
SSVDefaultColumnType
SSVDefaultNCharLength
Depending on the exact version of your SSMS installation you can find these keys in different paths.
For example using SQL Server Management Studio 17.9.1 you can find these keys in the following path:
HKEY_CURRENT_USER\Software\Microsoft\SQL Server Management Studio\14.0\DataProject
I changed these keys' values to varchar(50):
Now my default type/size in the table editor have changed to:
I decided to post an answer because the behaviour of SSMS is a bit more complicated.
Firstly, for version 18.5.1 (latest, the version I am using) the registry path is:
HKEY_CURRENT_USER\Software\Microsoft\SQL Server Management Studio\18.0_IsoShell\DataProject.
It's not difficult to find the path and the keys, as others have already stated.
However the problem was not finding the keys, it was that SSMS was reverting the registry keys back when I was restarting the editor after making the change. Since the editor is saving those keys on closing, and I was making changes after failing to succeed (with the editor still open), the keys were always being overwritten.
The trick is to close SSMS before changing the registry, something I wasn't doing previously. Every article on this subject states having to 'restart the editor after making the changes', whereas the correct advice would be to 'close the editor before making the changes'.
Hopefully someone else having the same problem will find this useful.
I have performed this process numerous times with other Reports but this one Report is not working as it should.
Essentially, I am trying to point a report at a new server where the stored procedure is EXACTLY the same as on the previous server. I am using the Verify Database functionality to do this. But when I point at the new server and enter parameters, CR prompts me to re-map the fields. This would be only slightly annoying if the Map Fields window actually displayed the returned columns from the new server.
But, as you can see from the image, even with the 'Match type' unchecked, no columns from the stored procedure display to be mapped. I have clicked on every field in the report but none of them show any columns to map to.
I have also tried changing the Database Location first before trying to verify, but that doesn't make any difference.
Has anybody else seen this? Is there any sort of workaround?
I found my solution. Kinda dumb, really.
My stored procedure calls another stored procedure. I commented that call out and tried to Verify the Database and it worked.
Apparently Crystal Reports doesn't handle procedures that call other procedures very well when trying to map fields.
I'm using DBeaver 5.2.5.201811181655 with IBM DB2/400 v7r3.
I'm trying to see a schema called WRKCERTO, but Database Navigator will not show it. The schema is there and I have rights to it, and I'm able to run SQL scripts with its objects, such as SELECT * FROM WRKCERTO.DAILYT and it works.
To make matters stranger, when WRKCERTO is the only schema in the filters, the contents of a schema which I cannot identify are shown under the connection as if the connection is their parent. It doesn't show any schema as a node in the tree between the connection & Tables, Views, etc. The tables are familiar, but I cannot determine their exact schema, and as such also cannot query any of them because DBeaver doesn't know what schema to use.
The behavior of the Projects window is the same.
If I connect with SquirrelSQL 3.8.1 everything looks ok. I can see WRKCERTO along with all my other schemas as if nothing is different.
The screenshot below shows the issue. The schema I use most is F_CERTOB, which is visible under the connection ASP7, which currently has two schema filters: F_CERTOB and WRKCERTO. But as shown, WRKCERTO...isn't.
The connection TEST is an exact copy of ASP7, but its only filter is WRKCERTO. And as mentioned above, the items under the connection name cannot be identified.
I've gone through the DBeaver settings, but I cannot find any way to change this behavior. AND...this is the first time I've tried to use WRKCERTO. I tried to access it for the first time only a couple days ago, so it seems unlikely there are bad bits of information about it floating around in my system, or in DBeaver.
What information can I provide to help diagnose this issue...?
Please check below url.Similar issue mentioned with some solution.
You may also want to try this and let me know if it works or not.
https://dbeaver.io/forum/viewtopic.php?f=2&t=911
I am using Light switch on Azure.
After I modified a column in a record when I click the Save button I got
"Store update, insert, or delete statement affected an unexpected number of rows(0). Entties may have been modified or deleted since entities were loaded, Refresh ObjectStateManager entries.
I use VS 2012 on my dev machine debug this light switch app. it works fine and no errors when I modify the save column on same records then save it.
Is anybody in this forum has idea what could cause this? and how should I work around it?
I suspect the azure machine don't have the same version of EF with my dev machine. but in the Light switch project both client and server reference I could not find the EF is referenced there. So I don't know how I can bring the EF dll on my machine up to Azure machine.
Anybody could give me some suggestion on this?
Thanks
Chris
Usually it's a side effect of Optimistic Concurrency. This article can give you the idea of it in Lightswitch:
LightSwitch 2012 Concurrency Enhancements
When it's working on dev machine and it's not working on Azure, I guess something is not right in your production database.
you can also take a look at Entity framework: affected an unexpected number of rows(0)
Having Instead of insert/update triggers, sometimes SQL server does not report back an IdentityScope for each new inserted/updated row. Therefore EF can not realize the number of affected rows.
Normally, any insert/update into a table with identity column are immediately followed by a select of the scope_identity() to populate the associated value in the Entity Framework. The instead of trigger causes this second step to be missed, which leads to the 0 rows inserted error.
You can change your trigger to be either before or after insert or tweak your trigger by adding following line at the end of it:
select [Id] from [dbo].[TableXXX] where ##ROWCOUNT > 0 and [Id] = scope_identity()
Find more details in this or this thread.
I'm fairly new to using the Birt Report Designer and need to figure out how to generate a report from a SQLite database. I have suceeded in getting it to connect to the DB but am now unsure how to generate a report and the tutorials that I have found aren't of much help so far.
I have a template that was given to me by my employer that has a few fields, I'm wondering if these fieldnames (in the template) are supposed to match field names in the DB.
Also, when I go to Run->View Report-> As PDF I am unsure what I am supposed to enter for the field "User Key", does this correspond to a table name in the DB or something along these lines?
As of now, I have tried entering a table name but just a blank report is generated.
If anyone can point me to a good resource or help with this I would greatly appreciate it. Thanks
There are two books i could really advice:
BIRT - A Field Guide to Reporting
Integrating and Extending BIRT
and the Eclipse Help containing BIRT documentation.
I suppose the User Key could be report parameter (listed in Data Explorer window), which is passed to Data Set to select appropriate data. If I'm guessing right, check within a Data Set editor ("Parameters" tab and "Query" tab) where the User Key parameter goes in - probably to one of the table field in a WHERE clause. Parameters in a query are represented by question marks: SELECT * FROM fooTable WHERE barColumn = ?. Hope tracking this would lead to find out, what to enter to the parameter.
Additionally, ensure if your Data Set(s) is(are) connected correctly to your SQLite Data Source ("Data Source" tab in a Data Set editor).
Being as new as you are to BIRT, I would suggest building a couple of reports with the sample DB (Classic Models). There are many, many samples out there for you to use as a guide. Additionally, most tutorials will use the Classic Models data so you can follow right along. After you create a couple of practice reports (this should not take more than 30-45 minutes) the template you have been given will likely make A LOT more sense and allow you to make progress almost immediately.
If you are looking for a nice collection of tutorials and samples, be sure to check out Birt Exchange for Dev Share (samples) & tutorials.
As for the "User Key" this is almost certainly a report-level parameter used to filter the data set (as the previous answer points out).
Good Luck!