How to update table using schema compare while data is in table? - tsql

Using Visual Studio Enterprise 2015
I would like to use schema compare to update a bunch of table changes from a test environment to my local one.
I'm getting the error:
Rows were detected. The schema update is terminating because data loss
might occur.
So this is saying I have data in the tables I want to update and I could lose data if I made the table changes. But I'm going to do a data compare afterward the get the updated data as well. How can I override the above error and force the changes? Or do I have to just truncate the tables with data in them first?
Thank you in advance

I found the answer in the settings.
Click on the options Cog wheel that's next to the compare and update buttons.
Next click on the General tab and then uncheck "Block on possible data loss"
Hope this helps someone in the future.

Related

How to detect if tableview was changed

I have following idea: user can edit database and when he will press exit from database he will be asked if he want to save edited database or if he edited something and then turned it back he wont be asked.
I think i should compare created database and database after editing when user press exit, but don't know how.
This is my code for creating database
model = new QSqlRelationalTableModel(this, *db);
model->setTable("cv");
model->setFilter("cv_id = "+currentCV+"");
model->removeColumns(0,1);
model->select();
ui->tableView->show();
ui->tableView->setModel(model);
You have 2 options.
Create event triggers in the database. Event triggers work when users change database table structures or create tables or change columns. Thus, these triggers work when the user executes any DDL commands. (change table, add column, create index, drop table, etc.) You can insert these commands into your log tables using event triggers.
Your database structures (all tables, columns, indexes, etc.) are stored in information_shema. You can select the data of these tables and save it somewhere and then compare it with the data that has changed.
Take in mind that no change to database is performed before you're submitting the user changes. By default, sumbit is performed when changes occur, and this does not help you. But, you can set
model.setEditStrategy(QSqlTableModel.EditStrategy.OnManualSubmit)
At this point, changes are persisted only when the method model.submitAll() is called.
All you have to do, at this point, is exploiting dataChanged signal of your model and use a flag variable to check if changes have been operated:
data_changed = False
def on_data_changed()
data_changed = True
model.dataChanged.connect(on_data_changed)
Now you can adjust the logic of the code to serve your specific purpose

How to call a sas dataset by its label or where to check its name

I have a problem in dealing with SAS Enterprise Guide that runs on the server of my client.
I do not have access to the libraries so, in order to use the datasets the only thing we can do is to store them on the local disk C: of the computer and drag them to SAS.
We can not create libraries because the server does not read local paths.
Once you drag a table, let's call it "mydata" in SAS, the table is automatically renamed "mydata9865" with random numbers at the end and "mydata" is its label.
If you right-click the table and go to properties, you can't find the name of the table, just the label.
The only way I found to check the real name of the dataset is to open the Query Builder and check the name in the code preview.
The problem is that I am dealing with tables of millions of records and the machine I am using is very slow, so whenever I want to open the Query Building, just to check the table's name, it takes sometimes even an hour.
I am not a SAS expert, so I am sure there is a smarter way to do so. Is it possible for instance to use the table by calling it with its label?
data mydata2;
set mydata;
run;
instead of
set mydata9865?
Or is there some place I can rapidly check the name of the table without going through the query builder?
I tried to google it but I can't find anything, I hope someone will be able to help me!
Thank you in advance
Hover the mouse pointer over a data node to see it's attributes. The data set name is the File name: value.
For example:
In this example I had renamed the nodes created by two different queries to be the same (doable:yes, smart:maybe not). NOTE: A data node Label: is not necessarily the same as it's underlying data set's label metadata.
Regarding
use the table by calling it with its label?
Two nodes can have the same label, and is a a situation that defeats this approach.
Use the COPY task to upload your data explicitly. It sounds like you're not adding your data to the projects properly so SAS automatically assigns a name, rather than if you explicitly import or load your data.
Problem solved! I should have simply upload the data to the server with Tasks->Data->Upload Data Sets to Server but I didn't know this task so I didn't know it was possible to do it at all!
https://communities.sas.com/t5/SAS-Enterprise-Guide/Importing-sas-data-sets-from-C-drive-into-SAS-EG-not-possible/td-p/135184
Thank you everybody for you help!

SSMS 2008 adding ALTER TABLE WITH CHECK ADD CONSTRAINT to my Procs

I have searched Google, BOL, and several forums and can't find the answer:
I have a very small data base application that I write some queries and SPs to extract data. A few days ago I opened an exsiting SP to find that something had added code similar to that below, sometimes multiple lines referring to every table in the database. When I set up a new simple SP like "Select * from TinyTable" and re-open it, the same code has been inserted.
The last thing I remember doing was reviewing the settings for results to grid in SSMS 2008 R2. I'm afraid I may have accidently changed a setting but I've spent hours reviewing them and can't identify what it might be.
I have considered reinstalling SSMS to set back to defaults but I have a linked server set up to solve a collation conflict, and don't want to cause problems with that. If anyone can point me in the right direction I'd appreciate it. I may be searching using the wrong terminology but can find nothing. As I say, I don't know for sure a change to the SSMS tools options is the problem but I suspect it could be something I have done.
Here's a sample of what gets automatically inserted at the bottom of every one of my procs:
GO
ALTER TABLE [dbo].[tblLot] WITH CHECK ADD CONSTRAINT [FK_tblLot_tblLocation] FOREIGN KEY([LocID])
REFERENCES [dbo].[tblLocation] ([LocID])
You likely have Tools > Options > SQL Server Object Explorer > Scripting > Object Scripting Options > Generate script for dependent objects set to True. Try changing the value to false.

Visio 2013: How to trigger a change in databinding of all shapes

I have a nice process overview for our ordering process in Visio. I have an external data source (SQL Server), which works fine. Every record in my data source represents one ordering process. Currently all my shapes of the process are linked to the first record of the data source.
Now I want to add a dynamic behavior. What I want to achieve is this:
A user provides the order reference in a textbox (order reference is a column in the data source)
Afterwards the user clicks a button
After the button click, the process is updated and all shapes are now linked to the external data source record, that matches the provided order reference
So in short: the user should be able to select which process that needs to be visualized.
I assume that this is common functionality, but I don't see how I can deal with this requirement. I've searched already some days on this issue, but without any success.
Can you help me with this issue?
Thanks a lot!
Problem solved :-)
Some old school VBA was required. Using the DataRecordSet object did the trick. It contains a method GetDataRowIDs that you can use to query the external dataset. Once you have the record to visualize, it's just a matter of dynamically updating the shapes with the correct record. Use macro recording to see how to do this.
MSDN: http://msdn.microsoft.com/en-us/library/office/ms195694(v=office.12).aspx

Get Error when Save modifed record using Light Switch on Azure

I am using Light switch on Azure.
After I modified a column in a record when I click the Save button I got
"Store update, insert, or delete statement affected an unexpected number of rows(0). Entties may have been modified or deleted since entities were loaded, Refresh ObjectStateManager entries.
I use VS 2012 on my dev machine debug this light switch app. it works fine and no errors when I modify the save column on same records then save it.
Is anybody in this forum has idea what could cause this? and how should I work around it?
I suspect the azure machine don't have the same version of EF with my dev machine. but in the Light switch project both client and server reference I could not find the EF is referenced there. So I don't know how I can bring the EF dll on my machine up to Azure machine.
Anybody could give me some suggestion on this?
Thanks
Chris
Usually it's a side effect of Optimistic Concurrency. This article can give you the idea of it in Lightswitch:
LightSwitch 2012 Concurrency Enhancements
When it's working on dev machine and it's not working on Azure, I guess something is not right in your production database.
you can also take a look at Entity framework: affected an unexpected number of rows(0)
Having Instead of insert/update triggers, sometimes SQL server does not report back an IdentityScope for each new inserted/updated row. Therefore EF can not realize the number of affected rows.
Normally, any insert/update into a table with identity column are immediately followed by a select of the scope_identity() to populate the associated value in the Entity Framework. The instead of trigger causes this second step to be missed, which leads to the 0 rows inserted error.
You can change your trigger to be either before or after insert or tweak your trigger by adding following line at the end of it:
select [Id] from [dbo].[TableXXX] where ##ROWCOUNT > 0 and [Id] = scope_identity()
Find more details in this or this thread.