Using SQL Management Studio (the latest version, with SQL Express), I enable change tracking on a database and some of its tables.
If I ever need to edit the columns of one of this tables where change tracking is enabled, I will have to re-enable change tracking manually on this table afterward.
Is there any way to have change tracking automatically re-enabled after an edition to the table structure ?
I don't think so, but if you can afford to lose some of the data you could set up a job to run periodically and re-enable it. You should also extract the CT data and archive it somewhere before you make the table change.
Related
I am using postgres for my database and because our product is in development mode, sometimes we need to change our database scheme. If we find some problem in that we need to revert our changes, We have to do this manually. Although we have a option to make database backup before making any change. But I was looking for some solution more kind of git. So, that we don't have to do this manually.
I would suggest you script the creation of your database. Save this SQL script in a SqlScripts directory at the base of your project and check this into source control with the rest of your project. Then if you need to alter your database schema, you alter or add to you SQL script or scripts. In that way the structure of your database is versioned but the data is not in source control, which you would not typically want to do.
One short-coming here is you may have to manually run this script when reverting commits, but this isn't a situation that should happen too often.
I am working on a ModX website (mainly templates but also system settings, user management, etc) while a development website is already online with the customer starting to input the content.
I haven't found a practical solution to push my work online (layouts are stored in the database) without overriding the content input by the customer (in the database as well).
My present workflow consists of first replacing my local modx_site_content table with the one extracted from the online database, then push this hybrid database online. Not practicle, plus I am not sure user changes are confined to modx_site_content only.
There must be a better workflow though! How do you handle this?
I don't think it gets any easier than selecting the tables you need and exporting only those into the live environment. Assuming you only work on templating, the template, snippet & chunk tables are all you need to export.
We usually take a copy, develop, and merge only once when the new features are supposed to go live this minimizes this trouble. Also the client can continue normally until d-day.
If youre doing a lot of snippet work you could always just include an actual php file instead and work with your editor directly towards those files, connect them to git and what not.
If you project is not really big, you can store your chunks/resources, etc. in a separate files (there is and option called "Static Resource"), and then, manage your changes with git. Otherwise, you need to store user data in a separate table and deploy the whole database with Fabric, for example.
Is there any way in Oracle SQL Developer to view table data read-only or prevent editing? When I view the data in a table it lets me edit, and I want to avoid accidentally making a change. I would expect there to be a way to toggle this somehow, but I haven't found it. I'm using version 2.1.1.64.
Yes there is. Ask your DBA to create a user that only has read access to the TABLES in question and then log in with that user.
The other thing to remember is that if you do edit some data in a data grid in order for the change to written to the database you have to separately commit the change to the database by either issuing a COMMIT command in an SQL Worksheet or by clicking on the Green Tick button or pressing F11.
Pressing F12 Will Rollback Changes that have not been committed.
This is a very strange problem. I've asked about it before here:
How did my trigger get deleted?
I renamed my trigger and I thought it had stopped happening but the problem seems to have come back again.
I've added a trigger to a table in our database. The server is SQL 2008. The trigger doesn't do anything particularly tricky. Just changes a LastUpdated field in the table when certain fields are changed. It's a "After Update" trigger.
There is a large C++ legacy app that runs all kind of huge queries against this database. Somehow (I've got absolutely no idea how) it is deleting this trigger. It doesn't delete any other triggers and I'm certain that it's not explicitly dropping the trigger or table. The developers of this app don't even know anything about my triggers.
How is this possible?
I've tried running a trace using SQL Server Profiler and I've gone through each command that it's sending and run them using SQL Management Studio but my trigger is not affected. It only seems to happen when I run the app.
The other devs reckon that they are not deleting it explicitly. It doesn't exist in sys.objects or sys.triggers so it's not a glitch with SSMS. Guess I'll just rename it and hope for the best? I can't think of anything else to try. A few comments below have asked if the trigger is being deleted or just disabled or not working. As I stated, it's being deleted completely. Also, the problem is not related to the actual contents of the trigger. As I stated, it I remove the contents and replace with some extremely simple code that doesn't do anything then it is still deleted.
Look for table drops -- you might be surprised at what actually causes your table to be dropped. (Different things you do through the SSMS UI, as opposed to using tsql directly.)
You can prevent this kind of inadvertant table dropping like this:
Tools -> Options -> Designers-> Uncheck "Prevent saving changes that require table re-creation"
Perhaps you could create a DDL trigger on the database and log all the object deletion statements. In particular, you could write the drop events to a table to log to trace it or just flat out block it.
Is other application code dropping and re-creating the entire table ? If so, perhaps it is not aware of the trigger.
Thanks for the suggestions everyone. In particular Billinkc's suggestion of creating a DDL trigger was cool. I didn't know that such a thing exists.
Anyway after three months of wondering what was going on here I finally got to the bottom of it.
I have a bunch of scripts to create various triggers. They have the typical format of "if trigger exists then delete", "go", followed by "create trigger" and another "go". I use a script to add all these individual triggers to the database in one go. I missed one of the "go" commands at the bottom of a create command (in all these hundreds of lines). This means that trigger A got added to the database with "if trigger B exists then delete" at the bottom of it. That was so confusing...
I am about to try and automate a daily build, which will involve database changes, code generation, and of course a build, commit, and later on a deployment. At the moment, each developer on the team includes their structure and data changes for the DB in two files respectively, e.g. 6.029_Brady_Data.sql. Each structure and data file includes all changes for a version, but all changes are repeatable, i.e. with EXISTS checks etc, so they can be run every day if needed.
What can I do to bring more order to this process, which is currently basically concatenate all structure change files, run them repeatedly until all dependencies are resolved, then repeat with the data change files.
Create a database project using Visual studio database edition, put it into source control and let the developers check in their code. I have done this and it works good with daily builds and offer a lot of support for structuring your database code. See this blog post for features
http://www.vitalygorn.com/blog/post/2008/01/Handling-Database-easily-with-Visual-Studio-2008.aspx