How best to change SQL passwords and update all the connection strings on a farm? - powershell

I am very new to Powershell and was wondering if it is possible to accomplish the following, which is based on a requirement to change SQL passwords every 3 months and update all the websites whose web.config files have connections strings at the same time to the new SQL password.
Ideally the new SQL passwords, server names & databases could be stored in a file (SQL table, XML files, csv, etc) and the script reads in the file. Then the script changes the SQL Passwords based on the content of a table that contains the names of the servers and databases. Immediately after step 1, the script updates the web.configs connection strings in about 20 websites to the new password used in step 1. I guess I need a table to hold the path to the web.configs too. Then it repeats the process by changing a SQL password and updating another 20 websites.
Nice to have would be creating a backup copy of the web.config prior to modifying it and wrapping the script in transaction just in case the script encounters an error.
Thanks

Related

Working with CSVs in SSIS without SQL Database connection

So I am tasked with performing some calculation in SSIS on a CSV file using T-SQL. Once I am complete I need to save the project as an ISPAC file so other users outside my business can run the program too, however their SQL connection managers will obviously differ from mine. To that end is it possible to use T-SQL to make modifications to a CSV inside of SSIS without actually using a connection manager to an SQL database? If so, what tasks would I use to achieve this? Many Thanks
EDIT: For more clarification on what i need to achieve: Basically I am tasked with adding additional data to a CSV file. For example adding forename and surname together etc, create new columns such as month numbers, names etc from a date string then output that to a new CSV location. I am only allowed to use T-SQL and SSIS to achieve this. Once Complete I need to send the SSIS project file to another organisation to use therefore I cannot hardcode any of my connections or assume they hold any of the databases I will use.
My initial question is can all this be done inside of SSIS using T-SQL but I think I have the answer to that now. I plan to parameterise the connection string, source file location, output file and use the Master DB and temp DBs to perform the SQL code to add the additional data, so they will have SQL their end but it wont make any assumptions on what they are using

Storing data in array vs text file

My database migration automation script used to require the user to copy the database names into a text file, then the script would read in that text file and know which databases to migrate.
I now have a form where the user selects which databases to migrate, then my script automatically inserts those database names into the text file, then reads in that text file later in the script.
Would it be better practice to move away from the text file all together and just store the data in an array or some other structure?
I'm also using PowerShell.
I'm no expert on this, but I would suggest keeping the text file even if you choose to use the array or form only approach. You can keep the text file as sort of a log file, so you don't have to read from it, but you could write to it so you can quickly determine what databases were being migrated if an error happens.
Although in a production environment you probably have more sophisticated logging tools, but I say keep the file in case of an emergency and you have to debug.
When you finish migrating and determine in the script that everything is as it should be, then you can clear the text file or keep it, append the date and time, and store it, as a quick reference should another task come up and you need quick access to databases that were migrated on a certain date.

How to take backup of database objects (tables scripts, stored procedures and functions etc) in SQL Server using SQL scripts

I have a question about SQL Server: how to take script database objects automatically using T-SQL script and keep into specific foleder (e.g. c:\backup\)
Create monthly tables structure (create without data), views, procedures, functions and triggers backups with unique names and all are in one file and keep it into specific folder in SQL Server.
I have tried like below using ssms manually
right click on database
click Task -> Generate Scripts -> Choose Objects -> Select entire database
and all database objects (tables, view, function and triggers etc) ->
set scripting option choose the required directory name and save file
Please tell me how to achieve this task in SQL Server.

Extract Active Directory into SQL database using VBScript

I have written a VBScript to extract data from Active Directory into a record set. I'm now wondering what the most efficient way is to transfer the data into a SQL database.
I'm torn between;
Writing it to an excel file then firing an SSIS package to import it or...
Within the VBScript, iterating through the dataset in memory and submitting 3000+ INSERT commands to the SQL database
Would the latter option result in 3000+ round trips communicating with the database and therefore be the slower of the two options?
Sending an insert row by row is always the slowest option. This is what is known as Row by Agonizing Row or RBAR. You should avoid that if possible and take advantage of set based operations.
Your other option, writing to an intermediate file is a good option, I agree with #Remou in the comments that you should probably pick CSV rather than Excel if you are going to choose this option.
I would propose a third option. You already have the design in VB contained in your VBscript. You should be able to convert this easily to a script component in SSIS. Create an SSIS package, add a DataFlow task, add a Script Component (as a datasource {example here}) to the flow, write your fields out to the output buffer, and then add a sql destination and save yourself the step of writing to an intermediate file. This is also more secure, as you don't have your AD data on disk in plaintext anywhere during the process.
You don't mention how often this will run or if you have to run it within a certain time window, so it isn't clear that performance is even an issue here. "Slow" doesn't mean anything by itself: a process that runs for 30 minutes can be perfectly acceptable if the time window is one hour.
Just write the simplest, most maintainable code you can to get the job done and go from there. If it runs in an acceptable amount of time then you're done. If it doesn't, then at least you have a clean, functioning solution that you can profile and optimize.
If you already have it in a dataset and if it's SQL Server 2008+ create a user defined table type and send the whole dataset in as an atomic unit.
And if you go the SSIS route, I have a post covering Active Directory as an SSIS Data Source

How to copy everything except data from one database to another?

In T-SQL (Microsoft SQL 2008), how can I make a new database which will have the same schemas, tables, table columns, indexes, constraints, and foreign keys, but will not contain any data from the original database?
Note: making a full copy, then removing all data is not a solution in my case, since the database is quite big, and such full copy will spend too much time.
See here for instructions: How To Script Out The Whole Database In SQL Server 2005 and SQL Server 2008
In SQL Management Studio, right click on the database and select "Script database as"
http://msdn.microsoft.com/en-us/library/ms178078.aspx
You can then use the script to create an empty one.
Edit : OP did say 2008
I use liquibase for this purpose. Just point liquibase to a different server and it will use your changelog to bring the second database up to date, schema wise. It has the added benefit that the changelog file gets stored in source control and so I can have tagged versions of it, allowing me to restore a database to what a specific version of my app is expecting.