Generate a SQL create table script using powershell - powershell

I want to use powershell to be able to quickly generate a SQL script with create table statements that are able to recreate all tables in an existing database. Only thing is that I want to tweak some options, such as turn identities off.
I just cannot figure out how to do this!
I have already come as far as to set a $server variable, and to set a variable called $tables to get all tables in that particular database.
Then I use a loop:
foreach ($table in in $tables)
{$table.Script()
}
This works, but I just don't know how to add scripting options, such as NoIdentities = $True
Can anyone help me out?

I once had to do a lot of repetitive type work, including generate SQL scripts for
every table in a database. So I developed a general purpose tool that is good for this type of work. I am including the tool, as is, and a sample run of the tool intended to
produce a series of grant commands for each table, and each category of database user.
My tool runs off of CSV files rather than off of the database directly. I found it fairly easy to generate CSV files and templates for a lot of different tasks. Your mileage may vary. Maybe you can start with this tool and adapt it to your needs.
Here is the tool, and a sample run.
<#
.SYNOPSIS
Generates multiple expansions of a template,
driven by data in a CSV file.
.DESCRIPTION
This function is a table driven template tool.
It generates output from a template and
a driver table. The template file contains plain
text and embedded variables. The driver table
(in a csv file) has one column for each variable,
and one row for each expansion to be generated.
#>
function Expand-csv {
[CmdletBinding()]
Param(
[Parameter(Mandatory=$true)]
[string] $driver,
[Parameter(Mandatory=$true)]
[string] $template
)
Process
{
$xp = (Get-Content $template) -join "`r`n"
Import-Csv $driver | % {
$_.psobject.properties | % {Set-variable -name $_.name -value $_.value}
$ExecutionContext.InvokeCommand.ExpandString($xp)
}
}
}
# Now do a sample run of Expand-csv
# then display inputs and output
Expand-csv grants.csv grants.tmplt > grants.sql
get-content grants.tmplt
import-csv grants.csv
get-content grants.sql
Here is the result of running the above:
PS> C:\Users\David\Software\Powershell\test\sample.ps1
grant $privs
on $table
to $user;
privs table user
----- ----- ----
ALL Employees DBA
READ Employees Analyst
READ, WRITE Employees Application
ALL Departments DBA
READ Departments Analyst, Application
grant ALL
on Employees
to DBA;
grant READ
on Employees
to Analyst;
grant READ, WRITE
on Employees
to Application;
grant ALL
on Departments
to DBA;
grant READ
on Departments
to Analyst, Application;
PS>
In real life, I have the tool defined in my $profile file, so that it's available whenever I'm in powershell.
I'm not sure how well this applies to your case. This doesn't address the exact situation you describe, but you may be able to adapt the technique.

If you're still looking for a solution, may I suggest this little script I've been using with a single table. You should be able to update it to support multiple tables fairly easily. Notice line "$scripter.Options.NoIdentities = $true;" below.
param
(
[string] $server,
[string] $database,
[string] $schema,
[string] $table
)
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
$srv = New-Object "Microsoft.SqlServer.Management.SMO.Server" $server
$db = New-Object ("Microsoft.SqlServer.Management.SMO.Database")
$tbl = New-Object ("Microsoft.SqlServer.Management.SMO.Table")
$scripter = New-Object ("Microsoft.SqlServer.Management.SMO.Scripter") ($server)
$db = $srv.Databases[$database]
$tbl = $db.tables | Where-object {$_.schema -eq $schema-and$_.name -eq $table}
$scripter.Options.ScriptSchema = $true;
$scripter.Options.ScriptData = $false;
$scripter.Options.NoCommandTerminator = $false;
$scripter.Options.NoCollation = $true;
$scripter.Options.NoIdentities = $true;
$scripter.EnumScript($tbl)
Save it to "table.ps1" and execute from within PowerShell by passing param values:
& table.ps1 -server SERVER\INSTANCE -database MyDB -schema dbo -table MyTable
Let me know if it works.

Related

Export PC Name and Updates Needed from WSUS to CSV Using PowerShell

I'm trying to export a CSV file from WSUS using PowerShell containing a list of all computers that need updates and the titles or KBs of the updates each particular computer needs. Something like this...
Computer1, update1, update2
Computer2, update1, update3, update5
Computer3, update2, update4
I found this script on TechNet that returns the computer name and how many updates are needed, but it doesn't return the titles of the updates, and it may return all computers in WSUS, not just the ones that need updates (I'm in a test environment of only 1 computer right now).
Function Get-WSUSClientNeeded {
[cmdletbinding()]
Param (
[parameter(Mandatory=$true)]
[string]$WsusServer
)
#Load assemblies
[void][system.reflection.assembly]::LoadWithPartialName('Microsoft.UpdateServices.Administration')
#Connect to WSUS
$Global:wsus = [Microsoft.UpdateServices.Administration.AdminProxy]::getUpdateServer($WsusServer,$False,8530)
#Create Scope objects
$computerscope = New-Object Microsoft.UpdateServices.Administration.ComputerTargetScope
$updatescope = New-Object Microsoft.UpdateServices.Administration.UpdateScope
#Get Update Summary
$wsus.GetSummariesPerComputerTarget($updatescope,$computerscope) | ForEach {
New-Object PSObject -Property #{
ComputerName = ($wsus.GetComputerTarget([guid]$_.ComputerTargetId)).FullDomainName
NeededCount = ($_.DownloadedCount + $_.NotInstalledCount)
DownloadedCount = $_.DownloadedCount
NotApplicableCount = $_.NotApplicableCount
NotInstalledCount = $_.NotInstalledCount
InstalledCount = $_.InstalledCount
FailedCount = $_.FailedCount
}
}
}
Get-WSUSClientNeeded -WsusServer 'Server' | Select ComputerName, NeededCount
I'm very new to PowerShell, so any help would be greatly appreciated.
Thanks!
You're creating a custom-object from the results of the update summary. You're using piping and an inline loop. These are complicated and while you may be able to improve your scripts using them later, getting them to work in the first place is much easier if you use loops and variable assignments and arrays.
My suggestion of how to work through this is to
Split the work part of that into an actual loop. (`$wsus.Get.... piped through foreach and creating objects)
Add the results of your pull command (the object you create) to an array. You're creating an object then not doing anything with it.
Loop through the array and run commands against the elements. Apply filters or extract info as you wish.
Only pull the properties you want. Most Get- cmdlets include the -properties switch.
Use Get-Members to peek inside the objects returned by a command. It will tell you the properties and methods of the object.
Run commands like this at the command line in ISE, figure out which property names you want to extract.
$summaries = $wsus.GetSummariesPerComputerTarget($updatescope,$computerscope)
$summaries | get-members
loop through an array, pull out data, put it in another array
foreach (summary in $summaries) {
$props = $summary | select property1, property2
$AnswerArray += $props
}
Export to a csv, or use the array of answers in the next stage
$AnswerArray | export-csv -NoTypeInformation -NoClobber -Path c:\temp\answers.csv
Most of my PowerShell experience is with Active Directory, but I would have one script to extract data into a CSV, manipulate that in another script, or outside PowerShell with other tools like excel, and use it as input in a script that makes the changes I needed to make.
This turned out to be much easier than expected.
Get-WindowsUpdate -WindowsUpdate
Get-WindowsUpdate will return available updates from an online source. The -WindowsUpdate parameter will return the updates from WSUS. Make sure to import the PSWindowsUpdate module first.

How to combine a template with a CSV file in Powershell

I want to combine a template that looks like this:
grant $privs
on $table
to $user;
With a CSV file that looks like this:
privs,table,user
ALL,Employees,DBA
READ,Employees,Analyst
"READ, WRITE", Employees, Application
ALL,Departments,DBA
READ,Departments,"Analyst, Application"
To produce an SQL script that looks like this:
grant ALL
on Employees
to DBA;
grant READ
on Employees
to Analyst;
grant READ, WRITE
on Employees
to Application;
grant ALL
on Departments
to DBA;
grant READ
on Departments
to Analyst, Application;
The template has three parameters that look like Powershell variables. The CSV file has enough data
to specify five copies of the template. In real life, it would be more like 200 copies.
I also want to be able to apply the same technique to a variety of CSV files, most of which
do not come from databases. And I want to use a variety of templates, most of which do not
generate SQL. For that reason, I want a technique that deals with plain text files, instead
of attaching to the database.
Note: I am asking this question so as to provide the community with an answer.
I have written a function, Expand-Csv, that does this. Here it is:
<# This function is a table driven template tool.
It generates output from a template and
a driver table. The template file contains plain
text and embedded variables. The driver table
(in a csv file) has one column for each variable,
and one row for each expansion to be generated.
12/12/2016
#>
function Expand-csv {
[CmdletBinding()]
Param(
[Parameter(Mandatory=$true)]
[string] $driver,
[Parameter(Mandatory=$true)]
[string] $template
)
Process
{
$pattern = (Get-Content $template) -join "`n"
Import-Csv $driver | % {
foreach ($p in $_.psobject.properties) {
Set-variable -name $p.name -value $p.value
}
$ExecutionContext.InvokeCommand.ExpandString($pattern)
}
}
}
In the case at hand, the call would look like this:
Expand-Csv grants.csv grants.tmplt > grants.sql
The output that would come to the console gets redirected to a file.
I've tried using this for generating scripts in Powershell itself,
but a word of caution is in order. ExpandString does some of the same processing
that Invoke-Command does, so you can get undesired consequences. I only use it
for the simplest kind of PS code, like a series of calls to some external app.

Powershell: Advanced file age searching

This is my first post within Stackoverflow, I have for many years just read many fantastic questions, answers and other various posts. I have learned a lot a lot from this fantastic community. I hope now that I have taken the brave step to really sink my teeth into powershell and join this community that I may be able to contribute in someway!
So I have started working on a project which at it's basic core level - list all files that are older than 7 years so they can then review and delete where possible.
I have however broken the whole entire script up into several stages. I am currently stuck at a step in stage 2.
I have been stuck for about 2 days on what to many of you powershell genius's out there may only take 10mins to figure out!
I must apologise for my stupidity and lack of knowledge, my experience with powershell scripting is limited to literally 5 working days, I am currently diving in and learning with books, but I also have a job to do so don't get to learn the easy way!
My script essentially has 3 steps;
Runs a Get-ACL of top Level DATA folders to create a listing of all Groups that have permissions on particular folder. I want to then either export this data or simply hold it for the next step.
Filter this gathered information based off a CSV which contains a Column labelled Role (Role will contain a group that the Folder Manager is exclusively in), and check the inherent member of this exlcusive group (maybe this last bit needs to be another step as well?)
Stores or Exports this list of exclusive members with their relevant folder to then later use as a variable for to send an email with a list of files that need to be deleted.
With the script below I am essentially stuck on Step 2 and how to create a filter from the CSV (or stored variables?) and apply it to the GET-ACL foreach loop. I may be going about this the whole wrong way using regex, and to be honest most of this is copy and paste and reading around the internet where people have done similar tasks. SO again I apologise if this is just a dumb way to go about it from the start!
I want to thank everyone in advance for all help, opinions and advice, I will listen to it all and I will try and take on-board as much as my brain can handle - I promise!
#$RoleList = import-csv "C:\DATA\scripts\GB_CMS\CSV\datafolders_rolelist.csv"
#foreach ($Manager in $RoleList) {
#$FolderManager = $RoleList.Role
$FolderManagers = Import-Csv C:\DATA\scripts\GB_CMS\CSV\datafolders_rolelist.csv | foreach {
New-Object PSObject -prop #{
Folder = $_.Folder;
Manager = $_.'Folder Manager';
Role = $_.Role
}
}
$Role = $FolderManagers.Role
$Role
gci "c:\DATA" | Where {$_.PSIsContainer} | get-acl |
ForEach $_.Name {
[regex]$regex="\w:\\\S+"
$path=$regex.match($_.Path).Value
$_ | select -expand access |
$
where {$_.identityreference -like "$Role"} |
Select #{Name="Path";Expression={$Path}},IdentityReference
}
Thanks,
Daniel.
Bit of a guess at what you want here. e.g. if you have folders
C:\Data\Accounts
C:\Data\Marketing
C:\Data\Sales
You might have permissions
C:\Data\Accounts {'FolderManagers-Accounts', 'Accounts', 'Directors'}
C:\Data\Marketing {'FolderManagers-Marketing', 'Marketing', 'Sales'}
C:\Data\Sales {'FolderManagers-Sales', 'Sales', 'Directors'}
and your CSV is
Name, Role, Email
Alice, FolderManagers-Accounts, alice#example.com
Bob, FolderManagers-Marketing, bob#example.com
And there will be a clear mapping of one (1) row in the CSV to one of the groups in the ACLs.
And you want, from your script:
Identify who to email about "C:\Data\Accounts"
How close am I?
# Import the managers. This will turn the CSV into an array of objects
# no need to do that explicitly
$FolderManagers = Import-Csv C:\DATA\scripts\GB_CMS\CSV\datafolders_rolelist.csv
# This will be a hashtable pairing up folder names with people
# e.g. 'C:\DATA\Accounts' -> Alice
$FolderMap = #{}
# Run through all the folders
GetChildItem -Path "C:\Data" -Directory | ForEach-Object {
# Run through the group/user ACL entries on the folder
foreach ($group in (Get-Acl $_.FullName).Access.IdentityReference)
{
# Look for a matching row in the CSV
$CsvRow = $FolderManagers | Where-Object {$_.Role -match $group}
if (-not $CsvRow)
{
Write-Error "No manager found for folder $_"
}
else
{
# Add to the map
# $_ converts to folder path, C:\DATA\Accounts
# $CsvRow is the person, #{Name=Alice, Role=..., Email=...}
$FolderMap[$_.FullName] = $CsvRow
}
}
}
Then it (the FolderMap) will be
Name Value
---- -----
C:\Data\Accounts {Name='Alice';Role=...
C:\Data\Marketing {Name='Bob';Role=...
you can query it with
$person = $FolderMap["c:\data\Accounts"]
$person.email
and if you really want to export it, maybe
$FolderMap | ConvertTo-Json | Set-Content foldermanagers.json
Nb. I wrote most of this off the top of my head, and it probably won't just run. And that's a problem with big, not very specific questions on StackOverflow.
Auto-generated PS help links from my codeblock (if available):
Import-Csv (in module Microsoft.PowerShell.Utility)
ForEach-Object
Get-Acl (in module Microsoft.PowerShell.Security)
Where-Object
Write-Error (in module Microsoft.PowerShell.Utility)

Create variable from CSV

I want to make variables from a particular column in a CSV.
CSV will have the following headers:
FolderName,FolderManager,RoleGroup,ManagerEmail
Under FolderName will be a list of rows with respective folder names such as: Accounts,HR,Projects, etc... (each of these names is a separate row in the FolderName column)
So I would like to create a list of variables to call on in a later stage. They would be something like the following:
$Accounts,
$HR,
$Projects,
I have done a few different scripts based on searching here and google, but unable to produce the desired results. I am hoping someone can lead me in the right direction here to create this script.
Versions of this question ("dynamic variables" or "variable variables" or "create variables at runtime") come up a lot, and in almost all cases they are not the right answer.
This is often asked by people who don't know a better way to approach their problem, but there is a better way: collections. Arrays, lists, hashtables, etc.
Here's the problem: You want to read a username and print it out. You can't write Hello Alice because you don't know what their name is to put in your code. That's why variables exist:
$name = Read-Host "Enter your name"
Write-Host "Hello $name"
Great, you can write $name in your source code, something which never changes. And it references their name, which does change. But that's OK.
But you're stuck - how can you have two people's names, if all you have is $name? How can you make many variables like $name2, $name3? How can you make $Alice, $Bob?
And you can...
New-Variable -Name (Read-Host "Enter your name") -Value (Read-Host "Enter your name again")
Write-Host "Hello
wait
What do you put there to write their name? You're straight back to the original problem that variables were meant to solve. You had a fixed thing to put in your source code, which allowed you to work with a changing value.
and now you have a varying thing that you can't use in your source code because you don't know what it is again.
It's worthless.
And the fix is that one variable with a fixed name can reference multiple values in a collection.
Arrays (Get-Help about_Arrays):
$names = #()
do {
$name = Read-Host "Enter your name"
if ($name -ne '')
{
$names += $name
}
} while ($name -ne '')
# $names is now a list, as many items long as it needs to be. And you still
# work with it by one name.
foreach ($name in $names)
{
Write-Host "Hello $name"
}
# or
$names.Count
or
$names | foreach { $_ }
And more collections, like
Hashtables (Get-Help about_Hash_Tables): key -> value pairs. Let's pair each file in a folder with its size:
$FileSizes = #{} # empty hashtable. (aka Dictionary)
Get-ChildItem *.txt | ForEach {
$FileSizes[$_.BaseName] = $_.Length
}
# It doesn't matter how many files there are, the code is just one block
# $FileSizes now looks like
#{
'readme' = 1024;
'test' = 20;
'WarAndPeace' = 1048576;
}
# You can list them with
$FileSizes.Keys
and
foreach ($file in $FileSizes.Keys)
{
$size = $FileSizes[$file]
Write-Host "$file has size $size"
}
No need for a dynamic variable for each file, or each filename. One fixed name, a variable which works for any number of values. All you need to do is "add however many there are" and "process however many there are" without explicitly caring how many there are.
And you never need to ask "now I've created variable names for all my things ... how do I find them?" because you find these values in the collection you put them in. By listing all of them, by searching from the start until you find one, by filtering them, by using -match and -in and -contains.
And yes, New-Variable and Get-Variable have their uses, and if you know about collections and want to use them, maybe you do have a use for them.
But I submit that a lot of people on StackOverflow ask this question solely because they don't yet know about collections.
Dynamic variables in Powershell
Incrementing a Dynamic Variable in Powershell
Dynamic variable and value assignment in powershell
Dynamically use variable in PowerShell
How to create and populate an array in Powershell based on a dynamic variable?
And many more, in Python too:
https://stackoverflow.com/a/5036775/478656
How can you dynamically create variables via a while loop?
Basically you want to create folders based on the values you are getting from CSV File.
(FileName has headers such as FolderName,
FolderManager,
RoleGroup,
ManagerEmail)
$File=Import-csv "FileName"
$Path="C:\Sample"
foreach ($item in $File){
$FolderName=$item.FolderName
$NewPath=$Path+"\$FolderName"
if(!(Test-Path $NewPath))
{
New-Item $NewPath -ItemType Directory
}
}
Hope this HElps.
In PowerShell, you can import a CSV file and get back custom objects. Below code snippet shows how to import a CSV to generate objects from it and then dot reference the properties on each object in a pipeline to create the new variables (your specific use case here).
PS>cat .\dummy.csv
"foldername","FolderManager","RoleGroup"
"Accounts","UserA","ManagerA"
"HR","UserB","ManagerB"
PS>$objectsFromCSV = Import-CSV -Path .\dummy.csv
PS>$objectsFromCSV | Foreach-Object -Process {New-Variable -Name $PSItem.FolderName }
PS>Get-Variable -name Accounts
Name Value
---- -----
Accounts
PS>Get-Variable -name HR
Name Value
---- -----
HR
`

common ways to pass state from cmdlet to cmdlet

I am creating my own set of cmdlets. They all need the same state data (like location of DB and credentials for connecting to DB). I assume this must be a common need and wonder what the common idiom for doing this is.
the obvious one is something like
$db = my-make-dbcreds db=xxx creds=yyyy ...
my-verb1 $db | my-verb2 $db -foo 42...
my-verb8 $db bar wiz
.....
but i was wondering about other ways. Can I silently pipe the state from one to another. I know I can do this if state is the only thing I pipe but these cmdlets return data
Can I set up global variables that I use if the user doesnt specify state in the command
Passing the information state through the pipe is a little lost on me. You could update your cmdlets to return objects that the next cmdlet will accept via ValueFromPipeline. When you mentioned
like location of DB and credentials for connecting to DB
the best this I could think that you want is....
SPLATTING!
Splatting is a method of passing a collection of parameter
values to a command as unit. Windows PowerShell associates
each value in the collection with a command parameter.
In its simplest form
$params = #{
Filter = "*.txt"
Path = "C:\temp"
}
Get-ChildItem #params
Create a hashtable of parameters and values and splat them to the command. The you can edit the table as the unique call to the cmdlet would allow.
$params.Path = "C:\eventemperor"
Get-ChildItem #params
I changed the path but left the filter the same. You also dont have to have everything in $params you splat and use other parameters in the same call.
It is just a matter of populating the variables as you see fit and changing them as the case requires.
Spewing on the pipeline
Pretty that is what it is actually called. If you use advanced function parameters you can chain properties from one cmdlet to the next if you really wanted to. FWIW I think splatting is better in your case but have a look at the following.
function One{
param(
[parameter(Mandatory=$true,
ValueFromPipeline=$True,
ValueFromPipelineByPropertyName=$true)]
[String[]]
$Pastry
)
write-host "You Must Like $Pastry"
Write-Output (New-Object -TypeName PSCustomObject -Property #{Pastry= $pastry})
# If you have at least PowerShell 3.0
# [pscustomobject]#{Pastry= $pastry}
}
Simple function that writes the variable $pastry to the console but also outputs an object for the next pipe. So running the command
"Eclairs" | One | One | Out-Null
We get the following output
You Must Like Eclairs
You Must Like Eclairs
We need to pipe to Out-Null at the end else you would get this.
Pastry
------
{Eclairs}
Perhaps not the best example but you should get the idea. If you wanted to extract information between the pipe calls you could use Tee-Object.
"Eclair" | One | Tee-Object -Variable oneresults | One | Out-Null
$oneresults
Consider Parameter Default Values
Revisiting this concept after trying to find a better way to pass SQL connection information between many function working against the same database. I am not sure if this is the best thing to do but it certainly simplifies thing for me.
The basic idea is to add a rule for your cmdlet or wildcard rule if your cmdlets share a naming convention. For instance I have a series of functions that interact with our ticketing system. They all start with Get-Task.... and all configured with SQL connection information.
$invokeSQLParameters = #{
ServerInstance = "serverName"
Username = $Credentials.UserName
Password = $Credentials.GetNetworkCredential().Password
}
$PSDefaultParameterValues.Add("New-Task*:Connection",$invokeSQLParameters)
$PSDefaultParameterValues.Add("Get-Task*:Connection",$invokeSQLParameters)
So now in my functions I have a parameter called Connection that will always be populated with $invokeSQLParameters as long as the above is done before the call. I still use splatting as well
Invoke-Sqlcmd -Query $getCommentsQuery #Connection
You can read up more about this at about_parameters_default_values