Passing multiple variables from csv file to invoke-sqlcmd - powershell

I'm trying to read values from a CSV file, embed them into a INSERT T-SQL statement and run that statement using Invoke-Sqlcmd.
Here's my code:
Push-Location; Import-Module SQLPS -DisableNameChecking; Pop-Location
$InsertQry = "insert into $ImportTable VALUES ('`$(Col1)','`$(Col2)','`$(Col3)','`$(Col4)') "
Import-CSV $ImportFile | ForEach-Object { `
$RowData = "Col1=$($_.{My Ref})","Col2=$($_.{General satisfaction})","Col3=$($_.Helpfulness)","Col4=$($_.Effort)"
Invoke-Sqlcmd `
-Database $DBName -ServerInstance $SQLServer `
-Query $InsertQry `
-Variable $RowData
}
The script works fine for rows in the CSV file that contain values for each column. Unfortunately for me, some of the rows in the CSV file contain empty values (so perhaps only the first two columns contain values). These rows fail to be inserted into the table, and generate the following error:
Invoke-Sqlcmd : The format used to define the new variable for
Invoke-Sqlcmd cmdlet is invalid. Please use the 'var=value' format for
defining a new variable.
The potentially empty columns are all columns that are either empty or contain a single digit number 1 to 5.
I've tried various ways to escape the value, cast it to a different data type, add zero or an empty string to it, null coalesce it, but I cannot get a solution that works.
I have control over the destination table, so I'd be happy to pass zero, empty string, null or any other value as a placeholder for the empty values.

As per the documentation you are to pass variables in a string array where each element has a "key=value" format. You are building that correctly. Invoke-SQLCMD seems to take offence to null values being passed. The nulls of course are coming from blank entries in your CSV. Assuming you allow nulls in those columns then perhaps you could just adjust the query as each loop pass instead.
Push-Location; Import-Module SQLPS -DisableNameChecking; Pop-Location
$InsertQry = "insert into $ImportTable VALUES ('{0}','{1}','{2}','{3}')"
$propertiesToSplat = #{
Database = $DBName
ServerInstance = $SQLServer
}
Import-CSV $ImportFile | ForEach-Object {
$propertiesToSplat.Query = $InsertQry -f $_."My Ref", $_."General satisfaction", $_.Helpfulness, $_.Effort
Invoke-Sqlcmd #propertiesToSplat
}
So at each loop pass we use the format operator to insert the column values into your insert statement. Using curly braces in property names is useful when your properties contain special characters. Since you just have to deal with a space; quotes work just as well.
I also wanted to show you splatting which is a method to pass properties as a hashtable to a cmdlet. This lets you edit props on the fly and keep your lines shorter without having to worry about backticks everywhere.

Edit - completely new answer
I suck at ForEach-Object. This a foreach loop that checks the value of "General staisfaction" for each line in the CSV, and replaces it with a placeholder string before completing the $RowData variable. Unfortunately I cannot test it here; please let me know how you get on.
Push-Location; Import-Module SQLPS -DisableNameChecking; Pop-Location
$InsertQry = "insert into $ImportTable VALUES ('`$(Col1)','`$(Col2)','`$(Col3)','`$(Col4)') "
$myCSVFile = Import-CSV $ImportFile
foreach($line in $myCSVFile){
if($line."General staisfaction" -eq $null -or $line."General staisfaction" -eq ""){
$line."General staisfaction" = "placeholder"
}
$RowData = "Col1=$($line.{My Ref})","Col2=$($line.{General satisfaction})","Col3=$($line.Helpfulness)","Col4=$($line.Effort)"
Invoke-Sqlcmd -Database $DBName -ServerInstance $SQLServer -Query $InsertQry -Variable $RowData
}

Related

Read a CSV in powershell with a variable number of columns

I have a CSV that contains a username, and then one or more values for the rest of the record. There are no headers in the file.
joe.user,Accounting-SG,CustomerService-SG,MidwestRegion-SG
frank.user,Accounting-SG,EastRegion-SG
I would like to read the file into a powershell object where the Username property is set to the first column, and the Membership property is set to either the remainder of the row (including the commas) or ideally, an array of strings with each element containing a single membership value.
Unfortunately, the following line only grabs the first membership and ignores the rest of the line.
$memberships = Import-Csv -Path C:\temp\values.csv -Header "username", "membership"
#{username=joe.user; membership=Accounting-SG}
#{username=frank.user; membership=Accounting-SG}
I'm looking for either of these outputs:
#{username=joe.user; membership=Accounting-SG,CustomerService-SG,MidwestRegion-SG}
#{username=frank.user; membership=Accounting-SG,EastRegion-SG}
or
#{username=joe.user; membership=string[]}
#{username=frank.user; membership=string[]}
I've been able to get the first result by enclosing the "rest" of the data in the csv file in quotes, but that doesn't really feel like the best answer:
joe.user,"Accounting-SG,CustomerService-SG,MidwestRegion-SG"
Well, the issue is that what you have isn't really a (proper) CSV. The CSV format doesn't support that notation.
You can "roll your own" and just process the file yourself, something like this:
$memberships = Get-Content -LiteralPath C:\temp\values.csv |
ForEach-Object -Process {
$user,$membership = $_.Split(',')
New-Object -TypeName PSObject -Property #{
username = $user
membership = $membership
}
}
You could do a half and half sort of thing. Using your modification, where the groups are all a single field in quotes, do this:
$memberships = Import-Csv -Path C:\temp\values.csv -Header "username", "membership" |
ForEach-Object -Process {
$_.membership = $_.membership.Split(',')
$_
}
The first example just reads the file line by line, splits on commas, then creates a new object with the properties you want.
The second example uses Import-Csv to create the object initially, then just resets the .membership property (it starts as a string, and we split the string so it's now an array).
The second way only makes sense if whatever is creating the "CSV" can create it that way in the first place. If you have to modify it yourself every time, just skip this and process it as it is.

How to get Database Name from Connectionstring in PowerShell

I'm trying to get the Database name from a connection string in PowerShell.
"Server=server\instance;uid=User;pwd=Hello;Database=SomeName;"
I can think of two ways to do that, either to search for the string Database, up until the first ; after that split the string on = and select the Databasename - but I don't really know how to do that.
The second way could be with the DBConnectionStringBuilder like this:
$sb = New-Object System.Data.Common.DbConnectionStringBuilder
$sb.set_ConnectionString($cstring)
[string]$Database = ($sb | ? {$_.Keys -eq 'Database'}).value
but with this way, no matter how hard i try to filter the Databasename, it won't give me the databasename returned.
Question: What's the best way to get my Databasename from the connection string?
Use the second method, but simplify it:
$cstring = "Server=server\instance;uid=User;pwd=Hello;Database=SomeName;"
$sb = New-Object System.Data.Common.DbConnectionStringBuilder
$sb.set_ConnectionString($cstring)
$Database = $sb.database
This works perfectly fine.
If you want to avoid an error in the case where the key doesn't exist, there are a lot of ways to do that, the more idiomatic method of looking for the key first:
if ($sb.HasKey('Database')) {
$Database = $sb.Database
}
Or the object's own TryGetValue method:
if ($sb.TryGetValue('Database', [ref] $Database)) {
# It was successful
# $Database already contains the value, you can use it.
} else {
# The key didn't exist.
}
String Parsing
I don't recommend these in this case because there is some flexibility in the database connection string format, and why make your code aware of all the possibilities and try to correctly handle them all when that code was already written (the object you're using above)?
But for completeness, I'd do it with splitting and regular expression matching and capturing:
$cstring -split '\s*;\s*' |
ForEach-Object -Process {
if ($_ -imatch '^Database=(?<dbname>.+)$') {
$Database = $Matches.dbname
}
}
So here I'm first splitting on a semi-colon ; surrounded by any amount of whitespace. Each element (which should be just key-value pairs) is then checked against another regex, looking specifically for Database= and then capturing what comes after that until the end of the string, in a named capture group called dbname. If the match is successful, then the result of the capture group is assigned to the variable.
I still prefer a proper parser when one exists.
try this
"Server=server\instance;uid=User;pwd=Hello;Database=SomeName;".split(";") |
%{[pscustomobject]#{Property=$_.Split("=")[0];Value=$_.Split("=")[1]}} |
where Property -eq "Database" | select Value
other solution
$template=#"
{Property*:Abc123}={Value:Test123}
{Property*:Def}={Value:XX}
"#
"Server=server\instance;uid=User;pwd=Hello;Database=SomeName;".replace(";", "`r`n") | ConvertFrom-String -TemplateContent $template |
where Property -eq "Database" | select Value

Powershell - Iterate through variables dynamically

I am importing a CSV file with two records per line, "Name" and "Path".
$softwareList = Import-Csv C:\Scripts\NEW_INSTALLER\softwareList.csv
$count = 0..($softwareList.count -1)
foreach($i in $count){
Write-Host $softwareList[$i].Name,$softwareList[$i].Path
}
What I am trying to do is dynamically assign the Name and Path of each record to a WPFCheckbox variable based on the $i variable. The names for these checkboxes are named something such as WPFCheckbox0, WPFCheckbox1, WPFCheckbox2 and so on. These objects have two properties I planned on using, "Command" to store the $SoftwareList[$i].path and "Content" to store the $SoftwareList[$i].Name
I cannot think of a way to properly loop through these variables and assign the properties from the CSV to the properties on their respective WPFCheckboxes.
Any suggestions would be very appreciated.
Invoke-Expression is one way, though note Mathias' commented concerns on the overall approach.
Within your foreach loop, you can do something like:
invoke-expression "`$WPFCheckbox$i`.Command = $($SoftwareList[$i].Path)"
invoke-expression "`$WPFCheckbox$i`.Content= $($SoftwareList[$i].Name)"
The back-tick ` just before the $WPFCheckBox prevents what would be an undefined variable from being immediately evaluated (before the expression is invoked), but the $I is. This gives you a string with your $WPFCheckbox1, to which you then append the property names and values. The $SoftwareList values are immediately processed into the raw string.
The Invoke-Expression then evaluates and executes the entire string as if it were a regular statement.
Here's a stand-alone code snippet to play with:
1..3 |% {
invoke-expression "`$MyVariable$_` = New-Object PSObject"
invoke-expression "`$MyVariable$_` | add-member -NotePropertyName Command -NotePropertyValue [String]::Empty"
invoke-expression "`$MyVariable$_`.Command = 'Path #$_'"
}
$MyVariable1 | Out-String
$MyVariable2 | Out-String
$MyVariable3 | Out-String
As a side note (since I can't comment yet on your original question,) creating an array just to act as iterator through the lines of the file is really inefficient. There are definitely better ways to do that.

Concat invoke-SqlCmd query string does not work

How can I concat a list of parameter OR a string of parameter (better) to my sql query? The below does not work.
$parameters = #("-ServerInstance `"MyMachine\SQLEXPRESS`"", "-Database %TargetDbName%", "-Username %SQLUserName%", "-Password %SQLPassword%")
$row = Invoke-Sqlcmd -Query "SELECT field FROM Table;" $parameters
I want to execute later multiple queries all with the same connection parameters and it is usefull to reuse them in a string which I can just add to the query string.
You were on the right track. Sounds like you are looking for splatting.
Splatting is a method of passing a collection of parameter
values to a command as unit.
I don't use Invoke-SQLcmd but it should work just like this:
$parameters = #{
ServerInstance = "MyMachine\SQLEXPRESS"
Database = "TargetDbName"
Username = "SQLUserName"
Password = "SQLPassword"
Query = "SELECT field FROM Table;"
}
$row = Invoke-Sqlcmd #parameters
Collect all the parameters as a hashtable and splat the cmdlet. If you wanted to use this parameter set again later, but make small changes, that would be easy now by referencing the name/value pair of the hashtable.
$parameters.Query = "SELECT field FROM DifferentTable;"
$anotherRow = Invoke-Sqlcmd #parameters
Have a look at parameter splatting
This means that you can put the arguments into a hashtable and pass the hashtable as parameters. Given your code, you could change it to. Notice that even though i assign the parameters to a hashtable $parameters you have to send it to the cmdlet using the #parameter syntax.
$parameters = #{ServerInstance="MyMachine\SQLEXPRESS";Database="$env:TargetDbName";Username="$env:SQLUserName";Password="$env:SQLPassword"}
$row = Invoke-Sqlcmd -Query "SELECT field FROM Table;" #parameters
I assumed that the TargetDBName, Username and password were to be found in environment variables so i changed the code a little to get those as well.
Give it a go.

Set date format to be used in PowerShell export-csv?

I'm trying to export a database table to text (CSV-ish) for a later BULK INSERT.
It would be a lot less hassle to have dates in ISO format yyyy-mm-dd. I have, I believe, finally persuaded SQL Server Express to expect British format in its import (despite the greyed out Server Properties being stuck in "English (US)" no matter what I do). I changed the user accounts to British, and that corresponds to my PowerShell CSV export format.
But I'd rather use ISO format to route around the problem for good.
At the moment, having filled a table variable from a SELECT * FROM Table and piped that into Export-CSV, the dates are coming out in the generated text file as dd/mm/yyyy format.
How can I force the PowerShell script to use ISO format dates in all statements (i.e. no specifying formats in each individual command), so the Export-CSV will write them as I need? I've been going around in circles for a couple of hours looking at 'cultures' and things, but I'm utterly confused!
try formatting your culture:
PS C:\> $(get-date).ToShortDateString()
2/16/2013
PS C:\> $(Get-Culture).DateTimeFormat.ShortDatePattern = 'yyyy-MM-dd'
PS C:\> $(get-date).ToShortDateString()
2013-02-16
FYI I've done quite a bit with BULK Inserting into SQL using PowerShell, and I found that the simplest way to approach the problem was to:
Export the data to CSV with Export-Csv -Delimited "`t" - this is a tab delimited file.
When Bulk Inserting, insert into a temp table that has all the columns set to NVARCHAR(MAX) datatype.
Create a 2nd Temp Table that has the proper data types set.
Select the records from Temp Table 1 into Temp Table 2, with a REPLACE command in SQL to replace all quotes with nothing.
This only really falls over for me when I come across a column that contains a tab within it's own data, a pain but I just replace the tabs with spaces in those columns if I come across them.
As I was dealing with CSV files with many thousands of lines, this was the simplest way I could do it, and the quickest speed wise as it's all set based.
Many Thanks jbockle for the help, I can now take data home from the office (SQL server 2005) and import it into identical tables (from CREATE .sql scripts) on my home Win XP machine running SQL Server 2008 Express.
In this first example, the table is exported directly to CSV and then cleaned up afterwards. The Convert-Line function removes " quotes because BULK INSERT doesn't like them, and also adds extra backtick delimiters to the start and end of each line, so that it can then replace any True with 1 and any False with 0 (anywhere on the line) because Booleans are tricky :)
(it seems to have a problem with adjacent Booleans, so this pass runs twice to mop them all up!)
The final line trims the unwanted ` from the start & end of each line.
## PowerShell newbies : for scripts to run, you must first execute: Set-ExecutionPolicy RemoteSigned
## and then all scripts will work (it's a global setting, remembered once run)
$SQLDT = New-Object "System.Data.DataTable"
$path = "C:"
(Get-Culture).DateTimeFormat.ShortDatePattern="yyyy-MM-dd" # temp setting, for dates in ISO format
function Convert-Line
{ param( [string]$line=$(throw 'a CSV line is required.'))
## add ` to start and end, combined with removing quotes
$line = "``" + $line + "``" -replace "`"", ""
## swap Boolean True/False to 1 or 0
## !! Need to do it twice, as it has trouble with adjacent ones!!
$line = $line -replace "``True``","``1``" -replace "``False``","``0``"
$line = $line -replace "``True``","``1``" -replace "``False``","``0``"
## return with trimmed off start/end delimiters
$line.TrimStart("``").TrimEnd("``")
}
function Table-Export
{ param( [string]$table=$(throw 'table is required.'))
## Get whole SQL table into $SQLDT datatable
$sqldt.reset()
$connString = "Server=.\SQLEXPRESS;Database=Test1;Integrated Security=SSPI;"
$da = New-Object "System.Data.SqlClient.SqlDataAdapter" ("select * from $table",$connString)
[void]$da.fill($SQLDT)
## Export to CSV with ` delimiter
$sqldt | Export-Csv $path\$table.CSV -NoTypeInformation -delimiter "``"
## read entire file, parse line by line, process line, write back out again
(gc $path\$table.CSV) | Foreach-Object { Convert-Line -line $_ } | Set-Content $path\$table.CSV
}
# main...
Table-Export -table "Table1"
Table-Export -table "Table2"
Table-Export -table "Table3etc"
This imports nicely using SQL
DELETE FROM table1;
BULK INSERT table1 FROM 'C:\table1.csv' WITH (KEEPIDENTITY, FIELDTERMINATOR = '`');
DELETE FROM table2;
BULK INSERT table2 FROM 'C:\table2.csv' WITH (KEEPIDENTITY, FIELDTERMINATOR = '`');
-- etc, all tables
Original Identity fields are preserved for table joins to still work.
Works fine with field types : numerical, text, boolean, date.
BULK INSERT will complain about the first line containing field names, but that's an ignorable warning (don't bother trying FIRSTROW = 2 as it doesn't work).
In this second example, another approach is taken - this time the DataTable is copied to a new one where each column is string type, so that each field can be adjusted without type problems. The copy datatable is then exported to CSV, and then all we need to do is process it to remove the unwanted doublequotes.
This time we get a chance to replace " in any string fields, so neither doublequotes or commas will break them e.g. a name like John "JJ" Smith will end up as John 'JJ' Smith, which is hopefully acceptable enough.
$SQLDT = New-Object "System.Data.DataTable"
$path = "C:"
(Get-Culture).DateTimeFormat.ShortDatePattern="yyyy-MM-dd" # temp setting, for dates in ISO format
function Table-Export
{ param( [string]$table=$(throw 'table is required.'))
## Get whole SQL table into $SQLDT datatable
$sqldt.reset()
$connString = "Server=.\SQLEXPRESS;Database=Test1;Integrated Security=SSPI;"
$da = New-Object "System.Data.SqlClient.SqlDataAdapter" ("select * from $table",$connString)
[void]$da.fill($SQLDT)
## Copy $SqlDt DataTable to a new $DT2 copy, with all columns now String type
$DT2 = New-Object "System.Data.DataTable"
$sqldt.columns | Foreach-Object { $DT2.Columns.Add($_.Caption) > $null }
## copy all $SqlDt rows to the new $DT2
## and change any " double quote in any field to a ' single quote, to preserve meaning in text fields
## ( or you could use an odd char and replace in SQL database later, to return to " )
For($i=0;$i -lt $sqldt.Rows.Count;$i++)
{ $DT2.Rows.Add() > $null
For($i2=0;$i2 -lt $sqldt.Columns.Count;$i2++)
{ $DT2.Rows[$i][$i2] = $SQLDT.Rows[$i][$i2] -replace "`"","'" }
}
## If any $SqlDt column was Boolean...
## use column name.. and for all rows in the new $DT2 : convert True/False to 1/0
$sqldt.columns | Foreach-Object {
If ($_.DataType.Name -EQ "Boolean")
{ $ColName = $_.Caption
For($i=0;$i -lt $sqldt.Rows.Count;$i++)
{ If ($DT2.Rows[$i][$ColName] -EQ "True") { $DT2.Rows[$i][$ColName]="1" }
If ($DT2.Rows[$i][$ColName] -EQ "False") { $DT2.Rows[$i][$ColName]="0" }
}
}
}
## Export to CSV with ` delimiter
$DT2 | Export-Csv $path\$table.CSV -NoTypeInformation -delimiter "``"
## read entire file, parse line by line, remove all ", write back out again
(gc $path\$table.CSV) | Foreach-Object {$_ -replace "`"", "" } | Set-Content $path\$table.CSV
}
# main...
Table-Export -table "Table1"
Table-Export -table "Table2"
Table-Export -table "Table3etc"
Emtpy tables won't break this script, you'll just get a zero-bytes CSV file.