it is possible to pass argument to powershell script a datatable? - powershell

I'm trying to create a PowerShell script that inserts a datatable to SQL via WriteToServer...
This script is called by a PowerAutomateDesktop automation.
So... I cannot pass my datatable as an argument :(
%dt% it s datatable variable which needs to be used inside powershell script.
This is my dilemma - it is interpreted as a string or something like that
#Invoke-sqlcmd Connection string parameters
$params = #{'server'='SQLEXPRESS';'Database'='Db'}
Write-Output %dt%
#Variable to hold output as data-table
$dataTable = %dt% | Out-DataTable
#Define Connection string
$connectionString = "Data Source=DSQLEXPRESS; Integrated Security=SSPI;Initial Catalog=Db"
#Bulk copy object instantiation
$bulkCopy = new-object ("Data.SqlClient.SqlBulkCopy") $connectionString
#Define the destination table
$bulkCopy.DestinationTableName = "dbo.__SALES"
#load the data into the target
$bulkCopy.WriteToServer($dataTable)
#Query the target table to see for output
Invoke-Sqlcmd #params -Query "SELECT * FROM dbo.__SALES" | format-table -AutoSize
Thanks!
UPDATE
No loner need to pass an argument - I create the datatable inside the script.
Thanks again!

Work-around: create the datatable inside the script

Related

Passing multiple variables from csv file to invoke-sqlcmd

I'm trying to read values from a CSV file, embed them into a INSERT T-SQL statement and run that statement using Invoke-Sqlcmd.
Here's my code:
Push-Location; Import-Module SQLPS -DisableNameChecking; Pop-Location
$InsertQry = "insert into $ImportTable VALUES ('`$(Col1)','`$(Col2)','`$(Col3)','`$(Col4)') "
Import-CSV $ImportFile | ForEach-Object { `
$RowData = "Col1=$($_.{My Ref})","Col2=$($_.{General satisfaction})","Col3=$($_.Helpfulness)","Col4=$($_.Effort)"
Invoke-Sqlcmd `
-Database $DBName -ServerInstance $SQLServer `
-Query $InsertQry `
-Variable $RowData
}
The script works fine for rows in the CSV file that contain values for each column. Unfortunately for me, some of the rows in the CSV file contain empty values (so perhaps only the first two columns contain values). These rows fail to be inserted into the table, and generate the following error:
Invoke-Sqlcmd : The format used to define the new variable for
Invoke-Sqlcmd cmdlet is invalid. Please use the 'var=value' format for
defining a new variable.
The potentially empty columns are all columns that are either empty or contain a single digit number 1 to 5.
I've tried various ways to escape the value, cast it to a different data type, add zero or an empty string to it, null coalesce it, but I cannot get a solution that works.
I have control over the destination table, so I'd be happy to pass zero, empty string, null or any other value as a placeholder for the empty values.
As per the documentation you are to pass variables in a string array where each element has a "key=value" format. You are building that correctly. Invoke-SQLCMD seems to take offence to null values being passed. The nulls of course are coming from blank entries in your CSV. Assuming you allow nulls in those columns then perhaps you could just adjust the query as each loop pass instead.
Push-Location; Import-Module SQLPS -DisableNameChecking; Pop-Location
$InsertQry = "insert into $ImportTable VALUES ('{0}','{1}','{2}','{3}')"
$propertiesToSplat = #{
Database = $DBName
ServerInstance = $SQLServer
}
Import-CSV $ImportFile | ForEach-Object {
$propertiesToSplat.Query = $InsertQry -f $_."My Ref", $_."General satisfaction", $_.Helpfulness, $_.Effort
Invoke-Sqlcmd #propertiesToSplat
}
So at each loop pass we use the format operator to insert the column values into your insert statement. Using curly braces in property names is useful when your properties contain special characters. Since you just have to deal with a space; quotes work just as well.
I also wanted to show you splatting which is a method to pass properties as a hashtable to a cmdlet. This lets you edit props on the fly and keep your lines shorter without having to worry about backticks everywhere.
Edit - completely new answer
I suck at ForEach-Object. This a foreach loop that checks the value of "General staisfaction" for each line in the CSV, and replaces it with a placeholder string before completing the $RowData variable. Unfortunately I cannot test it here; please let me know how you get on.
Push-Location; Import-Module SQLPS -DisableNameChecking; Pop-Location
$InsertQry = "insert into $ImportTable VALUES ('`$(Col1)','`$(Col2)','`$(Col3)','`$(Col4)') "
$myCSVFile = Import-CSV $ImportFile
foreach($line in $myCSVFile){
if($line."General staisfaction" -eq $null -or $line."General staisfaction" -eq ""){
$line."General staisfaction" = "placeholder"
}
$RowData = "Col1=$($line.{My Ref})","Col2=$($line.{General satisfaction})","Col3=$($line.Helpfulness)","Col4=$($line.Effort)"
Invoke-Sqlcmd -Database $DBName -ServerInstance $SQLServer -Query $InsertQry -Variable $RowData
}

Concat invoke-SqlCmd query string does not work

How can I concat a list of parameter OR a string of parameter (better) to my sql query? The below does not work.
$parameters = #("-ServerInstance `"MyMachine\SQLEXPRESS`"", "-Database %TargetDbName%", "-Username %SQLUserName%", "-Password %SQLPassword%")
$row = Invoke-Sqlcmd -Query "SELECT field FROM Table;" $parameters
I want to execute later multiple queries all with the same connection parameters and it is usefull to reuse them in a string which I can just add to the query string.
You were on the right track. Sounds like you are looking for splatting.
Splatting is a method of passing a collection of parameter
values to a command as unit.
I don't use Invoke-SQLcmd but it should work just like this:
$parameters = #{
ServerInstance = "MyMachine\SQLEXPRESS"
Database = "TargetDbName"
Username = "SQLUserName"
Password = "SQLPassword"
Query = "SELECT field FROM Table;"
}
$row = Invoke-Sqlcmd #parameters
Collect all the parameters as a hashtable and splat the cmdlet. If you wanted to use this parameter set again later, but make small changes, that would be easy now by referencing the name/value pair of the hashtable.
$parameters.Query = "SELECT field FROM DifferentTable;"
$anotherRow = Invoke-Sqlcmd #parameters
Have a look at parameter splatting
This means that you can put the arguments into a hashtable and pass the hashtable as parameters. Given your code, you could change it to. Notice that even though i assign the parameters to a hashtable $parameters you have to send it to the cmdlet using the #parameter syntax.
$parameters = #{ServerInstance="MyMachine\SQLEXPRESS";Database="$env:TargetDbName";Username="$env:SQLUserName";Password="$env:SQLPassword"}
$row = Invoke-Sqlcmd -Query "SELECT field FROM Table;" #parameters
I assumed that the TargetDBName, Username and password were to be found in environment variables so i changed the code a little to get those as well.
Give it a go.

PowerShell Using **DacServices** With SQLCMD Variables To Deploy A DACPAC

In PowerShell I'm using Microsoft.SqlServer.Dac.DacServices and Microsoft.SqlServer.Dac.DacDeployOptions to deploy/update a database DACPAC. The problem I am having is finding where to set the SQLCMD Variables the package requires.
Abbreviated Sample
# Create a DacServices object, which needs a connection string
$dacsvcs = New-Object Microsoft.SqlServer.Dac.DacServices "server=$sqlserver"
# Load dacpac from file
$dp = [Microsoft.SqlServer.Dac.DacPackage]::Load($dacpac)
# Deploy options
$deployOptions = New-Object Microsoft.SqlServer.Dac.DacDeployOptions
$deployOptions.IncludeCompositeObjects = $true
I know I can input these just fine with SqlPackage.exe, and maybe that's what I should do. But no where in the documentation or web grok can I find an example of DacServices usage with SQLCMD variables as an option--SQLCMD variables as required parameters for my project's DACPAC.
You should set options in the $deployOptions.SqlCommandVariableValues property. This is an updateabase Dictionary - you can't assign a new dictionary but you can update the key/value pairs inside it. For example to set a variable "MyDatabaseRef" to "Database123" use
$deployOptions.SqlCommandVariableValues.Add("MyDatabaseRef", "Database123");
The API reference is here.
I have another code snippet to share in relation to this, a method of processing multiple variables from a Powershell script argument;
param(
[hashtable] $SqlCmdVar
)
$deployOptions = New-Object Microsoft.SqlServer.Dac.DacDeployOptions
# Process the Sql Command Variables
#
if ($SqlCmdVar -ne $null)
{
foreach($key in $SqlCmdVar.keys)
{
Write-Verbose -Message "Adding Sql Command Variable ""$key""..."
$deployOptions.SqlCommandVariableValues.Add($key,$SqlCmdVar[$key])
}
}
You would call the script like this;
myscript.ps1 -SqlCmdVar #{ variable1 = "my first value"; variable2 = "my second value"; variableetc = "more values"}

file IO, is this a bug in Powershell?

I have the following code in Powershell
$filePath = "C:\my\programming\Powershell\output.test.txt"
try
{
$wStream = new-object IO.FileStream $filePath, [System.IO.FileMode]::Append, [IO.FileAccess]::Write, [IO.FileShare]::Read
$sWriter = New-Object System.IO.StreamWriter $wStream
$sWriter.writeLine("test")
}
I keep getting error:
Cannot convert argument "1", with value: "[IO.FileMode]::Append", for
"FileStream" to type "System.IO.FileMode": "Cannot convert value
"[IO.FileMode]::Append" to type "System.IO.FileMode" due to invalid
enumeration values. Specify one of the following enumeration values
and try again. The possible enumeration values are "CreateNew, Create,
Open, OpenOrCreate, Truncate, Append"."
I tried the equivalent in C#,
FileStream fStream = null;
StreamWriter stWriter = null;
try
{
fStream = new FileStream(#"C:\my\programming\Powershell\output.txt", FileMode.Append, FileAccess.Write, FileShare.Read);
stWriter = new StreamWriter(fStream);
stWriter.WriteLine("hahha");
}
it works fine!
What's wrong with my powershell script? BTW I am running on powershell
Major Minor Build Revision
----- ----- ----- --------
3 2 0 2237
Another way would be to use just the name of the value and let PowerShell cast it to the target type:
New-Object IO.FileStream $filePath ,'Append','Write','Read'
When using the New-Object cmdlet and the target type constructor takes in parameters, you should either use the -ArgumentList parameter (of New-Object) or wrap the parameters in parenthesis - I prefer to wrap my constructors with parens:
# setup some convenience variables to keep each line shorter
$path = [System.IO.Path]::Combine($Env:TEMP,"Temp.txt")
$mode = [System.IO.FileMode]::Append
$access = [System.IO.FileAccess]::Write
$sharing = [IO.FileShare]::Read
# create the FileStream and StreamWriter objects
$fs = New-Object IO.FileStream($path, $mode, $access, $sharing)
$sw = New-Object System.IO.StreamWriter($fs)
# write something and remember to call to Dispose to clean up the resources
$sw.WriteLine("Hello, PowerShell!")
$sw.Dispose()
$fs.Dispose()
New-Object cmdlet online help: http://go.microsoft.com/fwlink/?LinkID=113355
Yet another way could be to enclose the enums in parens:
$wStream = new-object IO.FileStream $filePath, ([System.IO.FileMode]::Append), `
([IO.FileAccess]::Write), ([IO.FileShare]::Read)
If your goal is to write into a logfile or text file, then you could try the supported cmdlets in PowerShell to achieve this?
Get-Help Out-File -Detailed

Powershell: Implementing an IdataReader wrapper around streamreader

I am trying to load extremely large CSV files into SQL Server using Powershell. The code also has to apply on the fly regex replacements, allow for various delimiters, EOR, and EOF markers. For maintenance, I would really like all of this logic to exist in Powershell without importing assemblies.
To be efficient, I know I need to use the SQLBulkCopy method. But, all of the Powershell examples I see fill a DataTable and pass it which is not possible for me because of the file size.
I am pretty sure I need to wrap StreamReader in an Idatareader and then pass that to SQLBulkcopy. I found a couple great examples of this implemented in C#:
http://archive.msdn.microsoft.com/FlatFileDataReader
http://www.codeproject.com/Articles/9258/A-Fast-CSV-Reader
Is it possible to accomplish this functionality using native PowerShell without importing the C# assembly? I am specifically having a hard time converting the abstract class wrapper.
This is the code I have so far that does not pass an IdataReader and breaks on memory limits.
function Get-CSVDataReader()
{
param (
[string]$path
)
$parsedData = New-Object 'System.Collections.Generic.List[string]'
#List<string[]> parsedData = new List<string[]>()
$sr = new-object IO.StreamReader($path)
while ($line = $sr.ReadLine())
{
#regex replace and other logic here
$parsedData.Add($line.Split(','))
}
,$parsedData #if this was an idatareader, the comma keeps it from exploding
}
$MyReader = Get-CSVDataReader('This should not fill immediately. It needs a Read Method.')
Thanks a bunch for the help.
If all you want to do is use a DataReader with SqlBulkCopy you could use the ACE drivers which comes with Office 2007/2010 and is also available as a separate download to open an OLEDB connection to to CSV file, open a reader and call WriteToServer
$ServerInstance = "$env:computername\sql1"
$Database = "tempdb"
$tableName = "psdrive"
$ConnectionString = "Server={0};Database={1};Integrated Security=True;" -f $ServerInstance,$Database
$filepath = "C:\Users\Public\bin\"
get-psdrive | export-csv ./psdrive.csv -NoTypeInformation -Force
$connString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=`"$filepath`";Extended Properties=`"text;HDR=yes;FMT=Delimited`";"
$qry = 'select * from [psdrive.csv]'
$conn = new-object System.Data.OleDb.OleDbConnection($connString)
$conn.open()
$cmd = new-object System.Data.OleDb.OleDbCommand($qry,$conn)
$dr = $cmd.ExecuteReader()
$bulkCopy = new-object ("Data.SqlClient.SqlBulkCopy") $connectionString
$bulkCopy.DestinationTableName = $tableName
$bulkCopy.WriteToServer($dr)
$dr.Close()
$conn.Close()
#CREATE TABLE [dbo].[psdrive](
# [Used] [varchar](1000) NULL,
# [Free] [varchar](1000) NULL,
# [CurrentLocation] [varchar](1000) NULL,
# [Name] [varchar](1000) NULL,
# [Provider] [varchar](1000) NULL,
# [Root] [varchar](1000) NULL,
# [Description] [varchar](1000) NULL,
# [Credential] [varchar](1000) NULL,
# [DisplayRoot] [varchar](1000) NULL
#)
I'm importing large CSV's by a datatable and performing batch updates after 1 million rows.
if ($dt.rows.count -eq 1000000) {
$bulkCopy.WriteToServer($dt)
$dt.Clear()
}
Here is the link where I detail my own script on my blog, but the above code outlines the basic concept. My PowerShell script took 4.x minutes to import 9 million rows from a 1.1 GB CSV. The script relied on SqlBulkCopy, [System.IO.File]::OpenText and a datatable.