Am trying to use below code but am getting a syntax error on line 4 chr 5. For the love of me I am not able to find out what's wrong with my code. Can someone please help, Must be something really really stupid.
PS, this is not my code, someone helped me with it.
$csv = Import-Csv 'D:\Chayan\POmiss\test.csv'
$SQLSERVER = "WSQL009D"
Foreach ($row in $csv) {
Invoke-Sqlcmd -ServerInstance $SQLSERVER -Database BuyerWorksheetCentralRepository -Query "INSERT
into BuyerWorksheetCentralRepository.[po].[PurchaseOrderMessage]
values (
$($row.PurchaseOrderID),
4,
'Timeout waiting for mainframe to send response message',
getdate(),
'deb00535'
)"
}
#Steven answer is good about using Here Strings make things simpler and easier to debug. In this case, it is not a PowerShell error that is happening. When you see "Msg 102, Level 15, State 1" that is a SQL error. You will have to check the schema of your database to ensure that you are passing in the right value. If you POID is a varchar type, then you may be simply missing a set of quotes around the VALUES that you are passing. e.g. :
...
values (
'$($row.PurchaseOrderID)',
4,
'Timeout waiting for mainframe to send response message',
getdate(),
'deb00535'
)
...
Remember, you are replacing $($row.PurchaseOrderID) with the value in the CSV. You still need the single quotes in the query for SQL to not complain ;-). So your code becomes:
$csv = Import-Csv 'D:\Chayan\POmiss\test.csv'
$SQLSERVER = "WSQL009D"
Foreach ($row in $csv) {
$query =
#"
INSERT
into BuyerWorksheetCentralRepository.[po].[PurchaseOrderMessage]
values (
'$($row.PurchaseOrderID)',
4,
'Timeout waiting for mainframe to send response message',
getdate(),
'deb00535'
)
"#
Invoke-Sqlcmd -ServerInstance $SQLSERVER -Database BuyerWorksheetCentralRepository -Query $query
}
If you need debugging, you can always do:
Write-Host $query
And see if that query will actually work in SQL.
The last thing that might be happening is to check your CSV file in Notepad. You may have an extra row of data that is blank. The error message eerily seems like you have one entry that has a blank value that is trying to be inserted.
EDIT
#ChayanChakraborty Provided the sample file:
PurchaseOrderID PONumMaster DivisionNum HoursElapsedSinceLastStatusUpdate
--------------- ----------- ----------- ---------------------------------
16601566 536958 8 2070
16601613 536998 8 1471
16601626 537011 8 700
Unfortunately #ChayanChakraborty, that is not a CSV file. Adding .CSV to the file name does not make it a Comma-Separated Values file. If you are exporting to file from Microsoft SQL Server Management Studio, you have to change the output options (see Here).
In Microsoft SQL Server Management Studio, go to Tools -> Options... menu. Then go to the Query Results -> SQL Server -> Results to Text section. Change the Output Format to Comma delimited, click Ok, and re-run your query. Your CSV file should only look like:
PurchaseOrderID,PONumMaster,DivisionNum,HoursElapsedSinceLastStatusUpdate
16601566,536958,8,2070
16601613,536998,8,1471
16601626,537011,8,700
You should verify that the end of the file also doesn't have the results as that will mess it up:
(3 rows affected)
Completion time: 2021-05-14T11:29:21.9260884-06:00
My guess is Invoke-SQLcmd isn't able to interpret the multi-line argument you gave -Query. This is typically done using a here string, like below:
$csv = Import-Csv 'D:\Chayan\POmiss\test.csv'
$SQLSERVER = "WSQL009D"
$QueryTemplate =
#"
INSERT INTO BuyerWorksheetCentralRepository.[po].[PurchaseOrderMessage]
VALUES (
%PurchaseOrderID%,
4,
'Timeout waiting for mainframe to send response message',
GETDATE(),
'deb00535'
)
"#
Foreach ($row in $csv) {
$Query = $QueryTemplate -replace '%PurchaseOrderID%', $row.PurchaseOrderID
Invoke-Sqlcmd -ServerInstance $SQLSERVER -Database BuyerWorksheetCentralRepository -Query $Query
}
I changed it a little bit. Rather than recreate the entire query for every loop iteration I created a template with a known string to replace with data you're getting from the CSV, namely $row.PurchaseOrderID.
Related
I am running a script which triggers a query to fetch the worst performing SQL query for an interval of 1hr from the SQL server & also formatting the output using "Format-Table -AutoSize". But the query is not printing full query rather it is printing partial.
Code Snippet:
function PerformerQuery {
$i=1
$number=2
do{
$serverInstanceP = "SQLExpress2014"
$perfQuery="
USE [master]
SELECT TOP 20
total_worker_time/execution_count AS Avg_CPU_Time
,Execution_count
,total_elapsed_time/execution_count as AVG_Run_Time
,total_elapsed_time
,(SELECT SUBSTRING(text,statement_start_offset/2+1,statement_end_offset) FROM sys.dm_exec_sql_text(sql_handle)) AS Query_Text
FROM sys.dm_exec_query_stats
ORDER BY Avg_CPU_Time DESC
"
Invoke-Sqlcmd -ServerInstance $serverInstanceP -Database master -Query $perfQuery -QueryTimeout 9400| Format-Table -AutoSize
Start-Sleep -s 3
}
while ($i -le $number)
}
PerformerQuery
Tried with "Out-String" & "Out-Gridview", but the same output. Cannot use "ft" as it is printing the discrete data.
Please refer the screenshot as the output (added both command line output and database server output).
Database Server output:
Commandline output:
Did you try to add -Wrap to the Format-Table and/or pipe it to Out-String -Width 500 so it does not use the terminal width as a limit.
I'm trying to read values from a CSV file, embed them into a INSERT T-SQL statement and run that statement using Invoke-Sqlcmd.
Here's my code:
Push-Location; Import-Module SQLPS -DisableNameChecking; Pop-Location
$InsertQry = "insert into $ImportTable VALUES ('`$(Col1)','`$(Col2)','`$(Col3)','`$(Col4)') "
Import-CSV $ImportFile | ForEach-Object { `
$RowData = "Col1=$($_.{My Ref})","Col2=$($_.{General satisfaction})","Col3=$($_.Helpfulness)","Col4=$($_.Effort)"
Invoke-Sqlcmd `
-Database $DBName -ServerInstance $SQLServer `
-Query $InsertQry `
-Variable $RowData
}
The script works fine for rows in the CSV file that contain values for each column. Unfortunately for me, some of the rows in the CSV file contain empty values (so perhaps only the first two columns contain values). These rows fail to be inserted into the table, and generate the following error:
Invoke-Sqlcmd : The format used to define the new variable for
Invoke-Sqlcmd cmdlet is invalid. Please use the 'var=value' format for
defining a new variable.
The potentially empty columns are all columns that are either empty or contain a single digit number 1 to 5.
I've tried various ways to escape the value, cast it to a different data type, add zero or an empty string to it, null coalesce it, but I cannot get a solution that works.
I have control over the destination table, so I'd be happy to pass zero, empty string, null or any other value as a placeholder for the empty values.
As per the documentation you are to pass variables in a string array where each element has a "key=value" format. You are building that correctly. Invoke-SQLCMD seems to take offence to null values being passed. The nulls of course are coming from blank entries in your CSV. Assuming you allow nulls in those columns then perhaps you could just adjust the query as each loop pass instead.
Push-Location; Import-Module SQLPS -DisableNameChecking; Pop-Location
$InsertQry = "insert into $ImportTable VALUES ('{0}','{1}','{2}','{3}')"
$propertiesToSplat = #{
Database = $DBName
ServerInstance = $SQLServer
}
Import-CSV $ImportFile | ForEach-Object {
$propertiesToSplat.Query = $InsertQry -f $_."My Ref", $_."General satisfaction", $_.Helpfulness, $_.Effort
Invoke-Sqlcmd #propertiesToSplat
}
So at each loop pass we use the format operator to insert the column values into your insert statement. Using curly braces in property names is useful when your properties contain special characters. Since you just have to deal with a space; quotes work just as well.
I also wanted to show you splatting which is a method to pass properties as a hashtable to a cmdlet. This lets you edit props on the fly and keep your lines shorter without having to worry about backticks everywhere.
Edit - completely new answer
I suck at ForEach-Object. This a foreach loop that checks the value of "General staisfaction" for each line in the CSV, and replaces it with a placeholder string before completing the $RowData variable. Unfortunately I cannot test it here; please let me know how you get on.
Push-Location; Import-Module SQLPS -DisableNameChecking; Pop-Location
$InsertQry = "insert into $ImportTable VALUES ('`$(Col1)','`$(Col2)','`$(Col3)','`$(Col4)') "
$myCSVFile = Import-CSV $ImportFile
foreach($line in $myCSVFile){
if($line."General staisfaction" -eq $null -or $line."General staisfaction" -eq ""){
$line."General staisfaction" = "placeholder"
}
$RowData = "Col1=$($line.{My Ref})","Col2=$($line.{General satisfaction})","Col3=$($line.Helpfulness)","Col4=$($line.Effort)"
Invoke-Sqlcmd -Database $DBName -ServerInstance $SQLServer -Query $InsertQry -Variable $RowData
}
I don't know where my error is in my powershell script.
#Add-PSSnapin SqlServerCmdletSnapin100
#Add-PSSnapin SqlServerProviderSnapin100
$DATA=IMPORT-CSV "C:\Users\pzd74f\Desktop\SLAPERFMAT2011.csv" -header("SLA1 Applications Availability")"
FOREACH ($LINE in $DATA)
{
$SLADefinition="`'"+$SLADefinition+"`'"
# insert into this
$SQLHEADER=”INSERT INTO [SP2010_EDCSLA_AppDBHIM].[dbo].SLAPerfMatrix ([SLADefinition])"
#insert into this
$SQLVALUES=" VALUES ($SLADefinition)"
$SQLQUERY=$SQLHEADER+$SQLVALUES
Invoke-Sqlcmd –Query $SQLQuery -ServerInstance localhost
}
The line
$DATA=IMPORT-CSV "C:\Users\pzd74f\Desktop\SLAPERFMAT2011.csv" -header("SLA1 Applications Availability")"
is syntactically incorrect
Remove extra " at the end.
I have got a file "servers.txt" :
[Server1]
Value_A
Value_B
Value_C
[Server2]
Value_A
[Server3]
Value_A
Value_B
Value_C
Value_D
===
I need to search into this file and display the server line + all his values.
Something like :
$search = "server3"
gc servers.txt | Select-String -Pattern $search and diplay until the next "["
(I can't tell for example, display the line+1, because the values are different, sometimes there are only 3, sometimes 1, etc.)
Thanks a lot!
How about:
$search = "server3"
(gc servers.txt -Delimiter '[') -like "$search]*" -replace '^','[' -replace '\s*\[$'
Cleaner solution (I think):
(gc servers.txt -raw) -split '\r\n(?=\[)' -like "?$search]*"
Looks like your delimiter is a blank line. How about reading the file and processing it so the first line is server name, all the following lines until a blank are an array of data, and then on blank lines it outputs a custom object with the server name and array of data as properties, and creating an array of those object?
Hm, that's confusing, and I wrote it. Let me post code, and then explain it.
$Server = ""
$Data = #()
$Collection = #()
Switch(GC C:\temp\test.txt){
{[String]::IsNullOrEmpty($Server) -and !([String]::IsNullOrWhiteSpace($_))}{$Server = $_;Continue}
{!([String]::IsNullOrEmpty($Server)) -and !([String]::IsNullOrEmpty($_))}{$Data+=$_;Continue}
{[String]::IsNullOrEmpty($_)}{$Collection+=[PSCustomObject]#{Server=$Server;Data=$Data};Remove-Variable Server; $Data=#()}
}
If(!([String]::IsNullOrEmpty($Server))){$Collection+=[PSCustomObject]#{Server=$Server;Data=$Data};Remove-Variable Server; $Data=#()}
Ok, it starts out by defining variables as either empty strings or arrays.
Then it processes each line, and performs one of three actions depending on the situation. The first line of the switch reads the text file, and processes it line by line. The first option in the Switch basically reads:
If there is nothing stored in the $Server variable, and the current line is not blank, then $Server = Current Line. Continue to the next line.
The second option is:
If $Server is not blank, and the current line is not blank, add this line to the array $Data. Continue to the next line.
The last option for the Switch is:
If the current line is blank, then this is the end of the current record. Create a custom object with two properties. The first property is named Server, and the value is whatever is in $Server. The second property is named Data, and the value is whatever is in $Data. Then remove the $server variable, and reset $Data to an empty array.
After the switch it checks to see if $Server still has data, and outputs one last object if it does. I do this in case there is no blank line at the end of the last record, just as cleanup.
What you are left with is $Collection being an array of objects that looks something like this:
Server Data
------ ----
[Server1] {Value_A , Value_B , Value_C}
[Server2] {Value_A}
[Server3] {Value_A , Value_B , Value_C , Value_D}
I'm trying to export a database table to text (CSV-ish) for a later BULK INSERT.
It would be a lot less hassle to have dates in ISO format yyyy-mm-dd. I have, I believe, finally persuaded SQL Server Express to expect British format in its import (despite the greyed out Server Properties being stuck in "English (US)" no matter what I do). I changed the user accounts to British, and that corresponds to my PowerShell CSV export format.
But I'd rather use ISO format to route around the problem for good.
At the moment, having filled a table variable from a SELECT * FROM Table and piped that into Export-CSV, the dates are coming out in the generated text file as dd/mm/yyyy format.
How can I force the PowerShell script to use ISO format dates in all statements (i.e. no specifying formats in each individual command), so the Export-CSV will write them as I need? I've been going around in circles for a couple of hours looking at 'cultures' and things, but I'm utterly confused!
try formatting your culture:
PS C:\> $(get-date).ToShortDateString()
2/16/2013
PS C:\> $(Get-Culture).DateTimeFormat.ShortDatePattern = 'yyyy-MM-dd'
PS C:\> $(get-date).ToShortDateString()
2013-02-16
FYI I've done quite a bit with BULK Inserting into SQL using PowerShell, and I found that the simplest way to approach the problem was to:
Export the data to CSV with Export-Csv -Delimited "`t" - this is a tab delimited file.
When Bulk Inserting, insert into a temp table that has all the columns set to NVARCHAR(MAX) datatype.
Create a 2nd Temp Table that has the proper data types set.
Select the records from Temp Table 1 into Temp Table 2, with a REPLACE command in SQL to replace all quotes with nothing.
This only really falls over for me when I come across a column that contains a tab within it's own data, a pain but I just replace the tabs with spaces in those columns if I come across them.
As I was dealing with CSV files with many thousands of lines, this was the simplest way I could do it, and the quickest speed wise as it's all set based.
Many Thanks jbockle for the help, I can now take data home from the office (SQL server 2005) and import it into identical tables (from CREATE .sql scripts) on my home Win XP machine running SQL Server 2008 Express.
In this first example, the table is exported directly to CSV and then cleaned up afterwards. The Convert-Line function removes " quotes because BULK INSERT doesn't like them, and also adds extra backtick delimiters to the start and end of each line, so that it can then replace any True with 1 and any False with 0 (anywhere on the line) because Booleans are tricky :)
(it seems to have a problem with adjacent Booleans, so this pass runs twice to mop them all up!)
The final line trims the unwanted ` from the start & end of each line.
## PowerShell newbies : for scripts to run, you must first execute: Set-ExecutionPolicy RemoteSigned
## and then all scripts will work (it's a global setting, remembered once run)
$SQLDT = New-Object "System.Data.DataTable"
$path = "C:"
(Get-Culture).DateTimeFormat.ShortDatePattern="yyyy-MM-dd" # temp setting, for dates in ISO format
function Convert-Line
{ param( [string]$line=$(throw 'a CSV line is required.'))
## add ` to start and end, combined with removing quotes
$line = "``" + $line + "``" -replace "`"", ""
## swap Boolean True/False to 1 or 0
## !! Need to do it twice, as it has trouble with adjacent ones!!
$line = $line -replace "``True``","``1``" -replace "``False``","``0``"
$line = $line -replace "``True``","``1``" -replace "``False``","``0``"
## return with trimmed off start/end delimiters
$line.TrimStart("``").TrimEnd("``")
}
function Table-Export
{ param( [string]$table=$(throw 'table is required.'))
## Get whole SQL table into $SQLDT datatable
$sqldt.reset()
$connString = "Server=.\SQLEXPRESS;Database=Test1;Integrated Security=SSPI;"
$da = New-Object "System.Data.SqlClient.SqlDataAdapter" ("select * from $table",$connString)
[void]$da.fill($SQLDT)
## Export to CSV with ` delimiter
$sqldt | Export-Csv $path\$table.CSV -NoTypeInformation -delimiter "``"
## read entire file, parse line by line, process line, write back out again
(gc $path\$table.CSV) | Foreach-Object { Convert-Line -line $_ } | Set-Content $path\$table.CSV
}
# main...
Table-Export -table "Table1"
Table-Export -table "Table2"
Table-Export -table "Table3etc"
This imports nicely using SQL
DELETE FROM table1;
BULK INSERT table1 FROM 'C:\table1.csv' WITH (KEEPIDENTITY, FIELDTERMINATOR = '`');
DELETE FROM table2;
BULK INSERT table2 FROM 'C:\table2.csv' WITH (KEEPIDENTITY, FIELDTERMINATOR = '`');
-- etc, all tables
Original Identity fields are preserved for table joins to still work.
Works fine with field types : numerical, text, boolean, date.
BULK INSERT will complain about the first line containing field names, but that's an ignorable warning (don't bother trying FIRSTROW = 2 as it doesn't work).
In this second example, another approach is taken - this time the DataTable is copied to a new one where each column is string type, so that each field can be adjusted without type problems. The copy datatable is then exported to CSV, and then all we need to do is process it to remove the unwanted doublequotes.
This time we get a chance to replace " in any string fields, so neither doublequotes or commas will break them e.g. a name like John "JJ" Smith will end up as John 'JJ' Smith, which is hopefully acceptable enough.
$SQLDT = New-Object "System.Data.DataTable"
$path = "C:"
(Get-Culture).DateTimeFormat.ShortDatePattern="yyyy-MM-dd" # temp setting, for dates in ISO format
function Table-Export
{ param( [string]$table=$(throw 'table is required.'))
## Get whole SQL table into $SQLDT datatable
$sqldt.reset()
$connString = "Server=.\SQLEXPRESS;Database=Test1;Integrated Security=SSPI;"
$da = New-Object "System.Data.SqlClient.SqlDataAdapter" ("select * from $table",$connString)
[void]$da.fill($SQLDT)
## Copy $SqlDt DataTable to a new $DT2 copy, with all columns now String type
$DT2 = New-Object "System.Data.DataTable"
$sqldt.columns | Foreach-Object { $DT2.Columns.Add($_.Caption) > $null }
## copy all $SqlDt rows to the new $DT2
## and change any " double quote in any field to a ' single quote, to preserve meaning in text fields
## ( or you could use an odd char and replace in SQL database later, to return to " )
For($i=0;$i -lt $sqldt.Rows.Count;$i++)
{ $DT2.Rows.Add() > $null
For($i2=0;$i2 -lt $sqldt.Columns.Count;$i2++)
{ $DT2.Rows[$i][$i2] = $SQLDT.Rows[$i][$i2] -replace "`"","'" }
}
## If any $SqlDt column was Boolean...
## use column name.. and for all rows in the new $DT2 : convert True/False to 1/0
$sqldt.columns | Foreach-Object {
If ($_.DataType.Name -EQ "Boolean")
{ $ColName = $_.Caption
For($i=0;$i -lt $sqldt.Rows.Count;$i++)
{ If ($DT2.Rows[$i][$ColName] -EQ "True") { $DT2.Rows[$i][$ColName]="1" }
If ($DT2.Rows[$i][$ColName] -EQ "False") { $DT2.Rows[$i][$ColName]="0" }
}
}
}
## Export to CSV with ` delimiter
$DT2 | Export-Csv $path\$table.CSV -NoTypeInformation -delimiter "``"
## read entire file, parse line by line, remove all ", write back out again
(gc $path\$table.CSV) | Foreach-Object {$_ -replace "`"", "" } | Set-Content $path\$table.CSV
}
# main...
Table-Export -table "Table1"
Table-Export -table "Table2"
Table-Export -table "Table3etc"
Emtpy tables won't break this script, you'll just get a zero-bytes CSV file.