I am using PowerShell to run a SQL query. I then want to update another table based on information pulled from the query. I have tested my SQL query and update statements directly in the SQL Server Management Studio so I know that they work. The results of those tests show that there are over 800 records that should be updated. However, when I run the same query and update from within PowerShell, it only updates one record. I have mostly copied this script from another much larger script that was written in a similar format. But it appears that I am missing a ForEach loop (or something similar) but cannot figure out where to place it or how. Here is my script:
# Set the database connection strings
$Conn02 = new-object System.Data.SqlClient.SqlConnection "SERVER_INFORMATION"
$mySQL02 = New-Object System.Data.SqlClient.SqlCommand
$mySQL02.Connection = $Conn02
$Conn03 = new-object System.Data.SqlClient.SqlConnection "SERVER_INFORMATION"
$mySQL03 = New-Object System.Data.SqlClient.SqlCommand
$mySQL03.Connection = $Conn03
#Connect to the database to perform the query
$Conn02.Open()
$mySQL02.CommandText = "SELECT IDNUM, FNAME, LNAME
FROM TABLE1
WHERE STATUS = 'C'"
$SQL02 = $mySQL02.ExecuteReader()
WHILE($SQL02.Read()){
$NEWID = $SQL02['ID_NUM']
$FNAME1 = $SQL02['FNAME']
$LNAME1 = $SQL02['LNAME']
}
#Run the update
$Conn03.Open()
$mySQL03.CommandText =
"INSERT INTO TABLE2 (user_id,firstname,lastname)
VALUES ('$NEWIDme','$FNAME1','$LNAME1')"
Thank you for your time
As #jeroen said, you need to move the Insert statement into the while loop. Here is the code should look like:
#Connect to the database to perform the query
$Conn02.Open()
$mySQL02.CommandText = "SELECT IDNUM, FNAME, LNAME
FROM TABLE1
WHERE STATUS = 'C'"
$SQL02 = $mySQL02.ExecuteReader()
#Save reader into datatable
$Datatable = New-Object System.Data.DataTable
$Datatable.Load($SQL02)
#Close the connection
$Conn02.Close()
#Run the update
$Conn03.Open()
foreach($row in $dt){
$NEWID = $row['ID_NUM']
$FNAME1 = $row['FNAME']
$LNAME1 = $row['LNAME']
$mySQL03.CommandText =
"INSERT INTO TABLE2 (user_id,firstname,lastname)
VALUES ('$NEWIDme','$FNAME1','$LNAME1')"
$mySQL03.ExecuteNonQuery()
}
$Conn03.Close()
To prevent SQL Injection I suggest using Parameters when assigning values
foreach($row in $dt){
$NEWID = $row['ID_NUM']
$FNAME1 = $row['FNAME']
$LNAME1 = $row['LNAME']
$mySQL03.CommandText =
"INSERT INTO TABLE2 (user_id,firstname,lastname)
VALUES (#NEWIDme,#FNAME1,#LNAME1)"
$mySQL03.Parameters.Clear()
$Command.Parameters.AddWithValue('#NEWIDme',$NEWID) | Out-Null
$Command.Parameters.AddWithValue('#FNAME1',$FNAME1) | Out-Null
$Command.Parameters.AddWithValue('#LNAME1',$LNAME1) | Out-Null
$mySQL03.ExecuteNonQuery()
}
Related
I have a Powershell script where I select all the values (they are numeric) from the column of a tab in an Excel sheet. It works but I want to exclude numbers that are returned from a select from an Oracle table. For the moment I just want to get the select "where not in" functioning, I will worry about putting the values I want excluded into an array later.
My Powershell select is below
$Table = "Inventory Parts$"
$qry = "select [Part No] from [{0}]" -f $Table;
$cmd.CommandText = $qry;
$da.SelectCommand = $cmd;
$dt = new-object System.Data.dataTable("$($Table)");
$null = $da.fill($dt);
I select from the tab (Inventory Parts) and assign it to a data table. How do I put in "where not in" to the select? If I could just have an array hardcoded with values and use that as the values for "where not in" it would be start. After that I will instead populate the array from the Oracle table.
Thank you for any replies.
Edit:
I have got as far as populating an array with the values I want in the "where not in" clause (see below).
$queryString = "select PART_NO from INVENTORY_PART_CFV WHERE PART_NO like
'100009%' "
$array = New-Object System.Collections.ArrayList
$command = new-Object
System.Data.OracleClient.OracleCommand($queryString, $connection)
$connection.Open()
$rdr = $command.ExecuteReader();
Foreach($row in $rdr)
{
$array.add( $row.Item("PART_NO") )
}
They go into the array "$array" but I need to append this on to my select statement
"select PART_NO from INVENTORY_PART_CFV WHERE PART_NO NOT IN " + $array
I don't know how to do this though.
Im new to powershell and would like to
-delete all rows in a sql server DB that have a date older than 10 years
-for every row that is deleted also delete a folder or a hard disk
So for example if I run the query
DELETE FROM [RMS].[dbo].[requests] where date_logged < DATEADD(year, -10, GetDate())
I then thought I could get the lowest request_id and just delete any folders under that number.
So for example if I delete 10 rows with my delete query and then do a select
It would say that the lowest request_id is 11.
I've started below but I'm not sure how to capture that the oldest request_id is?
The SQL would be this ...
SELECT TOP 1 request_id FROM [RMS].[dbo].[requests] order by request_id asc
And also how I would delete any folder "less" than that value.
So if request_id = 11 then I'd need to delete
C:\temp\1
C:\temp\2
C:\temp\3
...
C:\temp\10
Thanks
P
$connectionString = "Data Source=server;Initial Catalog=RMS;Integrated Security=SSPI";
$connection = New-Object System.Data.SqlClient.SqlConnection($connectionString);
$commandR = New-Object System.Data.SqlClient.SqlCommand("DELETE FROM dbo.requests WHERE request_id= 1", $connection);
$commandCount = New-Object System.Data.SqlClient.SqlCommand("select count(*) from requests", $connection);
$connection.Open();
$rowsDeletedR = $commandR.ExecuteNonQuery();
Write-Host "$rowsDeletedR rows deleted";
$rowsCountR = $commandCount.ExecuteScalar();
Write-Host "$rowsCountR rows in requests table";
$connection.Close();
Your task is broad. I intentionally splitted it into smaller pieces. Take a look at this demo and comments.
Since Invoke-SqlCmd is considered harmful (SQL Injection), I use my own function to invoke SQL
function Invoke-Sql(
$ConnectionString,
$Query,
$Parameters
) {
$conn = New-Object System.Data.SqlClient.SqlConnection -ArgumentList $ConnectionString
$cmd = New-Object System.Data.SqlClient.SqlCommand -ArgumentList $Query,$conn
$conn.Open()
if ($Parameters) {
foreach ($arg in $Parameters.GetEnumerator()){
$cmd.Parameters.AddWithValue($arg.Key, $arg.Value) | Out-Null;
}
}
$reader = $cmd.ExecuteReader()
if ($reader.Read()) {
[string[]]$columns = 0..($reader.FieldCount-1) |
% { if ($reader.GetName($_)) { $reader.GetName($_) } else { "(no name $_)" } }
do {
$obj = #{}
0..($reader.FieldCount-1) | % { $obj[$columns[$_]] = $reader[$_] }
[PSCustomObject]$obj
} while ($reader.Read())
}
$reader.Dispose()
$cmd.Dispose()
$conn.Dispose()
}
You need a database table. Since there is no strict schema in question, I assume following, minimal:
$conn = 'Data Source=.;Initial Catalog=Test;Integrated Security=SSPI'
$createTestTable = #'
CREATE TABLE MyRequests
(
RequestId int,
DateLogged datetime
)
'#
Invoke-Sql $conn $createTestTable
There is no sample data, I assume folders named 1, 2, 3, .. 7 and matching records in SQL database:
1..7 | % {
Invoke-Sql $conn 'INSERT MyRequests VALUES (#id,#value)' #{id=$_;value=[DateTime]::Now.AddDays(-$_)}
mkdir $_
}
Table should contain following records (dates may differ):
RequestId DateLogged
----------- -----------------------
1 2018-09-23 14:47:49.113
2 2018-09-22 14:47:49.130
3 2018-09-21 14:47:49.137
4 2018-09-20 14:47:49.140
5 2018-09-19 14:47:49.140
6 2018-09-18 14:47:49.143
7 2018-09-17 14:47:49.147
Then, final solution:
#get deleted id's using OUTPUT clause
$older = Invoke-Sql $conn 'DELETE FROM MyRequests OUTPUT deleted.RequestId WHERE DateLogged<#date' #{date=[DATETime]::Now.AddDays(-4)}
#foreach id in returned set, delete corresponding folder
$older | select -ExpandProperty RequestId | % { rm $_ }
I have been wrestling with database connection to PostgreSQL from Powershell. I finally am able to connect to and insert into the database. Now I can't figure out how to extract data from a DB select into a variable.
I'm not including my insert for the sake of clarity but will tack it onto this thread later as I know it was super hard to find and may be helpful to someone.
so here's my code:
# use existing 64 bit ODBC System DSN that we set up manually
$DBconn = New-Object -comobject ADODB.Connection
$DBconn.Open("PostgreSQL35W")
$theQuery = "select * from test1"
$theObject = $DBconn.Execute($theQuery) # $theObject is a System.__ComObject
$numRecords = $theObject.RecordCount
write-host "found $numRecords records" # getting -1
$theObject.MoveFirst() # throws no error
# $theValue = $theObject.DataMember # throws no error, but gives no result
$theValue = $theObject.Index[1] # throws "Cannot index into a null array"
write-host($theValue)
try this
replace "#database#" with your database name in $cnString
replace "#server_ip#" with your server ip address in $cnString
replace "#user#" with a valid user in $cnString and $user
replace "#pass#" with a valid pass in $pass
replace "#table#" with a valid table name of your db
replace 5432 with your db port
$cnString = "DRIVER={PostgreSQL Unicode(x64)};DATABASE=#database#;SERVER=#server_ip#;PORT=5432;UID=#user#;"
$user="#user#"
$pass="#pass#"
$conn = New-Object -comobject ADODB.Connection
$conn.Open($cnString,$user,$pass)
$recordset = $conn.Execute("SELECT * FROM #table# limit 1;")
while ($recordset.EOF -ne $True)
{
foreach ($field in $recordset.Fields)
{
'{0,30} = {1,-30}' -f # this line sets up a nice pretty field format, but you don't really need it
$field.name, $field.value
}
'' # this line adds a line between records
$recordset.MoveNext()
}
$conn.Close();
Via psql, which comes with postgresql
$dburl="postgresql://exusername:expw#exhostname:5432/postgres"
$data="select * from extable" | psql --csv $dburl | ConvertFrom-Csv
You must have psql in your path or reference it, its within e.g. C:\Program Files\PostgreSQL\12\bin. Should be able to type "psql" and see output within powershell.
As a warning, expect strings. E.g $data[0].age.GetType() would be string, despite being stored in the database as an integer. You can immediately cast it, cast it later, or hope powershell infers type correctly.
If you want to add back in type information can do e.g.:
$data = $data | %{[pscustomobject]#{name=$_.name;age=[int]$_.age}}
I ended up figuring it out - here's what I did
$conn = New-Object -comobject ADODB.Connection
# use existing 64 bit ODBC System DSN that we set up manually
$conn.Open("PostgreSQL35W")
$recordset = $conn.Execute("SELECT * FROM JobHistory")
while ($recordset.EOF -ne $True)
{
foreach ($field in $recordset.Fields)
{
'{0,30} = {1,-30}' -f # this line sets up a nice pretty field format, but you don't really need it
$field.name, $field.value
}
'' # this line adds a line between records
$recordset.MoveNext()
}
$conn.Close();
Exit
use the dot notation. You don't need to split the data.
$list = New-Object Collections.Generic.List[OnlineCourse]
foreach($element in $results)
{
$tempObj= New-Object OnlineCourse($element.id,$element.courseName,$element.completedRatio,$element.completedRatio,$element.lastActivity, $element.provider)
$list.add($tempObj)
}
I have a slightly different approach to #dog, I couldn't get the --csv to work, so I resorted to tuple only rows returned, then parse them into a List of Classes (which happen to be called OnlineCourses):
class OnlineCourse
{
[int]$id
[string]$email
[string]$courseName
[int]$completedRatio
[datetime]$lastActivity
[String]$provider
OnlineCourse([int]$id,
[string]$email,
[string]$courseName,
[int]$completedPerc,
[datetime]$lastActivity,
[String]$provider) {
$this.id = $id
$this.email = $email.Trim()
$this.courseName = $courseName.Trim()
$this.completedRatio = $completedPerc
$this.lastActivity = $lastActivity
$this.provider = $provider.Trim()
}
}
$connstr="postgresql://exusername:expw#exhostname:5432/postgres"
$data = "select * from onlinecourses" | .\psql -t $connstr
$list = New-Object Collections.Generic.List[OnlineCourse]
foreach ($field in $data) {
$id, $email, $courseName, $completedratio, $lastactivity, $provider = $field.split('|')
$course = [OnlineCourse]::new($id, $email, $courseName, $completedratio, $lastactivity, $provider)
$list.Add($course)
}
This is slightly adapted from another answer and it worked for me.
$dburl="postgresql://postgres:secret_pwd#database-host:5432/dbname"
$psqlPath = 'C:\Program Files\PostgreSQL\11\bin\psql.exe'
function Query {
param($Sql)
Write-Host $Sql
$rows = $Sql `
| &$psqlPath "-A" $dburl | ConvertFrom-Csv -Delimiter '|'
$result = #($rows | Select-Object -SkipLast 1)
Write-Host "-> " (ConvertTo-Json $result)
$result
}
$rows = Query "select ... from ..."
I am trying to upload some string values into an Oracle table by means of powershell. However when I upload strings directly some characters are shown up like ? in the table.
Actually, I first parse a text and retrieve some results through regex as below:
if($wiki_link -match "http:\/\/en\.wikipedia\.org\/wiki\/(.*)") {$city = $matches[1]}
Then I wanna upload this $city variable into a table as below:
[System.Reflection.Assembly]::LoadWithPartialName("System.Data.OracleClient")
$connectionString = "Data Source=(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(Host=xxxxxxxxx)(Port=1521)))(CONNECT_DATA=(SERVER = DEDICATED) (SERVICE_NAME =xxxxx)));user id=xxxxxx;password=xxxxx"
$connection = New-Object System.Data.OracleClient.OracleConnection($connectionString)
$connection.Open()
$cmd2=$connection.CreateCommand()
$cmd2.CommandText="insert into mehmet.goo_region (city) values ('$city')"
$rdr2=$cmd2.ExecuteNonQuery()
When I apply this method, the city named Elâzığ appears as Elaz?? in the table cell.
I guess I have to convert string into UTF-8 but I could not find a solution through web.
Thanks in advance...
Try this, it should work:
$u = New-Object System.Text.UTF8Encoding
$s = $u.GetBytes("YourStringGoesHere")
$u.GetString($s) ## this is your UTF-8 string
So your code becomes
$u = New-Object System.Text.UTF8Encoding
$s = $u.GetBytes($city)
$utf8city = $u.GetString($s)
I am trying to load extremely large CSV files into SQL Server using Powershell. The code also has to apply on the fly regex replacements, allow for various delimiters, EOR, and EOF markers. For maintenance, I would really like all of this logic to exist in Powershell without importing assemblies.
To be efficient, I know I need to use the SQLBulkCopy method. But, all of the Powershell examples I see fill a DataTable and pass it which is not possible for me because of the file size.
I am pretty sure I need to wrap StreamReader in an Idatareader and then pass that to SQLBulkcopy. I found a couple great examples of this implemented in C#:
http://archive.msdn.microsoft.com/FlatFileDataReader
http://www.codeproject.com/Articles/9258/A-Fast-CSV-Reader
Is it possible to accomplish this functionality using native PowerShell without importing the C# assembly? I am specifically having a hard time converting the abstract class wrapper.
This is the code I have so far that does not pass an IdataReader and breaks on memory limits.
function Get-CSVDataReader()
{
param (
[string]$path
)
$parsedData = New-Object 'System.Collections.Generic.List[string]'
#List<string[]> parsedData = new List<string[]>()
$sr = new-object IO.StreamReader($path)
while ($line = $sr.ReadLine())
{
#regex replace and other logic here
$parsedData.Add($line.Split(','))
}
,$parsedData #if this was an idatareader, the comma keeps it from exploding
}
$MyReader = Get-CSVDataReader('This should not fill immediately. It needs a Read Method.')
Thanks a bunch for the help.
If all you want to do is use a DataReader with SqlBulkCopy you could use the ACE drivers which comes with Office 2007/2010 and is also available as a separate download to open an OLEDB connection to to CSV file, open a reader and call WriteToServer
$ServerInstance = "$env:computername\sql1"
$Database = "tempdb"
$tableName = "psdrive"
$ConnectionString = "Server={0};Database={1};Integrated Security=True;" -f $ServerInstance,$Database
$filepath = "C:\Users\Public\bin\"
get-psdrive | export-csv ./psdrive.csv -NoTypeInformation -Force
$connString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=`"$filepath`";Extended Properties=`"text;HDR=yes;FMT=Delimited`";"
$qry = 'select * from [psdrive.csv]'
$conn = new-object System.Data.OleDb.OleDbConnection($connString)
$conn.open()
$cmd = new-object System.Data.OleDb.OleDbCommand($qry,$conn)
$dr = $cmd.ExecuteReader()
$bulkCopy = new-object ("Data.SqlClient.SqlBulkCopy") $connectionString
$bulkCopy.DestinationTableName = $tableName
$bulkCopy.WriteToServer($dr)
$dr.Close()
$conn.Close()
#CREATE TABLE [dbo].[psdrive](
# [Used] [varchar](1000) NULL,
# [Free] [varchar](1000) NULL,
# [CurrentLocation] [varchar](1000) NULL,
# [Name] [varchar](1000) NULL,
# [Provider] [varchar](1000) NULL,
# [Root] [varchar](1000) NULL,
# [Description] [varchar](1000) NULL,
# [Credential] [varchar](1000) NULL,
# [DisplayRoot] [varchar](1000) NULL
#)
I'm importing large CSV's by a datatable and performing batch updates after 1 million rows.
if ($dt.rows.count -eq 1000000) {
$bulkCopy.WriteToServer($dt)
$dt.Clear()
}
Here is the link where I detail my own script on my blog, but the above code outlines the basic concept. My PowerShell script took 4.x minutes to import 9 million rows from a 1.1 GB CSV. The script relied on SqlBulkCopy, [System.IO.File]::OpenText and a datatable.