Insert data with a collection object into a SQL Server table - powershell

The code below does not error, it inserts into a SQL Server table with no issues. However the [ServicePrincipalNames] data is not inserted how I planned.
The value that gets inserted into the table is
Microsoft.ActiveDirectory.Management.ADPropertyValueCollection
What I am trying to insert is the value in that object collection which looks like this:
WSMAN/Server1Name
WSMAN/Server1Name.mx.ds.abc.com
TERMSRV/Server1Name
TERMSRV/Server1Name.mx.ds.abc.com
RestrictedKrbHost/Server1Name
HOST/Server1Name
RestrictedKrbHost/Server1Name.mx.ds.abc.com
HOST/Server1Name.mx.ds.abc.com
The code to do the insert is shown here below. How could I change this to have the insert put all the services in the column, separated by |?
$sqlServer='SomeServer'
$catalog = 'SomeDatabase'
$insert = #"
Insert into dbo.ADServers([Name],[OperatingSystem],[OperatingSystemVersion],[ipv4Address],[Created],[Deleted],[whenChanged],[Modified],[Description],[ServicePrincipalNames],[DisplayName],[Location],[DistinguishedName],[DNSHostName])
values('{0}','{1}','{2}','{3}','{4}','{5}','{6}','{7}','{8}','{9}','{10}','{11}','{12}', '{13}')
"#
$start = (Get-Date).ToString('MM/dd/yyyy hh:mm:ss tt')
$connectionString = "Data Source=$sqlServer;Initial Catalog=$catalog;Integrated Security=SSPI"
# connection object initialization
$conn = New-Object System.Data.SqlClient.SqlConnection($connectionString)
#Open the Connection
$conn.Open()
# Prepare the SQL
$cmd = $conn.CreateCommand()
#WMI ouput transformation to SQL table
Get-ADComputer -Filter {operatingSystem -Like 'Windows*server*2019*'} -Property * |`
Select Name,OperatingSystem,OperatingSystemVersion,ipv4Address,Created,Deleted,whenChanged,Modified,Description,ServicePrincipalNames,DisplayName,Location,DistinguishedName,DNSHostName |`
forEach-object{
$cmd.CommandText = $insert -f $_.Name, $_.OperatingSystem, $_.OperatingSystemVersion, $_.ipv4Address, $_.Created, $_.Deleted, $_.whenChanged, $_.Modified,$_.Description, $_.ServicePrincipalNames , $_.DisplayName,$_.Location,$_.DistinguishedName,$_.DNSHostName
$cmd.ExecuteNonQuery()
}
$end = (Get-Date).ToString('MM/dd/yyyy hh:mm:ss tt')
Write-Host $start
Write-Host $end

Ok after a more time googling and learning about out-string. In order to display objects i had to create an expression on that column and rewrite it as below. and it worked
In query replace the
ServicePrincipalNames
with
#{Label="ServicePrincipalNames";Expression={$_.ServicePrincipalNames -join ";" }}

Related

Powershell For Loop for multiple servers - to get SSAS connection string details

I am very new to powershell script. i am trying to get SSAS Tabular model connection string details for multiple servers. i have code which will return only for single server. How to modify the code to pass multiple servers?
$servername = "servername1"
# Connect SSAS Server
$server = New-Object Microsoft.AnalysisServices.Server
$server.connect($servername)
$DSTable = #();
foreach ( $db in $server.databases)
{
$dbname = $db.Name
$Srver = $db.ParentServer
foreach ( $ds in $db.Model.DataSources)
{
$hash = #
{
"Server" = $Srver;
"Model_Name" = $dbname ;
"Datasource_Name" = $ds.Name ;
"ConnectionString" = $ds.ConnectionString ;
"ImpersonationMode" = $ds.ImpersonationMode;
"Impersonation_Account" = $ds.Account;
}
$row = New-Object psobject -Property $hash
$DSTable += $row
}
}
As commented, you can surround the code you have in another foreach loop.
Using array concatenation with += is a bad idea, because on each addition, the entire array needs to be recreated in memory, so that is both time and memory consuming.
Best thing is to let PowerShell do the heavy lifting of collecting the data:
$allServers = 'server01','server02','server03' # etc. an array of servernames
# loop through the servers array and collect the utput in variable $result
$result = foreach($servername in $allServers) {
# Connect SSAS Server
$server = New-Object Microsoft.AnalysisServices.Server
$server.Connect($servername)
foreach ( $db in $server.databases) {
foreach ( $ds in $db.Model.DataSources) {
# output an object with the desired properties
[PsCustomObject]#{
Server = $db.ParentServer
Model_Name = $db.Name
Datasource_Name = $ds.Name
ConnectionString = $ds.ConnectionString
ImpersonationMode = $ds.ImpersonationMode
Impersonation_Account = $ds.Account
}
}
}
}
# output on screen
$result | Out-GridView -Title 'SSAS connection string details'
# output to a CSV file (change the path and filename here of course..)
$result | Export-Csv -Path 'D:\Test\MySSAS_Connections.csv' -UseCulture -NoTypeInformation
The above uses parameter -UseCulture because then the delimiter used for the CSV file is the same as your machine expects when double-clicking and opening in Excel. Without that, the default comma is used

is there a simple way to output to xlsx?

I am trying to output a query from a DB to a xlsx but it takes so much time to do this because there about 20,000 records to process, is there a simpler way to do this?
I know there is a way to do it for csv but im trying to avoid that, because if the records had any comma is going to take it as a another column and that would mess with the info
this is my code
$xlsObj = New-Object -ComObject Excel.Application
$xlsObj.DisplayAlerts = $false
$xlsWb = $xlsobj.Workbooks.Add(1)
$xlsObj.Visible = 0 #(visible = 1 / 0 no visible)
$xlsSh = $xlsWb.Worksheets.Add([System.Reflection.Missing]::Value, $xlsWb.Worksheets.Item($xlsWb.Worksheets.Count))
$xlsSh.Name = "QueryResults"
$DataSetTable= $ds.Tables[0]
Write-Output "DATA SET TABLE" $DataSetTable
[Array] $getColumnNames = $DataSetTable.Columns | SELECT *
Write-Output "COLUMN NAMES" $DataSetTable.Rows[0]
[Int] $RowHeader = 1
foreach ($ColH in $getColumnNames)
{
$xlsSh.Cells.item(1, $RowHeader).font.bold = $true
$xlsSh.Cells.item(1, $RowHeader) = $ColH.ColumnName
Write-Output "Nombre de Columna"$ColH.ColumnName
$RowHeader++
}
[Int] $rowData = 2
[Int] $colData = 1
foreach ($rec in $DataSetTable.Rows)
{
foreach ($Coln in $getColumnNames)
{
$xlsSh.Cells.NumberFormat = "#"
$xlsSh.Cells.Item($rowData, $colData) = $rec.$($Coln.ColumnName).ToString()
$ColData++
}
$rowData++; $ColData = 1
}
$xlsRng = $xlsSH.usedRange
[void] $xlsRng.EntireColumn.AutoFit()
#Se elimina la pestaña Sheet1/Hoja1.
$xlsWb.Sheets(1).Delete() #Versión 02
$xlsFile = "directory of the file"
[void] $xlsObj.ActiveWorkbook.SaveAs($xlsFile)
$xlsObj.Quit()
Start-Sleep -Milliseconds 700
While ([System.Runtime.Interopservices.Marshal]::ReleaseComObject($xlsRng)) {''}
While ([System.Runtime.Interopservices.Marshal]::ReleaseComObject($xlsSh)) {''}
While ([System.Runtime.Interopservices.Marshal]::ReleaseComObject($xlsWb)) {''}
While ([System.Runtime.Interopservices.Marshal]::ReleaseComObject($xlsObj)) {''}
[gc]::collect() | Out-Null
[gc]::WaitForPendingFinalizers() | Out-Null
$oraConn.Close()
I'm trying to avoid [CSV files], because if the records had any comma is going to take it as a another column and that would mess with the info
That's only the case if you try to construct the output format manually. Builtin commands like Export-Csv and ConvertTo-Json will automatically quote the values as necessary:
PS C:\> $customObject = [pscustomobject]#{ID = 1; Name = "Solis, Heber"}
PS C:\> $customObject
ID Name
-- ----
1 Solis, Heber
PS C:\> $customObject |ConvertTo-Csv -NoTypeInformation
"ID","Name"
"1","Solis, Heber"
Notice, in the example above, how:
The string value assigned to $customObject.Name does not contain any quotation marks, but
In the output from ConvertTo-Csv we see values and headers clearly enclosed in quotation marks
PowerShell automatically enumerates the row data when you pipe a [DataTable] instance, so creating a CSV might (depending on the contents) be as simple as:
$ds.Tables[0] |Export-Csv table_out.csv -NoTypeInformation
What if you want TAB-separated values (or any other non-comma separator)?
The *-Csv commands come with a -Delimiter parameter to which you can pass a user-defined separator:
# This produces semicolon-separated values
$data |Export-Csv -Path output.csv -Delimiter ';'
I usually try and refrain from recommending specific modules libraries, but if you insist on writing to XSLX I'd suggest checking out ImportExcel (don't let the name fool you, it does more than import from excel, including exporting and formatting data from PowerShell -> XSLX)

Take value from SQL server and delete relevant folders

Im new to powershell and would like to
-delete all rows in a sql server DB that have a date older than 10 years
-for every row that is deleted also delete a folder or a hard disk
So for example if I run the query
DELETE FROM [RMS].[dbo].[requests] where date_logged < DATEADD(year, -10, GetDate())
I then thought I could get the lowest request_id and just delete any folders under that number.
So for example if I delete 10 rows with my delete query and then do a select
It would say that the lowest request_id is 11.
I've started below but I'm not sure how to capture that the oldest request_id is?
The SQL would be this ...
SELECT TOP 1 request_id FROM [RMS].[dbo].[requests] order by request_id asc
And also how I would delete any folder "less" than that value.
So if request_id = 11 then I'd need to delete
C:\temp\1
C:\temp\2
C:\temp\3
...
C:\temp\10
Thanks
P
$connectionString = "Data Source=server;Initial Catalog=RMS;Integrated Security=SSPI";
$connection = New-Object System.Data.SqlClient.SqlConnection($connectionString);
$commandR = New-Object System.Data.SqlClient.SqlCommand("DELETE FROM dbo.requests WHERE request_id= 1", $connection);
$commandCount = New-Object System.Data.SqlClient.SqlCommand("select count(*) from requests", $connection);
$connection.Open();
$rowsDeletedR = $commandR.ExecuteNonQuery();
Write-Host "$rowsDeletedR rows deleted";
$rowsCountR = $commandCount.ExecuteScalar();
Write-Host "$rowsCountR rows in requests table";
$connection.Close();
Your task is broad. I intentionally splitted it into smaller pieces. Take a look at this demo and comments.
Since Invoke-SqlCmd is considered harmful (SQL Injection), I use my own function to invoke SQL
function Invoke-Sql(
$ConnectionString,
$Query,
$Parameters
) {
$conn = New-Object System.Data.SqlClient.SqlConnection -ArgumentList $ConnectionString
$cmd = New-Object System.Data.SqlClient.SqlCommand -ArgumentList $Query,$conn
$conn.Open()
if ($Parameters) {
foreach ($arg in $Parameters.GetEnumerator()){
$cmd.Parameters.AddWithValue($arg.Key, $arg.Value) | Out-Null;
}
}
$reader = $cmd.ExecuteReader()
if ($reader.Read()) {
[string[]]$columns = 0..($reader.FieldCount-1) |
% { if ($reader.GetName($_)) { $reader.GetName($_) } else { "(no name $_)" } }
do {
$obj = #{}
0..($reader.FieldCount-1) | % { $obj[$columns[$_]] = $reader[$_] }
[PSCustomObject]$obj
} while ($reader.Read())
}
$reader.Dispose()
$cmd.Dispose()
$conn.Dispose()
}
You need a database table. Since there is no strict schema in question, I assume following, minimal:
$conn = 'Data Source=.;Initial Catalog=Test;Integrated Security=SSPI'
$createTestTable = #'
CREATE TABLE MyRequests
(
RequestId int,
DateLogged datetime
)
'#
Invoke-Sql $conn $createTestTable
There is no sample data, I assume folders named 1, 2, 3, .. 7 and matching records in SQL database:
1..7 | % {
Invoke-Sql $conn 'INSERT MyRequests VALUES (#id,#value)' #{id=$_;value=[DateTime]::Now.AddDays(-$_)}
mkdir $_
}
Table should contain following records (dates may differ):
RequestId DateLogged
----------- -----------------------
1 2018-09-23 14:47:49.113
2 2018-09-22 14:47:49.130
3 2018-09-21 14:47:49.137
4 2018-09-20 14:47:49.140
5 2018-09-19 14:47:49.140
6 2018-09-18 14:47:49.143
7 2018-09-17 14:47:49.147
Then, final solution:
#get deleted id's using OUTPUT clause
$older = Invoke-Sql $conn 'DELETE FROM MyRequests OUTPUT deleted.RequestId WHERE DateLogged<#date' #{date=[DATETime]::Now.AddDays(-4)}
#foreach id in returned set, delete corresponding folder
$older | select -ExpandProperty RequestId | % { rm $_ }

I have some code to download all the tables in my database to csv, is there a way to specify row separators?

Currently the code uses a comma for the column and a new line for row.
This is an issue because some of the data in the tables are paragraphs which already include commas and new lines.
I want to be able to use a delimiter with multiple characters but that is returning an error
Cannot bind parameter Delimiter. Cannot convert value '/~/' to type System.Char
$server = "(server)\instance"
$database = "DBName"
$tablequery = "SELECT name from sys.tables"
#Delcare Connection Variables
$connectionTemplate = "Data Source={0};Integrated Security=SSPI;Initial Catalog={1};"
$connectionString = [string]::Format($connectionTemplate, $server, $database)
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$command = New-Object System.Data.SqlClient.SqlCommand
$command.CommandText = $tablequery
$command.Connection = $connection
#Load up the Tables in a dataset
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $command
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$connection.Close()
# Loop through all tables and export a CSV of the Table Data
foreach ($Row in $DataSet.Tables[0].Rows)
{
$queryData = "SELECT * FROM [$($Row[0])]"
#Specify the output location of your dump file
$extractFile = "C:\temp\backups\$($Row[0]).csv"
$command.CommandText = $queryData
$command.Connection = $connection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $command
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$connection.Close()
$DataSet.Tables[0] | Export-Csv $extractFile -Delimiter '/~/'
}
Export-Csv and ConvertTo-Csv correctly handles newline and comma characters
It is not a problem that the data could potentially contain commas and/or new lines.
If you look at the CSV "specification" (written in quotation signs here because a lot of usages of the term CSV is not referring to anything following this specification) you'll see that a field in a CSV file can be enclosed in quation characters. If the data of that field contains a quotation character, the delimiter character or a newline character it must be enclosed in quotation characters. If the data contains a quotation character, that quotation character should be doubled.
This will all be handled correctly by the ConvertTo-Csv and the Export-Csv cmdlets.
$obj = New-Object PSObject -Property ([ordered]#{
FirstColumn = "First Value";
SecondColumn = "Second value, including a comma";
ThirdColumn ="Third Value including two`nnewline`ncharacters";
FourthColumn = 'Fourth value including a " character'}
)
$obj | ConvertTo-Csv -NoTypeInformation
This will give us the following output:
"FirstColumn","SecondColumn","ThirdColumn","FourthColumn"
"First Value","Second value, including a comma","Third Value including two
newline
characters","Fourth value including a "" character"
Which is correctly handled according to the CSV specification.
So you do not need to worry about the data containing comma characters or newline characters, since it is handled by the CSV format.
Open a CSV file in Excel
I don't know what your current problem with the data is, but I'm guessing you're trying to open the resulting file in Excel and seeing incorrect data. This is because Excel unfortunately doesn't open .csv files as... well... CSV files.
One way (there might be more ways) to open it in Excel is to go on the Data tab and in the "Get & Transform Data" section press the "From Text/CSV" button. This way, Excel should open the file correctly according to the CSV standard.

Retrieve data from PostgreSQL using Powershell

I have been wrestling with database connection to PostgreSQL from Powershell. I finally am able to connect to and insert into the database. Now I can't figure out how to extract data from a DB select into a variable.
I'm not including my insert for the sake of clarity but will tack it onto this thread later as I know it was super hard to find and may be helpful to someone.
so here's my code:
# use existing 64 bit ODBC System DSN that we set up manually
$DBconn = New-Object -comobject ADODB.Connection
$DBconn.Open("PostgreSQL35W")
$theQuery = "select * from test1"
$theObject = $DBconn.Execute($theQuery) # $theObject is a System.__ComObject
$numRecords = $theObject.RecordCount
write-host "found $numRecords records" # getting -1
$theObject.MoveFirst() # throws no error
# $theValue = $theObject.DataMember # throws no error, but gives no result
$theValue = $theObject.Index[1] # throws "Cannot index into a null array"
write-host($theValue)
try this
replace "#database#" with your database name in $cnString
replace "#server_ip#" with your server ip address in $cnString
replace "#user#" with a valid user in $cnString and $user
replace "#pass#" with a valid pass in $pass
replace "#table#" with a valid table name of your db
replace 5432 with your db port
$cnString = "DRIVER={PostgreSQL Unicode(x64)};DATABASE=#database#;SERVER=#server_ip#;PORT=5432;UID=#user#;"
$user="#user#"
$pass="#pass#"
$conn = New-Object -comobject ADODB.Connection
$conn.Open($cnString,$user,$pass)
$recordset = $conn.Execute("SELECT * FROM #table# limit 1;")
while ($recordset.EOF -ne $True)
{
foreach ($field in $recordset.Fields)
{
'{0,30} = {1,-30}' -f # this line sets up a nice pretty field format, but you don't really need it
$field.name, $field.value
}
'' # this line adds a line between records
$recordset.MoveNext()
}
$conn.Close();
Via psql, which comes with postgresql
$dburl="postgresql://exusername:expw#exhostname:5432/postgres"
$data="select * from extable" | psql --csv $dburl | ConvertFrom-Csv
You must have psql in your path or reference it, its within e.g. C:\Program Files\PostgreSQL\12\bin. Should be able to type "psql" and see output within powershell.
As a warning, expect strings. E.g $data[0].age.GetType() would be string, despite being stored in the database as an integer. You can immediately cast it, cast it later, or hope powershell infers type correctly.
If you want to add back in type information can do e.g.:
$data = $data | %{[pscustomobject]#{name=$_.name;age=[int]$_.age}}
I ended up figuring it out - here's what I did
$conn = New-Object -comobject ADODB.Connection
# use existing 64 bit ODBC System DSN that we set up manually
$conn.Open("PostgreSQL35W")
$recordset = $conn.Execute("SELECT * FROM JobHistory")
while ($recordset.EOF -ne $True)
{
foreach ($field in $recordset.Fields)
{
'{0,30} = {1,-30}' -f # this line sets up a nice pretty field format, but you don't really need it
$field.name, $field.value
}
'' # this line adds a line between records
$recordset.MoveNext()
}
$conn.Close();
Exit
use the dot notation. You don't need to split the data.
$list = New-Object Collections.Generic.List[OnlineCourse]
foreach($element in $results)
{
$tempObj= New-Object OnlineCourse($element.id,$element.courseName,$element.completedRatio,$element.completedRatio,$element.lastActivity, $element.provider)
$list.add($tempObj)
}
I have a slightly different approach to #dog, I couldn't get the --csv to work, so I resorted to tuple only rows returned, then parse them into a List of Classes (which happen to be called OnlineCourses):
class OnlineCourse
{
[int]$id
[string]$email
[string]$courseName
[int]$completedRatio
[datetime]$lastActivity
[String]$provider
OnlineCourse([int]$id,
[string]$email,
[string]$courseName,
[int]$completedPerc,
[datetime]$lastActivity,
[String]$provider) {
$this.id = $id
$this.email = $email.Trim()
$this.courseName = $courseName.Trim()
$this.completedRatio = $completedPerc
$this.lastActivity = $lastActivity
$this.provider = $provider.Trim()
}
}
$connstr="postgresql://exusername:expw#exhostname:5432/postgres"
$data = "select * from onlinecourses" | .\psql -t $connstr
$list = New-Object Collections.Generic.List[OnlineCourse]
foreach ($field in $data) {
$id, $email, $courseName, $completedratio, $lastactivity, $provider = $field.split('|')
$course = [OnlineCourse]::new($id, $email, $courseName, $completedratio, $lastactivity, $provider)
$list.Add($course)
}
This is slightly adapted from another answer and it worked for me.
$dburl="postgresql://postgres:secret_pwd#database-host:5432/dbname"
$psqlPath = 'C:\Program Files\PostgreSQL\11\bin\psql.exe'
function Query {
param($Sql)
Write-Host $Sql
$rows = $Sql `
| &$psqlPath "-A" $dburl | ConvertFrom-Csv -Delimiter '|'
$result = #($rows | Select-Object -SkipLast 1)
Write-Host "-> " (ConvertTo-Json $result)
$result
}
$rows = Query "select ... from ..."