PostgreSQL Query export to CSV with PowerShell - powershell

$sf = "\\\\domain\\dept\\dcgsi\\Extracts\\Tableau_Unlicensed_Users.csv"
if (Test-Path $sf){
Remove-Item $sf
}
$query = #"
\\copy (SELECT Name
FROM _users
WHERE licensing_role_name = 'Unlicensed')
TO $sf
WITH CSV DELIMITER ','
"#
$conn = New-Object -ComObject ADODB.Connection
# use existing 64 bit ODBC System DSN that we set up manually
$conn.Open('PostgreSQL30')
$conn.Execute($query)
$conn.Close()
I keep getting an error about "\" on the line with the $conn.Execute() when I try and do this. I assume it has to do with character escaping and maybe I am doing it wrong.
Is there a better way to do this with PowerShell if I just need to get the name field of any record from _users and output it to CSV?
Eventually I will be adding more to this to loop through each record in the CSV and execute a tabcmd to remove all the users that are unlicensed.

$sf = "\\domain\dept\dcgsi\Extracts\Tableau_Unlicensed_Users.csv"
if (Test-Path $sf){
Remove-Item $sf
}
$query = #"
SELECT Name
FROM _users
WHERE licensing_role_name = 'Unlicensed'
"#
function Get-ODBC-Data{
$conn = New-Object System.Data.Odbc.OdbcConnection
$conn.ConnectionString = "DSN=PostgreSQL30;"
$conn.Open()
$cmd = New-object System.Data.Odbc.OdbcCommand($query, $conn)
$ds = New-Object system.Data.DataSet
(New-Object System.Data.Odbc.OdbcDataAdapter($cmd)).Fill($ds) | Out-Null
$conn.Close()
$ds.Tables[0] | Export-Csv $sf
}
Get-ODBC-Data
This did like 99% of what I need; I just have to process the csv now and drop the first two lines. The first line is a type info message and the second is the column header.

Related

Write PowerShell console text to a file

I have a PowerShell script that connects to a DB and loops over some data.
After the script finishes or throws an error, I need to append whatever the text displayed in the console to a log file.
I couldn't achieve that using Write-Output because I don't want to save specific values, I just need the whole console text to be appended to a file.
Thank you.
EDIT :
In fact, the final result that I'm looking for, is a log file with timestamps, here is my code :
$server = "USER\SQLEXPRESS"
$database = "database_test"
$tablequery = "SELECT name from sys.tables"
#Delcare Connection Variables
$connectionTemplate = "Data Source={0};Integrated Security=SSPI;Initial Catalog={1};"
$connectionString = [string]::Format($connectionTemplate, $server, $database)
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$command = New-Object System.Data.SqlClient.SqlCommand
$command.CommandText = $tablequery
$command.Connection = $connection
#Load up the Tables in a dataset
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $command
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$connection.Close()
$DriveName = (get-location).Drive.Name
$extractDir = md -Force "$($DriveName):\csv\files"
# Loop through all tables and export a CSV of the Table Data
foreach ($Row in $DataSet.Tables[0].Rows)
{
$connection.open();
#Specify the output location of your dump file
$command.CommandText = "SELECT * FROM [$($Row[0])]"
$command.Connection = $connection
(Get-Culture).NumberFormat.NumberDecimalSeparator = '.'
(Get-Culture).DateTimeFormat.ShortDatePattern = 'yyyy-MM-dd'
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $command
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$connection.Close()
$extractFile = "$($extractDir)\$($Row[0]).csv"
$DataSet.Tables[0] | Export-Csv $extractFile -NoTypeInformation -Encoding UTF8
}
I need to print each filename exported to csv ($extractFile) with a timestamp in a log file, and then if an error occurs, I need to print that too with a timestamp, and so on till the script finishes.
You can do this with Start-Transcript, try/catch/finally, or writing your own code (to store the console output to a variable, and append a text file with the contents when required). Note the -Append parameter with Start-Transcript.
Without code, it's difficult to know which of these to recommend.
Expanded
Now that you've added some code, see some additional info on each method. I'm not familiar with SQL via PowerShell so not sure what kind of output/errors you will be getting (regarding errors, specifically if there are terminating or non-terminating)
Transcript
Start-Transcript should go at the beginning, and Stop-Transcript at the end. This will log whatever is normally displayed on the console. Running Start-Transcript while a transcript is already being recorded will lead to a nasty error.
Start-Transcript -Path "c\temp\mylogfile.txt"
$server = "USER\SQLEXPRESS"
$database = "database_test"
$tablequery = "SELECT name from sys.tables"
...
...
$extractFile = "$($extractDir)\$($Row[0]).csv"
$DataSet.Tables[0] | Export-Csv $extractFile -NoTypeInformation -Encoding UTF8
}
Stop-Transcript
Terminating Errors
Add try/catch/finally as appropriate. You can be lazy and add this over the whole code, or do it properly and wrap the parts that could lead to terminating errors.
...
foreach ($Row in $DataSet.Tables[0].Rows)
{
try{
$connection.open();
#Specify the output location of your dump file
...
...
...
$extractFile = "$($extractDir)\$($Row[0]).csv"
$DataSet.Tables[0] | Export-Csv $extractFile -NoTypeInformation -Encoding UTF8
}catch{
# what to do if there is a terminating error
}finally{
# what to do whether there is an error or not
if(Test-Path "$($extractDir)\$($Row[0]).csv"){
# simple check: if a file was created, no error... right?
"$(Get-Date -f 'yyyy-MM-dd hh:mm:ss') $($Error[0])" | Out-File "c:\temp\mylogfile.txt" -Append
}else{
"$(Get-Date -f 'yyyy-MM-dd hh:mm:ss') $extractFile" | Out-File "c:\temp\mylogfile.txt" -Append
}
}
}
...
No Terminating Errors
Just add a line to export errors. Ensure you clear the automatic variable $Error each loop
...
foreach ($Row in $DataSet.Tables[0].Rows)
{
$Error.Clear()
$connection.open();
#Specify the output location of your dump file
...
...
...
$extractFile = "$($extractDir)\$($Row[0]).csv"
$DataSet.Tables[0] | Export-Csv $extractFile -NoTypeInformation -Encoding UTF8
# if there are no errors, write filename. Otherwise write errors
if([string]::IsNullOrEmpty($Error){
"$(Get-Date -f 'yyyy-MM-dd hh:mm:ss') $extractFile" | Out-File "c:\temp\mylogfile.txt" -Append
}else{
"$(Get-Date -f 'yyyy-MM-dd hh:mm:ss') $Error" | Out-File "c:\temp\mylogfile.txt" -Append
}
}
...
You could use Start-Transcript for debugging purposes:
Start-Transcript -path "C:\temp\myTranscript.txt"
Add at the start of your script and get all the console output written into C:\temp\myTranscript.txt

Format date inside table from database

I'm trying to create a script for querying a database, extract data and put it to an .csv file. This is the part of interest of the code:
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()
$command = $connection.CreateCommand()
$command.CommandText = $query
$result = $command.ExecuteReader()
###########################
##PRSENTAZIONE DEI RISULTATI##
$table = New-Object System.Data.DataSet
$table.Load($result)
$table | Export-Csv -Path $fileName -Delimiter "|" -NoTypeInformation
I have 2 problems:
On Windows 2003 PowerShell -Delimiter parameter is not present
If I delete that parameter the ouput file presents a wrong format date.
username, ipaddress, yyyy-mm-dd hh:mm:ss (running the same query in Management Studio), username, ipaddress, "dd/mm/yyyy hh.mm.ss" (apexes included ; printing the $table).
How can I solve this?

Data type mismatch when querying a CSV with ACE OLEDB provider

I am attempting to query a CSV file using the Microsoft ACE OLEDB provider. When I add "PrctBusy > 60" to the where clause I receive the Error "Data type mismatch in criteria expression." I have searched StackOverFlow and used google to search for solutions, I see this is not an uncommon issue. From my readings it looks to be datatype issue. The data in the column PrctBusy is all numeric. I think I need to force it to be number but I have not found a solution.
Below is the code I am currently working with:
$ArrayNameUtil = "000198701258"
$CatNameUtil = "FE_DIR"
$sdLocalPath = "D:\Logs\SANData\Perf"
$InputCSV = "VMaxSANReportUtilFile.csv"
$csv = Join-Path $sdLocalPath $InputCSV
$provider = (New-Object System.Data.OleDb.OleDbEnumerator).GetElements() | Where-Object { $_.SOURCES_NAME -like "Microsoft.ACE.OLEDB.*" }
if ($provider -is [system.array]) { $provider = $provider[0].SOURCES_NAME } else { $provider = $provider.SOURCES_NAME }
$connstring = "Provider=$provider;Data Source=$(Split-Path $csv);Extended Properties='text;HDR=$firstRowColumnNames;';"
$firstRowColumnNames = "Yes"
$delimiter = ","
$tablename = (Split-Path $csv -leaf).Replace(".","#")
$conn = New-Object System.Data.OleDb.OleDbconnection
$conn.ConnectionString = $connstring
$provider = (New-Object System.Data.OleDb.OleDbEnumerator).GetElements() | Where-Object { $_.SOURCES_NAME -like "Microsoft.ACE.OLEDB.*" }
if ($provider -is [system.array]) { $provider = $provider[0].SOURCES_NAME } else { $provider = $provider.SOURCES_NAME }
$connstring = "Provider=$provider;Data Source=$(Split-Path $csv);Extended Properties='text;HDR=$firstRowColumnNames;';"
$firstRowColumnNames = "Yes"
$delimiter = ","
$tablename = (Split-Path $csv -leaf).Replace(".","#")
$conn = New-Object System.Data.OleDb.OleDbconnection
$conn.ConnectionString = $connstring
$conn.Open()
#
$sql = "SELECT TimeStamp, count(PrctBusy) AS Above60 FROM [$tablename] WHERE array = '$ArrayNameUtil' and Category like '$CatNameUtil' and PrctBusy > 60 Group by TimeStamp "
$cmd = New-Object System.Data.OleDB.OleDBCommand
$cmd.Connection = $conn
$cmd.CommandText = $sql
$dtp = New-Object System.Data.DataTable
$dtp.Load($cmd.ExecuteReader())
Because of the pointer from TessellatingHeckler to Codeproject and some follow on queries, I was lead to http://aspdotnetcodes.com/Importing_CSV_Database_Schema.ini.aspx. I found that a schema.ini file in the same directory as the CSV file could specify the data type.
The schema.ini file ended up in the following format:
[VMaxSANReportUtilFile.csv]
ColNameHeader=True
Format=CSVDelimited
Col1=Array Text Width 20
Col2=TimeStamp Text Width 20
Col3=Category Text Width 20
Col4=Instance Text Width 20
Col5=PrctBusy Short
Col6=QueUtil Short
I went through several revisions to get the data type correct for an ACE OLE DB provider. If the columns are named the names need to be in the schema.ini file.

How to display & export postgresql output in powershell

How can i display the output of a postgresql in powershell. Here is an example:
$query = "SELECT * FROM test where first_name='test'"
function Get-ODBC-Data{
param([string]$query=$(throw 'query is required.'))
$conn = New-Object System.Data.Odbc.OdbcConnection
$conn.ConnectionString = "Driver={PostgreSQL Unicode(x64)};Server=localhost;Port=5432;Database=Test;Uid=test;Pwd=test;"
$conn.open()
$cmd = New-object System.Data.Odbc.OdbcCommand($query,$conn)
$ds = New-Object system.Data.DataSet
(New-Object system.Data.odbc.odbcDataAdapter($cmd)).fill($ds) | out-null
$conn.close()
$ds.Tables[0]
}
function Set-ODBC-Data{
param([string]$query=$(throw 'query is required.'))
$conn = New-Object System.Data.Odbc.OdbcConnection
$conn.ConnectionString= "Driver={PostgreSQL Unicode(x64)};Server=localhost;Port=5432;Database=test;Uid=test;Pwd=test;"
$cmd = new-object System.Data.Odbc.OdbcCommand($query,$conn)
$conn.open()
$cmd.ExecuteNonQuery()
$conn.close()
}
$result = Get-ODBC-Data -query $query
$db = set-odbc-data -query $query
How can i display or fetch values present in the output in a format shown in the screenshot?
How can i export the output to a csv in a proper format?
I didn't test it but try to use below commands as the output from powershell script
C:\Get-Query | Out-GridView
C:\Get-Query | Export-CSV Info.csv
C:\Get-Query | Format-Table -autosize
assuming the script in the file named Get-Query
The solution from this URL

Extra rows in a XLSX export via SQL Pull

Sorry if I am asking anything that is very basic but I just started PowerShell last night and I'm getting a feel for the new language.
I'm using an export to XLSX script provided by the link below:
https://gallery.technet.microsoft.com/office/Export-XLSX-PowerShell-f2f0c035
I'm also using a simple SQL server pull (SQL_Connection_Script.ps1):
$dataSource = "####"
$user = "####"
$pwd = "####"
$database = "####"
$connectionString = "Server=$dataSource;uid=$user; pwd=$pwd;Database=$database;Integrated Security=False;"
$query = "Select * from name where id = '1000'"
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$connection.Open()
$command = $connection.CreateCommand()
$command.CommandText = $query
$result = $command.ExecuteReader()
$table = new-object “System.Data.DataTable”
$table.Load($result)
$table
$connection.Close()
My issue is when I export this object I get the extra columns: RowError, RowState, Table, ItemArray and HasErrors.
Is there anyway to remove these columns by either a command that I am just unaware of or should I insert a dynamic Select statement, example below?
I am hoping to not have to use a dynamic Select statement if possible.
.\Desktop\SQL_Connection_Script.ps1 | Select $DynamicHeadersHere | Export-XLSX -path .\Desktop\testing123.xlsx -Append
So it looks like my SQL Server Pull function is the thing that is pulling in the extra fields. Any ideas?
After looking around for a while I found I can exclude properties.
.\Desktop\SQL_Connection_Script.ps1 | Select * -ExcludeProperty RowError, RowState, HasErrors, Table, ItemArray | Export-XLSX -path .\Desktop\testing123.xlsx -Append