Function to check DB in remote server - Timeout question - powershell

I am writing a function in PowerShell that will do the following when given some parameter:
Get Credetials to connect to remote SQL Server
Open SQL Connection with given parameters. If Opening connection takes longer than value in $sqlconnectionTimeout, it will report a log file with Timeout error
A query will be introduce in a specific DB from Remote Server. If executing command takes longer than value in $sqlCommandTimeout, it will report a log file with error.
The function will create a html file with connectivity result (True or False). If True, the script will check log reports within last 5 minutes. Connectivity will be True if no logs are found. If False, a log will be created and saved in a specific folder.
About the Timeout variables, No matter what I put in the variable, The Connection is opened and the command is executed. Even If I put 0 forcing it to output a timeout error. The connection is opened and the query executed.
My question is: What am I missing here so that Timeout takes effect when executing the function?
Thanks for your help
This is the Code:
# Editing Variables
$user = "userName"
$DatabaseName = "SampleDB"
$query = "INSERT INTO [SampleDB].[dbo].[HealthChecks]([DateTime],[Source],[User]) VALUES (CURRENT_TIMESTAMP, ##SERVERNAME, CURRENT_USER)"
#Set timeout when stablishing connection (Default = 15)
$sqlConnectionTimeout = 3
#Set timeout when executing sql command (Default = 30)
$sqlCommandTimeout = 3
# 1. R-etrieve the data
function Test-SqlConnection
{
param(
[Parameter(Mandatory)]
[string]$serverURL,
[Parameter(Mandatory)]
[string]$DatabaseName,
[Parameter(Mandatory)]
[string]$user,
[Parameter(Mandatory)]
[string]$query,
[Parameter(Mandatory=$False)]
[string]$sqlConnectionTimeout,
[Parameter(Mandatory=$False)]
[string]$sqlCommandTimeout
)
try
{
$logTime = (Get-Date -Format "hh-mm_dd-MM-yyyy")
$reportPath = "C:\Logs\DBLogs\"
$DBCheckPath = "C:\inetpub\wwwroot\UptimeRobot\"
# Obtain Credentials without revealing password in script
$userName = $user
$passwordFile = "C:\Utilities\keys\password.txt"
$keyFile = "C:\Utilities\keys\aes.key"
$key = Get-Content $keyFile
$Credential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $userName, (Get-Content $passwordFile | ConvertTo-SecureString -Key $key)
# Open Connection to DB
$userName = $Credential.UserName
$password = $Credential.GetNetworkCredential().Password
$connectionString = "Data Source={0};database={1};User ID={2};Password={3};Connection Timeout={4}" -f $serverURL,$DatabaseName,$userName,$password,$sqlConnectionTimeout
$sqlConnection = New-Object System.Data.SqlClient.SqlConnection $ConnectionString
$sqlConnection.Open()
#3. P-repare the UpdateRequest
$sqlCommand = New-Object System.Data.SQLClient.SQLCommand
$sqlCommand.CommandTimeout = $sqlCommandTimeout
$sqlCommand.Connection = $sqlConnection
$sqlCommand.CommandText = $query
$res = $sqlCommand.ExecuteNonQuery()
# 4. S-end reports to log file in JSon format
$sqlLogs = Get-Childitem $reportPath -filter *.json | Where-Object {$_.LastWriteTime -gt [datetime]::Now.AddMinutes(-5)}
if($res -eq 1)
{
if($sqlLogs.count -gt 0)
{
$connectivityResult = #{
DBServer = $ServerFriendlyName
DataCenter = $dataCenter
Connectivity = $False
}
}else
{
$connectivityResult =#{
DBServer = $ServerFriendlyName
DataCenter = $dataCenter
Connectivity = $True
}
}
$connectivityResult | ConvertTo-Json -Compress | Out-File -Encoding utf8 -LiteralPath ($DBCheckPath + "DBCheck.html")
}
##TODO: Only report Connectivity Serviceable= $false when there are two logs with $false value within 5 minutes.
}catch
{
$errorMessage = $_.Exception
## Only return $false if the exception was thrown because it can't connect for some reason. Otherwise
## throw the general exception
if ($errorMessage -match 'The server was not found' -or 'timeout')
{
$connectivityResult = #{
DBServer = $ServerFriendlyName
DataCenter = $dataCenter
Connectivity = $False
ErrorMessage = $errorMessage.GetBaseException().message
}
$connectivityResult | ConvertTo-Json | Out-File -Encoding utf8 -LiteralPath ($reportPath + $logTime + ".json")
$connectivityResult | ConvertTo-Json | Out-File -Encoding utf8 -LiteralPath ($DBCheckPath + "DBCheck.html")
}
}
finally
{
$sqlConnection.Close()
}
}

Related

Comparing Server IDs in two tools via PowerShell script

I am making a PowerShell script that is supposed to retrieve and compare the server IDs in two tools that we are using - Octopus tool and MOSS (the idea is to check that all servers in Octopus are also registered in MOSS). The Octopus is accessed on PowerShell via MySQL query and the MOSS is accessed via API call. Currently I am able to retrieve successfully the sql query and format it to JSON to be able to be readable by MOSS. However, I am not aware as to how to make the check if the server IDs are also present in the MOSS. All that the script does is retrieve the IDs from the Octopus SQL and then parse them to JSON and make an empty call to MOSS. Would highly appreciate it if anyone knows how to make MOSS calls from PowerShell.
The current script is:
# Extract RDS and servers from Octopus
# Define log file path
$date = $(get-date).tostring()
$currentdate = get-date -format yyyy-MM-dd
$log_file_path = "C:\Program Files\test\logs\"+$currentdate+"_extract_rds_and_servers_from_octopus.log"
$errorlog_file_path = "C:\Program Files\test\logs\errors\errors.log"
# 0. Exclude Admin Users before getting the RDS licenses which need to be reported
#& ((Split-Path $MyInvocation.InvocationName) + "\exclude_users.ps1") -log_file_path $log_file_path -errorlog_file_path $errorlog_file_path
# 1. Extract ObjectID from Octopus API for current month for each RDP user
try {
$month = (Get-Date -UFormat "%Y%m")
$UrlHost = "https://octopus.mos-windows.eu01.stackit.cloud/api/workspace/55cd5c70-d188-4ac3-b946-f1afec8764ad/report/licensing/spla-usage-reseller?&payload[month_id]=$month&payload[with_itemized]=1&_format=json&_token=j8FE4wZDmBITewHUc7lyYeX9XVVjt3dqz0ID4S6A9KQjkMeKfO7_EcgV7Qshuuw1&_tenant=TlXcM&_language=en&payload[flat_structure]=1"
$HostResponse = Invoke-RestMethod -Uri $UrlHost -Method Get
$users = $HostResponse.itemized
$objectids = #()
foreach ($user in $users.PSObject.Properties) {
if ($user.Value.readable_label -eq "Windows Server Remote Desktop Services")
{
$objectid = $user.Value.object_id
$objectids += $objectid
}
}
}
catch {
$date+" - Error in Octopus API Call: "+$_ | Out-File -Append $errorlog_file_path
exit
}
# 2. Get access device ids from Octopus Database
Import-Module -Name "C:\Program Files\test\request_database.ps1" -ArgumentList $log_file_path, $errorlog_file_path -Verbose
$get_users_devices_query =
#"
select user_id, access_device_ids from oc_reporter.ws_installed_software where user_id is not null;
"#
$ui_ad = execute_db_query $get_users_devices_query
$access_device_ids = $users_devices.access_device_ids
$user_ids = $users_devices.user_id
# 3. Get all openstack server id from Octopus Database 
$get_access_device_server_ids_query =
#"
select id, lower(SUBSTRING_INDEX(SUBSTRING_INDEX(ref_id, "-", -6), "-", 5)) as server_id, ref_id, label from ws_device
where type_id = "vm" and operating_system like "%Windows%" and created > 1644820951;
"#
$ad_si = execute_db_query $get_access_device_server_ids_query
# 4. Map the users/objectids with access device ids and server ids
# Create array with UserID filtered by ObjectID and map each AccessDeviceID(s) to corresponding ServerID
try {
$filteredsi = #()
foreach ($userid in $ui_ad)
{
if ($objectids -contains $userid.user_id)
{
$filteredad = $userid
foreach ($id in $ad_si)
{
if ($filteredad.access_device_ids.split(',') -contains $id.id)
{
$filteredsi += [PSCustomObject]#{"userid" = $filteredad.user_id; "serverid" = $id.server_id}
}
}
}
}
}
catch {
$date+" - Error in Mapping userIDs/objectIDs/accessdeviceIDs/serverIDs: "+$_ | Out-File -Append $errorlog_file_path
exit
}
# Preparation for MOSS
# Create JSON contentblock with looped $filteredsi array
try {
$myArray = $filteredsi
$uniqueUsers = [System.Collections.ArrayList]::new()
for($i = 0; $i -lt $myArray.Count; $i++){
if(!$uniqueUsers.Contains($myArray[$i].userid)){
$uniqueUsers.Add($myArray[$i].userid)
}
}
$allMappings = [System.Collections.ArrayList]::new()
for($i = 0; $i -lt $uniqueUsers.Count; $i++){
$singleMapping = [PSCustomObject]#{id = $uniqueUsers[$i]; servers = $null}
$serverids = [System.Collections.ArrayList]::new()
for($j = 0; $j -lt $myArray.Count; $j++){
if($myArray[$j].userid -eq $uniqueUsers[$i]){
$serverids.Add($myArray[$j].serverid)
}
}
$singleMapping.servers = $serverids
$allMappings.Add($singleMapping)
}
$mosscontent = $allMappings | ConvertTo-Json
$mosscontent
}
catch {
$date+" - Error in creating content block for MOSS: "+$_ | Out-File -Append $errorlog_file_path
exit
}
# Create complete array including contentblock for MOSS API call
try {
$moss = #"
{
"type": "server.usage",
"data": {
"users": $mosscontent
}
}
"#
}
catch {
$date+" - Error in creating array for POST to MOSS: "+$_ | Out-File -Append $errorlog_file_path
exit
}
# 5. Call MOSS prod, dev and qa with the whole list of servers and users
# Authenticating to MOSS
$query_file_path_dev_pw = "~\Documents\MOSSDevEncryptedPassword_"
$query_file_path_qa_pw = "~\Documents\MOSSQaEncryptedPassword_"
$query_file_path_prod_pw = "~\Documents\MOSSProdEncryptedPassword_"
# Function to store credentials
function get_encrypted_content {
param (
[String] $file_path,
[String] $password
)
# Check if credentials file exis
if ( -Not (Test-Path -Path $file_path)) {
switch ($password) {
dev {
# Get credentials
Read-Host -Prompt "Enter password for mos-windows-us-dev-client-id" -AsSecureString | ConvertFrom-SecureString | Out-File -FilePath $file_path
}
qa {
# Get credentials
Read-Host -Prompt "Enter password for mos-windows-us-qa-client-id" -AsSecureString | ConvertFrom-SecureString | Out-File -FilePath $file_path
}
prod {
# Get credentials
Read-Host -Prompt "Enter password for mos-windows-us-prod-client-id" -AsSecureString | ConvertFrom-SecureString | Out-File -FilePath $file_path
}
}
}
# Read credentials from file
$Encrypted_value = Get-Content -Path $file_path
# Decrypt credentials from file
return [Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR((ConvertTo-SecureString $Encrypted_value)))
}
# Define username and password
$clientid_dev = "mos-windows-us-dev"
$clientid_qa = "mos-windows-us-qa"
$clientid_prod = "mos-windows-us-prod"
$dev_pass = get_encrypted_content $query_file_path_dev_pw "dev"
$qa_pass = get_encrypted_content $query_file_path_qa_pw "qa"
$prod_pass = get_encrypted_content $query_file_path_prod_pw "prod"
[System.Security.SecureString]$clientsecret_dev = ConvertTo-SecureString -String $dev_pass -AsPlainText -Force
[System.Security.SecureString]$clientsecret_qa = ConvertTo-SecureString -String $qa_pass -AsPlainText -Force
[System.Security.SecureString]$clientsecret_prod = ConvertTo-SecureString -String $prod_pass -AsPlainText -Force
#Prepare static variables
$MOSSToken_dev = 'https://auth.00.idp.eu01.stackit.cloud/oauth/token'
$MOSSToken_qa = 'https://auth.01.idp.eu01.stackit.cloud/oauth/token'
$MOSSToken_prod = 'https://auth.01.idp.eu01.stackit.cloud/oauth/token'
$MOSSUrl_dev = "https://stackit-service-mos-dev.apps.01.cf.eu01.stackit.cloud/v1/events"
$MOSSUrl_qa = "https://stackit-service-mos-qa.apps.01.cf.eu01.stackit.cloud/v1/events"
$MOSSUrl_prod = "https://stackit-service-mos.apps.01.cf.eu01.stackit.cloud/v1/events"
$body = #{grant_type='client_credentials'}
#Set function to get all customerinfo from all portals
function call_moss {
param (
[String] $clientid,
[SecureString] $clientsecret,
[String] $MOSSToken,
[String] $MOSSUrl
)
$cred = New-Object -typename System.Management.Automation.PSCredential -ArgumentList $clientid, $clientsecret
#Get Token from MOSS
$Response = Invoke-RestMethod -Uri $MOSSToken -Method Post -Credential $cred -Body $body -ContentType "application/x-www-form-urlencoded"
$Token = $Response.access_token
$Tokenfinal = "Bearer " + $Token
#Post Content to MOSS
Invoke-RestMethod -Uri $MOSSUrl -Method Post -Headers #{'Authorization' = $Tokenfinal } -Body $moss -ContentType "application/json"
}
#Call function to Call MOSS
try
{
Write-Host "Call to MOSS Dev.."
call_moss $clientid_dev $clientsecret_dev $MOSSToken_dev $MOSSUrl_dev
Write-Host "Call to MOSS QA.."
call_moss $clientid_qa $clientsecret_qa $MOSSToken_qa $MOSSUrl_qa
Write-Host "Call to MOSS Prod"
call_moss $clientid_prod $clientsecret_prod $MOSSToken_prod $MOSSUrl_prod
}
catch
{
$date+" - Error in calling MOSS: "+$_ | Out-File -Append $errorlog_file_path
exit
}
# 6. Create daily logs with users reported to MOSS
$date = $(get-date).tostring()
$log = foreach ($item in $filteredsi)
{
$item
}
echo $date $log | Out-file -Append $log_file_path
# delete logs older than 60 days
$limit = (Get-Date).AddDays(-60)
$path = "C:\Program Files\test\logs"
# Delete files older than the $limit.
Get-ChildItem -Path $path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force
# Create activity file to check if script is working
$temp_file_path = "C:\Program Files\test\tempfile*"
if (Test-Path $temp_file_path)
{
Remove-Item $temp_file_path
}
[string]$filePath = "C:\Program Files\test\tempfile";
[string]$directory = [System.IO.Path]::GetDirectoryName($filePath);
[string]$strippedFileName = [System.IO.Path]::GetFileNameWithoutExtension($filePath);
[string]$extension = [System.IO.Path]::GetExtension($filePath);
[string]$newFileName = $strippedFileName + "_" + (Get-Date).ToString('MM-dd-yyyy') + $extension;
[string]$newFilePath = [System.IO.Path]::Combine($directory, $newFileName);
New-Item $newFilePath
Additional scripts that are being called:
-> request_database.ps1
# 1. Get path for log files
param(
[string]$log_file_path = "C:\Program Files\test\logs\request_database.log",
[string]$errorlog_file_path = "C:\Program Files\test\logs\request_database_errors.log"
)
# 2. Get credentials to access Octopus DB
try {
# Define username and password security string files
$mysql_user_file_path = "~\Documents\ue_"
$mysql_pass_file_path = "~\Documents\pe_"
# Functions
function get_encrypted_content {
param (
[String] $file_path,
[String] $user_or_pass
)
# Check if credentials file exist
if ( -Not (Test-Path -Path $file_path)) {
switch ($user_or_pass) {
msqu {
# Get credentials
Read-Host -Prompt "Please, enter MySQL username" -AsSecureString | ConvertFrom-SecureString | Out-File -FilePath $file_path
}
msqp {
# Get credentials
Read-Host -Prompt "Please, enter MySQL password" -AsSecureString | ConvertFrom-SecureString | Out-File -FilePath $file_path
}
}
}
# Read credentials from file
$Encrypted_value = Get-Content -Path $file_path
# Decrypt credentials from file
return [Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR((ConvertTo-SecureString $Encrypted_value)))
}
# Define username and password
$my_sql_user = get_encrypted_content $mysql_user_file_path "msqu"
$my_sql_pass = get_encrypted_content $mysql_pass_file_path "msqp"
[System.Security.SecureString]$SecPwd = ConvertTo-SecureString -String $my_sql_pass -AsPlainText -Force
$Credential = new-object -typename System.Management.Automation.PSCredential -argumentlist #($my_sql_user,$SecPwd)
}
catch {
$(get-date).tostring() +" - Error in fetching SQL user/pwd: "+$_ | Out-File -Append $errorlog_file_path
exit
}
$error_string = ""
# 3. Execute DB query
# If you want the script to continue on error you have to provide $true as second parameter
# Example: execute_db_query $exclude_stackitadmin_users_query $true
function execute_db_query {
param (
[String] $query,
[bool] $continueOnError = $false
)
try {
# Query Octopus DB
Connect-MySqlServer -Credential $Credential -Server localhost
if ($continueOnError) {
$query_results = Invoke-MySqlQuery -Query $query
} else {
$query_results = Invoke-MySqlQuery -Query $query -ErrorAction Stop
}
Disconnect-MySqlServer
return $query_results
}
catch {
$error_message = $(get-date).tostring() + " - Error in DB Query 1: " + $_
$error_message | Out-File -Append $errorlog_file_path
Write-Error $error_message
exit
}
}
-> exclude_users.ps1 (probably not needed for the task but the overall script doesn't work without it)
# 1. Get path for log files or use default
param(
[string]$log_file_path = "C:\Program Files\test\logs\exclude_users.log",
[string]$errorlog_file_path = "C:\Program Files\test\logs\errors\exclude_users_errors.log"
)
Import-Module -Name "C:\Program Files\test\request_database.ps1" -ArgumentList $log_file_path, $errorlog_file_path -Verbose
# Exclude StackitAdmin users
$exclude_stackitadmin_users_query =
#"
update oc_reporter.ws_user as updated_user,
(
select id from oc_reporter.ws_user
where username = "StackITAdmin" and exclude_spla = "no"
) as us
set
exclude_spla = "yes",
exclude_spla_reason = "nh"
where updated_user.id = us.id;
"#
execute_db_query($exclude_stackitadmin_users_query)
# Exclude users on dev & qa environment
$exclude_users_on_dev_qa_query =
#"
update oc_reporter.ws_installed_software as updated_software,
(
select ws.access_device_ids, min(ws.access_device_labels), min(ws.user_label), excluded_users, min(ws.id) as id from oc_reporter.ws_user
join oc_reporter.ws_installed_software as ws
on ws.user_id = ws_user.id
join oc_reporter.ws_customer as wc
on wc.id = ws_user.customer_id
left join
(select access_device_ids, count(1) as excluded_users from oc_reporter.ws_user as wu
join oc_reporter.ws_customer as wc
on wc.id = wu.customer_id
join oc_reporter.ws_installed_software as ws
on ws.user_id = wu.id
where
(internal_id like "d-%" or internal_id like "q-%") and
locate(',', access_device_ids) = 0 and
ws.exclude_spla = "yes" and
ws.label = "Microsoft Remote Desktop Services" and
wu.username != "StackitAdmin"
group by access_device_ids, wu.exclude_spla) as servers
on servers.access_device_ids = ws.access_device_ids
where
ws.exclude_spla = "no" and
ws.label = "Microsoft Remote Desktop Services" and
(internal_id like "d-%" or internal_id like "q-%") and
locate(',', ws.access_device_ids) = 0
group by ws.access_device_ids
having (excluded_users = 1 or excluded_users is null)
) as us
set
exclude_spla = "yes",
exclude_spla_reason = "admin"
where updated_software.id = us.id;
"#
# run twice to exlude 2 users per vm
execute_db_query($exclude_users_on_dev_qa_query)
execute_db_query($exclude_users_on_dev_qa_query)
# Exclude users from our mos-windows-2 project
$exclude_users_from_our_projects =
#"
update oc_reporter.ws_installed_software as ins,
(
select ws.access_device_ids, min(ws.access_device_labels), min(ws.user_label), excluded_users, min(ws.id) as id, min(wu.id) from oc_reporter.ws_user as wu
join oc_reporter.ws_installed_software as ws
on ws.user_id = wu.id
join oc_reporter.ws_device as wd
on wd.id = ws.access_device_ids
left join (
select ws.access_device_ids, min(ws.id), min(wu.id), count(1) as excluded_users from oc_reporter.ws_user as wu
join oc_reporter.ws_installed_software as ws
on ws.user_id = wu.id
join oc_reporter.ws_device as wd
on wd.id = ws.access_device_ids
where
ws.exclude_spla = "yes" and
ws.label = "Microsoft Remote Desktop Services" and
LOCATE(',',access_device_ids) = 0 and
(
hkey like "%d57abb0200304506879bd8037f7a49cb%" or
hkey like "%fce60e2c938c49e4a37687492a45b652%" or
hkey like "%8eb91d45f25b45978b71abb0e06a0443%" or
hkey like "%66ad75e4ff624f7e940dc363549c8404%" or
hkey like "%351aa84fb9b54be896112b36ae15dd48%" or
hkey like "%64edd6c19e17417d86094e6a02610eed%"
) and
wu.username != "StackitAdmin"
group by ws.access_device_ids
) as excluded_ws
on
excluded_ws.access_device_ids = ws.access_device_ids
where
ws.exclude_spla = "no" and
ws.label = "Microsoft Remote Desktop Services" and
LOCATE(',',ws.access_device_ids) = 0 and
(
hkey like "%d57abb0200304506879bd8037f7a49cb%" or
hkey like "%fce60e2c938c49e4a37687492a45b652%" or
hkey like "%8eb91d45f25b45978b71abb0e06a0443%" or
hkey like "%66ad75e4ff624f7e940dc363549c8404%" or
hkey like "%351aa84fb9b54be896112b36ae15dd48%" or
hkey like "%64edd6c19e17417d86094e6a02610eed%"
) and
wu.domain not like "%HOP01%" and
wu.domain not like "%WSUS01%" and
wu.domain not like "%OCKMS%" and
wu.domain not like "%AZDVOP%"
group by ws.access_device_ids
having (excluded_users = 1 or excluded_users is null)
) as rds
set
exclude_spla = "yes",
exclude_spla_reason = "admin"
where ins.id = rds.id;
"#
# run twice to exlude 2 users per vm
execute_db_query($exclude_users_from_our_projects)
execute_db_query($exclude_users_from_our_projects)
The order was wrong, as well as lots of unneccessary elements causing errors and crashes.
Current working code is:
# Compare the data between the MOSS and the Octopus
# Define log file path
$date = $(get-date).tostring()
$currentdate = get-date -format yyyy-MM-dd
$log_file_path = "[FILE PATH]"
$errorlog_file_path = "[FILE PATH]"
# 1. Call MOSS (dev, qa, prod) to get the data for all servers created in the last 48 hours
# Authenticating to MOSS
$query_file_path_dev_pw = "[FILE PATH]"
$query_file_path_qa_pw = "[FILE PATH]"
$query_file_path_prod_pw = "[FILE PATH]"
# Function to store credentials
function get_encrypted_content {
param (
[String] $file_path,
[String] $password
)
# Check if credentials file exis
if ( -Not (Test-Path -Path $file_path)) {
switch ($password) {
dev {
# Get credentials
Read-Host -Prompt "Enter password for mos-windows-us-dev-client-id" -AsSecureString | ConvertFrom-SecureString | Out-File -FilePath $file_path
}
qa {
# Get credentials
Read-Host -Prompt "Enter password for mos-windows-us-qa-client-id" -AsSecureString | ConvertFrom-SecureString | Out-File -FilePath $file_path
}
prod {
# Get credentials
Read-Host -Prompt "Enter password for mos-windows-us-prod-client-id" -AsSecureString | ConvertFrom-SecureString | Out-File -FilePath $file_path
}
}
}
# Read credentials from file
$Encrypted_value = Get-Content -Path $file_path
# Decrypt credentials from file
return [Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR((ConvertTo-SecureString $Encrypted_value)))
}
# Define username and password
$clientid_dev = "[USERNAME]"
$clientid_qa = "[USERNAME]"
$clientid_prod = "[USERNAME]"
$dev_pass = get_encrypted_content $query_file_path_dev_pw "dev"
$qa_pass = get_encrypted_content $query_file_path_qa_pw "qa"
$prod_pass = get_encrypted_content $query_file_path_prod_pw "prod"
[System.Security.SecureString]$clientsecret_dev = ConvertTo-SecureString -String $dev_pass -AsPlainText -Force
[System.Security.SecureString]$clientsecret_qa = ConvertTo-SecureString -String $qa_pass -AsPlainText -Force
[System.Security.SecureString]$clientsecret_prod = ConvertTo-SecureString -String $prod_pass -AsPlainText -Force
# Time variable
$date48h = ("{0:yyyy-MM-ddThh:mm:ss}" -f ((get-date).Addhours(-48))).split("T").split(":")
$date = $date48h[0]
$hour = $date48h[1]
$min = $date48h[2]
$sec = $date48h[3]
#Prepare static variables
$MOSSToken_dev = '[URL]'
$MOSSToken_qa = '[URL]'
$MOSSToken_prod = '[URL]'
$MOSSUrl_dev = "[URL]"
$MOSSUrl_qa = "[URL]"
$MOSSUrl_prod = "[URL]"
$body = #{grant_type='client_credentials'}
#Set function to get all customerinfo from all portals
function call_moss {
param (
[String] $clientid,
[SecureString] $clientsecret,
[String] $MOSSToken,
[String] $MOSSUrl
)
$cred = New-Object -typename System.Management.Automation.PSCredential -ArgumentList $clientid, $clientsecret
#Get Token from MOSS
$Response = Invoke-RestMethod -Uri $MOSSToken -Method Post -Credential $cred -Body $body -ContentType "application/x-www-form-urlencoded"
$Token = $Response.access_token
$Tokenfinal = "Bearer " + $Token
#Post Content to MOSS
Invoke-RestMethod -Uri $MOSSUrl -Method Get -Headers #{'Authorization' = $Tokenfinal } -ContentType "application/json"
}
#Call function to Call MOSS
try
{
Write-Host "Call to MOSS Dev.."
$get_moss_dev = call_moss $clientid_dev $clientsecret_dev $MOSSToken_dev $MOSSUrl_dev
Write-Host "Call to MOSS QA.."
$get_moss_qa = call_moss $clientid_qa $clientsecret_qa $MOSSToken_qa $MOSSUrl_qa
Write-Host "Call to MOSS Prod"
$get_moss_prod = call_moss $clientid_prod $clientsecret_prod $MOSSToken_prod $MOSSUrl_prod
}
catch
{
$date+" - Error in calling MOSS: "+$_ | Out-File -Append $errorlog_file_path
exit
}
$moss_dev_serverids = $get_moss_dev.items.id
$moss_qa_serverids = $get_moss_qa.items.id
$moss_prod_serverids = $get_moss_prod.items.id
$moss_serverid_arr = #($moss_dev_serverids, $moss_qa_serverids, $moss_prod_serverids)
# 2. Call Octopus to get the data for new servers created in the last 36 hours
Import-Module -Name "[FILE PATH]" -ArgumentList $log_file_path, $errorlog_file_path -Verbose
# Calculate timestamp
$DateTime = Get-Date #or any other command to get DateTime object
$CurrentUnixTime = ([DateTimeOffset]$DateTime).ToUnixTimeSeconds()
$queryTime = $CurrentUnixTime - (36 * 3600)
$get_new_servers_query_oc =
#"
select id, lower(SUBSTRING_INDEX(SUBSTRING_INDEX(ref_id, "-", -6), "-", 5)) as server_id, ref_id, label from oc_reporter.ws_device where type_id = "vm" and operating_system like "%Windows%" and created > $queryTime;
"#
$query = execute_db_query $get_new_servers_query_oc
$serverid_oc = $query.server_id
$serverid_oc_arr = #($serverid_oc)
# 3. Compare the properties in MOSS and Octopus
$unmatching_serverids = $serverid_oc_arr | Where {$moss_serverid_arr -NotContains $_}
$error_report = #($unmatching_serverids)
# Create daily logs with servers in Octopus that are unregistered in MOSS
$date = $(get-date).tostring()
$log = $error_report
echo $date $log | Out-file -Append $log_file_path
# delete logs older than 60 days
$limit = (Get-Date).AddDays(-60)
$path = "[FILE PATH]"
# Delete files older than the $limit.
Get-ChildItem -Path $path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force
# 4. Generate summary with all errors and send a notification if there is an error. Schedule a task to check once per day.
If ($error_report.Count -eq 0) {
exit
}
else {
$JSONBody = [PSCustomObject][Ordered] #{
"type" = "MessageCard"
"title" = "Octopus Alerts"
"text" = "Servers located in Octopus, that are not registered in MOSS. <br>
Please check logs."
}
$TeamsMessageBody = ConvertTo-Json $JSONBody
$parameters = #{
"URI" = '[URL]'
"Method" = 'POST'
"Body" = $TeamsMessageBody
"ContentType" = 'application/json'
}
Invoke-RestMethod #parameters
}
# Create activity file to check if script is working
$temp_file_path = "[FILE PATH]"
if (Test-Path $temp_file_path)
{
Remove-Item $temp_file_path
}
[string]$filePath = "[FILE PATH]";
[string]$directory = [System.IO.Path]::GetDirectoryName($filePath);
[string]$strippedFileName = [System.IO.Path]::GetFileNameWithoutExtension($filePath);
[string]$extension = [System.IO.Path]::GetExtension($filePath);
[string]$newFileName = $strippedFileName + "_" + (Get-Date).ToString('MM-dd-yyyy') + $extension;
[string]$newFilePath = [System.IO.Path]::Combine($directory, $newFileName);
New-Item $newFilePath
Also, as already mentioned the exclude_users script was completely not needed. The only additionally included script is the request_database script.

Parallel run for this particular powershell script

I am in the process of re-writing the script below to be able to run in parallel, as can be seen in the code, an array of servers is passed to the script, and then it loads it onto a hash table, loops through each server at a time to do the deployment, for each server there are files to execute in a particular order (see array of files). Looking at the structure, I feel workspace is the way to go here but I could be wrong.
Where the performance gains can be seen in my opinion or having the code such that multiple servers can be executed at thesame time rather than waiting for each server to complete and move onto the next one. foreach parallel
I ran a test to call a function declared outside a workspace, it worked.Is this good practice to call a function declared outside a workspace ? I ask this because I would like to reuse some functions outside the workspace, or is it generally better to put all the code in the workspace even ones that are not intended for parallel workloads i.e one off calls to the code. ?
The below is the code I am testing with.
Function Check-Instance-Connection{
param
(
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
$sql_server,
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=1)]
$db_name
)
try
{
#Return extra useful info by using custom objects
$check_outcome = "" | Select-Object -Property log_date, stage, status, error_message
$check_outcome.log_date = (Get-Date)
$check_outcome.stage = 'Ping SQL instance for $sql_server'
#test connection for a sql instance
$connectionstring = "Data Source=$sql_server;Integrated Security =true;Initial Catalog=$db_name;Connect Timeout=5;"
$sqllconnection = New-Object System.Data.SqlClient.SqlConnection $connectionstring
$sqllconnection.Open();
$check_outcome.status = $true
$check_outcome.error_message = ''
return $check_outcome
}
Catch
{
$check_outcome.status = $false
$check_outcome.error_message = $_.Exception.Message
return $check_outcome
}
finally{
$sqllconnection.Close();
}
}
$file_list = #("deployment_1.sql","deployment_2.sql","deployment_3.sql","deployment_4.sql","deployment_5.sql")
$x = (1,"Server1",3,1),(4,"Server2",6,2),(3,"Server3",4,3)
$k = 'serverid','servername','locationid','appid' # key names correspond to data positions in each array in $x
$h = #{}
For($i=0;$i -lt $x[0].length; $i++){
$x |
ForEach-Object{
[array]$h.($k[$i]) += [string]$_[$i]
}
}
$folder = "F:\Files\"
$database_name = "Test"
foreach ($server_id in $all_server_ids)
{
$severid = $h["serverid"][$all_server_ids.indexof($server_id)]
$servername = $h["servername"][$all_server_ids.indexof($server_id)]
$locationid = $h["locationid"][$all_server_ids.indexof($server_id)]
$message = 'ServerID {0} has a servername of {1} and a location id of {2}' -f $server_id, $h["servername"][$all_server_ids.indexof($server_id)],$h["locationid"][$all_server_ids.indexof($server_id)]
Write-Output $message
Write-Output "This $severid and this $servername and this $locationid"
foreach ($file in $file_list)
{
$is_instance_ok = Check-Instance-Connection $servername $database_name
if ($is_instance_ok.check_outcome -eq $true){
invoke-sqlcmd -ServerInstance "$servername" -inputfile $folder$file -Database "$database_name" -Querytimeout 60 -OutputSqlErrors $true -ConnectionTimeout 10 -ErrorAction Continue -Errorvariable generated_error | Out-Null
}
}
}
Thanks, I did a lot more research and looked at a lot of examples on how workflows work. This is what I have come up with.
Workflow RunExecution
{
Function Check-Instance-Connection{
param
(
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
$sql_server,
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=1)]
$db_name
)
try
{
#Return extra useful info by using custom objects
$check_outcome = "" | Select-Object -Property log_date, stage, status, error_message
$check_outcome.log_date = (Get-Date)
$check_outcome.stage = 'Ping SQL instance for $sql_server'
#test connection for a sql instance
$connectionstring = "Data Source=$sql_server;Integrated Security =true;Initial Catalog=$db_name;Connect Timeout=5;"
$sqllconnection = New-Object System.Data.SqlClient.SqlConnection $connectionstring
$sqllconnection.Open();
$check_outcome.status = $true
$check_outcome.error_message = ''
return $check_outcome
}
Catch
{
$check_outcome.status = $false
$check_outcome.error_message = $_.Exception.Message
return $check_outcome
}
finally{
$sqllconnection.Close();
}
}
$file_list = #("deployment_1.sql","deployment_2.sql","deployment_3.sql","deployment_4.sql","deployment_5.sql")
$x = (1,"server1\DEV3",3,1),(4,"serer1\DEV2",6,2),(3,"serer2\DEV1",4,3)
$k = 'serverid','servername','locationid','appid'
$h = #{}
For($i=0;$i -lt $x[0].length; $i++){
$x |
ForEach-Object{
[array]$h.($k[$i]) += [string]$_[$i]
}
}
$folder = "C:\Temp\"
$database_name = "Test"
$all_server_ids = $h['serverid']
foreach -parallel ($server_id in $all_server_ids)
{
$severid = $h["serverid"][$all_server_ids.indexof($server_id)]
$servername = $h["servername"][$all_server_ids.indexof($server_id)]
$locationid = $h["locationid"][$all_server_ids.indexof($server_id)]
foreach ($file in $file_list)
{
# $check_fine = $is_instance_ok.check_outcome
# if ($check_fine = $true){
invoke-sqlcmd -ServerInstance "$servername" -inputfile $folder$file -Database "$database_name" -Querytimeout 60 -OutputSqlErrors $true -ConnectionTimeout 10 -ErrorAction Continue
write-output "invoke-sqlcmd -ServerInstance $servername -inputfile $folder$file -Database $database_name -Querytimeout 60 -OutputSqlErrors $true -ConnectionTimeout 10 -ErrorAction Continue "
# }
}
}
}
RunExecution

Unable to convert .xlsx to .csv through Power shell in windows server 2012

I have created below function in Powershell to convert an .xlsx file into .csv ,the script is running in windows server 2012 machine.The script is getting complete without any error but it not doing the conversion.When same script I have tried in win 10 machine its working perfectly.Can you please confirm how to fix this issue.Is it due to server OS as excel is not installed in it ?
function xlsx_to_csv($xlinput,$csvout) {
$excel = New-Object -ComObject excel.application
$excel.DisplayAlerts = $false
$Workbook = $excel.Workbooks.Open("$xlinput")
$Workbook.SaveAs("$csvout",6)
$excel.Quit()
}
xlsx_to_csv $yest_xl_in $yest_csv_in
$yest_xl_in and $yest_csv_in are the xlsx and csv file locations.
Here is a more elegant function to convert an XLSX file into a csv. I have been using it for quite some time now and it has never let me down, yet! :)
function Get-ExcelData {
[CmdletBinding(DefaultParameterSetName='Worksheet')]
Param(
[Parameter(Mandatory=$true, Position=0)]
[String] $Path,
[Parameter(Position=1, ParameterSetName='Worksheet')]
[String] $WorksheetName = 'Sheet1',
[Parameter(Position=1, ParameterSetName='Query')]
[String] $Query = 'SELECT * FROM [Sheet1$]'
)
switch ($pscmdlet.ParameterSetName) {
'Worksheet' {
$Query = 'SELECT * FROM [{0}$]' -f $WorksheetName
break
}
'Query' {
# Make sure the query is in the correct syntax (e.g. 'SELECT * FROM [SheetName$]')
$Pattern = '.*from\b\s*(?<Table>\w+).*'
if($Query -match $Pattern) {
$Query = $Query -replace $Matches.Table, ('[{0}$]' -f $Matches.Table)
}
}
}
# Create the scriptblock to run in a job
$JobCode = {
Param($Path, $Query)
# Check if the file is XLS or XLSX
if ((Get-Item -Path $Path).Extension -eq 'xls') {
$Provider = 'Microsoft.Jet.OLEDB.4.0'
$ExtendedProperties = 'Excel 8.0;HDR=YES;IMEX=1'
} else {
$Provider = 'Microsoft.ACE.OLEDB.12.0'
$ExtendedProperties = 'Excel 12.0;HDR=YES'
}
# Build the connection string and connection object
$ConnectionString = 'Provider={0};Data Source={1};Extended Properties="{2}"' -f $Provider, $Path, $ExtendedProperties
$Connection = New-Object System.Data.OleDb.OleDbConnection $ConnectionString
try {
# Open the connection to the file, and fill the datatable
$Connection.Open()
$Adapter = New-Object -TypeName System.Data.OleDb.OleDbDataAdapter $Query, $Connection
$DataTable = New-Object System.Data.DataTable
$Adapter.Fill($DataTable) | Out-Null
}
catch {
# something went wrong ??
Write-Error $_.Exception.Message
}
finally {
# Close the connection
if ($Connection.State -eq 'Open') {
$Connection.Close()
}
}
# Return the results as an array
return ,$DataTable
}
# Run the code in a 32bit job, since the provider is 32bit only
$job = Start-Job $JobCode -RunAs32 -ArgumentList $Path, $Query
$job | Wait-Job | Receive-Job
Remove-Job $job
}
Now to get your csv file, you can simply do -
$csvfile = Get-ExcelData -Path 'PathToYourExcelFile\YourExcelFile.xlsx'
$csvfile | Export-Csv -path $env:USERPROFILE\Desktop\CsvFileName.csv -NoTypeInformation #Save the csv file on your Desktop

Emailing a Hard Drive Disk Space Alert Using Powershell

I've been browsing the web trying to find a way if possible to email a low disk space alert from a Gmail account to a shared mail box using power shell but Im struggling with a query I've managed to piece together.
$EmailFrom = "FromEmail#Gmail.com"
$EmailTo = "ToEmail#Gmail.com"
$SMTPServer = "smtp.gmail.com"
$SMTPClient = New-Object Net.Mail.SmtpClient($SmtpServer, 587)
$SMTPClient.EnableSsl = $true
$SMTPClient.Credentials = New-Object System.Net.NetworkCredential("Username", "Password");
$Computers = "Local Computer"
$Subject = "Disk Space Storage Report"
$Body = "This report was generated because the drive(s) listed below have less than $thresholdspace % free space. Drives above this threshold will not be listed."
[decimal]$thresholdspace = 50
$tableFragment = Get-WMIObject -ComputerName $computers Win32_LogicalDisk `
| select __SERVER, DriveType, VolumeName, Name, #{n='Size (Gb)' ;e={"{0:n2}" -f ($_.size/1gb)}},#{n='FreeSpace (Gb)';e={"{0:n2}" -f ($_.freespace/1gb)}}, #{n='PercentFree';e={"{0:n2}" -f ($_.freespace/$_.size*100)}} `
| Where-Object {$_.DriveType -eq 3} `
| ConvertTo-HTML -fragment
$regexsubject = $Body
$regex = [regex] '(?im)<td>'
if ($regex.IsMatch($regexsubject)) {$smtpclinet.send($fromemail, $EmailTo, $Subject, $Body)}
Script runs but nothing happens, any help would be fantastic!!!
My version would be longer because I'd have made a substitute for Send-MailMessage such that swapping between mine and Send-MailMessage is trivial.
This is one possible way of doing it. There are good uses for the Fragment parameter on ConvertTo-Html, but not much of a justification to do so here.
This is a script and expected to be a .ps1 file. Mandatory things I don't really want to hard-code beyond a default are set in the param block.
param(
[String[]]$ComputerName = $env:COMPUTERNAME,
[Decimal]$Theshold = 0.5,
[PSCredential]$Credential = (Get-Credential)
)
#
# Supporting functions
#
# This function acts in much the same way as Send-MailMessage.
function Send-SmtpMessage {
param(
[Parameter(Mandatory = $true, Position = 1)]
[String[]]$To,
[Parameter(Mandatory = $true, Position = 2)]
[String]$Subject,
[String]$Body,
[Switch]$BodyAsHtml,
[String]$SmtpServer = $PSEmailServer,
[Int32]$Port,
[Switch]$UseSSL,
[PSCredential]$Credential,
[Parameter(Mandatory = $true)]
[String]$From
)
if ([String]::IsNullOrEmtpy($_)) {
# I'd use $pscmdlet.ThrowTerminatingError for this normally
throw 'A value must be provided for SmtpServer'
}
# Create a mail message
$mailMessage = New-Object System.Net.Mail.MailMessage
# Email address formatting si validated by this, allowing failure to kill the command
try {
foreach ($recipient in $To) {
$mailMessage.To.Add($To)
}
$mailMessage.From = $From
} catch {
$pscmdlet.ThrowTerminatingError($_)
}
$mailMessage.Subject = $Subject
$mailMessage.Body = $Body
if ($BodyAsHtml) {
$mailMessage.IsBodyHtml = $true
}
try {
$smtpClient = New-Object System.Net.Mail.SmtpClient($SmtpServer, $Port)
if ($UseSSL) {
$smtpClient.EnableSsl = $true
}
if ($psboundparameters.ContainsKey('Credential')) {
$smtpClient.Credentials = $Credential.GetNetworkCredential()
}
$smtpClient.Send($mailMessage)
} catch {
# Return errors as non-terminating
Write-Error -ErrorRecord $_
}
}
#
# Main
#
# This is inserted before the table generated by the script
$PreContent = 'This report was generated because the drive(s) listed below have less than {0} free space. Drives above this threshold will not be listed.' -f ('{0:P2}' -f $Threshold)
# This is a result counter, it'll be incremented for each result which passes the threshold
$i = 0
# Generate the message body. There's not as much error control around WMI as I'd normally like.
$Body = Get-WmiObject Win32_LogicalDisk -Filter 'DriveType=3' -ComputerName $ComputerName | ForEach-Object {
# PSCustomObject requires PS 3 or greater.
# Using Math.Round below means we can still perform numeric comparisons
# Percent free remains as a decimal until the end. Programs like Excel expect percentages as a decimal (0 to 1).
[PSCustomObject]#{
ComputerName = $_.__SERVER
DriveType = $_.DriveType
VolumeName = $_.VolumeName
Name = $_.Name
'Size (GB)' = [Math]::Round(($_.Size / 1GB), 2)
'FreeSpace (GB)' = [Math]::Round(($_.FreeSpace / 1GB), 2)
PercentFree = [Math]::Round(($_.FreeSpace / $_.Size), 2)
}
} | Where-Object {
if ($_.PercentFree -lt $Threshold) {
$true
$i++
}
} | ForEach-Object {
# Make Percentage friendly. P2 adds % for us.
$_.PercentFree = '{0:P2}' -f $_.PercentFree
$_
} | ConvertTo-Html -PreContent $PreContent | Out-String
# If there's one or more warning to send.
if ($i -gt 0) {
$params = #{
To = "ToEmail#Gmail.com"
From = "FromEmail#Gmail.com"
Subject = "Disk Space Storage Report"
Body = $Body
SmtpServer = "smtp.gmail.com"
Port = 587
UseSsl = $true
Credential = $Credential
}
Send-SmtpMessage #params
}

Opening a SSRS project using Powershell

I have a report that is copied to a number of different servers. It is imported manually and the data source properties are altered to match the current server's specs. I would like to be able to automate the process by enabling users to open a the SSRS report and dynamically alter it's shared data source properties through PowerShell. I hope you could help. You may see reference below.
The script would accept an input parameter for servername, username and password. Also, the save my password must be ticked.
I couldn't believe I managed to create a script for this. You may make use of the script below as future reference. Comments are available for each part and anything that needs to be altered has a "here" keyword , ex. Your_database_name_here .
Import-Module SqlPs
#Input parameter to get Server\Instancename of your Datasource
$Servername = Read-Host "Please enter your Servername"
$Instancename = Read-Host "Please enter your Instancename. For default instance please press enter"
Write-host ""
if ($Instancename -eq ""){
$ServerInstance = $Servername
}
Else {
$ServerInstance = $Servername +"\"+ $InstanceName
}
#Setting up SSRS Target URL. This is the location where your reports would be deployed.
if ($Instancename -eq ""){
$ReportServerUri = "http://$Servername/ReportServer//ReportService2010.asmx?wsdl"
$TargetURL = "http://$Servername/Reports"
}
Else {
$ReportServerUri = "http://$Servername/ReportServer_$Instancename//ReportService2010.asmx?wsdl"
$TargetURL = "http://$Servername/Reports_$Instancename"
}
$global:proxy = New-WebServiceProxy -Uri $ReportServerUri -UseDefaultCreden
#We would make use of SQL Server Authentication for the reports shared datasource so you need to supply a username and password.
Write-Host " SQL Server Authentication:"
$Username = Read-Host " Username"
$Password = Read-Host -AsSecureString "Password"
$type = $Proxy.GetType().Namespace
$datatype = ($type + '.Property')
$property =New-Object ($datatype);
$property.Name = “NewFolder”
$property.Value = “NewFolder”
$numproperties = 1
$properties = New-Object ($datatype + '[]')$numproperties
$properties[0] = $property;
$newFolder = $proxy.CreateFolder("Reports”, “/”, $properties);
$newFolder = $proxy.CreateFolder("Data Sources”, “/”, $properties);
$Children =$proxy.ListChildren("/",$false)
$DBname = 'Your_Database_Name_Here'
# Creating Datasource through powershell
Write-Host " Creating Datasource ..."
$Name = "Name_Your_Datasource_here"
$Parent = "/Data Sources"
$ConnectString = "data source=$Servername\$Instancename;initial catalog=$DBname"
$type = $Proxy.GetType().Namespace
$DSDdatatype = ($type + '.DataSourceDefinition')
$DSD = new-object ($DSDdatatype)
if($DSD -eq $null){
Write-Error Failed to create data source definition object
}
$CredentialDataType = ($type + '.CredentialRetrievalEnum')
$Cred = new-object ($CredentialDataType)
$CredEnum = ($CredentialDataType).Integrated
$Cred.value__=1
$DSD.CredentialRetrieval =$Cred
$DSD.ConnectString = $ConnectString
$DSD.Enabled = $true
$DSD.EnabledSpecified = $false
$DSD.Extension = "SQL"
$DSD.ImpersonateUserSpecified = $false
$DSD.Prompt = $null
$DSD.WindowsCredentials = $false
$DSD.UserName = $Username
$DSD.Password = [Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR($Password))
$newDSD = $proxy.CreateDataSource($Name,$Parent,$true,$DSD,$null)
#Deploying RLD files to Target URL
Write-Host " Deploying RDL files ..."
$stream = Get-Content 'D:\Your_RDL_path_here.rdl' -Encoding byte
$warnings =#();
$proxy.CreateCatalogItem("Report","Report_Name_here","/Reports",$true,$stream,$null,[ref]$warnings)
#Let's make use of the datasource we just created for your RDL files.
$Items = $global:proxy.listchildren("/Data Sources", $true)
foreach ($item in $items)
{
$DatasourceName = $item.Name
$DatasourcePath = $item.Path
}
$RDLS = $global:proxy.listchildren("/Reports", $true)
foreach ($rdl in $rdls)
{
$report = $rdl.path
$rep = $global:proxy.GetItemDataSources($report)
$rep | ForEach-Object {
$proxyNamespace = $_.GetType().Namespace
$constDatasource = New-Object ("$proxyNamespace.DataSource")
$constDatasource.Name = $DataSourceName
$constDatasource.Item = New-Object ("$proxyNamespace.DataSourceReference")
$constDatasource.Item.Reference = $DataSourcePath
$_.item = $constDatasource.Item
$global:proxy.SetItemDataSources($report, $_)
Write-Host "Changing datasource `"$($_.Name)`" to $($_.Item.Reference)"
}
}
#Open a IE browser to view the report.
$IE=new-object -com internetexplorer.application
$IE.navigate2($TargetURL)
$IE.visible=$true
Write-Host ""
Write-Host "You may now view the Reports through the open IE browser."
Write-Host -ForegroundColor Green "**STEP COMPLETED!"