String comparison from XML failing - powershell

I am currently working on a project in PowerShell. The project downloads the NVD database XML, loops through a separate CSV for scan results from Nexpose, and pulls CVSS scores for each vulnerability identified with a CVE number.
It seems that matching the CVE number from the sheet (string) with the CVE number in the XML (also a string) is failing completely. The code i am using is below:
clear
[xml]$nvdxml = (New-Object system.Net.WebClient).DownloadString("http://static.nvd.nist.gov/feeds/xml/cve/nvdcve-2.0-recent.xml")
$nsmgr = New-Object System.XML.XmlNamespaceManager($nvdxml.NameTable)
$nsmgr.AddNamespace('xsi','http://www.w3.org/2001/XMLSchema-instance')
$nsmgr.AddNamespace('vuln','http://scap.nist.gov/schema/vulnerability/0.4')
$nsmgr.AddNamespace('cvss','http://static.nvd.nist.gov/feeds/xml/cve/nvdcve-2.0-recent.xml')
$nsmgr.AddNamespace('df','http://scap.nist.gov/schema/feed/vulnerability/2.0')
$nvdxml.SelectNodes('//vuln:product',$nsmgr) | out-null
$nvdxml.SelectNodes('//vuln:vulnerable-configuration',$nsmgr) | out-null
$nvdxml.SelectNodes('//vuln:vulnerable-software-list',$nsmgr) | out-null
$nvdxml.SelectNodes('//default:nvd',$nsmgr) | out-null
$nvdxml.SelectNodes('//default:entry',$nsmgr) | out-null
$x = import-csv "test-report.csv"
$items = #()
$x | where {$_."Vulnerability Test Result Code" -like "v*"} | %{
$item = #{}
$vid = $_."Vulnerability CVE IDs"
$entry = ""
$item["Vname"] = $_."Vulnerability Title"
$item["VNode"] = $_."Asset IP Address"
$item['VID'] = $vid
$entry = $nvdxml.nvd.entry | where { $_."cve-id" -eq $vid }
$item['Score'] = $entry.cvss.base_metrics.score
$items += $item
}
$items
The $items array contains a vulnerability which has a CVE ID, but the string comparison is utterly failing. When I query for the object's type I get
You cannot call a method on a null-valued expression.
At line:25 char:19
+ $entry.GetType <<<< ()
+ CategoryInfo : InvalidOperation: (GetType:String) [], RuntimeException
+ FullyQualifiedErrorId : InvokeMethodOnNull
When I assign a CVE ID to a string, and attempt to get the relevant vulnerability from the XML for that string, the comparison returns no results; yet, when I replace the variable with the quoted string of the same ID, the query returns the correct result. So, this would fail
$cveID = "CVE-2003-1567"
$nvdxml.nvd.entry | where { $_."cve-id" -eq $cveID }
However, this works fine
$nvdxml.nvd.entry | where { $_."cve-id" -eq "CVE-2003-1567" }
Any ideas? I have tried explicitly casting both $_."cve-id" and $cveID as String with the same results.

I would put all the entries in a hashtable, then look it up via the CVE ID:
$entriesByID = #{ }
$nvdxml.nvd.entry |
ForEach-Object { $entriesByID[$_.id] = $_ }
Note that instead of using the cve-id element, I'm using the id attribute on the event element.
Then, you can look it each entry in the hashtable:
$entry = $entriesByID[$vid]
If you're married to your original approach, you may be running into namespace issues. I would try using SelectNodes instead of PowerShell's virtual XML object properties:
$entry = $nvdxml.SelectSingleNode('/nvd/entry[vuln:cve-id = "$vid"]')
# or
$entry = $nvdxml.SelectSingleNode('/nvd/entry[#id = "$vid"]')

Related

Creating New Contacts In Bulk AND adding to distros

I have a script that creates bulk external contacts using PS and a CSV. I would like to tweak this to not only create the contact but also add the contact to any specified distros.
foreach($contact in (import-csv c:\users\ME\desktop\contactstest.csv)){
$properties = #{
type = 'Contact'
name = $contact.firstname + ", " + $contact.lastname
OtherAttributes = #{'mail' = "$($contact.email)"}
Path = "OU=External Contacts,DC=Company,DC=org"
}
New-ADObject #properties
}
If I have the CSV with the following columns. Is this do-able?
Firstname/lastname/email/Group
The CSV is located on the desktop of the DC.
CSV EXAMPLE
first,last,email,group
billy,bob,bb#aol.com,emailist1
I see now what went wrong the first time.
What I meant was for you to replace the line New-ADObject #properties with my code block. Since you left that in aswell, the code tried to create the contact twice, leading to the error message and skipping the part where it add the contact to the group(s).
If your CSV looks like this:
first,last,email,group
billy,bob,bb#aol.com,emailist1
jane,doe,jd#somewhereintheworld.com,emaillist1;emaillist5
Note the second contact is to be added to two groups, separated by a semi-colon ;
Here the code in full:
foreach ($contact in (import-csv c:\users\ME\desktop\contactstest.csv)) {
# create a hash table for splatting the parameters to New-ADObject
$properties = #{
type = 'Contact'
name = $contact.firstname + ", " + $contact.lastname
OtherAttributes = #{'mail' = "$($contact.email)"}
Path = "OU=External Contacts,DC=Company,DC=org"
}
# create the contact and capture the resulting object
$newContact = New-ADObject #properties -PassThru -ErrorAction SilentlyContinue
if ($newContact) {
# if success, add this contact to the group(s)
$contact.Group -split ';' | ForEach-Object {
Add-ADGroupMember -Identity $_ -Members $newContact
}
}
else {
Write-Warning "Could not create contact $($contact.lastname)"
}
}
P.S. You can also include the PassThru and ErrorAction parameters to the splatting hash:
$properties = #{
type = 'Contact'
name = $contact.firstname + ", " + $contact.lastname
OtherAttributes = #{'mail' = "$($contact.email)"}
Path = "OU=External Contacts,DC=Company,DC=org"
PassThru = $true
ErrorAction = 'SilentlyContinue'
}

Powershell Global Variable usage as parameter to argument

$global:af_fp = "C:\Path\to\folder\"
Function function-name {
do things …
$global:af_fp = $global:af_fp + $variableFromDo_things + "_AF.csv"
}
function-name | ConvertTo-CSV -NoTypeInformation | Add-Content -Path $($af_fp)
Above is the generalized (and abbreviated) script contents for a powershell script.
Every time I run the script in this way, I get the following error:
Add-Content : Could not find a part of the path 'C:\Users\timeuser\Documents\'.
At C:\Users\timeuser\Documents\get_software.ps1:231 char:51
+ ... ware | ConvertTo-CSV -NoTypeInformation | Add-Content -Path $($af_fp)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (C:\Users\timeuser\Documents\:String) [Add-Content], DirectoryNotFoundException
+ FullyQualifiedErrorId : GetContentWriterDirectoryNotFoundError,Microsoft.PowerShell.Commands.AddContentCommand
When I run
Get-Variable -Scope global
after running the script and seeing the error, the variable af_fp contains exactly the information I am seeking for the file name, however, the error shows the variable contents ending in ':String'.
To confuse me even more, if I comment out the lines containing '$global:...' and re-run the same script, IT ACTUALL RUNS AND SAVES THE FILE USING THE LINE
function-name | ConvertTo-CSV -NoTypeInformation | Add-Content -Path $($af_fp)
AS INTENDED. Of course, I had to run the script and watch it error first, then re-run the script with the global variable declaration and update commented out for it to actually work. I want to run the script ONCE and still get the same results.
FYI, I am a complete noob to powershell, but very familiar with the concept of variable scope.....but why is this global not working when initially created and updated, but then work the second time around, when, as far as I can tell, the CONTENT AND SCOPE of the global remains the same...…. any assistance to finding a solution to this small issue would be greatly appreciated; I have tried sooooo may different methods from inquiries through here and on Google...…..
EDIT: not sure why this will matter, because the script ran before as intended when I explicitly typed the parameter for -Path as 'C:\path\to\file'. The ONLY CHANGES MADE to the original, working script (below) were my inclusion of the global variable declaration, the update to the contents of the global variable (near the end of the function), and the attempt to use the global variable as the parameter to -Path, that is why I omitted the script:
'''
$global:af_fp = "C:\Users\timeuser\Documents\"
Function Get-Software {
[OutputType('System.Software.Inventory')]
[Cmdletbinding()]
Param(
[Parameter(ValueFromPipeline = $True, ValueFromPipelineByPropertyName = $True)]
[String[]]$Computername = $env:COMPUTERNAME
)
Begin {
}
Process {
ForEach ($Computer in $Computername) {
If (Test-Connection -ComputerName $Computer -Count 1 -Quiet) {
$Paths = #("SOFTWARE\\Microsoft\\Windows\\CurrentVersion\\Uninstall", "SOFTWARE\\Wow6432node\\Microsoft\\Windows\\CurrentVersion\\Uninstall")
ForEach ($Path in $Paths) {
Write-Verbose "Checking Path: $Path"
# Create an instance of the Registry Object and open the HKLM base key
Try {
$reg = [microsoft.win32.registrykey]::OpenRemoteBaseKey('LocalMachine', $Computer, 'Registry64')
}
Catch {
Write-Error $_
Continue
}
# Drill down into the Uninstall key using the OpenSubKey Method
Try {
$regkey = $reg.OpenSubKey($Path)
# Retrieve an array of string that contain all the subkey names
$subkeys = $regkey.GetSubKeyNames()
# Open each Subkey and use GetValue Method to return the required values for each
ForEach ($key in $subkeys) {
Write-Verbose "Key: $Key"
$thisKey = $Path + "\\" + $key
Try {
$thisSubKey = $reg.OpenSubKey($thisKey)
# Prevent Objects with empty DisplayName
$DisplayName = $thisSubKey.getValue("DisplayName")
If ($DisplayName -AND $DisplayName -notmatch '^Update for|rollup|^Security Update|^Service Pack|^HotFix') {
$Date = $thisSubKey.GetValue('InstallDate')
If ($Date) {
Try {
$Date = [datetime]::ParseExact($Date, 'yyyyMMdd', $Null)
}
Catch {
Write-Warning "$($Computer): $_ <$($Date)>"
$Date = $Null
}
}
# Create New Object with empty Properties
$Publisher = Try {
$thisSubKey.GetValue('Publisher').Trim()
}
Catch {
$thisSubKey.GetValue('Publisher')
}
$Version = Try {
#Some weirdness with trailing [char]0 on some strings
$thisSubKey.GetValue('DisplayVersion').TrimEnd(([char[]](32, 0)))
}
Catch {
$thisSubKey.GetValue('DisplayVersion')
}
$UninstallString = Try {
$thisSubKey.GetValue('UninstallString').Trim()
}
Catch {
$thisSubKey.GetValue('UninstallString')
}
$InstallLocation = Try {
$thisSubKey.GetValue('InstallLocation').Trim()
}
Catch {
$thisSubKey.GetValue('InstallLocation')
}
$InstallSource = Try {
$thisSubKey.GetValue('InstallSource').Trim()
}
Catch {
$thisSubKey.GetValue('InstallSource')
}
$HelpLink = Try {
$thisSubKey.GetValue('HelpLink').Trim()
}
Catch {
$thisSubKey.GetValue('HelpLink')
}
$Object = [pscustomobject]#{
#Potential Candidate for AssetID in the TIME system
AssetID = $Computer
#String that contains word or word combinations for the product field of CPE WFN; may also contain the valid values necessary for update, edition, language, sw_edition, target_hw/sw fields as well.
cpeprodinfo = $DisplayName
cpeversion = $Version
InstallDate = $Date
cpevendor = $Publisher
UninstallString = $UninstallString
InstallLocation = $InstallLocation
InstallSource = $InstallSource
HelpLink = $thisSubKey.GetValue('HelpLink')
EstimatedSizeMB = [decimal]([math]::Round(($thisSubKey.GetValue('EstimatedSize') * 1024) / 1MB, 2))
}
$Object.pstypenames.insert(0, 'System.Software.Inventory')
Write-Output $Object
}
}
Catch {
Write-Warning "$Key : $_"
}
}
}
Catch { }
$reg.Close()
}
}
Else {
Write-Error "$($Computer): unable to reach remote system!"
}
$global:af_fp = $global:af_fp + $Computer + "_AF.csv"
}
}
}
Get-Software | ConvertTo-CSV -NoTypeInformation | Add-Content -Path $($af_fp)
'''
IGNORE FORMATTING PLEASE- HAD TROUBLE MAKING INDENTS CORRECTLY FROM COPY-PASTE AND RESTRICTIONS ON SITE FOR CODE BLOCKS.....
NOTE: the ONLY changes I made, that I am asking about, are the global declaration, the global variable update in the function, and the attempt to use the global variable for the -Path parameter....script otherwise runs and will even run WITH THE LAST LINE AS IS if I ran it and errored the first time.....not sure how the addition script will help in any way, shape, or form!
With a little effort, Nasir's solution worked! HOWEVER, I ran across a sample file that had a way of adding to a parameter that inspired me to make a change to my ORIGINAL, that also worked: remove global variable from script entirely and add this code the very end:
$file_suffix = '_AF.csv'
Get-Software | ConvertTo-CSV -NoTypeInformation | Add-Content -Path $env:COMPUTERNAME$file_suffix
In this way, I was able to accomplish exactly what I was setting out to do! Thanks Nasir for your response as well! I was able to also make that work as intended!
Global variables are generally frowned upon, since they often lead to poor scripts, with hard to debug issues.
It seems like your function returns some stuff, which you need to write to a file, the name of which is also generated by the same function. You can try something like this:
function function-name {
param($PathPrefix)
#do things
[pscustomobject]#{"DoThings_data" = $somevariablefromDoThings; "Filename" = "$($PathPrefix)$($variableFromDo_Things)_AF.csv"}
}
function-name -PathPrefix "C:\Path\to\folder\" | Foreach-Object { $_.DoThings_data | Export-Csv -Path $_.Filename -NoTypeInformation }
Or just have your function write the CSV data out and then return the data if you need to further process it outside the function.
Edit: this is just me extrapolating from partial code you have provided. To Lee_Dailey's point, yes, please provide more details.

comparision of 2 csv files using powershell

The format of two files is same and as follows:
ServiceName Status computer State
AdobeARMservice OK NEE Running
Amazon Assistan OK NEE Running
the requirement is, i have to check the service name and computer name..if both are same, then i have to check whether the state of particular service is same in both the files or not. And if it is not same then display it..
$preser = import-csv C:\info.csv
$postser = import-csv C:\serviceinfo.csv
foreach($ser1 in $preser)
{
foreach($ser2 in $postser)
{
if(($ser1.computer -eq $ser2.computer) -and ($ser1.ServiceName -eq $ser2.ServiceName))
{
if($ser1.State -eq $ser2.State)
{
}
else
{
write-host $ser1,$ser2
}
}
}
}
This code is working fine but as the files length is very large, the time of execution is more.
Is there any alternative method to reduce the time of execution..?
Thank you
Although Import-Csv on very large files will take its time, maybe this will be faster:
$preser = Import-Csv -Path 'C:\info.csv'
$postser = Import-Csv -Path 'C:\serviceinfo.csv'
# build a lookup Hashtable for $preser
$hash = #{}
foreach ($item in $preser) {
# combine the ServiceName and Computer to form the hash key
$key = '{0}#{1}' -f $item.ServiceName, $item.computer
$hash[$key] = $item
}
# now loop through the items in $postser
foreach ($item in $postser) {
$key = '{0}#{1}' -f $item.ServiceName, $item.computer
if ($hash.ContainsKey($key)) {
if ($hash[$key].State -ne $item.State) {
# create a new object for output
$out = $hash[$key] | Select-Object * -ExcludeProperty State
$out | Add-Member -MemberType NoteProperty -Name 'State in Preser' -Value $hash[$key].State
$out | Add-Member -MemberType NoteProperty -Name 'State in Postser' -Value $item.State
$out
}
}
}
The output on screen will look something like this:
ServiceName : AdobeARMservice
Status : OK
computer : NEE
State in Preser : Running
State in Postser : Stopped
Of course, you can capture this output and save it as new csv if you do
$result = foreach ($item in $postser) {
# rest of the above foreach loop
}
# output on screen
$result
# output to new csv
$result | Export-Csv -Path 'C:\ServiceInfoDifference.csv' -NoTypeInformation
There are a few ways to do this:
1. Sorting the columns
If the columns are unsorted in the files, sort them first, and then try finding a match by using linear search.
2. Binary search
What you are currently doing is an implementation of a linear search. You can implement binary search (works best on sorted lists) to find a result faster.
Taken from dfinkey's github repo
function binarySearch {
param($sortedArray, $seekElement, $comparatorCallback)
$comparator = New-Object Comparator $comparatorCallback
$startIndex = 0
$endIndex = $sortedArray.length - 1
while ($startIndex -le $endIndex) {
$middleIndex = $startIndex + [Math]::floor(($endIndex - $startIndex) / 2)
# If we've found the element just return its position.
if ($comparator.equal($sortedArray[$middleIndex], $seekElement)) {
return $middleIndex
}
# Decide which half to choose for seeking next: left or right one.
if ($comparator.lessThan($sortedArray[$middleIndex], $seekElement)) {
# Go to the right half of the array.
$startIndex = $middleIndex + 1
}
else {
# Go to the left half of the array.
$endIndex = $middleIndex - 1
}
}
return -1
}
3. Hashes
I am not completely sure of this method, but, you can load the columns into hashes and then compare them. Hash comparisons are generally faster than array comparisons.

Concatenating a variable and a string literal without a space to an array using PowerShell

I'm trying add to a variable and a string in an array dynamically but i'm not getting expected output.
(1) I'm getting env name
(2) Concatinating the string and variable in an array
Code is as follows.
$env = $env:COMPUTERNAME.Substring(0,2)
$servers = { $env+"server1.test.com",$env+"server2.test.com" }
$serverCount = $servers -split(",") | measure | % { $_.Count }
For ($i=0; $i -lt $serverCount; $i++)
{
$ServerName = $servers -split(',') -replace '\[\d+\]'
$server = $ServerName[$i]
Write-Host $server
}
output i'm getting as
$env+"server1.test.com"
$env+"server2.test.com"
Values are not getting concatenated properly and variable value is not getting displayed. Any help.
$servers = { $env+"server1.test.com",$env+"server2.test.com" }
This is a scriptblock, not an array. {} is like a function, you have to run it for it to do anything (such as evaluating $env).
When you force it into a string using -split(",") what you get is text representation of the source code in the scriptblock, including the variable names.
As #Olaf comments, the right way to create an array of names is
$servers = ($env + "server1.test.com"), ($env + "server2.test.com")
This might be how I'd write it:
$env = $env:COMPUTERNAME.Substring(0,2)
"server1.test.com", "server2.test.com" | foreach-object {
"$env$_" -replace '\d+'
}

Return all SSRS reports in a folder with the data source name and ConnectString

This what I have so far. However, I want to list every report to it's connection string. I don't see a unique identifier in the GetDataSourceContents() method to join the report and data source lists.
$ReportServerUri = "YOUR_SERVER";
$rs = New-WebServiceProxy -Uri $ReportServerUri -UseDefaultCredential -Namespace "SSRS"
$rs.Url = "YOUR_SERVER"
$rs.Credentials = [System.Net.CredentialCache]::DefaultNetworkCredentials;
$BICItems = $rs.ListChildren("/", $true);
$BICFolders = $BICItems | Where { $_.TypeName -eq "Folder"}
$BICDataSources = $BICItems | Where {$_.typename -eq "DataSource"}
$BICDataSourceFolders = $BICFolders | Where {$_.path -like "*Data Source*"}
$BICReports = $BICItems | Where {$_.typename -eq "Report"}
foreach ($DataSource in $BICDataSources)
{
$BICDataSourceContents = $rs.GetDataSourceContents($DataSource.Path)
$MyConnectStrings = $BICDataSourceContents | Where {$_.ConnectString -like "*MY_CONNECT_STRING*"}
$MyConnectStrings
}
I don't see a unique identifier in the GetDataSourceContents method to join the report and data source lists.
Nope. Neither do I. However when were are querying for those details we already know something unique enough. The path to the datasource itself. This is also what a report would be using so that should be a good connector.
There is a series of functions that I made to serve this purpose. Find-SSRSEntities, Get-SSRSReportDataSources and Get-SSRSDatasourceDetails are what I will try and showcase here. The last one I just made since I had no reason for those details but it was easy enough to integrate into my module.
Find-SSRSEntities
Return items from a SSRS connection. Supports loads of filtering options.
function Find-SSRSEntities{
[CmdletBinding()]
param(
[Parameter(Position=0,Mandatory=$true)]
[Alias("Proxy")]
[Web.Services.Protocols.SoapHttpClientProtocol]$ReportService,
[Parameter(Position=1)]
[Alias("Path")]
[string]$SearchPath="/",
[Parameter(Position=2)]
[ValidateSet("All", "Folder", "Report", "Resource", "LinkedReport", "DataSource", "Model")]
[Alias("Type")]
[String]$EntityType = "All",
[Parameter(Position=3)]
[String]$Match,
[Parameter(Position=4)]
[Switch]$Partial=$false
)
# Get all of the catalog items that match the criteria passed
# https://msdn.microsoft.com/en-us/library/reportservice2005.reportingservice2005.listchildren.aspx
$recursive = $true
$catalogItems = $ReportService.ListChildren($SearchPath,$recursive)
Write-Verbose "$($catalogItems.Count) item(s) located in the root path $SearchPath"
# Limit the results to the catalog types requested
if($EntityType -ne "All"){$catalogItems = $catalogItems | Where-Object{$_.Type -eq $EntityType}}
Write-Verbose "$($catalogItems.Count) item(s) found matching the type $EntityType"
# Set the match string based on parameters
if(-not $Partial.isPresent -and $Match){$Match = "^$Match$"}
Write-Verbose "Returning all items matching: '$Match'"
# If the regex is an empty string all object will be returned.
return $catalogItems | Where-Object{$_.Name -match $Match}
}
Get-SSRSReportDataSources
When given a valid report path it will return all associated datasources of that report.
function Get-SSRSReportDataSources{
[CmdletBinding()]
param(
[Parameter(Position=0,Mandatory=$true)]
[Alias("Proxy","SSRSService")]
[Web.Services.Protocols.SoapHttpClientProtocol]$ReportService,
[Parameter(Position=1,Mandatory=$true)]
[Alias("Path")]
[string]$ReportPath
)
# Test the report path to be sure it is for a valid report
if(Test-SSRSPath -ReportService $ReportService -EntityPath $ReportPath -EntityType Report){
$ReportService.GetItemDataSources($reportPath) | ForEach-Object{
[pscustomobject][ordered]#{
ReportPath = $reportPath
DataSourceName = $_.name
Reference = $_.item.reference
}
}
} else {
Write-Error "$ReportPath is not a valid report path"
}
}
Get-SSRSDatasourceDetails
When given a valid datasource path it will return all detail of that datasource. Also attaches an extra path property.
function Get-SSRSDatasourceDetails{
[CmdletBinding()]
param(
[Parameter(Position=0,Mandatory=$true)]
[Alias("Proxy")]
[Web.Services.Protocols.SoapHttpClientProtocol]$ReportService,
[Parameter(Position=1,Mandatory=$true,ValueFromPipelineByPropertyName)]
[Alias("Path")]
[string]$EntityPath
)
process{
# Split the path into its folder and entity parts
$SearchPath = Split-SSRSPath $EntityPath -Parent
$EntityName = Split-Path $EntityPath -Leaf
# Verify the path provided is to a valid datasource
if((Find-SSRSEntities -ReportService $ReportService -SearchPath $SearchPath -EntityType DataSource -Match $EntityName -Partial:$false) -as [boolean]){
Add-Member -InputObject ($ReportService.GetDataSourceContents($EntityPath)) -MemberType NoteProperty -Name "Path" -Value $EntityPath -PassThru
} else {
Write-Warning "Could not find a datasource at path: $EntityPath"
}
}
}
So armed with those lets match up all reports in a folder to their datasource connection strings. I would note that all of these functions rely on a active connection to work. Something like this
$ssrsservice = Connect-SSRSService "http://ssrsreports/ReportServer/ReportService2005.asmx" -Credential $credentials
$PSDefaultParameterValues.Add("*SSRS*:ReportService",$ssrsservice)
That will automatically apply the populated -ReportService $ssrsservice to all the SSRS functions I made below.
Else you could just add something like Find-SSRSEntities -ReportService $rs to the code below and it would work.
# Lets get all of the Marketing Datasources
$datasources = Find-SSRSEntities -SearchPath "/data sources/marketing" -EntityType DataSource | Get-SSRSDatasourceDetails
# Now gather all of their reports
Find-SSRSEntities -SearchPath "/Marketing" -EntityType Report |
# Get the report datasources
Get-SSRSReportDataSources | ForEach-Object{
# Attach the connection strings to each object
$reportDataSourceDetail = $_
# Filter the datasource for the individual datasource mapping of this report
$matchingDatasource = $datasources | Where-Object{$_.path -eq $reportDataSourceDetail.Reference}
Add-Member -InputObject $_ -MemberType NoteProperty -Name ConnectionString -Value $matchingDatasource.ConnectString -PassThru
}
This would net me results that look like this:
ReportPath : /Marketing/OandD Class Summary By Month
DataSourceName : Marketing
Reference : /Data Sources/Marketing/Marketing
ConnectionString : Data Source=SQL08R2VM; Initial Catalog=Marketing;
ReportPath : /Marketing/OandD Class YTD Summary
DataSourceName : Marketing
Reference : /Data Sources/Marketing/Marketing
ConnectionString : Data Source=SQL08R2VM; Initial Catalog=Marketing;
These, and other functions, suite me just fine. I have not really had anyone else using them so you might have issues that I have never encountered. Works fine connecting to my SSRS 2008R2 server using PowerShell v5
Here's a T-SQL statement that will return the data source name, path & connection string with the report name and path.
;WITH
XMLNAMESPACES -- XML namespace def must be the first in with clause.
(
DEFAULT 'http://schemas.microsoft.com/sqlserver/reporting/2006/03/reportdatasource'
,'http://schemas.microsoft.com/SQLServer/reporting/reportdesigner'
AS rd
)
,
shared_datasource
AS
(
SELECT
DsnSharedName = sds.[Name]
, DsnPath = sds.[Path]
, DEF = CONVERT(xml, CONVERT(varbinary(max), content))
FROM
dbo.[Catalog] AS sds
WHERE sds.[Type] = 5) --> 5 = Shared Datasource
,
data_source_name (DsnPath, DsnSharedName, DsnConnString)
AS
(
SELECT
cn.DsnPath
, cn.DsnSharedName
, cn.DsnConnString
FROM
(SELECT
sd.DsnPath
, sd.DsnSharedName
, DsnConnString = dsn.value('ConnectString[1]', 'varchar(150)')
FROM
shared_datasource AS sd
CROSS APPLY sd.DEF.nodes('/DataSourceDefinition') AS R(dsn)
) AS cn
)
SELECT
DataSourceName = lk.[Name]
, dsn.DsnPath
, dsn.DsnConnString
, ReportName = c.[Name]
, ReportFolder = c.[Path]
FROM
dbo.[Catalog] c
INNER JOIN dbo.DataSource ds ON c.ItemID = ds.ItemID
INNER JOIN dbo.[Catalog] lk ON ds.Link = lk.ItemID
INNER JOIN data_source_name dsn ON dsn.DsnSharedName = lk.[Name]
WHERE
c.[Type] = 2 --> 2 = Reports
--AND dsn.DsnConnString LIKE '%Initial Catalog%=%DatabaseNameHere%'
Then you can run the T-SQL script file in powershell with this. original post
<# Function to Check whether Server is Ping Status of the Server #>
Function Check-Ping()
{
param
(
[string]$HostName
)
$PingStatus=Get-WmiObject -Query "Select * from Win32_PingStatus where Address='$HostName'"
Return $PingStatus
}
<# Function to Check Instance name Present in the Server #>
Function Get-SQLInstances()
{
param
(
[string]$SQLServerName
)
$Status=Check-Ping($SQLServerName)
if($Status.StatusCode -ne 0)
{
Return "The Server Is Not Reachable"
}
elseif($Status.StatusCode -eq 0)
{
$Reg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('LocalMachine', $SQLServerName)
$RegKey = $Reg.OpenSubKey("SOFTWARE\\Microsoft\\Microsoft SQL Server")
$Instances=$regKey.GetValue("installedinstances")
Return $Instances
}
}
<# Function To Run TSQL and Return Results within HTML Table Tag #>
Function Run-TSQL()
{
Param
(
[string]$MachineName,
[string]$TSQLfilePath
)
$Assembly=[reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo")
$Instances=Get-SQLInstances($MachineName)
$TSQL=Get-Content $TSQLfilePath
foreach($Instance in $Instances)
{
$SQLServiceStatus=Get-Service -ComputerName $MachineName | Where-Object {$_.displayname -like "SQL Server ("+$Instance+")"}
if($SQLServiceStatus.Status -eq "Running")
{
if($Instance -eq "MSSQLSERVER")
{
$SQLServer=$MachineName
}
Else
{
$SQLServer=$MachineName+"\"+$Instance
}
$SQLServerObject = new-Object Microsoft.SqlServer.Management.Smo.Server($SQLServer)
$DatabaseObject = New-Object Microsoft.SqlServer.Management.Smo.Database
$DatabaseObject = $SQLServerObject.Databases.Item("Master")##The TSQL Script Runs in Master Database
$OutPutDataSet = $DatabaseObject.ExecuteWithResults($TSQL)
for($t=0;$t -lt $OutPutDataSet.Tables.Count;$t++)
{
$OutString+="<Table Border=2>"
$OutString+="<Tr>"
foreach($Column in $OutPutDataSet.Tables[$t].Columns)
{
$OutString+="<Th>"
$OutString+=$Column.ColumnName
$OutString+="</Th>"
}
$OutString+="</Tr>"
for($i=0;$i -lt $OutPutDataSet.Tables[$t].Rows.Count;$i++)
{
$OutString+="<Tr>"
for($j=0;$j -lt $OutPutDataSet.Tables[$t].Columns.Count;$j++)
{
$OutString+="<Td>"
$OutString+=$($OutPutDataSet.Tables[$t].Rows[$i][$j])
$OutString+="</Td>"
}
$OutString+="</Tr>"
}
$OutString+="</Table>"
$OutString+="</Br>"
$OutString+="</Br>"
}
}
}
Return $OutString
}
<# Function To Add Table Tag to with In HTML tags
Modify Title and Subject as Per yoru Requirement
#>
Function Get-HTMLOut()
{
Param
(
[String]$InputFile,
[String]$OutputFile,
[String]$TSQL
)
$Out+="<Html>"
$Out+="<Title>Run TSQL and Return HTML FIle</Title>" ## Modify 'TiTle' Tag as per your Required
$Out+="<Head><style>body {background-color:lightgray} H3{color:blue}H1{color:green}table, td, th {border: 1px solid green;}th {background-color: green;color: white;}</style></Head>" ## Modify 'Head' Tag as per your Required
$Out+="<Body><H1 Align='Center'>Run TSQL and Return HTML File</H1></Br></Br>" ## Modify 'Body' Tag as per your Required
ForEach($ServerName in Get-Content $InputFile)
{
$Out+="<H3 align='center'>--------------$ServerName--------------</H3>" ## Modify 'header Text' Tag as per your Required
$Out+="</Br>"
$Out+=Run-TSQL -MachineName $ServerName -TSQLfilePath $TSQL
}
$Out+="</Body></Html>"
Set-Content -Value $Out -Path $OutputFile
}
<# Call Get-HTMLOut Function
It Accepts 3 parameter
a. -InputFile (.txt file each server in a List withOut Instance Name)
b. -OutputFile (.Html File to which Output need to be sent)
c. -TSQL (.sql file which Contains the Script to Run)
#>
Get-HTMLOut -InputFile ".\Servers.txt" -OutputFile .\Status.Html -TSQL '.\TSQL Script.sql'