I'm writing a script that I'd like to be able to easily move between IIS servers to analyze logs, but these servers store the logs in different places. Some on C:/ some on D:/ some in W3SVC1, some in W3SVC3. I'd like to be able to have powershell look this information up itself rather than having to manually edit this on each server. (Yeah, I'm a lazy sysadmin. #automateallthethings.)
Is this information available to PowerShell if I maybe pass the domain to it or something?
I found this to work for me since I want to know all of the sites log directory.
Import-Module WebAdministration
foreach($WebSite in $(get-website))
{
$logFile="$($Website.logFile.directory)\w3svc$($website.id)".replace("%SystemDrive%",$env:SystemDrive)
Write-host "$($WebSite.name) [$logfile]"
}
Import-Module WebAdministration
$sitename = "mysite.com"
$site = Get-Item IIS:\Sites\$sitename
$id = $site.id
$logdir = $site.logfile.directory + "\w3svc" + $id
Thanks for Chris Harris for putting the website ID idea in my head. I was able to search around better after that and it led me to the WebAdministration module and examples of its use.
Nice... I updated your script a little bit to Ask IIS for the log file location.
param($website = 'yourSite')
Import-Module WebAdministration
$site = Get-Item IIS:\Sites\$website
$id = $site.id
$logdir = $site.logfile.directory + "\w3svc" + $id
$time = (Get-Date -Format "HH:mm:ss"(Get-Date).addminutes(-30))
# Location of IIS LogFile
$File = "$logdir\u_ex$((get-date).ToString("yyMMdd")).log"
# Get-Content gets the file, pipe to Where-Object and skip the first 3 lines.
$Log = Get-Content $File | where {$_ -notLike "#[D,S-V]*" }
# Replace unwanted text in the line containing the columns.
$Columns = (($Log[0].TrimEnd()) -replace "#Fields: ", "" -replace "-","" -replace "\(","" -replace "\)","").Split(" ")
# Count available Columns, used later
$Count = $Columns.Length
# Strip out the other rows that contain the header (happens on iisreset)
$Rows = $Log | where {$_ -like "*500 0 0*"}
# Create an instance of a System.Data.DataTable
#Set-Variable -Name IISLog -Scope Global
$IISLog = New-Object System.Data.DataTable "IISLog"
# Loop through each Column, create a new column through Data.DataColumn and add it to the DataTable
foreach ($Column in $Columns) {
$NewColumn = New-Object System.Data.DataColumn $Column, ([string])
$IISLog.Columns.Add($NewColumn)
}
# Loop Through each Row and add the Rows.
foreach ($Row in $Rows) {
$Row = $Row.Split(" ")
$AddRow = $IISLog.newrow()
for($i=0;$i -lt $Count; $i++) {
$ColumnName = $Columns[$i]
$AddRow.$ColumnName = $Row[$i]
}
$IISLog.Rows.Add($AddRow)
}
$IISLog | select #{n="DateTime"; e={Get-Date ("$($_.date) $($_.time)")}},csuristem,scstatus | ? { $_.DateTime -ge $time }
Related
I want a script that can help me check for the name of keyset (column a) in Sample.cvs and then replace the current command(column b) with new command (column c) in the Source text file.
CSV file: Sample.csv
A. | B. | C.
Manock | 2B | 2ab
Sterling | 3F | 3sf
Source file text: Source.txt
keyset "Manock"
(
key("SELECT")
command ("display/app=%disapp% "2B")
);
So desired output:
keyset "Manock"
(
key("SELECT")
command ("display/app=%disapp% "2ab")
);
Powershell Script:
New-Item -Path "C:\Users\e076200\Desktop\ks_update\source.txt" -ItemType File -Force
$data = Get-Content C:\Users\e076200\Desktop\ks_update\source.ddl
Add-Content -Value $data -Path "C:\Users\e076200\Desktop\ks_update\source.txt"
$foundline = $false
$a = 0
$Etxt = foreach($line in Get-Content C:\Users\e076200\Desktop\ks_update\source.txt)
{
if ($line -match 'keyset "Manock"' )
{
$a = 0
$foundline = $true
}
$a= $a + 1
if($line -match "display/app" -and $a -eq 5 -and $foundline -eq $true)
{
$line = $line.replace('2b' , '2ab')
$line
}
else
{
$line
}
}
$Etxt | Set-Content C:\Users\e076200\Desktop\ks_update\source.txt -Force
$users = Import-CSV -Path:\Users\e076200\Desktop\ks_update\sample.csv
I've figured out how to find and replace one line in the file directly. I've also figured out how to import the csv. I need help on how to make the logic parameterized and use column A of CSV as the match piece and column c as the replacement piece.
Script Explanation.
New-Item -Path "C:\Users\e076200\Desktop\ks_update\source.txt" -ItemType File -Force
New-Item creates new text file # location defined by -Path using name specified at the end, source.
ItemType to define type of document, -Force is force command.
$data = Get-Content C:\Users\e076200\Desktop\ks_update\source.ddl
Retrieves ddl and stores in variable.
Add-Content -Value $data -Path "C:\Users\e076200\Desktop\ks_update\source.txt"
Transfers content from variable to new text file created.
$foundline = $false
conditional variable defined for when keyset identifier is found.
$a = 0
counter defined for if statement.
$Etxt = foreach($line in Get-Content C:\Users\e076200\Desktop\ks_update\source.txt)
$Etxt - for loop
$line - variable for each line in txt
{
if ($line -match 'keyset "Manock"' )
{
$a = 0
$foundline = $true
}
If keyset identifier is found, set counter to 0 and set conditional variable to true
$a= $a + 1
if($line -match "display/app" -and $a -eq 5 -and $foundline -eq $true)
{
$line = $line.replace('2b' , '2ab')
$line
Match found, PS runs logic, line with keyset identifier == 0 + 1....up until line = 5 where we find item to be replaced.
For redundancy, line reader set to check for line identifier, ("display/app") on expected line.
If Redundant check if met and counter is 5 then we replace word with the line.replace function.
Overwritten data is returned in $line
}
else
{
$line
}
Else retain line
}
$Etxt | Set-Content C:\Users\e076200\Desktop\ks_update\source.txt -Force
Updated text file
$users = Import-CSV -Path:\Users\e076200\Desktop\ks_update\sample.csv
Imports Reference csv file
Please make explanation as dumbed down as possible. Thank you.
I am attempting to turn the file below into one that contains no comments '#', no blank lines, no unneeded spaces, and only one entry per line. I'm unsure how to run the following code without the need to output the file and then reimport it. There should be code that doesn't require that step but I can't find it. The way I wrote my script also doesn't look right to me even though it works. As if there was a more elegant way of doing what I'm attempting but I just don't see it.
Before File Change: TNSNames.ora
#Created 9_27_16
#Updated 8_30_19
AAAA.world=(DESCRIPTION =(ADDRESS_LIST =
(ADDRESS =
(COMMUNITY = tcp.world)
(PROTOCOL = TCP)
(Host = www.url1111.com)
(Port = 1111)
)
)
(CONNECT_DATA = (SID = SID1111)
)
)
#Created 9_27_16
BBBB.world=(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(COMMUNITY=tcp.world)(PROTOCOL=TCP)(Host=url2222.COM)(Port=2222))(ADDRESS=(COMMUNITY=tcp.world)(PROTOCOL=TCP)(Host=url22222.COM)(Port=22222)))(CONNECT_DATA=(SID=SID2222)))
CCCC.world=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(Host=url3333.COM)(Port=3333))(CONNECT_DATA=(SID=SID3333)))
DDDD.url =(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(COMMUNITY=tcp.world)(PROTOCOL=TCP)(Host=URL4444 )(Port=4444))(ADDRESS=(COMMUNITY=TCP.world)(PROTOCOL=TCP)(Host=URL44444 )(Port=44444)))(CONNECT_DATA=(SID=SID4444 )(GLOBAL_NAME=ASDF.URL)))
#Created 9_27_16
#Updated 8_30_19
After File Change:
AAAA.world=(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(COMMUNITY=tcp.world)(PROTOCOL=TCP)(Host=www.url1111.com)(Port=1111)))(CONNECT_DATA=(SID=SID1111)))
BBBB.world=(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(COMMUNITY=tcp.world)(PROTOCOL=TCP)(Host=url2222.COM)(Port=2222))(ADDRESS=(COMMUNITY=tcp.world)(PROTOCOL=TCP)(Host=url22222.COM)(Port=22222)))(CONNECT_DATA=(SID=SID2222)))
CCCC.world=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(Host=url3333.COM)(Port=3333))(CONNECT_DATA=(SID=SID3333)))
DDDD.url=(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(COMMUNITY=tcp.world)(PROTOCOL=TCP)(Host=URL4444)(Port=4444))(ADDRESS=(COMMUNITY=TCP.world)(PROTOCOL=TCP)(Host=URL44444)(Port=44444)))(CONNECT_DATA=(SID=SID4444)(GLOBAL_NAME=ASDF.URL)))
Code:
# Get the file
[System.IO.FileInfo] $File = 'C:\temp\TNSNames.ora'
[string] $data = (Get-Content $File.FullName | Where-Object { !$_.StartsWith('#') }).ToUpper()
# Convert the data. This part is where any (CONNECT_DATA entry ends up on it's own line.
$Results = $data.Replace(" ", "").Replace("`t", "").Replace(")))", ")))`n")
# Convert $Results from BaseType of System.Object to System.Array
$Path = '.\.vscode\StringResults.txt'
$Results | Out-File -FilePath $Path
$Results = Get-Content $Path
# Find all lines that start with '(CONNECT_DATA'
for ($i = 0; $i -lt $Results.Length - 1; $i++) {
if ($Results[$i + 1].StartsWith("(CONNECT_DATA")) {
# Add the '(CONNECT_DATA' line to the previous line
$Results[$i] = $Results[$i] + $Results[$i + 1]
# Blank out the '(CONNECT_DATA' line
$Results[$i + 1] = ''
}
}
# Remove all blank lines
$FinalForm = $null
foreach ($Line in $Results) {
if ($Line -ne "") {
$FinalForm += "$Line`n"
}
}
$FinalForm
So the crux of your problem is that you have declared $data as a [string] which is fine because probably some of your replace operations work better as a single string. Its just that $Results also then ends up being a string so when you try to index into $Results near the bottom these operations fail. You can however easily turn your $Results variable into a string array using the -split operator this would eliminate the need to save the string to disk and import back in just to accomplish the same. See comments below.
# Get the file
[System.IO.FileInfo] $File = 'C:\temp\TNSNames.ora'
[string] $data = (Get-Content $File.FullName | Where-Object { !$_.StartsWith('#') }).ToUpper()
# Convert the data. This part is where any (CONNECT_DATA entry ends up on it's own line.
$Results = $data.Replace(' ', '').Replace("`t", '').Replace(')))', ")))`n")
# You do not need to do this next section. Essentially this is just saving your multiline string
# to a file and then using Get-Content to read it back in as a string array
# Convert $Results from BaseType of System.Object to System.Array
# $Path = 'c:\temp\StringResults.txt'
# $Results | Out-File -FilePath $Path
# $Results = Get-Content $Path
# Instead split your $Results string into multiple lines using -split
# this will do the same thing as above without writing to file
$Results = $Results -split "\r?\n"
# Find all lines that start with '(CONNECT_DATA'
for ($i = 0; $i -lt $Results.Length - 1; $i++) {
if ($Results[$i + 1].StartsWith('(CONNECT_DATA')) {
# Add the '(CONNECT_DATA' line to the previous line
$Results[$i] = $Results[$i] + $Results[$i + 1]
# Blank out the '(CONNECT_DATA' line
$Results[$i + 1] = ''
}
}
# Remove all blank lines
$FinalForm = $null
foreach ($Line in $Results) {
if ($Line -ne '') {
$FinalForm += "$Line`n"
}
}
$FinalForm
Also, for fun, try this out
((Get-Content 'C:\temp\tnsnames.ora' |
Where-Object {!$_.StartsWith('#') -and ![string]::IsNullOrWhiteSpace($_)}) -join '' -replace '\s' -replace '\)\s?\)\s?\)', ")))`n" -replace '\r?\n\(Connect_data','(connect_data').ToUpper()
I want power shell script to fetch all 500 entries from IIS logs from multiple servers. I have written a script that fetches 500 from single servers for previous hours. Could someone check and help me how I can go for fetching multiple servers. Script that I have:
#Set Time Variable -60
$time = (Get-Date -Format "HH:mm:ss"(Get-Date).addminutes(-60))
# Location of IIS LogFile
#$servers = get-content C:\Users\servers.txt
$File = "\\server\D$\Logs\W3SVC89\"+"u_ex"+(get-date).ToString("yyMMddHH")+".log"
# Get-Content gets the file, pipe to Where-Object and skip the first 3 lines.
$Log = Get-Content $File | where {$_ -notLike "#[D,S-V]*" }
# Replace unwanted text in the line containing the columns.
$Columns = (($Log[0].TrimEnd()) -replace "#Fields: ", "" -replace "-","" -replace "\(","" -replace "\)","").Split(" ")
# Count available Columns, used later
$Count = $Columns.Length
# Strip out the other rows that contain the header (happens on iisreset)
$Rows = $Log | where {$_ -like "*500 0 0*"}
# Create an instance of a System.Data.DataTable
#Set-Variable -Name IISLog -Scope Global
$IISLog = New-Object System.Data.DataTable "IISLog"
# Loop through each Column, create a new column through Data.DataColumn and add it to the DataTable
foreach ($Column in $Columns) {
$NewColumn = New-Object System.Data.DataColumn $Column, ([string])
$IISLog.Columns.Add($NewColumn)
}
# Loop Through each Row and add the Rows.
foreach ($Row in $Rows) {
$Row = $Row.Split(" ")
$AddRow = $IISLog.newrow()
for($i=0;$i -lt $Count; $i++) {
$ColumnName = $Columns[$i]
$AddRow.$ColumnName = $Row[$i]
}
$IISLog.Rows.Add($AddRow)
}
$IISLog | select #{n="DateTime"; e={Get-Date ("$($_.date) $($_.time)")}},sip,csuristem,scstatus | ? { $_.DateTime -ge $time } |Out-File C:\Users\Servers\results.csv
Assuming your logfile is always on the same path, and that servers.txt contains you server list,
you can read the server list then execute your code against each one using a foreach loop :
something like that ( a result file is create for each server) :
#Set Time Variable -60
$time = (Get-Date -Format "HH:mm:ss"(Get-Date).addminutes(-60))
# Location of IIS LogFile
$servers = get-content C:\Users\servers.txt
$servers| foreach{
#inside the foreach loop $_ will represent the current server
$File = "\\$_\D$\Logs\W3SVC89\"+"u_ex"+(get-date).ToString("yyMMddHH")+".log"
# Get-Content gets the file, pipe to Where-Object and skip the first 3 lines.
$Log = Get-Content $File | where {$_ -notLike "#[D,S-V]*" }
# Replace unwanted text in the line containing the columns.
$Columns = (($Log[0].TrimEnd()) -replace "#Fields: ", "" -replace "-","" -replace "\(","" -replace "\)","").Split(" ")
# Count available Columns, used later
$Count = $Columns.Length
# Strip out the other rows that contain the header (happens on iisreset)
$Rows = $Log | where {$_ -like "*500 0 0*"}
# Create an instance of a System.Data.DataTable
#Set-Variable -Name IISLog -Scope Global
$IISLog = New-Object System.Data.DataTable "IISLog"
# Loop through each Column, create a new column through Data.DataColumn and add it to the DataTable
foreach ($Column in $Columns) {
$NewColumn = New-Object System.Data.DataColumn $Column, ([string])
$IISLog.Columns.Add($NewColumn)
}
# Loop Through each Row and add the Rows.
foreach ($Row in $Rows) {
$Row = $Row.Split(" ")
$AddRow = $IISLog.newrow()
for($i=0;$i -lt $Count; $i++) {
$ColumnName = $Columns[$i]
$AddRow.$ColumnName = $Row[$i]
}
$IISLog.Rows.Add($AddRow)
}
$IISLog | select #{n="DateTime"; e={Get-Date ("$($_.date) $($_.time)")}},sip,csuristem,scstatus | ? { $_.DateTime -ge $time } |Out-File C:\Users\Servers\$_results.csv
}
Note that this will run your code sequentially on each of your server wich can be time consumming. If you are facing duration issue, you can try to use invoke-command and the -asjob parameter in order to launch you code asynchronoulsy
I have an Excel File which has an unknown number of records in it, and these 3 columns:
Variable Name, Store Number, Email Address
I use this in QlikView to import data for certain stores and then create a separate report for each store in the list. I then need to email each report to each individual store (store number will be in the report file name).
So in PowerShell I would like to read the Excel File and set variables for each store:
$Store1 = The Store Number in Row 2 of the Excel File
$Store1Email = The Store Email in Row 2 of the Excel File
$Store2 = The Store Number in Row 3 of the Excel File
$Store2Email = The Store Email in Row 3 of the Excel File
etc. for each Storein the file (can be any number of stores).
Please note the "Variable Name" in the excel file must be ignored (that is for QLikView) and the PowerShell variables must be named as per my above examples, each time incrementing the number.
Check out my PowerShell Excel Module on Github. You can also grab it from the PowerShell Gallery.
$stores = Import-Excel C:\Temp\store.xlsx
$stores[2].Name
$stores[2].StoreNumber
$stores[2].EmailAddress
''
'All stores'
'----------'
$stores
Ok, first off if you are going to be working with actual .XLS or .XLSX or .XLSM files I would highly suggest using the Import-XLS function from the TechNet gallery (found here).
After that, just reference the object it imports to send the emails instead of making objects for each store. Such as:
$StoreList = Import-XLS <path to Excel file>
GC <report folder> | %{
$Current = $_
$Store = $StoreList|?{$_.StoreNumber -match $Current.BaseName}|Select -ExpandProperty StoreNumber
$Email = $StoreList|?{$_.StoreNumber -match $Current.BaseName}|Select -ExpandProperty StoreEmail
<code to send $Current to $Email>
}
My preference is to Save-As the Excel file to a '.csv' type. The comma separated value can easily be imported into PowerShell.
$csvFile = Import-Csv -Path c:\scripts\temp\excelFile.csv
#now the entire Excel '.csv' file is saved into csvFile variable
$csvFile |Get-Member
#look at the properties
Remember to study the greats so your PowerShell script looks great. Jeffery Snover, Jason Hicks, Don Jones, Ashley McGlone, and anyone on their friends list ha ha
The above answers usually work, but I just had a project with excel datasheets that caused some problems.
edit: Here's a much more advanced version that will pull it into an object, can handle blank and duplicate column names, and can skip human information at the beginning of the worksheet by looking for something in the header row. I've also included some example usages
Your example:
$file = New-Module -AsCustomObject -ScriptBlock $file_template
$file.from_excel("c:\folder\file.xls")
$Store1 = $file.data[0]."Store Number" #first row, column named "Store Number"
$Store1Email = $file.data[0]."Store Email" #first row, column named "Store Email"
foreach ($row in $file.data)
{
write-host "Store: $($row."Store Number")"
write-host "Store Email: $($row."Store Email")"
}
Example 1:
# Simplest example
$file = New-Module -AsCustomObject -ScriptBlock $file_template
$file.from_excel("c:\folder\file.xls")
$file.data[0]
Example 2:
#advanced usage
$file = New-Module -AsCustomObject -ScriptBlock $file_template
$file.header_contains="First Name" # if included it will drop everything before the first line that contains this, useful if there are instructions for humans in the worksheet
$file.indexer_column = 5 # Default: 1 (first column); This column's contents will set the minimum number of rows, use if there are blank rows in your file but more data after them
$file.worksheet_index = "January" # Default: 1; can be a sheet index or sheet name
$file.filename = "c:\folder\file.xls" #can set this independently, useful for validation and troubleshooting
$file.from_excel() #This is where we actually pull from excel
$collected = $file.data|ogv -pass thru #this is a neat way to select some rows you want
$file.headers.count # It stores an array of the headers here, useful for troubleshooting and advanced logic
Excel Reader pseudoclass
$file_template = {
# -- universal --
$filename = ""
$delimiter = ","
$headers = #()
$data = #()
# -- used by some functions --
# we put these here to allow assigning them before calling functions, which improves readability and auditability
$header_contains=""
$indexer_column=1
$worksheet_index=1
function from_excel(
$filename=$this.filename,
$worksheet_index=$this.worksheet_index
)`
{
$this.filename = $filename
$this.worksheet_index = $worksheet_index
$data_by_row = $this.from_excel_as_csv() # $data_by_row = $file.from_excel_as_csv($test_file)
$data_by_row = $data_by_row -split"`n"
#if ($this.headers.count -lt 1) {$this.headers = $data_by_row[0] -split $this.delimiter} #this would let us set headers elsewhere which is more flexible but less adaptive, Because columns change unpredicably we need something more adaptive
$temp_headers = $data_by_row[0] -split $this.delimiter
$temp_headers = $this.fix_blank_headers($temp_headers)
$this.headers = $this.dedupe_headers($temp_headers)
$this.data = $data_by_row|select -Skip 1|ConvertFrom-Csv -Header $this.headers -Delimiter $this.delimiter
}
function from_csv($filename=$this.filename)`
{
$this.filename = $filename
$this.headers = (Get-Content $this.filename -ReadCount 1|select -first 1) -split $this.delimiter
$this.data = Get-Content $this.filename|ConvertFrom-Csv -Delimiter $this.delimiter
}
function from_excel_as_csv(
$filename=$this.filename,
$worksheet_index=$this.worksheet_index
)`
{
$this.filename = $filename
$this.worksheet_index = $worksheet_index
#set up excel
Write-Host "Importing from excel, this may take a little while..."
$excel = New-Object -ComObject Excel.Application
$excel.DisplayAlerts = $false
$excel.Visible = $false
$workbook = $excel.workbooks.open($this.filename)
$worksheet = $workbook.Worksheets.Item($this.worksheet_index)
#import from excel
try{
$data_by_row = ""
$indexed_column = $worksheet.columns.item($this.indexer_column).value2 #we use this to work around some files having headers with blank space
$minimum_rows = (($indexed_column -join "◘").TrimEnd("◘") -split "◘").count # This Strips the million or so extra blank rows excel appends to get a realistic column length.
[bool]$header_found = 0
$i=1
do `
{
$row = $worksheet.rows.item($i).value2
$row_as_text = $row -join "◘" # ◘ (alt+8) is just a placeholder that's unlikely to show up in the text
$row_as_text = $row_as_text -replace $this.delimiter,"."
$row_as_text = $row_as_text.TrimEnd("◘")
$row_as_text = $row_as_text -replace "◘",$this.delimiter
if ($row_as_text -like "*$($this.header_contains)*"){[bool]$header_found=1}
if ($header_found) {$data_by_row+="$row_as_text`n"}
$i++
}
while ( ($row_as_text.Length -gt 1) -or ($i -lt $minimum_rows) )
}
catch {Write-Warning "ERROR Importing from excel"}
#close excel
$workbook.Close()
$excel.Quit()
write-host "Done importing from excel"
return $data_by_row
}
function dedupe_headers($headers){
$dupes = ($headers|group)|?{$_.count -gt 1}
if ($dupes.count -ge 1)
{
foreach ($dupe in $dupes)
{ #$dupe = $dupes[0]
$i=1
$new_headers = #()
foreach ($header in $headers)
{ #$header = $headers[0]
if ($header -eq $dupe.name)
{
$header = "$($header)_$($i)" # "header_#"
$i++
}
$new_headers += $header
}
}
}
else {$new_headers = $headers} # no duplicates found
return $new_headers
}
function fix_blank_headers($headers)
{
$replace_blanks_with = "_"
$new_headers = #()
foreach ($header in $headers)
{
if ($header -eq "") {$new_headers += $replace_blanks_with}
else {$new_headers += $header}
}
if ($new_headers.count -ne $header)
{
$error_json = #($headers),#($new_headers)|ConvertTo-Json -Compress
Write-Error "Error when fixing blank headers, original and new counts are different $($error_json)"
}
return $new_headers
}
<# function some_function($some_parameter){return $some_parameter} #>
Export-ModuleMember -Function * -Variable *
}
Forgive the ugliness here. I am not a programmer, so there are undoubtedly more optimized ways to do this, as well as better formatting. It will work, however, if I understand your requirements correctly.
$excelfile = import-csv "c:\myfile.csv"
$i = 1
$excelfile | ForEach-Object {
New-Variable "Store$i" $_."Store Number"
$iemail = $i.ToString() + "Email"
New-Variable "Store$iemail" $_."Email Address"
$i ++
}
edit: as per the reply to your original post, this works with a csv file. Just save it to csv first if necessary.
$excelfile = import-csv "C:\Temp\store.csv"
$i = 1 $excelfile | ForEach-Object {
$NA= $_."Name"
$SN= $_."StoreNumber"
Write-Output "row $i"
$NA
$SN
$i++ }
When looking in the IIS 7.5 manager > application pools, The last column lists "Applications". This column shows the number of application pools / websites this appPool is associated with.
I am trying to figure out how to query this column / information using Powershell. The end goal here is to have a script that I could run that would tell me if any applicaiton pool is being used for more than 1 website or app.
I am unable to find how to query this information, when running:
get-itemproperty IIS:\AppPools\(AppPoolName) | format-list *
I dont see this property. Im not sure that this column is a property, if not, is there a best way to check if AppPools are being used for more than 1 website / applicaiton?
The Applications property is defined in the format file, its code reside in the iisprovider.format.ps1xml file (in the webadmin module folder).
<TableColumnItem>
<ScriptBlock>
$pn = $_.Name
$sites = get-webconfigurationproperty "/system.applicationHost/sites/site/application[#applicationPool=`'$pn`'and #path='/']/parent::*" machine/webroot/apphost -name name
$apps = get-webconfigurationproperty "/system.applicationHost/sites/site/application[#applicationPool=`'$pn`'and #path!='/']" machine/webroot/apphost -name path
$arr = #()
if ($sites -ne $null) {$arr += $sites}
if ($apps -ne $null) {$arr += $apps}
if ($arr.Length -gt 0) {
$out = ""
foreach ($s in $arr) {$out += $s.Value + "`n"}
$out.Substring(0, $out.Length - 1)
}
</ScriptBlock>
</TableColumnItem>
You can take the code out and use it outside the format file, just assign $pn the apppool name you want to query. Here's a simplified version of the code:
$pn = 'pool1'
$sites = get-webconfigurationproperty "/system.applicationHost/sites/site/application[#applicationPool='$pn' and #path='/']/parent::*" machine/webroot/apphost -name name
$apps = get-webconfigurationproperty "/system.applicationHost/sites/site/application[#applicationPool='$pn' and #path!='/']" machine/webroot/apphost -name path
$sites,$apps | foreach {$_.value}
I went with this:
Import-Module WebAdministration
function Get-WebAppPoolApplications($webAppPoolName) {
$result = #()
$webAppPool = Get-Item ( Join-Path 'IIS:\AppPools' $webAppPoolName )
if ( $webAppPool -ne $null ) {
$webSites = Get-ChildItem 'IIS:\Sites'
$webSites | % {
$webApplications = Get-ChildItem ( Join-Path 'IIS:\Sites' $_.Name ) |
where { $_.NodeType -eq 'application' }
$result += $webApplications |
where { $_.applicationPool -eq $webAppPoolName }
}
}
$result
}
Wish I would have seen your post earlier, this is what I eventually came up with:
$SiteApps = get-item IIS:\Sites* $arraySize = ($SiteApps.count -1)
$i = 0
$t = 0
for ($i=0; $i -le $arraySize; $i ++) # start at the beg of the array
{
for ($t=($i+1); $t -le $arraySize; $t++)
{
if ($siteApps[$i].applicationpool -eq $siteApps[$t].applicationpool)
{
$web1 = $siteApps[$i].name
$webappPool = $siteApps[$i].applicationpool
$web2 = $siteApps[$t].name $answer = $answer + "The website "$web1" is sharing the AppPool "webAppPool" with website "$web2". "
}
}
}