I Have below script:-- looking for help to convert the output to excel format
$servers = get-content “c:\list.txt”
foreach ($server in $servers)
{
$server
$command = “quser /server:” + $server
invoke-expression $command
}
when executed getting in below format the output.
server1
USERNAME SESSIONNAME ID STATE IDLE TIME LOGON TIME
Vdw231 ica-tcp#8 7 Active . 11/5/2012 10:40 AM
Vdw232 ica-tcp#60 16 Active 16:18 11/5/2012 2:22 PM
Vdw233 ica-tcp#71 3 Active . 11/6/2012 6:10 AM
Vdw234 ica-tcp#72 1 Active 3 11/6/2012 6:59 AM
Vdw235 ica-tcp#73 5 Active . 11/6/2012 6:59 AM
Vdw236 rdp-tcp#74 2 Active . 11/6/2012 7:07 AM
server2
USERNAME SESSIONNAME ID STATE IDLE TIME LOGON TIME
Vdw210 ica-tcp#44 14 Active 13:50 11/5/2012 9:03 AM
Vdw211 ica-tcp#67 6 Active . 11/6/2012 1:56 AM
Vdw212 ica-tcp#70 1 Active 45 11/6/2012 6:34 AM
Vdw213 ica-tcp#72 9 Active 25 11/6/2012 6:53 AM
Vdw214
server3
USERNAME SESSIONNAME ID STATE IDLE TIME LOGON TIME
Vdw215 rdp-tcp#131 1 Active 19 11/5/2012 1:42 AM
Vdw216 rdp-tcp#132 4 Active 17 11/5/2012 2:06 AM
Vdw217 rdp-tcp#143 6 Active . 11/6/2012 3:31 AM
My requirement is i wanted to convert this output to excel format for submitting to management. Below is the excel format that i am thinking...to have from above script...
I've rewritten this, but I didn't test the full script and it's not optimized. If you encounter any
problems, feel free to contact me.
$statuses = #()
$servers = get-content "c:\list.txt"
$splitter = [regex]"\s+"
foreach ($server in $servers)
{
$command = "quser /server:$server"
$lines = #((invoke-expression $command | Out-String) -split "`n")
#remove header
$lines = $lines[1..$lines.count]
foreach ($line in $lines)
{
$attrs = #($splitter.Split($line.Trim(),6))
if ( $attrs -eq 6 )
{
$status = New-Object PSCustomObject -Property #{
"SERVER"=$server;
"USERNAME"=$attrs[0];
"SESSIONNAME"=$attrs[1];
"ID"=$attrs[2];
"STATE"=$attrs[3];
"IDLE_TIME"=$attrs[4];
"LOGON_TIME"=[datetime]$attrs[5]}
$statuses += $status
}
}
}
#your filter here
#$statuses = $statuses | where{ XXXXX }
$statuses | Export-Csv G:/test.csv -NoTypeInformation
You need to convert PSObject to an excel compatible Array and after you can write this array in excel sheet
include this code in your *.PS1 script, and use like this : get-process | Export-Excel
#=============================================================================
# Convert powershell Object to Array for Excel
#=============================================================================
function ConvertTo-MultiArray {
<#
.Notes
NAME: ConvertTo-MultiArray
AUTHOR: Tome Tanasovski
Website: http://powertoe.wordpress.com
Twitter: http://twitter.com/toenuff
Version: 1.2
.Synopsis
Converts a collection of PowerShell objects into a multi-dimensional array
.Description
Converts a collection of PowerShell objects into a multi-dimensional array. The first row of the array contains the property names. Each additional row contains the values for each object.
This cmdlet was created to act as an intermediary to importing PowerShell objects into a range of cells in Exchange. By using a multi-dimensional array you can greatly speed up the process of adding data to Excel through the Excel COM objects.
.Parameter InputObject
Specifies the objects to export into the multi dimensional array. Enter a variable that contains the objects or type a command or expression that gets the objects. You can also pipe objects to ConvertTo-MultiArray.
.Inputs
System.Management.Automation.PSObject
You can pipe any .NET Framework object to ConvertTo-MultiArray
.Outputs
[ref]
The cmdlet will return a reference to the multi-dimensional array. To access the array itself you will need to use the Value property of the reference
.Example
$arrayref = get-process |Convertto-MultiArray
.Example
$dir = Get-ChildItem c:\
$arrayref = Convertto-MultiArray -InputObject $dir
.Example
$range.value2 = (ConvertTo-MultiArray (get-process)).value
.LINK
http://powertoe.wordpress.com
#>
param(
[Parameter(Mandatory=$true, Position=1, ValueFromPipeline=$true)]
[PSObject[]]$InputObject
)
BEGIN {
$objects = #()
[ref]$array = [ref]$null
}
Process {
$objects += $InputObject
}
END {
$properties = $objects[0].psobject.properties |%{$_.name}
$array.Value = New-Object 'object[,]' ($objects.Count+1),$properties.count
# i = row and j = column
$j = 0
$properties |%{
$array.Value[0,$j] = $_.tostring()
$j++
}
$i = 1
$objects |% {
$item = $_
$j = 0
$properties | % {
if ($item.($_) -eq $null) {
$array.value[$i,$j] = ""
}
else {
$array.value[$i,$j] = $item.($_).tostring()
}
$j++
}
$i++
}
$array
}
}
#=============================================================================
# Export pipe in Excel file
#=============================================================================
function Export-Excel {
[cmdletBinding()]
Param(
[Parameter(Mandatory=$true, Position=1, ValueFromPipeline=$true)]
[PSObject[]]$InputObject
)
begin{
$header=$null
$row=1
$xl=New-Object -ComObject Excel.Application
$wb=$xl.WorkBooks.add(1)
$ws=$wb.WorkSheets.item(1)
$xl.Visible=$false
$xl.DisplayAlerts = $false
$xl.ScreenUpdating = $False
$objects = #()
}
process{
$objects += $InputObject
}
end{
$array4XL = ($objects | ConvertTo-MultiArray).value
$starta = [int][char]'a' - 1
if ($array4XL.GetLength(1) -gt 26) {
$col = [char]([int][math]::Floor($array4XL.GetLength(1)/26) + $starta) + [char](($array4XL.GetLength(1)%26) + $Starta)
} else {
$col = [char]($array4XL.GetLength(1) + $starta)
}
$ws.Range("a1","$col$($array4XL.GetLength(0))").value2=$array4XL
$wb.SaveAs("$([Environment]::GetFolderPath('desktop'))\Export-Excel ($(Get-Date -Format u)).xlsx")
$xl.Quit()
Remove-Variable xl
}
}
you get
Related
I am creating usernames as such: first 3 letters of the first name then 4 randomly generated numbers. Ryan Smith = RYA4859. I am getting the random number from this PowerShell command:
Get-Random -Minimum 1000 -Maximum 10000
I need to know how to create a script that will add the username to a .txt file after it has been generated. I also want the script to first check the .txt file to see if the randomly generated number already already exists and if it does, generate a new 4 digit number that does not exist and then add that to the .txt file.
The flow should be:
generate random 4 digit number
check txt file if number exists
if yes - generate new number
if no - append file and add generated number to file
You want to run a do...until loop that runs until the randomly generated number doesn't exist in your text file
$file = "C:\users.txt"
$userId = "RYA"
# get the contents of your text file
$existingUserList = Get-Content $file
do
{
$userNumber = Get-Random -Minimum 1000 -Maximum 10000
# remove all alpha characters in the file, so only an array of numbers remains
$userListReplaced = $existingUserList -replace "[^0-9]" , ''
# the loop runs until the randomly generated number is not in the array of numbers
} until (-not ($userNumber -in $userListReplaced))
# concatenates your user name with the random number
$user = $userId + $userNumber
# appends the concatenated username into the text file
$user | Out-File -FilePath $file -Append
Without the 3 character prefix
$file = "C:\users.txt"
# get the contents of your text file
$existingUserList = Get-Content $file
do
{
$userNumber = Get-Random -Minimum 1000 -Maximum 10000
# remove all alpha characters in the file, so only an array of numbers remains
$userListReplaced = $existingUserList -replace "[^0-9]" , ''
# the loop runs until the randomly generated number is not in the array of numbers
} until (-not ($userNumber -in $userListReplaced))
# appends the concatenated username into the text file
$userNumber| Out-File -FilePath $file -Append
Note: Hashtables in general will find keys in less time than finding a matching element in an unsorted array. This difference in performance increases as the number of elements increase. While a binary search on a sorted arrays may come closer in performance, the sorting process itself can be be a major performance hit and add complexity to the code.
The main difference between the described version of the code in the comment on the question, and the following code, is that I'm appending the new user name to the file instead of over writing the file, and added a loop near the end to repeatedly ask if the code should continue.
function RandomDigits {
[CmdletBinding()]
param (
[Parameter()]
[int]$DigitCount = 2
)
$RandString = [string](Get-Random -Minimum 100000 -Maximum 10000000)
$RandString.Substring($RandString.Length-$DigitCount)
}
function GenUserName {
[CmdletBinding()]
param(
[Parameter(Mandatory = $true, Position = 0)]
[string]$Prefix
)
"$Prefix$(RandomDigits 4)"
}
function ReadAndMatchRegex {
[CmdletBinding()]
param(
[Parameter(Mandatory = $true, Position = 0)]
[string]$Regex,
[Parameter(Mandatory = $true, Position = 1)]
[string]$Prompt,
[Parameter(Mandatory = $false, Position = 2)]
[string]$ErrMsg = "Incorrect, please enter needed info (Type 'exit' to exit)."
)
$FirstPass = $true
do {
if (-not $FirstPass) {
Write-Host $ErrMsg -ForegroundColor Red
Write-Host
}
$ReadText = Read-Host -Prompt $Prompt
$ReadText = $ReadText.ToUpper()
if($ReadText -eq 'exit') {exit}
$FirstPass = $false
} until ($ReadText -match $Regex)
$ReadText
}
$Usernames = #{}
$UsernameFile = "$PSScriptRoot\Usernames.txt"
if(Test-Path -Path $UsernameFile -PathType Leaf) {
foreach($line in Get-Content $UsernameFile) { $Usernames[$Line]=$true }
}
do {
Write-Host
$UserPrefix = ReadAndMatchRegex '^[A-Z]{3}$' "Please enter 3 letters for user's ID"
do {
$NewUserName = GenUserName $UserPrefix
} while ($Usernames.ContainsKey($NewUserName))
$NewUserName | Out-File $UsernameFile -Append
$UserNames[$NewUserName]=$true
$UserNames.Keys
$Continue = ReadAndMatchRegex '^(Y|y|YES|yes|Yes|N|n|NO|no|No)$' 'Continue?[Y/N]'
} while ($Continue -match '^(Y|y|YES|yes|Yes)$')
I have file 1.csv
number, name # column name
1,john
2,mike
3,test
4,test2
...
I created function for returning all values from this csv (number,name)
Function Get-CSV {
[CmdletBinding()]
param (
# CSV File path
[Parameter(Mandatory=$true)][string]$path
)
#Create an hashtable variable
[hashtable]$return = #{}
Import-Csv $path |
ForEach-Object {
$number = $_.number
$name = $_.name
$return.name = $name
$return.number = $number
return $return
}
# return $return un-commenting this line don't change output
}
# calling function
$a = Get-CSV "C:\Users\1.csv"
$a.number
$a.name
I get only one (last row from CSV) - a.name = test2 and a.number = 4
How to get all rows from CSV when calling this function ?
You need to construct an array of hashtables for this to work. Even better, I would create an array of objects because it gives you control over the property names. You can change the function definition to:
Function Get-CSV {
[CmdletBinding()]
param (
# CSV File path
[Parameter(Mandatory=$true)][string]$path
)
#Create an empty array variable
$return = #()
Import-Csv $path |
ForEach-Object {
$return += ,(New-Object PSObject -property #{'name'= $_.name; 'number'= $_.number})
}
return $return
}
This gives you:
$a = Get-CSV "C:\Users\1.csv"
$a
name number
---- ------
john 1
mike 2
test 3
test2 4
Note
I'm not sure on your exact use case, but Import-Csv already gives you the information back as an object, so there may not be a need to create a separate one.
After lot of googling and try/errors found solution:
Function Get-CSV {
[CmdletBinding()]
param (
# CSV File path
[Parameter(Mandatory=$true)][string]$path
)
Import-Csv $path |
ForEach-Object {
$number = $_.number
$name = $_.name
[pscustomobject]#{
name = $name
number = $number
}
}
}
We're having a very strange problem. When iterating through multiple elements in an array of public folder names, PowerShell sometimes throws an error. But not always.
When running the code below with as input only one element it works fine but when multiple elements are defined the second iteration throws an error.
According to this Microsoft article one should Release the COM-Object, but this doesn't work either.
Code
Param (
[String]$Mail = 'User#donain.com',
[String]$ImportFile = 'C:\Scripts\Import.txt'
)
$Import = Get-Content $ImportFile
$Start = "\\Public Folders - $Mail"
Add-Type -AssemblyName 'Microsoft.Office.Interop.Outlook'
foreach ($L in $Import) {
$PSTFile = "$ExportFolder\$($L -replace '[^A-Za-z0-9-_ \.\[\]]', ' ').pst"
$Outlook = New-Object -ComObject Outlook.Application -Verbose:$false
$Namespace = $Outlook.GetNameSpace('MAPI')
$AllPublicFolders = $Namespace.Folders | where FolderPath -EQ $Start | ForEach-Object {
$Start = $Start + '\All Public Folders'
$_.Folders | where FolderPath -EQ $Start
}
$Split = $L.Split('\')
$Folder = Switch ($Split.Count) {
1 {$AllPublicFolders.Folders.Item($Split[0])}
2 {$AllPublicFolders.Folders.Item($Split[0]).Folders.Item($Split[1])}
3 {$AllPublicFolders.Folders.Item($Split[0]).Folders.Item($Split[1]).Folders.Item($Split[2])}
4 {$AllPublicFolders.Folders.Item($Split[0]).Folders.Item($Split[1]).Folders.Item($Split[2]).Folders.Item($Split[3])}
5 {$AllPublicFolders.Folders.Item($Split[0]).Folders.Item($Split[1]).Folders.Item($Split[2]).Folders.Item($Split[3]).Folders.Item($Split[4])}
6 {$AllPublicFolders.Folders.Item($Split[0]).Folders.Item($Split[1]).Folders.Item($Split[2]).Folders.Item($Split[3]).Folders.Item($Split[4]).Folders.Item($Split[5])}
7 {$AllPublicFolders.Folders.Item($Split[0]).Folders.Item($Split[1]).Folders.Item($Split[2]).Folders.Item($Split[3]).Folders.Item($Split[4]).Folders.Item($Split[5]).Folders.Item($Split[6])}
8 {$AllPublicFolders.Folders.Item($Split[0]).Folders.Item($Split[1]).Folders.Item($Split[2]).Folders.Item($Split[3]).Folders.Item($Split[4]).Folders.Item($Split[5]).Folders.Item($Split[6]).Folders.Item($Split[7])}
9 {$AllPublicFolders.Folders.Item($Split[0]).Folders.Item($Split[1]).Folders.Item($Split[2]).Folders.Item($Split[3]).Folders.Item($Split[4]).Folders.Item($Split[5]).Folders.Item($Split[6]).Folders.Item($Split[7]).Folders.Item($Split[8])}
}
Write-Verbose "Folder '$($Folder.FolderPath.TrimStart($Start))'"
Write-Verbose "Add PST"
$NameSpace.AddStore($PSTFile)
$PSTStore = $NameSpace.Stores | where {$_.FilePath -eq $PSTFile}
Write-Verbose "Copy content to PST"
$Folder.CopyTo($PSTStore) | Out-Null
Write-Verbose "Remove PST"
$PST = $NameSpace.Stores | where {$_.FilePath -eq $PSTFile}
$PSTRoot= $PST.GetRootFolder()
$PSTFolder= $NameSpace.Folders.Item($PSTRoot.Name)
$NameSpace.GetType().InvokeMember('RemoveStore',[System.Reflection.BindingFlags]::InvokeMethod,$null,$Namespace,($PSTFolder))
$Outlook.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($Outlook)
Remove-Variable Outlook
Start-Sleep -Seconds 5
}
ImportFile
WEUR - COMPANY\DAF\Tableau de bord IB\Année 2002\07 juillet
WEUR - COMPANY\DAF\Tableau de bord IB\Année 2002\08 Août
Error
Failed for path 'WEUR - DOMAIM\DAF\Tableau de bord IB\Année 2002\08 Août': You cannot call a method on a null-va
lued expression.
Found the problem:
$AllPublicFolders = $Namespace.Folders | where FolderPath -EQ "\\Public Folders - $Mail" | ForEach-Object {
$_.Folders | where FolderPath -EQ "\\Public Folders - $Mail\All Public Folders"
}
I want power shell script to fetch all 500 entries from IIS logs from multiple servers. I have written a script that fetches 500 from single servers for previous hours. Could someone check and help me how I can go for fetching multiple servers. Script that I have:
#Set Time Variable -60
$time = (Get-Date -Format "HH:mm:ss"(Get-Date).addminutes(-60))
# Location of IIS LogFile
#$servers = get-content C:\Users\servers.txt
$File = "\\server\D$\Logs\W3SVC89\"+"u_ex"+(get-date).ToString("yyMMddHH")+".log"
# Get-Content gets the file, pipe to Where-Object and skip the first 3 lines.
$Log = Get-Content $File | where {$_ -notLike "#[D,S-V]*" }
# Replace unwanted text in the line containing the columns.
$Columns = (($Log[0].TrimEnd()) -replace "#Fields: ", "" -replace "-","" -replace "\(","" -replace "\)","").Split(" ")
# Count available Columns, used later
$Count = $Columns.Length
# Strip out the other rows that contain the header (happens on iisreset)
$Rows = $Log | where {$_ -like "*500 0 0*"}
# Create an instance of a System.Data.DataTable
#Set-Variable -Name IISLog -Scope Global
$IISLog = New-Object System.Data.DataTable "IISLog"
# Loop through each Column, create a new column through Data.DataColumn and add it to the DataTable
foreach ($Column in $Columns) {
$NewColumn = New-Object System.Data.DataColumn $Column, ([string])
$IISLog.Columns.Add($NewColumn)
}
# Loop Through each Row and add the Rows.
foreach ($Row in $Rows) {
$Row = $Row.Split(" ")
$AddRow = $IISLog.newrow()
for($i=0;$i -lt $Count; $i++) {
$ColumnName = $Columns[$i]
$AddRow.$ColumnName = $Row[$i]
}
$IISLog.Rows.Add($AddRow)
}
$IISLog | select #{n="DateTime"; e={Get-Date ("$($_.date) $($_.time)")}},sip,csuristem,scstatus | ? { $_.DateTime -ge $time } |Out-File C:\Users\Servers\results.csv
Assuming your logfile is always on the same path, and that servers.txt contains you server list,
you can read the server list then execute your code against each one using a foreach loop :
something like that ( a result file is create for each server) :
#Set Time Variable -60
$time = (Get-Date -Format "HH:mm:ss"(Get-Date).addminutes(-60))
# Location of IIS LogFile
$servers = get-content C:\Users\servers.txt
$servers| foreach{
#inside the foreach loop $_ will represent the current server
$File = "\\$_\D$\Logs\W3SVC89\"+"u_ex"+(get-date).ToString("yyMMddHH")+".log"
# Get-Content gets the file, pipe to Where-Object and skip the first 3 lines.
$Log = Get-Content $File | where {$_ -notLike "#[D,S-V]*" }
# Replace unwanted text in the line containing the columns.
$Columns = (($Log[0].TrimEnd()) -replace "#Fields: ", "" -replace "-","" -replace "\(","" -replace "\)","").Split(" ")
# Count available Columns, used later
$Count = $Columns.Length
# Strip out the other rows that contain the header (happens on iisreset)
$Rows = $Log | where {$_ -like "*500 0 0*"}
# Create an instance of a System.Data.DataTable
#Set-Variable -Name IISLog -Scope Global
$IISLog = New-Object System.Data.DataTable "IISLog"
# Loop through each Column, create a new column through Data.DataColumn and add it to the DataTable
foreach ($Column in $Columns) {
$NewColumn = New-Object System.Data.DataColumn $Column, ([string])
$IISLog.Columns.Add($NewColumn)
}
# Loop Through each Row and add the Rows.
foreach ($Row in $Rows) {
$Row = $Row.Split(" ")
$AddRow = $IISLog.newrow()
for($i=0;$i -lt $Count; $i++) {
$ColumnName = $Columns[$i]
$AddRow.$ColumnName = $Row[$i]
}
$IISLog.Rows.Add($AddRow)
}
$IISLog | select #{n="DateTime"; e={Get-Date ("$($_.date) $($_.time)")}},sip,csuristem,scstatus | ? { $_.DateTime -ge $time } |Out-File C:\Users\Servers\$_results.csv
}
Note that this will run your code sequentially on each of your server wich can be time consumming. If you are facing duration issue, you can try to use invoke-command and the -asjob parameter in order to launch you code asynchronoulsy
I'm writing a script that I'd like to be able to easily move between IIS servers to analyze logs, but these servers store the logs in different places. Some on C:/ some on D:/ some in W3SVC1, some in W3SVC3. I'd like to be able to have powershell look this information up itself rather than having to manually edit this on each server. (Yeah, I'm a lazy sysadmin. #automateallthethings.)
Is this information available to PowerShell if I maybe pass the domain to it or something?
I found this to work for me since I want to know all of the sites log directory.
Import-Module WebAdministration
foreach($WebSite in $(get-website))
{
$logFile="$($Website.logFile.directory)\w3svc$($website.id)".replace("%SystemDrive%",$env:SystemDrive)
Write-host "$($WebSite.name) [$logfile]"
}
Import-Module WebAdministration
$sitename = "mysite.com"
$site = Get-Item IIS:\Sites\$sitename
$id = $site.id
$logdir = $site.logfile.directory + "\w3svc" + $id
Thanks for Chris Harris for putting the website ID idea in my head. I was able to search around better after that and it led me to the WebAdministration module and examples of its use.
Nice... I updated your script a little bit to Ask IIS for the log file location.
param($website = 'yourSite')
Import-Module WebAdministration
$site = Get-Item IIS:\Sites\$website
$id = $site.id
$logdir = $site.logfile.directory + "\w3svc" + $id
$time = (Get-Date -Format "HH:mm:ss"(Get-Date).addminutes(-30))
# Location of IIS LogFile
$File = "$logdir\u_ex$((get-date).ToString("yyMMdd")).log"
# Get-Content gets the file, pipe to Where-Object and skip the first 3 lines.
$Log = Get-Content $File | where {$_ -notLike "#[D,S-V]*" }
# Replace unwanted text in the line containing the columns.
$Columns = (($Log[0].TrimEnd()) -replace "#Fields: ", "" -replace "-","" -replace "\(","" -replace "\)","").Split(" ")
# Count available Columns, used later
$Count = $Columns.Length
# Strip out the other rows that contain the header (happens on iisreset)
$Rows = $Log | where {$_ -like "*500 0 0*"}
# Create an instance of a System.Data.DataTable
#Set-Variable -Name IISLog -Scope Global
$IISLog = New-Object System.Data.DataTable "IISLog"
# Loop through each Column, create a new column through Data.DataColumn and add it to the DataTable
foreach ($Column in $Columns) {
$NewColumn = New-Object System.Data.DataColumn $Column, ([string])
$IISLog.Columns.Add($NewColumn)
}
# Loop Through each Row and add the Rows.
foreach ($Row in $Rows) {
$Row = $Row.Split(" ")
$AddRow = $IISLog.newrow()
for($i=0;$i -lt $Count; $i++) {
$ColumnName = $Columns[$i]
$AddRow.$ColumnName = $Row[$i]
}
$IISLog.Rows.Add($AddRow)
}
$IISLog | select #{n="DateTime"; e={Get-Date ("$($_.date) $($_.time)")}},csuristem,scstatus | ? { $_.DateTime -ge $time }