There is a folder on the remote server which has various subfolders in it. It is completely nested. I would like to:
Prepare an HTML report which contains folder name.
For every folder it should also record the file count.
The code needs to append the HTML file which is already created.
Columns required: Folder name, Folder Path, File Count
Below is the code snippet which is part of my main script. I am fairly new to PowerShell.
Can some one please help?
$server_dir = "D:\Data\Inbox"
$does_dir_e = (Test-Path $server_dir)
if($does_dir_e)
{
$fso = New-Object -com "Scripting.FileSystemObject"
$f = $fso.GetFolder($server_dir)
foreach($folder in $f.subfolders)
{
$fcount = $((Get-ChildItem $folder.Path).count)
$fname = $folder.name | Convertto-HTML -Fragment >> C:\Temp\Server.html
}
}
You don't actually say what isn't working for you, but the following script should get you started.
The outer loop recurses through the folders (PSIsContainer) means it is a folder.
The inner loop counts the number of files in each folder using measure-object, we filter out folders from this count to give us just the file count.
$path = "D:\Data\Inbox"
# Enumerate the given path recursively
Get-ChildItem -Path $path -Recurse | Where-Object {$_.PSIsContainer} | %{
# Add a user-defined custom member with a value of the filecount this
# time not recursively (using measure object)
$_ | add-member -membertype noteproperty -name FileCount -value (Get-ChildItem -Path $_.Fullname |
Where-Object {!$_.PSIsContainer} |
Measure-Object).Count
# Output the required values
$_ | select Name, FullName, FileCount | ConvertTo-Html -Fragment
}
Is this what you want? I haven't used the HTML cmdlet before, so be aware it's ugly : )
$server_dir = 'D:\Data\Inbox'
if(Test-Path $server_dir)
{
$folders = Get-ChildItem $server_dir -Recurse | where {$_.PSIsContainer}
$output = #()
foreach($folder in $folders)
{
$fname = $folder.Name
$fpath = $folder.FullName
$fcount = Get-ChildItem $fpath | where {!$_.PSIsContainer} | Measure-Object | Select-Object -Expand Count
$obj = New-Object psobject -Property #{FolderName = $fname; FolderPath = $fpath; FileCount = $fcount}
$output += $obj
}
#Output to HTML
$output | ConvertTo-Html -Fragment >> 'C:\Temp\Server.html'
}
Related
I have a PowerShell script that stores the full paths of the files in a specific directory along with some other information. I have exported the CSV file. Now, the paths are actually being combined together to comprise of the full path. Let me give an example below:
$root= C:\Users\home\
$web = facebook website\domain
$app = facebook android\latest
These paths are then joined together with either Join-Path or $fbweb = $root + $web to make up the full path: C:\Users\home\facebook website\domain
Now the above mentioned path will have other files, subfolders etc in it but that's the gist of how the paths are structured. I have exported them in the CSV file but I'm having trouble with the following. I need the CSV file to have paths in such a way that the part leading up to the $web is trimmed out.
For instance if the CSV file is like this:
Path
C:\Users\home\facebook website\domain\version\version-feb-2020.txt
C:\Users\home\facebook website\domain\interface\somefile.html
C:\Users\home\facebook android\latest\messenger\messenger app files\code.js
C:\Users\home\facebook android\latest\application\somecode.js
I want it to turn out like this:
Path
facebook website\domain
\version\version-feb-2020.txt
\interface\somefile.html
facebook android\latest
\messenger\messenger app files\code.js
application\somecode.js
I have tried using the following to trim it out:
$number = [regex]::matches($fbweb,"\\").count
Select-Object Hash,
#{
Name = "FilePath";
Expression = { [string]::Join("\", ($_.Path -split "\\" | Select-Object -skip ($number)))}
}
Update:
I have tried this:
$replace = Join-Path -Path $root -ChildPath $web
echo $replace
$RefHash = Import-csv "C:\Users\Admin\Desktop\fb.csv"
$RefHash | ForEach-Object {
echo $_.Path
($_.Path).Replace($replace, "\")
} | Export-csv "C:\Users\Admin\Desktop\replaced.csv"
But this just results in the exported csv showing the following:
#TYPE System.String
"Length"
"numbers"
"numbers"
"numbers"
As discussed, you have 2 methods to manage this:
Treat the csv file as a text file and do a replace on a get-content:
(Get-Content -Path "C:\temp\TestMD5.csv").Replace($replace, "\") | Set-Content "C:\temp\TestMD5updated.csv"
Import the CSV, separate the each of the parameters, modify what you require and then build a pscustomobject which you then export as csv:
#Preparing variables
$scriptdir = [System.IO.Path]::GetDirectoryName($MyInvocation.MyCommand.Path)
$sourcecsv = Import-csv -path "C:\temp\TestMD5.csv"
$obj = #()
$root = "C:\Temp"
$web = "Test01\Test02\"
$replace = Join-Path -Path $root -ChildPath $web
$target = "\"
#Executing replace
foreach ($line in $sourcecsv) {
$object = New-Object -TypeName psobject
$algo = $line | Select-Object -ExpandProperty 'Algorithm'
$Hash = $line | Select-Object -ExpandProperty 'Hash'
$Path = ($line | Select-Object -ExpandProperty 'Path').Replace($replace, $target)
$object | Add-Member -MemberType NoteProperty -Name Algorithm -Value $algo
$object | Add-Member -MemberType NoteProperty -Name Hash -Value $Hash
$object | Add-Member -MemberType NoteProperty -Name Path -Value $Path
$obj += $object
$object
}
$obj | Export-Csv -NoTypeInformation -Path "$scriptdir\UpdatedVars.csv"
The first one is faster, the 2nd one provides you the flexibility that would allow you to build functions that are generalized and modify additional parameters as required.
OK assuming you don't actually need to import and deal with the file as a CSV file. You simply need to replace the strings in a text file. If so then you can use get-content instead of import-csv
You want to use REPLACE.
$SourceFile = Get-Content -Path "D:\URL.txt"
$root= "C:\Users\home\"
$web = "facebook website\domain"
$app = "facebook android\latest"
$replace1 = $root+$web
$replace2 = $root+$app
$SourceFile -replace [Regex]::Escape($replace1), "\" -replace [Regex]::Escape($replace2), "\" | Set-Content -Path "D:\urlreplaced.txt"
This will do the replace and output the new file to D:\urlreplaced.txt
To convert the csv data into a new format as you would like, you could do the following:
$root= 'C:\Users\home'
$web = 'facebook website\domain'
$app = 'facebook android\latest'
$webPath = [regex]::Escape((Join-Path -Path $root -ChildPath 'facebook website\domain'))
$appPath = [regex]::Escape((Join-Path -Path $root -ChildPath 'facebook android\latest'))
$data = Import-Csv -Path "C:\Users\Admin\Desktop\fb.csv"
$appData = ($data | Where-Object { $_.Path -match "^$appPath" } | Select-Object #{Name = 'Path'; Expression = {$_.Path -replace "^$appPath" }}).Path
$webData = ($data | Where-Object { $_.Path -match "^$webPath" } | Select-Object #{Name = 'Path'; Expression = {$_.Path -replace "^$webPath" }}).Path
# manually create the one-column csv (easiest is to do this in a Here-String)
$newData = #"
Path
$web
$($webData -join [Environment]::NewLine)
$app
$($appData -join [Environment]::NewLine)
"#
# output on screen
$newData
# output to new CSV file
$newData | Set-Content -Path "C:\Users\Admin\Desktop\replaced.csv" -Force
Output on screen
Path
facebook website\domain
\version\version-feb-2020.txt
\interface\somefile.html
facebook android\latest
\messenger\messenger app files\code.js
\application\somecode.js
Okay i am not a programmer and my Powershell experience is basic. But here goes. I have been asked to collect some info on a Directory we are migrating off our network.
It collects sub dirs names, size, #of files and folders and datestamp and exports to csv.
I cannot for the life of me make the folder creation date work so i gave up on that and have been looking to get the lastwritetime for the folders as i am trying to figure out what has been used recently. It only works for a few folders but the rest in excel have system.object[] in the cell. Super frustrating.
Here is the code. It uses a gui directory picker.
#Refresh network drives for session
net use i: /delete
net use m: /delete
net use i: "\\wfs.queensu.ca\ADV\Workgroups"
net use m: "\\wfs.queensu.ca\ADVMedia"
Function Get-Folder($initialDirectory)
{
[System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms")|Out-Null
$foldername = New-Object System.Windows.Forms.FolderBrowserDialog
$foldername.Description = "Select a folder"
$foldername.rootfolder = "MyComputer"
if($foldername.ShowDialog() -eq "OK")
{
$folder += $foldername.SelectedPath
}
return $folder
}
$forDir = Get-Folder
#Change this to the parent directory that you want counts for
#$forDir = "\\wfs.queensu.ca\adv\workgroups\ADV Services\$seldir"
$Dirs = Get-ChildItem $forDir -Directory -Name
$Tab = [char]9
$results = #()
Write-Host $forDir
foreach($Dir in $Dirs)
{
$dirSize = "{0:N2} MB" -f ((Get-ChildItem $forDir/$Dir -Recurse | Measure-Object -Property Length
-Sum -ErrorAction Stop).Sum / 1MB)
$dirFiles = Get-ChildItem $forDir/$Dir -Recurse -File | Measure-Object | %{$_.Count}
$dirFolders = Get-ChildItem $forDir/$Dir -Recurse -Directory | Measure-Object | %{$_.Count}
#$dirDate = (Get-ChildItem $forDir/$Dir).LastWriteTime.ToString
$dirDate = #(Get-ChildItem $forDir/$Dir | % {$_.LastWriteTime})
$details = [ordered] #{
dir = $Dir
No_Files = $dirFiles
No_Folders = $dirFolders
size = $dirSize
date = $dirDate
}
$results += New-Object PSobject -Property $details
}
#This line finds the last index of the slash and adding one char
$Dirlength = $forDir.LastIndexOf('\') + 1
#This line takes the entire length of the string minus the postion above leaving the directory name
$sublength = $forDir.length - $Dirlength
#Assigns the remaining characters from the substring to the varibale to be used as the filename
$DirName = $forDir.SubString($Dirlength, $sublength)
$results | Export-Csv "C:\$DirName.csv" -NoTypeInformation
Write-Host ("Complete WOW!")
Get-ChildItem .\dir gives you all files contained in the directory .\dir not the directory itself.
That is why the following line in your script creates an array of LastWriteTimes for all files that are contained in the directory that $forDir/$Dir resolves to in your foreach loop:
$dirDate = #(Get-ChildItem $forDir/$Dir | % {$_.LastWriteTime})
The array in $dirDate will return ​System.Object[] when its toString() method is called. This is the reason, why you see this string in your excel, where you expect the folder's timestamp.
I bet that those folders, that seem to work do have exactly one childitem...
To get the LastWriteTime of the directory itself use Get-Item instead of Get-ChildItem.
$dirDate = Get-Item $forDir/$Dir | Select-Object -Expand LastWriteTime
try this...
Get-ChildItem -Path 'D:\temp' -Recurse |
Where-Object { $_.PSIsContainer } |
Select-Object -Property Name, LastWriteTime
<#
# Results
Name LastWriteTime
---- -------------
est 17-Feb-20 15:50:53
LogFiles 11-Mar-20 11:37:28
NewFolder 06-Feb-20 14:56:48
ParentFolder 12-Feb-20 14:24:25
Reference 03-Feb-20 11:55:47
Source 06-Feb-20 14:56:48
Target 24-Feb-20 22:03:56
New folder 03-Feb-20 11:55:24
temp 20-Jan-20 11:17:42
ChildFolder 12-Feb-20 14:08:11
GrandchildFolder 12-Feb-20 14:08:32
#>
# Or in v3 and beyond
Get-ChildItem -Path 'D:\temp' -Directory -Recurse |
Select-Object -Property Name, LastWriteTime
<#
# Results
Name LastWriteTime
---- -------------
est 17-Feb-20 15:50:53
LogFiles 11-Mar-20 11:37:28
NewFolder 06-Feb-20 14:56:48
ParentFolder 12-Feb-20 14:24:25
Reference 03-Feb-20 11:55:47
Source 06-Feb-20 14:56:48
Target 24-Feb-20 22:03:56
New folder 03-Feb-20 11:55:24
temp 20-Jan-20 11:17:42
ChildFolder 12-Feb-20 14:08:11
GrandchildFolder 12-Feb-20 14:08:32
#>
I know this question has already been answered, but for completeness, here's another way of doing this by utilizing the GetFileSystemInfos method every DirInfo object has.
$rootFolder = 'X:\YourRootPath'
Get-ChildItem -Path $rootFolder -Directory -Recurse | ForEach-Object {
# GetFileSystemInfos() (needs .NET 4+) is faster than Get-ChildItem and returns hidden objects by default
# See: https://devblogs.microsoft.com/powershell/why-is-get-childitem-so-slow/
$fsObjects = $_.GetFileSystemInfos('*', 'TopDirectoryOnly') # TopDirectoryOnly --> do not recurse
# you can also use Get-ChildItem here of course.
# To also get hidden files, with Get-ChildItem you need to add the -Force switch
# $fsObjects = Get-ChildItem -Path $_.FullName -Filter '*' -Force
# from the $fsObjects array, filter out the files and directories in order to get the count
$files = $fsObjects | Where-Object { $_ -is [System.IO.FileInfo] } # or: !($_.Attributes -band 'Directory')
$folders = $fsObjects | Where-Object { $_ -is [System.IO.DirectoryInfo] } # or: $_.Attributes -band 'Directory'
# emit a PSObject with all properties you wish to collect
[PsCustomObject]#{
Path = $_.FullName
FileCount = $files.Count
DirCount = $folders.Count
DirSize = "{0:N2} MB" -f (($files | Measure-Object -Sum -Property Length).Sum / 1MB)
DirDate = $_.LastWriteTime
}
} | Export-Csv -Path "X:\YourFolder_Info.csv" -NoTypeInformation -UseCulture
I have a powershell script that performs and inventory of a fileshare. I want to know how to add ID to each line in the csv file and also a parent ID to line in the csv.
I am new to Powershell but worked out how to get the inventory script working.
Here is the code.
#Set-ExecutionPolicy Unrestricted
$SourcePath = "G:\My Drive"
$DestinationCSVPath = "e:\G Drive Inventory 20180611.csv" #Destination for Temp CSV File
$CSVColumnOrder = 'Path', 'IsDIR', 'Directory', 'FileCount', 'Parent', 'Name', 'CreationTime', 'LastAccessTime', 'LastWriteTime', 'Extension', 'BaseName', 'B'
#, 'Root', 'IsReadOnly', 'Attributes', 'Owner', 'AccessToString', 'Group' #, #'MD5', #'SHA1' #Order in which columns in CSV Output are ordered
#FOLDERS ONLY
#$SourcePathFileOutput = Get-ChildItem $SourcePath -Recurse | where {$_.PSIsContainer}
#FILES AND FOLDERS
$SourcePathFileOutput = Get-ChildItem $SourcePath -Recurse #| where {$_.PSIsContainer} #Uncomment for folders only
$HashOutput = ForEach ($file in $SourcePathFileOutput) {
Write-Output (New-Object -TypeName PSCustomObject -Property #{
Path = $file.FullName
IsDIR = $file.PSIsContainer
Directory = $File.DirectoryName
FileCount = (GCI $File.FullName -Recurse).Count
Parent = $file.Parent
Name = $File.Name
CreationTime = $File.CreationTime
LastAccessTime = $File.LastAccessTime
LastWriteTime = $File.LastWriteTime
Extension = $File.Extension
BaseName = $File.BaseName
B = $File.Length
#Root = $file.Root
#IsReadOnly = $file.IsReadOnly
#Attributes = $file.Attributes
#Owner = $acl.owner
#AccessToString = $acl.accesstostring
#Group = $acl.group
#MD5 = Get-FileHash $file.FullName -Algorithm MD5 | Select-Object -ExpandProperty Hash
#SHA1 = Get-FileHash $file.FullName -Algorithm SHA1 | Select-Object -ExpandProperty Hash
}) | Select-Object $CSVColumnOrder
}
$HashOutput | Export-Csv -NoTypeInformation -Path $DestinationCSVPath
I want to know how to add ID to each line in the csv file and also a parent ID to line in the csv.
try Something like this:
$ID=1
$SourcePath="c:\temp"
$SourcePathFileOutput=#()
#for have the parent dir
$SourcePathFileOutput += Get-Item $SourcePath | %{
$ParentPath=if ($_.PSIsContainer){$_.parent.FullName}else{$_.DirectoryName}
Add-Member -InputObject $_ -MemberType NoteProperty -Name "ID" -Value ($ID++)
Add-Member -InputObject $_ -MemberType NoteProperty -Name "PARENTPATH" -Value $ParentPath
$_
}
#for have all directory and file
$SourcePathFileOutput += Get-ChildItem $SourcePath -Recurse | %{
$ParentPath=if ($_.PSIsContainer){$_.parent.FullName}else{$_.DirectoryName}
Add-Member -InputObject $_ -MemberType NoteProperty -Name "ID" -Value ($ID++)
Add-Member -InputObject $_ -MemberType NoteProperty -Name "PARENTPATH" -Value $ParentPath
$_
}
#List dir for optimise
$DirOutput=$SourcePathFileOutput | where {$_.psiscontainer}
#for output result (add all properties you want)
$list=foreach ($Current in $SourcePathFileOutput)
{
$Result=[pscustomobject]#{
Path = $Current.FullName
PARENTPATH=$Current.PARENTPATH
ISDIR=$Current.psiscontainer
ID=$Current.ID
PARENTID=($DirOutput | where {$_.FullName -eq $Current.PARENTPATH}).ID
}
#Initialise parent root
if ($Result.PARENTID -eq $null) {$Result.PARENTID=0}
#send result on output
$Result
}
$list | Out-GridView
This should do what you want:
#Set-ExecutionPolicy Unrestricted
$SourcePath = "G:\My Drive"
$DestinationCSVPath = "e:\G Drive Inventory 20180611.csv" #Destination for Temp CSV File
$CSVColumnOrder = 'Path', 'IsDIR', 'Directory', 'FileCount', 'Parent', 'Name', 'CreationTime', 'LastAccessTime', 'LastWriteTime', 'Extension', 'BaseName', 'B','ID','ParentID'
#, 'Root', 'IsReadOnly', 'Attributes', 'Owner', 'AccessToString', 'Group' #, #'MD5', #'SHA1' #Order in which columns in CSV Output are ordered
#FOLDERS ONLY
#$SourcePathFileOutput = Get-ChildItem $SourcePath -Recurse | where {$_.PSIsContainer}
#FILES AND FOLDERS
$SourcePathFileOutput = Get-ChildItem $SourcePath -Recurse | Sort-Object Fullname #| where {$_.PSIsContainer} #Uncomment for folders only
$CurrentID = 1
$IDs = [ordered]#{}
$IDs.add(($SourcePathFileOutput[0].fullname | split-path),0)
$HashOutput = ForEach ($file in $SourcePathFileOutput) {
$IDs.add($file.fullname,$CurrentID)
Write-Output (New-Object -TypeName PSCustomObject -Property #{
ID = $CurrentID
ParentID = $IDs.$($file.fullname | split-path)
Path = $file.FullName
IsDIR = $file.PSIsContainer
Directory = $File.DirectoryName
FileCount = (GCI $File.FullName -Recurse).Count
Parent = $file.Parent
Name = $File.Name
CreationTime = $File.CreationTime
LastAccessTime = $File.LastAccessTime
LastWriteTime = $File.LastWriteTime
Extension = $File.Extension
BaseName = $File.BaseName
B = $File.Length
#Root = $file.Root
#IsReadOnly = $file.IsReadOnly
#Attributes = $file.Attributes
#Owner = $acl.owner
#AccessToString = $acl.accesstostring
#Group = $acl.group
#MD5 = Get-FileHash $file.FullName -Algorithm MD5 | Select-Object -ExpandProperty Hash
#SHA1 = Get-FileHash $file.FullName -Algorithm SHA1 | Select-Object -ExpandProperty Hash
}) | Select-Object $CSVColumnOrder
$CurrentID++
}
$HashOutput | Export-Csv -NoTypeInformation -Path $DestinationCSVPath
Explanation:
I piped $SourcePathFileOutput to Sort-Object because it is easiest to assign an ordered ID if things are sorted by their location in the directory tree.
$CurrentID starts at 1 because the first item in the loop is not the root parent. Your loop will increment this variable by 1 ($CurrentID++) at the end of each loop iteration so that the next file/directory will a new number. Since this code creates $CurrentID as an Int32 type, you will have issues if you have more than 2 billion files/directories. You will have to initialize it as a 64-bit type ($CurrentID = [long]1). However, I believe there can only be Int32 number of keys in a hash table, so a different strategy will need to be adopted if you have billions of files.
$IDs is a ordered hash table that tracks IDs for all files and directories. Each key in hash table is the path to the current item in the loop. The value is the ID assignment. This means you can access the ID using the syntax $IDs.path. After the initialization, I added the first entry with ID 0, which represents the root parent.
Inside of the loop, I created the ID property that just stores the current value of $CurrentID. I created 'ParentID', which looks up the parent directory inside of the $IDs hash table and returns that key's ID value.
I updated $CSVColumnOrder to have the ID and ParentID as columns.
You could have the ID scheme be slightly different if you do your initial sort differently. You don't have to increment by 1. You may have a requirement that you want directories to have smaller IDs than files and that will require more code (I did not see this requirement though).
So I'm trying to process CSV files, then giving the output new name. I can do it with one file by explicitly specifying the file name. But is there a way / wildcard I can use to make the script to process multiple files at the same time? Let's just say I want to process anything with .csv as an extension. Here's my script that's used to process a specific file
$objs =#();
$output = Import-csv -Path D:\TEP\FilesProcessing\Test\file1.csv | ForEach {
$Object = New-Object PSObject -Property #{
Time = $_.READ_DTTM
Value = $_.{VALUE(KWH)}
Tag = [String]::Concat($_.SUBSTATION,'_',$_.CIRCUITNAME,'_',$_.PHASE,'_',$_.METERID,'_KWH')
}
$objs += $Object;
}
$objs
$objs | Export-CSv -NoTypeInformation D:\TEP\FilesProcessing\Test\file1_out.csv
You can combine Get-ChildItem and Import-Csv.
Here's an example that specifies different input and output directories to avoid name collisions:
$inputPath = "D:\TEP\FilesProcessing\Test"
$outputPath = "D:\TEP\FilesProcessing\Output"
Get-ChildItem (Join-Path $inputPath "*.csv") | ForEach-Object {
$outputFilename = Join-Path $outputPath $_.Name
Import-Csv $_.FullName | ForEach-Object {
New-Object PSObject -Property #{
"Time" = $_.READ_DTTM
"Value" = $_.{VALUE(KWH)}
"Tag" = "{0}_{1}_{2}_{3}_KWH" -f $_.SUBSTATION,$_.CIRCUITNAME,$_.PHASE,$_.METERID
}
} | Export-Csv $outputFilename -NoTypeInformation
}
Note that there's no need for creating an array and repeatedly appending it. Just output the custom objects you want and export afterwards.
Use the Get-Childitem and cut out all the unnecessary intermediate variables so that you code it in a more Powershell type way. Something like this:
Get-CHhilditems 'D:\TEP\FilesProcessing\Test\*.csv' | % {
Import-csv $_.FullName | % {
New-Object PSObject -Property #{
Time = $_.READ_DTTM
Value = $_.{VALUE(KWH)}
Tag = '{0}_{1}_{2}_{3}_KWH' -f $_.SUBSTATION, $_.CIRCUITNAME, $_.PHASE, $_.METERID
}
} | Export-CSv ($_.FullName -replace '\.csv', '_out.csv') -NoTypeInformation
}
The Get-ChildItem is very useful for situations like this.
You can add wildcards directly into the path:
Get-ChildItem -Path D:\TEP\FilesProcessing\Test\*.csv
You can recurse a path and use the provider to filter files:
Get-ChildItem -Path D:\TEP\FilesProcessing\Test\ -recurse -include *.csv
This should get you what you need.
$Props = #{
Time = [datetime]::Parse($_.READ_DTTM)
Value = $_.{VALUE(KWH)}
Tag = $_.SUBSTATION,$_.CIRCUITNAME,$_.PHASE,$_.METERID,'KWH' -join "_"
}
$data = Get-ChildItem -Path D:\TEP\FilesProcessing\Test\*.csv | Foreach-Object {Import-CSV -Path $_.FullName}
$data | Select-Object -Property $Props | Export-CSv -NoTypeInformation D:\TEP\FilesProcessing\Test\file1_out.csv
Also when using Powershell avoid doing these things:
$objs =#();
$objs += $Object;
I'm trying to get The Folder Info and Security Info for all the folders on our server.
But I'm not to familiar with Powershell here. Mind helping a newbie?
How to do I get the Security acl piped into the Text file?
Along with just the member objects of Folder Name, Size, sub folder count?
# Step 1 Get Folder Path
function Select-Folder($message='Select a folder', $path = 0) {
$object = New-Object -comObject Shell.Application
$folder = $object.BrowseForFolder(0, $message, 0, $path)
if ($folder -ne $null) {
$folder.self.Path
}
}
#Step 2:Search For Directories
$dirToAudit = Get-ChildItem -Path (Select-Folder 'Select some folder!') -recurse | Where {$_.psIsContainer -eq $true}
foreach ($dir in $dirToAudit)
{
#Step 3: Output: [Folder Path, Name, Security Owner, Size, Folder Count]
#Pipe To CSV Text File
Get-Acl -Path $dir.FullName | Select-Object PSPath, Path,Owner | export-csv C:\temp\SecurityData.csv
#I also want the Folder path, Size and SubFolder Count
}
#Step 4: Open in Excel
invoke-item -path C:\temp\SecurityData.csv
Here's some sites that I found useful on the subject: http://blogs.msdn.com/b/powershell/archive/2007/03/07/why-can-t-i-pipe-format-table-to-export-csv-and-get-something-useful.aspx
http://www.maxtblog.com/2010/09/to-use-psobject-add-member-or-not/
This task isn't particularly easy. First you will want to create a custom object that contains the properties you want. These properties will be added via different commands e.g.:
$objs = Get-ChildItem . -r |
Where {$_.PSIsContainer} |
Foreach {new-object psobject -prop #{Path=$_.FullName;Name=$_.Name;FolderCount=$_.GetDirectories().Length}}
$objs = $objs | Foreach { Add-Member NoteProperty Owner ((Get-Acl $_.Path).Owner) -Inp $_ -PassThru}
$objs | Export-Csv C:\temp\data.csv
Getting the folder size will take some extra work to compute.