I am not a coder, but know enough to do some simple tasks. Using Powershell
I need to get:
folder/subfolder (s)/Filename.txt
Mode
LastWriteTime
Get-childItem and FullName work but file path too long..
I tried:
Get-ChildItem -Recurse -Force |foreach-object -process {$_.Fullname,LastWriteTime,Mode} >FullFileName.csv
and a number of other scripts I found online, Including this one
Get-ChildItem "MyFolderDirectory" | Select Name, #{ n = 'Folder'; e = { Convert-Path $_.PSParentPath } }, `
#{ n = 'Foldername'; e = { ($_.PSPath -split '[\\]')[-2] } }
I just cant get what I want.. this is what I need, there has to be an easy way to do this, but I just am not skilled enough to figure it out.
-a---- 9/9/2019 9:39AM folder1/folder2/Filename.txt
Does this help you?
Get-ChildItem -recurse | Foreach-Object { write-output "$($_.Mode) $($_.LastWriteTime) $($_.FullName)" }
This will grab the properties for each file or directory returned by Get-ChildItem
I would do something like this:
Get-ChildItem -Recurse -Force | Select-Object Mode, LastWriteTime, FullName
to get the list as array of objects. That way, it is also easy to export the results to a CSV file you can open in Excel for instance. To do that, simply append
| Export-Csv -Path 'X:\filelist.csv' -NoTypeInformation
to the above line. (change the X: to a existing drive on your machine of course)
Related
I have several txt files distributed in several sub-folders.
This is what a file looks like:
Data file. version 01.10.
1
8
*
DAT\Trep\Typ10
DAT\Trep\Typ12
DAT\Trep\Typ13
what I would like to do is to extract only the part after the last "\" in order to get something like this:
Typ10 FileName.txt Path
Typ12 FileName.txt Path
Typ13 FileName.txt Path
...
I tried the following
Get-ChildItem -Path 'D:\MyData\*.txt' -Recurse | ForEach-Object {Write-Output $_; $data=Get-Content $_}
$data = $data -replace '.*\\'
$data
it works well for a single file but not with several (-recurse).
Being a powershell beginner I can't figure out how to improve my script.
I also tried to add this to get the result shown above in my post, but that doesn't work either.
Select-Object -Property #{Name = 'Result list'; Expression = { $data }}, Filename, Path
Thanks in advance for your kind help
Use Select-String:
Get-ChildItem -Path D:\MyData\*.txt -Recurse -File |
Select-String '^.+\\(.+)' |
ForEach-Object {
[pscustomobect] #{
Result = $_.Matches.Groups[1].Value
FileName = $_.FileName
Path = $_.Path
}
}
As for your desire to exclude certain folders during recursive traversal:
Unfortunately, Get-ChildItem -Exclude only excludes the matching folders themselves, not also their content. There are two relevant feature requests to potentially overcome this limitation in the future:
GitHub issue #4126 asks for path patterns to be supported too in the future.
GitHub issue #15159 proposes a new subtree-exclusion parameter, such as
-ExcludeRecursive.
For now, a different approach with post-filtering based on Where-Object is required, using folder names Folder1 and Folder2 as examples:
Get-ChildItem -Path D:\MyData\*.txt -Recurse |
Where-Object FullName -NotLike *\Folder1\* |
Where-Object FullName -NotLike *\Folder2\* |
Select-String '^.+\\(.+)' |
ForEach-Object {
[pscustomobect] #{
Result = $_.Matches.Groups[1].Value
FileName = $_.FileName
Path = $_.Path
}
}
For a more flexible, cross-platform approach based on regex matching (which is invariably more complex), see the bottom section of this answer.
I have an applications folder that have more than 10 applications in. I want to list all files (including sub-folders) with directory and size info and save it under each application folder.
Here is the script (it is in C:\Users\xxxxxx\Desktop\apps)
$FolderList = Get-ChildItem -Directory
foreach ($folder in $FolderList)
{
$thisPath = Get-Location
Get-ChildItem -File -Filter * -Path $folder -Recurse |
Sort-Object -Property Length -Descending|
Select-Object -Property FullName, Length, #{l='Length (MB)';e={$_.Length/1MB}} |
Format-Table -AutoSize |
Out-File $thisPath\fileList_$folder.txt
}
Output:
FullName - Length - Length (MB)
C:\Users\xxxxxx\Desktop\apps\3\VSCodeUserSetup-x64-1.62.2.exe 79944464 76.2409820556641
C:\Users\xxxxxx\Desktop\apps\3\son.zip 18745870 17.8774547576904
It does what I want but in some of the outputs where the path is too long it didn't write the length or even full path.
FullName
C:\Users\xxxxxx\Desktop\apps\3\xxxxxxxxxxx/xxxxxx/xxxxxxxxxxxxxx/xxxxxxxxxxx/VSCodeUserSetu...
C:\Users\xxxxxx\Desktop\apps\3\son.zip
As I searched I saw that there is a char limit. How can I overcome this issue?
The -Path parameter is defined as a string array, but you are feeding it a DirectoryInfo object.
The second part of your question is about truncation, which happens because you use Format-Table -AutoSize on the data.
Format-* cmdlets are designed to display output on the console window only which has a limited width and all items longer are truncated.
Rule of Thumb: never use Format-* cmdlets when you need to process the data later.
Instead of saving a formatted table to a text file, I would suggest saving the (object) info as structured CSV file which makes working with the data later MUCH easier and without truncation. (you can for instance open it in Excel)
# just get an array of Full pathnames
$FolderList = (Get-ChildItem -Directory).Name #instead of Fullname
foreach ($folder in $FolderList) {
# create the path for the output
$fileOut = Join-Path -Path $folder -ChildPath ('filelist_{0}.csv' -f $folder)
Get-ChildItem -File -Path $folder -Recurse |
Sort-Object -Property Length -Descending |
Select-Object -Property FullName, Length, #{l='Length (MB)';e={$_.Length/1MB}} |
Export-Csv -Path $fileOut -NoTypeInformation -UseCulture
}
I added switch -UseCulture to the Export-Csv cmdlet so you can afterwards simply double-click the csv file to open it in your Excel
I am seeking help creating a PowerShell script which will search a specified path for multiple .xml files within the same folder.
The script should provide the full path of the file(s) if found.
The script should also provide a date.
Here's my code:
$Dir = Get-ChildItem C:\windows\system32 -Recurse
#$Dir | Get-Member
$List = $Dir | where {$_.Extension -eq ".xml"}
$List | Format-Table Name
$folder = "C:\Windows\System32"
$results = Get-ChildItem -Path $folder -File -Include "*.xml" | Select Name, FullName, LastWriteTime
This will return all xml files only and display the file name, full path to the file and last time it was written to. The "-File" switch is only available in Powershell 4 and up. So if doing it off a Windows 7 or Windows 2008 R2 Server, you will have to make sure you updated your WMF to 4 or higher. Without file the second like will look like.
#Powershell 2.0
$results = Get-ChildItem -Path $folder -Include "*.xml" | Where {$_.PSIsContainer -eq $false} | Select Name, FullName, LastWriteTime
I like the Select method mentioned above for the simpler syntax, but if for some reason you just want the file names with their absolute path and without the column header that comes with piping to Select (perhaps because it will be used as input to another script, or piped to another function) you could do the following:
$folder = 'C:\path\to\folder'
Get-ChildItem -Path $folder -Filter *.xml -File -Name | ForEach-Object {
[System.IO.Path]::GetFullPath($_)
}
I'm not sure if Select lets you leave out the header.
You could also take a look at this answer to give you some more ideas or things to try if you need the results sorted, or the file extension removed:
https://stackoverflow.com/a/31049571/10193624
I was able to make a few changes exporting the results to a .txt file, but though it provides the results I only want to isolate the same .xml files.
$ParentFolder = "C:\software"
$FolderHash = #{}
$Subfolders = Get-ChildItem -Path $ParentFolder
foreach ($EventFolder in $Subfolders) {
$XMLFiles = Get-ChildItem -Path $EventFolder.fullname -Filter *.xml*
if ($XMLFiles.Count -gt 1) {
$FolderHash += #{$EventFolder.FullName = $EventFolder.LastWriteTime}
}
}
$FolderHash
Judging from your self-answer you want a list of directories that contain more than one XML file without recursively searching those directories. In that case your code could be simplified to something like this:
Get-ChildItem "${ParentFolder}\*\*.xml" |
Group-Object Directory |
Where-Object { $_.Count -ge 2 } |
Select-Object Name, #{n='LastWriteTime';e={(Get-Item $_.Name).LastWriteTime}}
So I have a script that gets the filename of songs contained in a CSV list and checks a directory to see if the file exists, then exports the missing information if there is any. The CSV file looks something like this:
Now, my script seems to work when I test on a smaller directory but when I run it against my actual directory contained on an external drive (about 10TB of files), I get a "system.outofmemoryexception" error before the script can complete.
$myPath = 'Z:\Music\media'
$myCSV = 'C:\Users\Me\Documents\Test.csv'
$CSVexport = 'C:\Users\Me\Documents\Results.csv'
$FileList = Get-ChildItem $myPath -Recurse *.wav | Select-Object -ExpandProperty Name -Unique
Import-CSV -Path $myCSV |
Where-Object {$FileList -notcontains $_.Filename} |
Select ID, AlbumTitle, TrackNo, Filename | Export-CSV $CSVexport -NoTypeInformation
$missing = Import-CSV $CSVexport | Select-Object -ExpandProperty Filename
If(!([string]::IsNullOrEmpty($missing))){
Write-Output "Missing files:`n" $missing}
Is there a way to make this script consume less memory or a more efficient way to do this against a large directory of files? I am new to Powershell scripting and am having trouble finding a way around this.
When #TheIncorrigible says iteratively he means something like this. Please note I am using different file paths since I don't have Z: drive. The best way would be to load up your csv items in a variable then iterate through that variable using a foreach loop, then for each one of those items testing to see if file exist, then if it does not add that item to a new variable. Once complete then export the new variable containing the missing items to csv.
$myPath = "C:\temp\"
$myCsv = "C:\temp\testcsv.csv"
$CSVexport = "C:\temp\results.csv"
$CsvItems = Import-Csv -Path $myCsv
$MissingItems
foreach($item in $CsvItems)
{
$DoesFileExist = Test-Path ($myPath + $item.Filename)
If($DoesFileExist -eq $false)
{
$MissingItems = $MissingItems + $item
}
}
$MissingItems | Export-Csv $CSVexport -NoTypeInformation
I am working on creating a script that will read a .csv document containing a single column of filenames (one per cell) and search a larger folder for each of the files matching the filenames provided and identify the 'owner' using:
(get-acl $file).owner
Currently I have several bits of code that can do individual parts, but I am having a hard time tying it all together. Ideally, a user can simply input file names into the .csv file, then run the script to output a second .csv or .txt identifying each file name and it's owner.
csv formatting will appear as below (ASINs is header):
ASINs
B01M8N1D83.MAIN.PC_410
B01M14G0JV.MAIN.PC_410
Pull file names without header:
$images = Get-Content \\path\ASINs.csv | Select -skip 1
Find images in larger folder to pull full filename/path (not working):
ForEach($image in $images) {
$images.FullName | ForEach-Object
{
$ASIN | Get-ChildItem -Path $serverPath -Filter *.jpg -Recurse -ErrorAction SilentlyContinue -Force | Set-Content \\path\FullNames.csv
}
}
At that point I would like to use the full file paths provided by FullNames.csv to pull the owners from the files in their native location using the above mentioned:
(get-acl $file).owner
Does anyone have any ideas how to tie these together into one fluid script?
EDIT
I was able to get the following to work without the loop, reading one of the filenames, but I need it to loop as there are multiple filenames.
New CSV Format:
BaseName
B01LVVLSCM.MAIN.PC_410
B01LVY65AN.MAIN.PC_410
B01MAXORH6.MAIN.PC_410
B01MTGEMEE.MAIN.PC_410
New Script:
$desktopPath = [System.Environment]::GetFolderPath([System.Environment+SpecialFolder]::Desktop)
$images = $desktopPath + '\Get_Owner'
Get-ChildItem -Path $images | Select BaseName | Export-Csv $desktopPath`\Filenames.csv -NoTypeInformation
$serverPath = 'C:\Users\tuggleg\Desktop\Archive'
$files = Import-Csv -Path $desktopPath`\Filenames.csv
While($true) {
ForEach ($fileName in $files.BaseName)
{
Get-ChildItem -Path $serverPath -Filter "*$fileName*" -Recurse -ErrorAction 'SilentlyContinue' |
Select-Object -Property #{
Name='Owner'
Expression={(Get-Acl -Path $_.FullName).Owner}
},'*' |
Export-Csv -Path $desktopPath`\Owners.csv -NoTypeInformation
}
}
Any ideas on the loop issue? Thanks everyone!
This example assumes your csv contains partial filenames. It will search the filepath and filter for those partials.
Example.csv
"ASINs"
"B01M8N1D83.MAIN.PC_410"
"B01M14G0JV.MAIN.PC_410"
Code.ps1
$Files = Import-Csv -Path '.\Example.csv'
ForEach ($FileName in $Files.ASINs)
{
Get-ChildItem -Path $serverPath -Filter "*$FileName*" -Recurse -ErrorAction 'SilentlyContinue' |
Select-Object -Property #{
Name='Owner'
Expression={(Get-Acl -Path $_.FullName).Owner}
},'*' |
Export-Csv -Path '\\path\FullNames.csv' -NoTypeInformation
}