Power Shell: Save Path after file is found via foreach loop - powershell

I´m still quite new to PS, so it might be easy to answer for the veterans.
I want to grab the path of a file after a search through the drives.
Current code:
$Drives = Get-PSDrive -PSProvider 'FileSystem'
foreach($Drive in $Drives)
{
Get-ChildItem -Path $Drive.Root -include FindMe.txt -Recurse -ErrorAction SilentlyContinue | Invoke-Item
}
The goal is to save info like BIOS info in the found file.
But how can I grab the Path of the file or the found file itself to use it as a save file?
Thanks in advance

As commenter Lee_Dailey suggested, just assign the command output to a variable and use the FullName property of the FileInfo class to get the absolute file path.
$Drives = Get-PSDrive -PSProvider 'FileSystem'
$foundPath = foreach($Drive in $Drives)
{
# Find first file named FindMe.txt
$found = Get-ChildItem -Path $Drive.Root -Filter FindMe.txt -Recurse -ErrorAction SilentlyContinue |
Select-Object -First 1
if( $found ) {
# Output the full path of the found file.
# This is implicit output which PowerShell captures in $foundPath
$found.FullName
break
}
}
# Output found path to the console
$foundPath
Notes:
I've replaced -Include by -Filter which is more efficient because it is handled by the underlying .NET filesystem API. Using -Include the API would return all files and the filtering would be done by PowerShell, which is slower.
I've added Select-Object -First 1 to stop searching when the first file named 'FindMe.txt' has been found.
Remove | Select-Object -First 1 to get all files named 'FindMe.txt'. In this case, $found will be an array when more than one files are found. Also $found.FullName could resolve as an array due to member access enumeration.

Related

How can I use a PowerShell output in the next command?

I am working on a PowerShell command to search across drives for a specific file. I am new to PowerShell so most of what I have already is just stuff I found online. At the moment I have this:
$ExclDrives = ('C')
>> Get-PSDrive -PSProvider FileSystem | Where-Object {$_.Name -notin $ExclDrives} `
>> | % {write-host -f Green "Searching " $_.Root;get-childitem $_.Root -include *MyFile.txt -r `
>> | sort-object Length -descending}
Which outputs this:
Searching D:\
Searching E:\
Searching F:\
Directory: F:\MyDirectory
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a---- 8/13/2022 12:03 AM 0 MyFile.txt
PS C:\Windows\system32>
I would like to know how I can take the directory that is listed in the output and use it in a following command such as:
cd F:\MyDirectory
If this is possible through piping or something I would really appreciate an answer :)
Thanks for reading
I wasn't really sure what the best way to handle this would be if multiple files were found. We wouldn't be able to change directory into the parent folders while the script was running nor would we be able to do so for all of the returned files unless we opened new PowerShell windows for each. Since it appears that you will be searching for specific files which I assume will not return too many results and not knowing your ultimate goal I went with opening a new file explorer window for each file with the file being highlighted/selected.
$excludeDrives = ('C')
Get-PSDrive -PSProvider FileSystem | Where-Object { $_.Name -notin $excludeDrives } |
ForEach-Object {
Write-Host -f Green 'Searching ' $_.Root
Get-ChildItem -Path $_.Root -Recurse -Include *MyFile.txt -ErrorAction SilentlyContinue |
ForEach-Object {
# This line will open a file explorer window with the file highlighted
explorer.exe /select, $_
# This line will send the file object out through the pipeline
$_
} | Sort-Object Length -Descending
}
To answer your question about how to access the file's directory in the next command, you can use Foreach-Object and $_.Directory:
Get-ChildItem -Path $_.Root -Recurse -Include *MyFile.txt -ErrorAction SilentlyContinue |
Sort-Object Length -Descending |
ForEach-Object {
# Using the pipeline we can pass object along and access them
# using a special automatic variable called $_
# a property exists on FileInfo objects called Directory
'The directory is ' + $_.Directory
}
UPDATE
Hopefully this will answer the question in your comment
$ExclDrives = ('C')
Get-PSDrive -PSProvider FileSystem |
Where-Object { $_.Name -in $ExclDrives } |
ForEach-Object {
Write-Host -f Green 'Searching ' $_.Root
Get-ChildItem $_.Root -Include *MyFile.txt -Recurse -ErrorAction SilentlyContinue |
ForEach-Object {
# do whatever you want with the file. Reference using $_
Write-Host "Found Filename: $($_.Name)`tDirectory: $($_.Directory)" -ForegroundColor Cyan
explorer.exe /select, $_
# output the fileinfo object, in this case
# to the next command in the pipeline which is Sort-Object
$_
} |
Sort-Object Length -Descending
}

How to get a list of files and extended attributes with powershell

I am trying to do something in PowerShell but I am struggling with it.
I would like to get a list of all the files in my computer that has an Extended Attribute (EA) with name: '$KERNEL.SMARTLOCKER.ORIGINCLAIM'.
I got some help and have a basic code, but it doesn't work, I don't think it's doing the right thing.
ls C:\ -Recurse -ErrorAction SilentlyContinue | Where-Object {
$File = Get-NtFile -Path $_.FullName -Win32Path -Access ReadEa -ErrorAction SilentlyContinue
if ($File) {
$ExtendedAttributes = $File.GetEa()
$ExtendedAttributes.Entries | Where-Object { $_.Name -eq '$Kernel.Smartlocker.OriginClaim' }
}
}
I am using a non-standard PowerShell module that I found here
This module adds a provider and cmdlets to access the NT object manager namespace. It allows me to use Get-NtFile.
The $_.Name is displaying the file name and not the attribute, at least that's the feeling I have.
Also, I don't know how to send this to a file where I could see the filename, the file path and the ExtendedAttribute Name.
Although I am using this, I don't have any requirement to do so, I just want something that allows me to get the attribute I am looking for.
Anyone can help?
Thanks in advance!
Aganju
I tested the below with a known entry "$KERNEL.PURGE.ESBCACHE" and it definitely works, suggesting Mathias' suggestion is correct to me. This will output to a file for you
Get-ChildItem -Path C:\ -Recurse -ErrorAction SilentlyContinue | %{
try{
(Get-NtFile -Path $_.FullName -Win32Path -Access ReadEa -ErrorAction SilentlyContinue).getea().Entries |
Where-Object Name -eq '$KERNEL.PURGE.ESBCACHE' |
Select-Object Name, #{ Name='FilePath'; Expression = { $Script:_.DirectoryName }}, #{ Name='FileName'; Expression = { $script:_.Name }}
}catch{}
} | Export-Csv -Path c:\temp\test2.csv -NoTypeInformation

Get attributes for list of files

I am not a coder, but know enough to do some simple tasks. Using Powershell
I need to get:
folder/subfolder (s)/Filename.txt
Mode
LastWriteTime
Get-childItem and FullName work but file path too long..
I tried:
Get-ChildItem -Recurse -Force |foreach-object -process {$_.Fullname,LastWriteTime,Mode} >FullFileName.csv
and a number of other scripts I found online, Including this one
Get-ChildItem "MyFolderDirectory" | Select Name, #{ n = 'Folder'; e = { Convert-Path $_.PSParentPath } }, `
#{ n = 'Foldername'; e = { ($_.PSPath -split '[\\]')[-2] } }
I just cant get what I want.. this is what I need, there has to be an easy way to do this, but I just am not skilled enough to figure it out.
-a---- 9/9/2019 9:39AM folder1/folder2/Filename.txt
Does this help you?
Get-ChildItem -recurse | Foreach-Object { write-output "$($_.Mode) $($_.LastWriteTime) $($_.FullName)" }
This will grab the properties for each file or directory returned by Get-ChildItem
I would do something like this:
Get-ChildItem -Recurse -Force | Select-Object Mode, LastWriteTime, FullName
to get the list as array of objects. That way, it is also easy to export the results to a CSV file you can open in Excel for instance. To do that, simply append
| Export-Csv -Path 'X:\filelist.csv' -NoTypeInformation
to the above line. (change the X: to a existing drive on your machine of course)

Search a specified path for multiple .xml files within the same folder

I am seeking help creating a PowerShell script which will search a specified path for multiple .xml files within the same folder.
The script should provide the full path of the file(s) if found.
The script should also provide a date.
Here's my code:
$Dir = Get-ChildItem C:\windows\system32 -Recurse
#$Dir | Get-Member
$List = $Dir | where {$_.Extension -eq ".xml"}
$List | Format-Table Name
$folder = "C:\Windows\System32"
$results = Get-ChildItem -Path $folder -File -Include "*.xml" | Select Name, FullName, LastWriteTime
This will return all xml files only and display the file name, full path to the file and last time it was written to. The "-File" switch is only available in Powershell 4 and up. So if doing it off a Windows 7 or Windows 2008 R2 Server, you will have to make sure you updated your WMF to 4 or higher. Without file the second like will look like.
#Powershell 2.0
$results = Get-ChildItem -Path $folder -Include "*.xml" | Where {$_.PSIsContainer -eq $false} | Select Name, FullName, LastWriteTime
I like the Select method mentioned above for the simpler syntax, but if for some reason you just want the file names with their absolute path and without the column header that comes with piping to Select (perhaps because it will be used as input to another script, or piped to another function) you could do the following:
$folder = 'C:\path\to\folder'
Get-ChildItem -Path $folder -Filter *.xml -File -Name | ForEach-Object {
[System.IO.Path]::GetFullPath($_)
}
I'm not sure if Select lets you leave out the header.
You could also take a look at this answer to give you some more ideas or things to try if you need the results sorted, or the file extension removed:
https://stackoverflow.com/a/31049571/10193624
I was able to make a few changes exporting the results to a .txt file, but though it provides the results I only want to isolate the same .xml files.
$ParentFolder = "C:\software"
$FolderHash = #{}
$Subfolders = Get-ChildItem -Path $ParentFolder
foreach ($EventFolder in $Subfolders) {
$XMLFiles = Get-ChildItem -Path $EventFolder.fullname -Filter *.xml*
if ($XMLFiles.Count -gt 1) {
$FolderHash += #{$EventFolder.FullName = $EventFolder.LastWriteTime}
}
}
$FolderHash
Judging from your self-answer you want a list of directories that contain more than one XML file without recursively searching those directories. In that case your code could be simplified to something like this:
Get-ChildItem "${ParentFolder}\*\*.xml" |
Group-Object Directory |
Where-Object { $_.Count -ge 2 } |
Select-Object Name, #{n='LastWriteTime';e={(Get-Item $_.Name).LastWriteTime}}

Cannot use Get-ChildItem to list filesnames in powershell script

In the powershell command-line I can execute the following
$list = Get-ChildItem $path -name
foreach($k in $list){Write-host $k}
And it will list all the filenames in $path
But if I copy and paste the same thing in code to execute, the output is blank
Why would this even happen?
Write-Host will write the string directly to the Host aka. the command line window, nowhere else
Just replace Write-Host with Write-Output if you want $k written to STDOUT:
$list = Get-ChildItem $path -name
foreach($k in $list){Write-Output $k}
But in reality, that is unnecessary, Write-Output is called implicitly. You could achieve the same thing with just:
Get-ChildItem $path -Name
I wanted just a list of files sorted by size without directory headers. I used this to accomplish it.
get-childitem foldername -file -recurse | select FullName,Length | sort Length -descending | export-csv c:\export.csv