Out-File: output file path is not accepted - powershell

Still learning and am having a hard time trying to output information to a file: the output file path is not accepted.
My location is PS Cert:\localmachine and here is the entire command:
$cert = Get-ChildItem -Path cert: -Recurse | where { $_.notafter -le (get-date).AddDays(75) -AND $_.notafter -gt (get-date)} | select notafter, issuer, thumbprint, subject | sort-object notafter
$cert | Out-File -FilePath \\ad.dcpds.cpms.osd.mil\WinAdm\Logs\Expiring_Certificates\$hostname.log
The error message I'm getting is:
Out-File : Cannot open file because the current provider (Microsoft.PowerShell.Security\Certificate) cannot open a file.

Based on the comments above, the issue comes from the fact that the current location is somewhere in the certificate provider (cert:).
One possible workaround/solution is to change the current location back to the file provider before writing the file.
$cert = Get-ChildItem -Path cert: -Recurse | where { $_.notafter -le (get-date).AddDays(75) -AND $_.notafter -gt (get-date)} | select notafter, issuer, thumbprint, subject | sort-object notafter
Set-location c:
$cert | out-file -FilePath \\ad.dcpds.cpms.osd.mil\WinAdm\Logs\Expiring_Certificates\$h‌​ostname.log
Second solution: use a path that explicitly includes the filesystem provider:
$cert | out-file -FilePath FileSystem::\\ad.dcpds.cpms.osd.mil\WinAdm\Logs\Expiring_Certificates\$h‌​ostname.log

To complement PoorKenny's effective solutions with background information:
If you use Out-File and the current location is on a drive of a provider OTHER than the filesystem provider:
only drive letter-based paths are recognized as filesystem paths; e.g.:
... | Out-File C:\temp\out.txt # OK, due to using filesystem drive C:
any other path requires prefix FileSystem::, notably including paths such as \path\to\... and even \\server\share\path\to\... (UNC paths); without the prefix, they're interpreted as relative to the current location, whatever its provider, which fails for any provider other than the filesystem provider.
... | Out-File \temp\out.txt # NOT recognized
... | Out-File \\server\share\temp\out.txt # NOT recognized
... | Out-File FileSystem::\temp\out.txt # OK, thanks to 'FileSystem::' prefix
Arguably, given that Out-File only ever creates files, it would make sense to ALWAYS interpret the -FilePath / -LiteralPath arguments as a filesystem path, irrespective of the provider of the current location.
However, the following, from an example that comes with the Out-File help, suggests that the behavior is by design (the (omitted) example invokes Out-File from a current location on the registry provider's drive).
Because Out-File is not supported by the Windows PowerShell Registry provider, you must specify either the file system drive name, such as c:, or the name of the provider followed by two colons, FileSystem::, in the value of the FilePath parameter.
"
If anyone knows whether there truly is a good reason not to always default to the filesystem provider's current location, do let us know.
(Can there be additional, alternative filesystem providers?).

Related

Comparing Desktop and documents with their backup and produce an output.txt file that highlights the different folders and files between them

can someone please help me. Still new to powershell, I am comparing my documents and desktop mirror to check my backup solutions using several different codes. The one below is meant to check both the documents/desktop with its mirror folders and tell me exactly what files are 'different' between source and destination and output it into the same output.txt file (not sure if it overwrites it). When I do this for my documents alone, it works when I want try the code for my desktop it doesn't output anything at all. Any advice?
function Get-Directories ($path)
{
$PathLength = $path.length
Get-ChildItem $path -exclude *.pst,*.ost,*.iso,*.lnk | % {
Add-Member -InputObject $_ -MemberType NoteProperty -Name RelativePath -Value $_.FullName.substring($PathLength+1)
$_
}
}
Compare-Object (Get-Directories $Folder3) (Get-Directories $Folder4) -Property RelativePath | Sort RelativePath, Name -desc | Out-File C:\Users\desktop\output.txt
Judging by earlier revisions of your question, your problem wasn't that your code didn't output anything at all, but that the output had empty Name property values.
Thus, the only thing missing from your code was Compare-Object's -PassThru switch:
Compare-Object -PassThru (Get-Directories $Folder3) (Get-Directories $Folder4) -Property RelativePath |
Sort RelativePath, Name -desc |
Out-File C:\Users\desktop\output.txt
Without -PassThru, Compare-Object outputs [pscustomobject] instances that have a .SideIndicator property (to indicate which side a difference object is exclusive to) and only the comparison properties passed to -Property.
That is, in your original attempt the Compare-Object output objects had only .SideIndicator and .RelativePath properties, and none of the other properties of the original [System.IO.FileInfo] instances originating from Get-ChildItem, such as .Name, .LastWriteTime, ...
With -PassThru, the original objects are passed through, decorated with an ETS (Extended Type System) .SideIndicator property (decorated in the same way you added the .RelativePath property), accessing the .Name property later works as intended.
Note:
Since Out-File then receives the full (and decorated) [System.IO.FileInfo] instances, you may want to limit what properties get written via a Select-Object call beforehand.
Additionally you may choose a structured output format, via Export-Csv for instance, given that the formatting that Out-File applies is meant only for the human observer, not for programmatic processing.
$_.FullName.substring($PathLength+1) in your Get-Directories should be $_.FullName.substring($PathLength), otherwise you'll cut off the 1st char.
since you're not using -Recurse when listing files, you could just use the file name instead of adding on a relative path property:
$folder1 = gci 'C:\Users\username\desktop' -exclude *.pst,*.ost,*.iso,*.lnk
$folder2 = gci 'D:\desktop' -exclude *.pst,*.ost,*.iso,*.lnk
Compare-Object $folder1 $folder2 -Property Name,Length
Name Length SideIndicator
---- ------ -------------
test.txt 174 =>
test.csv 174 <=
# Alternatively, use -PassThru to keep the whole object:
Compare-Object $folder1 $folder2 -Property Name,Length -PassThru | select SideIndicator,FullName,Length,LastWriteTime
SideIndicator FullName Length LastWriteTime
------------- -------- ------ -------------
=> D:\desktop 174 7/14/2021 2:47:09 PM
<= C:\Users\username\desktop 174 7/14/2021 2:47:09 PM
Use Out-File -Append to append output to a file.
As for troubleshooting your current script, try manually checking whether the RelativePath property looks like it's getting set correctly for you:
(Get-Directories $Folder3).RelativePath
(Get-Directories $Folder4).RelativePath
Finally, I recommend using robocopy over powershell for backup stuff like this, since it can use backup privileges (for locked files) and can copy multiple files at a time, but it's personal preference:
robocopy source destination /b /mir /mt /r:0 /w:0
/b - Runs robocopy in backup mode. Will copy everything as long as you are an Administrator
/mir - Mirrors everything from the source to the destination
/mt - Copies up to 8 files at a time
/r:0 - Sets it to not retry a file, default is like a million retries
/w:0 - Sets the time to 0 seconds between retries - default is like 30 seconds
source: https://community.spiceworks.com/topic/286190-transfer-user-profiles-using-robocopy

Removing a certificate from the local machine certificate store in powershell?

I am trying to check for, and then remove a certificate if it exists in a user's local machine store. I've tried this:
$certCN = 'test.domain.com'
Set-Location Cert:\LocalMachine\My
$oldCert = Get-ChildItem -Recurse |
Where-Object { $_.subject -like "CN=$oldCert*" }
Remove-Item Cert:\LocalMachine\My\$oldCert -Force
But it is not removing the cert from the store or giving any errors (yes I am running this elevated).
I checked my $oldCert variable to see if it is populated and it is:
PS Cert:\LocalMachine\My> $oldcert
PSParentPath: Microsoft.PowerShell.Security\Certificate::LocalMachine\My
Thumbprint Subject
---------- -------
276B7B87740D5E9595A258060F5CD9CC4190E9E1 CN=test.domain.com, <truncated>
Does anyone know how to accomplish this? I really appreciate it.
The problem you're encountering is the automatic string conversion of the X509Certificate2 object from the Cert:\ drive. When you're appending it to your path as -Path some\path\$myobj, it's implicitly calling ToString on the object. You can observe this by doing "some\path\$myobj" at the console without any other code or by simply calling $myobj.ToString().
Because Remove-Item takes pipeline input by property name, it will automatically pull the path off your object when you pass it over the pipeline, so you can remediate your problem simply as such:
$oldCert | Remove-Item
or
Remove-Item -LiteralPath $oldCert.PSPath

Delete files containing string

How can I delete all files in a directory that contain a string using powershell?
I've tried something like
$list = get-childitem *.milk | select-string -pattern "fRating=2" | Format-Table Path
$list | foreach { rm $_.Path }
And that worked for some files but did not remove everything. I've tried other various things but nothing is working.
I can easily get the list of file names and can create an array with the path's only using
$lista = #(); foreach ($f in $list) { $lista += $f.Path; }
but can't seem to get any command (del, rm, or Remove-Item) to do anything. Just returns immediately without deleting the files or giving errors.
Thanks
First we can simplify your code as:
Get-ChildItem "*.milk" | Select-String -Pattern "fRating=2" | Select-Object -ExcludeProperty path | Remove-Item -Force -Confirm
The lack of action and errors might be addressable by one of two things. The Force parameter which:
Allows the cmdlet to remove items that cannot otherwise be changed,
such as hidden or read-only files or read-only aliases or variables.
I would aslo suggest that you run this script as administrator. Depending where these files are located you might not have permissions. If this is not the case or does not work please include the error you are getting.
Im going to guess the error is:
remove-item : Cannot remove item C:\temp\somefile.txt: The process cannot access the file 'C:\temp\somefile.txt'
because it is being used by another process.
Update
In testing, I was also getting a similar error. Upon research it looks like the Select-String cmd-let was holding onto the file preventing its deletion. Assumption based on i have never seen Get-ChildItem do this before. The solution in that case would be encase the first part of this in parentheses as a sub expression so it would process all the files before going through the pipe.
(Get-ChildItem | Select-String -Pattern "tes" | Select-Object -ExpandProperty path) | Remove-Item -Force -Confirm
Remove -Confirm if deemed required. It exists as a precaution so that you don't open up a new powershell in c:\windows\system32 and copy paste a remove-item cmdlet in there.
Another Update
[ and ] are wildcard searches in powershell in order to escape those in some cmdlets you use -Literalpath. Also Select-String can return multiple hits in files so we should use -Unique
(Get-ChildItem *.milk | Select-String -Pattern "fRating=2" | Select-Object -ExpandProperty path -Unique) | ForEach-Object{Remove-Item -Force -LiteralPath $_}
Why do you use select-string -pattern "fRating=2"? You would like to select all files with this name?
I think the Format-Table Path don't work. The command Get-ChildItem don't have a property called "Path".
Work this snipped for you?
$list = get-childitem *.milk | Where-Object -FilterScript {$_.Name -match "fRating=2"}
$list | foreach { rm $_.FullName }
The following code gets all files of type *.milk and puts them in $listA, then uses that list to get all the files that contain the string fRating=[01] and stores them in $listB. The files in $listB are deleted and then the number of files deleted versus the number of files that contained the match is displayed(they should be equal).
sv -name listA -value (Get-ChildItem *.milk); sv -name listB -value ($listA | Select-String -Pattern "fRating=[01]"); (($listB | Select-Object -ExpandProperty path) | ForEach-Object {Remove-Item -Force -LiteralPath $_}); (sv -name FCount -value ((Get-ChildItem *.milk).Count)); Write-Host -NoNewline Files Deleted ($listA.Count - $FCount)/($listB.Count)`n;
No need to complicate things:
1. $sourcePath = "\\path\to\the\file\"
2. Remove-Item "$sourcePath*whatever*"
I tried the answer, unfortunately, errors seems to always come up, however, I managed to create a solution to get this done:
Without using Get-ChilItem; You can use select-string directly to search for files matching a certain string, yes, this will return the filename:count:content ... etc, but, internally these have names that you can chose or omit, the one you need is the "filename" to do this pipe this into "select-object" choosing the "FileName" from the output.
So, to select all *.MSG files that has the pattern of "Subject: Webservices restarted", you can do the following:
Select-String -Path .*.MSG -Pattern 'Subject: WebServices Restarted'
-List | select-object Filename
Also, to remove these files on the fly, you could pip into a ForEach statement with the RM command as follows:
Select-String -Path .*.MSG -Pattern 'Subject: WebServices Restarted'
-List | select-object Filename | foreach { rm $_.FileName }
I tried this myself, works 100%.
I hope this helps

Powershell network drive Get-ChildItem issues

Essentially I'm trying to use PowerShell to find files with certain file extensions on a network drive created on or after June 1st of this year. I thought I could do that with the following statement:
Get-ChildItem NETWORKPATH*. -recurse -include .xlsx | Where-Object { $_.CreationTime -ge "06/01/2014" }
I run into 2 problems:
The command only returns files from the root and one folder on the network drive, there's over 100 folders on this network drive
The command returns 3 out of the 5 files created after 6/1/14 and one created well before my creation time date.
I have access to all of the folders on the network drive. When I run Windows 7 search it finds all of the files. It doesn't matter if I run Powershell as administrator or not. It doesn't matter if I run it from my machine (Windows 7) or from one of our 2008 servers. The network drive I'm trying to search through is on a 2003 file server. What am I doing wrong?
Make sure you add a wildcard to your Include parameter. Also you should never use strings for date comparison. See the example of why not here. Try the following:
$testDate = new-object DateTime (2014,06,01)
Get-ChildItem NETWORKPATH*. -recurse -include *.xlsx | Where-Object { $_.CreationTime -ge $testDate }
Also note that files and folders marked as hidden will not show up unless you add a -force to the get-childitem. Not sure if that is part of the issue or not.
gci -path PATH -recurse | where {$_.extension -match "xlsx"} was the silver bullet to all of this.
This is what I use.
$Extensions = '*.xlsx','*.csv','*.xls'
$path = 'Network path'
Get-ChildItem "$path" -Include $Extensions -Recurse -Force | where {$_.CreationTime -gt
[datetime]"10/05/2018"} | Select * | Export-Csv -Path C:\TestExcelfiles.csv -
NoTypeInformation | fl * #format-wide

how to make Powershell output file path using mapped drive path instead of true path?

How can I change the path in result using mapped drive instead of true path? Now I got something like \\server\data\work\.... I would like to see it in lets say K:\work\....
I can't use mapped drive path for variable because it does not work with task scheduler....
Get-ChildItem -Recurse $source -Filter *.prt | Where{$_.LastWriteTime -gt (Get-Date).AddDays(-6)} | sort LastWriteTime -descending | select name,LastWriteTime,Directory | convertto-html -head $a -body "<H2>FILES LIST FOR PAST 7 DAYS</H2>" | out-file $output\result.htm
The mapped drive letter should work as long as you're running the Powershell script as a user who has the drive mapped. Otherwise, you need to use New-PSDrive to map the drive for the session.
New-PSDrive –Name K –PSProvider FileSystem –Root "\\server\data"
Try creating a PSDrive at the start of your script. Make sure that the account running the script(through task scheduler) has the required rights on the share.
New-PSDrive –Name "K" –PSProvider FileSystem –Root "\\server\data"
#.... something something, creating $output variable etc.
Get-ChildItem -Recurse $source -Filter *.prt | Where{$_.LastWriteTime -gt (Get-Date).AddDays(-6)} | sort LastWriteTime -descending | select name,LastWriteTime,Directory | convertto-html -head $a -body "<H2>FILES LIST FOR PAST 7 DAYS</H2>" | out-file $output\result.htm