$PSScriptRoot NULL value - recursive deletion - powershell

I had an issue with one of my scripts that has a few calls to Remove-Item, here are all of them:
Remove-Item -r -force "$path_Docs\*"
Remove-Item -recurse -force ( join-path $PSScriptRoot "\logs\*" ) | Out-Null
Remove-Item -r -force "$MappingFilePath\*
These variables are set to:
$MappingFilePath = ( Join-Path $PSScriptRoot "\Table Mappings\")
$path_Docs = (join-path $settings.files.data_folder "\documents")
Basically what happened is, I started my script which is a schema migration script and takes around 3 hours for the database I was running it on. Went on lunch, came back and saw that my script had decided to try and recursively delete the entire C:\ drive. I can't think of any reason as to why this would have happened other than $PSScriptRoot being NULL, but then again i always Set-Location to the script root directory before running the script so i am not entirely sure.
Either way i have added the following to the top of my script main and ran it again a few times without issue:
## NOT AGAIN!!!
if ([string]::IsNullOrEmpty($PSScriptRoot)) {
Set-Output "[!!!] Dodged a bullet on this one" -colour red -logfilepath $log_prerequisites
break;
}
And changed all Remove-Item calls to have a filter:
Remove-Item -r -force "$MappingFilePath\*.xlsx"
Remove-Item -recurse -force "$PSScriptRoot\logs\*.log" | Out-Null
Is using $PSScriptRoot the best way of getting the script directory and if not what other methods are there? and how could the usage of the above commands have resulted in a recursive deletion of C:\?
Duplication Edit: This is less about the usage of $PSScriptRoot and more about how the recursive deletion could have happened.

Related

Remove-Item -Force on NULL Filewatcher file removed 1000 files on my server, is this a Powershell bug? Or just my bad code?

So, I am developing a script using FileSystemWatcher similar to this one: https://powershell.one/tricks/filesystem/filesystemwatcher
I only use the Created event.
I then run the following code on the files that are "Created."
I met a really unexpected error when I ran this code on a file that was already removed by another piece of code. So basically, the "Remove-WrongFileType" function received a file that was NULL, just nothing. And then it just started deleting tons of different files on my server.
I run my script from C:\ and I obviously gave it to high rights. However, I find it really strange that when the $Path is Null, the script just finds files to remove. I've managed to fix this in my code, by checking first if the path to the file leads to something, however I want to learn what caused the script to crash this hard, and why the Get-ChildItem finds files when the $Path is a NULL file. I wonder if this could be some kind of bug in Powershell? (Most likely not.. but I wonder..)
Function Remove-WrongFileType {
Param (
[string]$Path
)
$Files = Get-ChildItem -Path $Path -Force -Recurse
foreach($file in $Files) {
if(-not (Assert-LegalFileType -File $file.FullName){
Remove-Item -Path $file.Fullname -Force
Add-ToLog -logString “File $file was removed because of illegal filetype”
}
}
}
Function Assert-LegalFileType {
Param (
[string]$File
)
if(Test-Path -Path $File -PathType Container){
return $true
}
$fileToCheck = Get-Item -Path $File
$ExtensionOfFile = $fileToCheck.Extension
foreach($type in $AllowedFiles){
if($ExtensionOfFile -match $type) {
return $true
}
}
}
So I looked up what happens when you pass NULL to Get-childitem. And it is a known issue apparently. Get-ChildItem -Path $null does not throw an error #8708
A comment describe the same issue:
One of the side effects of this bug/feature could be to accidentally delete your system when you are using the output of this command piped to say a $_.Delete(). That is exactly what happened when I refactored my code to delete previous test runs; so
From :
Get-ChildItem -Path C:\SourceCodeTLM\testRunResults-Include * -File -Recurse | foreach { $.Delete() }
To:
$testRunResults= "C:\SourceCodeTLM\testRunResults"
Get-ChildItem -Path $testRunResults-Include * -File -Recurse | foreach { $.Delete() }
and forgot to initialize the variable while doing a debug.
In the worst case, I expected an error but instead, the cmd ran and started deleting my current dir content (Which by default was PS C:\windows\system32>).
Before I could understand what happened and pressed ctrl+c; enough files were deleted to corrupt my system. I had to restore and all of my stuff on my machine was lost. I learned this lesson the hard way but maybe others don't have to :). May be giving an error (when null) or keeping this parameter (mandatory) would be better from a risk standpoint :).
So yeah, don’t pass null to get-childitem and try to force delete the output with high privileges.

How to delete all temp files using powershell

Anyone knows how to delete all temp files using powershell.
Get-ChildItem $env:TEMP\TEMP | Remove-Item -confirm:$false -force -Recurse
I tired this code but it couldn't work.
Can you suggest me any better way to perform the same.
If you don't want to see any errors, you could use the -ErrorAction switch like this:
Remove-Item -Path $env:TEMP\* -Recurse -Force -ErrorAction SilentlyContinue
To empty TEMP folder and leave the folder in place, you should use this command:
Remove-Item $env:TEMP\* -Recurse
If you don't want to type so much, you can use also shorter version:
rm $env:TEMP\* -r
Just use this:
Remove-Item -Path $env:TEMP -Recurse -Force
Of course you will get access errors if any of the files you are deleting are actually being used by the system.
I'm running PS as LOCAdmin and run following command
PS C:\Windows\system32>$tempfolders = #(“C:\Windows\Temp\*”, “C:\Windows\Prefetch\*”, “C:\Documents and Settings\*\Local Settings\temp\*”, “C:\Users\*\Appdata\Local\Temp\*”)
PS C:\Windows\system32>Remove-Item $tempfolders -force -recurse
works for me :)

Compress-Archive Error: Cannot access the file because it is being used by another process

I would like to zip a path (with a service windows running inside).
When the service is stopped, it works perfectly, when the service is running, I have the exception:
The process cannot access the file because it is being used by another
process.
However, when I zip with 7-zip, I don't have any exception.
My command:
Compress-Archive [PATH] -CompressionLevel Optimal -DestinationPath("[DEST_PATH]") -Force
Do you have any idea to perform the task without this exception?
Copy-Item allows you to access files that are being used in another process.
This is the solution I ended up using in my code:
Copy-Item -Path "C:\Temp\somefolder" -Force -PassThru |
Get-ChildItem |
Compress-Archive -DestinationPath "C:\Temp\somefolder.zip"
The idea is that you pass through all the copied items through the pipeline instead of having to copy them to a specific destination first before compressing.
I like to zip up a folder's content rather than the folder itself, therefore I'm using Get-ChildItem before compressing in the last line.
Sub-folders are already included. No need to use -recurse in the first line to do this
A good method to access files being used by another process is by creating snapshots using Volume Shadow Copy Service.
To do so, one can simply use PowerShells WMI Cmdlets:
$Path = "C:/my/used/folder"
$directoryRoot = [System.IO.Directory]::GetDirectoryRoot($Path).ToString()
$shadow = (Get-WmiObject -List Win32_ShadowCopy).Create($directoryRoot, "ClientAccessible")
$shadowCopy = Get-WmiObject Win32_ShadowCopy | ? { $_.ID -eq $shadow.ShadowID }
$snapshotPath = $shadowCopy.DeviceObject + "\" + $Path.Replace($directoryRoot, "")
Now you can use the $snapshotPath as -Path for your Compress-Archive call.
This method can also be used to create backups with symlinks.
From there on you can use the linked folders to copy backed up files, or to compress them without those Access exceptions.
I created a similiar function and a small Cmdlet in this Gist: Backup.ps1
There was a similar requirement where only few extensions needs to be added to zip.
With this approach, we can copy the all files including locked ones to a temp location > Zip the files and then delete the logs
This is bit lengthy process but made my day!
$filedate = Get-Date -Format yyyyMddhhmmss
$zipfile = 'C:\Logs\logfiles'+ $filedate +'.zip'
New-Item -Path "c:\" -Name "Logs" -ItemType "directory" -ErrorAction SilentlyContinue
Robocopy "<Log Location>" "C:\CRLogs\" *.txt *.csv *.log /s
Get-ChildItem -Path "C:\Logs\" -Recurse | Compress-Archive -DestinationPath $zipfile -Force -ErrorAction Continue
Remove-Item -Path "C:\Logs\" -Exclude *.zip -Recurse -Force

Copy-Item is not working

I have a script that is finding a few files and the copying them. This is part of a psake build script.
The command is:
Get-ChildItem -Path "$sourceFolder" -Include "*.ispac" -Recurse `
| Select-Object -ExpandProperty FullName `
| Copy-Item -Destination "$caseFolder\" -Force -Verbose
When I execute this, I get this for the message:
VERBOSE: Performing the operation "Copy File" on target
"Item: C:\Source\TestSssisOne.ispac
Destination: C:\Destination\TestSssisOne.ispac".
That sure looks like the files where copied. But they aren't. No errors. If I copy this command out to ISE and setup the variables, it copies no problem. I tried to just manually copy a single file, with explicit paths. Again, in script it does not copy, but in PS console or ISE it does.
I have no idea what could be the problem. I've used Copy-Item in psake scripts. In fact, I copied the above code to a later task and it works! In the task where it isn't working I'm calling msbuild to build a solution.
Any insight appreciated!
modify your code like this
Get-ChildItem -Path "$sourceFolder" -Include "*.ispac" -Recurse -File | foreach{Copy-Item $_.FullName -Destination (("$caseFolder\") + $_.Name) -Force -Verbose }

Why is PowerShell copying to a random location?

I have the following simple script:
$workingDir = "C:\foo\bar"
$projectsDir = "C:\foo"
Copy-Item -Path "$projectsDir\some subpath\MyFile1.dll" -Destination $workingDir
Copy-Item -Path "$projectsDir\somewhere else\MyFile2.dll" -Destination $workingDir
Copy-Item -Path "$projectsDir\another place\MyFile3.dll" -Destination $workingDir
For some unknown reason, every time I run this script it copies the files to the correct location ($workingDir) and also copies them to $projectsDir\some subpath\something\else. I have to go delete the extra files from the other location every time this script is run.
So far I've tried:
changing variable names
specifying -Destination "$workingDir\MyFile1.dll"
using $null = Copy-Item -Path "...."
I even tried replacing Copy-Item with xcopy.exe
and nothing changes. I put a breakpoint on the first Copy-Item command and looked at the variables - they all looked right. What's going on here?
The only other thing I could think of is to run the copy-item like this:
Copy-Item -Path $($projectsDir + "\some subpath\MyFile1.dll") -Destination $workingDir
This is how I declare almost all of my Variable + SomethingElse scenarios. Since I haven't scene this behavior, I'll go back and test it some more to see what I can find. If I come up with something else, I redo my answer.
I rebooted my computer. Problem solved.