Test-Path Remove-Item issue - powershell

I have a powershell script to install a very large application (15gb source media) from a location its been delivered to on the C drive.
At the end of the script, to ensure that the software is installed I perform a test-path of the HKLM Microsoft Windows CurrentVersion Uninstall path for the GUID, and if successful, clear the source media from the C drive.
If (Test-Path("HKLM:pathname")) { Remove-Item $path -force -recurse }
The problem I have is that the above command works via Powershell ISE when run individually. It knows the key exists so should perform Remove-Item. When run as a script, or via a deployment mechanism, it will not remove the folder.
I have even gone further and used:
GCI $Path -Recurse | Remove-Item -force -recurse
... to no avail.
Prior to introducing the Test-Path, I only had the Remove-Item $Path -force -recurse and this worked!!
So despite Test-Path correctly judging, it appears to prevent Remove-Item from doing anything. (I wrote to a log file to check the If routing)
Any thoughts? Sorry for any typos, I did not copy / paste any part of the script.

If you can't delete the key immediately after, that probably means it has a write lock that's been applied by the Test-Path cmdlet.
Try finding out if a lock exist using Sysinternals Handle command and then release it using the handle.exe -c argument, referencing the hexadecimal number.
Make sure the format used in $path matches the format used by handle.exe
$lock = & handle.exe -nobanner -a -p ($PID) ($path)
if (-not ($lock -like '*No matching handles found*')){
& handle.exe -nobanner -p ($PID) -c ($lock[0].split(':')[0].Trim(' ')) -y
}
This will only work if you have the permission to close handles.

Related

Powershell: Setting SASS_BINARY_PATH

I need to set SASS_BINARY_PATH environment variable with the local file I've downloaded to be able to install node-sass behind a corporate firewall. So on windows cmd, I just do:
SET SASS_BINARY_PATH=C:\Source\Repos\SRT\Srt.Web\sass-binary\v4.7.2\win32-x64-48_binding.node
And the installation works fine since it successfully sets the variable. But when I try doing it via Powershell, it doesn't work:
$env:SASS_BINARY_PATH="C:\Source\Repos\SRT\Srt.Web\sass-binary\v4.7.2\win32-x64-48_binding.node"
I've also tried another way on Powershell:
[Environment]::SetEnvironmentVariable("SASS_BINARY_PATH", "C:\Source\Repos\SRT\Srt.Web\sass-binary\v4.7.2\win32-x64-48_binding.node", "Machine")
Upon checking it on the control panel, it successfully added a "SASS_BINARY_PATH" system variable. But upon trying to reinstall node-sass, it fails again.
One of my observations is when I'm doing it the windows cmd way then check it by using the command line set, the variable shows up along with others. But when I use both the Powershell methods, it does not show up. Any ideas on this?
The error encountered when trying to npm-install node-sass over a corporate firewall is:
Downloading binary from
https://github.com/sass/node-sass/releases/download/v4.7
.2/win32-x64-48_binding.node Cannot download
"https://github.com/sass/node-sass/releases/download/v4.7.2/win3
2-x64-48_binding.node":
HTTP error 401 Unauthorized
Download win32-x64-48_binding.node manually
Put it in C:\Users\<user>\AppData\Roaming\npm-cache\node-sass\4.7.2 folder.
Then try to run npm install node-sass
here is the PowerShell command #jengfad used based on above solution which is commented in the discussion
$cacheSassPath = $env:APPDATA + '\npm-cache\node-sass'
if( -Not (Test-Path -Path $cacheSassPath ) )
{
Write-Host "cacheSassPath not exists"
New-Item -ItemType directory -Path $cacheSassPath
Write-Host "cacheSassPath CREATED"
}
<# Ensure has no content #>
Get-ChildItem -Path $cacheSassPath -Recurse| Foreach-object {Remove-item -Recurse -path $_.FullName }
<# Copy local sass binary (~Srt.Web\sass-binary\4.7.2) file to cache folder #>
$sassBinaryPath = split-path -parent $MyInvocation.MyCommand.Definition
$sassBinaryPath = $sassBinaryPath + "\sass-binary\4.7.2"
Copy-Item -Path $sassBinaryPath -Recurse -Destination $npmcachedir -Container
Write-Host "node-sass binary file successfully copied!"

PowerShell command to forcefully perform Remove-Item or Move-Item command

Tried below command to remove a folder and to move one folder to another
$string2 = "20*.$e.*"
$path = "C:\Users\Desktop\Powershell_script\$string2"
Remove-Item $path -Recurse -Exclude "api"
Move-Item C:\Users\Desktop\Powershell_script\$string2 C:\Users\Desktop\Powershell_script2\
Above both command works fine but the when any file is open then the command shows access error or program is being used by another task like that.
Can the same command be altered to forcefully perform the function irrespective of file usage or open?

How to do a data search on remote pc's

I have a script:
get-childitem c:\users -include *.mov,*.avi,*.asf,*.flv,*.swf,*.mpg,*.mp3,*.mp4,*.wmv,*.wav,*.jpg,*.tif,*.png,*.gif,*.bmp
-recurse > collection.txt
This works great when collecting on a local computer. However, I need to run the same thing on several computers at once. So I tried this in a BAT file:
PSexec #list.txt -u UserID -p Password PowerShell get-childitem c:\users -include *.mov,*.avi,*.asf,*.flv,*.swf,*.mpg,*.mp3,*.mp4,*.wmv,*.wav,*.jpg,*.tif,*.png,*.gif,*.bmp
-recurse > collection.txt 2>&1 pause
This worked on some remote PC's, but I ran into a couple of problems:
1) The collection.txt file contains all the information with no identification of which piece goes with which computer.
2) When running on a single computer, sometimes, it looks like it is running, but never finishes and/or never reports that it has completed or writes to the file.
Is there another way to collect the same data for all users that have logged into the computer? Or, am I just not doing it right
The better approach would be to use PSRemoting rather than PSExec.
$list = "RemoteComputer1","RemoteComputer2"
Invoke-Command -ComputerName $list -ScriptBlock {get-childitem c:\users -include *.mov,*.avi,*.asf,*.flv,*.swf,*.mpg,*.mp3,*.mp4,*.wmv,*.wav,*.jpg,*.tif,*.png,*.gif,*.bmp -recurse} | Out-File .\collection.txt
If you need to use PSExec and a BAT file:
PSexec #list.txt -u UserID -p Password PowerShell -command $env:computername; get-childitem c:\users -include *.mov,*.avi,*.asf,*.flv,*.swf,*.mpg,*.mp3,*.mp4,*.wmv,*.wav,*.jpg,*.tif,*.png,*.gif,*.bmp -recurse 2>&1 > collection.txt

PowerShell: Using $env:userprofile in an 'IF' statement

I am using PowerShell ISE (I think 4).
I am writing logon scripts to replace the old '*.BAT' files.
I am trying to test for a user-profile condition before 'creating/deleting' certain directories from the desktop.
Example
If(($env:userprofile = "rmullins"))
{
Remove-Item $env:userprofile\Desktop\ITFILES -Recurse -Force
}
So I run the following to see what's going on:
md -Path $env:userprofile\Desktop\ITFILES
The path is created in the following location:
C:\Windows\System32.........
The MD command above works fine until I run that 'IF' statement. I think I might not understand how the $env:userprofile part works.
Any ideas?
On Windows 7:
[PS]> echo $ENV:UserProfile
C:\Users\arco444
This returns the path to the profile directory. Therefore I'd expect looking only for the username to fail the condition. I'd do a simple match instead:
if ($env:userprofile -imatch "rmullins")
{
Remove-Item $env:userprofile\Desktop\ITFILES -Recurse -Force
}

Delete directory regardless of 260 char limit

I'm writing a simple script to delete USMT migration folders after a certain amount of days:
## Server List ##
$servers = "Delorean","Adelaide","Brisbane","Melbourne","Newcastle","Perth"
## Number of days (-3 is over three days ago) ##
$days = -3
$timelimit = (Get-Date).AddDays($days)
foreach ($server in $servers)
{
$deletedusers = #()
$folders = Get-ChildItem \\$server\USMT$ | where {$_.psiscontainer}
write-host "Checking server : " $server
foreach ($folder in $folders)
{
If ($folder.LastWriteTime -lt $timelimit -And $folder -ne $null)
{
$deletedusers += $folder
Remove-Item -recurse -force $folder.fullname
}
}
write-host "Users deleted : " $deletedusers
write-host
}
However I keep hitting the dreaded Remove-Item : The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.
I've been looking at workarounds and alternatives but they all revolve around me caring what is in the folder.
I was hoping for a more simple solution as I don't really care about the folder contents if it is marked for deletion.
Is there any native Powershell cmdlet other than Remove-Item -recurse that can accomplish what I'm after?
I often have this issue with node projects. They nest their dependencies and once git cloned, it's difficult to delete them. A nice node utility I came across is rimraf.
npm install rimraf -g
rimraf <dir>
Just as CADII said in another answer: Robocopy is able to create paths longer than the limit of 260 characters. Robocopy is also able to delete such paths. You can just mirror some empty folder over your path containing too long names in case you want to delete it.
For example:
robocopy C:\temp\some_empty_dir E:\temp\dir_containing_very_deep_structures /MIR
Here's the Robocopy reference to know the parameters and various options.
I've created a PowerShell function that is able to delete a long path (>260) using the mentioned robocopy technique:
function Remove-PathToLongDirectory
{
Param(
[string]$directory
)
# create a temporary (empty) directory
$parent = [System.IO.Path]::GetTempPath()
[string] $name = [System.Guid]::NewGuid()
$tempDirectory = New-Item -ItemType Directory -Path (Join-Path $parent $name)
robocopy /MIR $tempDirectory.FullName $directory | out-null
Remove-Item $directory -Force | out-null
Remove-Item $tempDirectory -Force | out-null
}
Usage example:
Remove-PathToLongDirectory c:\yourlongPath
This answer on SuperUser solved it for me: https://superuser.com/a/274224/85532
Cmd /C "rmdir /S /Q $myDir"
I learnt a trick a while ago that often works to get around long file path issues. Apparently when using some Windows API's certain functions will flow through legacy code that can't handle long file names. However if you format your paths in a particular way, the legacy code is avoided. The trick that solves this problem is to reference paths using the "\\?\" prefix. It should be noted that not all API's support this but in this particular case it worked for me, see my example below:
The following example fails:
PS D:\> get-childitem -path "D:\System Volume Information\dfsr" -hidden
Directory: D:\System Volume Information\dfsr
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a-hs 10/09/2014 11:10 PM 834424 FileIDTable_2
-a-hs 10/09/2014 8:43 PM 3211264 SimilarityTable_2
PS D:\> Remove-Item -Path "D:\System Volume Information\dfsr" -recurse -force
Remove-Item : The specified path, file name, or both are too long. The fully qualified file name must be less than 260
characters, and the directory name must be less than 248 characters.
At line:1 char:1
+ Remove-Item -Path "D:\System Volume Information\dfsr" -recurse -force
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : WriteError: (D:\System Volume Information\dfsr:String) [Remove-Item], PathTooLongExcepti
on
+ FullyQualifiedErrorId : RemoveItemIOError,Microsoft.PowerShell.Commands.RemoveItemCommand
PS D:\>
However, prefixing the path with "\\?\" makes the command work successfully:
PS D:\> Remove-Item -Path "\\?\D:\System Volume Information\dfsr" -recurse -force
PS D:\> get-childitem -path "D:\System Volume Information\dfsr" -hidden
PS D:\>
If you have ruby installed, you can use Fileman:
gem install fileman
Once installed, you can simply run the following in your command prompt:
fm rm your_folder_path
This problem is a real pain in the neck when you're developing in node.js on Windows, so fileman becomes really handy to delete all the garbage once in a while
This is a known limitation of PowerShell. The work around is to use dir cmd (sorry, but this is true).
http://asysadmin.tumblr.com/post/17654309496/powershell-path-length-limitation
or as mentioned by AaronH answer use \?\ syntax is in this example to delete build
dir -Include build -Depth 1 | Remove-Item -Recurse -Path "\\?\$($_.FullName)"
If all you're doing is deleting the files, I use a function to shorten the names, then I delete.
function ConvertTo-ShortNames{
param ([string]$folder)
$name = 1
$items = Get-ChildItem -path $folder
foreach ($item in $items){
Rename-Item -Path $item.FullName -NewName "$name"
if ($item.PSIsContainer){
$parts = $item.FullName.Split("\")
$folderPath = $parts[0]
for ($i = 1; $i -lt $parts.Count - 1; $i++){
$folderPath = $folderPath + "\" + $parts[$i]
}
$folderPath = $folderPath + "\$name"
ConvertTo-ShortNames $folderPath
}
$name++
}
}
I know this is an old question, but I thought I would put this here in case somebody needed it.
There is one workaround that uses Experimental.IO from Base Class Libraries project. You can find it over on poshcode, or download from author's blog. 260 limitation is derived from .NET, so it's either this, or using tools that do not depend on .NET (like cmd /c dir, as #Bill suggested).
Combination of tools can work best, try doing a dir /x to get the 8.3 file name instead. You could then parse out that output to a text file then build a powershell script to delete the paths that you out-file'd. Take you all of a minute. Alternatively you could just rename the 8.3 file name to something shorter then delete.
For my Robocopy worked in 1, 2 and 3
First create an empty directory lets say c:\emptydir
ROBOCOPY c:\emptydir c:\directorytodelete /purge
rmdir c:\directorytodelete
This is getting old but I recently had to work around it again. I ended up using 'subst' as it didn't require any other modules or functions be available on the PC this was running from. A little more portable.
Basically find a spare drive letter, 'subst' the long path to that letter, then use that as the base for GCI.
Only limitation is that the $_.fullname and other properties will report the drive letter as the root path.
Seems to work ok:
$location = \\path\to\long\
$driveLetter = ls function:[d-z]: -n | ?{ !(test-path $_) } | random
subst $driveLetter $location
sleep 1
Push-Location $driveLetter -ErrorAction SilentlyContinue
Get-ChildItem -Recurse
subst $driveLetter /D
That command is obviously not to delete files but can be substituted.
PowerShell can easily be used with AlphaFS.dll to do actual file I/O stuff
without the PATH TOO LONG hassle.
For example:
Import-Module <path-to-AlphaFS.dll>
[Alphaleonis.Win32.Filesystem.Directory]::Delete($path, $True)
Please see at Codeplex: https://alphafs.codeplex.com/ for this .NET project.
I had the same issue while trying to delete folders on a remote machine.
Nothing helped but... I found one trick :
# 1:let's create an empty folder
md ".\Empty" -erroraction silentlycontinue
# 2: let's MIR to the folder to delete : this will empty the folder completely.
robocopy ".\Empty" $foldertodelete /MIR /LOG+:$logname
# 3: let's delete the empty folder now:
remove-item $foldertodelete -force
# 4: we can delete now the empty folder
remove-item ".\Empty" -force
Works like a charm on local or remote folders (using UNC path)
Adding to Daniel Lee's solution,
When the $myDir has spaces in the middle it gives FILE NOT FOUND errors considering set of files splitted from space. To overcome this use quotations around the variable and put powershell escape character to skip the quatations.
PS>cmd.exe /C "rmdir /s /q <grave-accent>"$myDir<grave-accent>""
Please substitute the proper grave-accent character instead of <grave-accent>
SO plays with me and I can't add it :). Hope some one will update it for others to understand easily
Just for completeness, I have come across this a few more times and have used a combination of both 'subst' and 'New-PSDrive' to work around it in various situations.
Not exactly a solution, but if anyone is looking for alternatives this might help.
Subst seems very sensitive to which type of program you are using to access the files, sometimes it works and sometimes it doesn't, seems to be the same with New-PSDrive.
Any thing developed using .NET out of the box will fail with paths too long. You will have to move them to 8.3 names, PInVoke (Win32) calls, or use robocopy