Delete directory regardless of 260 char limit - powershell

I'm writing a simple script to delete USMT migration folders after a certain amount of days:
## Server List ##
$servers = "Delorean","Adelaide","Brisbane","Melbourne","Newcastle","Perth"
## Number of days (-3 is over three days ago) ##
$days = -3
$timelimit = (Get-Date).AddDays($days)
foreach ($server in $servers)
{
$deletedusers = #()
$folders = Get-ChildItem \\$server\USMT$ | where {$_.psiscontainer}
write-host "Checking server : " $server
foreach ($folder in $folders)
{
If ($folder.LastWriteTime -lt $timelimit -And $folder -ne $null)
{
$deletedusers += $folder
Remove-Item -recurse -force $folder.fullname
}
}
write-host "Users deleted : " $deletedusers
write-host
}
However I keep hitting the dreaded Remove-Item : The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.
I've been looking at workarounds and alternatives but they all revolve around me caring what is in the folder.
I was hoping for a more simple solution as I don't really care about the folder contents if it is marked for deletion.
Is there any native Powershell cmdlet other than Remove-Item -recurse that can accomplish what I'm after?

I often have this issue with node projects. They nest their dependencies and once git cloned, it's difficult to delete them. A nice node utility I came across is rimraf.
npm install rimraf -g
rimraf <dir>

Just as CADII said in another answer: Robocopy is able to create paths longer than the limit of 260 characters. Robocopy is also able to delete such paths. You can just mirror some empty folder over your path containing too long names in case you want to delete it.
For example:
robocopy C:\temp\some_empty_dir E:\temp\dir_containing_very_deep_structures /MIR
Here's the Robocopy reference to know the parameters and various options.

I've created a PowerShell function that is able to delete a long path (>260) using the mentioned robocopy technique:
function Remove-PathToLongDirectory
{
Param(
[string]$directory
)
# create a temporary (empty) directory
$parent = [System.IO.Path]::GetTempPath()
[string] $name = [System.Guid]::NewGuid()
$tempDirectory = New-Item -ItemType Directory -Path (Join-Path $parent $name)
robocopy /MIR $tempDirectory.FullName $directory | out-null
Remove-Item $directory -Force | out-null
Remove-Item $tempDirectory -Force | out-null
}
Usage example:
Remove-PathToLongDirectory c:\yourlongPath

This answer on SuperUser solved it for me: https://superuser.com/a/274224/85532
Cmd /C "rmdir /S /Q $myDir"

I learnt a trick a while ago that often works to get around long file path issues. Apparently when using some Windows API's certain functions will flow through legacy code that can't handle long file names. However if you format your paths in a particular way, the legacy code is avoided. The trick that solves this problem is to reference paths using the "\\?\" prefix. It should be noted that not all API's support this but in this particular case it worked for me, see my example below:
The following example fails:
PS D:\> get-childitem -path "D:\System Volume Information\dfsr" -hidden
Directory: D:\System Volume Information\dfsr
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a-hs 10/09/2014 11:10 PM 834424 FileIDTable_2
-a-hs 10/09/2014 8:43 PM 3211264 SimilarityTable_2
PS D:\> Remove-Item -Path "D:\System Volume Information\dfsr" -recurse -force
Remove-Item : The specified path, file name, or both are too long. The fully qualified file name must be less than 260
characters, and the directory name must be less than 248 characters.
At line:1 char:1
+ Remove-Item -Path "D:\System Volume Information\dfsr" -recurse -force
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : WriteError: (D:\System Volume Information\dfsr:String) [Remove-Item], PathTooLongExcepti
on
+ FullyQualifiedErrorId : RemoveItemIOError,Microsoft.PowerShell.Commands.RemoveItemCommand
PS D:\>
However, prefixing the path with "\\?\" makes the command work successfully:
PS D:\> Remove-Item -Path "\\?\D:\System Volume Information\dfsr" -recurse -force
PS D:\> get-childitem -path "D:\System Volume Information\dfsr" -hidden
PS D:\>

If you have ruby installed, you can use Fileman:
gem install fileman
Once installed, you can simply run the following in your command prompt:
fm rm your_folder_path
This problem is a real pain in the neck when you're developing in node.js on Windows, so fileman becomes really handy to delete all the garbage once in a while

This is a known limitation of PowerShell. The work around is to use dir cmd (sorry, but this is true).
http://asysadmin.tumblr.com/post/17654309496/powershell-path-length-limitation
or as mentioned by AaronH answer use \?\ syntax is in this example to delete build
dir -Include build -Depth 1 | Remove-Item -Recurse -Path "\\?\$($_.FullName)"

If all you're doing is deleting the files, I use a function to shorten the names, then I delete.
function ConvertTo-ShortNames{
param ([string]$folder)
$name = 1
$items = Get-ChildItem -path $folder
foreach ($item in $items){
Rename-Item -Path $item.FullName -NewName "$name"
if ($item.PSIsContainer){
$parts = $item.FullName.Split("\")
$folderPath = $parts[0]
for ($i = 1; $i -lt $parts.Count - 1; $i++){
$folderPath = $folderPath + "\" + $parts[$i]
}
$folderPath = $folderPath + "\$name"
ConvertTo-ShortNames $folderPath
}
$name++
}
}
I know this is an old question, but I thought I would put this here in case somebody needed it.

There is one workaround that uses Experimental.IO from Base Class Libraries project. You can find it over on poshcode, or download from author's blog. 260 limitation is derived from .NET, so it's either this, or using tools that do not depend on .NET (like cmd /c dir, as #Bill suggested).

Combination of tools can work best, try doing a dir /x to get the 8.3 file name instead. You could then parse out that output to a text file then build a powershell script to delete the paths that you out-file'd. Take you all of a minute. Alternatively you could just rename the 8.3 file name to something shorter then delete.

For my Robocopy worked in 1, 2 and 3
First create an empty directory lets say c:\emptydir
ROBOCOPY c:\emptydir c:\directorytodelete /purge
rmdir c:\directorytodelete

This is getting old but I recently had to work around it again. I ended up using 'subst' as it didn't require any other modules or functions be available on the PC this was running from. A little more portable.
Basically find a spare drive letter, 'subst' the long path to that letter, then use that as the base for GCI.
Only limitation is that the $_.fullname and other properties will report the drive letter as the root path.
Seems to work ok:
$location = \\path\to\long\
$driveLetter = ls function:[d-z]: -n | ?{ !(test-path $_) } | random
subst $driveLetter $location
sleep 1
Push-Location $driveLetter -ErrorAction SilentlyContinue
Get-ChildItem -Recurse
subst $driveLetter /D
That command is obviously not to delete files but can be substituted.

PowerShell can easily be used with AlphaFS.dll to do actual file I/O stuff
without the PATH TOO LONG hassle.
For example:
Import-Module <path-to-AlphaFS.dll>
[Alphaleonis.Win32.Filesystem.Directory]::Delete($path, $True)
Please see at Codeplex: https://alphafs.codeplex.com/ for this .NET project.

I had the same issue while trying to delete folders on a remote machine.
Nothing helped but... I found one trick :
# 1:let's create an empty folder
md ".\Empty" -erroraction silentlycontinue
# 2: let's MIR to the folder to delete : this will empty the folder completely.
robocopy ".\Empty" $foldertodelete /MIR /LOG+:$logname
# 3: let's delete the empty folder now:
remove-item $foldertodelete -force
# 4: we can delete now the empty folder
remove-item ".\Empty" -force
Works like a charm on local or remote folders (using UNC path)

Adding to Daniel Lee's solution,
When the $myDir has spaces in the middle it gives FILE NOT FOUND errors considering set of files splitted from space. To overcome this use quotations around the variable and put powershell escape character to skip the quatations.
PS>cmd.exe /C "rmdir /s /q <grave-accent>"$myDir<grave-accent>""
Please substitute the proper grave-accent character instead of <grave-accent>
SO plays with me and I can't add it :). Hope some one will update it for others to understand easily

Just for completeness, I have come across this a few more times and have used a combination of both 'subst' and 'New-PSDrive' to work around it in various situations.
Not exactly a solution, but if anyone is looking for alternatives this might help.
Subst seems very sensitive to which type of program you are using to access the files, sometimes it works and sometimes it doesn't, seems to be the same with New-PSDrive.

Any thing developed using .NET out of the box will fail with paths too long. You will have to move them to 8.3 names, PInVoke (Win32) calls, or use robocopy

Related

Compress-Archive Error: Cannot access the file because it is being used by another process

I would like to zip a path (with a service windows running inside).
When the service is stopped, it works perfectly, when the service is running, I have the exception:
The process cannot access the file because it is being used by another
process.
However, when I zip with 7-zip, I don't have any exception.
My command:
Compress-Archive [PATH] -CompressionLevel Optimal -DestinationPath("[DEST_PATH]") -Force
Do you have any idea to perform the task without this exception?
Copy-Item allows you to access files that are being used in another process.
This is the solution I ended up using in my code:
Copy-Item -Path "C:\Temp\somefolder" -Force -PassThru |
Get-ChildItem |
Compress-Archive -DestinationPath "C:\Temp\somefolder.zip"
The idea is that you pass through all the copied items through the pipeline instead of having to copy them to a specific destination first before compressing.
I like to zip up a folder's content rather than the folder itself, therefore I'm using Get-ChildItem before compressing in the last line.
Sub-folders are already included. No need to use -recurse in the first line to do this
A good method to access files being used by another process is by creating snapshots using Volume Shadow Copy Service.
To do so, one can simply use PowerShells WMI Cmdlets:
$Path = "C:/my/used/folder"
$directoryRoot = [System.IO.Directory]::GetDirectoryRoot($Path).ToString()
$shadow = (Get-WmiObject -List Win32_ShadowCopy).Create($directoryRoot, "ClientAccessible")
$shadowCopy = Get-WmiObject Win32_ShadowCopy | ? { $_.ID -eq $shadow.ShadowID }
$snapshotPath = $shadowCopy.DeviceObject + "\" + $Path.Replace($directoryRoot, "")
Now you can use the $snapshotPath as -Path for your Compress-Archive call.
This method can also be used to create backups with symlinks.
From there on you can use the linked folders to copy backed up files, or to compress them without those Access exceptions.
I created a similiar function and a small Cmdlet in this Gist: Backup.ps1
There was a similar requirement where only few extensions needs to be added to zip.
With this approach, we can copy the all files including locked ones to a temp location > Zip the files and then delete the logs
This is bit lengthy process but made my day!
$filedate = Get-Date -Format yyyyMddhhmmss
$zipfile = 'C:\Logs\logfiles'+ $filedate +'.zip'
New-Item -Path "c:\" -Name "Logs" -ItemType "directory" -ErrorAction SilentlyContinue
Robocopy "<Log Location>" "C:\CRLogs\" *.txt *.csv *.log /s
Get-ChildItem -Path "C:\Logs\" -Recurse | Compress-Archive -DestinationPath $zipfile -Force -ErrorAction Continue
Remove-Item -Path "C:\Logs\" -Exclude *.zip -Recurse -Force

Handling Path Too Long Exception with New-PSDrive

I am recursing a deep folder structure in order to retreive all folder paths like so:
$subFolders = Get-ChildItem $rootFolder -Recurse -Directory -ErrorVariable folderErrors | Select-Object -ExpandProperty FullName
NOTE: $rootFolder in my case is a network share. i.e. "\\server\DeptDir$\somefolder"
The $folderErrors variable is correctly capturing all the FileTooLong exceptions so I want to create new PSDrives using the long Paths in order to recurse those long paths.
So I create a new PSDrive using this cmdlet:
new-psdrive -Name "long1" -PSProvider FileSystem -Root $folderErrors[0].CategoryInfo.TargetName
However, after creating a new PSDrive I am still getting PathTooLong Exceptions.
PS C:\>> cd long1:
PS long1:\>> dir
dir : The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.
At line:1 char:1
+ dir
+ ~~~
+ CategoryInfo : ReadError: (\\svr01\Dep...\Fibrebond ECO\:String) [Get-ChildItem], PathTooLongException
+ FullyQualifiedErrorId : DirIOError,Microsoft.PowerShell.Commands.GetChildItemCommand
I see no other way around this problem. Am I doing something incorrectly? Why is the new PSDrive throwing PathTooLong when I am creating a drive at the location where the path is too long?
Thanks
There is a local policy that is now available since Windows anniversary update.
Requirements are :
Windows Management Framework 5.1
.Net Framework 4.6.2 or more recent
Windows 10 / Windows server 2016 (Build 1607 or newer)
This policy can be enabled using the following snippet.
#GPEdit location: Configuration>Administrative Templates>System>FileSystem
Set-ItemProperty 'HKLM:\System\CurrentControlSet\Control\FileSystem' -Name 'LongPathsEnabled' -value 1
Otherwise, you can actually get to the paths longer than 260 characters by making your call to the unicode version of Windows API.
There's a catch though.
This work only in Powershell 5.1 minimum.
From there, instead of making your call the standard way:
get-childitem -Path 'C:\Very long path' -Recurse
You will need to use the following prefix:
\\?\
Example
get-childitem -LiteralPath '\\?\C:\Very long path' -Recurse
For UNC path, this is slightly different, the prefix being \\?\UNC\ instead of \\
get-childitem -LiteralPath '\\?\UNC\127.0.0.1\c$\Very long path\' -Recurse
Important
When calling Get-ChildItem unicode version, you should use the -LiteralPath parameter instead of Path
From Microsoft documentation
-LiteralPath
Specifies a path to one or more locations. Unlike the -Path parameter, the value of the -LiteralPath parameter is used exactly as it is typed. No characters are interpreted as wildcards. If the path includes escape characters, enclose them in single quotation marks. Single quotation marks tell Windows PowerShell not to interpret any characters as escape sequences.
source
Example
(get-childitem -LiteralPath '\\?\UNC\127.0.0.1\This is a folder$' -Recurse) |
ft #{'n'='Path length';'e'={$_.FullName.length}}, FullName
output
Here is the actual functional test I made to create the very long repository, query it to produce the output above and confirm I could create repository with more than 260 characters and view them.
Function CreateVeryLongPath([String]$Root,[Switch]$IsUNC,$FolderName = 'Dummy Folder',$iterations = 200) {
$Base = '\\?\'
if ($IsUNC) {$Base = '\\?\UNC\'}
$CurrentPath = $Base + $Root + $FolderName + '\'
For ($i=0;$i -le $iterations;$i++) {
New-Item -Path $CurrentPath -Force -ItemType Directory | Out-Null
$currentPath = $CurrentPath + $FolderName + '\'
}
}
Function QueryVeryLongPath([String]$Root,[Switch]$IsUNC) {
$Base = '\\?\'
if ($IsUNC) {$Base = '\\?\UNC\';$Root = $Root.substring(2,$Root.Length -2)}
$BasePath = $Base + $Root
Get-ChildItem -LiteralPath $BasePath -Recurse | ft #{'n'='Length';'e'={$_.FullName.Length}},FullName
}
CreateVeryLongPath -Root 'C:\__tmp\' -FolderName 'This is a folder'
QueryVeryLongPath -Root 'C:\__tmp\Dummy Folder11\'
#UNC - tested on a UNC share path
CreateVeryLongPath -Root '\\ServerName\ShareName\' -FolderName 'This is a folder' -IsUNC
QueryVeryLongPath -Root '\\ServerName\ShareName\' -IsUNC
Worth to mention
During my research, I saw people mentioning using RoboCopy then parse its output. I am not particularly fond of this approach so I won't elaborate on it.
(edit: Years later, I discovered that Robocopy is part of Windows and not some third-party utility. I guess that would be an ok approach too, although I prefer a pure Powershell solution)
I also saw AlphaFS being mentionned a couple of time which is a library that allows to overcome the 260 characters limitation too. It is open sourced on Github and there's even a (I did not test it though) Get-AlphaFSChildItem built on it available on Technet here
Other references
Long Paths in .Net
Naming Files, Paths, and Namespaces

Copy files in PowerShell too slow

Good day, all. New member here and relatively new to PowerShell so I'm having trouble figuring this one out. I have searched for 2 days now but haven't found anything that quite suits my needs.
I need to copy folders created on the current date to another location using mapped drives. These folders live under 5 other folders, based on language.
Folder1\Folder2\Folder3\Folder4\chs, enu, jpn, kor, tha
The folders to be copied all start with the same letters followed by numbers - abc123456789_111. With the following script, I don't need to worry about folder names because only the folder I need will have the current date.
The folders that the abc* folders live in have about 35k files and over 1500 folders each.
I have gotten all of this to work using Get-ChildItem but it is so slow that the developer could manually copy the files by the time the script completes. Here is my script:
GCI -Path $SrcPath -Recurse |
Where {$_.LastWriteTime -ge (Get-Date).Date} |
Copy -Destination {
if ($_.PSIsContainer) {
Join-Path $DestPath $_.Parent.FullName.Substring($SrcPath.length)
} else {
Join-Path $DestPath $_.FullName.Substring($SrcPath.length)
}
} -Force -Recurse
(This only copies to one destination folder at the moment.)
I have also been looking into using cmd /c dir and cmd /c forfiles but haven't been able to work it out. Dir will list the folders but not by date. Forfiles has turned out to be pretty slow, too.
I'm not a developer but I'm trying to learn as much as possible. Any help/suggestions are greatly appreciated.
#BaconBits is right, you have a recurse on your copy-item as well as your getchild-item. This will cause a lot of extra pointless copies which are just overwrites due to your force parameter. Change your script to do a foreach loop and drop the recurse parameter from copy-item
GCI -Path $SrcPath -Recurse |
Where {$_.LastWriteTime -ge (Get-Date).Date} | % {
Copy -Destination {
if ($_.PSIsContainer) {
Join-Path $DestPath $_.Parent.FullName.Substring($SrcPath.length)
} else {
Join-Path $DestPath $_.FullName.Substring($SrcPath.length)
}
} -Force
}

How to limiting files searched by Get-ChildItem (or limiting depth of recursion)?

Background
There is a directory that is automatically populated with MSI files throughout the day. I plan on leveraging Task Scheduler to run the script shown below every 15 minutes. The script will search the directory and copy any new MSIs that have been created in the last 15 minutes to a network share.
Within this folder C:\ProgramData\flx\Output\<APP-NAME>\_<TIME_STAMP>\<APP-NAME>\ there are two other folders: Repackaged and MSI Package. The Repackaged folder does not need to be searched as it does not contain any MSIs. Also I have found that it needs to be excluded in some way to prevent this error:
Get-ChildItem : The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.
At line:14 char:32
+$listofFiles=(Get-ChildItem <<<< -Recurse -Path $outputPath -Include "*.msi" -Exclude "*.Context.msi" | where {$_.LastAccessTime -gt $time.AddMinutes($minutes)})
+ CategoryInfo : ReadError: C:\ProgramData\...xcellence\Leg 1:String) [Get-ChildItem], PathTooLongException
+ FullyQualifiedErrorId : DirIOError,Microsoft.PowerShell.Commands.GetChildItemCommand
Limitations
I am stuck using Powershell v1.0
I have no control over the directory structure of the source location
Updated:
I don't know the app name or the what the time stamp will be. That is something else that is out of my control.
Current plans
I have read about using -Filter and I am aware of filters that are similar to functions but I wasn't able to come up with any ideas of how to use them. My only thought at the moment would be to do something like:
$searchList=Get-ChildItem "all instances of the MSI Package folder"
foreach($folder in $searchList){
$listofFiles=Get-ChildItem "search for *.msi"
foreach($file in $listofFiles){"Logic to copy MSI from source to destination"}
}
However...I thought that there might be a more efficient way of doing this.
Questions
How can I limit depth that Get-ChildItem searches?
How can I limit the Get-ChildItem search to C:\ProgramData\flx\Output\<APP-NAME>_<TIME_STAMP>\<APP-NAME>\MSI Package
How can I only search folders that have been accessed in the last 15 minutes? I don't want to waste time drilling down into folders when I know MSI has already been copied.
Any additional advice on how to make this script more efficient overall would also be greatly appreciated.
Script
My current script can be found here. I kept getting: "Your post appears to contain code that is not properly formatted as code" and gave up after the fourth time trying to reformat it.
You can try this
dir C:\ProgramData\flx\Output\*\*\*\*\* -filter *.msi
this search all .msi files at this level
C:\ProgramData\flx\Output\<APP-NAME>\_<TIME_STAMP>\<APP-NAME>\Repackaged or 'MSI Package' or whatever else present folder
without recursion, this avoid too deep folder that give you error.
Pipe the result to:
Where {$_.LastAccessTime -gt (Get-Date).AddMinutes(-15)} #be sure no action on file is taken before the dir command
or
Where {$_.LastWriteTime -gt (Get-Date).AddMinutes(-15)} #some file can be re-copied maybe
With help from C.B. this is my new search which eliminates the issues I was having.
Changed -Path to C:\ProgramData\flx\Output\*\*\*\* to help limit the depth that was searched.
Used -Filter instead of -Include and put the -Exclude logic into the where clause.
Get-ChildItem -Path C:\ProgramData\flx\Output\*\*\*\* -Filter "*.msi" | where {$_.Name -notlike "*.Context.msi" -and $_.LastAccessTime -gt (Get-Date).AddMinutes(-15)}
You can't limit the recursion depth of Get-ChildItem except to not use -Recurse i.e. Get-ChildItem is either depth = 0 or N.
Set up variables for app name and timestamp e.g.:
$appName = "foo"
$timestamp = Get-date -Format HHmmss
Get-ChildItem "C:\ProgramData\flx\Output\${appName}_$timestamp\$appName\MSI Package" -force -r
You can filter the results like so:
Get-ChildItem <path> -R | Where {$_.LastWriteTime -gt (Get-Date).AddMinutes(-15)}

Find out whether a file is a symbolic link in PowerShell

I am having a PowerShell script which is walking a directory tree, and sometimes I have auxiliary files hardlinked there which should not be processed. Is there an easy way of finding out whether a file (that is, System.IO.FileInfo) is a hard link or not?
If not, would it be easier with symbolic links (symlinks)?
Try this:
function Test-ReparsePoint([string]$path) {
$file = Get-Item $path -Force -ea SilentlyContinue
return [bool]($file.Attributes -band [IO.FileAttributes]::ReparsePoint)
}
It is a pretty minimal implementation, but it should do the trick. Note that this doesn't distinguish between a hard link and a symbolic link. Underneath, they both just take advantage of NTFS reparse points, IIRC.
If you have Powershell 5+ the following one-liner recursively lists all file hardlinks, directory junctions and symbolic links and their targets starting from d:\Temp\:
dir 'd:\Temp' -recurse -force | ?{$_.LinkType} | select FullName,LinkType,Target
Output:
FullName LinkType Target
-------- -------- ------
D:\Temp\MyJunctionDir Junction {D:\exp\junction_target_dir}
D:\Temp\MySymLinkDir SymbolicLink {D:\exp\symlink_target_dir}
D:\Temp\MyHardLinkFile.txt HardLink {D:\temp\MyHardLinkFile2.txt, D:\exp\hlink_target.xml}
D:\Temp\MyHardLinkFile2.txt HardLink {D:\temp\MyHardLinkFile.txt, D:\exp\hlink_target.xml}
D:\Temp\MySymLinkFile.txt SymbolicLink {D:\exp\symlink_target.xml}
D:\Temp\MySymLinkDir\MySymLinkFile2.txt SymbolicLink {D:\temp\normal file.txt}
If you care about multiple targets for hardlinks use this variation which lists targets tab-separated:
dir 'd:\Temp' -recurse -force | ?{$_.LinkType} | select FullName,LinkType,#{ Name = "Targets"; Expression={$_.Target -join "`t"} }
You may need administrator privileges to run this script on say C:\.
Utilize Where-Object to search for the ReparsePoint file attribute.
Get-ChildItem | Where-Object { $_.Attributes -match "ReparsePoint" }
For those that want to check if a resource is a hardlink or symlink:
(Get-Item ".\some_resource").LinkType -eq "HardLink"
(Get-Item ".\some_resource").LinkType -eq "SymbolicLink"
My results on Vista, using Keith Hill's powershell script to test symlinks and hardlinks:
c:\markus\other>mklink symlink.doc \temp\2006rsltns.doc
symbolic link created for symlink.doc <<===>> \temp\2006rsltns.doc
c:\markus\other>fsutil hardlink create HARDLINK.doc \temp\2006rsltns.doc
Hardlink created for c:\markus\other\HARDLINK.doc <<===>> c:\temp\2006rsltns.doc
c:\markus\other>dir
Volume in drive C has no label.
Volume Serial Number is C8BC-2EBD
Directory of c:\markus\other
02/12/2010 05:21 PM <DIR> .
02/12/2010 05:21 PM <DIR> ..
01/10/2006 06:12 PM 25,088 HARDLINK.doc
02/12/2010 05:21 PM <SYMLINK> symlink.doc [\temp\2006rsltns.doc]
2 File(s) 25,088 bytes
2 Dir(s) 6,805,803,008 bytes free
c:\markus\other>powershell \script\IsSymLink.ps1 HARDLINK.doc
False
c:\\markus\other>powershell \script\IsSymLink.ps1 symlink.doc
True
It shows that symlinks are reparse points, and have the ReparsePoint FileAttribute bit set, while hardlinks do not.
here is a one-liner that checks one file $FilePath and returns if it is a symlink or not, works for files and directories
if((Get-ItemProperty $FilePath).LinkType){"symboliclink"}else{"normal path"}
Just want to add my own two cents, this is a oneliner function which works perfectly fine for me:
Function Test-Symlink($Path){
((Get-Item $Path).Attributes.ToString() -match "ReparsePoint")
}
The following PowerShell script will list all the files in a directory or directories with the -recurse switch. It will list the name of the file, whether it is a regular file or a hardlinked file, and the size, separated by colons.
It must be run from the PowerShell command line. It doesn't matter which directory you run it from as that is set in the script.
It uses the fslink utility shipped with Windows and runs that against each file using the hardlink and list switches and counts the lines of output. If two or greater it is a hardlinked file.
You can of course change the directory the search starts from by changing the c:\windows\system in the command. Also, the script simply writes the results to a file, c:\hardlinks.txt. You can change the name or simply delete everything from the > character on and it will output to the screen.
Get-ChildItem -path C:\Windows\system -file -recurse -force |
foreach-object {
if ((fsutil hardlink list $_.fullname).count -ge 2) {
$_.PSChildname + ":Hardlinked:" + $_.Length
} else {
$_.PSChildname + ":RegularFile:" + $_.Length
}
} > c:\hardlinks.txt