I'm looking at doing a recursive Get-ChildItem -r to get lastWriteTime, length and group for count by extension.
I get a bunch of errors, e.g., 'Get-ChildItem : Could not find item C:\Pics & Videos\Thumbs.db'.
I was thinking some folders or filenames had special characters in the name of the folder or file. I was able to encapsulate in quotes to correct some of the erroring files, but not all.
[System.IO.File]::Exists("C:\Pics & Videos\Thumbs.db") gave me a True, but
Get-ChildItem "C:\Pics & Videos\Thumbs.db" gave me the error.
I'm going to look at [System.IO.Fileinfo], but wonder if anyone can answer why I get these errors using Get-ChildItem aka ls?
Thanks
I may have found what I was looking for. With $Path a full path to the starting folder I want to recursively get file info from.
[IO.Directory]::EnumerateFileSystemEntries($path,"*.*","AllDirectories") |
ForEach { [System.IO.FileInfo]"$_" }
Other suggestions are welcome that might be faster. I'm looking at millions of files over 4500 folders. get-childitem only was able to get 60% of the files with 40% being errors without values. This is for one department and there are several.
Tested: get-childItem vs EnumerateFiles vs Explorer vs TreeSize
$path = "P:\Proxy Server Files\Dept1\sxs\"
First choice was slow. I get errors; so I added the error count as a guess.
$error.clear()
(get-ChildItem $path -r -ErrorAction SilentlyContinue).count
1333
$error.count
256
Second choice was much faster but gave less numbers.
$error.clear()
([IO.Directory]::EnumerateFileSystemEntries($path,"*.*","AllDirectories")).count
1229
$error.count
0
Trying to only look at files recursively again I get errors; so I added the error count as a guess.
$error.clear()
(get-ChildItem $path -r -file).count
558
$error.count
256
Looking at just files I get a much lower number that expected.
([IO.Directory]::EnumerateFileSystemEntries($path,"*.*","AllDirectories") | ForEach { [System.IO.FileInfo]"$_" }| Where Mode -NotMatch "d").count
108
Tried another method but same result.
([IO.Directory]::EnumerateFiles($path,"*.*","AllDirectories")| ForEach { [System.IO.FileInfo]"$_" }| Where Mode -NotMatch "d").count
108
From Windows Eplorer I see 37 files and 80 folders.
TreeSize.exe shows 1175 files on 775 folders.
I'm not sure what count to believe. Admin rights used to get all counts.
Any ideas why so many different results?
Thumbs.db is (typically) a hidden file. By default Get-ChildItem doesn't look for hidden files. Pass -Force (-Hidden shows only hidden items):
PS> get-childitem .\Thumbs.db
Get-ChildItem: Could not find item C:\[...]\Thumbs.db.
PS> get-childitem .\Thumbs.db -Force
Directory: C:\[...]
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a-h- 24/12/2016 11:17 13824 Thumbs.db
This was answered in question 9508375.
get-childitem -literalpath
This handles the problem with special characters in the name.
Related
I have been trying to create/modify a Powershell script that allows me to automate finding all files across multiple servers larger than 1GB and excluding .ldf and mdf.
I have found a script, but it only looks into the current C Drive and although I've been trying to modify this, I have been unsuccessful.
I'm unsure how to modify this to fit finding multiple servers.
gci -r|sort -descending -property length | select -first 10 name, #{Name="Gigabytes";Expression={[Math]::round($_.length / 1GB, 2)}}
Please help.
Complete Script:
$size=1GB
$path="C:\"
$omit="*.mdf,*.ldf"
Get-ChildItem -Path $path -Exclude $omit -Recurse -OutBuffer 1000|
where {($_.Length -gt $size)}|Select Name, Directory, Length
Sample Output:
Name Directory Length
---- --------- ------
CAP2015-07-29 21-07-08-71.avi C:\ 1216624984
CAP2015-07-29 21-08-17-48.avi C:\Movies 1205696024
Explination of Script:
Variable for controlling search size. Can be KB, MB, GB
$size=1GB
Variable to set base path to search from
$path="C:\"
Variable to set list of excluded extensions
$omit="*.mdf,*.ldf"
Searches through all items from the $Path recursively and returns only files that are over the set size controlled by $size, and omits files listed in $omit.
Get-ChildItem -Path $path -Exclude $omit -Recurse -OutBuffer 1000|
where {($_.Length -gt $size)}|Select Name, Directory, Length
NOTE: The -OutBuffer parameter controls how many items are gathered before continuing. Managing this parameter correctly can greatly increase the speed with which a command completes. This is from a group of parameter called "CommonParameters". Knowing what these are, and how they work is invaluable.Microsoft Docs about_CommonParameters
I'm a blind user and I keep having to go find what's breaking a few people's computers and it's getting annoying. They always click on "install this Active-X" or "download the free video player now" and I have to then dig through everything.
I whipped up a powershell script to search C:\ for files that have a write time of 5 minutes ago and less for testing purposes the Get-ChildItem part works. Now I just want to get a list of file paths to make my life easier but I am missing something.
Here's what I have so far:
cd c:\
$fileizer = Get-ChildItem -Path . -exclude *.txt,*.log -ErrorAction SilentlyContinue -Recurse| ? {$_.LastWriteTime -gt (Get-Date).AddMinutes(-5)}
echo $fileizer
Here are the results if I just do the Get-ChildItem part of it:
PS C:\Users\tryso> c:\bin\hours.ps1
Directory: C:\bin
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a---- 8/1/2017 2:44 PM 169 hours.ps1
PS C:\>
Obviously I am going to narrow down the path to something more specific than just C:\ like to get into C:\Windows\Temp and C:\Users\ and the likes, I'm just wondering how to parse everything to just give me a list of files and their path.
I'd also like to point out that 5 minutes old is dumb, yes I know. I just did that to make it scream through my C:\ drive because you'd be amazed at how many files have a write time of .5 hours in C:\ LoL.
Ultimately I'd like to figure out how to find new files as opposed to recent write times if that's possible.
Sorry if my query is lame or a repeat, the only close examples I have found don't work for me for some reason and I'm pretty new at PS scripting - but it's getting pretty addicting and awesome LoL.
Thanks a million for any help!
Ryan
The Select-Object cmdlet can pull out the information you're looking for. Often you will want to know more than one piece of info on your results, so dot sourcing isn't going to be the most efficient.
Try something like this to see the full path, size and last modified timestamp:
Get-ChildItem -Path $path -exclude .txt,.log -ErrorAction SilentlyContinue -Recurse | Where-Object {$_.LastWriteTime -gt (Get-Date).AddMinutes(-5)} | Select-Object FullName, Length, LastWriteTime
Background
There is a directory that is automatically populated with MSI files throughout the day. I plan on leveraging Task Scheduler to run the script shown below every 15 minutes. The script will search the directory and copy any new MSIs that have been created in the last 15 minutes to a network share.
Within this folder C:\ProgramData\flx\Output\<APP-NAME>\_<TIME_STAMP>\<APP-NAME>\ there are two other folders: Repackaged and MSI Package. The Repackaged folder does not need to be searched as it does not contain any MSIs. Also I have found that it needs to be excluded in some way to prevent this error:
Get-ChildItem : The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.
At line:14 char:32
+$listofFiles=(Get-ChildItem <<<< -Recurse -Path $outputPath -Include "*.msi" -Exclude "*.Context.msi" | where {$_.LastAccessTime -gt $time.AddMinutes($minutes)})
+ CategoryInfo : ReadError: C:\ProgramData\...xcellence\Leg 1:String) [Get-ChildItem], PathTooLongException
+ FullyQualifiedErrorId : DirIOError,Microsoft.PowerShell.Commands.GetChildItemCommand
Limitations
I am stuck using Powershell v1.0
I have no control over the directory structure of the source location
Updated:
I don't know the app name or the what the time stamp will be. That is something else that is out of my control.
Current plans
I have read about using -Filter and I am aware of filters that are similar to functions but I wasn't able to come up with any ideas of how to use them. My only thought at the moment would be to do something like:
$searchList=Get-ChildItem "all instances of the MSI Package folder"
foreach($folder in $searchList){
$listofFiles=Get-ChildItem "search for *.msi"
foreach($file in $listofFiles){"Logic to copy MSI from source to destination"}
}
However...I thought that there might be a more efficient way of doing this.
Questions
How can I limit depth that Get-ChildItem searches?
How can I limit the Get-ChildItem search to C:\ProgramData\flx\Output\<APP-NAME>_<TIME_STAMP>\<APP-NAME>\MSI Package
How can I only search folders that have been accessed in the last 15 minutes? I don't want to waste time drilling down into folders when I know MSI has already been copied.
Any additional advice on how to make this script more efficient overall would also be greatly appreciated.
Script
My current script can be found here. I kept getting: "Your post appears to contain code that is not properly formatted as code" and gave up after the fourth time trying to reformat it.
You can try this
dir C:\ProgramData\flx\Output\*\*\*\*\* -filter *.msi
this search all .msi files at this level
C:\ProgramData\flx\Output\<APP-NAME>\_<TIME_STAMP>\<APP-NAME>\Repackaged or 'MSI Package' or whatever else present folder
without recursion, this avoid too deep folder that give you error.
Pipe the result to:
Where {$_.LastAccessTime -gt (Get-Date).AddMinutes(-15)} #be sure no action on file is taken before the dir command
or
Where {$_.LastWriteTime -gt (Get-Date).AddMinutes(-15)} #some file can be re-copied maybe
With help from C.B. this is my new search which eliminates the issues I was having.
Changed -Path to C:\ProgramData\flx\Output\*\*\*\* to help limit the depth that was searched.
Used -Filter instead of -Include and put the -Exclude logic into the where clause.
Get-ChildItem -Path C:\ProgramData\flx\Output\*\*\*\* -Filter "*.msi" | where {$_.Name -notlike "*.Context.msi" -and $_.LastAccessTime -gt (Get-Date).AddMinutes(-15)}
You can't limit the recursion depth of Get-ChildItem except to not use -Recurse i.e. Get-ChildItem is either depth = 0 or N.
Set up variables for app name and timestamp e.g.:
$appName = "foo"
$timestamp = Get-date -Format HHmmss
Get-ChildItem "C:\ProgramData\flx\Output\${appName}_$timestamp\$appName\MSI Package" -force -r
You can filter the results like so:
Get-ChildItem <path> -R | Where {$_.LastWriteTime -gt (Get-Date).AddMinutes(-15)}
I'm trying to build a function that will show me all path's where a certain filename is located. The function would take one parameter, that being the file name.
The result would be either a list of all paths, or a message saying there's no such file on the system.
I'm new to Powershell, and I'm not getting the syntax just yet.
I've tried this:
Get-ChildItem -Path -Include notepad.exe
But that threw an error message. I'm currently trying:
$i="notepad.exe"
Foreach ($i in Get-ChildItem c:\ -Recurse){echo -Path}
Started that now, it's still running, don't know what'll happen, really.
EDIT: echo'd an enormous amount of lines that just say "-Path"...
Can anybody help with this problem? I'm running Powershell 1.0 by the way.
So, to explain what I wish to see when executing this command, here is an example of what I expect after looking for *.txt:
C:/foo.txt
C:/A/foobar.txt
C:/A1/foo.txt
And so on, listing the path to all .txt files on my harddrive. Only the paths, one per line, no extra info needed.
EDIT2:
I've done it. I'm gonna leave this question up for those who make look for this in the future.
The function I used was this(this specific example will hand you a list of all .zip files on your harddrive, edit where needed):
Get-ChildItem -Path c:\ -Include "*.zip" -Recurse -Force -Name > c:\listOfPaths.txt
This created a file called listOfPaths.txt on my C:\ folder and this contained a list of all occurences of any file ending with .zip in all subfolders of my harddrive.
The "c:\" bit isn't mentioned, but I don't mind.
EDIT3:
thanks capar for a more complete version.
Here is capar's code(or how I got it to work, since Get-Children doesn't work in 1.0)
Get-ChildItem -Path c:\ -Recurse *.txt | Select-Object -Property FullName
Since it's Friday night, I decided to play with Power Shell to see if I can help :)
This comes pretty close to what you are asking for I think:
Get-ChildItem -Path c:\ -Recurse *.txt | Select-Object -Property FullName
If it helps, this command will list the properties of any object that will be returned by Get-ChildItem:
Get-ChildItem | Get-Member
ls c:\ -r | ? {$_.name -eq "notepad.exe"}
Get-Children is not recognized in Powershell V3 either. It would be great if someone removed that bad example.
As a warning to anyone searching for files: C:\ on today's hard drives will take a long time to run. You are well advised to narrow your search as much as you can. Since your folder structure might include spaces or special characters, use the typewriter quote (") or apostrophe (') delimeters.
$mylistoffiles = Get-ChildItem -Path 'C:\Windows\Setup\Scripts' -Recurse *.cmd | Select-Object -Property FullName
$mylistoffiles
I am having a PowerShell script which is walking a directory tree, and sometimes I have auxiliary files hardlinked there which should not be processed. Is there an easy way of finding out whether a file (that is, System.IO.FileInfo) is a hard link or not?
If not, would it be easier with symbolic links (symlinks)?
Try this:
function Test-ReparsePoint([string]$path) {
$file = Get-Item $path -Force -ea SilentlyContinue
return [bool]($file.Attributes -band [IO.FileAttributes]::ReparsePoint)
}
It is a pretty minimal implementation, but it should do the trick. Note that this doesn't distinguish between a hard link and a symbolic link. Underneath, they both just take advantage of NTFS reparse points, IIRC.
If you have Powershell 5+ the following one-liner recursively lists all file hardlinks, directory junctions and symbolic links and their targets starting from d:\Temp\:
dir 'd:\Temp' -recurse -force | ?{$_.LinkType} | select FullName,LinkType,Target
Output:
FullName LinkType Target
-------- -------- ------
D:\Temp\MyJunctionDir Junction {D:\exp\junction_target_dir}
D:\Temp\MySymLinkDir SymbolicLink {D:\exp\symlink_target_dir}
D:\Temp\MyHardLinkFile.txt HardLink {D:\temp\MyHardLinkFile2.txt, D:\exp\hlink_target.xml}
D:\Temp\MyHardLinkFile2.txt HardLink {D:\temp\MyHardLinkFile.txt, D:\exp\hlink_target.xml}
D:\Temp\MySymLinkFile.txt SymbolicLink {D:\exp\symlink_target.xml}
D:\Temp\MySymLinkDir\MySymLinkFile2.txt SymbolicLink {D:\temp\normal file.txt}
If you care about multiple targets for hardlinks use this variation which lists targets tab-separated:
dir 'd:\Temp' -recurse -force | ?{$_.LinkType} | select FullName,LinkType,#{ Name = "Targets"; Expression={$_.Target -join "`t"} }
You may need administrator privileges to run this script on say C:\.
Utilize Where-Object to search for the ReparsePoint file attribute.
Get-ChildItem | Where-Object { $_.Attributes -match "ReparsePoint" }
For those that want to check if a resource is a hardlink or symlink:
(Get-Item ".\some_resource").LinkType -eq "HardLink"
(Get-Item ".\some_resource").LinkType -eq "SymbolicLink"
My results on Vista, using Keith Hill's powershell script to test symlinks and hardlinks:
c:\markus\other>mklink symlink.doc \temp\2006rsltns.doc
symbolic link created for symlink.doc <<===>> \temp\2006rsltns.doc
c:\markus\other>fsutil hardlink create HARDLINK.doc \temp\2006rsltns.doc
Hardlink created for c:\markus\other\HARDLINK.doc <<===>> c:\temp\2006rsltns.doc
c:\markus\other>dir
Volume in drive C has no label.
Volume Serial Number is C8BC-2EBD
Directory of c:\markus\other
02/12/2010 05:21 PM <DIR> .
02/12/2010 05:21 PM <DIR> ..
01/10/2006 06:12 PM 25,088 HARDLINK.doc
02/12/2010 05:21 PM <SYMLINK> symlink.doc [\temp\2006rsltns.doc]
2 File(s) 25,088 bytes
2 Dir(s) 6,805,803,008 bytes free
c:\markus\other>powershell \script\IsSymLink.ps1 HARDLINK.doc
False
c:\\markus\other>powershell \script\IsSymLink.ps1 symlink.doc
True
It shows that symlinks are reparse points, and have the ReparsePoint FileAttribute bit set, while hardlinks do not.
here is a one-liner that checks one file $FilePath and returns if it is a symlink or not, works for files and directories
if((Get-ItemProperty $FilePath).LinkType){"symboliclink"}else{"normal path"}
Just want to add my own two cents, this is a oneliner function which works perfectly fine for me:
Function Test-Symlink($Path){
((Get-Item $Path).Attributes.ToString() -match "ReparsePoint")
}
The following PowerShell script will list all the files in a directory or directories with the -recurse switch. It will list the name of the file, whether it is a regular file or a hardlinked file, and the size, separated by colons.
It must be run from the PowerShell command line. It doesn't matter which directory you run it from as that is set in the script.
It uses the fslink utility shipped with Windows and runs that against each file using the hardlink and list switches and counts the lines of output. If two or greater it is a hardlinked file.
You can of course change the directory the search starts from by changing the c:\windows\system in the command. Also, the script simply writes the results to a file, c:\hardlinks.txt. You can change the name or simply delete everything from the > character on and it will output to the screen.
Get-ChildItem -path C:\Windows\system -file -recurse -force |
foreach-object {
if ((fsutil hardlink list $_.fullname).count -ge 2) {
$_.PSChildname + ":Hardlinked:" + $_.Length
} else {
$_.PSChildname + ":RegularFile:" + $_.Length
}
} > c:\hardlinks.txt