I'm able to get the file name, number of rows in the file and file size, but unable to get the file's full path.
$measure = Get-Content C:\Users\Documents\Daily_files_YYYY-MM-DD.txt | Measure-Object
$lines = $measure.Count
echo "line count is: ${lines}"
Get-ChildItem C:\Users\Documents\ -Recurse|
? {! $_.PSIsContainer} |
Select-Object Name, #{Name='Size'; Expression={([string]([int]($_.Length / 1KB))) + " KB"}}
How can i get in the below format?
File Name : Daily_files_2019-01-10.txt
Path : C:\Users\Documents\
Line count is: 27723
File Size : 23 KB or MB or GB
Use Get-Content and expand the ReadCount property to get the number of lines. The full path to the directory of a file is stored in its DirectoryPath property.
Get-ChildItem 'C:\Users\Documents' -Recurse |
Where-Object {-not $_.PSIsContainer} |
Select-Object #{n='File Name';e={$_.Name}}, #{n='Path';e={$_.DirectoryPath}},
#{Get-Content $_.FullName | Select-Object -Expand ReadCount -Last 1}},
#{n='File Size';e={$_.Length}}
I would not recommend doing calculations with the file size or converting it to a string unless the value is meant solely for (human-readable) output.
Note that PowerShell v3 introduced a new parameter -Directory for Get-ChildItem, so you don't need the extra Where-Object pipeline step when using a recent enough version.
Related
I spent quite some time searching for the solution of my problem, but found nothing. I have one single folder with mostly .html files, and I frequently need to search to find the files that contain certain strings. I need the search result to be displayed with just the file name (as the file will only be in that one folder) and file's last write time. The list needs to be sorted by the last write time. This code works perfectly for finding the correct files
Get-ChildItem -Filter *.html -Recurse | Select-String -pattern "keyWord string" | group path | select name
The problem with it is that it displays the entire path of the file (which is not needed), it does not show the last write time, and it is not sorted by the last write time.
I also have this code
Get-ChildItem -Attributes !Directory *.html | Sort-Object -Descending -Property LastWriteTime | Select-Object Name, LastWriteTime
That code prints everything exactly as I want to see it, but it prints all the file names from the folder instead of printing only the files that I need to find with a specific string in them.
Since you are only using Select-String to determine if the text exists in any of the files move it inside a Where-Object filter and use the -Quiet parameter so that it returns true or false. Then sort and select the properties you want.
Get-ChildItem -Filter *.html |
Where-Object { $_ | Select-String -Pattern 'keyWord string' -Quiet } |
Sort-Object LastWriteTime |
Select-Object Name, LastWriteTime
For multiple patterns one way you can do it is like this
Get-ChildItem -Filter *.html |
Where-Object {
($_ | Select-String -Pattern 'keyWord string' -Quiet) -and
($_ | Select-String -Pattern 'keyWord string #2' -Quiet)
} |
Sort-Object LastWriteTime |
Select-Object Name, LastWriteTime
And another way using Select-String with multiple patterns which may be a bit faster
$patterns = 'keyword 1', 'keyword 2', 'keyword 3'
Get-ChildItem -Filter *.html |
Where-Object {
($_ | Select-String -Pattern $patterns | Select-Object -Unique Pattern ).Count -eq $patterns.Count
} |
Sort-Object LastWriteTime |
Select-Object Name, LastWriteTime
If you don't care about it being a bit redundant, you can Get-ChildItem the results after your searching:
Get-ChildItem -Filter *.html -Attributes !Directory -Recurse | Select-String -Pattern "keyWord string" | group path | foreach {Get-ChildItem $_.Name } | Sort-Object -Descending LastWriteTime | Select Name,LastWriteTime
After you Select-String you get the attributes of that object instead of the original, so we're taking the results of that object and passing it back into the Get-ChildItem command to retrieve those attributes instead.
I am trying to pick the right file using file name(timestamp appended in the file name).
I have 3 files: text.041922.061512, text.041922.063016, text.041922.064212. I need pick text.041922.064212 because it was created last which has data and time on the file name itself. How do i achieve this using PowerShell?
Thanks in advance. I would really appreciate it.
My script is this:
Get-ChildItem -Path "c:/demo | Sort-Object { [DateTime]::ParseExact($_.BaseName.Substring(7,13).Replace('.',' '), "MMddyy hhmmss",$null) } | Select-Object -First 1 | Copy-Item -Destination "E:/test/"
Your file names are missing their extension, but assuming the extension doesn't have any numeric digits, you could use -replace '\D+' to remove all non numeric digits from the file names and then the format for ParseExact could be MMddyyHHmmss.
If the files actually don't have an extension, use $_.Name instead of $_.BaseName.
Get-ChildItem -Path "c:/demo" | Sort-Object {
[DateTime]::ParseExact(($_.BaseName -replace '\D+'), 'MMddyyHHmmss', $null)
} -Descending | Select-Object -First 1 | Copy-Item -Destination "E:/test/"
Here is an example that you can use for testing:
[System.IO.FileInfo[]]('text.041922.061512', 'text.041922.063016', 'text.041922.064212') | Sort-Object {
[DateTime]::ParseExact(($_.Name -replace '\D+'), 'MMddyyHHmmss', $null)
} -Descending | Select-Object -Expand Name -First 1
# Returns: text.041922.064212
#Get-ChildItem get the items from the path where the script is run
# only those files are fetched which has the extension .jpg
Get-ChildItem -Path *.jpg |
#using Where-Object and using the length property of the file
#to fetch only those files which are greater 10000 in size
Where-Object {$_.length -gt 10000} |
#sorting the files by length using Sort-Object
Sort-Object -Property length |
#formatting the out to only name and length of files
Format-Table -Property name, length |
#writing the file using Out-File to a text file Output.txt
Out-File -FilePath .\Output.txt
i already did. But i need another script as mentioned in heading.
If I understood correctly, you want collect files from several folders, and then generate your report. There are multiple ways to do it
Example 1 (Get-ChildItem can take a list of folders)
$MyFolders = 'C:\Temp\','D:\Docs\','\\server01\fileShare2\'
#Get-ChildItem get the items from the several fodlers
# only those files are fetched which has the extension .jpg
Get-ChildItem -Path $MyFolders -Filter *.jpg |
#using Where-Object and using the length property of the file
#to fetch only those files which are greater 10000 in size
Where-Object {$_.length -gt 10000} |
#sorting the files by length using Sort-Object
Sort-Object -Property length |
#formatting the out to only name and length of files
Format-Table -Property name, length |
#writing the file using Out-File to a text file Output.txt
Out-File -FilePath .\Output.txt
Example 2 (run several Get-ChildItem commands, combine results in an single array, and then generate report)
$MyFolders = 'C:\Temp\','D:\Docs\','\\server01\fileShare2\'
#Get-ChildItem get the items from the several fodlers
# only those files are fetched which has the extension .jpg
(Get-ChildItem -Path 'C:\Temp\' -Filter *.jpg) + (Get-ChildItem -Path 'D:\Docs\' -Filter *.jpg) |
#using Where-Object and using the length property of the file
#to fetch only those files which are greater 10000 in size
Where-Object {$_.length -gt 10000} |
#sorting the files by length using Sort-Object
Sort-Object -Property length |
#formatting the out to only name and length of files
Format-Table -Property name, length |
#writing the file using Out-File to a text file Output.txt
Out-File -FilePath .\Output.txt
There are more ways to archive the same results, such as using expressions $(...), arrays #(...), scriptblocks &{...}
Please consider the following directory tree:
root
dir1
dir11
x.L01 12kb
x.L02 10kb
dir12
dir122
a.jpg 5kb
b.xls 3kb
c.bmp 3kb
dir2
a.L01 100kb
a.L02 200kb
a.L03 50kb
dir3
dir31
dir4
There are 3 possible cases:
a (sub)dir is empty; root/dir3/dir31 and root/dir4
a (sub)dir contains (only) L0x files, where x is a number; root/dir1/dir11 and root/dir2
a (sub)dir has files, but not of the L0x-kind
The desired output is a custom directory listing with 3 columns:
filepath
filesize
lefcount (see below)
The logic is as follows:
if a (sub)dir is empty, do not list the dir
if a (sub)dir contains (only) L0x files, only list the first one (root/dir1/dir11/x.L01) but count the number of and total filesize of all L01s
if a (sub)dir has other files, list the dir, but count the number of and total filesize of all files
So the example output would be:
path size count
----------------------------------------
root/dir1/dir11/x.L01 22kb 2
root/dir1/dir12/dir122 11kb 3
root/dir2/a.L01 350kb 3
I'm just beginning powershell, and have come up with the following, which is not much but (a) am I going in the right direction? and (b) how to proceed from here?
Get-ChildItem "C:\root" -Recurse |
Foreach-Object {
If ($_.PSIsContainer) {
Get-ChildItem $_.fullname |
Foreach-Object {
Write-Host $_.fullname
}
}
}
Any help would be greatly appreciated!
This can evolve as your needs change. This will create the desired output as a custom object that you can manipulate and export as required.
$rootPath = "c:\temp"
Get-ChildItem $rootPath -Recurse |
Where-Object {$_.PSIsContainer} |
Where-Object {(Get-ChildItem $_.FullName | Where-Object {!$_.PSIsContainer}| Measure-Object | Select-Object -ExpandProperty Count) -gt 0} |
ForEach-Object{
$files = Get-ChildItem $_.FullName
$props = #{
Path = $_.FullName
Size = "{0:N0} KB" -f (($files | Where-Object {!$_.PSIsContainer} | Measure-Object -Sum Length | Select-Object -ExpandProperty Sum) / 1024)
Count = $files | Measure-Object | Select-Object -ExpandProperty Count
}
If($files.Extension -match "L\d\d"){
# These are special files and we are assuming they are alone in the directory
# Change the path
$props.Path = $files | Where-Object {!$_.PSIsContainer} | Select-Object -First 1 | Select-Object -ExpandProperty FullName
}
New-Object -TypeName PSCustomObject -Property $props
} | Select Path,Size,Count
Get all the folders and files recursively for the $rootPath. Filter out all files and empty folders based on their immediate contents. Then build a custom object with all the requested details. If it turns out the L0X files are present then update the path with the first one found.
Currently I assume that all files are of L0X format. If need be we can confirm.
Sample Output
Path Size Count
---- ---- -----
C:\temp\win64 1,092 KB 2
C:\temp\Empy\Stuff\New Text Document - Copy.L01 0 KB 2
I have a file directory which contains approx. 600 employee image files which have been copied from an alternative source.
The filename format is:
xxxxxx_123456_123_20141212.jpg
When the employee image file is updated it just creates another file in the same location and only the datetime changes at the end.
I need to be able to identify the most recent file, however i need to establish first of all which files are 'duplicated'.
My initial thoughts were to try and match the first 14 characters and, if they matched, work out the recent modified date and then delete the older file.
This requires PowerShell version 3.
$Path = 'C:\Users\madtomvane\Documents\PowerShellTest'
#Get the files #Group them by name #Select the most resent file
$FilesToKeep = Get-ChildItem $Path -Recurse -File | Group-Object -Property {$_.Name[0..14]} | ForEach-Object {$_.Group | Sort-Object -Property LastWriteTime -Descending | Select-Object -First 1}
#Get the files #Group them by name #Where there is more than one file in the group #Select the old ones
$FilesToRemove = Get-ChildItem $Path -Recurse -File | Group-Object -Property {$_.Name[0..14]} | Where-Object {$_.Group.Count -gt 1} | ForEach-Object {$_.Group | Sort-Object -Property LastWriteTime -Descending | Select-Object -Skip 1}
$FilesToRemove | Remove-Item