I'm trying to combine System.IO.FileInfo objects (from distinct Get-ChildItem calls) together. I've found working solutions (i.e. using PowerShell array) from this question:
Combine the results of two distinct Get-ChildItem calls into single variable to do the same processing on them
$target = "C:\example"
$Files = #( Get-ChildItem -LiteralPath $target -Force -Attributes !D )
$Files += #( Get-ChildItem -LiteralPath $target -Force -Attributes !D ) # for demo. & simplicity, I'm using the same path here
$Files | Write-Host # here the same entries are duplicated
However, the same entries from the System.IO.FileInfo objects are duplicated in the resulting object. I'm wondering if there is an elegant way to combine the objects while removing the duplicates?
PS: Files are "duplicated" if they have the same ".FullName".
$Files = #($Files | Sort-Object -Property FullName -Unique)
Related
I am trying to build a PowerShell script that can search for files with similar names inside of a folder.
The files will always have a similar name template:
filename(C).TIF
filename(M).TIF
filename(Y).TIF
filename(K).TIF
All I need to do is to extract the "filename", check if there are 4 similar ones (C,M,Y,K) and use it as a variable to move those files.
$files = Get-ChildItem -Path "E:\test" -Filter "*.TIF" |
Foreach-Object {$_.BaseName} | Sort-Object -Unique
$files = $files -replace ".{3}$"
$names = (Get-Unique -InputObject $files)
$names
The result looks like this:
jobname
jobname
jobname
jobname
test
test
test
test
But I need to sort by unique and count them, maybe, before action.
But I need to sort by unique and count them, maybe, before action.
You definitely want the Group-Object cmdlet!
As the name suggests, it... groups objects, based on some common property:
$filesByCommonBaseName = Get-ChildItem -Path "E:\test" -Filter "*.TIF" |Group-Object { $_.BaseName -replace '.{3}$' }
Now that you have them grouped correctly, you can start operating on them as such:
foreach($entry in $filesByCommonBaseName){
Write-Host "Operating on files starting with $($entry.Name)"
if($entry.Count -eq 4){
# we found four matches, move them!
$entry.Group |Move-Item -Destination $destinationDir -Force
}
}
I have the following powershell script that renames files from one location to another with a sequential filename. Ultimately, these file changes need to be mapped, as in original - new. Currently, I just have a Write-Host cmdlet and I just copy the cmd windows output into a txt file then run through a python script I wrote to spit out the original and renamed files into an excel file. I was wondering if there was an easier way to do this in the initial ps script. Even something tab delimited would be easily copy-pasteable into an excel file.
Set-Location -Path "C:\Users\mmcintyre\Desktop\grail_new"
$destLoc = "C:\Users\mmcintyre\Desktop\renamed"
$countRef = [ref] 0
Get-ChildItem -Filter *.pdf -Recurse |
Copy-Item -WhatIf -Destination { '{0}\{1}.pdf' -f $destLoc,++$countRef.Value }
Any help would be greatly appreciated.
Edit: I am currently using PS 2.0.
The following outputting old-name/new-name pairs to a TSV file in addition to the copy operation (PSv3+ syntax):
$countRef = [ref] 0
Get-ChildItem -Filter *.pdf -Recurse | ForEach-Object {
$newFullName = '{0}\{1}.pdf' -f $destLoc, ++$countRef.Value
Copy-Item -WhatIf -LiteralPath $_.FullName -Destination $newFullName
[pscustomobject] #{
Old = $_.FullName
New = $newFullName
}
} | Export-Csv -Delimiter "`t" NameMappings.tsv
This creates a TSV (tab-separated values) file with columns named Old and New that contain the old and new full filenames, respectively.
PSv2: The [pscustomobject] #{ ... } syntactic sugar for creating custom objects from hashtables is not available in v2, so New-Object must be used:
$countRef = [ref] 0
Get-ChildItem -Filter *.pdf -Recurse | ForEach-Object {
$newFullName = '{0}\{1}.pdf' -f $destLoc, ++$countRef.Value
Copy-Item -WhatIf -LiteralPath $_.FullName -Destination $newFullName
New-Object PSCustomObject -Property #{
Old = $_.FullName
New = $newFullName
}
} | Export-Csv -Delimiter "`t" NameMappings.tsv
Caveat: -Property accepts a hashtable[1]
, which means that its key ordering is not guaranteed, so the ordering of properties of the resulting object will typically not reflect the input order - in this case it just happens to do so.
If the resulting property order is undesired, you have two options:
Slow, but convenient: insert a Select-Object call with the properties in the desired order (e.g., Select-Object Old, New).
More cumbersome: Construct the object empty at first New-Object PSCustomObject, and then attach properties one by one in the desired order with individual Add-Member calls.
[1] The PSv3+ [pscustomobject] #{ ... } syntax is seemingly also hashtable-based, but it is parsed in a way that preserves the key order; i.e., as if you had implicitly used [ordered] #{ ... }.
As part of a PowerShell script, I want to generate a list of subfolders of two different folders. I approached this by calling Get-ChildItem twice, using Select-Object to transform the paths, and trying to combine the results. However, this is the combining step where I got stuck. I have tried this:
$cur = Get-Location
$mainDirs = Get-ChildItem -Directory -Name | Select-Object {"$cur\$_"}
$appDirs = Get-ChildItem -Directory -Name Applications\Programs |
Select-Object {"$cur\Applications\Programs\$_"}
$dirs = $mainDirs,$appDirs #Doesn't work!
But $dirs ends up consisting of the entries from $mainDirs followed by as many null items as many items there are in $appDirs.
How can I combine these in PowerShell?
Edit: The output of mainDirs[0]:
"$cur\$_"
---------
D:\somefolder\somesubfolder
The output of appDirs[0]:
"$cur\Applications\Programs\$_"
-------------------------------
D:\somefolder\Applications\Programs\othersubfolder
Get-ChildItem accepts a string array as input. Simply pass both folders whose subfolders you want listed as an array. Expand the FullName property to get the paths of subfolders:
$folders = '.', '.\Applications\Programs'
$dirs = Get-ChildItem $folders -Directory | Select-Object -Expand FullName
If you want the relative rather than the absolute path remove the current directory from the beginning of the path strings:
$pattern = '^{0}\\' -f [regex]::Escape($PWD.Path)
$folders = '.', '.\Applications\Programs'
$dirs = Get-ChildItem $folders -Directory |
ForEach-Object { $_.FullName -replace $pattern }
I am attempting to delete all directories, sub-directories, and the files contained in them based on a filter that specifies the required directory/sub-directory name.
For example, if I have c:\Test\A\B.doc, c:\Test\B\A\C.doc, and c:\Test\B\A.doc and my filter specifies all directories named 'A', I would expect the remaining folders and files to be c:\Test, c:\Test\B and c:\Test\B\A.doc respectively.
I am trying to do this in PowerShell and am not familiar with it.
The following 2 examples will delete all of the files that match my specified filter, but the files that match the filter as well.
$source = "C:\Powershell_Test" #location of directory to search
$strings = #("A")
cd ($source);
Get-ChildItem -Include ($strings) -Recurse -Force | Remove-Item -Force –Recurse
and
Remove-Item -Path C:\Powershell_Test -Filter A
I would use something like this:
$source = 'C:\root\folder'
$names = #('A')
Get-ChildItem $source -Recurse -Force |
Where-Object { $_.PSIsContainer -and $names -contains $_.Name } |
Sort-Object FullName -Descending |
Remove-Item -Recurse -Force
The Where-Object clause restricts the output from Get-ChildItem to just folders whose names are present in the array $names. Sorting the remaining items by their full name in descending order ensures that child folders get deleted before their parent. That way you avoid errors from attempting to delete a folder that had already been deleted by a prior recursive delete operation.
If you have PowerShell v3 or newer you can do all filtering directly with Get-ChildItem:
Get-ChildItem $source -Directory -Include $names -Recurse -Force |
Sort-Object FullName -Descending |
Remove-Item -Recurse -Force
I don't think you can do it quite that simply. This gets the list of directories, and breaks the path into its constituent parts, and verifies whether the filter matches one of those parts. If so, it removes the whole path.
It adds a little caution to handle if it already deleted a directory because of nesting (the test-path) and the -Confirm helps ensure that if there's a bug here you have a chance to verify the behavior.
$source = "C:\Powershell_Test" #location of directory to search
$filter = "A"
Get-Childitem -Directory -Recurse $source |
Where-Object { $_.FullName.Split([IO.Path]::DirectorySeparatorChar).Contains($filter) } |
ForEach-Object { $_.FullName; if (Test-Path $_) { Remove-Item $_ -Recurse -Force -Confirm } }
I need to filter 30 types of file from 2T of data, i want to set a variable for get-childitem then pass to -filter for different type of files, but it doesn't work.....any idea why? The idea was if I use get-childitem 30 times it will slow down the system, so I only want to do it once and set the output as a variable and use it for filtering different types of files.
$a = Get-ChildItem -Recurse c:\work
$a -filter .prt | .............
Any suggestion please!
You can use Where-Object and filter off of the Name parameter. You can't use -filter on a variable.
Also, you need a wildcard to filter all files ending with ".prt" (if that's what you're trying to do).
$a = Get-ChildItem -Recurse c:\work
$a | Where-Object {$_.Name -like '*.prt'} | ...
It is usually best to include only the data you're after rather than filtering it from a larger collection; consider using the -include parameter to achieve this. For example:
#Non-recursive
$fileTypes = #("*.prt","*.doc")
$a = Get-ChildItem c:\work\* -include $fileTypes
#Recursive
$fileTypes = #("*.prt","*.doc")
$a = Get-ChildItem c:\work\* -include $fileTypes -recurse