Delete multiple files using powershell and a .txt file - powershell

I have a .txt with the names of over 1000 files I want to delete. My .txt file does not have file paths. The files I want to delete are spread throughout multiple folders but they are all in the same drive. Is there any way to use powershell or command prompt to search for all files within my drive with the same name as what is listed in my .txt file and delete them?

Assuming you're PowerShell prompt is currently set at the root location from which you want to start your search and the file is in the same directory:
gc .\MyListOfFilesIWantToDelete.txt | %{gci $_ -Recurse | Remove-Item -WhatIf}
Note, you'll have to remove the -whatif
Or, let's say your file is somewhere else where you have PowerShell opened (eg: ~/Documents), and you want to scan your D: drive. This should work:
gc .\MyListOfFilesIWantToDelete.txt | %{gci D:\ -Filter $_ -Recurse -ErrorAction SilentlyContinue | Remove-Item -WhatIf}
Note I put SilentlyContinue. This is because you'll see a lot of red if you don't have access to folders in your search path.
Alternatively, you can load up a variable with your list of files..
$thesefiles = gc .\mylistoffilesiwanttodelete.txt
.. and use the Remove-Item cmdlet directly..
Remove-Item -Path D:\Folder -Include $thesefiles -Recurse -WhatIf
or in one swoop without loading a variable:
Remove-Item -Path D:\Folder -Include $(gc .\mylistoffilesiwanttodelete.txt) -Recurse -WhatIf
Again, I'm using -WhatIf for testing. Also, I've noticed different behaviors in the past with get-childitem on different versions of PowerShell. I tested these with 5.1

Change directory from following powershell command
Following command will allow you to delete .txt files in specific directory
Get-ChildItem C:\*.txt -file -r | remove-item

Related

Powershell: Find Folders and Run Command in Those Folders

so trying to find a way to combine a couple of things the Stack Overflow crowd has helped me do in the past. So I know how to find folders with a specific name and move them where I want them to go:
$source_regex = [regex]::escape($sourceDir)
(gci $sourceDir -recurse | where {-not ($_.psiscontainer)} | select -expand fullname) -match "\\$search\\" |
foreach {
$file_dest = ($_ | split-path -parent) -replace $source_regex,$targetDir
if (-not (test-path $file_dest)){mkdir $file_dest}
move-item $_ -Destination $file_dest -force -verbose
}
And I also know how to find and delete files of a specific file extension within a preset directory:
Get-ChildItem $source -Include $searchfile -Recurse -Force | foreach{ "Removing file $($_.FullName)"; Remove-Item -force -recurse $_}
What I'm trying to do now is combine the two. Basically, I'm looking for a way to tell Powershell:
"Look for all folders named 'Draft Materials.' When you find a folder with that name, get its full path ($source), then run a command to delete files of a given file extension ($searchfile) from that folder."
What I'm trying to do is create a script I can use to clean up an archive drive when and if space starts to get tight. The idea is that as I develop things, a lot of times I go through a ton of incremental non-final drafts (hence folder name "Draft Materials"), and I want to get rid of the exported products (the PDFs, the BMPs, the AVIs, the MOVs, atc.) and just leave the master files that created them (the INDDs, the PRPROJs, the AEPs, etc.) so I can reconstruct them down the line if I ever need to. I can tell the script what drive and folder to search (and I'd assign that to a variable since the backup location may change and I'd like to just change it once), but I need help with the rest.
I'm stuck because I'm not quite sure how to combine the two pieces of code that I have to get Powershell to do this.
If what you want is to
"Look for all folders named 'Draft Materials.' When you find a folder with that name, get its full path ($source), then run a command to delete files of a given file extension ($searchfile) from that folder."
then you could do something like:
$rootPath = 'X:\Path\To\Start\Searching\From' # the starting point for the search
$searchFolder = 'Draft Materials' # the folder name to search for
$deleteThese = '*.PDF', '*.BMP', '*.AVI', '*.MOV' # an array of file patterns to delete
# get a list of all folders called 'Draft Materials'
Get-ChildItem -Path $rootPath -Directory -Filter $searchFolder -Recurse | ForEach-Object {
# inside each of these folders, get the files you want to delete and remove them
Get-ChildItem -Path $_.FullName -File -Recurse -Include $deleteThese |
Remove-Item -WhatIf
}
Or use Get-ChildItem only once, having it search for files. Then test if their fullnames contain the folder called 'Draft Materials'
$rootPath = 'X:\Path\To\Start\Searching\From'
$searchFolder = 'Draft Materials'
$deleteThese = '*.PDF', '*.BMP', '*.AVI', '*.MOV'
# get a list of all files with extensions from the $deleteThese array
Get-ChildItem -Path $rootPath -File -Recurse -Include $deleteThese |
# if in their full path names the folder 'Draft Materials' is present, delete them
Where-Object { $_.FullName -match "\\$searchFolder\\" } |
Remove-Item -WhatIf
In both cases I have added safety switch -WhatIf so when you run this, nothing gets deleted and in the console is written what would happen.
If that info shows the correct files are being removed, take off (or comment out) -Whatif and run the code again.

How to prevent PowerShell -Recurse from renaming first file twice?

When using powershell to rename files with their directory name and file name, my code works, except in the first file in a directory, it gives it two copies of the directory name. So the file book1.xlsx in folder folder1 should become folder1book1.xlsx but it becomes folder1folder1book1.xlsx. The remaining files in folder1 are correctly named folder1book2.xlsx, folder1book3.xlsx, etc.
I have a directory, with many sub-directories. In each sub-dir are files that need their sub-dir name added in.
I've been following this code. For me it looks like:
dir -Filter *.xlsx -Recurse | Rename-Item -NewName {$_.Directory.Name + "_" + $_.Name}
I've also tried
--setting the Recurse -Depth 1 so that it doesn't keep looking for folders in the sub-folders.
--using ForEach-Object {$_ | ... after the pipe, similar to this.
--running it in Visual Studio Code rather than directly in PowerShell, which turns it into:
Get-ChildItem "C:\my\dir\here" -Filter *.xls -Recurse | Rename-Item -NewName {$_.DirectoryName + '_' + $_.Name}
--putting an empty folder inside the sub-directory, setting -Depth 2 to see if that will "catch" the recurse loop
I would expect the files to be named folder1_book1.xlsx, folder1_book2.xlsx, folder1_book3.xlsx.
But all of the attempted changes above give the same result. The first file is named folder1_folder1_book1.xlsx [INCORRECT], folder1_book2.xlsx[CORRECT], folder1_book3.xlsx[CORRECT].
A workaround might be writing an if statement for "not files that contain the sub-directory name" as suggested here. But the link searches for a text string not an object (probably not the correct term) like #_.Directory.Name. This post shows how to concatenate objects but not something like #_.Directory.Name. Having to put in an if statement seems like an unnecessary step if -Recurse worked the way it should, so I'm not sure this workaround gets at the heart of the issue.
I'm running windows 10 with bootcamp on a 2018 iMac (I'm in Windows a lot because I use ArcMap). Powershell 5.1.17134.858. Visual Studio Code 1.38.0. This is a task I would like to learn how to use more in the future, so explanations will help. I'm new to using PowerShell. Thanks in advance!
This was a script I created for one of my customers that may help
<##################################################################################################################################
This script can be used to search through folders to rename files from their
original name to "filename_foldername.extension". To use this script
please configure the items listed below.
Items to Congfigure
-$Original
-$Source
-$Destination
-$Files
Also please change the Out-File date on line 29 to today's date ****Example: 2019-10-02****
We've also added a change log file that is named "FileChange.txt" and can be found in the location identified on line 30
>
$Original="C:\temp\test" #Location of ".cab" files copied
$Source="C:\temp\Test" #Location were ".cab" files are stored
$Destination="C:\temp\Test\2019-10-02" #Location were you want to copy ".cab" files after the file name change. Be sure to change the date to the date you run this script. The script creates a folder with todays date
$Files=#("*.cab") #Choose the file type you want to search for
$ErrorActionPreference = "SilentlyContinue" #Suppress Errors
Get-ChildItem $Original -Include "*.cab" -File -Recurse | Rename-Item -NewName {$_.BaseName+"_"+$_.Directory.Name +'.cab'}
New-Item -ItemType Directory -Path ".\$((Get-Date).ToString('yyyy-MM-dd'))"; Get-ChildItem -recurse ($Source) -include ($Files) | Copy-Item -Destination ($Destination) -EA SilentlyContinue
Get-ChildItem $Original | Where {$_.LastWriteTime -ge [datetime]::Now.AddMinutes(-10)} | Out-File C:\temp\test\2019-10-02\FileChange.txt

PowerShell Script finding File using extension and excluding a folder

I am using the below code:
Get-ChildItem -Path N:\USERS -Filter DANTOM.DTM -Recurse -ErrorAction SilentlyContinue -Force
I need it to either find the file "DANTOM.DTM" or the extension ".DTM". I need to exclude the folder N:\USERS\EDI because it is a 1.7TB folder that would never have this file in it. So in doing so would really speed up the process.
I would like the end result to either spit into a .txt file saying which folders inside of N:\USERS has the file or just have it display as a list in powershell.
Thank you,
Assuming that the files of interest do not reside directly in N:\USERS (only in subdirs.), try the following (PSv3+ syntax); send to a file by appending > dirs.txt, for instance.
Get-ChildItem N:\USERS -Directory | ? Name -ne 'EDI' |
Get-ChildItem -Recurse -Filter *.DTM |
ForEach-Object { $_.DirectoryName }
Note: While it is tempting to try a simpler approach with -Exclude EDI, it unfortunately doesn't seem to be effective in excluding the entire subtree of the EDI subfolder.

Search computer for .docx, .xls files using PowerShell

I am new to PowerShell and having difficulties trying to locate certain types of files (.doc, .docx, .xls, .xlsx), output the filenames and sizes (in groups by file extension) to a text file, and also include the total number files and total files size for each file extension.
The code that I have so far is:
$Report_File_Destination = "C:\Users\StayPositibve\Desktop\testing20.txt"
$path = ".\*"
Get-ChildItem $path -Include *.doc, *.docx, *.xls, *.xlsx -Recurse | Group-Object Extension -NoElement | Out-File $Report_File_Destination -Append
Every time I run this code, I receive a Get-ChildItem Access is Denied (I am running PowerShell as Administrator). What am I doing wrong? Thanks for your help!
This is due to the fact that it exists some paths that your are not allowed to browse. You can try to use -ErrorAction Ignore -Force options of Get-ChildItem to ignore these errors or force access to files that cannot otherwise be accessed by the user, such as hidden or system files. in old version of PowerShell you can test -ErrorAction SilentlyContinue.
Get-ChildItem $path -Include *.doc, *.docx, *.xls, *.xlsx -Recurse -ErrorAction Ignore -Force

Powershell script for copying dlls and pdbs name working but not erroring out

The below script doesn't error out and it doesn't work. I have trapped it inside of a try catch block and that isn't working either. I am attempting to move only pdb, and dll files to a certain folder. However, when I run this script, the dll's and pdb's aren't moved. I probably have an order of operations mixup or something, but I thought this script should have worked...
gci -path $FromPath -Include ("*.dll", "*.pdp") | ? {$_.Name -match "PackageServiceLib|Package.capture.CSE.inc|PackageDBCore|Package.capture|PackageCommon|PackageServiceFramework"} | Copy-item -path $FromPath -destination $ToPath -force
My guess is that your $FromPath variable doesn't specify \* at the end. If you don't specify that, then your -Include parameter will be useless.
Assuming that a folder named c:\test contains 10 files with a .txt file extension, consider the difference between this:
Get-ChildItem -Path c:\test -Include *.txt;
And this:
Get-ChildItem -Path c:\test\* -Include *.txt;
The first command will yield no output, because you are getting the directory, not the children of the directory. In the second command, we are specifying that we want everything that is a child of the directory, except we only want the items that match the -Include parameter.