Need a cmd/powershell script to delete files for than 30 days old in nested folders - powershell

We have 30 processes running which generate error screenshots. So we only keep 30 days worth I've been trying to write a Powershell script to do this. The problem I'm facing is with wildcards in the folder crawl. Say I have the following files:
C:\Runs\Process-1\AppFiles\Dummy.txt
C:\Runs\Process-1\AppFiles\Dummy.png
C:\Runs\Process-2\AppFiles\DummyPic.png
C:\Runs\Process-3\AppFiles\Dummy.log
C:\Runs\Process-3\AppFiles\Dummy1.png
And I want to get rid of all the png files in those subfolders more than 30 days old.
I tried:
ForFiles /p "C:\Runs\Process*" /s /d -30 /m "*.png"
but it doesn't like my folder wildcard. Help anyone?

In Powershell you may try this:
Get-ChildItem "C:\Runs\Process*\AppFiles\*.png" | Where-Object { $_.CreationTime -lt (Get-Date).AddDays(-30) } | Remove-Item

I would suggest using nested forfiles loops:
An outer forfiles loop for the directories, Process-*, and;
An inner forfiles loop for the *.png files that you wish to delete.
This way you have the additional flexibility of two loops to play with.
Another less elegant method would be to use a foreach-object loop, again containing a nested ForFiles, with a list of directories supplied to the foreach-object. However, then you have to use a pre-determined list of directories. Obviously, you could also use foreach-object for the inner loop as well, but again you would need a pre-determined list of .png files, which pretty much defeats the whole object of the exercise.
The nested forfiles approach is much better, IMHO.

Related

Loop command to move multiple files

I have multiple files in a folder (C:\webfix) the folder has 23 items including random files and folders.
I have 70+ folders I need to push these files out to. Each folder looks like this:
C:\saas\CU01313\wwwroot\
C:\saas\CU01316\wwwroot\
C:\saas\CU08453\wwwroot\
etc. etc.
The destination is all the same minus the CU0* part.
I would like to be able to mass move the 23 files/folders to each of these destinations, but I have not been able to figure out how.
After some research, I found that I might be better off using a 'foreach' loop command?
I have been trying to accomplish this in Powershell.
I have tried a couple of things which I can show the code below for.
The first "script" uses the Involk-Expression command which I can get to work if I do it one by one.
I have not been able to figure out how to "Wild Card" that \CU0*\ part.
First thing:
Invoke-Expression -Command "robocopy C:\webfix\ 'C:\saas\TT08931\wwwroot\' /e /b /COPYALL /MT:8 /r:2 /log:C:\log\log.txt "
If anyone could give me a hand with this I would be very grateful.
Thank you very much!
Figure out a way to put all the CU0xxxx foldernames in a text file. Then do something like this.
$folderlist = get-content C:\temp\Folderlist.txt
foreach ($folder in $folderlist)
{
Copy-Item -Path "C:\Webfix\*" -Destination "C:\saas\$folder\wwwroot\" -Recurse
}

How to change order of files listed by dir cmd

Currently upon using the dir command dir /b *.*dot my files are listed in the following random order.
C.dot
D.dot
B.dot
A.dot
What should be done so that the same command dir /b *.*dot returns an ordered list, i.e.
A.dot
B.dot
C.dot
D.dot
I was initially thinking about a touch like command and I have tried copy /b A.dot+ trying to update timestamp but it did not work.
Please suggest which command can be used in windows / powershell to achieve this.
If you're trying to sort by alphabetical order of the file name, use Sort-Object. So something like Get-ChildItem -Path . -Filter "*.*dot" | Sort-Object -Property Name. Or, if you insist on not using Get-ChildItem, you could do dir /b *.*dot | Sort-Object.
you mentioned touch, if you need the timestamp updated you can set the LastWriteTime
(Get-ChildItem a.dot).LastWriteTime = get-date
there might be a better way to do what you are trying to do lastwritetime is limited to minutes so you can only update one file per minute
i am just unsure how you would GCI into a random order...? dir (the same thing as get-ChildItem) is going to order by name unless you sort by something else.

Strange Powershell GCI Recurse Results with Wildcard

I'm trying to use GCI with the Recurse argument to get a list of all files WITHIN the subfolders of a specified path.
I'm using the following line:
gci 'C:\temp\TestRecurse\*\*' -Recurse
Underneath the TestRecurse folder, I have the following:
TestRecurse
|---b.txt
|---dir1
|------a.txt
I expect a.txt to be returned. However, I'm getting a.txt and b.txt. Stranger still to me, if I put a.txt into another subfolder:
TestRecurse
|---b.txt
|---dir1
|------dir2
|---------a.txt
The same statement above only returns a.txt. I'm baffled as to how messing with the location of a.txt changes when b.txt is returned. Can someone explain to me why this happens, and why b.txt is ever returned at all?
Update
I should mention, while they're appreciated, I'm not necessarily looking for a workaround. This is part of a larger script in our environment that is in charge of moving files around in various ways while trying stay flexible. It's not behaving as I expected it would, so I'm trying to understand why it's working the way it is. As pointed out by PetSerAl, understanding Get-ChildItem may be more trouble than it's worth.
Thanks!
You're including a wildcard for the parent directory (TestRecurse\*), so you are getting files contained in it as well. Try getting the folder list of the TestRecurse, then iterating through them.
Structure:
TestRecurse\b.txt
TestRecurse\dir1
TestRecurse\dir1\a.txt
Code:
Get-ChildItem 'C:\tmp\TestRecurse\' | ` # Get the list of items within TestRecurse
? {$_.PSIsContainer} | ` # Filter items that are folders
% {Get-ChildItem $_.FullName -Recurse} # Iterate through and get all items within those folders
This only returns folders and files within dir1, but not dir1 itself.

Powershell : Quickly count containers

I think we all know the PsIsContainer method to check if the current file is a folder or not. But in my project I need a way to quickly know the number of folders in a folder. All I need is to quickly get their number. I want to write in a .txt lines which would look like C:\folder;12. It would mean in the folder, with the -recurse argument, there would be 12 folders.
To explain why, I need to save the progress of my work when i cut off the program which is used to analyse some folders. When a folder's analysed, the result is written in a second .txt. For example, if a folder is called C:\folder\folder1, folder will be analysed and then folder1 will be too. Which makes folder appear 2 times in the file because the full name always is written. What i want to do is to count the number of lines where C:\folder is written. If it equals the number next it's path in the first .txt, it means the file already has been analysed and the function doesnt need to do it again.
Does someone have a solution ? Or maybe an another idea to save the progress ? Cause i really have the feeling this is taking too long to do this.
Thank you for your help.
Another approach, which i find much faster is using cmd built-in 'dir' command
of course this is in case you don't need the subfolders(which you can then run the function in a foreach loop, or change the function if this is the case)
Function Get-FolderCount($path)
{
$Dir = cmd /c dir $path /a:d
Return ($Dir[-1] -csplit 'Dir' -replace '\s')[0]
}
I use this as well for measuring folder size with /s switch and take the total size which is much faster then powershell, also much faster then run it on interactive shell...

PowerShell - bulk move of some subfolders to new server

I have been asked to move around 200 folders to a new server, but I'm not sure how to script it, or if it is possible.
I have a duplicate folder structure at the destination, but the problem is I have to move only 1 subfolder and its contents in each of the parent folders across. Looks like this:
FolderA
Folder1
Folder2
FolderB
Folder1
Folder2
Is it possible to move only 'Folder1' from Folders a-z and place 'Folder1' in its corresponding new parent folder?
I'd use Robocopy, particularly if you want to preserve the ownership and permissions.
I would be very tempted to use RoboCopy to do the copying, because you can set it to bring over the original file created and modified dates and times, and security permissions.
Although it can't do the Folder1 thing natively. So I would be looking at using PowerShell to generate a batch file of RoboCopy commands, then running that. e.g. something that looks like this, although I haven't tested it.
$sourceFolder = "\\server1\share"
$destFolder = "\\server2\share"
foreach ($folder in (Get-ChildItem -Directory $sourceFolder)) {
"robocopy `"$($sourceFolder)\$($folder)\Folder1`" `"$($destFolder)\$($folder)\Folder1`" /E /COPYALL /DCOPY:T" | Out-File roboscript.bat -Append
}
Then check over, and run roboscript.bat to do the copying.
More a comment on TessellatingHeckler's code than an answer here. Please give any credit to him that you would attribute to this since he had the answer first.
Since you are working with outputting strings to a text file you probably want to work with strings. In regards to your ForEach I would like to suggest:
foreach ($folder in (Get-ChildItem -Directory $sourceFolder | Select -ExpandProperty FullName)) {
$TargetFolder = $Folder -replace [regex]::Escape($sourceFolder), "$destFolder"
"robocopy `"$Folder\Folder1`" `"$TargetFolder\Folder1`" /E /COPYALL /DCOPY:T" | Out-File roboscript.bat -Append
}
That selects the full path for the folder in question as a part of the ForEach. Then it declares a variable as the target in which it replaces the source path with the destination path (source is escaped since this is a RegEx match). This likely makes it more flexible.
Also, you don't need to make subexpressions this way since it's just a string that you're referencing and not a [FileInfo] object.
Lastly, I thought I'd add that you can use two consecutive double quotes instead of escaping them if you prefer, though I would suggest escaping them as TH suggested.