How do I create a loop based on file size in power shell - powershell

I am working with intune and PowerShell and I basically want to run a exe file which downloads 15.2GB / 7932 files for the insulation off the autodesk website and then creates a text file so that intune knows that it's done as I want to delete all the install files with another script later.
The problem is the PowerShell script will run and close before it has finished downloading and intune thinks it is done and the next script tries to install what was downloaded but it is not fully downloaded so it fails.
I have tried to put a wait command but intune will just hang and you will have to restart windows which is something I don't want the users to do.
I am thinking to add a loop so ot checks the file size of the following folder:
C:\Autodesk\{E658F785-6D4D-4B7F-9BEB-C33C8E0027FA}
and once it reaches 15.2GB / 7932 files it goes to the next step and creates the text file.
Below is my current PowerShell script:
Start-Process -NoNewWindow -FilePath "\\arch-syd-fs\EM Setup\Autodesk Recap Custom Install 2023\Source 1 Download\Revit_2023.exe" -ArgumentList "--quiet' " -Wait
New-Item "C:\Temp\Revit 2023" -Type Directory
New-Item -Path "C:\Temp\Revit 2023\Download Done.txt"

Lets break this down into 3 questions
How do you check the size of a directory?
How do you check the count of files in a directory?
How do you make a script wait until these checks reach a certain value?
It turns out you can do the first 2 together
$dirStats = Get-ChildItem -Recurse -Force 'C:\path\to\whatever' | Measure-Object -Sum Length
$size = $dirStats.Sum
$fileCount = $dirStats.Count
Then you can wrap it in a do-until loop (with a delay to keep it from eating all the CPU) to make the script wait until those values reach a certan threshold
do {
Start-Sleep 5
$dirStats = Get-ChildItem -Recurse -Force 'C:\path\to\whatever' | Measure-Object -Sum Length
$size = $dirStats.Sum
$fileCount = $dirStats.Count
} until( ($size -ge 15.2*1024*1024*1024) -and ($fileCount -ge 7932) )
Note that $size is in bytes, and you might want to make that an -or condition rather than an -and depending on weather you want the script to return after either condition is met or wait for both.

Related

Copy files after time x based on modifaction Date

I need a script that only copy files after 5 minutes based on the modification date. Does anyone have a solution for this ?
I couldn't find any script online.
The answer from jdweng is a good solution to identify the files in scope.
You could make your script something like this to easily re-use it with other paths or file age.
# Customizable variables
$Source = 'C:\Temp\Input'
$Destination = 'C:\Temp\Output'
[int32]$FileAgeInMinutes = 5
# Script Execution
Get-ChildItem -Path $Source | Where-Object { $_.LastWriteTime -lt (Get-Date).AddMinutes(-$FileAgeInMinutes) } | Copy-Item -Destination $Destination
You could then run a scheduled task using this script and schedule it to run in periodically, depending on your need.

Powershell Form watching wsusutil.exe until finished

This is my first form I'm making to accomplish a task, but also to learn how to do it. I'm making a simple front end for exporting and then importing WSUS data from a connected server to a disconnected server. I'm working on the export part now and I need to figure out how to start the export process, then once it is done then make the iso file.
Here is the button code I have so far, but not sure how to watch the WsusUtil.exe until it's done then proceed to the next task. I thought I could watch the process and when it's over move to the next step. But have not been able to make that work. I tried a do until, but it kept running the start-process over and over. Also when it starts it make a large black box with some message on it. I tried to use the -NoNewWindow but it did launched. WsusUtil.exe running
$btnStartExport.Add_Click({
$nicedate = get-date -UFormat %m-%d-%y #put in MM-DD-YY format
#progress bar for overall progress of steps 1 and 2 and individual progress bar for each steps
$WSUSUtilPath = "c:\Program Files\Update Services\Tools\"
$WSUSMetaDataPath = "c:\tools\wsusexport\"
$isotitle = "WSUS Offline Server update $nicedate"
$ProcessName = "WsusUtil" #process to watch until it's done
$isofilename = "WSUSSvrOffline-$nicedate.iso" #creates WSUS Offline Server update-11-14-2021.iso
#Step 1 Check if directory exists
Check-Path $WSUSMetaDataPath
#Step 1 - export the WSUS Metadata
Start-process -FilePath $WSUSUtilPath\WsusUtil.exe -ArgumentList #("export","$WSUSMetaDataPath\$nicedate-export.xml.gz","$WSUSMetaDataPath\$nicedate-export.log") -NoNewWindow -Wait
$wsusProcess = get-process WsusUtil -ErrorAction SilentyContinue
# Step 2 - create ISO
get-childitem "$WSUSMetaDataPath","$txtWSUSContentPath.text" | New-IsoFile -path $txtISOLocation.text+$isofilename -Force -Title $isotitle
#clean up medatadata directory
get-childitem -path $WSUSMetaDataPath -Include *.* -File -Recurse | foreach {$_.Delete()}
})
$frmMain.controls.add($btnStartExport)

Increase speed of PowerShell Get-ChildItem large directory

I have a script that references a .csv document of filenames and then runs a Get-ChildItem over a large directory to find the file and pull the 'owner'. Finally the info outputs into another .csv document. We use this to find who created files. Additionally I have it create .txt files with filename and timestamp to see how fast the script is finding the data. The code is as follows:
Get-ChildItem -Path $serverPath -Filter $properFilename -Recurse -ErrorAction 'SilentlyContinue' |
Where-Object {$_.LastWriteTime -lt (get-date).AddDays(30) -and
$_.Extension -eq ".jpg"} |
Select-Object -Property #{
Name='Owner'
Expression={(Get-Acl -Path $_.FullName).Owner}
},'*' |
Export-Csv -Path "$desktopPath\Owner_Reports\Owners.csv" -NoTypeInformation -Append
$time = (get-date -f 'hhmm')
out-file "$desktopPath\Owner_Reports\${fileName}_$time.txt"
}
This script serves it's purpose but is extremely slow based on the large size of the parent directory. Currently it takes 12 minutes per filename. We query approx 150 files at a time and this long wait time is hindering production.
Does anyone have better logic that could increase the speed? I assume that each time the script runs Get-ChildItem it recreates the index of the parent directory, but I am not sure. Is there a way we can create the index one time instead of for each filename?
I am open to any and all suggestions! If more data is required (such as the variable naming etc) I will provide upon request.
Thanks!

Report generation using For loop powershell

I have a program which generate some reports by reading .XML file and I have to generate reports for multiple files.
But the problem which I am facing is for doing this I need to run it multiple times for each files as program reads only 1 file in 1 click.
Is there any way by which I can generate reports for multiple files in one click ?
So far i have tried below codes
$a = Get-ChildItem "D:\Directory1\Files\*.xml"
foreach ($i in $a)
{
Move-Item $i "D:\Directory1\"
if ($a) {
D:\Directory1\Program1.exe /run /exit /SilentMode
}
}
As per the above code I am trying to Read files from "D:\Directory1\Files\" Then move any 1 file (Not all Files) to the directory "D:\Directory1\" and then start the Program "Program1.exe" and generate the reports and repeat it till the .xml files exist in "D:\Directory1\Files\"
Is your goal to copy all files from D:\Directory1\Files\ to D:\Directory1\ in one step and then run D:\Directory1\Program1.exe /run /exit /SilentMode?
EDIT:
This work for you?
0. Set location that your program work
1. Get all files
2. For each file
3. Move file to new location
4. Start you program
5. Remove the moved file
Set-Location -Path "D:\Directory1\"
$arrFiles = Get-ChildItem -Path "D:\Directory1\Files\*.xml"
Foreach ($objFile in $arrFiles) {
Move-Item -Path $objFile.FullName -Destination "D:\Directory1\$($objFile.Name)"
Start-Process -FilePath "D:\Directory1\Program1.exe" -ArgumentList "/run /exit /SilentMode" -Wait
Remove-Item -Path "D:\Directory1\$($objFile.Name)"
}
Your logic here was sound, however, one issue you would have is the script will continue processing even while Program1.exe is running. Thereby making it possible for it to seeminly skip files. Also your If statement is just check if $a contains data which it always will in you example. Makes the condition check mute.
What you can do is something like this.
$moveLocation = "D:\Directory1\"
Get-ChildItem "D:\Directory1\Files\*.xml" | ForEach-Object{
# Move the file to its new location
Move-Item -Path $_.FullName -Destination $moveLocation
Start-Process -FilePath "D:\Directory1\Program1.exe" -ArgumentList "/run /exit /SilentMode" -Wait
}

Why is this PowerShell script so slow? How can I speed it up?

I developed this script to apply Sitecore workflows to whole swaths of items without having to manually click through the GUI. I'm pretty pleased with how well it works, but it's just slow. Here is the script:
Import-Module 'C:\Subversion\CMS Build\branches\1.2\sitecorepowershell\Sitecore.psd1'
# Hardcoded IDs of workflows and states
$ContentApprovalWfId = "{7005647C-2DAC-4C32-8A09-318000556325}";
$ContentNoApprovalWfId = "{BCBE4080-496F-4DCB-8A3F-6682F303F3B4}";
$SettingsWfId = "{7D2BA7BE-6A0A-445D-AED7-686385145340}";
#new-psdrive *REDACTED*
set-location m-rocks:
function ApplyWorkflows([string]$path, [string]$WfId) {
Write-Host "ApplyWorkflows called: " $path " - " $wfId;
$items = Get-ChildItem -Path $path;
$items | foreach-object {
if($_ -and $_.Name) {
$newPath = $path + '\' + $_.Name;
$newPath;
} else {
Write-host "Name is empty.";
return;
}
if($_.TemplateName -eq "Folder" -or $_TemplateName -eq "Template Folder") {
# don't apply workflows to pure folders, just recurse
Write-Host $_.Name " is a folder, recursing.";
ApplyWorkflows $newPath $wfId;
}
elseif($_.TemplateName -eq "Siteroot" -or $_.TemplateName -eq "InboundSiteroot") {
# Apply content-approval workflow
Set-ItemProperty $newPath -name "__Workflow" $ContentApprovalWfId;
Set-ItemProperty $newPath -name "__Default workflow" $ContentApprovalWfId;
# Apply content-no-approval workflow to children
Write-Host $_.Name " is a siteroot, applying approval workflow and recursing.";
ApplyWorkflows $newPath $ContentNoApprovalWfId;
}
elseif($_.TemplateName -eq "QuotesHomePage") {
# Apply settings workflow to item and children
Write-Host $_.Name " is a quotes item, applying settings worfklow recursing.";
Set-ItemProperty $newPath -name "__Workflow" $SettingsWfId;
Set-ItemProperty $newPath -name "__Default workflow" $SettingsWfId;
ApplyWorkflows $newPath $SettingsWfId;
}
elseif($_.TemplateName -eq "Wildcard")
{
Write-Host $_.Name " is a wildcard, applying workflow (and halting).";
Set-ItemProperty $newPath -name "__Workflow" $ContentApprovalWfId;
Set-ItemProperty $newPath -name "__Default workflow" $ContentApprovalWfId;
}
elseif($_ -and $_.Name) {
# Apply passed in workflow and recurse with passed in workflow
Write-Host $_.Name " is a something else, applying workflow and recursing.";
Set-ItemProperty $newPath -name "__Workflow" $WfId;
Set-ItemProperty $newPath -name "__Default workflow" $WfId;
ApplyWorkflows $newPath $wfId;
}
}
}
ApplyWorkflows "sitecore\Content\" $ContentNoApprovalWfId;
It processes one item in a little less than a second. There are some pauses in its progress - evidence suggests that this is when Get-ChildItem returns a lot of items. There are a number of things I would like to try, but it's still running against one of our sites. It's been about 50 minutes and looks to be maybe 50% done, maybe less. It looks like it's working breadth-first, so it's hard to get a handle on exactly what's done and what's not.
So what's slowing me down?
Is it the path construction and retrieval? I tried to just get the children on the current item via $_ or $_.Name, but it always looks in the current working directory, which is the root, and can't find the item. Would changing the directory on every recursion be any faster?
Is it the output that's bogging it down? Without the output, I have no idea where it is or that it's still working. Is there some other way I could get indication of where it is, how many it has done, etc.?
Is there a better approach where I just use Get-ChildItem -r with filter sets and loop through those? If so, a first attempt at incorporating some of my conditionals in the first script into a filter set would be very much appreciated. I am new to PowerShell, so I'm sure there's more than one or two improvements to be made in my code.
Is it that I always call the recursive bit even if there aren't any children? The content tree here is very broad with a very many leaves with no children. What would be a good check whether or not child items exist?
Finally, the PowerShell provider (PSP) we have is not complete. It does not seem to have a working implementation of Get-Item, which is why everything is almost completely written with Get-ChildItem instead. Our Sitecore.Powershell.dll says it is version 0.1.0.0. Would upgrading that help? Is there a newer one?
Edit: it finally finished. I did a count on the output and came up with 1857 items and it took ~85 minutes to run, for an average of 21 items per minute. Slower than I thought, even...
Edit: My first run was on PowerShell 2.0, using the Windows PowerShell ISE. I've not tried the Sitecore PowerShell plugin module or the community. I didn't even know it existed until yesterday :-)
I tried another run after upgrading to PowerShell 3.0. Starting locally - running the script from my laptop, connecting to the remote server - there was no noticeable difference. I installed PowerShell 3.0 on the hosting box and ran the script from there and saw maybe a 20-30% increase in speed. So that's not the silver bullet I was hoping it would be - I need an order of magnitude or two's worth of improvement to make this something I won't have to babysit and run in batches. I am now playing around with some of the actual script improvements suggested by the fine answers below. I will post back with what works for me.
Personally I think the biggest boost you would get if you started using the community PowerShell implementation over the Rocks one.
Let me explain why.
You're traversing the whole tree which means you have to visit every node in your branch, which means it has to be read and travel over the Rocks web service at least once.
Then every property save is another webservice call.
I have run your script in the community console and it took me around 25 seconds for 3724 items.
(I've removed the modifications as the values didn't relate to my system).
A Simple
Get-ChildItem -recurse
On my 3724 item tree took 11 seconds in the community console vs 48 seconds in the Rocks implementation.
Additional tweaks you could use in the Community implementation for your script would be using the Sitecore query like:
get-item . -Query '/sitecore/content/wireframe//*[##TemplateName="Template Folder"]'
and only send those items into your function
None of it means the Rocks console is not written right, it just means the design choices in the Rocks console and their target is different.
You can find the community console here:
http://bit.ly/PsConScMplc
See this blog post where there are stated differences between foreach-object cmdlet and foreach statement.
You could speed it up when you pipe the get-childitem with foreach object:
get-childitem . | foreach-object { ApplyWorkflow($_) }
This will cause that each object returned by get-childitem will be immediately passed to the following step in pipeline so you will be processing them only once. This operation should also prevent long pauses for reading all children.
Additionally you can get all the items recursively and filter them by template and then apply appropriate workflows, eg:
get-childitem -recurse . | where-object { $_.TemplateName -eq "MyTemplate"} | foreach-object { ApplyWorkflowForMyTemplate($_) }
get-childitem -recurse . | where-object { $_.TemplateName -eq "MySecondTemplate"} | foreach-object { ApplyWorkflowForMySecondTemplate($_) }
Still I would rather not expect this script to run in seconds anyway. In the end you are going over the whole content tree.
And finally what library are you using? Is this Sitecore Powershell Console (the name of the dll sounds familiar)? There is newer version which has a lots of new features added.
The most obvious problems I see are that you are iterating over files twice per invocation of the function:
function ApplyWorkflows([string]$path, [string]$WfId) {
Write-Host "ApplyWorkflows called: " $path " - " $wfId;
# this iterates through all files and assigns to $items
$items = Get-ChildItem -Path $path;
# this iterates through $items for a second time
$items | foreach-object { # foreach-object is slower than for (...)
Also, piping to foreach-object is slower than using the traditional for keyword.