Get-Random increased randomness HELP Powershell - powershell

I am in need of a little help as to increasing the randomness of get-random thus not repeating so often.
I have been using this with powershell:
$formats =
#("*.mpg")
$dir = Split-Path $MyInvocation.MyCommand.Path
gci "$dir\*" -include $formats | Get-Random -Count 1 |
Invoke-Item
My application is to have a in home virtual television channel that randomly selects 1 file from multiple folders at 6am and plays them all day until midnight, powers off and starts up, repeating the process daily. The problem I am finding is that whatever the get-random command uses had a tendency to choose the exact same file often. Thus after months of running this script, I am seeing the exact same movies chosen day after day and some that are never chosen. I'm guessing because the get-random is using the clock as it's factor for choosing a number?
Is there a way to increase the odds of getting a broader selection of files .mpg's in this instance and less of a repeat of the same .mpg's chosen?
My other option was to find a script that would keep track of the .mpg's chosen and "mark" them, or sort by date accessed and not play the same file twice, until all files of a particular folder have been played once; then "unmarking" all the files and starting over. To me that sounds like advanced scripting that I just don't have the knowledge to procure on my own.
Any insight into my dilemma would be vastly appreciated. Any questions about my specifics to help you ascertain a conclusion to this query will be forthcoming.
**maybe i want to know how to increase the random distribution? I'm looking for a way to have more variety in the files chosen day after day and less repeats.

that (access time) should be fairly easy to check
$item = gci "$dir\*" -include $formats | where {$_.lastaccesstime -le (Get-Date).AddDays(-30)} |
Get-Random -Count 1
$item.LastAccessTime = Get-Date
$item | Invoke-Item

Related

How to find a file from files via PowerShell?

I had an excel script to search for files in a command.
I found this example on the forum, the statement says that to search for a file by name, you need to write down the name and send (*) but when requested, it does not find anything
Get-ChildItem -Path "C:\\Folder\\test\*"
What can I do to simplify the code and make it much faster. Wait 10 minutes to find a file out of 10000. this is very long
I have a folder with 10,000 files, and excel searches through VBA for a script in almost 2-3 seconds.
When to script in PowerShell via
$find = Get-ChildItem -Path "C:\\Folder"
for ($f=0; $f -lt $find.Count; $f++){
$path_name = $find\[$f\].Name
if($path_name-eq 'test'){
Write Host 'success'
}
}
ut it turns out sooooooo long, the script hangs for 10 minutes and does not respond, and maybe he will be lucky to answer.
How can I find a file by filter using
Get-ChildItem
To make your search faster you can use Get-ChildItem filter.
$fileName = "test.txt"
$filter = "*.txt"
$status = Get-ChildItem -Path "C:\PS\" -Recurse -Filter $filter | Where-Object {$_.Name -match $fileName}
if ($status) {
Write-Host "$($status.Name) is found"
} else {
Write-Host "No such file is available"
}
You could also compare the speed of searching by using Measure-Command
If the disk the data is on is slow then it'll be slow no matter what you do.
If the folder is full of files then it'll also be slow depending on the amount of RAM in the system.
Less files per folder equals more performance so try to split them up into several folders if possible.
Doing that may also mean you can run several Get-ChildItems at once (disk permitting) using PSJobs.
Using several loops to take take care of a related problem usually makes the whole thing run "number of loops" times as long. That's what Where-Object is for (in addition to the -Filter, -Include and -Exclude flags to Get-ChildItem`).
Console I/O takes A LOT of time. Do NOT output ANYTHING unless you have to, especially not inside loops (or cmdlets that act like loops).
For example, including basic statistics:
$startTime = Get-Date
$FileList = Get-ChildItem -Path "C:\Folder" -File -Filter 'test'
$EndTime = Get-Date
$FileList
$AfterOutputTime = Get-Date
'Seconds taken for listing:'
(EndTime - $startTime).TotalSeconds
'Seconds taken including output:'
($AfterOutputTime - $StartTime).TotalSeconds

First run of script is slower

I am doing some performance profiling and noticed something very odd. I wanted to compare two ways of building a list of XML files. One uses Get-ChildItem twice, with the first one filtering for a single file and the second filtering based on file name, and the second approach uses Get-Item for the single file and Get-ChildItem again for the multiple files. I wrapped both in Measure-Command.
When I run it, the first run shows a much longer time for the very first Measure-Command, but it doesn't matter which approach is first. And it's only the first one over a fair amount of time. So, if we call the two approaches GIGC (Get-Item & Get-ChildItem) and GCGC (Get-ChildItem & Get-ChildItem), if I have the order GIGC then GCGC, and I run it I will see 2.5 seconds for GIGC and 1.5 seconds for GCGC. If I immediately rerun it they will both be around 1.5 seconds. It will stay around 1.5 seconds as I rerun over and over. let the console sit for a few minutes and GIGC will again be around 2.5 seconds. BUT, if I reverse it, GCGC first and GIGC second, the numbers remain the same. That first Measure-Command, this time of GCGC, will be 2.5 seconds, and all the rest will be 1.5. Let the console sit for long enough and the first one will go back up.
$firmAssets = '\\Mac\Support\Px Tools\Dev 4.0'
Measure-Command {
[Collections.ArrayList]$sourceDefinitions = #(Get-Item "$firmAssets\Definitions.xml") + #(Get-ChildItem $firmAssets -Filter:Definitions_*.xml -Recurse)
}
Measure-Command {
[Collections.ArrayList]$sourceDefinitions = #(Get-ChildItem $firmAssets -Filter:Definitions.xml) + #(Get-ChildItem $firmAssets -Filter:Definitions_*.xml -Recurse)
}
At first I thought the issue might be with Measure-Command, so I changed the code to use Get-Date as the timer. Same results.
$startTime = Get-Date
[Collections.ArrayList]$sourceDefinitions = #(Get-Item "$firmAssets\Definitions.xml") + #(Get-ChildItem $firmAssets -Filter:Definitions_*.xml -Recurse)
Write-Host "$(New-Timespan –Start:$startTime –End:(Get-Date))"
$startTime = Get-Date
[Collections.ArrayList]$sourceDefinitions = #(Get-ChildItem $firmAssets -Filter:Definitions.xml) + #(Get-ChildItem $firmAssets -Filter:Definitions_*.xml -Recurse)
Write-Host "$(New-Timespan –Start:$startTime –End:(Get-Date))"
So, last test, I thought it might be something related to the console, so I converted to a script. Weirdly, same result! First run is always substantially slower. My (uneducated) guess is that there is some memory initialization thing happening, that happens on the first run and then stays initialized over multiple uses of PowerShell, be it scripts or console. Basically some sort of PowerShell overhead, which feels like something we can't work around. But hopefully someone has a better answer than my "That's just how PowerShell works, get over it"

Move Files Older Than 3 BUSINESS days from One Directory To Another

I know I've seen people answer the more simple, "move files older than N days from one folder to another question" but i want to introduce two necessary additions to this. I need to be able to move files that are older than three BUSINESS days (Mon-Fri) AND, i need to set both directories (the from and the to directories) to variables as this is likely to change frequently.
I assume Powershell is the way to go here but am still learning. Let me know if additional information is needed or if someone is able to assist.
Thanks in advance.
Something like this should work.
First you'll need to define the number of days to exclude and the current numerical day of the week.
$days = 3
$DOW = (Get-Date).DayOfWeek.Value__
You also wanted the file paths configured.
$from = "C:\old"
$to = "c:\new"
This will set the threshold to the morning minus however many days you've defined. If that overlaps a weekend it will back up the threshold two more days.
$threshold = (Get-Date).Date.AddDays((0-$days))
if ($DOW -lt $days) {
$threshold = (Get-Date).Date.AddDays((0-$days-2))
}
Once the hard part is done you can just get the files that are older than the threshold and move them.
Get-ChildItem $from | Where-Object $lastWriteTime -lt $threshold | Move-Item $to
There are several areas where this could use some tweaking depending on your needs. First, I understand that (Get-Date).DayOfWeek.Value__ may be dependent on your locale, so you may need to tweak some of the math that uses that date. Also, if you're wanting to maintain the file and folder structure throughout the move action, the last line may need to become more complex.

Check if data coming continuously

Every hour data comes into every folder of my dir tree. I need to check if it does come in every hour, or of there was any interruption. (For example, no data coming in for 2–3 hours.)
I am trying to write a PowerShell script that will check LastWriteTime for every folder, but it would solve the problem with gaps. If I would check the logs in the morning I would see that all is OK if some data come to folder one hour ago, but was not there a few hours earlier.
So IMHO LastWriteTime is not suitable for it.
Also there is a problem with subfolders. I need to check only the last folder in every dir tree. I do not know how to drop any previous folders like:
Z:\temp #need to drop
Z:\temp\foo #need to drop
Z:\temp\foo\1 #need to check
I had tried to write a script that checks the LastAccessTime, but it throws an error:
Expressions are only allowed as the first element of a pipeline.
The script is as follows:
$rootdir = "Z:\SatData\SatImages\nonprojected\"
$timedec1 = (Get-date).AddHours(-1)
$timedec2 = (Get-date).AddHours(-2)
$timedec3 = (Get-date).AddHours(-3)
$timedec4 = (Get-date).AddHours(-4)
$dir1 = get-childitem $rootdir –recurse | ?{ $_.PSIsContainer } | Select-Object FullName | where-object {$_.lastwritetime -lt $timedec1} | $_.LastWriteTime -gt $timedec4 -and $_.LastWriteTime -lt $timedec3
$dir1
But as I said before it does not solve the problem.
--
The main question exactly about checking of continuously data collections. I would make dir tree bu hands, but I need to way to check if data had come to folder every hour or there was any hours without data...
you can try to setup the powershell script to run in a Windows Scheduler (which will run every hour). This way, the script will only have to check if any data arrived within the past one hour.

Powershell getchild-item by specific date

I'm trying to find files from the previous day to copy, but my simple get-childitem isn't working. It works with every other switch except -eq. Any suggestions to list files only from the previous day ?
get-childitem c:\users| where-object {$_.LastWriteTime -eq (get-date).adddays(-2)}
You are looking for files that are written at exact time ( hours mins, secs, year, month and 2 days before). Unless the files were written to the second, two (or one) days ago, you will not find them. In other words, you were comparing full DateTime objects and not just a date and hence there is very less chance that they were exactly equal, which seemed to suggest that -eq doesn't work, but other comparisons do.
You probably wanted to just compare the dates, without the times:
$yesterday = (get-date).AddDays(-1).Date
gci c:\users | ?{ $_.LastWriteTime.Date -eq $yesterday}
( also moved the get-date outside, as you probably don't want to do it again and again.)
They are not equal because they differ in time. If you want an exact date match, use the Date property:
get-childitem c:\users| where-object {$_.LastWriteTime.Date -eq (get-date).adddays(-2).Date}