I'm trying to get this script to do a subdirectory compare based on filename and then only files that are 30 days or younger. The Syntax seems to acceptable but the HandbrakeCLI encoding doesn't launch.
Clear screen
$SourceDir = "\\netshare\testing\Source\*.MP4"
$DestinationDir = "\\netshare\testing\Completed_mp4\*.MP4"
$s1 = get-childitem -path $SourceDir -Recurse -Force | Where-Object {$_.LastWriteTime -gt (Get-Date).addDays(-30)}
$d1 = get-childitem -path $DestinationDir -Recurse
$results = #(compare-object $s1 $d1) | Where-Object {$_.Name -ne $_.Name}
$quantity = $results | measure
$Filecount = $quantity
$Process = 0;
foreach ($result in $results){
Write-Host -----------------------------------------------------------------
Write-Host Handbrake Batch Encoding
$Process++;
$results = $file.DirectoryName + "\" + $file.BaseName + ".MP4";
$progress = ($Process / $filecount) * 100
$progress = [Math]::Round($progress,2)
#Clear-Host
Write-Host "Processing - $results"
Write-Host "File $Process of $Filecount - $progress%"
Write-Host -------------------------------------------------------------------------------
$s1 = get-childitem -path $SourceDir -Recurse -Force | Where-Object {$_.LastWriteTime -gt (Get-Date).addDays(-30)}
$d1 = get-childitem -path $DestinationDir -Recurse
Start-Process "C:\Users\Downloads\HandBrakeCLI-1.0.1-win-x86_64.exe -ArgumentList -q 25 -i '$results' -o '$d1'"
}
$results = #(compare-object $s1 $d1) | Where-Object {$_.Name -ne $_.Name}
would be a good start to look. This will not return any results unless $_.Name is NaN (which is unlikely).
Once you fixed that there should be an error message that
C:\Users\Downloads\HandBrakeCLI-1.0.1-win-x86_64.exe -ArgumentList -q 25 -i '$results' -o '$d1'
cannot be run.
Note that you use quotation marks around the whole line, effectively telling Start-Process that the whole thing is the program to run. Which it isn't.
There's no need for Start-Process here anyway, though, you should be able to just use
C:\Users\Downloads\HandBrakeCLI-1.0.1-win-x86_64.exe -q 25 -i $results -o $d1
(Note also that due to your use of single quotes, you were passing $results and $d1 verbatim to the program instead of the variable contents. Furthermore I'm fairly sure you'd need commas between arguments with -ArgumentList instead of spaces, as that would be normal PowerShell parameter binding behaviour.)
There are a bunch of other mistakes here:
Clear screen can just be clear, cls, or Clear-Host. The screen does nothing.
$quantity = $results | measure should probably be $quantity = ($results | measure).Count or simply #($results).Count. Otherwise you won't get the output you want a few lines later.
Related
$homefolder = (gci \\SERVER\homefolder | select fullname)
$outfile = "$env:USERPROFILE\Desktop\Homefolder_Desktop_Redirect.csv"
ForEach ($dir in $homefolder)
{If(Test-Path ($dir.FullName +"\Desktop")){write-host $dir.Fullname" contains desktop" -ForegroundColor Yellow
"{0:N2} GB" -f ((Get-ChildItem $dir.fullname -Recurse | Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue).Sum / 1GB)
}}
ForEach ($dir in $homefolder)
{If(Test-Path ($dir.FullName +"\Desktop")){}else{write-host $dir.Fullname" does not contain desktop" -ForegroundColor Red
"{0:N2} GB" -f ((Get-ChildItem $dir.fullname -Recurse | Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue).Sum / 1GB)
}}
I'm trying to get this to output to a file. If I put the pipe between the last 2 }} or after the last } (in each Foreach), I'm told it's empty. If I put IF inside another set of parentheses, like {(If I get If isn't valid.
If I try to write/append after 1GB) my outfile is just my script.
If I try making the Foreach($dir in $homefolder) a variable, the in is an unexpected token.
I'm sure this is something simple, but I haven't used PowerShell for much in the last 5 years... assistance is appreciated.
---UPDATE---
Thanks for the help, all!
This is what I have thanks to the assistance I've received.
$outfile = "$env:USERPROFILE\Desktop\Homefolder_Desktop_Redirect.txt"
Write-Output "Contains desktop:" | Set-Content $outfile -Force
(Get-ChildItem \\SERVER\homefolder).FullName | ForEach-Object {
if(Test-Path (Join-Path $_ -ChildPath Desktop)) {
Write-Host "$_ contains desktop" -ForegroundColor Yellow
"$_ [{0:N2} GB]" -f (
(Get-ChildItem $_ -Recurse |
Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue
).Sum / 1GB)
}
} | Add-Content $outfile -Force
Write-Output "Contains NO desktop:" | Add-Content $outfile -Force
(Get-ChildItem \\SERVER\homefolder).FullName | ForEach-Object {
if(Test-Path (Join-Path $_ -ChildPath Desktop)) {}
else{
Write-Host "$_ contains no desktop" -ForegroundColor Red
"$_ [{0:N2} GB]" -f (
(Get-ChildItem $_ -Recurse |
Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue
).Sum / 1GB)
}
} | Add-Content $outfile -Force
Invoke-Item $outfile
The main reason why PowerShell complains is because you're looking to pipe after a language keyword which is simply not possible. You can however, use ForEach-Object, a cmdlet designed to enumerate input objects from pipeline, and because it is a cmdlet and not a statement (foreach), you can pipe other cmdlets to it:
(Get-ChildItem \\SERVER\homefolder).FullName | ForEach-Object {
if(Test-Path (Join-Path $_ -ChildPath Desktop)) {
Write-Host "$_ contains desktop" -ForegroundColor Yellow
}
else {
Write-Host "$_ does not contain desktop" -ForegroundColor Red
}
$output = "$_ [{0:N2} GB]" -f (
(Get-ChildItem $_ -Recurse |
Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue
).Sum / 1GB)
# send output to the host
Write-Host $output
# send output to the success stream
$output
} | Set-Content path\to\export.txt
Generally, if you want to send the output from multiple statements to a single file, enclose them in & { ... } (or . { ... } to run directly in the caller's scope); a simplified example:
& {
foreach ($i in 1..5) { $i }
foreach ($i in 6..10) { $i }
} | Out-File test.txt
However, you can reformulate your code to a single pipeline, using the ForEach-Object cmdlet rather than the foreach loop statement:
$homefolder |
ForEach-Object {
$hasDesktop = Test-Path (Join-Path $_.FullName Desktop)
Write-Host ('{0} {1} desktop' -f $_.FullName, ('does not contain', 'contains')[$hasDesktop]) -ForegroundColor ('Red', 'Yellow')[$hasDesktop]
'Contains {0}desktop' -f ('NO ', '')[$hasDesktop]
'{0:N2} GB' -f ((Get-ChildItem $_.FullName -Recurse | Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue).Sum / 1GB)
} |
Out-File $outfile
Note:
Note the technique of letting a Boolean variable $hasDesktop select one of two values from an array, e.g., ('Red', 'Yellow')[$hasDesktop], which allows you to make do with a single Write-Host call to cover both the has-desktop and doesn't-have-desktop case.
This acts similar to the ternary conditional operator, available in PowerShell (Core) 7+ only; that is, the above is equivalent to: ($hasDesktop ? 'Yellow' : 'Red')
The output string is pieced together with the help of -f, the format operator.
As for your desire to print the calculated size to the display as well:
In Windows PowerShell, capture it in a variable first, write its value to the display with Write-Host, then output it to the success stream, as shown in Santiago Squarzon's helpful answer.
In PowerShell (Core) 7+, you can simply pipe to Tee-Object CON, as discussed in this answer.
New here and getting to learn powershell, so forgive me for mistakes.
A senior staff had left abruptly and i was tasked to finding out all folders in DFS that the employee had access to (security reasons).
Couldn't find a script that does that for me (to scan 14TB of DFS shares to find what folders user or his group memberships may have access to), so just wrote my own.
Its working fine but too slow for my liking, wondering if it can be tuned to run faster.
running it in 2 parts to save folders first, then user each folder path to get ACL permissions and filter against the username to a csv (with ~ as delimiter to avoid messing with commas).
using powershell 5.1
$ErrorActionPreference = "Continue"
#$rootDirectory = 'C:\temp'
$rootDirectory = '\\?\UNC\myServer\myShare'
$scriptName = 'myACL'
$version = 1.0
$dateStamp = (Get-Date).ToString('yyyyMMddHHmm')
$scriptDirectory = $PSScriptRoot
$log = $scriptDirectory + "\" + $scriptName + "_dirList_v" + $version + "_"+$dateStamp+".log"
"Path" | Out-File $log
function getSubfolders ([String]$arg_directory, [string]$arg_log)
{
$subFolders = Get-ChildItem -LiteralPath $arg_directory -Directory -Force -ErrorAction SilentlyContinue | Select-Object -expandProperty FullName
$subFolders | Out-File $arg_log -append
#"just before loop" | Out-File $arg_log -append
foreach ($folder in $subFolders)
{
#"working on $folder" | Out-File $arg_log -append
getSubfolders $folder $arg_log
}
#"returning from function" | Out-File $arg_log -append
}
#part1
getSubfolders $rootDirectory $log
#part2
$dirListSourceFile = $log
$log2 = $scriptDirectory + "\" + $scriptName + "_permissionList_v" + $version + "_"+$dateStamp+".csv"
$i=0
"Sr~Path~User/Group~Rights~isInherited?" | Out-File $log2
Start-Sleep -s 2
Import-CSV $dirListSourceFile | ForEach-Object{
$i++
$path = $_.path.Trim()
$Acl = get-acl $path | Select *
ForEach ($Access in $Acl.Access)
{
if($Access.IdentityReference.value -eq "mydomain\user1" -or $Access.IdentityReference.value -eq "mydomain\sg1" -or $Access.IdentityReference.value -eq "mydomain\sg2" -or $Access.IdentityReference.value -eq "mydomain\sg3" -or $Access.IdentityReference.value -eq "mydomain\sg4")
{
"$i~$path~$($Access.IdentityReference.value)~$($Access.FileSystemRights)~$($Access.IsInherited)" | Out-File $log2 -append
}
}
}
As you can read in the comments, if you have the possibility to run the code locally do it. You can use the same technique, as you did in case of the UNC path, for local paths - e.g. \\?\C:\directoy
see:
https://learn.microsoft.com/en-us/windows/win32/fileio/maximum-file-path-limitation?tabs=registry
Furthermore you did write a recursive function but thats not necessary in this case as get-childitem has this feature built in. Currently you call for each subfolder get-childitem again and also you write each time a log entry to the disk. Its faster to collect the data and write it to the disk one time, e.g.:
#Get paths locally and add \\?\ in combination with -pspath to overcome 256 string length limit, add -recurse for recursive enumeration
$folders = get-childitem -PSPath "\\?\[localpath]" -Directory -Recurse -ErrorAction:SilentlyContinue
#Write to lofile
$folders.fullname | set-content -Path $arg_log
Also if you want to optimize performance avoid unecessary operations like this:
$Acl = get-acl $path | Select *
get-acl gives you a complete object and you take it send it over the pipeline and select all (*) properties from it. Why? This $Acl = get-acl $path is enough.
Finally you may use io classes directly, instead of get-childitem - see:
How to speed up Powershell Get-Childitem over UNC
My code below:
$Source = "C:\Users\xxxx"
$targetFolder = "C:\Users\xxx\new"
Get-ChildItem -Path $Source -Include * | forEach-Object{
$fileObject = $_
$filename = $_.BaseName.Substring(26)
Get-ChildItem -Path $Source -Include * | forEach-Object{
$tempObject = $_
$temp = $_.BaseName.Substring(26)
if($temp -eq $filename -And $tempObject -ne $fileObject){
Start-Process $fileObject + $tempObject -Verb Print -ArgumentList $targetFolder
}
}
I'm able to successfully match files based on part of their names, now I'd like to print the two matching images vertically on top of each other as a PDF, the Printer name would be "Microsoft Print to PDF"
What I truly want is as long as the images are matching according to my if statement, to be combined into one image, if there's another solution that can make that happen I'm open to it as well!!!
Any ideas????
We are trying to run a script against a pile of remote computers to check the date stamps of files in a fixed folder that are older than say 12 hours and return the results to a CSV. The date range needs to be flexible as its a set time of 6pm yesterday which will move as the time moves on.
$computers = Get-Content -Path computers.txt
$filePath = "c:\temp\profile"
$numdays = 0
$numhours = 12
$nummins = 5
function ShowOldFiles($filepath, $days, $hours, $mins)
{
$files = $computers #(get-childitem $filepath -include *.* -recurse | where {($_.LastWriteTime -lt (Get-Date).AddDays(-$days).AddHours(-$hours).AddMinutes(-$mins)) -and ($_.psIsContainer -eq $false)})
if ($files -ne $NULL)
{
for ($idx = 0; $idx -lt $files.Length; $idx++)
{
$file = $files[$idx]
write-host ("Old: " + $file.Name) -Fore Red
}
}
}
Write-output $computers, $numdays, $numhours, $nummins >> computerlist.txt
You could run the follow script on all of your remote machines:
$computers = Get-Content -Path computers.txt
$logFile = "\\ServerName\C$\Logfile.txt"
$date = "12/03/2002 12:00"
$limit = Get-Date $date
$computers | %{
$filePath = "\\$_\C$\temp\profile"
$files = $null
$files = Get-ChildItem -Path $filePath -Recurse -Force | `
Where-Object {$_.CreationTime -lt $limit }
If($files -ne $null){
"-------------------------[$($_)]------------------------">> $logFile
$files | Foreach {$_.FullName >> $logFile}
}
}
This will check the folder given ($filePath) for files that are older than the limit given. Files older than the limit will have there full file path logged in the given network location $logFile.
with a small alteration to #chard earlier code I managed to get a workable solution.
The output log file only returns the files that are older than the date in the code.
this can be manipulated in Excel with other outputs for our needs.
I will try the updated code above in a bit.
$computers = Get-Content -Path "C:\temp\computers.txt"
$logFile = "\\SERVER\logs\output.txt"
$numdays = 3
$numhours = 10
$nummins = 5
$limit = (Get-Date).AddDays(-$numdays).AddHours(-$numhours).AddMinutes(-$nummins)
$computers | %{
$filePath = "\\$_\C$\temp\profile\runtime.log"
Get-ChildItem -Path $filePath -Recurse -Force | `
Where-Object {$_.LastWriteTime -lt $limit } |
foreach {"$($_)">> $logFile}
}
Really need help creating a script that backs up, and shoots out the error along the file that did not copy
Here is what I tried:
Creating lists of filepaths to pass on to copy-item, in hopes to later catch errors per file, and later log them:
by using $list2X I would be able to cycle through each file, but copy-item loses the Directory structure and shoots it all out to a single folder.
So for now I am using $list2 and later I do copy-item -recurse to copy the folders:
#create list to copy
$list = Get-ChildItem -path $source | Select-Object Fullname
$list2 = $list -replace ("}"),("")
$list2 = $list2 -replace ("#{Fullname=") , ("")
out-file -FilePath g:\backuplog\DirList.txt -InputObject $list2
#create list crosscheck later
$listX = Get-ChildItem -path $source -recurse | Select-Object Fullname
$list2X = $listX -replace ("}"),("")
$list2X = $list2X -replace ("#{Fullname=") , ("")
out-file -FilePath g:\backuplog\FileDirList.txt -InputObject $list2X
And here I would pass the list:
$error.clear()
Foreach($item in $list2){
Copy-Item -Path $item -Destination $destination -recurse -force -erroraction Continue
}
out-file -FilePath g:\backuplog\errorsBackup.txt -InputObject $error
Any help with this is greatly appreciated!!!
The answer to complex file-copying or backup scripts is almost always: "Use robocopy."
Bill
"Want to copy all the items in C:\Scripts (including subfolders) to C:\Test? Then simply use a wildcard character..."
Next make it easier on yourself and do something like this:
$files = (Get-ChildItem $path).FullName #Requires PS 3.0
#or
$files = Get-ChildItem $path | % {$_.Fullname}
$files | Out-File $outpath
well it took me a long time, considering my response time. here is my copy function, which logs most errors(network drops, failed copies , etc) the copy function , and targetobject.
Function backUP{ Param ([string]$destination1 ,$list1)
$destination2 = $destination1
#extract new made string for backuplog
$index = $destination2.LastIndexOf("\")
$count = $destination2.length - $index
$source1 = $destination2.Substring($index, $count)
$finalstr2 = $logdrive + $source1
Foreach($item in $list1){
Copy-Item -Container: $true -Recurse -Force -Path $item -Destination $destination1 -erroraction Continue
if(-not $?)
{
write-output "ERROR de copiado : " $error| format-list | out-file -Append "$finalstr2\GCI-ERRORS-backup.txt"
Foreach($erritem in $error){
write-output "Error Data:" $erritem.TargetObject | out-file -Append "$finalstr2\GCI- ERRORS-backup.txt"
}
$error.Clear()
}
}
}