Powershell adding file sizes, unexpected result - powershell

I am currently working on a powershell script to copy a random selection of songs from my NAS onto an SD card. As an added complication, I can have no more than 512 songs per folder and also obviously, need to stop the process before I run out of free space on the card.
I have written a nearly complete script (with reduced amounts of songs for testing purposes), but am struggling with keeping track of the total size of the files I have copied. As an example, a test run with a total of 112MB of files gives a recorded value (in $copied_size) of 1245. I don't know what that value means, it doesn't seem to be a realistic value of GB, Gb, MB or Mb. I am obviously missing something here. Any ideas?
Here is the script, I haven't put in the size restriction yet:
$j = 1
$i = 0
$files_per_folder = 5
$sd_card_size = 15920000000
$copied_size = 0
$my_path = '\\WDMYCLOUD\Public\Shared Music'
$random = Get-Random -Count 100 -InputObject (1..200)
For ($j=1; $j -le 5; $j++)
{
md ("F:\" + $j)
$list = Get-ChildItem -Path $my_path | ?{$_.PSIsContainer -eq $false -and $_.Extension -eq '.mp3'}
For ($i=0; $i -le $files_per_folder - 1; $i++)
{
Copy-Item -Path ($my_path + "\" + $list[$random[(($J - 1) * $files_per_folder) +$i]]) -Destination ('F:\' + $j)
$copied_size = $copied_size + ($my_path + "\" + $list[$random[(($J - 1) * $files_per_folder) +$i]]).length
}
}
Write-Host "Copied Size = " $copied_size

Here's a way to solve your problem using some more powershell-like patterns. It compares the current file to be copied with space remaining and will exit out of the top-level loop if that condition is true.
#requires -Version 3
$path = '\\share\Public\Shared Music'
$filesPerFolder = 5
$copySize = 0
$random = 1..200 | Get-Random -Count 100
$files = Get-ChildItem -Path $path -File -Filter *.mp3
:main
for ($i = 1; $i -le 5; $i++) {
$dest = New-Item -Path F:\$i -ItemType Directory -Force
for ($j = 0; $j -le $filesPerFolder; $j++) {
$file = $files[$random[(($j - 1) * $filesPerFolder) + $i]]
if ((Get-PSDrive -Name F).Free -lt $file.Length) {
break main
}
$file | Copy-Item -Destination $dest\
$copySize += $file.Length
}
}

Related

How to improve this Powershell script?

I wrote a powershell script that merges go pro video files if they have multiple files. It works when the videos are at the root drive i.e. C:, but otherwise not. The mergevideos.txt file is not created if I run the script from a different directory. Or the txt is created but it's empty. Not sure what's going on when run from a different directory.
So is there a way to fix these issues and refactor this code to make it better? Ideally I want the script to automatically look at the directory it's in or allows me to specify the directory it should work in so I can just call it from the same location but the videos can be anywhere.
$path = "C:/NewVideos/"
$oldvids = Get-ChildItem -Path $path *.mp4
foreach ($oldvid in $oldvids) {
$curpath = $oldvid.DirectoryName
$name = [System.IO.Path]::GetFileNameWithoutExtension($oldvid)
$ext = [System.IO.Path]::GetExtension($oldvid)
if ($name.StartsWith("GX01") -and !($name.EndsWith("_merged")))
{
$newvid = $curpath + $name + "_merged" + $ext
if ([System.IO.File]::Exists($newvid))
{
Write-Output "$name | ALREADY MERGED"
continue
}
$count = 1
for ($num = 2; $num -lt 10; $num++)
{
$nextpart = $name.Replace("GX01", "GX0" + $num)
if ([System.IO.File]::Exists($curpath + "/" + $nextpart + $ext))
{
$count = $num
}
}
if ($count -eq 1)
{
Write-Output "$name | SINGLE VIDEO"
continue
}
$mergefile = $curpath + "mergevideos.txt"
if (!(Test-Path $mergefile))
{
New-Item -path $curpath -name mergevideos.txt -type "file"
}
Clear-Content $mergefile
for ($num = 1; $num -le $count; $num++)
{
$nextpart = $name.Replace("GX01", "GX0" + $num)
$videofilename = "file '" + $nextpart + $ext + "'"
Add-Content $mergefile $videofilename
}
Write-Output "$name | MERGING $count VIDEOS"
ffmpeg -f concat -safe 0 -i $mergefile -c copy $newvid
}
}

Reducing amout of lines in variable within loop in Powershell

I have a txt file containing 10000 lines. Each line is an ID.
Within every loop iteration I want to select 100 lines, put them in a special format and do something. I want to do this until the document is finished.
The txt looks like this:
406232C1331283
4062321N022075
4062321H316457
Current approach:
$liste = get-content "C:\x\input.txt"
foreach ($item in $liste) {
azcopy copy $source $target --include-pattern "*$item*" --recursive=true
}
The system will go throug the TXT file and make a copy request for every name it finds in the TXT file. Now the system is able to handle like 300 search-patterns in one request. like
azcopy copy $source $target --include-pattern "*id1*;*id2*;*id3*"
How can I extract 300 items from the document at once, separate them with semicolon and embedd them in wildcard? I tried to pipe everyting in a variable and work with -skip.
But it seems not easy to handle :(
Use the -ReadCount parameter to Get-Content to send multiple lines down the pipeline:
Get-Content "C:\x\input.txt" -ReadCount 300 | ForEach-Object {
$wildCards = ($_ | ForEach-Object { "*$_*" } -join ';'
azcopy copy $source $target --include-pattern $wildCards --recursive=true
}
Do you want 100 or 300 at a time? ;-)
I'm not sure if I really got what the endgoal is but to slice a given amount of elements in chunks of a certain size you can use a for loop like this:
$liste = Get-Content -Path 'C:\x\input.txt'
for ($i = 0; $i -lt $Liste.Count; $i += 100) {
$Liste[$i..$($i + 99)]
}
Now if I got it right you want to join these 100 elements and surround them with certain cahrachters ... this might work:
'"*' + ($Liste[$i..$($i + 99)] -join '*;*') + '*"'
Together it would be this:
$liste = Get-Content -Path 'C:\x\input.txt'
for ($i = 0; $i -lt $Liste.Count; $i += 100) {
'"*' + ($Liste[$i..$($i + 99)] -join '*;*') + '*"'
}
There's many ways, here's one of them...
First I would split array to chunks of 100 elements each, using this helper function:
Function Split-Array ($list, $count) {
$aggregateList = #()
$blocks = [Math]::Floor($list.Count / $count)
$leftOver = $list.Count % $count
for($i=0; $i -lt $blocks; $i++) {
$end = $count * ($i + 1) - 1
$aggregateList += #(,$list[$start..$end])
$start = $end + 1
}
if($leftOver -gt 0) {
$aggregateList += #(,$list[$start..($end+$leftOver)])
}
$aggregateList
}
For example to split your list into chunks of 100 do this:
$Splitted = Split-Array $liste -count 100
Then use foreach to iterate each chunk and join its elements for the pattern you need:
foreach ($chunk in $Splitted)
{
$Pattern = '"' + (($chunk | % {"*$_*"}) -join ";") + '"'
azcopy copy $source $target --include-pattern $Pattern --recursive=true
}

PowerShell: How to use remove-item to delete files in batches?

I have a small script which deletes files from a directory where date1 is less than date2. My script is working, however, as this is run on a directory with many files, I'd like the remove item to only remove 100 files at a time, so that I can monitor the progress with each run. Is this possible?
if ($date1 -lt $date2)
{
$_ | Remove-Item;
}
Use a for loop and make the counter step 100 on each iteration:
if($date1 -ge $date2){
# nothing to be done
return
}
$files = Get-ChildItem $pathToFolder
for($i = 0; $i -lt $files.Count; $i += 100){
$null = Read-Host "Press Enter to delete the next 100 files... "
$filesToDelete = $files[$i..($i+99)]
$filesToDelete |Remove-Item
Write-Host "Deleted $($filesToDelete.Count) files..."
}

Split a large csv file into multiple csv files according to the size in powershell

I have a large CSV file and I want to split it with respect to size and the header should be in every file.
For example, I have this 1.6MB file and I want the child files shouldn't be more than 512KB. So practically the parent file should have 4 child file.
Tried with the below simple program but the file is splitting with blank child files.
function csvSplitter {
$csvFile = "D:\Test\PTest\Dummy.csv";
$split = 10;
$content = Import-Csv $csvFile;
$start = 1;
$end = 0;
$records_per_file = [int][Math]::Ceiling($content.Count / $split);
for($i = 1; $i -le $split; $i++) {
$end += $records_per_file;
$content | Where-Object {[int]$_.Id -ge $start -and [int]$_.Id -le $end} | Export-Csv -Path "D:\Test\PTest\Destination\file$i.csv" -NoTypeInformation;
$start = $end + 1;
}
}csvSplitter
The logic for the size of the file is yet to write.
Tried to add both the files but I guess there is no option to add files.
this takes a slightly different path to a solution. [grin]
it ...
loads the CSV as a plain text file
saves the 1st line as a header line
calcs the batch size from the total line count & the batch count
uses array index ranges to grab the lines for each batch
combines the header line with the current batch of lines
writes that out to a text file
the reason for such a roundabout method is to save RAM. one drawback to loading the file as a CSV is the sheer amount of RAM needed. just loading the lines of text requires noticeably less RAM.
$SourceDir = $env:TEMP
$InFileName = 'LargeFile.csv'
$InFullFileName = Join-Path -Path $SourceDir -ChildPath $InFileName
$BatchCount = 4
$DestDir = $env:TEMP
$OutFileName = 'LF_Batch_.csv'
$OutFullFileName = Join-Path -Path $DestDir -ChildPath $OutFileName
#region >>> build file to work with
# remove this region when you are ready to do this with your test data OR to do this with real data
if (-not (Test-Path -LiteralPath $InFullFileName))
{
Get-ChildItem -LiteralPath $env:APPDATA -Recurse -File |
Sort-Object -Property Name |
Select-Object Name, Length, LastWriteTime, Directory |
Export-Csv -LiteralPath $InFullFileName -NoTypeInformation
}
#endregion >>> build file to work with
$CsvAsText = Get-Content -LiteralPath $InFullFileName
[array]$HeaderLine = $CsvAsText[0]
$BatchSize = [int]($CsvAsText.Count / $BatchCount) + 1
$StartLine = 1
foreach ($B_Index in 1..$BatchCount)
{
if ($B_Index -ne 1)
{
$StartLine = $StartLine + $BatchSize + 1
}
$CurrentOutFullFileName = $OutFullFileName.Replace('_.', ('_{0}.' -f $B_Index))
$HeaderLine + $CsvAsText[$StartLine..($StartLine + $BatchSize)] |
Set-Content -LiteralPath $CurrentOutFullFileName
}
there is no output on screen, but i got 4 files named LF_Batch_1.csv thru LF_Batch_4.csv that contained the 4our parts of the source file as expected. the last file has a slightly smaller number of rows, but that is what happens when the row count is not evenly divisible by the batch count. [grin]
Try this:
Add-Type -AssemblyName System.Collections
function Split-Csv {
param (
[string]$filePath,
[int]$partsNum
)
# Use generic lists for import/export
[System.Collections.Generic.List[object]]$contentImport = #()
[System.Collections.Generic.List[object]]$contentExport = #()
# import csv-file
$contentImport = Import-Csv $filePath
# how many lines per export file
$linesPerFile = [Math]::Max( [int]($contentImport.Count / $partsNum), 1 )
# start pointer for source list
$startPointer = 0
# counter for file name
$counter = 1
# main loop
while( $startPointer -lt $contentImport.Count ) {
# clear export list
[void]$contentExport.Clear()
# determine from-to from source list to export
$endPointer = [Math]::Min( $startPointer + $linesPerFile, $contentImport.Count )
# move lines to export to export list
[void]$contentExport.AddRange( $contentImport.GetRange( $startPointer, $endPointer - $startPointer ) )
# export
$contentExport | Export-Csv -Path ($filePath.Replace('.', $counter.ToString() + '.' ) ) -NoTypeInformation -Force
# move pointer
$startPointer = $endPointer
# increase counter for filename
$counter++
}
}
Split-Csv -filePath 'test.csv' -partsNum 7
try running this script:
$sw = new-object System.Diagnostics.Stopwatch
$sw.Start()
$FilePath = $HOME +'\Documents\Projects\ADOPT\Data8277.csv'
$SplitDir = $HOME +'\Documents\Projects\ADOPT\Split\'
CSV-FileSplitter -Path $FilePath -PartSizeBytes 35MB -SplitDir $SplitDir #-Verbose
$sw.Stop()
Write-Host "Split complete in " $sw.Elapsed.TotalSeconds "seconds"
I created this for files larger than 50GB files

Renaming Folders ascending

I want to automate my backups and Keep always some old versions. The idea was to use Windows Backup on a share and use a PowerShell script to start it.
I'm almost done, but I stuck at the renaming.
Share: \\Server\SysBackup\WindowsImageBackup
in that share there are Folders for all PC's I have. So for example I want to Keep the three last backups, it should to the following:
Current Backup: PC1
Old Backups: PC1_1, PC1_2
Now I want to rename them to one higher number
PC1_2 → PC1_3
PC1_1 → PC1_2
PC1 → PC1_1
So the backup now can use the PC1 Folder for the newest backup.
That's what I tried so far:
$BackupFolder = Get-ChildItem -Directory ($Target + "\WindowsImageBackup") |
Where-Object -Property Name -Like $env:computername* |
Select-Object -Property Name
$CurrentBackups = $BackupFolder.Count
if ($CurrentBackups -ge 1) {
Push-Location ($Target + "\WindowsImageBackup")
$i = 0
$xCurrentArray = $CurrentBackups - 1
$NewSubVersion = $CurrentBackups
while ($i -le $CurrentBackups) {
$NewName = $BackupFolder[$xCurrentArray].Name.TrimEnd("_")
Rename-Item $BackupFolder[$xCurrentArray] -NewName
}
Pop-Location
Clear-Variable $i
}
The files are not renamed, and I'm getting the following error:
You cannot call a method on a null-valued expression.
Rename-Item : Missing an argument for parameter 'NewName'. Specify a parameter of type 'System.String' and try again.
Where is my mistake?
I found the error
if ($CurrentBackups -ge 1)
{
Push-Location ($Target + "\WindowsImageBackup\")
$i= 0
$xCurrentArray = $CurrentBackups - 1
$NewSubVersion = $CurrentBackups
while ($i -lt $CurrentBackups)
{
if ($BackupFolder[$xCurrentArray].Contains("_"))
{
$Index = $BackupFolder[$xCurrentArray].IndexOf("_")
$NewName = $BackupFolder[$xCurrentArray].Substring(0, $Index)
Rename-Item $BackupFolder[$xCurrentArray] -NewName ($NewName + "_" + $NewSubVersion)
Clear-Variable Index
$NewSubVersion--
}
else
{
$NewName = $BackupFolder[$xCurrentArray]
Rename-Item $BackupFolder[$xCurrentArray] -NewName ($NewName + "_1")
}
$i++
$xCurrentArray--
}
Pop-Location
Clear-Variable i
}