Powershell script to apply Caption-Abstract on image from .txt file - powershell

Question answered!
Feel free to use code below, alter to correct paths on your system.
[Array] $arrayFromFile = Get-Content -Path '<path>\renameFileTitle.txt'
$iterator = 0
$files = Get-ChildItem "<path>\*.jpg"
$arrayFromFile[0]
foreach ($file in $files){
$i = $arrayFromFile[$iterator]
<path>\exiftool.exe exiftool -Caption-Abstract="$i" $file.FullName
$iterator++
}

You are setting $i = 0 again in every iteration. Move it outside the loop.
# initialization
$i = 0
foreach ($file in $files) {
C:\Users\Chris\Desktop\Pics\exiftool.exe exiftool -Caption-Abstract=$arrayFromFile[$i] $file.FullName
# increase at every iteration
$i++
}

Related

Powershell/ Print by filename

My English may not be perfect but I do my best.
I'm trying to write a Powershell script where the filename has a number at the end and it should print exactly that often.
Is this somehow possible ?
With the script it prints it only 1 time.
For whatever reason..
param (
[string]$file = "C:\Scans\temp\*.pdf",
[int]$number_of_copies = 1
)
foreach ($onefile in (Get-ChildItem $file -File)) {
$onefile -match '\d$' | Out-Null
for ($i = 1; $i -le [int]$number_of_copies; $i++) {
cmd /C "lpr -S 10.39.33.204 -P optimidoc ""$($onefile.FullName)"""
}
}
There is no need for parameter $number_of_copies when the number of times it should be printed is taken from the file's BaseName anyway.
I would change your code to:
param (
[string]$path = 'C:\Scans\temp'
)
Get-ChildItem -Path $path -Filter '*.pdf' -File |
# filter only files that end with a number and capture that number in $matches[1]
Where-Object { $_.BaseName -match '(\d+)$' } |
# loop through the files and print
ForEach-Object {
for ($i = 1; $i -le [int]$matches[1]; $i++) {
cmd /C "lpr -S 10.39.33.204 -P optimidoc ""$($_.FullName)"""
}
}
Inside the ForEach-Object, on each iteration, the $_ automatic variable represents the current FileInfo object.
P.S. Your script prints each file only once because you set parameter $number_of_copies to 1 as default value, but the code never changes that to the number found in the file name.
BTW. Nothing wrong with your English

Reducing amout of lines in variable within loop in Powershell

I have a txt file containing 10000 lines. Each line is an ID.
Within every loop iteration I want to select 100 lines, put them in a special format and do something. I want to do this until the document is finished.
The txt looks like this:
406232C1331283
4062321N022075
4062321H316457
Current approach:
$liste = get-content "C:\x\input.txt"
foreach ($item in $liste) {
azcopy copy $source $target --include-pattern "*$item*" --recursive=true
}
The system will go throug the TXT file and make a copy request for every name it finds in the TXT file. Now the system is able to handle like 300 search-patterns in one request. like
azcopy copy $source $target --include-pattern "*id1*;*id2*;*id3*"
How can I extract 300 items from the document at once, separate them with semicolon and embedd them in wildcard? I tried to pipe everyting in a variable and work with -skip.
But it seems not easy to handle :(
Use the -ReadCount parameter to Get-Content to send multiple lines down the pipeline:
Get-Content "C:\x\input.txt" -ReadCount 300 | ForEach-Object {
$wildCards = ($_ | ForEach-Object { "*$_*" } -join ';'
azcopy copy $source $target --include-pattern $wildCards --recursive=true
}
Do you want 100 or 300 at a time? ;-)
I'm not sure if I really got what the endgoal is but to slice a given amount of elements in chunks of a certain size you can use a for loop like this:
$liste = Get-Content -Path 'C:\x\input.txt'
for ($i = 0; $i -lt $Liste.Count; $i += 100) {
$Liste[$i..$($i + 99)]
}
Now if I got it right you want to join these 100 elements and surround them with certain cahrachters ... this might work:
'"*' + ($Liste[$i..$($i + 99)] -join '*;*') + '*"'
Together it would be this:
$liste = Get-Content -Path 'C:\x\input.txt'
for ($i = 0; $i -lt $Liste.Count; $i += 100) {
'"*' + ($Liste[$i..$($i + 99)] -join '*;*') + '*"'
}
There's many ways, here's one of them...
First I would split array to chunks of 100 elements each, using this helper function:
Function Split-Array ($list, $count) {
$aggregateList = #()
$blocks = [Math]::Floor($list.Count / $count)
$leftOver = $list.Count % $count
for($i=0; $i -lt $blocks; $i++) {
$end = $count * ($i + 1) - 1
$aggregateList += #(,$list[$start..$end])
$start = $end + 1
}
if($leftOver -gt 0) {
$aggregateList += #(,$list[$start..($end+$leftOver)])
}
$aggregateList
}
For example to split your list into chunks of 100 do this:
$Splitted = Split-Array $liste -count 100
Then use foreach to iterate each chunk and join its elements for the pattern you need:
foreach ($chunk in $Splitted)
{
$Pattern = '"' + (($chunk | % {"*$_*"}) -join ";") + '"'
azcopy copy $source $target --include-pattern $Pattern --recursive=true
}

Powershell Open File, Edit File, Update File - document lock issue

I have a text file with a list of multiple files that exceeded x characters. What I am trying to do is open each file, scan each line of the file, and if a file is more than x characters long I move the line to the next line so the file does not exceed x characters. That piece works great. The problem I am having is updating the text file I am trying to change/edit. I suspect the lock is the powershell script since the script is reading the file. Does anyone have any ideas on what I can do to update the original text file or remove the lock? Thanks for any help! My code is below:
[int] $limit = 131
$path = get-content C:\document\fix.txt
foreach ($f in $path)
{
Get-Content -path $f |
ForEach-Object {
$line = $_
for ($i = 0; $i -lt $line.Length; $i += $limit)
{
$length = [Math]::Min($limit, $line.Length - $i)
$line.SubString($i, $length)
}
} |
Set-Content $f
}
I figured it out! I had to put ( ) around get-content. Basically means - finish what you are doing before going to the next step.
[int] $limit = 131
$path = get-content C:\Soarian\fix.txt
foreach ($f in $path)
{
(Get-Content -path $f) |
ForEach-Object {
$line = $_
for ($i = 0; $i -lt $line.Length; $i += $limit)
{
$length = [Math]::Min($limit, $line.Length - $i)
$line.SubString($i, $length)
}
} |
Set-Content $f -Force
}

Using Powershell to recursively search directory for files that only contain zeros

I have a directory that contains millions of files in binary format. Some of these files were written to the disk wrong (no idea how). The files are not empty, but they only contain zeros. Heres an example http://pastebin.com/5b7jHjgr
I need to search this directory, find the files that are all zeros and write their path out to a file.
I've been experimenting with format-hex and get-content, but my limited powershell experience is tripping me up. Format-Hex reads the entire file, when I only need the first few bytes, and Get-Content expects text files.
Use IO.BinaryReader:
Get-ChildItem r:\1\ -Recurse -File | Where {
$bin = [IO.BinaryReader][IO.File]::OpenRead($_.FullName)
foreach ($byte in $bin.ReadBytes(16)) {
if ($byte) { $bin.Close(); return $false }
}
$bin.Close()
$true
}
In the old PowerShell 2.0 instead of -File parameter you'll need to filter it manually:
Get-ChildItem r:\1\ -Recurse | Where { $_ -is [IO.FileInfo] } | Where { ..... }
You can use a System.IO.FileStream object to read the first n bytes of each file.
The following code reads the first ten bytes of each file:
Get-ChildItem -Path C:\Temp -File -Recurse | ForEach-Object -Process {
# Open file for reading
$file = [System.IO.FileStream]([System.IO.File]::OpenRead($_.FullName))
# Go through the first ten bytes of the file
$containsTenZeros = $true
for( $i = 0; $i -lt $file.Length -and $i -lt 10; $i++ )
{
if( $file.ReadByte() -ne 0 )
{
$containsTenZeros = $false
}
}
# If the file contains ten zeros then add its full path to List.txt
if( $containsTenZeros )
{
Add-Content -Path List.txt -Value $_.FullName
}
}

Powershell to Split huge folder in multiple folders

I have a folder that contains many huge files. I want to split these files in 3 folders. The requirement is to get the count of files in main folder and then equally split those files in 3 child folders.
Example - Main folder has 100 files. When I run the powershell, 3 child folders should be created with 33, 33 and 34 files respectively.
How can we do this using Powershell?
I've tried the following:
$FileCount = (Get-ChildItem C:\MainFolder).count
Get-ChildItem C:\MainFolder -r | Foreach -Begin {$i = $j = 0} -Process {
if ($i++ % $FileCount -eq 0) {
$dest = "C:\Child$j"
md $dest
$j++
}
Move-Item $_ $dest
}
Here is another solution. This one accounts for the sub folders not existing.
# Number of groups to support
$groupCount = 3
$path = "D:\temp\testing"
$files = Get-ChildItem $path -File
For($fileIndex = 0; $fileIndex -lt $files.Count; $fileIndex++){
$targetIndex = $fileIndex % $groupCount
$targetPath = Join-Path $path $targetIndex
If(!(Test-Path $targetPath -PathType Container)){[void](new-item -Path $path -name $targetIndex -Type Directory)}
$files[$fileIndex] | Move-Item -Destination $targetPath -Force
}
If you need to split up the files into a different number of groups the use $groupCount of higher that 3. Can also work logic with a switch that would change $groupCount to something else if the count was greater that 500 for example.
Loop though the files one by one. Using $fileIndex as a tracker we determine the folder 0,1 or 2 in my case that the file will be put into. Then using that value we check to be sure the target folder exists. Yes, this logic could easily be placed outside the loop but if you have file and folder changes while the script is running you could argue it is more resilient.
Ensure the folder exists, if not make it. Then move that one item. Using the modulus operator, like in the other answers, we don't have to worry about how many files are there. Let PowerShell do the math.
This is super quick and dirty, but it does the job.
#Get the collection of files
$files = get-childitem "c:\MainFolder"
#initialize a counter to 0 or 1 depending on if there is a
#remainder after dividing the number of files by 3.
if($files.count % 3 -eq 0){
$counter = 0
} else {
$counter = 1
}
#Iterate through the files
Foreach($file in $files){
#Determine which subdirectory to put the file in
If($counter -lt $files.count / 3){
$d = "Dir1"
} elseif($counter -ge $files.count / 3 * 2){
$d = "Dir3"
} else {
$d = "Dir2"
}
#Create the subdirectory if it doesn't exist
#You could just create the three subdirectories
#before the loop starts and skip this
if(-Not (test-path c:\Child\$d)){
md c:\Child\$d
}
#Move the file and increment the counter
move-item $file.FullName -Destination c:\Child\$d
$counter ++
}
I think it's possible to do without doing the counting and allocating yourself. This solution:
Lists all the files
Adds a counter property which cycles 0,1,2,0,1,2,0,1,2 to each file
groups them into buckets based on the counter
moves each bucket in one command
There's scope for rewriting it in a lot of ways to make it nicer, but this saves doing the math, handling uneven allocations, iterating over the files and moving them one at a time, would easily adjust to different numbers of groups.
$files = (gci -recurse).FullName
$buckets = $files |% {$_ | Add-Member NoteProperty "B" ($i++ % 3) -PassThru} |group B
$buckets.Name |% {
md "c:\temp$_"
Move-Item $buckets[$_].Group "c:\temp$_"
}