I am trying to use the PSWritePDF module to merge pdfs. I have about 64 folders and each of them have about 20+ files that need to be merged. In the end, I would have 64 pdfs, each containing the merged files from each of the 64 folders. I have already written some code but I am struggling to create an array of file names that I can pass to the Merge-PDF function. I know the first part of this code is redundant, just haven't fixed it yet.
#https://github.com/EvotecIT/PSWritePDF/blob/master/Example/Example03.Merging/Example03.ps1
#This gives me the 64 folder names
$folder_NM = Get-ChildItem -Path \\main_directory\CURRENT |
Where-Object {$_.PSIsContainer} |
Foreach-Object {$_.Name}
#This iterates through the 64 folders
foreach ($X IN $folder_NM)
{
#this grabs each of the 64 directories
$main_path = join-path -path \\main_directory\CURRENT -ChildPath $X
#This grabs the names of the pdfs in each folder
$file_names = Get-ChildItem $main_path |
ForEach-Object {$_.Name}
#This is grabbing each file in the folder and giving me the formatted string I need to pass to Merge-PDF. i.e. C:\\User\Current\pdf.1
foreach($Y in $file_names){
$idv_files = join-path -path $main_path -ChildPath $Y
#This is where I am stuck. I am trying to create an array with each filename comma separated. This currently just overwrites itself each time it goes through the loop.
$arr = $idv_files-join','
#This is needed for mergePDF
$OutputFile = "$maindirectory\TESTING\$X.pdf"
#This only puts the most recent file in the output file. Thus the need for an array of file names.
Merge-PDF -InputFile $arr -OutputFile $OutputFile
#Debugging
#Write-Host $arr
}
}
Specifically, this is where I am struggling. I am getting the correct files in $idv_files and if I use those in Merge-PDF then I just get a PDF with the one file that was processed last. I think I just need them comma separated and all put into the same array so that Merge-PDF will merge them all together.
foreach($Y in $file_names){
$idv_files = join-path -path $main_path -ChildPath $Y
#This is where I am stuck. I am trying to create an array with each filename comma separated. This currently just overwrites itself each time it goes through the loop.
$arr = $idv_files-join','
Anything helps. Very new to powershell!
Untested but, if the function takes [string[]] as input as in my comment, this should get you a MERGED PDF.pdf on each folder.
I would recommend you to test this with a few folders containing pdf files on your local host before trying with your FS.
# Get the Directories
$folder_NM = Get-ChildItem -Path \\main_directory\CURRENT -Directory
#This iterates through the 64 folders
foreach ($dir IN $folder_NM)
{
# This gets you the array of PDF Files
$file_names = Get-ChildItem $dir.FullName -Filter *.pdf -File |
Sort-Object Name
# Define the output file for Merged PDF
$OutputFile = Join-Path $dir.FullName -ChildPath 'MERGED PDF.pdf'
# If Merge-PDF takes [string[]] as input, this should work
Merge-PDF -InputFile $file_names.FullName -OutputFile $OutputFile
}
It appeared that you wanted the merged .pdf file to be the subdirectory name + '.pdf'. Perhaps I misunderstood. This is also UNTESTED, but might do what you want. Using the current Windows PowerShell 5.1 or any PowerShell Core, testing for .PSIsContainer is not necessary. Get-ChildItem supports -File and -Directory switches.
[CmdletBinding()]
param ()
$RootDir = '\\main_directory\CURRENT'
# Get the subdirectory list.
Get-ChildItem -Directory -Path $RootDir |
# Process each subdirectory.
ForEach-Item {
# Create an array of the .pdf files to be merged.
$file_names = (Get-ChildItem -File -Path $_.FullName -Filter '*.pdf').FullName
#This is needed for mergePDF
$OutputFile = Join-Path -Path $RootDir -ChildPath $($_.Name + '.pdf')
Write-Verbose "OutputFile is $OutputFile"
Merge-PDF -InputFile $file_names -OutputFile $OutputFile
}
Related
I am trying to make simple powershell script that is archiving files coming in daily. Every file has date at the beggining of its name, for example: 20211220_Something.csv, 20211220_SomethingElse.txt, 20211219_Something.csv, 20211219_SomethingElse.txt etc...
I would like to make script that collects all files with extensions (*.txt, *.csv, *.xslx) from specifict directories which are:
\\Main\Files and \\Main\Files\SecondaryFiles
and archives all files with above extensions to for example \\Main\Files\archive\2021\12\20.12.zip
where 2021, 12 and 20.12 are elements of date provided in file name prefix. Inside 20.12.zip we have all files from \\Main\Files with directory named "SecondaryFiles" in which theres all files from the \\Main\Files\SecondaryFiles. After archiving i would like to delete all the files that i just zipped.
Right now i have this piece of code which loop through all files in the \Main\ dir and extracts date prefix. I have tried using [Datetime]::parseexact() method but it doesnt work since my loop returns whole path. Anybody has any idea how to approach this?
$Date = Get-Date
$Day = $Date.Day
$Month = Date.Month
$Year = $Date.Year
$directoryPath = "\\Main\Files\archive'"+$Year+"\"+$Month
$files = Get-ChildItem -Path "\\Main\Files" -Include *.txt, *.csv, *.xlsx -Recurse
for ($i=0; $i -lt $files.Count; $i++){
$temp = $files[$i].FullName.split("_")[1]
}
if(!Test-Path -path $directoryPath){
New-Item -ItemType directory -Path $directoryPath
}
Compress-Archive -Path "\\Main\Files", "\\Main\Files\*.txt", "\\Main\Files\*.csv", "\\Main\Files\*.xlsx", "\\Main\Files\SecondaryFiles\*.txt", "\\Main\Files\SecondaryFiles\*.csv", "\\Main\Files\SecondaryFiles\*.xlsx" -Update -DestinationPath "\\Main\Files\archive\$Year\$Month\$Day.$Month.zip"
Then i am removing items from the original directory.
Also one thing worth mentioning is that I cant be sure if folder contains only files from todays date. So script should work fine when theres files from like all week lets say 20211214 till 20211220.
So again i would like to Compress-Archive files like i did above but instead todays date the path would contain extracted date from file name prefix.
Use Group-Object to group all files having the same date prefix together and use that to create the output subdirectories, the final .zip file and also to remove the original files after zipping.
$sourcePath = '\\Main\Files'
$destination = '\\Main\Files\archive'
Get-ChildItem -Path $sourcePath -Include '*.txt', '*.csv', '*.xlsx' -Recurse |
# select only files that start with 8 digits followed by an underscore
Where-Object { $_.BaseName -match '^\d{8}_' } |
# group the files on the date part and loop trhough these groups
Group-Object { $_.BaseName.Substring(0,8) } | ForEach-Object {
# split the date part into variables. Automatic variable $_ represents one Group,
# so we can take that group's Name to split into date parts
$year, $month, $day = $_.Name -split '(\d{4})(\d{2})(\d{2})' -ne ''
# construct the target folder path for the zip file
$targetPath = Join-Path -Path $destination -ChildPath ('{0}\{1}' -f $year, $month)
# create the new sub directory if it does not yet exist
$null = New-Item -Path $targetPath -ItemType Directory -Force
# create the full path and filename for the zip file
$zip = Join-Path -Path $targetPath -ChildPath ('{0}.{1}.zip' -f $day, $month)
# compress the files in the group
Compress-Archive -Path $_.Group.FullName -DestinationPath $zip -Update
# here is where you can delete the original files after zipping
$_.Group | Remove-Item -WhatIf
}
Note I have added switch -WhatIf to the Remove-Item cmdlet. This is a safety switch, so you are not actually deleting anything yet. The cmdlet now only displays what would be deleted. Once you are happy with this output, remove that -WhatIf switch so the files are deleted.
I have many files in a folder with the same extension file. I want to rename the file one by one then do the other process, which is Proc_After_Rename. In this process, I will read some information of the file. In this process, I want to read the information of the file one by one based on the previous process to rename the extension file name. After I finish do the process, then I pick again the file to rename and do the process.
For now, I can rename the file, but it rename it all the files directly before I do the other process. ANf when I go to this process Proc_After_Rename, I read the information for all the file, because all the file already rename the extension. Anyone can help, please
UPDATED
Function Proc_After_Rename
{
$Path = "C:\Users\SS\PowerShell\"
Write-Host "Do some process with .pro file"
$Job_Info = Get-ChildItem -Path "$store\*.ini" -File -Force
& $Path\UIni.exe $Job_Info AGM CRM AGM_CUR_CRM AGM_CUR_CRM.CMD #this how I read the .ini file
start-sleep -s 1
$Read_AGM_CUR_CRM = Get-Content .\AGM_CUR_CRM.CMD
$a_AGM_CUR_CRM,$b_AGM_CUR_CRM = $Read_AGM_CUR_CRM -split "="
$b_AGM_CUR_CRM
Pick_file
}
Function Pick_file
{
$WKFD= "C:\Users\SS\PowerShell\"
$store = "$WKFD\GM"
$files = #(Get-ChildItem -Path "$store\*.txt")
Foreach ($file in $files)
{
# Check file existence
if (Test-Path -Path $file -PathType Leaf)
{
# Get file name from object path file $file
$file_name = #(Get-ChildItem -Path "$file" -Name)
# Replace the .cue with .pro
$new_name = $file_name -replace ".txt", ".ini"
# Rename the file
Rename-Item -Path $file -NewName "$new_name"
}
Proc_After_Rename
}
}
$A = Pick_file
With the Get-ChildItem cmdlet, you can iterate the results easily by directly piping them through to a Foreach-Object. Inside that loop, every file found is a FileInfo object, represented by the automatic variable $_.
Using the -Filter parameter the below code gets only files with a *.txt extension and by adding the -File switch you only recieve FileInfo objects, not Directory objects.
If I understand the question correctly, you want to first rename each *.txt file to *.ini and then do some more stuff with the renamed file. This should do it:
$store = "C:\Users\HH"
Get-ChildItem -Path $store -Filter '*.txt' -File | ForEach-Object {
# the automatic variable '$_' here represents a single FileInfo object in the list.
# you don't need to test if the file exists, if it doesn't, Get-ChildItem would not return it.
# create the new name for the file. Simply change the extension to '.ini'
$newName = '{0}.ini' -f $_.BaseName
# rename the file and get a reference to it using the -PassThru parameter
$renamedFile = $_ | Rename-Item -NewName $newName -PassThru
# for testing/proof:
# remember that the '$_' variable now has old file name info.
Write-Host ("File '{0}' is now renamed to '{1}'" -f $_.FullName, $renamedFile.FullName)
# now do the rest of your processing, using the $renamedFile FileInfo object.
# you can see what properties and methods a FileInfo object has here:
# https://learn.microsoft.com/en-us/dotnet/api/system.io.fileinfo?view=netframework-4.8#properties
# to get the full path and filename for instance, use $renamedFile.FullName
# ........ #
}
Hope that helps
# Rename the file
Rename-Item -Path $file -NewName "$new_name"
# path of the renamed file
$new_path_file = "$store\$new_name"
# This is the process after rename the file
# ........ #
#Put your process here and make sure you reference the new file, as long as its in
#the foreach you are good.
}
}
One problem with your code is the Get-ChildItem inside Proc_After_Rename. This presents UIni with a list of files instead of a file. I have tried to fix this problem by reworking your code, and sliding part of Proc_After_Rename into Pick_File. I haven't tested any of this, but I hope it gives you a better idea of how to organize your code.
If I were writing this from scratch, I would use pipelines.
Function Pick_file
{
$WKFD= "C:\Users\SS\PowerShell\"
$store = "$WKFD\GM"
$files = #(Get-ChildItem -Path "$store\*.txt")
Foreach ($file in $files)
{
# Check file existence
if (Test-Path -Path $file -PathType Leaf)
{
# Get file name from object path file $file
$file_name = #(Get-ChildItem -Path "$file" -Name)
# Replace the .cue with .pro
$new_name = $file_name -replace ".txt", ".ini"
# Rename the file
Rename-Item -Path $file -NewName "$new_name"
$new_file_name = $file.fullname
& $Path\UIni.exe $new_file_name AGM CRM AGM_CUR_CRM AGM_CUR_CRM.CMD
#this how I read the .ini file
start-sleep -s 1
$Read_AGM_CUR_CRM = Get-Content .\AGM_CUR_CRM.CMD
$a_AGM_CUR_CRM,$b_AGM_CUR_CRM = $Read_AGM_CUR_CRM -split "="
$b_AGM_CUR_CRM
}
}
}
$A = Pick_file
I was wondering how I would go about combining specific files together using powershell. Example: I want to take EDIDISC.UPD EDIACCA.UPD EDIBRUM.UPD ETC ETC ETC ETC and Combine the contents of these files together and make a new file named A075MMDDYY.UPD. Now I would want it to be able to be run by whomever has the .UPD files on their network drive. Such as example: Mine would be in N:\USERS\Kevin, someone else's may be in N:\USERS\JohnDoe.
So far I only have:
Param (
$path = (Get-Location).Path,
$filetype = "*.UPD",
$files = (Get-ChildItem -Filter $filetype),
$Newfile = $path + "\Newfile.UPD"
)
$files | foreach { Get-Content $_ | Out-File -Append $Newfile -Encoding ascii }
Focusing just on the aspect of concatenating (catting) the contents of multiple files to form a new file, assuming the current directory (.):
$dir = '.'
$newfile ="$dir/Newfile.UPD"
Get-Content "$dir/*.UPD" -Exclude (Split-Path -Leaf $newFile) |
Out-File -Append -Encoding Ascii $newFile
You can pass a wildcard expression directly to Get-Content's (implied) -Path parameter in order to retrieve the contents of all matching files.
Since the output file is placed in the same dir. and matches the wildcard expression too, however, it must be excluded from matching, by filename, hence the -Exclude (Split-Path -Leaf $newFile) argument (this additionally assumes that there's no input file whose name is the same as the output file's).
I need to extract the 20th to 30th letters from many files in a single folder and print the results to a text file.
I understand i need to use substring(20,10) but fail at managing multiple files.
I created a working script for single file use:
$var = Get-Content -Path C:\testfiles\1.txt
$var.Substring(20,10)
But now i need to handle multiple files.
Any help?
This might work
#Path to directory
$DirPath = "C:\Files"
#get all text files from above path
$Files = Get-ChildItem -Path $DirPath -Filter "*.txt"
#path to text file where strings are stored
$DesFile = "C:\String.txt"
#loop through all files and save to desfile
foreach ($txt in $Files)
{
$var = Get-Content -Path $txt.FullName
$var.Substring(20,10) | Add-Content -Path $DesFile
}
What I am attempting to do is have an end CSV file like this:
path , get-childItem
path , get-childItem
path , get-childItem
etc.
I am very new to powershell so as of right now the issue I am having isn't with errors but rather actually knowing what to do
Here is my code thus far:
$data = gc "c:\Documents and Settings\user\Desktop\paths.txt"
ForEach($line in $data) {
$subName = Get-ChildItem "$line"
write-host $line","$subName
#Export-Csv -Path "C:\Documents and Settings\reidli\Desktop\mytest.csv" -Force "true" -InputObject $output
So I am reading a list of paths from paths.txt where each line is a different path
$subName is the list of Children i want and is correct
and when I write-host the output is in fact: path,children
commentented out is where I attempted to export csv, but this wont work as it would over-write the existing csv for each path
Can someone please help me take each line in the foreach loop and create a .csv file with each path and children?
just don't use export-csv at all. CSV is plain comma separated text file. which makes it easy to export:
$data = gc "c:\Documents and Settings\user\Desktop\paths.txt"
Remove-Item "C:\Documents and Settings\reidli\Desktop\mytest.csv" -ea 0
ForEach($line in $data) {
$subName = Get-ChildItem "$line"
write-host $line","$subName
"$line,$subName" | Add-Content "C:\Documents and Settings\reidli\Desktop\mytest.csv"
}
btw. do you really want the sub files space separated? it is what you get when you convert Get-ChildItem to string
Try this:
$data | select $_ , #{L="subname";E=#{(gci $_)}} | Export-Csv "C:\Documents and Settings\reidli\Desktop\mytest.csv" -force