I need to extract the 20th to 30th letters from many files in a single folder and print the results to a text file.
I understand i need to use substring(20,10) but fail at managing multiple files.
I created a working script for single file use:
$var = Get-Content -Path C:\testfiles\1.txt
$var.Substring(20,10)
But now i need to handle multiple files.
Any help?
This might work
#Path to directory
$DirPath = "C:\Files"
#get all text files from above path
$Files = Get-ChildItem -Path $DirPath -Filter "*.txt"
#path to text file where strings are stored
$DesFile = "C:\String.txt"
#loop through all files and save to desfile
foreach ($txt in $Files)
{
$var = Get-Content -Path $txt.FullName
$var.Substring(20,10) | Add-Content -Path $DesFile
}
Related
I am trying to use the PSWritePDF module to merge pdfs. I have about 64 folders and each of them have about 20+ files that need to be merged. In the end, I would have 64 pdfs, each containing the merged files from each of the 64 folders. I have already written some code but I am struggling to create an array of file names that I can pass to the Merge-PDF function. I know the first part of this code is redundant, just haven't fixed it yet.
#https://github.com/EvotecIT/PSWritePDF/blob/master/Example/Example03.Merging/Example03.ps1
#This gives me the 64 folder names
$folder_NM = Get-ChildItem -Path \\main_directory\CURRENT |
Where-Object {$_.PSIsContainer} |
Foreach-Object {$_.Name}
#This iterates through the 64 folders
foreach ($X IN $folder_NM)
{
#this grabs each of the 64 directories
$main_path = join-path -path \\main_directory\CURRENT -ChildPath $X
#This grabs the names of the pdfs in each folder
$file_names = Get-ChildItem $main_path |
ForEach-Object {$_.Name}
#This is grabbing each file in the folder and giving me the formatted string I need to pass to Merge-PDF. i.e. C:\\User\Current\pdf.1
foreach($Y in $file_names){
$idv_files = join-path -path $main_path -ChildPath $Y
#This is where I am stuck. I am trying to create an array with each filename comma separated. This currently just overwrites itself each time it goes through the loop.
$arr = $idv_files-join','
#This is needed for mergePDF
$OutputFile = "$maindirectory\TESTING\$X.pdf"
#This only puts the most recent file in the output file. Thus the need for an array of file names.
Merge-PDF -InputFile $arr -OutputFile $OutputFile
#Debugging
#Write-Host $arr
}
}
Specifically, this is where I am struggling. I am getting the correct files in $idv_files and if I use those in Merge-PDF then I just get a PDF with the one file that was processed last. I think I just need them comma separated and all put into the same array so that Merge-PDF will merge them all together.
foreach($Y in $file_names){
$idv_files = join-path -path $main_path -ChildPath $Y
#This is where I am stuck. I am trying to create an array with each filename comma separated. This currently just overwrites itself each time it goes through the loop.
$arr = $idv_files-join','
Anything helps. Very new to powershell!
Untested but, if the function takes [string[]] as input as in my comment, this should get you a MERGED PDF.pdf on each folder.
I would recommend you to test this with a few folders containing pdf files on your local host before trying with your FS.
# Get the Directories
$folder_NM = Get-ChildItem -Path \\main_directory\CURRENT -Directory
#This iterates through the 64 folders
foreach ($dir IN $folder_NM)
{
# This gets you the array of PDF Files
$file_names = Get-ChildItem $dir.FullName -Filter *.pdf -File |
Sort-Object Name
# Define the output file for Merged PDF
$OutputFile = Join-Path $dir.FullName -ChildPath 'MERGED PDF.pdf'
# If Merge-PDF takes [string[]] as input, this should work
Merge-PDF -InputFile $file_names.FullName -OutputFile $OutputFile
}
It appeared that you wanted the merged .pdf file to be the subdirectory name + '.pdf'. Perhaps I misunderstood. This is also UNTESTED, but might do what you want. Using the current Windows PowerShell 5.1 or any PowerShell Core, testing for .PSIsContainer is not necessary. Get-ChildItem supports -File and -Directory switches.
[CmdletBinding()]
param ()
$RootDir = '\\main_directory\CURRENT'
# Get the subdirectory list.
Get-ChildItem -Directory -Path $RootDir |
# Process each subdirectory.
ForEach-Item {
# Create an array of the .pdf files to be merged.
$file_names = (Get-ChildItem -File -Path $_.FullName -Filter '*.pdf').FullName
#This is needed for mergePDF
$OutputFile = Join-Path -Path $RootDir -ChildPath $($_.Name + '.pdf')
Write-Verbose "OutputFile is $OutputFile"
Merge-PDF -InputFile $file_names -OutputFile $OutputFile
}
I have some issues now concerning a code in PowerShell.
I want to write a script in PowerShell to copy files from one folder to another existing folder based on CSV file :
For example I have in my CSV two columns : one for the name of the file without the extension and one for the path where I want to copy my file into.
example of csv :
File,Pathdestinationfile
test1_000,C:/Documents/test1
so I need to get my file from a path such as C:/Source and copy it to the specific destinationfile path mentionned in my CSV.
Here is the code that I have for the moment but it doesn't work :
Import-CSV C:\Users\T0242166\Documents\SCRIPT_CSV\testscript.csv |
Where-Object Name -Like $_.File |
foreach { Copy-item -Path "C:/Source" -Destination $_.Filepathdestination -Recurse }
Can you please help me with this issue ?
Thank you
I have provided comments in the script to help you understand what I am doing.
(note: this can be condensed, but I separated it out so you can see it easier.)
# Define your constant variable
$sourceFolder = "C:\Source"
# Import Your CSV
$content = Import-Csv "C:\Users\T0242166\Documents\SCRIPT_CSV\testscript.csv"
# Iterate Through Your Objects From CSV
foreach ($item in $content)
{
# Find the file you are looking for
$foundFile = Get-Item -Path ($sourceFolder + "$($item.File).*")
# Copy the file using the FullName property, which is actually the full path, to your destination defined in your csv file.
Copy-Item $foundFile.FullName -Destination ($item.Pathdestinationfile)
}
Condensed Version:
# Import and Iterate Through Your Objects From CSV
foreach ($item in (Import-Csv "C:\Users\T0242166\Documents\SCRIPT_CSV\testscript.csv"))
{
# Find the file you are looking for and copy it
Copy-Item "C:\Source\$($item.File).*" -Destination $item.Pathdestinationfile
}
Another Condensed Version:
Import-Csv "C:\Users\T0242166\Documents\SCRIPT_CSV\testscript.csv" | foreach { Copy-Item "C:\Source\$($_.File).*" -Destination $_.Pathdestinationfile }
Although you keep using forward slashes in the file paths, in PowerShell this would work:
# import the data from the CSV file and loop through each row
Import-CSV -Path 'C:\Users\T0242166\Documents\SCRIPT_CSV\testscript.csv' | ForEach-Object {
# get a list of FileInfo objects based on the partial file name from the CSV
# if you don't want it to find files in subfolders of "C:\Source", take off the -Recurse switch
$files = Get-ChildItem -Path "C:\Source" -Filter "$($_.File).*" -File -Recurse
foreach ($file in $files) {
$file | Copy-Item -Destination $_.Pathdestinationfile
}
}
I'm trying to display the .xml files and then delete them all. All other files for example, .txt, .pdf files can stay in the folder. Here is what I have so far. I'm pretty new to scripting and trying to find the efficient way since I have multiple folders.
$Folders = Get-childitem "my path" -Name
$HR = $Folders
$HR
foreach ($item in $HR)
{
$path = "my path" +$item + "\config\"
$path
$file = Get-ChildItem $path
$file
foreach ($item2 in $file) {
if ($file.name -eq 'XML*') {
$file
}
Get-ChildItem -include *
}
}
I do get the files displayed from each directory but I could not somehow exclude the .txt, .pdf files.
You can do this as a one liner:
Get-ChildItem | Where-Object {$_.Extension -eq ".xml"} | ForEach-Object { del $_.FullName}
Just call this from which ever directory you want to delete all xml files from.
Or if its a hierarchy of directories like
E:\AppFiles
.\TempFiles
.\Images
You can ise the -recurse switch too.
I have many files in a folder with the same extension file. I want to rename the file one by one then do the other process, which is Proc_After_Rename. In this process, I will read some information of the file. In this process, I want to read the information of the file one by one based on the previous process to rename the extension file name. After I finish do the process, then I pick again the file to rename and do the process.
For now, I can rename the file, but it rename it all the files directly before I do the other process. ANf when I go to this process Proc_After_Rename, I read the information for all the file, because all the file already rename the extension. Anyone can help, please
UPDATED
Function Proc_After_Rename
{
$Path = "C:\Users\SS\PowerShell\"
Write-Host "Do some process with .pro file"
$Job_Info = Get-ChildItem -Path "$store\*.ini" -File -Force
& $Path\UIni.exe $Job_Info AGM CRM AGM_CUR_CRM AGM_CUR_CRM.CMD #this how I read the .ini file
start-sleep -s 1
$Read_AGM_CUR_CRM = Get-Content .\AGM_CUR_CRM.CMD
$a_AGM_CUR_CRM,$b_AGM_CUR_CRM = $Read_AGM_CUR_CRM -split "="
$b_AGM_CUR_CRM
Pick_file
}
Function Pick_file
{
$WKFD= "C:\Users\SS\PowerShell\"
$store = "$WKFD\GM"
$files = #(Get-ChildItem -Path "$store\*.txt")
Foreach ($file in $files)
{
# Check file existence
if (Test-Path -Path $file -PathType Leaf)
{
# Get file name from object path file $file
$file_name = #(Get-ChildItem -Path "$file" -Name)
# Replace the .cue with .pro
$new_name = $file_name -replace ".txt", ".ini"
# Rename the file
Rename-Item -Path $file -NewName "$new_name"
}
Proc_After_Rename
}
}
$A = Pick_file
With the Get-ChildItem cmdlet, you can iterate the results easily by directly piping them through to a Foreach-Object. Inside that loop, every file found is a FileInfo object, represented by the automatic variable $_.
Using the -Filter parameter the below code gets only files with a *.txt extension and by adding the -File switch you only recieve FileInfo objects, not Directory objects.
If I understand the question correctly, you want to first rename each *.txt file to *.ini and then do some more stuff with the renamed file. This should do it:
$store = "C:\Users\HH"
Get-ChildItem -Path $store -Filter '*.txt' -File | ForEach-Object {
# the automatic variable '$_' here represents a single FileInfo object in the list.
# you don't need to test if the file exists, if it doesn't, Get-ChildItem would not return it.
# create the new name for the file. Simply change the extension to '.ini'
$newName = '{0}.ini' -f $_.BaseName
# rename the file and get a reference to it using the -PassThru parameter
$renamedFile = $_ | Rename-Item -NewName $newName -PassThru
# for testing/proof:
# remember that the '$_' variable now has old file name info.
Write-Host ("File '{0}' is now renamed to '{1}'" -f $_.FullName, $renamedFile.FullName)
# now do the rest of your processing, using the $renamedFile FileInfo object.
# you can see what properties and methods a FileInfo object has here:
# https://learn.microsoft.com/en-us/dotnet/api/system.io.fileinfo?view=netframework-4.8#properties
# to get the full path and filename for instance, use $renamedFile.FullName
# ........ #
}
Hope that helps
# Rename the file
Rename-Item -Path $file -NewName "$new_name"
# path of the renamed file
$new_path_file = "$store\$new_name"
# This is the process after rename the file
# ........ #
#Put your process here and make sure you reference the new file, as long as its in
#the foreach you are good.
}
}
One problem with your code is the Get-ChildItem inside Proc_After_Rename. This presents UIni with a list of files instead of a file. I have tried to fix this problem by reworking your code, and sliding part of Proc_After_Rename into Pick_File. I haven't tested any of this, but I hope it gives you a better idea of how to organize your code.
If I were writing this from scratch, I would use pipelines.
Function Pick_file
{
$WKFD= "C:\Users\SS\PowerShell\"
$store = "$WKFD\GM"
$files = #(Get-ChildItem -Path "$store\*.txt")
Foreach ($file in $files)
{
# Check file existence
if (Test-Path -Path $file -PathType Leaf)
{
# Get file name from object path file $file
$file_name = #(Get-ChildItem -Path "$file" -Name)
# Replace the .cue with .pro
$new_name = $file_name -replace ".txt", ".ini"
# Rename the file
Rename-Item -Path $file -NewName "$new_name"
$new_file_name = $file.fullname
& $Path\UIni.exe $new_file_name AGM CRM AGM_CUR_CRM AGM_CUR_CRM.CMD
#this how I read the .ini file
start-sleep -s 1
$Read_AGM_CUR_CRM = Get-Content .\AGM_CUR_CRM.CMD
$a_AGM_CUR_CRM,$b_AGM_CUR_CRM = $Read_AGM_CUR_CRM -split "="
$b_AGM_CUR_CRM
}
}
}
$A = Pick_file
I was wondering how I would go about combining specific files together using powershell. Example: I want to take EDIDISC.UPD EDIACCA.UPD EDIBRUM.UPD ETC ETC ETC ETC and Combine the contents of these files together and make a new file named A075MMDDYY.UPD. Now I would want it to be able to be run by whomever has the .UPD files on their network drive. Such as example: Mine would be in N:\USERS\Kevin, someone else's may be in N:\USERS\JohnDoe.
So far I only have:
Param (
$path = (Get-Location).Path,
$filetype = "*.UPD",
$files = (Get-ChildItem -Filter $filetype),
$Newfile = $path + "\Newfile.UPD"
)
$files | foreach { Get-Content $_ | Out-File -Append $Newfile -Encoding ascii }
Focusing just on the aspect of concatenating (catting) the contents of multiple files to form a new file, assuming the current directory (.):
$dir = '.'
$newfile ="$dir/Newfile.UPD"
Get-Content "$dir/*.UPD" -Exclude (Split-Path -Leaf $newFile) |
Out-File -Append -Encoding Ascii $newFile
You can pass a wildcard expression directly to Get-Content's (implied) -Path parameter in order to retrieve the contents of all matching files.
Since the output file is placed in the same dir. and matches the wildcard expression too, however, it must be excluded from matching, by filename, hence the -Exclude (Split-Path -Leaf $newFile) argument (this additionally assumes that there's no input file whose name is the same as the output file's).