i have more than 100 txt files in C:\myfolder*.txt
when i run this script from "C:\myfolder" i can add eighth and ninth lines to somename.txt
#echo off
powershell "$f=(Get-Content somename.txt);$f[8]='heretext1';$f | set-content somename.txt"
powershell "$f=(Get-Content somename.txt);$f[9]='heretext2';$f | set-content somename.txt"
but how can i add eighth and ninth lines to all *.txt files located in path C:\myfolder*.txt
Can someone explain me how to do it please...
Sorry for my English and Sorry if i didn't explaned my problem. i will try now:
I uses "*.uci" files, instead of *.txt files. i wrote txt because uci extensions are unknown for most of the people. These *.uci files are settings for chess engines with uci protocol.
So when you use chessbase program you have a lot of chess engines and each engine creates their "enginename.uci" file.
If you want to change the numbers of core used on your PC from 1 to 16 you need to do it manually by adding following information in *.uci file like this:
[OPTIONS]
Threads=1
That's why is better to make small batch or ps1 to change settings to all engines by adding these two lines with one click
Perhaps something like this PowerShell script would suit your task:
Get-ChildItem -Path 'C:\myfolder' -Filter '*.txt' | ForEach-Object {
$LineIndex = 0
$FileContent = Switch -File $_.FullName {Default {
$LineIndex++
If ($LineIndex -Eq 8) {#'
heretext1
heretext2
'#}
$_}}
Set-Content -Path $_.FullName -Value $FileContent}
Note:
Your code isn't adding lines, it is modifying existing lines. The solution below does the same.
Indices [8] and [9] access the 9th and 10th lines, not the 8th and 9th, given that array indexing is 0-based.
You need to call Get-ChildItem with your file-name pattern, C:\myfolder\*.txt, and process each matching file via ForEach-Object:
#echo off
powershell "Get-ChildItem C:\myfolder\*.txt | ForEach-Object { $f=$_ | Get-Content -ReadCount 0; $f[8]='heretext1'; $f[9]='heretext2'; Set-Content $_.FullName $f }"
Due to calling from a batch file (cmd.exe), the PowerShell command is specified on a single line; here's the readable version:
Get-ChildItem C:\myfolder\*.txt | # get all matching files
ForEach-Object { # process each
$f = $_ | Get-Content -ReadCount 0 # read all lines
$f[8] = 'heretext1'; $f[9] = 'heretext2' # update the 9th and 10th line
Set-Content $_.FullName $f # save result back to input file
}
Note:
Consider adding -noprofile after powershell, so as to suppress potentially unnecessary loading of profile files - see the documentation of the Windows PowerShell CLI, powershell.exe.
Using -ReadCount 0 with Get-Content greatly speeds up processing, because all lines are then read into a single array, instead of streaming the lines one by one, which requires collecting them in an array, which is much slower.
Note: If a given file has fewer than 10 lines, the above solution won't work, because you can only assign to existing elements of an array (an array is a fixed-size data structure). If you need to deal wit this case, insert the following after the $f = $_ | Get-Content -ReadCount 0 line, which inserts empty lines as needed to ensure that at least 10 lines are present:
if ($f.Count -lt 10) { $f += #('') * (10 - $f.Count) }
Easiest solution I can think of is using the -Index parameter provided in Select-Object for that.
Get-ChildItem -Path .\Desktop\*.txt | % { Get-Content $_.FullName | Select-Object -Index 7,8 } |
Out-File -FilePath .\Desktop\index.txt
Edit: based on your post.
Related
I am searching for all lines with '.png' and '.jpg' strings in them across multiple folders of TXT files.
Tried:
(Get-ChildItem K:\FILES -Recurse -Include '*.txt') | ForEach-Object {
(Get-Content $_) -match '\.png','\.jpg' | out-file K:\Output.txt
}
but it does not output anything. No error either. I did something similar recently and it was working. I am scratching my head wondering what am I doing wrong here...
By placing your Out-File call inside the ForEach-Object script block, you're rewriting your output file in full for every input file, so that the last input file's results - which may be none - end up as the sole content of the file.
The immediate fix is to move the Out-File call to its own pipeline segment, so that it receives all output, across all files:
Get-ChildItem K:\FILES -Recurse -Include '*.txt' |
ForEach-Object {
#(Get-Content $_) -match '\.png', '\.jpg'
} |
Out-File K:\Output.txt
Note: Technically, adding -Append to your Out-File call inside the ForEach-Object could have worked too, but this approach should be avoided:
Every Out-File call must open and close the output file, which makes the operation much slower.
You need to ensure that there is no preexisting output file beforehand - otherwise you'll end up appending to that file's existing content.
However, consider speeding up your command with the help of Select-String:
Get-ChildItem K:\FILES -Recurse -Include '*.txt' |
Select-String -Pattern '\.png', '\.jpg' |
ForEach-Object Line |
Out-File K:\Output.txt
Note:
In PowerShell (Core) 7+, you can use the -Raw switch with Select-String, which directly outputs only the text of all matching lines, in which case ForEach-Object Line isn't needed.
If you want to prefix each matching line with the source file path:
Get-ChildItem K:\FILES -Recurse -Include '*.txt' |
Select-String -Pattern '\.png', '\.jpg' |
ForEach-Object { '{0}: {1}' -f $_.Path, $_.Line } |
Out-File K:\Output.txt
Note: If you pipe Select-String output directly (without -Raw or ForEach-Object Line) to Out-File (or if you use >), you'll get similar output (even including a character position), but with limitations:
You'll get a blank line at the top and the bottom of the file.
Long line texts may be truncated.
The reason is that Out-File and its virtual alias > send the for-display representations of the input objects to the output file, which aren't meant for programmatic processing and can incur truncation of the data based on the line length (number of columns) of the current console window.
I have a list of EDI text files with specific text in them. Currently in order for our custom scripting to convert them into an SQL table, we need to be able to see the X12 file type in the filename. Because we are using SQL script to get the files into tables this solution needs to be a one line solution. We have a definition table of client files which specify which field terminator and file types to look for so we will be later substitute those values into the one line solution to be executed individually. I am currently looking at Powershell (v.3) to do this for maximum present and future compatibility. Also, I am totally new to Powershell, and have based my script generation on posts in this forum.
Files example
t.text.oxf.20170815123456.out
t.text.oxf.20170815234567.out
t.text.oxf.20170815345678.out
t.text.oxf.20170815456789.out
Search strings to find within files: (To find EDI X12 file type uniquely, which may be duplicated within the same file n times)
ST*867
ST*846
ST~867
ST~846
ST|867
ST|846
Here is what I have so far which does not show itself doing anything with the whatif parameter:
(Get-ChildItem .\ -recurse | Select-String -pattern 'ST~867' -SimpleMatch).Path | Foreach -Begin {$i=1} -Process {Rename-Item -LiteralPath $_ -NewName ($_ -replace 'out$','867.out' -f $i++) -whatif}
The fist part:
(Get-ChildItem .\ -recurse | Select-String -pattern 'ST~867' -SimpleMatch).Path
Simply gets a list of paths that we need to input to be renamed
The second part after the | pipe:
Foreach -Begin {$i=1} -Process {Rename-Item -LiteralPath $_ -NewName ($_ -replace '\.out','.867.out' -f $i++) -whatif}
will supposedly loop through that list and rename the files adding the EDI type to the end of the file. I have tried 'out$','867.out' with no change.
Current Errors:
The first part shows duplicated path elements probably because there are multiple Transaction Set Headers in the files, is there any way to force it to be unique?
The command does not show any Errors (red text) but with the whatif parameter shows that it does not rename any files (tried running it without as well).
1) remove duplicates using -List switch in Select-String
2) you need to really pipe the objects into the for loop
Try this?
Select-String -Path .\*.out -pattern 'ST~867' -SimpleMatch -List | Select-Object Path | ForEach-Object { Rename-Item $_.path ($_.path -replace 'out$','867.out') }
I have a working powershell script to find and and replace a few different strings with a new string in thousands of files, without changing the modified date on the files. In any given file there could be hundreds of instances of said strings to replace. The files themselves aren't very large and probably range from 1-50MB (a quick glance at the directory I am testing with shows the largest as ~33MB).
I'm running the script inside a Server 2012 R2 VM with 4 vCPUs and 4GB of RAM. I have set the MaxMemoryPerShellMB value for Powershell to 3GB. As mentioned previously, the script works, but after 2-4 hours powershell will start throwing OutOfMemoryExceptions and crash. The script is 'V2 friendly' and I haven't adopted it to V3+ but I doubt that matters too much.
My question is whether or not the script can be improved to prevent/eliminate the memory exceptions I am running into at the moment. I don't mind if it runs slower, as long as it can get the job done without having to check back every couple of hours and restart it.
$i=0
$all = Get-ChildItem -Recurse -Include *.txt
$scriptfiles = Select-String -Pattern string1,string2,string3 $all
$output = "C:\Temp\scriptoutput.txt"
foreach ($file in $scriptFiles)
{
$filecreate=(Get-ChildItem $file.Path).creationtime
$fileaccess=(Get-ChildItem $file.Path).lastaccesstime
$filewrite=(Get-ChildItem $file.Path).lastwritetime
"$file.Path,Created: $filecreate,Accessed: $fileaccess,Modified: $filewrite" | out-file -FilePath $output -Append
(Get-Content $file.Path) | ForEach-Object {$_ -replace "string1", "newstring" `
-replace "string2", "newstring" `
-replace "string3", "newstring"
} | Set-Content $file.Path
(Get-ChildItem $file.Path).creationtime=$filecreate
(Get-ChildItem $file.Path).lastaccesstime=$fileaccess
(Get-ChildItem $file.Path).lastwritetime=$filewrite
$filecreate=(Get-ChildItem $file.Path).creationtime
$fileaccess=(Get-ChildItem $file.Path).lastaccesstime
$filewrite=(Get-ChildItem $file.Path).lastwritetime
"$file.Path,UPDATED Created: $filecreate,UPDATED Accessed: $fileaccess,UPDATED Modified: $filewrite" | out-file -FilePath $output -Append
$i++}
Any comments, criticisms, and suggestions welcomed.
Thanks
Biggest issue I can see is that you are repeatedly getting the file for every property you are querying. Replace that with one call per loop pass and save it to be used during the pass. Also Out-File is one of the slower methods of outputting data to file.
$output = "C:\Temp\scriptoutput.txt"
$scriptfiles = Get-ChildItem -Recurse -Include *.txt |
Select-String -Pattern string1,string2,string3 |
Select-Object -ExpandProperty Path
$scriptfiles | ForEach-Object{
$file = Get-Item $_
# Save currrent file times
$filecreate=$file.creationtime
$fileaccess=$file.lastaccesstime
$filewrite=$file.lastwritetime
"$file,Created: $filecreate,Accessed: $fileaccess,Modified: $filewrite"
# Update content.
(Get-Content $file) -replace "string1", "newstring" `
-replace "string2", "newstring" `
-replace "string3", "newstring" | Set-Content $file
# Write all the original times back.
$file.creationtime=$filecreate
$file.lastaccesstime=$fileaccess
$file.lastwritetime=$filewrite
# Verify the changes... Should not be required but it is what you were doing.
$filecreate=$file.creationtime
$fileaccess=$file.lastaccesstime
$filewrite=$file.lastwritetime
"$file,UPDATED Created: $filecreate,UPDATED Accessed: $fileaccess,UPDATED Modified: $filewrite"
} | Set-Content $output
Not tested but should be fine.
Depending on what you replacements are actually like you could probably save some time there as well. Test first before running in production obviously.
I remove the counter you had since it appeared nowhere in the code.
Your logging could easily be csv based since you have all the object ready to go but I just want to be sure we are one the right track before we go to far.
I am using PowerShell 3.
What is best practice for concatenating files?
file1.txt + file2.txt = file3.txt
Does PowerShell provide a facility for performing this operation directly? Or do I need each file's contents be loaded into local variables?
If all the files exist in the same directory and can be matched by a simple pattern, the following code will combine all files into one.
Get-Content .\File?.txt | Out-File .\Combined.txt
I would go this route:
Get-Content file1.txt, file2.txt | Set-Content file3.txt
Use the -Encoding parameter on Set-Content if you need something other than ASCII which is the default for Set-Content.
If you need more flexibility, you could use something like
Get-ChildItem -Recurse *.cs | ForEach-Object { Get-Content $_ } | Out-File -Path .\all.txt
Warning: Concatenation using a simple Get-Content (whether or not using -Raw flag) works for text files; Powershell is too helpful for that:
Without -Raw, it "fixes" (i.e. breaks, pun intended) line breaks, or what Powershell thinks is a line break.
With -Raw, you get a terminating line end (normally CR+LF) at the
end of each file part, which is added at the end of the pipeline. There's an option for that in newer Powershells' Set-Content.
To concatenate a binary file (that is, an arbitrary file that was split for some reason and needs to be put together again), use either this:
Get-Content -Raw file1, file2 | Set-Content -NoNewline destination
or something like this:
Get-Content file1 -Encoding Byte -Raw | Set-Content destination -Encoding Byte
Get-Content file2 -Encoding Byte -Raw | Add-Content destination -Encoding Byte
An alternative is to use the CMD shell and use
copy file1 /b + file2 /b + file3 /b + ... destinationfile
You must not overwrite any part, that is, use any of the parts as destination. The destination file must be different from any of the parts. Otherwise you're up for a surprise and must find a backup copy of the file part.
a generalization based on #Keith answer:
gc <some regex expression> | sc output
Here is an interesting example of how to make a zip-in-image file based on Powershell 7
Get-Content -AsByteStream file1.png, file2.7z | Set-Content -AsByteStream file3.png
Get-Content -AsByteStream file1.png, file2.7z | Add-Content -AsByteStream file3.png
gc file1.txt, file2.txt > output.txt
I think this is as short as it gets.
In case you would like to ensure the concatenation is done in a specific order, use the Sort-Object -Property <Some Name> argument. For example, concatenate based on the name sorting in an ascending order:
Get-ChildItem -Path ./* -Include *.txt -Exclude output.txt | Sort-Object -Property Name | ForEach-Object { Get-Content $_ } | Out-File output.txt
IMPORTANT: -Exclude and Out-File MUST contain the same values, otherwise, it will recursively keep on adding to output.txt until your disk is full.
Note that you must append a * at the end of the -Path argument because you are using -Include, as mentioned in Get-ChildItem documentation.
I have five .sql files and know the name of each file. For this example, call them one.sql, two.sql, three.sql, four.sql and five.sql. I want to append the text of all files and create one file called master.sql. How do I do this in PowerShell? Feel free to post multiple answers to this problem because I am sure there are several ways to do this.
My attempt does not work and creates a file with several hundred thousand lines.
PS C:\sql> get-content '.\one.sql' | get-content '.\two.sql' | get-content '.\three.sql' | get-content '.\four.sql' | get-content '.\five.sql' | out-file -encoding UNICODE master.sql
Get-Content one.sql,two.sql,three.sql,four.sql,five.sql > master.sql
Note that > is equivalent to Out-File -Encoding Unicode. I only tend to use Out-File when I need to specify a different encoding.
There are some good answers here but if you have a whole lot of files and maybe you don't know all of the names this is what I came up with:
$vara = get-childitem -name "path"
$varb = foreach ($a in $vara) {gc "path\$a"}
example
$vara = get-childitem -name "c:\users\test"
$varb = foreach ($a in $vara) {gc "c:\users\test\$a"}
You can obviously pipe this directly into | add-content or whatever but I like to capture in variables so I can manipulate later on.
See if this works better
get-childitem "one.sql","two.sql","three.sql","four.sql","five.sql" | get-content | out-file -encoding UNICODE master.sql
I needed something similar, Chris Berry's post helped, but I think this is more efficient:
gci -name "*PathToFiles*" | gc > master.sql
The first part gci -name "*PathToFiles*" gets you your file list. This can be done with wildcards to just get your .sql files i.e. gci -name "\\share\folder\*.sql"
Then pipes to Get-Content and redirects the output to your master.sql file. As noted by Kieth Hill, you can use Out-File in place of > to better control your output if needed.
I think logical way of solving this is to use Add-Content
$files = Get-ChildItem '.\one.sql', '.\two.sql', '.\three.sql', '.\four.sql', '.\five.sql'
$files | foreach { Get-Content $_ | Add-Content '.\master.sql' -encoding UNICODE }
hovewer Get-Content is usually very slow when reading multiple very large files. If its your case this article could help: http://keithhill.spaces.live.com/blog/cns!5A8D2641E0963A97!756.entry
What about:
Get-Content .\one.sql,.\two.sql,.\three.sql,.\four.sql,.\five.sql | Set-Content .\master.sql
Here is how I do concatenate sql files from the Sql folder:
# Set the current location of the script to use relative path
Set-Location $PSScriptRoot
# Concatenate all the sql files
$concatSql = Get-Content -Path .\Sql\*.sql
# Write/overwrite sql to single file
Add-Content -Path concatFile.sql -Value $concatSql