Powershell replace special characters string in all files in directory path - powershell

I'm trying to create a 'find and replace' script for the website our company just acquired. Right now, I just want to use it to replace their address and phone number with
ours, but I'll likely need to customize it in the future to replace or update other stuffs.
So far, what I got is:
(Get-Content C:\Scripts\Test.txt) |
Foreach-Object {$_ -replace "\*", "#"} |
Set-Content C:\Scripts\Test.txt
which I got from The Scripting Guy :P
However, I need help customizing it. What I need it to do is:
Do it for all files in a directory and all sub-directories, not just one file. The website as far as I can tell is a collection of *.php files
Handle special characters that appear in some addresses, like copyrights (©) pipes (|) commas (,) and periods (.)
Here's the exact string I'm trying to replace (as it appears in the .php's):
<p>©Copyright 2012 GSS | 48009 Fremont Blvd., Fremont, CA 94538 USA</p>
Since this could be the first tool in my powershell toolbox, any explaining of what you're adding or changing would greatly help me understand what's going on.
Bonus points:
Any way to log which files were 'find-and-replace'ed?

My suggestion would be to use a ForEach loop. I don't see the need for a function in this case, just have the code in your ForEach loop. I would define a string to search for, and a string to replace with. When you perform the replace make sure that it is escaped. Something along these lines:
$TxtToFind = "<p>©Copyright 2012 GSS | 48009 Fremont Blvd., Fremont, CA 94538 USA</p>"
$UpdatedTxt = "<p>©Copyright 2014 | 1234 Somenew St., Houston, TX 77045 USA</p>"
$Logfile = "C:\Temp\FileUpdate.log"
ForEach($File in (GCI C:\WebRoot\ -Recurse)){
If($File|Select-String $TxtToFind -SimpleMatch -Quiet){
"Updating lines in $($File.FullName)" |Out-File $Logfile -append
$File|Select-String $TxtToFind -SimpleMatch -AllMatches|Select -ExpandProperty LineNumber -Unique|Out-File $Logfile -append
(GC $File.FullName) | %{$_ -replace [RegEx]::Escape($TxtToFind),$UpdatedTxt} | Set-Content $File.Fullname
}
}

You can leverage regular expression to find/replace the string you desire and the following script will iterate over all the php files within the provided folder recursively.
function ParseFile($file){
#Add logic to parse the file
Write-Host $file.FullName
}
$files = Get-ChildItem -recurse C:\Path -Filter *.php
foreach ($file in $files) {
ParseFile $file
}

Related

Powershell Help, Need to search for pattern/string and copy to new destination

I need Help to search a Network Share for 30+ Keywords and then to copy them to a new location...
I need to search for all type of docs... txt, doc, docx, pdfs, xls, xlsx, etc...
I have a CSV file of Keywords... the header is called Words
FYI - In the CSV file I have each word is not in quotes and some lines does have two words (do they need quotes? a few have a wildcard hous* (for houses, house, housing, etc..) does that need quotes?
example
street
1234 Elm St
Hous*
Do they need to be in Quotes?
It will wont search sub directories
This is what I have...
$CSV = Import-Csv -Path "C:\Users\Username\Documents\book1.csv"
foreach ($Words in $CSV)
{
Get-ChildItem \\Server\Groups$\HR-Dept -Recurse | Select-String -Pattern '$Words' -CaseSensitive -SimpleMatch | Copy-Item -Destination "C:\Users\Username\Desktop\Testing"
}
If you're intending to search by file name rather than content, -filter does the job quite efficiently.
$CSV = Import-Csv -Path "C:\Users\Username\Documents\book1.csv"
foreach ($Words in $CSV) {
# remove any extra "*" to avoid duplication later
$words = $words -replace "*"
# use -File to exclude directories from match. NB: PS version-dependent
Get-ChildItem \\Server\Groups$\HR-Dept -filter *$words* -Recurse -File | Copy-Item -Destination "C:\Users\Username\Desktop\Testing"
}
Spaces in your $words should be fine if your file names will have the same words and spaces.
I recommend trying the gci part first to see if you get the appropriate list of files before pipelining it to the Copy-Item. I haven't tested with a file path that includes a $, so that might be a consideration.

One Line PowerShell Batch Rename based on File Contents

I have a list of EDI text files with specific text in them. Currently in order for our custom scripting to convert them into an SQL table, we need to be able to see the X12 file type in the filename. Because we are using SQL script to get the files into tables this solution needs to be a one line solution. We have a definition table of client files which specify which field terminator and file types to look for so we will be later substitute those values into the one line solution to be executed individually. I am currently looking at Powershell (v.3) to do this for maximum present and future compatibility. Also, I am totally new to Powershell, and have based my script generation on posts in this forum.
Files example
t.text.oxf.20170815123456.out
t.text.oxf.20170815234567.out
t.text.oxf.20170815345678.out
t.text.oxf.20170815456789.out
Search strings to find within files: (To find EDI X12 file type uniquely, which may be duplicated within the same file n times)
ST*867
ST*846
ST~867
ST~846
ST|867
ST|846
Here is what I have so far which does not show itself doing anything with the whatif parameter:
(Get-ChildItem .\ -recurse | Select-String -pattern 'ST~867' -SimpleMatch).Path | Foreach -Begin {$i=1} -Process {Rename-Item -LiteralPath $_ -NewName ($_ -replace 'out$','867.out' -f $i++) -whatif}
The fist part:
(Get-ChildItem .\ -recurse | Select-String -pattern 'ST~867' -SimpleMatch).Path
Simply gets a list of paths that we need to input to be renamed
The second part after the | pipe:
Foreach -Begin {$i=1} -Process {Rename-Item -LiteralPath $_ -NewName ($_ -replace '\.out','.867.out' -f $i++) -whatif}
will supposedly loop through that list and rename the files adding the EDI type to the end of the file. I have tried 'out$','867.out' with no change.
Current Errors:
The first part shows duplicated path elements probably because there are multiple Transaction Set Headers in the files, is there any way to force it to be unique?
The command does not show any Errors (red text) but with the whatif parameter shows that it does not rename any files (tried running it without as well).
1) remove duplicates using -List switch in Select-String
2) you need to really pipe the objects into the for loop
Try this?
Select-String -Path .\*.out -pattern 'ST~867' -SimpleMatch -List | Select-Object Path | ForEach-Object { Rename-Item $_.path ($_.path -replace 'out$','867.out') }

PowerShell Script for updating a visual studio .vsxproj file

I have a small problem. I want to search a directory ,lets say D:\ . I want to find the files with the extention .vsxproj. Then I can do 2 things.
1. Access a node called project properties and change a single word amongst a string.
2. Or, just replace the word from the entire file without searching specific node. This is easier and would serve my purpose.
Now for the actual code. I have tried the following.
(Get-Content D:\data123.xml) |
Foreach-Object {$_ -replace "Wow32", "Wow3232" } |
Set-Content D:\data123.xml
This works for a single file if I know the name. But i have more than a hundred file.So I try something like this.I go to my D drive and do this:
$configFiles = Get-ChildItem . *.vsxproj -rec
foreach ($file in $configFiles)
{
(Get-Content $file.PSPath) |
Foreach-Object {$_ -replace "Wow32", "Wow3232" } |
Set-Content $file.PSPath
}
Please let me know how can I recurse over several file and change this one word. I have searched and looked at quite a few answers.None of them actually solves my problem. Please help.

Parse and Switch Elements of Folder Names with Powershell 2.0

I have been trying to write a powershell script (my first) to
parse out only the folders within a directory
select only those folders matching a specific naming convention ('SEASON YEAR')
switch the order of the elements of the name ('YEAR SEASON')
I eventually used the program BulkRenameUtility to do this using the regexp ^(\w+) (\d+) and switching the token order to $2 $1 -- however, I still am learning Powershell and would like to be able to do this without using an external program.
So, to re-iterate, at C:\Users\Test
there are folders and files.. some of the folders are named Spring 2015, Fall 2014, for example. However, other folders have names such as geogex. Files have names such as ENG1A SPRING 2015.odtand untitled_0.odt.
How do I only change the names of the folders named like "Spring 2015" to read "2015 Spring", "2014 Fall" etc. ?
I was able to use
gci | ? {$_.PSIsContainer} | select-string -pattern '\d+'
to accomplish 1 and 2 but am stuck on using this to do part 3: actually rename by reversing the elements of the name. I tried putting the above within a variable and like so:
gci | ? {$_.PSIsContainer} | select-string -pattern '\d+' | %{$data = $_.line; Write-Output "$data"};
however, while the above outputs exactly the folders I want the array $data seems to only hold the last line of output. Such that:
gci | ? {$_.PSIsContainer} | select-string -pattern '\d+' | %{$data = $_.line; Write-Output "$data"};
$data
will output:
test 123
test 321
test 321
I am unsure if this is even the a valid direction to begin with.
Any help would be greatly appreciated.
This should get the job done.
$Path = "C:\Users\Test"
$regex = "^(Spring|Summer|Fall|Winter)\s+\d{4}$"
Get-ChildItem $Path |
Where-Object {$_.PSIsContainer -and $_.Name -match $regex} |
Rename-Item -NewName {$split = $_.Name -split "\s+"; "{0} {1}" -f $split[1],$split[0]}
We use that regex to filter out the folder that fit your convention. Should be a little more targeted using specific season names. The year is a little more lacked by just looking for 4 numbers.
Other ways to do it but for the rename I just split the name on the space and reversed the output using the -f format operator.

Using PowerShell, read multiple known file names, append text of all files, create and write to one output file

I have five .sql files and know the name of each file. For this example, call them one.sql, two.sql, three.sql, four.sql and five.sql. I want to append the text of all files and create one file called master.sql. How do I do this in PowerShell? Feel free to post multiple answers to this problem because I am sure there are several ways to do this.
My attempt does not work and creates a file with several hundred thousand lines.
PS C:\sql> get-content '.\one.sql' | get-content '.\two.sql' | get-content '.\three.sql' | get-content '.\four.sql' | get-content '.\five.sql' | out-file -encoding UNICODE master.sql
Get-Content one.sql,two.sql,three.sql,four.sql,five.sql > master.sql
Note that > is equivalent to Out-File -Encoding Unicode. I only tend to use Out-File when I need to specify a different encoding.
There are some good answers here but if you have a whole lot of files and maybe you don't know all of the names this is what I came up with:
$vara = get-childitem -name "path"
$varb = foreach ($a in $vara) {gc "path\$a"}
example
$vara = get-childitem -name "c:\users\test"
$varb = foreach ($a in $vara) {gc "c:\users\test\$a"}
You can obviously pipe this directly into | add-content or whatever but I like to capture in variables so I can manipulate later on.
See if this works better
get-childitem "one.sql","two.sql","three.sql","four.sql","five.sql" | get-content | out-file -encoding UNICODE master.sql
I needed something similar, Chris Berry's post helped, but I think this is more efficient:
gci -name "*PathToFiles*" | gc > master.sql
The first part gci -name "*PathToFiles*" gets you your file list. This can be done with wildcards to just get your .sql files i.e. gci -name "\\share\folder\*.sql"
Then pipes to Get-Content and redirects the output to your master.sql file. As noted by Kieth Hill, you can use Out-File in place of > to better control your output if needed.
I think logical way of solving this is to use Add-Content
$files = Get-ChildItem '.\one.sql', '.\two.sql', '.\three.sql', '.\four.sql', '.\five.sql'
$files | foreach { Get-Content $_ | Add-Content '.\master.sql' -encoding UNICODE }
hovewer Get-Content is usually very slow when reading multiple very large files. If its your case this article could help: http://keithhill.spaces.live.com/blog/cns!5A8D2641E0963A97!756.entry
What about:
Get-Content .\one.sql,.\two.sql,.\three.sql,.\four.sql,.\five.sql | Set-Content .\master.sql
Here is how I do concatenate sql files from the Sql folder:
# Set the current location of the script to use relative path
Set-Location $PSScriptRoot
# Concatenate all the sql files
$concatSql = Get-Content -Path .\Sql\*.sql
# Write/overwrite sql to single file
Add-Content -Path concatFile.sql -Value $concatSql