Goal: Update text entry on one line within many files distributed on a server
Summary: As part of an application migration between datacenters the .rdp files on end-user desktops need to be updated to point to the new IP address of their Remote Desktop Server. All the .rdp files reside on Windows servers in a redirected folders SMB share where I have Administrative access.
Powershell experience: minimal. Still trying to wrap my head around the way variables, output and piping work.
Was originally trying to make a single line of powershell code to complete this task but got stuck and had to make script file with the two lines of code below.
-Line 1: Search for all .rdp files in the folder structure and store the full path with file name in a variable. Every file will be checked since the users tend to accidentally change file names, eliminating absolute predictability.
-Line 2: I want to make one pass through all the files to replace only instances of two particular IP addresses with the new address. Then write the changes into the original file.
$Path = ls 'C:\Scripts\Replace-RDP\TESTFILES\' -Include *.rdp -Recurse -Force -ErrorAction SilentlyContinue | foreach fullname
$Path | (Get-Content -Path $Path) -Replace 'IPserver1','newIPserver1' -Replace 'IPserver2','newIPserver2' | Set-Content $Path -Force
Have found most of the solution with Powershell but have a problem with the results. The second line of code when output to the screen changes contents correctly in memory. The content written to file however resulted in the new server IP address being written into ALL rdp files even if the source rdp file's target IP address doesn't match the -Replace criterion.
Text inside a .rdp on the relevant line is:
full address:s:192.168.1.123
changes to:
full address:s:172.16.1.23
Thank you for all assistance in reaching the endpoint. Have spent hours learning from various sites and code snippets.
You need to keep track of each file that you are reading so that you can save changes to that file. Foreach-Object makes this process easy. Inside of the Foreach-Object script block, the current object $_ is the FullName value for each of your files.
$CurrentIP1 = '192\.168\.1\.123'
$CurrentIP2 = '192\.168\.1\.124'
$NewIP1 = '172.16.1.23'
$NewIP2 = '172.16.1.24'
$files = (Get-ChildItem 'C:\Scripts\Replace-RDP\TESTFILES\' -Filter *.rdp -Recurse -Force -File -ErrorAction SilentlyContinue).FullName
$files | Foreach-Object {
if (($contents = Get-Content $_) -match "$CurrentIP1|$CurrentIP2") {
$contents -replace $CurrentIP1,$NewIP1 -replace $CurrentIP2,$NewIP2 |
Set-Content $_
}
}
Note that using the -File switch on Get-ChildItem (alias ls) outputs only files. Since -replace uses regex to do matching, you must backslash escape literal . characters.
Related
Good morning,
Hopefully this will be a quick and easy one to answer.
I am trying to run a PS script and have it export to csv based on a list of IP addresses from a text file. At the moment, it will run but only produce one csv.
Code Revision 1
$computers = get-content "pathway.txt"
$source = "\\$computer\c$"
foreach ($computer in $computers) {
Get-ChildItem -Path "\\$Source\c$" -Recurse -Force -ErrorAction SilentlyContinue |
Select-Object Name,Extension,FullName,CreationTime,LastAccessTime,LastWriteTime,Length |
Export-CSV -Path "C:\path\$computer.csv" -NoTypeInformation
}
Edit
The script is now creating the individual server files as needed and I did change the source .txt file to list the servers by HostName rather than IP. The issue now is that no data is populating in the .csv files. It will create them but nothing populates. I have tried different source file paths to see if maybe its due to folder permissions or just empty but nothing seems to populate in the files.
The $computer file lists a number of server IP addresses so the script should run against each IP and then write out to a separate csv file with the results, naming the csv file the individual IP address accordingly.
Does anyone see any errors in the script that I provided, that would prevent it from writing out to a separate csv with each run? I feel like it has something to do with the foreach loop but I cannot seem to isolate where I am going wrong.
Also, I cannot use any third-party software as this is a closed network with very strict FW rules so I am left with powershell (which is okay). And yes this will be a very long run for each of the servers but I am okay with that.
Edit
I did forget to mention that when I run the script, I get an error indicating that the export-csv path is too long which doesn't make any sense unless it is trying to write all of the IP addresses to a single name.
"Export-CSV : The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.
At line:14 char:1
TIA
Running the script against C: Drive of each computer is strongly not advisable that too with Recurse option. But for your understanding, this is how you should pass the values to the variables. I haven't tested this code.
$computer = get-content "pathway.txt"
foreach ($Source in $computer) {
Get-ChildItem -Path "\\$Source\c$" -Recurse -Force -ErrorAction SilentlyContinue |
Select-Object Name,Extension,FullName,CreationTime,LastAccessTime,LastWriteTime,Length | Export-Csv -Path "C:\Path\$source.csv" -NoTypeInformation
}
$computer will hold the whole content and foreach will loop the content and $source will get one IP at a time. I also suggest instead of IP's you can have hostname so that your output file have servername.csv for each server.
In hopes that this helps someone else. I have finally got the script to run and create the individual .csv files for each server hostname.
$servers = Get-Content "path"
Foreach ($server in $servers)
{
Get-ChildItem -Path "\\$server\c$" -Recurse -Force -ErrorAction SilentlyContinue |
Select-Object Name,Extension,FullName,CreationTime,LastAccessTime,LastWriteTime,Length |
Export-CSV -Path "path\$server.csv" -NoTypeInformation
}
I manage database servers and often I have to apply scripts into different servers or databases.
Sometimes these scripts are all saved in a directory and need to be open and run in the target server\database.
As I have been looking at automating this task I came across how Run All PowerShell Scripts In A Directory and also How can I execute a set of .SQL files from within SSMS? and that is exactly what I needed, however I stumbled over a few issues:
I don't know the file names
:setvar path "c:\Path_to_scripts\"
:r $(path)\file1.sql
:r $(path)\file2.sql
I tried to add all .sql files into one big thing, but when I copied from powershell into sql, in many of the procedures that had long lines, the lines got messed up
cls
$Radhe = Get-Content 'D:\apply all scripts to SQLPRODUCTION\*.sql' -Raw
$Radhe.Count
$Radhe.LongLength
$Radhe
If I could read all the files in that specific folder and save them all into a single the_scripts_to_run.sql file, without changing the line endings, that would be perfect.
I don't need to use get-content or any command in particular, I just would like to get all my scripts into a big single script with everything in it, without changes.
How can I achieve that?
I even found Merge multiple SQL files into a single SQL file but I want to get it done via powershell.
This should work fine, I'm not sure what you mean by not needing to use Get-Content you could use [System.IO.File]::ReadAllLines( ) or [System.IO.File]::ReadAllText( ) but this should work fine too. Try it and let me know if it works.
$path = "c:\Path_to_scripts"
$scripts = (Get-ChildItem "$path\*.sql" -Recurse -File).FullName
$merged = [system.collections.generic.list[string[]]]::new()
foreach($script in $scripts)
{
$merged.Add((Get-Content $script))
}
$merged | Out-File "$path\mergedscripts.sql"
This is actually much simpler than the proposed solutions. Get-Content takes a list of paths and supports wildcards, so no loop is required.
$path = 'c:\temp\sql'
Set-Content -Path "$path\the_scripts_to_run.sql" -Value (Get-Content -Path "$path\*.sql" -Raw)
Looks like me and #Santiago had the same idea:
Get-ChildItem -Path "$path" -Filter "*.sql" | ForEach-Object -Process {
Get-Content $_.FullName | Out-File $Path\stuff.txt -Append utf8
}
I have a whole bunch of users that are simply dragging emails from outlook on to a network share. That network SMB share gets sync'd with OneDrive and one drive hates to have leading spaces in a filename. Also some of the file names are so large that windows refuses to let me do anything with it (over 255 characters). I've found that simply renaming the files and taking out all the spaces does the trick. I have thousands of files in many sub directories and they keep adding more every day. I'd like a simple script that I can run everyday on the file server that scans all the *.msg files in a directory and all the sub directories and simply removes all the spaces. Can someone help me with a powershell script that will accomplish this?
The following variation of your own answer should perform significantly better:
Get-ChildItem -Recurse -Filter *.msg |
Rename-Item -NewName { $_.Name.Replace(' ', '') }
Note: You could add -File to Get-ChildItem in order to limit matches to files for extra robustness, but it actually slows down the command a bit (not much).
-Filter filters at the source and is therefore much faster than using -Path (implied in your command) or -Include, which require PowerShell itself to examine all items.
Using a delay-bind script block as the -NewName argument performs better than a call to the ForEach-Object cmdlet in whose script block Rename-Item is called once every iteration.
Note:
There's a risk of name collisions.
Attempts to rename a file to itself - if the name contains no spaces - are quietly ignored.
This script works great:
get-childitem -recurse *.msg | foreach { rename-item $_ $_.Name.Replace(" ", "") }
I have a list of EDI text files with specific text in them. Currently in order for our custom scripting to convert them into an SQL table, we need to be able to see the X12 file type in the filename. Because we are using SQL script to get the files into tables this solution needs to be a one line solution. We have a definition table of client files which specify which field terminator and file types to look for so we will be later substitute those values into the one line solution to be executed individually. I am currently looking at Powershell (v.3) to do this for maximum present and future compatibility. Also, I am totally new to Powershell, and have based my script generation on posts in this forum.
Files example
t.text.oxf.20170815123456.out
t.text.oxf.20170815234567.out
t.text.oxf.20170815345678.out
t.text.oxf.20170815456789.out
Search strings to find within files: (To find EDI X12 file type uniquely, which may be duplicated within the same file n times)
ST*867
ST*846
ST~867
ST~846
ST|867
ST|846
Here is what I have so far which does not show itself doing anything with the whatif parameter:
(Get-ChildItem .\ -recurse | Select-String -pattern 'ST~867' -SimpleMatch).Path | Foreach -Begin {$i=1} -Process {Rename-Item -LiteralPath $_ -NewName ($_ -replace 'out$','867.out' -f $i++) -whatif}
The fist part:
(Get-ChildItem .\ -recurse | Select-String -pattern 'ST~867' -SimpleMatch).Path
Simply gets a list of paths that we need to input to be renamed
The second part after the | pipe:
Foreach -Begin {$i=1} -Process {Rename-Item -LiteralPath $_ -NewName ($_ -replace '\.out','.867.out' -f $i++) -whatif}
will supposedly loop through that list and rename the files adding the EDI type to the end of the file. I have tried 'out$','867.out' with no change.
Current Errors:
The first part shows duplicated path elements probably because there are multiple Transaction Set Headers in the files, is there any way to force it to be unique?
The command does not show any Errors (red text) but with the whatif parameter shows that it does not rename any files (tried running it without as well).
1) remove duplicates using -List switch in Select-String
2) you need to really pipe the objects into the for loop
Try this?
Select-String -Path .\*.out -pattern 'ST~867' -SimpleMatch -List | Select-Object Path | ForEach-Object { Rename-Item $_.path ($_.path -replace 'out$','867.out') }
I want to start by saying coding is a bit outside of my skill set but because a certain problem keeps appearing at work, I'm trying to automate a solution.
I use the below script to read an input file for a list of name, search the C:\ for those files, then write the path to an output file if any are found.
foreach($line in Get-Content C:\temp\InPutfile.txt) {
if($line -match $regex){
gci -Path "C:\" -recurse -Filter $line -ErrorAction SilentlyContinue |
Out-File -Append c:\temp\ResultsFindFile.txt
}
}
I would like to make two modifications to this. First, to search all drives connected to the computer not just C:\. Next, be able to delete any found files. I'm using the Remove-Item -confirm command but so far can't make it delete the file it just found.