Set-Content will make new file but won't replace old one - powershell

I have a Powershell script that keeps track of an inventory text file. I want to update the text file with the script. The following line will remove an item from the inventory as requested by the user:
Get-Content inventory | Where-Object {$_ -notmatch "$removed"} | Set-Content inventory.new
It works just fine, but I want to update the original inventory file, i.e. I don't want to make an "inventory.new," I just want to update "inventory." If I replace inventory.out with inventory I am presented with the following error:
Set-Content : The process cannot access the file 'path\inventory' because it is being used by another process.
The funny thing is that I am able to remove inventory, then rename inventory.out to inventory. Again, this works but isn't ideal. I felt silly just testing it. There must be a more elegant solution!

My understanding of it is that when you open the file with Get-Content, the file is in use until the end of that command, so it won't let you make changes until Get-Content lets go of the file.
An easy workaround might be:
$inventory = Get-Content inventory | Where-Object {$_ -notmatch "$removed"}
$inventory | Set-Content inventory

Related

Powershell - Select File Name instead of full path and file name

I'm sure this is so simple you may roll your eyes at it. I cannot figure out how to suppress the full path of the file when I get a match.
I only need the file name, Line number and matched word.
I have a text file of 15K copybooks. I need to know what program references any of them.
The script takes forever to run but I know of only one way to search all files in a directory looking for ANY or ALL of the contents in my file.
Can someone point out the error? The full path is not needed and adds a step or two to my process to clean things up.
Regards,
-Ron
$searchWords=Get-Content "C:\Workspace\Missing.txt"
Foreach ($sw in $searchWords)
{
Get-Childitem -Path "C:\Workspace\src" -Recurse -include "*.cbl" |
Select-String -Pattern "$sw" |
Select Path,LineNumber,#{n='SearchWord';e={$sw}}
}

Set-content keeping busy my files

I'm trying to replace multiple strings for new ones, always in the same file.
This would be an example. This give me no problems.
(get-content modTags.bas) | %{$_ -replace "rng_origin.Offset(ColumnOffset:=1)", "rng_origin.Offset(ColumnOffset:=0)"} | set-content modTags.bas
But if I repeat this line in the script (in fact, i must do it like 20 times) I get the error that the file is currently in use.
I have tried to put (set-content) like in (get-content), but it seems it doesn't works for only allow parameters in the first command in the pipeline.
I already know how to "bypass" this error.
By typing all my replacements inline it works (or continue the code in a new line) like this.
(get-content modTags.bas) | %{$_ -replace "X","Y" `
-replace "A","B"} | set-content modTags.bas
So this is a question about why set-content keeps the file occupy for new query in the same script and how can it be avoided? With get-content it easy with the () solution, and I was kinda expecting something similar for set-content.
And second. Could you suggest any better alternative for replacing multiple strings for different ones and save it in the same file (not creating a file.new.txt file.old.txt or something like that)
Thanks!

Add values of array to specific place in csv file

I'm far away from being an expert in PowerShell, so I'll be my best to explain here.
I was able to add a column, but now I want to add stuff in a column (already there) using a separate script.
For example, the following CSV file:
WhenToBuyThings,ThingsToBuy
BuyNow ,Bed
BuyNow ,Desk
BuyNow ,Computer
BuyNow ,Food
BuyLater ,
BuyLater ,
BuyLater ,
I have the array:
$BuyStuffLater = "Books","Toys","ShinnyStuff"
So the end result of the file should look like this
BuyNow ,Bed
BuyNow ,Desk
BuyNow ,Computer
BuyNow ,Food
BuyLater ,Books
BuyLater ,Toys
BuyLater ,ShinnyStuff
Any help with how to do this in code would be much appreciated. Also, we can't use delimiter ",". Because in the real script some values will have commas.
I got it after a few hours of fiddling...
$myArray = "Books","Toys","ShinnyStuff"
$i = 0
Import-Csv "C:\Temp\test.csv" |
ForEach-Object {if($_.WhenToBuyThings -eq "BuyLater"){$_.ThingsToBuy = $myArray[$i];$i++}return $_} |
Export-Csv C:\Temp\testtemp.csv -NoTypeInformation
All is well now...
I am new to powershell, too. Here's what I found. This searches and returns all lines that fit. I'm not sure it can pipe.
$BuyStuffLater = "Books","Toys","ShinnyStuff"
$x = 0
Get-Content -Path .\mydata.txt | select-string -pattern "BuyLater" #searches and displays
# Im not sure about this piping. (| foreach {$_ + $BuyStuffLater[$x];$x++} | Set-Content .\outputfile.csv)
This filter will work, though I still have to work on the piping. The other answer might be better.
I don't see a point to iterating through each object to see if it is a WhenToBuyThings is "BuyLater". If anything what you are doing could be harmful if you run multiple passes adding to the list. It could remove previous things you wanted to by. If "Kidney Dialysis Machine" was listed as a "BuyLater" under WhenToBuyThings then you would overwrite it with dire consequences.
What we can do is build two lists and merge into new csv file. First list is your original file minus any entry where a "BuyLater" has a blank ThingsToBuy. The second list is an object array built from your $BuyStuffLater. Add these lists together and export.
Also there is zero need to worry about using a comma delimiter when using Export-CSV. The data is quoted so commas in data do not affect the data structure. If this was still a concern you could use -Delimiter ";". I noticed in your answer that you did not attempt to account for commas either (not that it matters based on what I just said).
$path = "C:\Temp\test.csv"
$ListOfItemsToBuy = "Books","Toys","ShinnyStuff: Big, ShinyStuff"
$BuyStuffLater = $ListOfItemsToBuy | ForEach-Object{
[pscustomobject]#{
WhenToBuyThings = "BuyLater"
ThingsToBuy = $_
}
}
$CurrentList = Import-Csv $path | Where-Object{$_.WhenToBuyThings -ne "BuyLater" -and ![string]::IsNullOrWhiteSpace($_.ThingsToBuy)}
$CurrentList + $BuyStuffLater | Export-Csv $path -NoTypeInformation
Since you have 3.0 we can use [pscustomobject]#{} to build the new object very easily. Combine both arrays simply by adding them together and export back to the original file.
You should notice I used slightly different input data. One includes a comma. I did that so you can see what the output file looks like.
"BuyLater","Books"
"BuyLater","Toys"
"BuyLater","ShinnyStuff: Big, ShinyStuff"

Replace lines with specific string and save with the same name

I'm working with an application that creates a log file. Due to an error in the software itself, it keeps producing three errors I'm not interested in. Each line has a unique identifier so I can't just replace the line since each one is different.
I have two main issues with this: I need to save it with the same name, and while it works the file should be available (in case the logger needs to write something).
I can't hard-code the original app to prevent it from writing that part of the log.
I have tried so far:
Get-Content log.log | Where-Object {$_-notmatch 'ERROR1' -And $_-notmatch 'ERROR2' -And $_-notmatch 'ERROR3' } `|Set-Content log_stripped.log
^ It only works if the output file has a different name.
Get-Content error.log | foreach-object { Where-Object {$_-notmatch 'ERROR1' -And $_-notmatch 'ERROR2' -And $_-notmatch 'ERROR3' } } | Set-Content error.log
^ This one froze my PS session.
I also tried reading the file to a variable:
$logcontent = ${h:error.log}
but I got System.OutOfMemoryException.
Ideally, what I need is something that reads the log file, takes away all the lines I don't want, and then save it with its original name.
Ideas? (Keep in mind that the log file is +/- 900 MB with the unnecesary data and 45mb once I strip the data with the first method - but I need it to save the file with its original name)
You can't save the file back to the same name while you're still reading from it, which means you'd have to read the whole 900MB into memory before you start writing. Not a good idea.
Try this:
Remove-Item log_stripped.log
Get-Content log.log -ReadCount 1000 |
foreach {$_ -notmatch 'ERROR1|ERROR2|ERROR3' | Add-Content log_stripped.log }
Remove-item log.log
Rename-Item log_stripped.log log.log
I know you said you want to save to the same filename, but if the reason you want that is that you want the log to be continuously updated, then you could do the following:
Get-Content -Wait log.log |
? {$_ -notmatch 'ERROR1|ERROR2|ERROR3' } |
Out-File log_stripped.log
Note the -Wait on the Get-Content.
log_stripped.log will be continuously updated as log.log is updated.

Powershell script write back to sources from drag and drop

I need to create a powershell script that removes quotes from CSV files in a user friendly drag and drop way. I have the basics of the script down courtesy of this page:
http://blogs.technet.com/b/heyscriptingguy/archive/2011/11/02/remove-unwanted-quotation-marks-from-csv-files-by-using-powershell.aspx
And I've already sucessfully made .ps1 files drag and droppable courtesy of this stack overflow question:
Drag and Drop to a Powershell script
The author of the answer implies that it's just as easy to drop a single file, many files, and folders with lots of files in them. However, I have yet to figure this out in a way that can also can write back to the source file. Here's my current code:
Param([string[]]$file)
(gc $file) | % {$_ -replace '"', ""} | out-file C:\Users\pfoster\Desktop\Output\test.txt -Fo -En ascii
Currently, this will only accept a single file, and output the result as a txt to a specified file regardless of the source file type (I can change that to CSV easily but I'd like the script to mirror the source). Ideally, I'd like it to accept files and folders, and to rewrite the source file. I have a feeling this would involve the get-ChildItem but I'm not sure how to implement that in the current scenario. I've also tried out-file $file and that didn't work either.
Thanks for the help!
For writing the modified content back to the original files try something like this:
foreach ($file in $ARGS) {
(Get-Content $file) -replace '"', '' | Out-File $file -Encoding ASCII -Force
}
Use a foreach in loop, because you need the file name in more than one place in the pipeline. Reading the content in a subshell and then piping the modified content into the Out-File cmdlet makes sure that the output file is only written after the content was already read.
Don't use a redirection operator ((Get-Content $file) >$file), because that would first open the file for writing (effectively truncating it) and afterwards read the content from the now empty file.
Beware that this approach may cause problems with large files, because each file is read completely into the RAM before they're processed and written back to disk. If a file doesn't fit into the available RAM the computer will start swapping, thus causing significant performance degradation.