powershell append to output - powershell

I'm 'teaching myself to powershell' and have come a cropper already, and google/this site hasn't enabled me to find a solution. I'm compiling a text file with filelists from different directories, but i'm having trouble appemnding new data to the file.
get-childitem $dir -recurse | % {write-output $_.fullname} >$file
creates my file, but then i want to APPEND new records from the below
get-childitem $dir2 -recurse | % {write-output $_.fullname} >$file
I've tried both add-content and -append, but I cant figure out what I'm not doing to get it right.

Try:
get-childitem $dir -recurse | % {write-output $_.fullname} >> $file
(Tested and works)
The double >> makes it append always, a single > overwrites each time.
Or change your syntax to use Out-File
get-childitem $dir -recurse | % {write-output $_.fullname} | out-file -filepath $file -Append
(untested)
In this case the variable $file must hold the full path. Like: C:\directory\filename.txt

You can use Out-File to write to a file, adding the append parameter will append to the file.
Get-ChildItem $dir -recurse | Select-object -ExpandProperty Fullname | Out-File -FilePath $file
Get-ChildItem $dir2 -recurse | Select-object -ExpandProperty Fullname | Out-File -FilePath $file -Append

Short Answer
The pipeline used here can be eliminated, and usage of Out-File would make life easy:
Out-File (Get-ChildItem $dir -Recurse).FullName -FilePath $File
To append would be to simply use the -Append flag:
Out-File (Get-ChildItem $dir2 -Recurse).FullName -FilePath $File -Append
Note: This only works in PowerShell v3 and up, as PowerShell v2 relied on the pipeline to expand properties of objects within an array. In that case, the best route is to use something more like #david-martin proposed on this same thread.
Long Answer, and Best Practices
In a different thread, Script to Append The File, they were having similar difficulties with appending files. Though, they were also using the pipeline in a way that was unnecessary (more so than you have used in your example).
Their pipeline usage looked like this:
$PathArray | % {$_} | Out-File "C:\SearchString\Output.txt"
Now, again, Out-File has an -Append parameter. Simply modifying their code to have it tagged on at the end took care of things.
Though, their ForEach-Object statement (the % symbol) is pretty useless in the pipeline and isn't needed (very close in similarity to how yours is used). This is because you are only using the ForEach-Object loop to output the object without any modification. This is exactly what the pipeline does by default, which is pass each object along to the next command.
For more information on the pipeline: About Pipelines
If Update-Help has been run locally, one can use Get-Help to locally run Get-Help about_pipelines to see information too.
Instead of this:
$PathArray | % {$_} | Out-File "C:\SearchString\Output.txt" -Append
We could do this:
$PathArray | Out-File "C:\SearchString\Output.txt" -Append
[Recommended] That example can also eliminate the need for the pipeline all together, as using a pipeline is less efficient if it can be done without it. Doing everything one can possibly do without the pipeline, or to the left of each pipe in the pipeline, is to "filter left" (see the following article for more about why one should filter left, format right: Filtering Command Output in PowerShell):
Out-File -InputObject $PathArray -FilePath "C:\SearchString\Output.txt" -Append
Note: In the case above, -Append is only needed if the file already exists and is being extended.
Remember: Get-Help, and Read The Friendly Manual (RTFM)
The easiest way to troubleshoot is to checkout help documentation. Use Get-Help to checkup whatever you need: parameter sets, available parameters, examples, etc. Make sure to run Update-Help in order to have detailed documentation available locally. To checkout everything:
Update-Help
Get-Help Out-File -Full
For more detailed information that is good to know about data stream/output redirection:
PowerShell redirection operators, such as > and >> (but also redirection of data streams with n> and n>&1), and the available streams per PowerShell version: About Redirection in PowerShell (or: Get-Help about_redirection in PowerShell)
Tee-Object cmdlet), a cmdlet that acts as a more robust version of Out-File (or: Get-Help tee-object in powerShell)

Related

Send result of powershell script in file

Am I doing something wrong with the following code :
if yes, please, how can i fix it ?
$currentDir = Get-Location
$output = Write-Host "$currentDir\computedMD5.txt"
Get-FileHash $currentDir\* -Algorithm MD5 | Format-List | Out-File -FilePath $output
When trying the same code (ofcourse adjusted to my folders) I get an error about the target file can't be read.
I then tried encapsulating the main operations in brackets and all was fine and dandy:
( Get-FileHash * | Format-List ) | Out-File t.txt -Force
I'm no expert on PS pipelines, but I suspect that the problem is something like a racing condition, that the target file is opened before the filehash is calculated. When using the brackets, the code inside is run first, and then outputs that result to the pipeline.
Although, I can only guess this is a solution, since there are no errors in the question to guide us to what is actually happening for you.
Just a guess as you don't explain what exactly you're trying to do but regardless you're overcomplicating things:
$currentDir = Get-Location
Get-FileHash "$currentDir\*" -Algorithm MD5 | Out-File -FilePath "$currentDir\computedMD5.txt"
It's probably not a good idea to use a wildcard for input though unless you specifically code for multiple matches. So a better appproach if you have multiple files is as follows:
$currentDir = Get-Location
Get-ChildItem -Path $currentDir | ForEach-Object {
Get-FileHash $_.FullName -Algorithm MD5 | Out-File -FilePath ($_.FullName + "_computedMD5.txt")
}

Powershell - include Get-Date in .log file inside the Add-Content Cmdlet

I've made a small Powershell Script which deletes alle files and folders except specific ones.
The script itself works pretty good but I have a lot of troubble getting the logging to work. I'm currently on a good way with the Add-Content Cmdlet which works good. The only thing I now want to include is a small Get-Date Cmdlet inside the Add-Content which also includes the current time in the log when the specific file/folder was deleted. But I just can't get it to work properly. Can someone help me?
Here is what I got so far:
Get-ChildItem -Path 'C:\sample\*\notesdata' -Recurse -exclude names.nsf |
Select -ExpandProperty FullName |
Where {$_ -notlike 'C:\sample\*\notesdata\Roaming*'} |
Where {$_ -notlike 'C:\sample\*\notesdata\Archive*'} |
sort length -Descending |
Remove-Item -force -Recurse -Verbose 4>&1 | Add-Content -Path .\ergebnis.log, .\ergebnis2.log -Value (Get-Date)
The file "names.nsf" and the folders "Roaming", "Archive" will not get deleted.
Thanks for your help :)
Agree with #Olaf I'm guessing you're getting error like:
Add-Content : The input object cannot be bound to any parameters for
the command either because the command does not take pipeline input or
the input and its properties do not match any of the parameters that
take pipeline input.
If you want to date the output for logging purposes you could add something a ForEach-Object before the Add-Content command. Something like:
Remove-Item C:\temp\something.txt -Verbose 4>&1 | ForEach-Object{ "$(Get-Date -format g) : $($_)" } | Add-Content C:\temp\something2.txt
Let us know if that helps. Thanks.

Powershell Find and Replace Loop, OutOfMemoryException

I have a working powershell script to find and and replace a few different strings with a new string in thousands of files, without changing the modified date on the files. In any given file there could be hundreds of instances of said strings to replace. The files themselves aren't very large and probably range from 1-50MB (a quick glance at the directory I am testing with shows the largest as ~33MB).
I'm running the script inside a Server 2012 R2 VM with 4 vCPUs and 4GB of RAM. I have set the MaxMemoryPerShellMB value for Powershell to 3GB. As mentioned previously, the script works, but after 2-4 hours powershell will start throwing OutOfMemoryExceptions and crash. The script is 'V2 friendly' and I haven't adopted it to V3+ but I doubt that matters too much.
My question is whether or not the script can be improved to prevent/eliminate the memory exceptions I am running into at the moment. I don't mind if it runs slower, as long as it can get the job done without having to check back every couple of hours and restart it.
$i=0
$all = Get-ChildItem -Recurse -Include *.txt
$scriptfiles = Select-String -Pattern string1,string2,string3 $all
$output = "C:\Temp\scriptoutput.txt"
foreach ($file in $scriptFiles)
{
$filecreate=(Get-ChildItem $file.Path).creationtime
$fileaccess=(Get-ChildItem $file.Path).lastaccesstime
$filewrite=(Get-ChildItem $file.Path).lastwritetime
"$file.Path,Created: $filecreate,Accessed: $fileaccess,Modified: $filewrite" | out-file -FilePath $output -Append
(Get-Content $file.Path) | ForEach-Object {$_ -replace "string1", "newstring" `
-replace "string2", "newstring" `
-replace "string3", "newstring"
} | Set-Content $file.Path
(Get-ChildItem $file.Path).creationtime=$filecreate
(Get-ChildItem $file.Path).lastaccesstime=$fileaccess
(Get-ChildItem $file.Path).lastwritetime=$filewrite
$filecreate=(Get-ChildItem $file.Path).creationtime
$fileaccess=(Get-ChildItem $file.Path).lastaccesstime
$filewrite=(Get-ChildItem $file.Path).lastwritetime
"$file.Path,UPDATED Created: $filecreate,UPDATED Accessed: $fileaccess,UPDATED Modified: $filewrite" | out-file -FilePath $output -Append
$i++}
Any comments, criticisms, and suggestions welcomed.
Thanks
Biggest issue I can see is that you are repeatedly getting the file for every property you are querying. Replace that with one call per loop pass and save it to be used during the pass. Also Out-File is one of the slower methods of outputting data to file.
$output = "C:\Temp\scriptoutput.txt"
$scriptfiles = Get-ChildItem -Recurse -Include *.txt |
Select-String -Pattern string1,string2,string3 |
Select-Object -ExpandProperty Path
$scriptfiles | ForEach-Object{
$file = Get-Item $_
# Save currrent file times
$filecreate=$file.creationtime
$fileaccess=$file.lastaccesstime
$filewrite=$file.lastwritetime
"$file,Created: $filecreate,Accessed: $fileaccess,Modified: $filewrite"
# Update content.
(Get-Content $file) -replace "string1", "newstring" `
-replace "string2", "newstring" `
-replace "string3", "newstring" | Set-Content $file
# Write all the original times back.
$file.creationtime=$filecreate
$file.lastaccesstime=$fileaccess
$file.lastwritetime=$filewrite
# Verify the changes... Should not be required but it is what you were doing.
$filecreate=$file.creationtime
$fileaccess=$file.lastaccesstime
$filewrite=$file.lastwritetime
"$file,UPDATED Created: $filecreate,UPDATED Accessed: $fileaccess,UPDATED Modified: $filewrite"
} | Set-Content $output
Not tested but should be fine.
Depending on what you replacements are actually like you could probably save some time there as well. Test first before running in production obviously.
I remove the counter you had since it appeared nowhere in the code.
Your logging could easily be csv based since you have all the object ready to go but I just want to be sure we are one the right track before we go to far.

Printing recursive file and folder count in powershell?

I am trying to compare two sets of folders to determine discrepancies in file and folder counts. I have found a command that will output the data I am looking for, but cannot find a way to print it to a file. Here is the command I am using currently:
dir -recurse | ?{ $_.PSIsContainer } | %{ Write-Host $_.FullName (dir $_.FullName | Measure-Object).Count }
This is getting me the desired data but I need to find a way to print this to a text file. Any help would be greatly appreciated.
The problem is the use of the Write-Host cmdlet, which bypasses almost all pipeline handling. In this case, it is also unnecessary, as any output that isn't used by a cmdlet is automatically passed into the pipeline (or to the console if there's nothing further).
Here is your code rewritten to output a string to the pipeline instead of using Write-Host. This uses PowerShell's string subexpression operator $(). At the console, it will look the same, but it can be piped to a file or other cmdlet.
gci -Recurse -Directory | %{ "$($_.FullName) $((gci $_.FullName).Count)" }
You may also find it useful to put the data into a PSCustomObject. Once you have the object, you can do further processing such as sorting or filtering based on the count.
$folders = gci -Recurse -Directory | %{ [PSCustomObject]#{Name=$_.FullName; Count=(dir $_.FullName).Count }}
$folders | sort Count
$folders | where Count -ne 0
Some notes on idioms: dir is an alias for Get-Childitem, as is gci. Using gci's -Directory parameter is the best way to list only directories, rather than the PSIsContainer check. Finally, Measure-Object is unnecessary. You can take the Count of the file listing directly.
See also Write-Host Considered Harmful from the inventor of PowerShell

Using PowerShell, read multiple known file names, append text of all files, create and write to one output file

I have five .sql files and know the name of each file. For this example, call them one.sql, two.sql, three.sql, four.sql and five.sql. I want to append the text of all files and create one file called master.sql. How do I do this in PowerShell? Feel free to post multiple answers to this problem because I am sure there are several ways to do this.
My attempt does not work and creates a file with several hundred thousand lines.
PS C:\sql> get-content '.\one.sql' | get-content '.\two.sql' | get-content '.\three.sql' | get-content '.\four.sql' | get-content '.\five.sql' | out-file -encoding UNICODE master.sql
Get-Content one.sql,two.sql,three.sql,four.sql,five.sql > master.sql
Note that > is equivalent to Out-File -Encoding Unicode. I only tend to use Out-File when I need to specify a different encoding.
There are some good answers here but if you have a whole lot of files and maybe you don't know all of the names this is what I came up with:
$vara = get-childitem -name "path"
$varb = foreach ($a in $vara) {gc "path\$a"}
example
$vara = get-childitem -name "c:\users\test"
$varb = foreach ($a in $vara) {gc "c:\users\test\$a"}
You can obviously pipe this directly into | add-content or whatever but I like to capture in variables so I can manipulate later on.
See if this works better
get-childitem "one.sql","two.sql","three.sql","four.sql","five.sql" | get-content | out-file -encoding UNICODE master.sql
I needed something similar, Chris Berry's post helped, but I think this is more efficient:
gci -name "*PathToFiles*" | gc > master.sql
The first part gci -name "*PathToFiles*" gets you your file list. This can be done with wildcards to just get your .sql files i.e. gci -name "\\share\folder\*.sql"
Then pipes to Get-Content and redirects the output to your master.sql file. As noted by Kieth Hill, you can use Out-File in place of > to better control your output if needed.
I think logical way of solving this is to use Add-Content
$files = Get-ChildItem '.\one.sql', '.\two.sql', '.\three.sql', '.\four.sql', '.\five.sql'
$files | foreach { Get-Content $_ | Add-Content '.\master.sql' -encoding UNICODE }
hovewer Get-Content is usually very slow when reading multiple very large files. If its your case this article could help: http://keithhill.spaces.live.com/blog/cns!5A8D2641E0963A97!756.entry
What about:
Get-Content .\one.sql,.\two.sql,.\three.sql,.\four.sql,.\five.sql | Set-Content .\master.sql
Here is how I do concatenate sql files from the Sql folder:
# Set the current location of the script to use relative path
Set-Location $PSScriptRoot
# Concatenate all the sql files
$concatSql = Get-Content -Path .\Sql\*.sql
# Write/overwrite sql to single file
Add-Content -Path concatFile.sql -Value $concatSql