All,
There is a application which generates it's export dumps.I need to write a script that will compare the previous days dump against the latest and if there are differences among them i have to some basic manipulations of moving and deleting sort of stuff.
I have tried finding a suitable way of doing it and the method i tried was :
$var_com=diff (get-content D:\local\prodexport2 -encoding Byte) (get-content D:\local\prodexport2 -encoding Byte)
I tried the Compare-Object cmdlet as well. I notice a very high memory usage and eventually i get a message System.OutOfMemoryException after few minutes. Has one of you done something similer ?. Some thoughts please.
There was a thread which mentioned about a has comparison which i have no idea as to how to go about.
Thanks in advance folks
Osp
With PowerShell 4 you can use native commandlets to do this:
function CompareFiles {
param(
[string]$Filepath1,
[string]$Filepath2
)
if ((Get-FileHash $Filepath1).Hash -eq (Get-FileHash $Filepath2).Hash) {
Write-Host 'Files Match' -ForegroundColor Green
} else {
Write-Host 'Files do not match' -ForegroundColor Red
}
}
PS C:> CompareFiles .\20131104.csv .\20131104-copy.csv
Files Match
PS C:> CompareFiles .\20131104.csv .\20131107.csv
Files do not match
You could easily modify the above function to return a $true or $false value if you want to use this programmatically on a large scale
EDIT
After seeing this answer, I just wanted to supply larger scale version that simply returns true or false:
function CompareFiles
{
param
(
[parameter(
Mandatory = $true,
HelpMessage = "Specifies the 1st file to compare. Make sure it's an absolute path with the file name and its extension."
)]
[string]
$file1,
[parameter(
Mandatory = $true,
HelpMessage = "Specifies the 2nd file to compare. Make sure it's an absolute path with the file name and its extension."
)]
[string]
$file2
)
( Get-FileHash $file1 ).Hash -eq ( Get-FileHash $file2 ).Hash
}
You could use fc.exe. It comes with Windows. Here's how you would use it:
fc.exe /b d:\local\prodexport2 d:\local\prodexport1 > $null
if (!$?) {
"The files are different"
}
Another method is to compare the MD5 hashes of the files:
$Filepath1 = 'c:\testfiles\testfile.txt'
$Filepath2 = 'c:\testfiles\testfile1.txt'
$hashes =
foreach ($Filepath in $Filepath1,$Filepath2)
{
$MD5 = [Security.Cryptography.HashAlgorithm]::Create( "MD5" )
$stream = ([IO.StreamReader]"$Filepath").BaseStream
-join ($MD5.ComputeHash($stream) |
ForEach { "{0:x2}" -f $_ })
$stream.Close()
}
if ($hashes[0] -eq $hashes[1])
{'Files Match'}
A while back I wrote an article on a buffered comparison routine to compare two files with PowerShell:
function FilesAreEqual {
param(
[System.IO.FileInfo] $first,
[System.IO.FileInfo] $second,
[uint32] $bufferSize = 524288)
if ($first.Length -ne $second.Length) return $false
if ( $bufferSize -eq 0 ) $bufferSize = 524288
$fs1 = $first.OpenRead()
$fs2 = $second.OpenRead()
$one = New-Object byte[] $bufferSize
$two = New-Object byte[] $bufferSize
$equal = $true
do {
$bytesRead = $fs1.Read($one, 0, $bufferSize)
$fs2.Read($two, 0, $bufferSize) | out-null
if ( -Not [System.Linq.Enumerable]::SequenceEqual($one, $two)) {
$equal = $false
}
} while ($equal -and $bytesRead -eq $bufferSize)
$fs1.Close()
$fs2.Close()
return $equal
}
You can use it by:
FilesAreEqual c:\temp\test.html c:\temp\test.html
A hash (like MD5) needs to traverse the entire file to do the hash calculation. This script returns as soon at it sees a difference in the buffer. It compares the buffer using LINQ which is faster than native PowerShell.
if ( (Get-FileHash c:\testfiles\testfile1.txt).Hash -eq (Get-FileHash c:\testfiles\testfile2.txt).Hash ) {
Write-Output "Files match"
} else {
Write-Output "Files do not match"
}
Related
I am creating usernames as such: first 3 letters of the first name then 4 randomly generated numbers. Ryan Smith = RYA4859. I am getting the random number from this PowerShell command:
Get-Random -Minimum 1000 -Maximum 10000
I need to know how to create a script that will add the username to a .txt file after it has been generated. I also want the script to first check the .txt file to see if the randomly generated number already already exists and if it does, generate a new 4 digit number that does not exist and then add that to the .txt file.
The flow should be:
generate random 4 digit number
check txt file if number exists
if yes - generate new number
if no - append file and add generated number to file
You want to run a do...until loop that runs until the randomly generated number doesn't exist in your text file
$file = "C:\users.txt"
$userId = "RYA"
# get the contents of your text file
$existingUserList = Get-Content $file
do
{
$userNumber = Get-Random -Minimum 1000 -Maximum 10000
# remove all alpha characters in the file, so only an array of numbers remains
$userListReplaced = $existingUserList -replace "[^0-9]" , ''
# the loop runs until the randomly generated number is not in the array of numbers
} until (-not ($userNumber -in $userListReplaced))
# concatenates your user name with the random number
$user = $userId + $userNumber
# appends the concatenated username into the text file
$user | Out-File -FilePath $file -Append
Without the 3 character prefix
$file = "C:\users.txt"
# get the contents of your text file
$existingUserList = Get-Content $file
do
{
$userNumber = Get-Random -Minimum 1000 -Maximum 10000
# remove all alpha characters in the file, so only an array of numbers remains
$userListReplaced = $existingUserList -replace "[^0-9]" , ''
# the loop runs until the randomly generated number is not in the array of numbers
} until (-not ($userNumber -in $userListReplaced))
# appends the concatenated username into the text file
$userNumber| Out-File -FilePath $file -Append
Note: Hashtables in general will find keys in less time than finding a matching element in an unsorted array. This difference in performance increases as the number of elements increase. While a binary search on a sorted arrays may come closer in performance, the sorting process itself can be be a major performance hit and add complexity to the code.
The main difference between the described version of the code in the comment on the question, and the following code, is that I'm appending the new user name to the file instead of over writing the file, and added a loop near the end to repeatedly ask if the code should continue.
function RandomDigits {
[CmdletBinding()]
param (
[Parameter()]
[int]$DigitCount = 2
)
$RandString = [string](Get-Random -Minimum 100000 -Maximum 10000000)
$RandString.Substring($RandString.Length-$DigitCount)
}
function GenUserName {
[CmdletBinding()]
param(
[Parameter(Mandatory = $true, Position = 0)]
[string]$Prefix
)
"$Prefix$(RandomDigits 4)"
}
function ReadAndMatchRegex {
[CmdletBinding()]
param(
[Parameter(Mandatory = $true, Position = 0)]
[string]$Regex,
[Parameter(Mandatory = $true, Position = 1)]
[string]$Prompt,
[Parameter(Mandatory = $false, Position = 2)]
[string]$ErrMsg = "Incorrect, please enter needed info (Type 'exit' to exit)."
)
$FirstPass = $true
do {
if (-not $FirstPass) {
Write-Host $ErrMsg -ForegroundColor Red
Write-Host
}
$ReadText = Read-Host -Prompt $Prompt
$ReadText = $ReadText.ToUpper()
if($ReadText -eq 'exit') {exit}
$FirstPass = $false
} until ($ReadText -match $Regex)
$ReadText
}
$Usernames = #{}
$UsernameFile = "$PSScriptRoot\Usernames.txt"
if(Test-Path -Path $UsernameFile -PathType Leaf) {
foreach($line in Get-Content $UsernameFile) { $Usernames[$Line]=$true }
}
do {
Write-Host
$UserPrefix = ReadAndMatchRegex '^[A-Z]{3}$' "Please enter 3 letters for user's ID"
do {
$NewUserName = GenUserName $UserPrefix
} while ($Usernames.ContainsKey($NewUserName))
$NewUserName | Out-File $UsernameFile -Append
$UserNames[$NewUserName]=$true
$UserNames.Keys
$Continue = ReadAndMatchRegex '^(Y|y|YES|yes|Yes|N|n|NO|no|No)$' 'Continue?[Y/N]'
} while ($Continue -match '^(Y|y|YES|yes|Yes)$')
Looking for some pointers / tips to increase the speed and/or efficacy of below. Would be open to other methods, but have only dabbled in powershell,cmd and python.
Also credit where credit is due: This is a hack-job on the following: https://stackoverflow.com/a/44183234/12834479
Rather than working local, I'm hitting a Network Share over VPN with abysmal connection speeds.
Roughly, it's working at 8 secs / PDF.
Issues I've tried to take care of, goal is to ensure each PDF is readable by Adobe. Images saved as PDF (but not pdfs) will open in some PDF software, but Adobe hates them. I have the method to convert, but my rate limiter is identifying them.
Adobe PDFs -start with %PDF
Some Bank PDFs - start with "blank space" then %PDF
3rd party software - Junk Headers, but %PDF is within document
$items = Get-ChildItem | Where-Object {$_.Extension -eq ".pdf"}
$arrary = #()
$logFile = "RESULTS_$(get-date -Format yyyymmdd).log"
$badCounter = 0
$goodCounter = 0
$msg = "`n`nProcessing " + $items.count + " files... "
Write-Host -nonewline -foregroundcolor Yellow $msg
foreach ($item in $items)
{
trap { Write-Output "Error trapped: $_"; continue; }
try {
$pdfText = Get-Content $item -raw
$ptr3 = '%PDF'
if ('%PDF' -ne $pdfText.SubString(([System.Math]::Max(0,$pdfText.IndexOf($ptr3))),4)) { $arrary+= "$item |-failed" >>$logfile;$badCounter += 1; $badCounter} else { $goodCounter += 1; $goodCounter}
continue;}
catch [System.Exception]{write-output "$item $_";}}
$totalCounter = $badCounter + $goodCounter
Write-Output $arrary >> $logFile
1..3 | %{ Write-Output "" >> $logFile }
Write-Output "Total: $totalCounter / BAD: $badCounter / GOOD: $goodCounter" >> $logFile
Write-Output "DONE!`n`n"
If any difference currently running in PS Version 7.1.3 / but also have 5.1.18 on local.
Actually, PDF files aren't plaintext files at all, but binary files, so you should not read them in as string.
What you are looking for is called a FourCC magic number in the file. This four-character code can be seen as Magic number to identify the file type.
For PDF files, these 4 bytes are 0x25, 0x50, 0x44, 0x46 ("%PDF") and the file should start with those bytes.
For those true PDF files, you could test with:
[byte[]]$fourCC = Get-Content -Encoding Byte -ReadCount 4 -TotalCount 4 -Path 'X:\TheFile.pdf'
if ([System.Text.Encoding]::ASCII.GetString($fourCC) -ceq '%PDF') {
Write-Host "This is a true PDF file"
}
However, as you say "Bank pdf's usually start with a blank space", to also consider those files "good", you can do:
[byte[]]$sixCC = Get-Content -Encoding Byte -ReadCount 6 -TotalCount 6 -Path 'X:\TheFile.pdf'
if ([System.Text.Encoding]::ASCII.GetString($sixCC) -cmatch '%PDF') {
Write-Host "This is a PDF file"
}
If you also want to treat files where "%PDF" is found anyhere in the file as "good", you will need to read the whole file as string, but with a one-to-one byte mapping of the bytes.
For that you can use below helper function:
function ConvertTo-BinaryString {
# converts the bytes of a file to a string that has a
# 1-to-1 mapping back to the file's original bytes.
# Useful for performing binary regular expressions.
Param (
[Parameter(Mandatory = $True, ValueFromPipeline = $True, Position = 0)]
[ValidateScript( { Test-Path $_ -PathType Leaf } )]
[String]$Path
)
# Note: Codepage 28591 returns a 1-to-1 char to byte mapping
$Encoding = [Text.Encoding]::GetEncoding(28591)
$Stream = [System.IO.FileStream]::new($Path, 'Open', 'Read')
$StreamReader = [System.IO.StreamReader]::new($Stream, $Encoding)
$BinaryText = $StreamReader.ReadToEnd()
$StreamReader.Close()
$Stream.Close()
return $BinaryText
}
Next, you can use that function as:
$binString = ConvertTo-BinaryString -Path 'X:\TheFile.pdf'
if ($binString.IndexOf("%PDF") -ge 0) {
Write-Host "This is a PDF file"
}
Putting it all together and assuming you want all files marked as .PDF files where the magic number '%PDF' (case-sensitive) can be found anywhere in the file:
function ConvertTo-BinaryString {
# converts the bytes of a file to a string that has a
# 1-to-1 mapping back to the file's original bytes.
# Useful for performing binary regular expressions.
Param (
[Parameter(Mandatory = $True, ValueFromPipeline = $True, Position = 0)]
[ValidateScript( { Test-Path $_ -PathType Leaf } )]
[String]$Path
)
# Note: Codepage 28591 returns a 1-to-1 char to byte mapping
$Encoding = [Text.Encoding]::GetEncoding(28591)
$Stream = [System.IO.FileStream]::new($Path, 'Open', 'Read')
$StreamReader = [System.IO.StreamReader]::new($Stream, $Encoding)
$BinaryText = $StreamReader.ReadToEnd()
$StreamReader.Close()
$Stream.Close()
return $BinaryText
}
$badCounter = 0
$goodCounter = 0
$logFile = "RESULTS_{0:yyyyMMdd}.log" -f (Get-Date)
# get an array of pdf file FullNames
$files = #(Get-ChildItem -File -Filter '*.pdf').FullName
Write-Host "Processing $($files.Count) files... " -ForegroundColor Yellow
# loop through the array, test if '%PDF' is found and output strings for the log file
$result = foreach ($item in $files) {
$pdfText = ConvertTo-BinaryString -Path $item
if ($pdfText.IndexOf("%PDF") -ge 0) {
$goodCounter++
"Success - $item"
}
else {
$badCounter++
"Fail - $item"
}
}
# write the output to the log file
$result | Set-Content -Path $logFile
"=" * 25 | Add-Content -Path $logFile
"BAD: $badCounter" | Add-Content -Path $logFile
"GOOD: $goodCounter" | Add-Content -Path $logFile
"Total: $($files.Count)" | Add-Content -Path $logFile
Write-Host "DONE!" -ForegroundColor Green
I wrote a script that will pull data from a .properties file (basically a config file). Some of the data from the properties file has environment data (i.e. %UserProfile%), so I run it through a function (Resolve–EnvVariable) that will replace the environment variable with the actual value. The replace works perfectly, but somehow the data seems to be altered.
When I try to use the values that have been run through the function, they no longer work (see results down below).
This is the file contents of c:\work\test.properties
types="*.txt"
in="%UserProfile%\Downloads"
This is my PowerShell Script
Clear-Host
#Read the properties file and replace the parameters when specified
if (Test-Path C:\work\test.properties) {
$propertiesFile = Get-Content C:\work\test.properties
Write-Host "Parameters will be substituded from properties file" -ForegroundColor Yellow
foreach ($line in $propertiesFile) {
Write-Host ("from Properties file $line")
$propSwitch = $line.Split("=")[0]
$propValue = Resolve–EnvVariable($line.Split("=")[1])
switch ($propSwitch) {
"types" { $types = $propValue }
"in" { $in = $propValue }
}
}
}
write-host ("After running through function `n in=" + $in + "<- types=" + $types + "<-")
# This function resolves environment variables
Function Resolve–EnvVariable {
[cmdletbinding()]
Param(
[Parameter(Position = 0, ValueFromPipeline = $True, Mandatory = $True,
HelpMessage = "Enter string with env variable i.e. %APPDATA%")]
[ValidateNotNullOrEmpty()]
[string]$String
)
Begin {
Write-Verbose "Starting $($myinvocation.mycommand)"
} #Begin
Process {
#if string contains a % then process it
if ($string -match "%\S+%") {
Write-Verbose "Resolving environmental variables in $String"
#split string into an array of values
$values = $string.split("%") | Where-Object { $_ }
foreach ($text in $values) {
#find the corresponding value in ENV:
Write-Verbose "Looking for $text"
[string]$replace = (Get-Item env:$text -erroraction "SilentlyContinue").Value
if ($replace) {
#if found append it to the new string
Write-Verbose "Found $replace"
$newstring += $replace
}
else {
#otherwise append the original text
$newstring += $text
}
} #foreach value
Write-Verbose "Writing revised string to the pipeline"
#write the string back to the pipeline
Write-Output $NewString
} #if
else {
#skip the string and write it back to the pipeline
Write-Output $String
}
} #Process
End {
Write-Verbose "Ending $($myinvocation.mycommand)"
} #End
} #end Resolve-EnvVariable
# Hardcoded values work
$test1 = Get-ChildItem -Path "C:\Users\Paul\Downloads" -Recurse -Include "*.txt"
# Values pulled and updated through function do not work
$test2 = Get-ChildItem -Path $in -Recurse -Include $types
# If I manually assign the values, it works
$in = "C:\Users\Paul\Downloads"
$types = "*.txt"
$test3 = Get-ChildItem -Path $in -Recurse -Include $types
foreach ($test in $test1) { write-host "test1 $test" }
foreach ($test in $test2) { write-host "test2 $test" }
foreach ($test in $test3) { write-host "test3 $test" }
Results
Parameters will be substituded from properties file
from Properties file types="*.txt"
from Properties file in="%UserProfile%\Downloads"
After running through function
in="C:\Users\Paul\Downloads"<- types="*.txt"<-
test1 C:\Users\Paul\Downloads\Test\testPaul.txt
test1 C:\Users\Paul\Downloads\Test2\File1.txt
test3 C:\Users\Paul\Downloads\Test\testPaul.txt
test3 C:\Users\Paul\Downloads\Test2\File1.txt
Two alternatives:
1. Use Environment.ExpandEnvironmentVariables()
If you switched to non-qualified string values and escaped your \, it would be as simple as piping the file to ConvertFrom-StringData, at which point you could expand the variable values with Environment.ExpandEnvironmentVariables():
Properties file:
types=*.txt
in=%UserProfile%\\Downloads
Script:
# Convert file to hashtable
$properties = Get-Content file.properties -Raw |ConvertFrom-StringData
# Copy value to new hashtable, but expand env vars first
$expanded = #{}
foreach($entry in $properties.GetEnumerator()){
$expanded[$entry.Key] = [Environment]::ExpandEnvironmentVariables($entry.Value)
}
Should give you the desired values:
PS C:\> $expanded
Name Value
---- -----
in C:\Users\username\Downloads
types *.txt
2. Use and dot-source a PowerShell script for your properties
This is lifted straight out of a page of the original Exchange Server modules - place all configuration variables in separate scripts, which are in turn dot-sourced when initializing a new session:
Properties file:
$types = "*.txt"
$in = Join-Path $env:USERPROFILE Downloads
Script:
# dot source the variables
. (Join-Path $PSScriptRoot properties.ps1)
# do the actual work
Get-ChildItem $in -Include $types
I am trying to specify my file path in the script that I got from here: https://gallery.technet.microsoft.com/scriptcenter/Outputs-directory-size-964d07ff
The current file path points to the directory, but I am unable to locate the variable that I need to change in order to specify a different path.
# Get-DirStats.ps1
# Written by Bill Stewart (bstewart#iname.com)
# Outputs file system directory statistics.
#requires -version 2
<#
.SYNOPSIS
Outputs file system directory statistics.
.DESCRIPTION
Outputs file system directory statistics (number of files and the sum of all file sizes) for one or more directories.
.PARAMETER Path
Specifies a path to one or more file system directories. Wildcards are not permitted. The default path is the current directory (.).
.PARAMETER LiteralPath
Specifies a path to one or more file system directories. Unlike Path, the value of LiteralPath is used exactly as it is typed.
.PARAMETER Only
Outputs statistics for a directory but not any of its subdirectories.
.PARAMETER Every
Outputs statistics for every directory in the specified path instead of only the first level of directories.
.PARAMETER FormatNumbers
Formats numbers in the output object to include thousands separators.
.PARAMETER Total
Outputs a summary object after all other output that sums all statistics.
#>
[CmdletBinding(DefaultParameterSetName="Path")]
param(
[parameter(Position=0,Mandatory=$false,ParameterSetName="Path",ValueFromPipeline =$true)]
$Path=(get-location).Path,
[parameter(Position=0,Mandatory=$true,ParameterSetName="LiteralPath")]
[String[]] $LiteralPath,
[Switch] $Only,
[Switch] $Every,
[Switch] $FormatNumbers,
[Switch] $Total
)
begin {
$ParamSetName = $PSCmdlet.ParameterSetName
if ( $ParamSetName -eq "Path" ) {
$PipelineInput = ( -not $PSBoundParameters.ContainsKey("Path") ) -and ( -
not $Path )
}
elseif ( $ParamSetName -eq "LiteralPath" ) {
$PipelineInput = $false
}
# Script-level variables used with -Total.
[UInt64] $script:totalcount = 0
[UInt64] $script:totalbytes = 0
# Returns a [System.IO.DirectoryInfo] object if it exists.
function Get-Directory {
param( $item )
if ( $ParamSetName -eq "Path" ) {
if ( Test-Path -Path $item -PathType Container ) {
$item = Get-Item -Path $item -Force
}
}
elseif ( $ParamSetName -eq "LiteralPath" ) {
if ( Test-Path -LiteralPath $item -PathType Container ) {
$item = Get-Item -LiteralPath $item -Force
}
}
if ( $item -and ($item -is [System.IO.DirectoryInfo]) ) {
return $item
}
}
# Filter that outputs the custom object with formatted numbers.
function Format-Output {
process {
$_ | Select-Object Path,
#{Name="Files"; Expression={"{0:N0}" -f $_.Files}},
#{Name="Size"; Expression={"{0:N0}" -f $_.Size}}
}
}
# Outputs directory statistics for the specified directory. With -recurse,
# the function includes files in all subdirectories of the specified
# directory. With -format, numbers in the output objects are formatted with
# the Format-Output filter.
function Get-DirectoryStats {
param( $directory, $recurse, $format )
Write-Progress -Activity "Get-DirStats.ps1" -Status "Reading
'$($directory.FullName)'"
$files = $directory | Get-ChildItem -Force -Recurse:$recurse | Where-
Object
{ -not $_.PSIsContainer }
if ( $files ) {
Write-Progress -Activity "Get-DirStats.ps1" -Status "Calculating
'$($directory.FullName)'"
$output = $files | Measure-Object -Sum -Property Length | Select-Object
`
#{Name="Path"; Expression={$directory.FullName}},
#{Name="Files"; Expression={$_.Count; $script:totalcount += $_.Count}},
#{Name="Size"; Expression={$_.Sum; $script:totalbytes += $_.Sum}}
}
else {
$output = "" | Select-Object `
#{Name="Path"; Expression={$directory.FullName}},
#{Name="Files"; Expression={0}},
#{Name="Size"; Expression={0}}
}
if ( -not $format ) { $output } else { $output | Format-Output }
}
}
... the rest of the code did not seem relevant
You either specify the $Path variable when calling the script, or add a line that overrides the default value. I've highlighted where this is below.
[CmdletBinding(DefaultParameterSetName="Path")]
param(
[parameter(Position=0,Mandatory=$false,ParameterSetName="Path",ValueFromPipeline =$true)]
$Path=(get-location).Path, ################ PATH IS SET HERE ##############
[parameter(Position=0,Mandatory=$true,ParameterSetName="LiteralPath")]
[String[]] $LiteralPath,
[Switch] $Only,
When calling script:
C:>.\myscript.ps1 -Path "c:\temp"
what you call the value depends on where you call it from.
the "main" part of this cmdlet accepts one of a couple parameters; path and literalPath, path would be used in preference to literal path. If neither is specified the current working directory will be the starting point. passing different arguments to the cmdlet seems to be the easiest technique. The author's intended usage.
BUT...
Up in that first function, after the parameters are bound in the "begin" section... the actual path is "$item".
inside Get-DirectoryStats it's being referred to as $directory.
There are places where it's referred to as $_.
There are a lot of articles on the topic of "scope". here's one: https://msdn.microsoft.com/en-us/powershell/reference/5.1/microsoft.powershell.core/about/about_scopes
I have several files that I need to add a "!" to the beginning, just on the first line. I still need to keep the first line's content, just add a "!" as the first character.
Any help would be really appreciated.
Thanks!
Edit:
The only thing I could figure out so far was to do the following:
$a = Get-Content 'hh_Regulars3.csv'
$b = '!'
Set-Content 'hh_Regulars3-new.csv' -value $b,$a
This just added the "!" to the top of the file, instead of to the beginning of the first line.
You sent an array to Set-Content with $b,$a. Each array item will be given its own line as you have seen. It would displayed the same way on the prompt if executed.
As long as the file is not too big read it in as one string and add the character in.
$path = 'hh_Regulars3.csv'
"!" + (Get-Content $path -Raw) | Set-Content $path
If you only have PowerShell 2.0 then Out-String would work in place of -Raw
"!" + (Get-Content $path | Out-String) | Set-Content $path
The brackets are important to be sure the file is read in before it goes to through the pipeline. It allows us to both read and write on the same pipeline.
If the file is larger look into using StreamReaders and StreamWriters. This would also have to be used if the trailing new line, created by the Add-Content and Set-Content, is not warranted.
Late to the party, but thought this might be useful. I needed to perform the operation over a thousand+ large files, and needed something a little more robust and less prone to OOM exceptions. Ended up just writing it leveraging .Net libraries:
function PrependTo-File{
[cmdletbinding()]
param(
[Parameter(
Position=1,
ValueFromPipeline=$true,
Mandatory=$true,
ValueFromPipelineByPropertyName=$true
)]
[System.IO.FileInfo]
$file,
[string]
[Parameter(
Position=0,
ValueFromPipeline=$false,
Mandatory=$true
)]
$content
)
process{
if(!$file.exists){
write-error "$file does not exist";
return;
}
$filepath = $file.fullname;
$tmptoken = (get-location).path + "\_tmpfile" + $file.name;
write-verbose "$tmptoken created to as buffer";
$tfs = [System.io.file]::create($tmptoken);
$fs = [System.IO.File]::Open($file.fullname,[System.IO.FileMode]::Open,[System.IO.FileAccess]::ReadWrite);
try{
$msg = $content.tochararray();
$tfs.write($msg,0,$msg.length);
$fs.position = 0;
$fs.copyTo($tfs);
}
catch{
write-verbose $_.Exception.Message;
}
finally{
$tfs.close();
# close calls dispose and gc.supressfinalize internally
$fs.close();
if($error.count -eq 0){
write-verbose ("updating $filepath");
[System.io.File]::Delete($filepath);
[System.io.file]::Move($tmptoken,$filepath);
}
else{
$error.clear();
write-verbose ("an error occured, rolling back. $filepath not effected");
[System.io.file]::Delete($tmptoken);
}
}
}
}
Usage:
PS> get-item fileName.ext | PrependTo-File "contentToAdd`r`n"
This oneliner might works :
get-ChildItem *.txt | % { [System.Collections.ArrayList]$lines=Get-Content $_;
$lines[0]=$lines[0].Insert(0,"!") ;
Set-Content "new_$($_.name)" -Value $lines}
Try this:
$a = get-content "c:\yourfile.csv"
$a | %{ $b = "!" + $a ; $b | add-content "c:\newfile.csv" }