Powershell command to prepend text before each file - powershell

I am trying to build a PowerShell command to extract all files from multiple folder and combine into a single file.
Script:
Get-ChildItem -Path $(Pipeline.Workspace)/Common_All/Drop_$(DropFolder)_Migrations -Filter "Release*"
-Directory | Get-ChildItem -File -Filter *.sql | ForEach-Object { $_ |Get-Content;
"GO" } | out-file $(System.DefaultWorkingDirectory)/combined-script.sql
This returns combined script with "GO" appended after each file content
How do I prepend a text before each file content?

ForEach-Object { "prepended text`n" + ($_ | Get-Content -raw) + "`nGO" }
It may also be done same way as you've appended GO.
ForEach-Object {"prepended text"; $_ |Get-Content; "GO" }

Related

Powershell script to locate only files starting with specified letters and ending with .csv

cd 'A:\P\E\D'
$files = Get-ChildItem . *.CSV -rec
ForEach ($file in $files) {
(Get-Content $file -Raw) | ForEach-Object {
*some simple code*
} | Set-Content $file
}
How to modify this powershell script to locate only files starting with letters A/a to O/o and ending with .csv in specified directory cd?
I thought the solution below would work, but the test file M_K_O_X.CSV stored in the cd directory was not found and modified. The solution above will find and modify the file. It's possible that I have the regex expression wrong or the problem is somewhere else? I tried also this regex -- "[A-O]..CSV"
cd 'A:\P\E\D'
$files = Get-ChildItem . -rec | Where-Object { $_.Name -like "[a-oA-O]*.*.CSV" }
ForEach ($file in $files) {
(Get-Content $file -Raw) | ForEach-Object {
*some simple code*
} | Set-Content $file
}
Looking at your wildcard pattern, seems like you have an extra *. that shouldn't be there:
'M_K_O_X.CSV' -like '[a-oA-O]*.*.CSV' # False
'M_K_O_X.CSV' -like '[a-oA-O]*.CSV' # True
In this case you could simply use the -Include Parameter which supports character ranges. Also PowerShell is case insensitive by default, [a-oA-O]*.CSV can be reduced to [a-o]*.CSV:
Get-ChildItem 'A:\P\E\D' -Recurse -Include '[a-o]*.csv' | ForEach-Object {
($_ | Get-Content -Raw) | ForEach-Object {
# *some simple code*
} | Set-Content -LiteralPath $_.FullName
}
As commented, I would use the standard wildcard -Filter to filter for all files with a .csv extension.
Then pipe to a Where-Object clause in which you can use regex -match
$files = Get-ChildItem -Path 'A:\P\E\D' -Filter '*.csv' -File -Recurse |
Where-Object { $_.Name -match '^[a-o]' }
foreach ($file in $files) {
# switch `-Raw` makes Get-Content return a single multiline string, so no need for a loop
$content = Get-Content -Path $file.FullName -Raw
# *some simple code manipulating $content*
$content | Set-Content -Path $file.FullName
}
However, if these are valid csv files, I would not recommend using a pure textual manipulation on them, instead use Import-Csv -Path $file.FullName and work on the properties on each of the objects returned.

Trying to truncate csv/txt files leaving header row only

I have several csv and txt files in a directory with data in them. I need to truncate the data from all of these files but leave the header in each.
You can use following script - it should work, if all files have more than one line...
$files = dir .\* -include ('*.csv', '*.txt')
foreach ($file in $files) {
$firstline = (get-content $file)[0]
set-content $file -Value $firstline
}
You do not need to read the whole file in order to just capture the first line..
Get-ChildItem -Path 'D:\Test' -File | Where-Object { $_.Extension -match '\.(csv|txt)'} | ForEach-Object {
# only read the first line using -TotalCount
($_ | Get-Content -TotalCount 1) | Set-Content -Path $_.FullName
}
The above could produce empty or whitespace only files if the top line is empty or only contains whitespaces..
Perhaps then the best option to quickly truncate these files to the top NON-EMPTY line would be:
Get-ChildItem -Path 'D:\Test' -File | Where-Object { $_.Extension -match '\.(csv|txt)'} | ForEach-Object {
$newcontent = switch -Regex -File $_.FullName {
'\S' { $_ ; break} # output the first line that is not empty or whitespace-only and exit the switch
}
# write back to the file
$newcontent | Set-Content -Path $_.FullName
}
P.S. Using -Filter as parameter on Get-ChildItem would work faster, but unfortunately, the filter can only be used for ONE file pattern only, like '*.csv'.
If you need recursion (search subfolders as well), then you could user the -Include parameter which accepts an array of file patterns. However, for that to work, you also need to add switch -Recurse OR have the path end in \*.
-Include is not as fast as -Filter, just about the same speed as by using a Where-Object clause in the examples above

How do I find & replace file contents in PowerShell?

I need to find text in a file, like:
PolicyFile=$(SrcRoot)BeHttp
and replace it with this:
PolicyFile=$(SrcRoot)PPCore/BeHttp
So I wrote following script but it's not working.
Get-ChildItem 'D:\SomeFolder\\*.MKE' -Recurse | ForEach {
(Get-Content $_ | ForEach {
$_ -replace "PolicyFile=$(SrcRoot)BeHttp", "PolicyFile=$(SrcRoot)PPCore//BeHttp"
}) | Set-Content $_
}
You want to escape the characters powershell considers reserved. Also, when using Get-Content, you need to provide full path. That path is available under FullName of the Child Item (Get-ChildItem).
Get-ChildItem 'D:\SomeFolder\*.MKE' -Recurse | ForEach {
(Get-Content $_.FullName) -replace 'PolicyFile=\$\(SrcRoot\)BeHttp', 'PolicyFile=$(SrcRoot)PPCore//BeHttp' | Set-Content $_.FullName }
To Escape $ ( ), use \. Also, you dont need to use For-Each on string obtained from Get-Content.
Update:
When running Get-ChildItem, i do see all the files from all subfolders.
PS C:\Users\user> Get-ChildItem 'C:\Temp\*.MKE' -Recurse | % { $_.FullName}
C:\Temp\1\new.mke
C:\Temp\2\3\new.mke
C:\Temp\2\new.mke
C:\Temp\new.mke

Exclude all sub-directories from PowerShell Get-ChildItem -match

Goal
Exclude all sub-directories when running a PowerShell script that matches a filename regex pattern
Directory structure
/
- 2018-11-19.md
18-2/
- 2018-10-16.md
- 2019-01-14.md
- 2019-10-10.md
18-3/
- 2019-01-13.md
- 2019-04-25.md
PowerShell script
$file = '2018-11-19.md'
Get-ChildItem -recurse | where-object { $_.FullName -match '[0-9]{4}-[0-9]{2}-[0-9]{2}.md' } |
ForEach-Object {$fullname = $_.fullname; (Get-Content $_.fullname | foreach-object {
$_ -replace "apple", "orange"
}) | Set-Content $fullname}
(Get-Content $file | ForEach-Object {
$_ -replace '<p(.*?)>(.*?)</p>', '$2'
}) | Set-Content -Encoding Utf8 $file
Get-ChildItem -recurse | where-object { $_.FullName -match '[0-9]{4}-[0-9]{2}-[0-9]{2}.md' } |
foreach-Object {$fullname2 = $_.fullname; (Get-Content $_.fullname |
pandoc -f markdown -t markdown -o $fullname2 $fullname2
)}
Details
The goal is to run the PowerShell script on only file(s) in the root directory. These file(s) at root will change but always be named according to the convention shown. The regex in the PowerShell script successfully matches this filename.
Currently the script changes all files in the directory example above.
Any examples I can find show how to exclude specific directories by identifying their names in the script (e.g., -Exclude folder-name). I want to exclude all sub-directories without naming them specifically because...
...In the future sub-directories may be added for 18-4, 19-5, etc., so it seems like an exclusion based on a regex would make sense.
Attempts
To limit the script's scope to the root directory, I tried variations on -notmatch with \, \*, \*.*, \\*, etc., without success.
To exclude sub-directories, I tried variations on -Exclude with the same paths but did not succeed.
My PowerShell knowledge is not advanced enough to get further than this. I would be grateful for any help or to be pointed in the right direction. Thank you for any help.
As pointed out by Owain and gvee in the comments, when using the -Recurse switch, you tell the Get-ChildItem cmdlet that you wish to traverse through the sub directory structure from the selected location. As expained on the docs site of the cmdlet
Gets the items in the specified locations and in all child items of the locations.
So simply removing the switch should make the code do as you want.
If you ever want only X level of sub directories you can use -Depth switch.
Get-ChildItem | where-object { $_.FullName -match '[0-9]{4}-[0-9]{2}-[0-9]{2}.md' } |
ForEach-Object {$fullname = $_.fullname; (Get-Content $_.fullname | foreach-object {
$_ -replace "apple", "orange"
}) | Set-Content $fullname}
(Get-Content $file | ForEach-Object {
$_ -replace '<p(.*?)>(.*?)</p>', '$2'
}) | Set-Content -Encoding Utf8 $file
Get-ChildItem | where-object { $_.FullName -match '[0-9]{4}-[0-9]{2}-[0-9]{2}.md' } |
foreach-Object {$fullname2 = $_.fullname; (Get-Content $_.fullname |
pandoc -f markdown -t markdown -o $fullname2 $fullname2
)}

Powershell - List all alternate data stream information from one directory

My end goal here is to cd to a directory in powershell and then list all the alternate data stream files, then output all their content to a CSV.
I currently have the first two parts scripted:
cd c:\users\profilename\downloads\
gci -recurse | % { gi $_.FullName -stream * } | where stream -ne ':$Data'
To open an example data stream file, open cmd, cd to a directory, then run:
dir /r
After this, grab the zone identified name of one of the files and run this command without the :$data.
Example before removing :$Data
notepad test.docx:Zone.Identifier:$Data
After removing(run this command):
notepad test.docx:Zone.Identifier
How would I go about taking the output of the second command and using the PSPath field to open each of these files and then output all the contents in to one CSV file?
Any help is greatly appreciated.
Presuming your are after the Stream content:
## Q:\Test\2018\11\19\SO_53380498.ps1
Pushd $ENV:USERPROFILE\Downloads
Get-ChildItem -Recurse | ForEach-Object {
Get-Item $_.FullName -Stream *
} | Where-Object Stream -ne ':$Data' |
Select-Object FileName,Stream,
#{n='CreationTime';e={(Get-Item $_.FileName).CreationTime}},
#{n='LastWriteTime';e={(Get-Item $_.FileName).LastWriteTime}},
#{n='Content';e={gc "$($_.FileName):$($_.Stream)"}} |
Export-Csv Streams.csv -NoTypeInformation
Shorted output of the generated Streams.csv file
(date format depends on locale/user settings):
> gc .\Streams.csv
"FileName","Stream","CreationTime","LastWriteTime","Content"
"C:\Users\LotPings\Downloads\2018-06-27-raspbian-stretch.zip","Zone.Identifier","2018-07-29 22:13:03","2018-07-29 22:16:41","[ZoneTransfer] ZoneId=3"
If your final destination for the csv supports multiline fields, you could do -join "`n" on the content.
I think this might be close to what you want:
$files = gci -recurse | % { gi $_.FullName -stream * } | where stream -ne ':$Data' | select filename,stream,#{'name'='identifier';"e"={"$($_.filename)$($_.stream)"}}
Broken into multi-lines for legibility:
$files = Get-ChildItem -Recurse |
Where-Object { Get-Item $_.FullName -Stream * } |
Where-Object {$_.Stream -ne ':$Data'} |
Select-Object -Properties filename, stream, #{'name'='identifier';"e"={"$($_.filename)$($_.stream)"}}