I want to get the heading as "StudentID|studentfirstname|studentlastname|class" to my existing data as below:
2|vicky|kash|A
5|abc|sdf|B
9|sdf|sdf|D
My code looks like:
add-content -path "outfile.txt" -Value (-join($StudentID, "|",`
$studentfirstname, "|",` $studentlastname, "|",`$class)
Expected output file:
StudentID|studentfirstname|studentlastname|class
2|vicky|kash|A
5|abc|sdf|B
9|sdf|sdf|D
Thanks in Advance!
Although I'm not quite sure what you intend to do, but to me the question reads as "I have pipe-delimited data and all it is missing is a header line".
If that is the case, you coud do something as simple as:
$fileIn = 'D:\Test\YourFile.csv'
$fileOut = 'D:\Test\YourFile2.csv'
# write the header line to a new file
Set-Content -Path $fileOut -Value "StudentID|studentfirstname|studentlastname|class"
# read the original file and append it to the one you have just created
Get-Content -Path $fileIn -Raw | Add-Content -Path $fileOut
If your input file is really large, below a faster alternative:
$fileIn = 'D:\Test\YourFile.csv'
$fileOut = 'D:\Test\YourFile2.csv'
# write the header line to a new file
Set-Content -Path $fileOut -Value "StudentID|studentfirstname|studentlastname|class"
# read the original file and append it to the one you have just created
[System.IO.File]::AppendAllText($fileOut, ([System.IO.File]::ReadAllText($fileIn)))
That syntax is incorrect...
Just do this...
$StudentID = '123'
$studentfirstname = 'John'
$studentlastname = 'Doe'
$class = 'Math'
Clear-Host
"$StudentID|$studentfirstname|$studentlastname|$class"
# Results
<#
123|John|Doe|Math
#>
Or
Clear-Host
$StudentID,$studentfirstname,$studentlastname,$class -join '|'
# Results
<#
123|John|Doe|Math
#>
Or
Clear-Host
"{0}|{1}|{2}|{3}" -f $StudentID,$studentfirstname,$studentlastname,$class
# Results
<#
123|John|Doe|Math
#>
Related
I'm trying to find a way to reliably replace all occurrences of a string found in a file with data from a column in a CSV using one column as the search pattern with data from the same row on the next column for the replace pattern. The new data is then written to a new file as to keep the original intact. The purpose of this is to simplify exchanging IDs between environments that are hardcoded into the Master pages of a SharePoint site collection. Here's what I have so far.
$file = "C:\Users\jeffery\Documents\ids.csv"
$csv = Import-Csv -Path $file -Delimiter `,
$prd2016 = $csv.'2016 PRD ID'
$stg2016 = $csv.'2016 STG ID'
$prd2010 = $csv.'2010 PRD ID'
$srcFile = "C:\Users\jeffery\Downloads\v5.master"
$dstFile = "C:\Users\jeffery\Downloads\v6.master"
Set-Variable 2010,2016
$content = Get-Content -Path $srcFile
For($i=0; $i -lt $prd2016.Count; $i++){
Clear-Variable 2010
Clear-Variable 2016
$2010 = $prd2010[$i]
$2016 = $prd2016[$i]
$content.replace("$2016", "$2010") | Set-Content -Path $dstFile -Force
}
I've also tried nested loops and using foreach loops to no avail as of yet. Any help will be greatly appreciated. Also, here's some sample data to assist with any answers.
CSV Data:
Navigation,a5a0c64c-17b1-4cba-a8ff-a6a61d8466f3,a66d1d48-ab5e-4aed-9eb9-e8763b88ff2a,2d3cd026-7e2a-4241-8500-abd9a83a0803
Source file data:
<WebPartPages:DataFormWebPart runat="server" IsIncluded="True" AsyncRefresh="false" NoDefaultStyle="TRUE" ViewFlag="8" Title="Navigation" PageType="PAGE_NORMALVIEW" __markuptype="vsattributemarkup" __WebPartId="{9CDA54AA-5C9F-4E62-A0D6-BE149C8B27F0}" partorder="2" id="g_9cda54aa_5c9f_4e62_a0d6_be149c8b27f0" listname="{a5a0c64c-17b1-4cba-a8ff-a6a61d8466f3}" pagesize="1" chrometype="None" __AllowXSLTEditing="true" WebPart="true" Height="" Width="">
<DataSources><SharePoint:SPDataSource runat="server" DataSourceMode="List" UseInternalName="true" UseServerDataFormat="true" selectcommand="<View><Query><Where><Eq><FieldRef Name="Title"/><Value Type="Text">Top Nav</Value></Eq></Where></Query></View>" id="dataformwebpart8"><SelectParameters><WebPartPages:DataFormParameter Name="ListID" ParameterKey="ListID" PropertyName="ParameterValues" DefaultValue="{a5a0c64c-17b1-4cba-a8ff-a6a61d8466f3}"/><asp:Parameter Name="MaximumRows" DefaultValue="1"/></SelectParameters><DeleteParameters><WebPartPages:DataFormParameter Name="ListID" ParameterKey="ListID" PropertyName="ParameterValues" DefaultValue="{a5a0c64c-17b1-4cba-a8ff-a6a61d8466f3}"/></DeleteParameters><UpdateParameters><WebPartPages:DataFormParameter Name="ListID" ParameterKey="ListID" PropertyName="ParameterValues" DefaultValue="{a5a0c64c-17b1-4cba-a8ff-a6a61d8466f3}"/></UpdateParameters><InsertParameters><WebPartPages:DataFormParameter Name="ListID" ParameterKey="ListID" PropertyName="ParameterValues" DefaultValue="{a5a0c64c-17b1-4cba-a8ff-a6a61d8466f3}"/></InsertParameters></SharePoint:SPDataSource></DataSources>
<datafields>#Title,Title;#Navigation,Navigation;#ID,ID;#ContentType,Content Type;#Modified,Modified;#Created,Created;#Author,Created By;#Editor,Modified By;#_UIVersionString,Version;#Attachments,Attachments;#File_x0020_Type,File Type;#FileLeafRef,Name (for use in forms);#FileDirRef,Path;#FSObjType,Item Type;#_HasCopyDestinations,Has Copy Destinations;#_CopySource,Copy Source;#ContentTypeId,Content Type ID;#_ModerationStatus,Approval Status;#_UIVersion,UI Version;#Created_x0020_Date,Created;#FileRef,URL Path;#ItemChildCount,Item Child Count;#FolderChildCount,Folder Child Count;#AppAuthor,App Created By;#AppEditor,App Modified By;</datafields>
<XSL><xsl:stylesheet xmlns:x="http://www.w3.org/2001/XMLSchema" xmlns:d="http://schemas.microsoft.com/sharepoint/dsp" version="1.0" exclude-result-prefixes="xsl msxsl ddwrt" xmlns:ddwrt="http://schemas.microsoft.com/WebParts/v2/DataView/runtime" xmlns:asp="http://schemas.microsoft.com/ASPNET/20" xmlns:__designer="http://schemas.microsoft.com/WebParts/v2/DataView/designer" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:msxsl="urn:schemas-microsoft-com:xslt" xmlns:SharePoint="Microsoft.SharePoint.WebControls" xmlns:ddwrt2="urn:frontpage:internal">
Thank you in advance for any and all help with this query.
You're jumping through some serious hoops to avoid using the objects Import-Csv gives you and foreach.
You're also overwriting the destination file every time you loop, which is unnecessary.
$srcFile = "C:\Users\jeffery.grantham\Downloads\v5.master"
$dstFile = "C:\Users\jeffery.grantham\Downloads\v6.master"
$file = "C:\Users\jeffery\Documents\ids.csv"
$csv = Import-Csv -Path $file
$content = Get-Content -Path $srcFile -Raw # Read the file into a single string.
foreach ($row in $csv) {
$content = $content.Replace($row.'2016 PRD ID', $row.'2010 PRD ID')
}
$content | Set-Content -Path $dstFile -Force
I found my own answer finally. The problem was that the string was getting replaced in realtime but it was not updating the variable prior to trying to write the result to the $dstFile.
So my final script looks like the following:
$file = "C:\Users\jeffery.grantham\Documents\peer-ids.csv"
$csv = Import-Csv -Path $file -Delimiter `,
$prd2016 = $csv.'2016 PRD ID'
$stg2016 = $csv.'2016 STG ID'
$prd2010 = $csv.'2010 PRD ID'
$srcFile = "C:\Users\jeffery.grantham\Downloads\v5.master"
$dstFile = "C:\Users\jeffery.grantham\Downloads\v6.master"
Set-Variable 2010,2016
$content = Get-Content -Path $srcFile
For($i=0; $i -lt $prd2016.Count; $i++){
Clear-Variable 2010
Clear-Variable 2016
$2010 = [string]$prd2010[$i]
$2016 = [string]$prd2016[$i]
$content = $content.replace("$2016", "$2010")
$content | Set-Content -Path $dstFile -Force
}
I hate when I find my own answer less than an hour after posting a question, but hopefully, finding this will help someone else in the future attempt to do the same thing or something similar.
I am getting CSV files (with no header) from another system. The last line ends the file, (there is not a newline after the last line of data). When I try Import-CSV, it will not read the last line of the file.
I do not have the ability to have the input file changed to include the newline.
I have noticed that the Get-Content doesn't have a problem reading the entire file, but then it isn't a CSV and I'm unable to reference the fields in the file.
Currently I'm doing:
$w = Import-CSV -path c:\temp\input.txt -header 'head1', 'head2', 'head3'
This will not read the last line of the file
This reads the entire file:
$w = Get-Content -path c:\temp\input.txt
But the data doesn't have the ability to reference the fields like: $w.head1
Is there a way to get Import-CSV to read the file including the last line?
OR Is there a way to read in the data using Get-Content, adding a header to it and then converting it back to a CSV?
I've tried use ConvertTo-CSV but have not had success:
$w = Get-Content -path c:\temp\input.txt
$csvdata = $w | ConvertTo-CSV # No header option for this function
I'd rather not create an intermediate file unless absolutely necessary.
You're very close! What you're after is not ConvertTo-Csv, you already have the file contents in CSV-format after all. So change that to ConvertFrom-Csv instead, which incidentally does support the -Headers parameter. So something like this:
$w = Get-Content -path c:\temp\input.txt
$csvdata = $w | ConvertFrom-Csv -Header 'head1', 'head2', 'head3'
If I understand correctly, you know the number of columns in the file and all it is missing is a header line. Since in your code you do not specify a -Delimiter parameter I'm assuming the delimiter character used in the file is a comma.
Best thing to do IMHO is to create a new output file and always keep the original.
$fileIn = 'c:\temp\input.txt'
$fileOut = 'c:\temp\input.csv'
# write the header line to a new file
Set-Content -Path $fileOut -Value 'head1,head2,head3'
# read the original file and append it to the one you have just created
Get-Content -Path $fileIn -Raw | Add-Content -Path $fileOut
If your file is really large, below a faster alternative:
$fileIn = 'c:\temp\input.txt'
$fileOut = 'c:\temp\input.csv'
# write the header line to a new file
Set-Content -Path $fileOut -Value 'head1,head2,head3'
# read the original file and append it to the one you have just created
[System.IO.File]::AppendAllText($fileOut, ([System.IO.File]::ReadAllText($fileIn)))
If you really do want to take the risk and overwrite the original file, you can do this:
$file = 'c:\temp\input.txt'
$content = Get-Content -Path $fileIn -Raw
# write the header line to a the file destroying what was in there
Set-Content -Path $file -Value 'head1,head2,head3'
# append the original content to it
$content | Add-Content -Path $file
I have put together a script inspired from a number of sources. The purpose of the powershell script is to scan a directory for files (.SQL), copy all of it to a new directory (retain the original), and scan each file against a list file (CSV format - containing 2 columns: OldValue,NewValue), and replace any strings that matches. What works: moving, modifying, log creation.
What doesn't work:
Recording in the .log for the changes made by the script.
Sample usage: .\ConvertSQL.ps1 -List .\EVar.csv -Files \SQLFiles\Rel_1
Param (
[String]$List = "*.csv",
[String]$Files = "*.sql"
)
function Get-TimeStamp {
return "[{0:dd/MM/yyyy} {0:HH:mm:ss}]" -f (Get-Date)
}
$CustomFiles = "$Files\CUSTOMISED"
IF (-Not (Test-Path $CustomFiles))
{
MD -Path $CustomFiles
}
Copy-Item "$Files\*.sql" -Recurse -Destination "$CustomFiles"
$ReplacementList = Import-Csv $List;
Get-ChildItem $CustomFiles |
ForEach-Object {
$LogFile = "$CustomFiles\$_.$(Get-Date -Format dd_MM_yyyy).log"
Write-Output "$_ has been modified on $(Get-TimeStamp)." | Out-File "$LogFile"
$Content = Get-Content -Path $_.FullName;
foreach ($ReplacementItem in $ReplacementList)
{
$Content = $Content.Replace($ReplacementItem.OldValue, $ReplacementItem.NewValue)
}
Set-Content -Path $_.FullName -Value $Content
}
Thank you very much.
Edit: I've cleaned up a bit and removed my test logging files.
Here's the snippet of code that I've been testing with little success. I put the following right under $Content= Content.Replace($ReplacementItem.OldValue, $ReplacementItem.NewValue)
if ( $_.FullName -like '*TEST*' ) {
"This is a test." | Add-Content $LogFile
}
I've also tried to pipe out the Set-Content using Out-File. The outputs I end up with are either a full copy of the contents of my CSV file or the SQL file itself. I'll continue reading up on different methods. I simply want to, out of hundreds to a thousand or so lines, to be able to identify what variables in the SQL has been changed.
Instead of piping output to Add-Content, pipe the log output to: Out-File -Append
Edit: compare the content using the Compare-Object cmdlet and evaluate it's ouput to identify where the content in each string object differs.
I need to pull logs from the original path in C:\ to log directory in D:\Logs but everytime the original path create new log, the script need to append new lines, not replace or rewrite the whole lines.
I already tried this but i guess this replace the whole file and I'm not sure about the Param things.
$SourceFolder = "C:\ProgramData\Sophos\Sophos Anti-Virus\logs"
$DestinationFolder = "D:\Logs\SophosAntivirus"
Function ChangeTabToSpace
{
Param(
[string] $OldFile = "",
[string] $NewFile = ""
)
$OldText = (Get-Content $OldFile -Raw)
#Change all tabt \t to space
$NewText = ($OldText -replace "`t"," ")
#Delete the last empty line
if ($NewText.Length -ge 2) {
$NewText = $NewText.Substring(0,$NewText.Length-2)
}
if (!(Test-path "$NewFile")) {
New-Item -type file "$NewFile" -force | Out-Null
}
#Write-Output $NewText | Out-File -Encoding utf8 "$NewFile"
[System.IO.File]::WriteAllLines($NewFile, $NewText)
}
If its a simple text file you can use the following
"the string you want or have" | out-file -path $path -append
This will add the string to a new line at the end of the file.
You don't need to pipe the input in like I did... its just how I learned to use it and just kept using it.
I need to update directory in many txt files
Input files:
1.txt
using c:\data\1.dta
its own data
2.txt
using c:\data\2.dta
its own data
3.txt
using c:\data\3.dta
its own data
Expected Output files:
1.txt
using C:\Data\Subfile\1.dta
its own data
2.txt
using C:\Data\Subfile\2.dta
its own data
3.txt
using C:\Data\Subfile\3.dta
its own data
I've tried -replace but the results are strange: either all files have the same result or have all new directories (sell bellow)
I want to update the oldpath into the newpath in all files. The code is as following:
$pathway='C:Data\Subfile\*.txt'
$oldpath='c:\\data\\'
$newpath='C:\Data\Subfile\'
$content=Get-Content -path $pathway
Method 1:
$newline=((Get-Content -path $pathway -TotalCount 1) -replace $oldpath,$newpath)
$content[0]= $newline
This method will include all updated directories in every file:
Wrong output:
1.txt
using C:\Data\Subfile\1.txt
using C:\Data\Subfile\2.txt
using C:\Data\Subfile\3.txt
its own data
2.txt
using C:\Data\Subfile\1.txt
using C:\Data\Subfile\2.txt
using C:\Data\Subfile\3.txt
its own data
Method 2:
$content[0]=$content[0]-replace $oldpath,$newpath
This method will cause all file has the same new directory:
Wrong output:
1.txt
using C:\Data\Subfile\1.txt
its own data
2.txt
using C:\Data\Subfile\1.txt
its own data
3.txt
using C:\Data\Subfile\1.txt
its own data
$content | Set-Content -Path $pathway
Can someone help me with that? I want each file has its corresponding new directory. For 1.txt I want C:\Data\Subfile\1.txt, for 2.txt I want C:\Data\Subfile\2.txt etc.
Thanks a lot!
I'm a bit unclear on what you want the final content to be. Is it using C:\Data\Subfile\1.txt or using C:\Data\Subfile\1.dta? I think you are asking for the following but if not let me know. You may run into speed / performance issues depending on how large your files are.
If these are your input files with their content:
C:\data\Subfile\1.txt
using c:\data\1.dta
its own data...
C:\data\Subfile\2.txt
using c:\data\2.dta
its own data...
C:\data\Subfile\3.txt
using c:\data\3.dta
its own data...
then this:
Get-ChildItem c:\data\Subfile\*.txt | Foreach-Object{
#Read in all content lines and replace c:\data\ with c:\data\subfile
$content = Get-Content $_.FullName | %{$_ -replace 'c:\\Data\\', 'c:\Data\Subfile\' }
#write the new data to file
$content | Set-Content $_.FullName
}
This results in the following:
C:\data\Subfile\1.txt
using c:\Data\Subfile\1.dta
its own data...
C:\data\Subfile\2.txt
using c:\Data\Subfile\2.dta
its own data...
C:\data\Subfile\3.txt
using c:\Data\Subfile\3.dta
its own data...
With lookarounds you can precisely define where to insert text without repeating the search pattern.
foreach ($File in Get-ChildItem 'C:\Data\Subfile\*.txt'){
(Get-Content $File -raw) -replace "(?<=C:\\data\\)(?=\d\.dta)","Subfile\" |
Set-Content $File
}
"(?<=C:\\data\\) is a positive lookbehind zero length assertion,
(?=\d\.dta) is a positive lookahead zero length assertion,
the replacement text is inserted in between these two.
this is more secure than other approaches as it is repeatable without inserting Subfile\ again.
here is a way to do the job. [grin] what it does ...
between the #region/#endregion markers is just to make the files to work with
reads the list of files
iterates thru that list
loads the content of each one
replaces the old dir with the new one
finally writes out the new content
here's the code ...
#region - Make files to work with
$Null = New-Item -Path "$env:TEMP\TestFiles" -ItemType Directory -ErrorAction SilentlyContinue
$1stFileName = "$env:TEMP\TestFiles\1.txt"
$1stFileContent = #'
using c:\data\1.dta
its own data
'# -split [System.Environment]::NewLine |
Set-Content -LiteralPath $1stFileName
$2ndFileName = "$env:TEMP\TestFiles\2.txt"
$2ndFileContent = #'
using c:\data\2.dta
its own data
'# -split [System.Environment]::NewLine |
Set-Content -LiteralPath $2ndFileName
$3rdFileName = "$env:TEMP\TestFiles\3.txt"
#'
using c:\data\3.dta
its own data
'# -split [System.Environment]::NewLine |
Set-Content -LiteralPath $3rdFileName
#endregion - Make files to work with
$OldDir = 'c:\data'
$NewDir = 'c:\data\SubDir'
$SourceDir = "$env:TEMP\TestFiles"
$FileList = Get-ChildItem -LiteralPath $SourceDir -Filter '*.txt' -File
foreach ($FL_Item in $FileList)
{
$NewContent = Get-Content -LiteralPath $FL_Item.FullName |
ForEach-Object {
$_.Replace($OldDir, $NewDir)
}
$NewContent |
Set-Content -LiteralPath $FL_Item.FullName
}
content of file 1.txt before & after the script runs ...
# before ...
using c:\data\1.dta
its own data
# after ...
using c:\data\SubDir\1.dta
its own data