I have application with the following name: "x64.Staging.1.0.0.99.ClientBootstrapper". I need install this app every week. From powershell i use this command
"./x64.Staging.1.0.0.99.ClientBootstrapper"
but every week version the version number changes and I have to change the my script. How I can update my script so that the script will automatically detect the latest version of the application and run it ?
I tried using the following:
$version=“x64.Staging.{0-9}.ClientBootstrapper.exe
./x64.Staging.$version.ClientBootstrapper.exe /qn
but this doesn't seem to work.
You could potentially do the following:
$exe = Get-ChildItem -Path "x64.Staging.[0-9]*.ClientBootstrapper.exe" -File | Sort-Object {
[version]($_.Name -replace 'x64\.Staging\.([0-9\.]+)\.ClientBootstrapper\.exe','$1' -replace '^\d+$','$0.0')
} -Descending |
Select-Object -First 1 | Resolve-Path -Relative
& $exe '/qn'
Explanation:
The strategy is to sort by the version string in the middle of the filename. -replace removes all characters in the file name except the version. ([0-9\.]+) matches one or more digits and dots. $1 is the capture group that represents the version string. Since a version object requires at least a major and minor number (3.2 for example), a .0 is appended to a lone, single digit that may show up in the version string.
Resolve-Path is just to return the relative path. It is not necessary as you could remove the command and then modify Select-Object -First 1 -Expand FullName
The GIT is messing up few files and it saves the unix based files in LF format in the system.
There is also few of the windows file that gets saved in CR LF format.
I need to differentiate between the UNIX based file and Windows based file.
I was able to successfully write the below code for 1 text file. The below returned TRUE as it is a windows file.
PS C:\Desktop\SecretSauce> (GET-CONTENT 'HIDEME.TXT' -raw) -match "\r\n$"
Question:
There are 1000's of files in different format(txt, cpp, hpp, sql) in both LF and CR LF format in the same location.
I need to get the output with the path of file, filename with extension and True (If it is CR LF) and False (if it is LF).
when I execute this command to check for multiple files, output is not returning any result.
(Get-Content -Path 'C:\Desktop\SecretSauce\*.*' -raw) -match "\r\n$"
What is the best approach for this using powershell ?
It's an expensive approach, because each file is read in full, but the following should do what you want:
Get-ChildItem -File -Path H:\Desktop\Parent_Folder\Sub-Folder2\*.* |
ForEach-Object {
[pscustomobject] #{
HasCRLF = (Get-Content -Raw -LiteralPath $_.FullName) -match '\r\n'
Name = $_.Name
FullName = $_.FullName
}
}
You'll see output such as the following:
HasCRLF Name FullName
------- ---- --------
False foo.txt H:\Desktop\Parent_Folder\Sub-Folder2\foo.txt
True bar.txt H:\Desktop\Parent_Folder\Sub-Folder2\bar.txt
I am trying to Create a separate script that will edit the file and add the year in the contents. How would I be able to do that
$content ="List of running Services"
$content | out-file C:\Windows\Temp\test
$textfile = get-content C:\Windows\Temp\test
write-host $textfile
Continuing from my comments.
What you need is defined in the Powershell Help files.
Use the Add-Content cmdlet. Use the examples in the Powershell help files
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.management/add-content?view=powershell-7.1
# Example 1: Add a string to all text files with an exception
Add-Content -Path .\*.txt -Exclude help* -Value 'End of file'
# Example 2: Add a date to the end of the specified files
This example appends the date to files in the current directory and displays the date in the PowerShell console.
Add-Content -Path .\DateTimeFile1.log, .\DateTimeFile2.log -Value (Get-Date) -PassThru
The above examples show how to do things, add strings to a file as well as how to use the date. So, you can extrapolate them both, use Example 2 to get your date info and use example 1 to add the said date to the file contents.
I have a simple PowerShell script that replaces "false" or "true" with "0" or "1":
$InputFolder = $args[0];
if($InputFolder.Length -lt 3)
{
Write-Host "Enter a path name as your first argument" -foregroundcolor Red
return
}
if(-not (Test-Path $InputFolder)) {
Write-Host "File path does not appear to be valid" -foregroundcolor Red
return
}
Get-ChildItem $InputFolder
$content = [System.IO.File]::ReadAllText($InputFolder).Replace("`"false`"", "`"0`"").Replace("`"true`"", "`"1`"").Replace("`"FALSE`"", "`"0`"").Replace("`"TRUE`"", "`"1`"")
[System.IO.File]::WriteAllText($InputFolder, $content)
[GC]::Collect()
This works fine for almost all files I have to amend, with the exception of one 808MB CSV.
I have no idea how many lines are in this CSV, as nothing I have will open it properly.
Interestingly, the PowerShell script will complete successfully when invoked manually via either PowerShell directly or via command prompt.
When this is launched as part of the SSIS package it's required for, that's when the error happens.
Sample data for the file:
"RowIdentifier","DateProfileCreated","IdProfileCreatedBy","IDStaffMemberProfileRole","StaffRole","DateEmploymentStart","DateEmploymentEnd","PPAID","GPLocalCode","IDStaffMember","IDOrganisation","GmpID","RemovedData"
"134","09/07/1999 00:00","-1","98","GP Partner","09/07/1999 00:00","14/08/2009 15:29","341159","BRA 871","141","B83067","G3411591","0"
Error message thrown:
I'm not tied to PowerShell - I'm open to other options. I had a cribbed together C# script previously, but that died on small files than this - I'm no C# developer, so was unable to debug it at all.
Any suggestions or help gratefully received.
Generally, avoiding read large files all at once, as you can run out of memory, as you've experienced.
Instead, process text-based files line by line - both reading and writing.
While PowerShell generally excels at line-by-line (object-by-object) processing, it it is slow with files with many lines.
Using the .NET Framework directly - while more complex - offers much better performance.
If you process the input file line by line, you cannot directly write back to it and must instead write to a temporary output file, which you can replace the input file with on success.
Here's a solution that uses .NET types directly for performance reasons:
# Be sure to use a *full* path, because .NET typically doesn't have the same working dir. as PS.
$inFile = Convert-Path $Args[0]
$tmpOutFile = [io.path]::GetTempFileName()
$tmpOutFileWriter = [IO.File]::CreateText($tmpOutFile)
foreach ($line in [IO.File]::ReadLines($inFile)) {
$tmpOutFileWriter.WriteLine(
$line.Replace('"false"', '"0"').Replace('"true"', '"1"').Replace('"FALSE"', '"0"').Replace('"TRUE"', '"1"')
)
}
$tmpOutFileWriter.Dispose()
# Replace the input file with the temporary file.
# !! BE SURE TO MAKE A BACKUP COPY FIRST.
# -WhatIf *previews* the move operation; remove it to perform the actual move.
Move-Item -Force -LiteralPath $tmpOutFile $inFile -WhatIf
Note:
UTF-8 encoding is assumed, and the rewritten file will not have a BOM. You can change this by specifying the desired encoding to the .NET methods.
As an aside: Your chain of .Replace() calls on each input line can be simplified as follows, using PowerShell's -replace operator, which is case-insensitive, so only 2 replacements are needed:
$line -replace '"false"', '"0"' -replace '"true"', '"1"'
However, while that is shorter to write, it is actually slower than the .Replace() call chain, presumably because -replace is regex-based, which incurs extra processing.
You could read the file Per line with get-content -readcount, Out-file a temp file, then delete old file and rename-item the temp file the old files name.
Small things that would need fixing. This will add a new empty line at end of file. This will change the encoding. You could try and get the current file encoding and set the encoding on the Out-file -encoding
function Replace-LargeFilesInFolder(){
Param(
[string]$DirectoryPath,
[string]$OldString,
[string]$NewString,
[string]$TempExtention = "temp",
[int]$LinesPerRead = 500
)
Get-ChildItem $DirectoryPath -File | %{
$File = $_
Get-Content $_.FullName -ReadCount $LinesPerRead |
%{
$_ -replace $OldString, $NewString |
out-file "$($File.FullName).$($TempExtention)" -Append
}
Remove-Item $File.FullName
Rename-Item "$($File.FullName).$($TempExtention)" -NewName $($File.FullName)
}
}
Replace-LargeFilesInFolder -DirectoryPath C:\TEST -LinesPerRead 1 -OldString "a" -NewString "5"
I know PowerShell is up to v5, but as I am new to PowerShell, I've been looking through Stack Overflow to generate the script I have. I've found that I need a generic non-version specific way of accomplishing this process...
Here is the issue - Step 1 - I'm pulling application installation locations information from the registry and am using a temporary file to house the results.
dir "HKLM:\SOFTWARE\Wow6432Node\companyname" | Get-ItemProperty | Select installdir | Out-File "$env:USERPROFILE\Desktop\KDI-Admin\Export\$env:COMPUTERNAME-SC-Installs.txt"
This provides me a list of installation directories for the company's software that is installed on a particular machine. I then want to take these results, append *.config to each line, as well as taking these results and appending *.xml to each line, and output to a new text file.
The input for the process would be the contents of the initial results file, and the output file should have each line listed in the first results, added to the final results file, once appended with *.xml, and once appended with *.config.
The net effect I am looking for is the creation of a #file for a 7z command. I am attempting this by using the following -
(Get-Content "$env:USERPROFILE\Desktop\KDI-Admin\Export\$env:COMPUTERNAME-SC-Installs.txt") -replace '\S+$','$&*.config' | Out-File "$env:USERPROFILE\Desktop\KDI-Admin\Export\$env:COMPUTERNAME-SC-config.txt" -Encoding utf8
(Get-Content "$env:USERPROFILE\Desktop\KDI-Admin\Export\$env:COMPUTERNAME-SC-Installs.txt") -replace '\S+$','$&*.xml' | Out-File "$env:USERPROFILE\Desktop\KDI-Admin\Export\$env:COMPUTERNAME-SC-config.txt" -Append -Encoding utf8
However, I'm only getting one line that has *.xml and one line that has *.config appended -
After getting this far, I'm thinking that some for-each loop is needed, but I'm not getting anywhere with what I have tried adapting from here. I'm looking now for some way to combine the three lines into one function, if that is possible, and eliminate the temporary file step in the first command, by reading and outputting in the same step. This would also need to remove the "installdir" and "----------" lines from the output. Anyone have some ideas and maybe examples?
Taken your above command dir "HKLM:\SOFTWARE\Wow6432Node\companyname" | Get-ItemProperty | Select installdir | Out-File "$env:USERPROFILE\Desktop\KDI-Admin\Export\$env:COMPUTERNAME-SC-Installs.txt" you could put the result of your query into a variable $result:
$result = dir "HKLM:\SOFTWARE\Wow6432Node\microsoft" | Get-ItemProperty | Select installdir;
From there you can easily loop through the array, skipping empty ones and process the rest of it:
foreach($path in $result.installdir)
{
# skip empty paths
if([string]::IsNullOrWhiteSpace($path)) { continue; }
# now do your processing ...
$path;
}
Is this what you were asking for?