I have the below code and currently it loads all the information on screen. I want it to log to a log file on D:\Apps\Logs.
The log file needs to have the name of the computer it is loading against - so COMPUTERNAME.log
Any idea how I can do this?
Thanks
$computer = gc env:computername
$onetcp = ((get-childitem c:\windows\system32\drivers\tcpip.sys).Versioninfo.ProductMajorPart).tostring() $twotcp = ((get-childitem c:\windows\system32\drivers\tcpip.sys).Versioninfo.ProductMinorPart).tostring() $threetcp = ((get-childitem c:\windows\system32\drivers\tcpip.sys).Versioninfo.ProductBuildPart).tostring() $fourtcp = ((get-childitem c:\windows\system32\drivers\tcpip.sys).Versioninfo.ProductPrivatePart).tostring()
$onedfsr = ((get-childitem c:\windows\system32\dfsrs.exe).Versioninfo.ProductMajorPart).tostring() $twodfsr = ((get-childitem c:\windows\system32\dfsrs.exe).Versioninfo.ProductMinorPart).tostring() $threedfsr = ((get-childitem c:\windows\system32\dfsrs.exe).Versioninfo.ProductBuildPart).tostring() $fourdfsr = ((get-childitem c:\windows\system32\dfsrs.exe).Versioninfo.ProductPrivatePart).tostring()
write-host TCPIP.sys Version on $computer is: "$onetcp.$twotcp.$threetcp.$fourtcp" Write-Host write-host DFSRS.exe Version on $computer is: "$onedfsr.$twodfsr.$threedfsr.$fourdfsr"
Write-Host
If (get-wmiobject win32_share | where-object {$_.Name -eq "REMINST"}) { Write-Host "The REMINST share exists on $computer" } Else { Write-Host "The REMINST share DOES NOT exist on $computer - Please create as per standards" } Write-Host
$hotfix1 = Get-HotFix -Id KB2450944 -ErrorAction SilentlyContinue $hotfix2 = Get-HotFix -Id KB2582284 -ErrorAction SilentlyContinue $hotfix3 = Get-HotFix -Id KB979808 -ErrorAction SilentlyContinue
If ($hotfix1) { Write-Host "Hotfix KB2450944 is installed"
-BackgroundColor Green -ForegroundColor Black } else { Write-Host "Hotfix KB2450944 is NOT installed - Please ensure you install this hotfix" -ForegroundColor "red" }
If ($hotfix2) { Write-Host "Hotfix KB2582284 is installed"
-BackgroundColor Green -ForegroundColor Black } else { Write-Host "Hotfix KB2582284 is NOT installed - Please ensure you install this hotfix" -ForegroundColor "red" }
If ($hotfix3) { Write-Host "Hotfix KB979808 is installed"
-BackgroundColor Green -ForegroundColor Black } else { Write-Host "Hotfix KB979808 is NOT installed - Please ensure you install this hotfix" -ForegroundColor "red" }
Put this at the top of your file:
$Logfile = "D:\Apps\Logs\$(gc env:computername).log"
Function LogWrite
{
Param ([string]$logstring)
Add-content $Logfile -value $logstring
}
Then replace your Write-host calls with LogWrite.
A function that takes these principles a little further.
Add's timestamps - can't have a log without timestamps.
Add's a level (uses INFO by default) meaning you can highlight big issues.
Allows for optional console output. If you don't set a log destination, it simply pumps it out.
Function Write-Log {
[CmdletBinding()]
Param(
[Parameter(Mandatory=$False)]
[ValidateSet("INFO","WARN","ERROR","FATAL","DEBUG")]
[String]
$Level = "INFO",
[Parameter(Mandatory=$True)]
[string]
$Message,
[Parameter(Mandatory=$False)]
[string]
$logfile
)
$Stamp = (Get-Date).toString("yyyy/MM/dd HH:mm:ss")
$Line = "$Stamp $Level $Message"
If($logfile) {
Add-Content $logfile -Value $Line
}
Else {
Write-Output $Line
}
}
I believe this is the simplest way of putting all what it is on the screen into a file. It is a native PS CmdLet so you don't have to change or install anything in your script
Start-Transcript -Path Computer.log
Write-Host "everything will end up in Computer.log"
Stop-Transcript
You can also add -Append to append instead the content [Thanks #scipilot for the tip!]
function WriteLog
{
Param ([string]$LogString)
$LogFile = "C:\$(gc env:computername).log"
$DateTime = "[{0:MM/dd/yy} {0:HH:mm:ss}]" -f (Get-Date)
$LogMessage = "$Datetime $LogString"
Add-content $LogFile -value $LogMessage
}
WriteLog "This is my log message"
Using this Log-Entry framework:
Script:
Function Main {
Log -File "D:\Apps\Logs\$Env:computername.log"
$tcp = (get-childitem c:\windows\system32\drivers\tcpip.sys).Versioninfo.ProductVersionRaw
$dfs = (get-childitem C:\Windows\Microsoft.NET\Framework\v2.0.50727\dfsvc.exe).Versioninfo.ProductVersionRaw
Log "TCPIP.sys Version on $computer is:" $tcp
Log "DFSVC.exe Version on $computer is:" $dfs
If (get-wmiobject win32_share | where-object {$_.Name -eq "REMINST"}) {Log "The REMINST share exists on $computer"}
Else {Log "The REMINST share DOES NOT exist on $computer - Please create as per standards"}
"KB2450944", "KB3150513", "KB3176935" | ForEach {
$hotfix = Get-HotFix -Id $_ -ErrorAction SilentlyContinue
If ($hotfix) {Log -Color Green Hotfix $_ is installed}
Else {Log -Color Red Hotfix $_ " is NOT installed - Please ensure you install this hotfix"}
}
}
Screen output:
Log File (at D:\Apps\Logs\<computername>.log):
2017-05-31 Write-Log (version: 01.00.02, PowerShell version: 5.1.14393.1198)
19:19:29.00 C:\Users\User\PowerShell\Write-Log\Check.ps1
19:19:29.47 TCPIP.sys Version on is: {Major: 10, Minor: 0, Build: 14393, Revision: 1066, MajorRevision: 0, MinorRevision: 1066}
19:19:29.50 DFSVC.exe Version on is: {Major: 2, Minor: 0, Build: 50727, Revision: 8745, MajorRevision: 0, MinorRevision: 8745}
19:19:29.60 The REMINST share DOES NOT exist on - Please create as per standards
Error at 25,13: Cannot find the requested hotfix on the 'localhost' computer. Verify the input and run the command again.
19:19:33.41 Hotfix KB2450944 is NOT installed - Please ensure you install this hotfix
19:19:37.03 Hotfix KB3150513 is installed
19:19:40.77 Hotfix KB3176935 is installed
19:19:40.77 End
Gist with log rotation: https://gist.github.com/barsv/85c93b599a763206f47aec150fb41ca0
Usage:
. .\logger.ps1
Write-Log "debug message"
Write-Log "info message" "INFO"
You might just want to use the new TUN.Logging PowerShell module, this can also send a log mail. Just use the Start-Log and/or Start-MailLog cmdlets to start logging and then just use Write-HostLog, Write-WarningLog, Write-VerboseLog, Write-ErrorLog etc. to write to console and log file/mail. Then call Send-Log and/or Stop-Log at the end and voila, you got your logging.
Just install it from the PowerShell Gallery via
Install-Module -Name TUN.Logging
Or just follow the link: https://www.powershellgallery.com/packages/TUN.Logging
Documentation of the module can be found here: https://github.com/echalone/TUN/blob/master/PowerShell/Modules/TUN.Logging/TUN.Logging.md
I've been playing with this code for a while now and I have something that works well for me. Log files are numbered with leading '0' but retain their file extension. And I know everyone likes to make functions for everything but I started to remove functions that performed 1 simple task. Why use many word when few do trick? Will likely remove other functions and perhaps create functions out of other blocks. I keep the logger script in a central share and make a local copy if it has changed, or load it from the central location if needed.
First I import the logger:
#Change directory to the script root
cd $PSScriptRoot
#Make a local copy if changed then Import logger
if(test-path "D:\Scripts\logger.ps1"){
if (Test-Path "\\<server>\share\DCS\Scripts\logger.ps1") {
if((Get-FileHash "\\<server>\share\DCS\Scripts\logger.ps1").Hash -ne (Get-FileHash "D:\Scripts\logger.ps1").Hash){
rename-Item -path "..\logger.ps1" -newname "logger$(Get-Date -format 'yyyyMMdd-HH.mm.ss').ps1" -force
Copy-Item "\\<server>\share\DCS\Scripts\logger.ps1" -destination "..\" -Force
}
}
}else{
Copy-Item "\\<server>\share\DCS\Scripts\logger.ps1" -destination "..\" -Force
}
. "..\logger.ps1"
Define the log file:
$logfile = (get-location).path + "\Log\" + $QProfile.replace(" ","_") + "-$metricEnv-$ScriptName.log"
What I log depends on debug levels that I created:
if ($Debug -ge 1){
$message = "<$pid>Debug:$Debug`-Adding tag `"MetricClass:temp`" to $host_name`:$metric_name"
Write-Log $message $logfile "DEBUG"
}
I would probably consider myself a bit of a "hack" when it comes to coding so this might not be the prettiest but here is my version of logger.ps1:
# all logging settins are here on top
param(
[Parameter(Mandatory=$false)]
[string]$logFile = "$(gc env:computername).log",
[Parameter(Mandatory=$false)]
[string]$logLevel = "DEBUG", # ("DEBUG","INFO","WARN","ERROR","FATAL")
[Parameter(Mandatory=$false)]
[int64]$logSize = 10mb,
[Parameter(Mandatory=$false)]
[int]$logCount = 25
)
# end of settings
function Write-Log-Line ($line, $logFile) {
$logFile | %{
If (Test-Path -Path $_) { Get-Item $_ }
Else { New-Item -Path $_ -Force }
} | Add-Content -Value $Line -erroraction SilentlyCOntinue
}
function Roll-logFile
{
#function checks to see if file in question is larger than the paramater specified if it is it will roll a log and delete the oldes log if there are more than x logs.
param(
[string]$fileName = (Get-Date).toString("yyyy/MM/dd HH:mm:ss")+".log",
[int64]$maxSize = $logSize,
[int]$maxCount = $logCount
)
$logRollStatus = $true
if(test-path $filename) {
$file = Get-ChildItem $filename
# Start the log-roll if the file is big enough
#Write-Log-Line "$Stamp INFO Log file size is $($file.length), max size $maxSize" $logFile
#Write-Host "$Stamp INFO Log file size is $('{0:N0}' -f $file.length), max size $('{0:N0}' -f $maxSize)"
if($file.length -ge $maxSize) {
Write-Log-Line "$Stamp INFO Log file size $('{0:N0}' -f $file.length) is larger than max size $('{0:N0}' -f $maxSize). Rolling log file!" $logFile
#Write-Host "$Stamp INFO Log file size $('{0:N0}' -f $file.length) is larger than max size $('{0:N0}' -f $maxSize). Rolling log file!"
$fileDir = $file.Directory
$fbase = $file.BaseName
$fext = $file.Extension
$fn = $file.name #this gets the name of the file we started with
function refresh-log-files {
Get-ChildItem $filedir | ?{ $_.Extension -match "$fext" -and $_.name -like "$fbase*"} | Sort-Object lastwritetime
}
function fileByIndex($index) {
$fileByIndex = $files | ?{($_.Name).split("-")[-1].trim("$fext") -eq $($index | % tostring 00)}
#Write-Log-Line "LOGGER: fileByIndex = $fileByIndex" $logFile
$fileByIndex
}
function getNumberOfFile($theFile) {
$NumberOfFile = $theFile.Name.split("-")[-1].trim("$fext")
if ($NumberOfFile -match '[a-z]'){
$NumberOfFile = "01"
}
#Write-Log-Line "LOGGER: GetNumberOfFile = $NumberOfFile" $logFile
$NumberOfFile
}
refresh-log-files | %{
[int32]$num = getNumberOfFile $_
Write-Log-Line "LOGGER: checking log file number $num" $logFile
if ([int32]$($num | % tostring 00) -ge $maxCount) {
write-host "Deleting files above log max count $maxCount : $_"
Write-Log-Line "LOGGER: Deleting files above log max count $maxCount : $_" $logFile
Remove-Item $_.fullName
}
}
$files = #(refresh-log-files)
# Now there should be at most $maxCount files, and the highest number is one less than count, unless there are badly named files, eg non-numbers
for ($i = $files.count; $i -gt 0; $i--) {
$newfilename = "$fbase-$($i | % tostring 00)$fext"
#$newfilename = getFileNameByNumber ($i | % tostring 00)
if($i -gt 1) {
$fileToMove = fileByIndex($i-1)
} else {
$fileToMove = $file
}
if (Test-Path $fileToMove.PSPath) { # If there are holes in sequence, file by index might not exist. The 'hole' will shift to next number, as files below hole are moved to fill it
write-host "moving '$fileToMove' => '$newfilename'"
#Write-Log-Line "LOGGER: moving $fileToMove => $newfilename" $logFile
# $fileToMove is a System.IO.FileInfo, but $newfilename is a string. Move-Item takes a string, so we need full path
Move-Item ($fileToMove.FullName) -Destination $fileDir\$newfilename -Force
}
}
} else {
$logRollStatus = $false
}
} else {
$logrollStatus = $false
}
$LogRollStatus
}
Function Write-Log {
[CmdletBinding()]
Param(
[Parameter(Mandatory=$True)]
[string]
$Message,
[Parameter(Mandatory=$False)]
[String]
$logFile = "log-$(gc env:computername).log",
[Parameter(Mandatory=$False)]
[String]
$Level = "INFO"
)
#Write-Host $logFile
$levels = ("DEBUG","INFO","WARN","ERROR","FATAL")
$logLevelPos = [array]::IndexOf($levels, $logLevel)
$levelPos = [array]::IndexOf($levels, $Level)
$Stamp = (Get-Date).toString("yyyy/MM/dd HH:mm:ss:fff")
# First roll the log if needed to null to avoid output
$Null = #(
Roll-logFile -fileName $logFile -filesize $logSize -logcount $logCount
)
if ($logLevelPos -lt 0){
Write-Log-Line "$Stamp ERROR Wrong logLevel configuration [$logLevel]" $logFile
}
if ($levelPos -lt 0){
Write-Log-Line "$Stamp ERROR Wrong log level parameter [$Level]" $logFile
}
# if level parameter is wrong or configuration is wrong I still want to see the
# message in log
if ($levelPos -lt $logLevelPos -and $levelPos -ge 0 -and $logLevelPos -ge 0){
return
}
$Line = "$Stamp $Level $Message"
Write-Log-Line $Line $logFile
}
Related
I'm using this code to delete files older than 30 days
Function Remove_FilesCreatedBeforeDate {
$Path = "\\servername\path"
$Date = (Get-Date).AddDays(-30)
$ValidPath = Test-Path $Path -IsValid
If ($ValidPath -eq $True) {
"Path is OK and Cleanup is now running"
Get-ChildItem -Path $path -Recurse | Where-Object { $_.LastWriteTime -lt $Date } | Remove-Item -Recurse -force -Verbose
}
Else {
"Path is not a ValidPath"
}
}
Remove_FilesCreatedBeforeDate
Now I want to log which files were deleted, and also whether there was an error or the path isn't valid. Can anyone help me here?
//EDIT
Im Now using this Code (Thanks to Efie for helping)
[Cmdletbinding()]
param(
[Parameter()]$LogPath = 'C:\Admin\scripts\Clean_Folder\Log\log.txt',
[Parameter(ValueFromPipeline)]$Message
)
process {
$timeStampedMessage = "[$(Get-Date -Format 's')] $Message"
$timeStampedMessage | Out-File -FilePath $LogPath -Append
}
}
Function Remove-FilesCreatedBeforeDate {
[Cmdletbinding()]
param(
[Parameter()]$Path = '\\servername\path\',
[Parameter()]$Date = $(Get-Date).AddDays(-30)
)
process {
if(-not (Test-Path $Path -IsValid)) {
"Path $Path was invalid" | Write-MyLog
return
}
"Path $Path is OK and Cleanup is now running" | Write-MyLog
try {
Get-ChildItem -Path $Path -Recurse |
Where-Object {
$_.LastWriteTime -lt $Date
} | Remove-Item -recurse -force -verbose | Write-MyLog
}
catch {
"Remove-Item failed with message $($_.Exception.Message)" | Write-MyLog
}
}
}
Write-MyLog
Remove-FilesCreatedBeforeDate
Two files getting deleted but i just see this in my Log
[2021-07-22T16:27:53] Path \\servername\path\ is OK and Cleanup is now running
I dont see which files getting deleted sadly
A simple implementation for your example would be something like this:
Function Remove-FilesCreatedBeforeDate {
[Cmdletbinding()]
param(
[Parameter(Mandatory)]$Path = '\some\default\path',
[Parameter()]$Date = $(Get-Date).AddDays(-30)
)
process {
if(-not (Test-Path $Path -IsValid)) {
"Path $Path was invalid" | Write-MyLog
return
}
"Path $Path is OK and Cleanup is now running" | Write-MyLog
try {
Get-ChildItem -Path $Path -Recurse |
Where-Object {
$_.LastWriteTime -lt $Date
} | Remove-Item -Recurse -Force -Verbose
}
catch {
"Remove-Item failed with message $($_.Exception.Message)" | Write-MyLog
}
}
}
function Write-MyLog {
[Cmdletbinding()]
param(
[Parameter()]$LogPath = 'default\log\path\log.txt',
[Parameter(ValueFromPipeline)]$Message
)
process {
$timeStampedMessage = "[$(Get-Date -Format 's')] $Message"
$timeStampedMessage | Out-File -FilePath $LogPath -Append
}
}
Some notes:
Advanced Functions
process { }, [Cmdletbinding()], and [Parameter()] are what turn your function into an 'advanced' function. You get to use loads of built in features normally reserved for compiled cmdlets this way.
For example, you could now suppress errors with $ErrorActionPreference = 'SilentlyContinue' like you're used to doing with native Powershell cmdlets.
You can pipe your messages to your logging function by adding ValueFromPipelin to your parameter.
Those really just brush the surface of the extra capabilities you get.
Here is some information. I would recommend getting in the habit of writing them like this if you plan to use them in the future.
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_functions_advanced?view=powershell-7.1
Error Handling
I'd recommend looking into this documentation by Microsoft on error handling:
https://learn.microsoft.com/en-us/powershell/scripting/learn/deep-dives/everything-about-exceptions?view=powershell-7.1
Naming Conventions
I would also recommend taking a look at this about PowerShell function naming conventions:
https://learn.microsoft.com/en-us/powershell/scripting/developer/cmdlet/approved-verbs-for-windows-powershell-commands?view=powershell-7
By PowerShell standards it would make more sense to name your function Remove-FilesCreatedBeforeDate with the dash separating verb-action instead of the underscore.
Logging
If you want a little more control and a few more features for logging your functions, here is some information on a tried and true solution for PowerShell using PSFramework:
https://adamtheautomator.com/powershell-logging/
Good luck! Hope some of that helps.
In Unix its Simple
find /var/log/hive -type f -mtime +30 -delete
Could Start-transcript with Try and catch be your solution here?
Start-Transcript logs everything that you do and the errors.
I tried this and this does what you want
Start-Transcript -Path "$PSScriptRoot\RemoveAccountLog.txt" -Force -Append
Get-Date -Format "yyyy-mm-dd HH:MM"
Try
{ # Start Try
$Path = "\\servername\path"
$Date = (Get-Date).AddDays(-30)
$TestPath = Test-Path -Path $Path -PathType Container
If ( $TestPath -Eq $Null )
{ # Start If
Write-Host "The $TestPath String is empty, Path is not a valid"
} # End If
Else
{ # Start Else
Write-host "Path is OK and Cleanup is now running... 0%"
$GetFiles = Get-ChildItem -Path $Path -Recurse -Force |
Where-Object { $_.LastWriteTime -lt $Date } |
Remove-Item -Recurse -force -Verbose |
Write-host "Path is OK and Cleanup is now running... 100%" -ForegroundColor Green
} # End Else
} # End Try
Catch
{ # Start Catch
Write-Warning -Message "## ERROR## "
Write-Warning -Message "## Script could not start ## "
Write-Warning $Error[0]
} # End Catch
Screenshot:
I configure a custom PowerShell Task to analyze the Code Coverage of Azure DevOps repository. The steps are:
Search for specific test assemblies (*Test.dll)
Run coverlet and pass the test assemblies
Proof $LASTEXITCODE is not equal 2 (coverage lower than a threshold)
If $LASTEXITCODE is equal 2
Run ReportGenerator with coverlet cobertura summary
Send e-mail with reports to last committer (get from git git --git-dir=$git log -1 --format="%ae")
The Problem I have:
The Commiter is not interested in the code coverage of the whole repository, he wants to know the coverage of his commit.
What I try to achieve:
How can I check up whether a commit contains a test assembly or not? I want to analyze only the test assemblies of the last commit.
If there is no test assembly: do nothing
If there are test assemblies: analyze only this specific one and inform the developer about his code coverage
PowerShell Script:
param([string]$Root, [int]$Threshold = 80, [string]$FromMail, [string]$Output = "Report", [string[]]$Include = #("*Tests.dll"), [string[]]$Exclude)
#VARIABLES
$format = "cobertura" #FORMAT OF THE GENERATED COVERAGE REPORT (json [default]/lcov/opencover/cobertura/teamcity)
$thresholdType = "line" #COVERAGE TYPE TO APPLY THE THRESHOLD TO (line/branch/method)
$coverletOutput = "cobertura.xml" #OUTPUT OF THE GENERATED COVERAGE REPORT
$reportTypes = "HtmlInline_AzurePipelines;Cobertura" #THE OUTPUT FORMATS AND SCOPE (SEPARATED BY SEMICOLON) (Badges/Cobertura/CsvSummary/Html/HtmlChart/HtmlInline/HtmlInline_AzurePipelines/HtmlInline_AzurePipelines_Dark/HtmlSummary/Latex/LatexSummary/MHtml/PngChart/SonarQube/TeamCitySummary/TextSummary/Xml/XmlSummary)
#CODE COVERAGE SCRIPT
#-----------------------------------------------------------------------------------------------------#
##The script should analyze the code coverage of a test assembly and create a `.xml` report.
##Requeried tools: [coverlet](https://github.com/tonerdo/coverlet/blob/master/Documentation/GlobalTool.md), [ReportGenerator](https://automationrhapsody.com/code-coverage-manual-automated-tests-opencover-net-applications/), [git](https://git-scm.com/downloads)
##Root = is the directory where the script seeks recursively for files with `$Include` patterns
##Threshold = is the threshold of the code coverage that will accept the test
##FromMail = is the mail address from which the script should send the coverage warning
##Output = is the output path of the `.xml` report file
##Include = is a pattern list for the recursive search of test assemblies which should be included for the code coverage analysis (for instance `#("*Tests.dll", "*Unit.dll")`)
##Exclude = is a pattern list of subdirectories which should be excluded for the code coverage analysis (for instance `#("*\obj\*", "*\Release\*")`)
#-----------------------------------------------------------------------------------------------------#
#JOIN INCLUDE & EXCLUDE FOR PRINTS
$includeJoin = $($Include -join "', '")
$excludeJoin = $($Exclude -join "', '")
Write-Host "Root:`t`t$Root`nThreshold:`t$Threshold`nFromMail:`t$FromMail`nOutput:`t$Output`nInclude:`t'$includeJoin'`nExclude:`t'$excludeJoin'"
#CHECK ARGUMENTS
if ($Root -eq "" -or $Threshold -lt 0 -or $FromMail -eq "" -or $Output -eq "" -or $null -eq $Include) {
Write-Host "##vso[task.logissue type=error;][ps1] error: missing root directory, coverage threshold, output directory or include pattern list of unit test .dll," -ForegroundColor Red
exit(-1)
}
if ($null -eq $Exclude) { $Exclude = #() }
#CHECK VALID E-MAIL
try { $_ = new-object net.mail.mailaddress($FromMail) }
catch { Write-Host "##vso[task.logissue type=error;][ps1] error: invalid mail address '$FromMail'" -ForegroundColor Red; exit(-1) }
#CHECK COMMANDS
[string[]] $cmds = "coverlet", "reportgenerator", "git"
foreach ($cmd in $cmds) {
if (Get-Command $cmd -errorAction SilentlyContinue) { Write-Host "[$cmd] path: '$($(Get-Command $cmd).Path)'" -ForegroundColor Green }
else { Write-Host "##vso[task.logissue type=error;][$cmd] error: '$cmd' command not exist" -ForegroundColor Red; exit(-1) }
}
#SET $PWD
Set-Location -Path $Root
#FIND GIT REPOSITORY (FOR COMMIT & E-MAIL)
$git = Get-ChildItem $pwd -Include ".git" -Recurse -Directory -Force -ErrorAction SilentlyContinue | Select-Object -First 1
if ($null -eq $git) { Write-Host "##vso[task.logissue type=error;][git] error: missing repository in directory '$($pwd.Path)' and his subdirectories" -ForegroundColor Red; exit(-1) }
#SEARCH FOR $INCLUDE FILES IN $ROOT
Write-Host "[ps1] search directory: '$Root'" -ForegroundColor Yellow
Write-Host "[ps1] search include: '$includeJoin'" -ForegroundColor Yellow
$files = Get-ChildItem -Path $Root -Include $Include -Recurse -File -Name -ErrorAction SilentlyContinue
#SEARCH FOR $EXCLUDE IN $FILES
$Exclude | Where-Object { $ex = $_; $files = $files | Where-Object { $_ -notlike $ex } }
Write-Host "[ps1] search exclude: '$excludeJoin'" -ForegroundColor Yellow
Write-Host "[ps1] search results:" -ForegroundColor Yellow
$files | Where-Object { Write-Host "`t-$_" -ForegroundColor Gray }
#CHECK FILES FOUND
if ($files.Count -eq 0) { Write-Host "##vso[task.logissue type=error;][ps1] error: error: no files with include pattern '$includeJoin' found in '$Root'" -ForegroundColor Red; exit(-1) }
#START COVERLET
foreach ($file in $files) {
Write-Host "[coverlet] analyse: '$file'" -ForegroundColor Yellow
$path = '"{0}"' -f $file
coverlet $path --target "dotnet" --targetargs "vstest $path --logger:trx" --format $format --threshold $Threshold --threshold-type $thresholdType --output $coverletOutput
$exitCoverlet = $LASTEXITCODE
Write-Host "[coverlet] exit code for '$file': $exitCoverlet" -ForegroundColor Yellow
if ($exitCoverlet -ne 0) { break }
}
#COVERAGE IS TO LOW (2)
if ($exitCoverlet -eq 2) {
#START REPORT GENERATOR
reportgenerator -reports:$coverletOutput -reporttypes:$reportTypes -targetdir:$('"{0}"' -f $Output)
$exitReportGenerator = $LASTEXITCODE
Write-Host "[reportgenerator] exit code: $exitReportGenerator" -ForegroundColor Yellow
#SEND MAIL
$from = $FromMail
$to = git --git-dir=$git log -1 --format="%ae"
$attachments = Get-ChildItem -Path "$Output" -Filter *.htm -Recurse | ForEach-Object { $_.FullName }
$index = Get-ChildItem -Path "$Output" -Filter index.htm -Recurse | ForEach-Object { $_.FullName }
$commit = git --git-dir=$git log -p $git -1 --pretty=%B
$subject = "Code Coverage in Commit '$commit'"
$body = "The code coverage of your commit '$commit' is under the threshold of $Threshold %.<br>Show attachments for more details.<br><br>" + $(Get-Content $index)
$smtpServer = "smtp.server.de"
$smtpPort = "25"
Write-Output "##vso[task.logissue type=warning;][ps1] code coverage is to low, send mail to: $to"
Send-MailMessage -From $from -to $to -Subject $subject -Body $body -BodyAsHtml -SmtpServer $smtpServer -port $smtpPort -Attachments $attachments
}
Azure DevOps Server Version: 17.143.28912.1 (AzureDevOps2019.0.1).
Agent: Self-Hosted Agent vsts-agent-win-x64-2.144.2.
EDIT: Code Coverage of commited Test-Assemblies
I modifie my first script with the following snippet steps:
read out all changed files from last git commit
search all project files (*csproj, *vbproj) with filter *Test*
check project file includes changed file
remove .proj extension, replace it with .dll
create assembly path with a given $OutputAssembly (bin\Release) from user
Snipped:
#GET THE LAST COMMITED FILES
$commitedFiles = git --git-dir=$GitPath diff-tree --no-commit-id --name-only -r $lastCommit
#SEARCH FOR PROJECT FILES IN $PWD WITH FILTER
$Filter = "*Test*"
$projs = Get-ChildItem -Path $pwd -Recurse -Filter $Filter -Include #("*csproj", "*vbproj")
#SEARCH FOR $EXCLUDE IN $FILES
$Exclude = #("*\obj\*")
$Exclude | Where-Object { $ex = $_; $projs = $projs | Where-Object { $_ -notlike $ex } }
Write-Host "[ps1] search exclude: '$excludeJoin'" -ForegroundColor Yellow
Write-Host "[ps1] search results:" -ForegroundColor Yellow
$projs | Where-Object { Write-Host "`t-$_" -ForegroundColor Gray }
#CHECK PROJECT FILES FOUND
if ($projs.Count -eq 0) { Write-Host "##vso[task.logissue type=error;][ps1] error: error: no projects with filter '$Filter' and include pattern '$includeJoin' found in '$Root'" -ForegroundColor Red; exit(-1) }
#ASSEMBLIES LIST
$assemblies = #()
#LOOP ALL .PROJ FILES
foreach ( $proj in $projs ) {
#LOOP ALL LINES IN .PROJ FILE
foreach ( $line in (Get-Content $proj) ) {
if ( $line -match 'Compile\s+Include="([^"]+)"' ) {
#COMPILED FILE IN .PROJ
$file = Split-Path $matches[1] -Leaf
#LOOP ALL COMMITED FILES
foreach($commitedFile in $commitedFiles){
#GET FILE NAME
$name = Split-Path $commitedFile -Leaf
#ADD ASSEMBLY BASED ON .PROJ BASENAME
if($name -eq $file) { $assemblies += $proj.BaseName + ".dll" }
}
}
}
}
#FEEDBACK CHANGED ASSEMBLIES
Write-Host "[ps1] changed assemblies:" -ForegroundColor Yellow
$assemblies | Where-Object { Write-Host "`t-$_" -ForegroundColor Gray }
#LOOP ALL ASSEMBLIES
$OutputAssembly = "bin\Release"
foreach ($assembly in $assemblies){
$path = [IO.Path]::Combine($Root , $OutputAssembly, $assembly)
#CHECK ASSEMBLY PATH
if (-not (Test-Path -Path $path)) {
Write-Host "##vso[task.logissue type=warning;][ps1] warning: missing assembly '$assembly' at: '$path'" -ForegroundColor Yellow;
}
else {
#START COVERLET
}
}
If I understand the question well, you can do something like this:
# Get the last commit SHA1
$lastCommit = "$(Build.SourceVersion)"
# Get the last commit files
$files = git diff-tree --no-commit-id --name-only -r $lastCommit
if($files -match "Test.cs")
{
# Do something...
}
else
{
# Do something else...
}
Because usually if you have Test.dll so the source code should be Test.cs.
I currently have a powershell script, which print out some information regarding the files which passed in as argument..
The command for executing the script, it done as such:
.\myscript.ps1 -accessitem C:\folder
I want to apply the script on all files and folder on the drive C:, is it possible i for loop to list all files, and pass the path as argument for the script?
The script:
[CmdletBinding()]
Param (
[Parameter(Mandatory=$True,Position=0)]
[String]$AccessItem
)
$ErrorActionPreference = "SilentlyContinue"
If ($Error) {
$Error.Clear()
}
$RepPath = Split-Path -Parent $MyInvocation.MyCommand.Definition
$RepPath = $RepPath.Trim()
$str = $AccessItem -replace ':',''
$str = $AccessItem -replace '/','.'
$FinalReport = "$RepPath\"+$str+".csv"
$ReportFile1 = "$RepPath\NTFSPermission_Report.txt"
If (!(Test-Path $AccessItem)) {
Write-Host
Write-Host "`t Item $AccessItem Not Found." -ForegroundColor "Yellow"
Write-Host
}
Else {
If (Test-Path $FinalReport) {
Remove-Item $FinalReport
}
If (Test-Path $ReportFile1) {
Remove-Item $ReportFile1
}
Write-Host
Write-Host "`t Working. Please wait ... " -ForegroundColor "Yellow"
Write-Host
## -- Create The Report File
$ObjFSO = New-Object -ComObject Scripting.FileSystemObject
$ObjFile = $ObjFSO.CreateTextFile($ReportFile1, $True)
$ObjFile.Write("NTFS Permission Set On -- $AccessItem `r`n")
$ObjFile.Close()
$ObjFile = $ObjFSO.CreateTextFile($FinalReport, $True)
$ObjFile.Close()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($ObjFSO) | Out-Null
Remove-Variable ObjFile
Remove-Variable ObjFSO
If((Get-Item $AccessItem).PSIsContainer -EQ $True) {
$Result = "ItemType -- Folder"
}
Else {
$Result = "ItemType -- File"
}
$DT = Get-Date -Format F
Add-Content $ReportFile1 -Value ("Report Created As On $DT")
Add-Content $ReportFile1 "=================================================================="
$Owner = (Get-Item -LiteralPath $AccessItem).GetAccessControl() | Select Owner
$Owner = $($Owner.Owner)
$Result = "$Result `t Owner -- $Owner"
Add-Content $ReportFile1 "$Result `n"
(Get-Item -LiteralPath $AccessItem).GetAccessControl() | Select * -Expand Access | Select IdentityReference, FileSystemRights, AccessControlType, IsInherited, InheritanceFlags, PropagationFlags | Export-CSV -Path "$RepPath\NTFSPermission_Report2.csv" -NoTypeInformation
Add-Content $FinalReport -Value (Get-Content $ReportFile1)
Add-Content $FinalReport -Value (Get-Content "$RepPath\NTFSPermission_Report2.csv")
Remove-Item $ReportFile1
Remove-Item "$RepPath\NTFSPermission_Report2.csv"
Invoke-Item $FinalReport
}
If ($Error) {
$Error.Clear()
}
I would prefer a outside command doing this, as workings of the script should not be altered, it it used for single file testing..
There are two ways to do this:
Add -Recurse Flag to the script
Run the script on each directory
I'm going with option two since the script looks complicated enough that I don't want to touch it.
$path_to_script = "C:\path\to\myscript.ps1"
$start_directory = "C:\folder"
# Call Script on Parent Directory
& "$path_to_script" -AccessItem "$start_directory"
# Call Script on any Child Directories within the "$start_directory"
foreach($child in (ls "$start_directory" -Recurse -Directory))
{
$path = $child.FullName
& "$path_to_script" -AccessItem "$path"
}
Basically, I'm calling the script on the parent directory and any sub-directories within the parent directory.
I'm trying to create a powershell script that deletes user profiles older than 30 days with powershell and excluding some users like admins accounts.
I've though maybe the script has to be signed by the domaincontroller or something, but i'm not sure if that will be the solution.
When i try to run it on a other directory it works but when i use it on c:\Users i get an error
Does anyone know what i have to change?
Error:
Set-ExecutionPolicy : Windows PowerShell updated your execution policy successfully, but the setting is overridden by a
policy defined at a more specific scope. Due to the override, your shell will retain its current effective execution
policy of RemoteSigned. Type "Get-ExecutionPolicy -List" to view your execution policy settings. For more information p
lease see "Get-Help Set-ExecutionPolicy".
At line:1 char:46
+ ... -ne 'AllSigned') { Set-ExecutionPolicy -Scope Process Bypass }; & 'H ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : PermissionDenied: (:) [Set-ExecutionPolicy], SecurityException
+ FullyQualifiedErrorId : ExecutionPolicyOverride,Microsoft.PowerShell.Commands.SetExecutionPolicyCommand
The Code:
$Now = Get-Date
$Days = "15"
$TargetFolder = "C:\Users"
$LastWrite = $Now.AddDays(-$Days)
$Folders = get-childitem -path $TargetFolder |
Where {$_.psIsContainer -eq $true} |
Where {$_.LastWriteTime -le "$LastWrite"}
foreach ($Folder in $Folders)
{
if($Folder -notlike "user1")
{
if($Folder -notlike "Administrator")
{
if($Folder -notlike "user2")
{
if($Folder -notlike "Public")
{
if($Folder -notlike "NetworkService")
{
if($Folder -notlike "LocalService")
{
if($Folder -notlike "user3")
{
if($Folder -notlike "user4")
{
write-host "Deleting $Folder" -ForegroundColor Green
Remove-Item -recurse -Force C:\Users\$Folder
#Write-Host -NoNewLine "Press any key to continue... `n";
#$null = $Host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown");
}
else
{
write-host "Cannot delete $Folder" -ForegroundColor Red
}
}
else
{
write-host "Cannot delete $Folder" -ForegroundColor Red
}
}
else
{
write-host "Cannot delete $Folder" -ForegroundColor Red
}
}
else
{
write-host "Cannot delete $Folder" -ForegroundColor Red
}
}
else
{
write-host "Cannot delete $Folder" -ForegroundColor Red
}
}
else
{
write-host "Cannot delete $Folder" -ForegroundColor Red
}
}
else
{
write-host "Cannot delete $Folder" -ForegroundColor Red
}
}
else
{
write-host "Cannot delete $Folder" -ForegroundColor Red
}
}
Here is my script which I have used sometime ago:
Most code is commented, so that should be clear. If you have any questions please ask.
# ********************************************************************************************************(**
# * **
# * Short description: Check profiles if they can be deleted - unused profiles will be deleted. **
# * Full description: * User running this script can specify the time which will check if profile contains **
# * any newer files than limit. If yes such profile is skipped. **
# * * User can specify which directories will be excepted from this search **
# * * User can specify which file suffixes will be ignored when the date will be checked **
# * * User can specify custom path for the profiles **
# * **
# * Creator : Patrik Svestka **
# * Created : 21/08/2017 **
# * Version : 1.0.1 **
# * **
# * Changes description: 1.0.0 - First Public version - Init release **
# * 1.0.1 - Added license type, minor changes to the header **
# * **
# * PowerShell compatibility: 2.0 , 4.0 and probably newer (untested) **
# * PowerShell tested versions: v2.0.50727, v4.0.30319 **
# * **
# * License: MIT **
# * **
# * TODO: ability to run the script remotely **
# to test remote connection - Get-WmiObject -ComputerName <server_name> Win32_Service -Credential $credentials
# Or manually from PowerShellEnter-PSSession <server_name> -Credential domain\<user_id>
# ***********************************************************************************************************
# **********************************************************
# Test run?
# **********************************************************
# when you want to test what will be deleted
$test_run = $true;
If ($test_run) {
Write-Warning -message 'Test run ENABLED - for actual DELETION set $test_run to $false' -verbose;
"`n";"`n";
}
# **********************************************************
# User configuration
# **********************************************************
# $credentials = 'domain\<user_id>';
# $server_name = '<server>';
# Profiles that contain file newer than 90 days will be exempted from deletion
$time_definition=#{'1m'="-0"};
# TODO: test for more periods - not tested yet!
# e.g more time frames - $time_definition=#{'1m'="-30"; '3m'="-90"; '6m'="-180"; '12m'="-360"; '18m'="-540"}
# running script path
$current_path = (Resolve-Path .\).Path;
$log_file = "$($current_path)\delete_files.log";
$folder_to_cleanse = 'E:\t\temp_profiles\'; #'C:\prg'
$excluded_directories = [System.Collections.ArrayList]#();
# All excluded profiles:
$excluded_directories.Add('All Users') | Out-null;
$excluded_directories.Add('Administrator') | Out-null;
$excluded_directories.Add('Default User') | Out-null;
$excluded_directories.Add('LocalService') | Out-null;
$excluded_directories.Add('NetworkService') | Out-null;
# Extensions excluded from date validation - these files will not influence the date check
# (will be deleted too if all others are found older)
$excluded_file_types = [System.Collections.ArrayList]#();
#$excluded_file_types.Add("*.bat", "*.cmd", "*.ps1") | Out-null;
$profile_directories = [System.Collections.ArrayList]#();
# **********************************************************
# The script's start
# **********************************************************
$newer_file_exist = $Null;
$files_to_delete = $Null;
# If previous log file exists delete it (only during test run)
If ((Test-Path -Path "$log_file") -and ($test_run)) {
Write-Verbose "Deleting previous log file $log_file." -verbose;
Remove-Item $log_file
}
# get all directories except excluded ones
$profile_directories = Get-ChildItem -Path $folder_to_cleanse -exclude $excluded_directories | Where-Object {$_.PSIsContainer -eq $True} | % { $_.Name }
# if $profile_directories found to be deleted => exit
If ([String]::IsNullOrEmpty($profile_directories)) {
Write-Warning -message "No profile directories to delete. Exiting." -verbose;
Exit;
}
# search in profile directories that are left after exclusion
# for all periods defined in time_definition
ForEach ($profile in $profile_directories) {
ForEach ($time in $time_definition.GetEnumerator()) {
Write-Verbose -message "Now processing the following profile: $folder_to_cleanse$profile." -verbose;
$test_current_pathPath = Test-Path -Path "$folder_to_cleanse$profile";
If ($test_current_pathPath) {
# check if any newer than $time_definition are present within the profile structure
# LastAccesstime can be empty! It is better, less issues, to use LastWriteTime. If you must use LastAccessTime use a check for ::IsNullOrEmpty
# LastWriteTime must be greater than current day - $time.Name (e.g. -90 days)
$newer_file_exist += Get-ChildItem -Path "$folder_to_cleanse$profile" -recurse -Force -exclude $excluded_file_types | Where-Object {$_.PSIsContainer -eq $FALSE} | where {($_.LastWriteTime).ToString('yyyy-MM-dd') -gt (get-date).adddays($time_definition.$($time.Name)).ToString('yyyy-MM-dd')};
}
# if any new file than the limit found the whole profile directory will be skipped (testing if $newer_file_exist $null)
If ($newer_file_exist) {
# add the top directory into excluded directory
$excluded_directories.Add($profile) | Out-null;
$newer_file_exist=$Null;
Write-Verbose -message "The profile $profile will be excluded from deletion process." -verbose;
continue;
}
}
}
# excluding the directories with newer files than limit defined by user
$profiles_with_path = Get-ChildItem -Path $folder_to_cleanse -exclude $excluded_directories | Where-Object {$_.PSIsContainer -eq $True}
# perhaps all $directories are now excluded?
If ([String]::IsNullOrEmpty($profiles_with_path)) {
Write-Warning -message "No directories to delete all probably filtered. Exiting." -verbose;
Exit;
}
# get all files to be deleted
ForEach ($dir in $profiles_with_path) {
# to check
$test_current_pathPath = Test-Path -Path $dir
If ($test_current_pathPath) {
#write-host 'Currently writing for these months:'$($time.Name);
$files_to_delete += Get-ChildItem -Path $dir -recurse -Force | Where-Object {$_.PSIsContainer -eq $FALSE} | % { $_.FullName }
}
}
# **********************************************************
# Messages for the user
# **********************************************************
Write-Verbose -message "List of profiles to be deleted:" -verbose;
ForEach ($profile_to_delete in $profiles_with_path) {
Write-Verbose -message "$profile_to_delete`n" -verbose;
}
Write-Verbose -message "The total count of non-excluded profile directories: $($profiles_with_path.Count)" -verbose;
Write-Verbose -message "==========================`n`n" -verbose;
Write-Verbose -message "List of excluded directories:`n" -verbose;
ForEach ($excluded_profile in $excluded_directories) {
Write-Verbose -message "$folder_to_cleanse$excluded_profile`n" -verbose;
}
Write-Verbose -message "Total count of excluded directories: $($excluded_directories.Count)" -verbose;
Write-Verbose -message "==========================`n`n" -verbose;
Write-Verbose -message "Total directory count (both to be deleted and excluded): $($($profiles_with_path.Count)+ $($excluded_directories.Count))`n" -verbose;
# **********************************************************
# Test run or actual deletion process
# **********************************************************
If ($test_run) {
ForEach ($file in $files_to_delete) {
$file | Out-file -Encoding 'Unicode' -FilePath $log_file -Append # >> $log_file
}
Write-Verbose 'This number of files would be deleted:' -verbose;
Write-Verbose "Found $($files_to_delete.Count) files marked for deletion." -verbose;
} Else {
$files_deleted = 0;
# delete files
If ($files_to_delete) {
ForEach ($file in $files_to_delete) {
#Remove-Item $file -Recurse -Force -ErrorAction SilentlyContinue
Remove-Item $file -Force -ErrorAction SilentlyContinue
If ($? -eq $true) {
$files_deleted ++;
#Write-Verbose -Verbose "$File deleted successfully!"
}
}
}
# delete directories
$directories_deleted = 0;
ForEach ($dir in $profiles_with_path) { #
Remove-Item $dir -Recurse -Force -ErrorAction SilentlyContinue
If ($? -eq $true) {
$directories_deleted ++;
#Write-Verbose -Verbose "$File deleted successfully!"
}
}
Return "Total files to be deleted: $($files_to_delete.count)","Total files Deleted: $files_deleted", "Total Directories deleted: $directories_deleted"
}
I'm a bit in the dark on what would be best practice. I'm creating a function to delete files older than, but I would like to add a switch called -Remote that is optional, but when you choose to use it, the switch requires mandatory information like $Server to have the whole function below executed on the remote server by using the Invoke-Command.
Something like this:
Delete-OldFiles -Target "\\Share\Dir1" -OlderThanDays "10" -LogName "Auto_Clean.log" -Remote "SERVER1"
The script/function
Function Delete-OldFiles
{
[CmdletBinding()]
Param(
[Parameter(Mandatory=$True,Position=1)]
[ValidateScript({Test-Path $_})]
[String]$Target,
[Parameter(Mandatory=$True,Position=2)]
[Int]$OlderThanDays,
[Parameter(Mandatory=$True,Position=3)]
[String]$LogName
)
if ($PSVersionTable.PSVersion.Major -ge "3") {
# PowerShell 3+ Remove files older than (FASTER)
Get-ChildItem -Path $Target -Exclude $LogName -Recurse -File |
Where-Object { $_.LastWriteTime -lt (Get-Date).AddDays(-$OlderThanDays) } | ForEach {
$Item = $_.FullName
Remove-Item $Item -Recurse -Force -ErrorAction SilentlyContinue
$Timestamp = (Get-Date).ToShortDateString()+" | "+(Get-Date).ToLongTimeString()
# If files can't be removed
if (Test-Path $Item)
{ "$Timestamp | FAILLED: $Item (IN USE)" }
else
{ "$Timestamp | REMOVED: $Item" }
} | Tee-Object $Target\$LogName -Append } # Output file names to console & logfile at the same time
Else {
# PowerShell 2 Remove files older than
Get-ChildItem -Path $Target -Exclude $LogName -Recurse |
Where-Object { !$_.PSIsContainer -and $_.LastWriteTime -lt (Get-Date).AddDays(-$OlderThanDays) } | ForEach {
$Item = $_.FullName
Remove-Item $Item -Recurse -Force -ErrorAction SilentlyContinue
$Timestamp = (Get-Date).ToShortDateString()+" | "+(Get-Date).ToLongTimeString()
# If files can't be removed
if (Test-Path $Item)
{
Write-Host "$Timestamp | FAILLED: $Item (IN USE)"
"$Timestamp | FAILLED: $Item (IN USE)"
}
else
{
Write-Host "$Timestamp | REMOVED: $Item"
"$Timestamp | REMOVED: $Item"
}
} | Out-File $Target\$LogName -Append }
}
Delete-OldFiles -Target "\\Share\Dir1" -OlderThanDays "10" -LogName "Auto_Clean.log"
#Delete-OldFiles "E:\Share\Dir1" "5" "Auto_Clean.log"
When I master this I can make the $LogName (logfile) optional to. Thank you for your help. I'm still new to PowerShell and trying to figure this stuff out.
You can use parameters like this
Param (
[switch] $Remote = $false,
[string] $server = $(
if ($Remote)
{
Read-Host -Prompt "Enter remote server:"
}
)
)
In this case, if you call script without -Remote, $server will remain $null.
If you'll call script.ps1 -Remote, it will ask you to enter server name.
If you'll use it like scripts.ps1 -Remote -server "Servername", $server will become Servername.
It can be complicated to wrap function into Invoke-Command based on switch, but you can always use Invoke-Command (it should be as fast as direct command), just use parameters like this
Param (
[switch] $Remote = $false,
[string] $server = $(
if ($Remote)
{
Read-Host -Prompt "Enter remote server:"
}
else
{
"localhost"
}
)
)