I see below error logs when ever i try to copy items from a path to destination . This PS script task is scheduled under a Jenkins Job. This is messing up things as build fails every time.
Error Logs-
Copy-Item : The process cannot access the file
'\\10.0.1.190\d$\Build\RPC\Fortius.RPC.AmadeusAir\Common.Logging.Core.dll' because it is being used by another process.
At C:\Users\Administrator\AppData\Local\Temp\hudson5254771699639808940.ps1:33 char:1
+ Copy-Item "$ReleaseDir\*" $AmadeusDir -Force -Recurse
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Copy-Item], IOException
+ FullyQualifiedErrorId : System.IO.IOException,Microsoft.PowerShell.Commands.CopyItemCommand
PS Script-
# force strict - so any variable used before being assigned causes an error
Set-PsDebug -Strict
# force PowerShell to exit with a non-zero code on the first error
$ErrorActionPreference = 'Stop'
# set directories here once, so we can reuse
$AmadeusDir = "\\$env:SERVER\d$\Build\RPC\Fortius.RPC.AmadeusAir"
$ReleaseDir = "C:\Amadeus\BTP\src\Fortius.Amadeus.Air.RPC.Host\bin\Release"
# get directory contents (are you expecting these to return to Jenkins?)
Get-ChildItem "$AmadeusDir\*"
Get-ChildItem "$ReleaseDir\*"
# create the search directory if it doesn't exist
if (-not (Test-Path -Path $AmadeusDir -PathType Container)) { New-Item -Path $AmadeusDir -type directory -Force }
# get the service, but fail gracefully if it doesn't exist
$service = Get-Service -Name AmadeusAirWindowsService -Computername $env:SERVER -ErrorAction SilentlyContinue
# if we have a service, stop and delete it
if($service.Status)
{
sc.exe \\$env:SERVER stop AmadeusAirWindowsService
if ($LASTEXITCODE -ne 0) { throw "error stopping the service: $LASTEXITCODE" }
Write-Host "AmadeusAirWindowsService STOPPED"
sc.exe \\$env:SERVER delete AmadeusAirWindowsService
if ($LASTEXITCODE -ne 0) { throw "error deleting the service: $LASTEXITCODE" }
Write-Host "AmadeusAirWindowsService DELETED"
}
# copy release to search
Copy-Item "$ReleaseDir\*" $AmadeusDir -Force -Recurse
# (re)create the service
sc.exe \\$env:SERVER create AmadeusAirWindowsService start=auto DisplayName="Fortius Amadeus Air RPC Service" binPath= D:\Build\RPC\Fortius.RPC.AmadeusAir\WorldVentures.Fortius.Amadeus.Air.RPC.Host.exe
if ($LASTEXITCODE -ne 0) { throw "error creating the service: $LASTEXITCODE" }
sc.exe \\$env:SERVER description AmadeusAirWindowsService "This service hosts Fortius Amadeus Air RPC service"
if ($LASTEXITCODE -ne 0) { throw "error adding description to service: $LASTEXITCODE" }
sc.exe \\$env:SERVER start AmadeusAirWindowsService
if ($LASTEXITCODE -ne 0) { throw "error starting the service: $LASTEXITCODE" }
Write-Host "AmadeusAirWindowsService STARTED"
As an alternate I am using
xcopy "From" "destination" /k/e/d/Y to do that
You're trying to copy over stuff while the destination still has files in use. Have you checked what is keeping this/these file(s) locked? You already stopped the service I see, did you actually check if that went successfully? Also SysInternals has "handle" and "process explorer" which can both checks what is keeping your file locked.
Related
First post! I apologize in advance for formatting. I'm just getting familiar with PowerShell and I'm wanting to Stop a service first, restart another service, and start the initial service. Before moving onto the next next service, I want to make sure that the service has stopped before proceeding.
I'm using this function that was mentioned here and tried to tailor it for my code.
Workflow Goal:
Stop Service A
Restart Service B
Start Service A
Code:
#Stops Service A and validates its in "Stopped" status
Get-Service 'ServiceNameA' -ComputerName 'ExampleServerA' | Stop-Service -force -PassThru
function WaitUntilServices1($searchString, $status)
{
# Get all services where DisplayName matches $searchString and loop through each of them.
foreach($service in (Get-Service -DisplayName $searchString))
{
# Wait for the service to reach the $status or a maximum of 30 seconds
$service.WaitForStatus($status, '00:00:30')
}
}
WaitUntilServices1 "ServiceDisplayNameA" "Stopped"
#Restarts Service B and validates its in "Running" status
Get-Service 'ServiceNameB' -ComputerName 'ExampleServerB' | Restart-Service -force -PassThru
function WaitUntilServices2($searchString, $status)
{
# Get all services where DisplayName matches $searchString and loop through each of them.
foreach($service in (Get-Service -DisplayName $searchString))
{
# Wait for the service to reach the $status or a maximum of 30 seconds
$service.WaitForStatus($status, '00:00:30')
}
}
WaitUntilServices2 "ServiceDisplayNameB" "Running"
#Start Service A and validates its in "Running" status
Get-Service 'ServiceA' -ComputerName 'ExampleServerA' | Start-Service -force -PassThru
Read-Host -Prompt "Press Enter to exit"
The Code I have above is giving me the following Errors for both of the functions.
Exception calling "WaitForStatus" with "2" argument(s): "Time out has expired and the operation has not been completed." At C:\PowerShell\ScriptExample\ScriptExampleFix.ps1:10 char:9
$service.WaitForStatus($status, '00:00:30')
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : TimeoutException
Then for the very last portion to start the service I'm getting 1 more error:
Start-Service : A parameter cannot be found that matches parameter name 'force'. At C:\PowerShell\ScriptExample\ScriptExampleFix.ps1:32 char:85
+ ... erName 'ServerNameExample' | Start-Service -force -PassTh ...
+ ~~~~~~
+ CategoryInfo : InvalidArgument: (:) [Start-Service], ParameterBindingException
+ FullyQualifiedErrorId : NamedParameterNotFound,Microsoft.PowerShell.Commands.StartServiceCommand
Any help would get greatly appreciated :)
In the first statement:
#Stops Service A and validates its in "Stopped" status
Get-Service 'ServiceNameA' -ComputerName 'ExampleServerA' | Stop-Service -force -PassThru
You ask PowerShell to stop ServiceNameA on a remote computer.
You then call WaitUntilServices1 which attempts to wait for a service of the same name on your local computer - which is obviously not gonna stop any time soon because you requested stopping a service on a different computer.
Change the function definition to accept a -ComputerName parameter too and pass that to Get-Service:
function Wait-ServiceStatus {
param(
[string]$Name,
[string]$ComputerName = $('.'),
[System.ServiceProcess.ServiceControllerStatus]$Status
)
foreach($service in Get-Service -Name $Name -ComputerName $ComputerName){
# If any call to WaitForStatus times out and throws, return $false
try { $service.WaitForStatus($Status, '00:00:30') } catch { return $false }
}
# No errors thrown while waiting, all is good, return $true
return $true
}
Now we can do:
# request the remote SCM stop the service
Get-Service 'ServiceNameA' -ComputerName 'ExampleServerA' | Stop-Service -Force
$success = Wait-ServiceStatus -Name 'ServiceNameA' -ComputerName 'ExampleServerA' -Status Stopped
if(-not $success){
# output an error
Write-Error "failed to complete restart cycle, 'ServiceNameA' on 'ExampleServerA' failed to stop in a timely manner"
# return from this script/function for good measure
return
}
# ... if we've reached this point the wait must have been successful, continue with the restart cycle.
Get-Service 'ServiceNameB' -ComputerName 'ExampleServerB' | Restart-Service -force -PassThru
$success = Wait-ServiceStatus -Name 'ServiceNameB' -ComputerName 'ExampleServerB' -Status Running
if(-not $success){
# ... etc.
}
Publish-AzWebApp throws an error when uploading the Source Code for the aplication. The code is the following:
$job = Publish-AzWebApp `
-WebApp $webApp `
-ArchivePath (Join-Path -Path $rootPath -ChildPath $archiveRelativePath) `
-Force `
-AsJob
# # #
# ...code...
# # #
# Later on
$job | Receive-Job -AutoRemoveJob -Wait -WriteJobInResults | ForEach-Object {
if ("AzureLongRunningJob" -eq $_.GetType().BaseType.Name) {
if ("Completed" -eq $_.JobStateInfo.State) {
Write-Log -Message "Published the Source Code for $($webApp.Name) successfully." -Level INFO
}
else {
Write-Log -Message $_.JobStateInfo -Level ERROR
throw $_.JobStateInfo
}
}
}
The error is the following:
Deployment failed with status code ServiceUnavailable
+ CategoryInfo : InvalidResult: (:) [], Exception
+ FullyQualifiedErrorId :
The thing is that between the beginning of the job and the end of the job I am also uploading WebJobs and and also I'm setting the AppSettings from the Configuration Blade.
I also noticed that this happens when I am creating the app then doing this procedure in one go. But if the app exists, the error is less likely to occur. Still it doesn't seem like it is stable. What could I do?
I am trying to write a script to check health of my web application's health status. Forexample if I couldn't get message in 10 seconds I have to recycle my apppool by using Powershell.Or except 200-ok codes, my app pool should recycled.
Please look at below code and ERROR:
# Load IIS module:
Import-Module WebAdministration
while ($true) {
write-host 'Runnig For Check app.xxx.com ...'
# First we create the request.
$HTTP_Request = [System.Net.WebRequest]::Create('https://app.xxx.com/')
Try
{
# We then get a response from the site.
$HTTP_Response = $HTTP_Request.GetResponse()
# We then get the HTTP code as an integer.
$HTTP_Status = [int]$HTTP_Response.StatusCode
If ($HTTP_Status -eq 200) {
Write-Host "Site is OK!"
}
Else {
Write-Host "The Site may be down, please check!"
Restart-WebAppPool -Name "app.xxx.com"
}
}
Catch
{
Stop-WebAppPool -Name "app.xxx.com"
Restart-WebAppPool -Name "app.xxx.com"
}
# Finally, we clean up the http request by closing it.
$HTTP_Response.Close()
Start-Sleep -Seconds 120
}
Error:
Restart-WebAppPool : You have to start stopped object before
restarting it. At C:\Scripts\CheckHealthHaydigo.ps1:25 char:6
+ Restart-WebAppPool -Name "app.xxx.com"
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [Restart-WebAppPool], InvalidOperationException
+ FullyQualifiedErrorId : InvalidOperation,Microsoft.IIs.PowerShell.Provider.RestartAppPoolCommand
It seems like you should check the status of the app pool before trying to restart it.
if ((Get-WebAppPoolState -Name "app.xxx.com").Value -eq "Stopped") {
Start-WebAppPool -Name "app.xxx.com"
}
else {
Restart-WebAppPool -Name "app.xxx.com"
}
Within Powershell, the CopyHere method for the Shell-Application Namespace is asynchronous. My main goal with this is to convert a KML file to a KMZ file. The process of doing this is to create a ZIP file with the same name, copy the KML into the KMZ (compresses the file) and then rename the ZIP to KMZ. Unfortunately, being asynchronous means the rename function is being called before the CopyHere method is completed. I have found many examples of solving this. The cleanest one I found is below:
$kmlPath = $global:directoryPath + "Test.kml"
$zip = $global:directoryPath + "Test.zip"
New-Item $zip -ItemType file
$shellApplication = new-object -com shell.application
$zipPackage = $shellApplication.NameSpace($zip)
$zipPackage.CopyHere($kmlPath, 16)
while($zipPackage.Items().Item($zip.Name) -Eq $null)
{
start-sleep -seconds 1
write-host "." -nonewline
}
write-host "."
Rename-Item -Path $zip -NewName $([System.IO.Path]::ChangeExtension($zip, ".kmz"))
This responds with the following error:
Exception calling "Item" with "1" argument(s): "Not implemented
(Exception from HRESULT: 0x80004001 (E_NOTIMPL))"
+ while($zipPackage.Items().Item($zip.Name) -Eq $null)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : ComMethodTargetInvocation
Am I misusing the Item method for this particular package? I am confused why something that "appears" to be neatly done is not working. I have also tried the snippet of code provided Here. It also complains about the .Item method in this particular situation.
The issue i ran into was trying to find away to check on zip status.
So instead i did a trigger for a while that would fire ...If the Zipfile was openable and the File name was inside.
function kml_to_kmz([string]$kmlPath){
[Reflection.Assembly]::LoadWithPartialName('System.IO.Compression.FileSystem')
$kmlInfo = Get-ChildItem -Path $kmlPath
$zipLocation = ($kmlInfo.Directory.FullName + '\' + $kmlInfo.Name.Remove($kmlInfo.Name.LastIndexOf('.')) + '.zip')
New-item $zipLocation
((new-object -com shell.application).NameSpace($zipLocation)).CopyHere($kmlPath, 16)
$trigger = $false
while ($trigger -eq $false){
try{
$zip = [IO.Compression.ZipFile]::OpenRead($zipLocation)
If(($zip.Entries | %{$_.Name}) -contains $kmlInfo.Name){
$zip.Dispose();
$trigger = $true
break;
}
}catch{}
start-sleep -seconds 1
write-host "." -nonewline
}
[IO.Compression.ZipFile]::OpenRead($zipLocation).Dispose()
Rename-Item -Path $zipLocation -NewName $([System.IO.Path]::ChangeExtension($zipLocation, '.kmz'))
}
kml_to_kmz -kmlPath "C:\Users\Default\Desktop\Test.kml"
Ok, here is the problem, randomly the LoadFiles option doesn't like a certain input file and produces an error. This error is always a terminating error and I cannot figure out any way to get it to continue. Any Ideas?
Function ProcessImage {
Param(
[Parameter(ValueFromPipeline=$true)]
[System.IO.FileInfo]$File
)
If ($Excluded_Owners -notcontains $(get-acl -path $File.FullName).owner) { #Check owner of the file and compare it to list of blacklisted file owners.
Try{
$Image = New-Object -ComObject Wia.ImageFile
$Image.LoadFile($File.fullname)
} Catch{
LogWriter -OutPut "File Failed Process File in WIA - `"$($File.fullname)`""
Write-Error $_.Exception.Message
continue
}
If($Image.width -gt $PictureMinWidth -or $Image.height -gt $PictureMinHeight) { #Check image dimensions.
IF ($Script:Copy) {
$CopyTryCount = 0
While ((Test-Path -Path "$CopyDir\$($Script:MF_ImagesCopied + 1)$($File.extension)" -PathType Leaf) -EQ $False -AND $CopyTryCount -le 3) { #After the script says the picture was copied without error, verify it indeed was.
$CopyTryCount++
Try {
Copy-Item -Path $File.FullName -Destination "$CopyDir\$($Script:MF_ImagesCopied + 1)$($File.extension)" -Force #If the picture meets all requirements, attempt to copy the image.
} Catch {
LogWriter -Status "Failure" -Output "File Failed to Copy (Attempt $CopyTryCount) - `"$($File.fullname)`""
}
}
IF (Test-Path -Path "$CopyDir\$($Script:MF_ImagesCopied + 1)$($File.extension)" -PathType Leaf) { #Check the CopyDir directory for the image.
LogWriter -Status "Success" -Output "File Successfully Copied - `"$($File.fullname)`"" #If the image was copied successfully, log that.
[Int]$Script:MF_ImagesCopied += 1
$Temp_ProcessImage_Success=$True
} Else {
LogWriter -Status "Failure" -Output "File Failed to Copy after 3 tries - `"$($File.fullname)`"" #If the image was not copied successfully, log that.
[Int]$Script:MF_ImagesFailed+= 1
}
}
} Else {
LogWriter -Status "Skipped" -Output "Incorrect Dimensions - `"$($File.fullname)`""
[Int]$Script:MF_ImagesSkipped += 1
}
} Else {
LogWriter -Status "Skipped" -Output "Excluded Owner - `"$($File.fullname)`""
[Int]$Script:MF_ImagesSkipped += 1
}
}#End ProcessImage
This is the troublesome error.
ProcessImage : Exception calling "LoadFile" with "1" argument(s): "The
segment is already disca rded and cannot be locked. " At
L:\MediaFinder.ps1:400 char:83
+ If ($Images -AND $ImageFileTypes -contains "*"+$.Extension) {ProcessImage <<<< $}
+ CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
+ FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,ProcessImage
You have caught the terminating error in a catch block and turned it into a non-terminating error. That's the first important step. BTW the continue in your catch block may also be causing premature termination. Continue is meant to be used with loops and with the Trap statement. Remove it and replace with a return statement.
The reason your function doesn't process any other files is that it isn't written quite right. So the second step is to put your script in a process block so it can process each $File object passed down the pipeline e.g.:
function ProcessImage {
param(
[Parameter(ValueFromPipeline=$true)]
$File
)
process {
try {
if ($file -eq 'foo') {
throw 'kaboom'
}
else {
"Processing $file"
}
}
catch {
Write-Error $_
return # can't continue - don't have valid file obj
}
"Still processing $file"
}
}
And if I run the above with these parameters, you can see that it processes objects after the one that throws a terminating error:
C:\PS> 'bar','foo','baz' | ProcessImage
Processing bar
Still processing bar
ProcessImage : kaboom
At line:1 char:21
+ 'bar','foo','baz' | ProcessImage
+ ~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
+ FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,ProcessImage
Processing baz
Still processing baz