I am trying to use Windows Azure PowerShell to copy a zip file into VM.
I have managed to connect to VM following the documentation.
But, I cannot find any tutorial to upload / copy / transfer a zip file to VM Disk, say into the C drive.
Can any one please help me giving any link for the tutorial or any idea how can I copy this?
Here is ano ther approach that I documented here. It involves
Creating and mounting an empty local VHD.
Copying your files to the new VHD and dismount it.
Copy the VHD to azure blob storage
Attach that VHD to your VM.
Here is an example:
#Create and mount a new local VHD
$volume = new-vhd -Path test.vhd -SizeBytes 50MB | `
Mount-VHD -PassThru | `
Initialize-Disk -PartitionStyle mbr -Confirm:$false -PassThru | `
New-Partition -UseMaximumSize -AssignDriveLetter -MbrType IFS | `
Format-Volume -NewFileSystemLabel "VHD" -Confirm:$false
#Copy my files
Copy-Item C:\dev\boxstarter "$($volume.DriveLetter):\" -Recurse
Dismount-VHD test.vhd
#upload the Vhd to azure
Add-AzureVhd -Destination http://mystorageacct.blob.core.windows.net/vhdstore/test.vhd `
-LocalFilePath test.vhd
#mount the VHD to my VM
Get-AzureVM MyCloudService MyVMName | `
Add-AzureDataDisk -ImportFrom `
-MediaLocation "http://mystorageacct.blob.core.windows.net/vhdstore/test.vhd" `
-DiskLabel "boxstarter" -LUN 0 | `
Update-AzureVM
Here is some code that I got from some powershell examples and modified. It works over a session created with New-PSSession. There's a cool wrapper for that also included below. Lastly, I needed to send a whole folder over so that's here too..
Some example usage for tying them together
# open remote session
$session = Get-Session -uri $uri -credentials $credential
# copy installer to VM
Write-Verbose "Checking if file $installerDest needs to be uploaded"
Send-File -Source $installerSrc -Destination $installerDest -Session $session -onlyCopyNew $true
<#
.SYNOPSIS
Returns a session given the URL
.DESCRIPTION
http://michaelcollier.wordpress.com/2013/06/23/using-remote-powershell-with-windows-azure-vms/
#>
function Get-Session($uri, $credentials)
{
for($retry = 0; $retry -le 5; $retry++)
{
try
{
$session = New-PSSession -ComputerName $uri[0].DnsSafeHost -Credential $credentials -Port $uri[0].Port -UseSSL
if ($session -ne $null)
{
return $session
}
Write-Output "Unable to create a PowerShell session . . . sleeping and trying again in 30 seconds."
Start-Sleep -Seconds 30
}
catch
{
Write-Output "Unable to create a PowerShell session . . . sleeping and trying again in 30 seconds."
Start-Sleep -Seconds 30
}
}
}
<#
.SYNOPSIS
Sends a file to a remote session.
NOTE: will delete the destination before uploading
.EXAMPLE
$remoteSession = New-PSSession -ConnectionUri $remoteWinRmUri.AbsoluteUri -Credential $credential
Send-File -Source "c:\temp\myappdata.xml" -Destination "c:\temp\myappdata.xml" $remoteSession
Copy the required files to the remote server
$remoteSession = New-PSSession -ConnectionUri $frontEndwinRmUri.AbsoluteUri -Credential $credential
$sourcePath = "$PSScriptRoot\$remoteScriptFileName"
$remoteScriptFilePath = "$remoteScriptsDirectory\$remoteScriptFileName"
Send-File $sourcePath $remoteScriptFilePath $remoteSession
$answerFileName = Split-Path -Leaf $WebPIApplicationAnswerFile
$answerFilePath = "$remoteScriptsDirectory\$answerFileName"
Send-File $WebPIApplicationAnswerFile $answerFilePath $remoteSession
Remove-PSSession -InstanceId $remoteSession.InstanceId
#>
function Send-File
{
param (
## The path on the local computer
[Parameter(Mandatory = $true)]
[string]
$Source,
## The target path on the remote computer
[Parameter(Mandatory = $true)]
[string]
$Destination,
## The session that represents the remote computer
[Parameter(Mandatory = $true)]
[System.Management.Automation.Runspaces.PSSession]
$Session,
## should we quit if file already exists?
[bool]
$onlyCopyNew = $false
)
$remoteScript =
{
param ($destination, $bytes)
# Convert the destination path to a full filesystem path (to supportrelative paths)
$Destination = $ExecutionContext.SessionState.`
Path.GetUnresolvedProviderPathFromPSPath($Destination)
# Write the content to the new file
$file = [IO.File]::Open($Destination, "OpenOrCreate")
$null = $file.Seek(0, "End")
$null = $file.Write($bytes, 0, $bytes.Length)
$file.Close()
}
# Get the source file, and then start reading its content
$sourceFile = Get-Item $Source
# Delete the previously-existing file if it exists
$abort = Invoke-Command -Session $Session {
param ([String] $dest, [bool]$onlyCopyNew)
if (Test-Path $dest)
{
if ($onlyCopyNew -eq $true)
{
return $true
}
Remove-Item $dest
}
$destinationDirectory = Split-Path -Path $dest -Parent
if (!(Test-Path $destinationDirectory))
{
New-Item -ItemType Directory -Force -Path $destinationDirectory
}
return $false
} -ArgumentList $Destination, $onlyCopyNew
if ($abort -eq $true)
{
Write-Host 'Ignored file transfer - already exists'
return
}
# Now break it into chunks to stream
Write-Progress -Activity "Sending $Source" -Status "Preparing file"
$streamSize = 1MB
$position = 0
$rawBytes = New-Object byte[] $streamSize
$file = [IO.File]::OpenRead($sourceFile.FullName)
while (($read = $file.Read($rawBytes, 0, $streamSize)) -gt 0)
{
Write-Progress -Activity "Writing $Destination" -Status "Sending file" `
-PercentComplete ($position / $sourceFile.Length * 100)
# Ensure that our array is the same size as what we read from disk
if ($read -ne $rawBytes.Length)
{
[Array]::Resize( [ref] $rawBytes, $read)
}
# And send that array to the remote system
Invoke-Command -Session $session $remoteScript -ArgumentList $destination, $rawBytes
# Ensure that our array is the same size as what we read from disk
if ($rawBytes.Length -ne $streamSize)
{
[Array]::Resize( [ref] $rawBytes, $streamSize)
}
[GC]::Collect()
$position += $read
}
$file.Close()
# Show the result
Invoke-Command -Session $session { Get-Item $args[0] } -ArgumentList $Destination
}
<#
.SYNOPSIS
Sends all files in a folder to a remote session.
NOTE: will delete any destination files before uploading
.EXAMPLE
$remoteSession = New-PSSession -ConnectionUri $remoteWinRmUri.AbsoluteUri -Credential $credential
Send-Folder -Source 'c:\temp\' -Destination 'c:\temp\' $remoteSession
#>
function Send-Folder
{
param (
## The path on the local computer
[Parameter(Mandatory = $true)]
[string]
$Source,
## The target path on the remote computer
[Parameter(Mandatory = $true)]
[string]
$Destination,
## The session that represents the remote computer
# [Parameter(Mandatory = $true)]
[System.Management.Automation.Runspaces.PSSession]
$Session,
## should we quit if files already exist?
[bool]
$onlyCopyNew = $false
)
foreach ($item in Get-ChildItem $Source)
{
if (Test-Path $item.FullName -PathType Container) {
Send-Folder $item.FullName "$Destination\$item" $Session $onlyCopyNew
} else {
Send-File -Source $item.FullName -Destination "$destination\$item" -Session $Session -onlyCopyNew $onlyCopyNew
}
}
}
You cannot use PowerShell to copy a file directly to a Virtual Machine's OS disk (or even to one of its attached disks). There's no API for communicating directly with a Virtual Machine's innards (you'd need to create your own custom service for that.
You can use PowerShell to upload a file to a Blob, with Set-AzureStorageBlobContent.
At that point, you could notify your running app (possibly with a Queue message?) on your Virtual Machine that there's a file waiting for it to process. And the processing could be as simple as copying the file down to the VM's local disk.
Install AzCopy from http://aka.ms/downloadazcopy
Read docs from: https://learn.microsoft.com/en-us/azure/storage/storage-use-azcopy
cd "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy"
Get Blob Storage (Secondary) Key
Powershell: Blob Upload single file
.\AzCopy /Source:C:\myfolder /Dest:https://myaccount.blob.core.windows.net/mycontainer/myfolder/ /DestKey:key /Pattern:abc.txt
Logon to Remote VM
Powershell: Blob Download single file
.\AzCopy /Source:https://myaccount.file.core.windows.net/myfileshare/myfolder/ /Dest:C:\myfolder /SourceKey:key /Pattern:abc.txt
Another solution is to use a Custom Extension Script.
Using a custom extension script allows you to copy file to the VM even if the VM does not have a public ip (private network). So you don't need to configure winRm or anything.
I've used custom extension scripts in the past for post-deployment like installing an app on a VM or a Scale Set. Basically you upload files to blob storage and the custom extension script will download these file on the VM.
I've created a test-container on my blob storage account and uploaded two files:
deploy.ps1: the script executed on the VM.
test.txt: a text file with "Hello world from VM"
Here is the code of the deploy.ps1 file:
Param(
[string] [Parameter(Mandatory=$true)] $filename,
[string] [Parameter(Mandatory=$true)] $destinationPath
)
# Getting the full path of the downloaded file
$filePath = $PSScriptRoot + "\" + $filename
Write-Host "Checking the destination folder..." -Verbose
if(!(Test-Path $destinationPath -Verbose)){
Write-Host "Creating the destination folder..." -Verbose
New-Item -ItemType directory -Path $destinationPath -Force -Verbose
}
Copy-Item $filePath -Destination $destinationPath -Force -Verbose
Here is the code to add a custom script extension to a virtual machine.
Login-AzureRMAccount
$resourceGroupName = "resourcegroupname"
$storageAccountName = "storageaccountname"
$containerName = "test-container"
$location = "Australia East"
$vmName = "TestVM"
$extensionName = "copy-file-to-vm"
$filename = "test.txt"
$deploymentScript = "deploy.ps1"
$destintionPath = "C:\MyTempFolder\"
$storageAccountKeys = (Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName).Value
$storageAccountKey = $storageAccountKeys[0]
Set-AzureRmVMCustomScriptExtension -ResourceGroupName $resourceGroupName -VMName $vmName -Name $extensionName -Location $location -TypeHandlerVersion "1.9" -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey -ContainerName $containerName -FileName $deploymentScript, $filename -Run $deploymentScript -Argument "$filename $destintionPath" -ForceRerun "1"
You can remove the extension after the file has been copied:
Remove-AzureRmVMCustomScriptExtension -ResourceGroupName $resourceGroupName -VMName $vmName -Name $extensionName -Force
In my scenario, I have a logic app that is triggered every time a new file is added to a container. The logic app call a runbook (required an azure automation account) that add the custom script extension then delete it.
I am able to copy binary on destination server but unable to install, I am using below syntax in deploy.ps1 at the bottom
powershell.exe Start-Process -Wait -PassThru msiexec -ArgumentList '/qn /i "c:\MyTempFolder\ddagent.msi" APIKEY="8532473174"'
Related
I am new to PowerShell. I can't connect to a server that requires a username and password.
I wrote a script that moves files from 5 different servers to 5 different sources.
Out of these 5 source servers, one of them requires a username and password to connect to it.
This script is supposed to run every hour. I want the authentication to go through so when it comes to transferring files the script runs as is without errors.
The code below gives the following error:
Connecting to remote server xx.xx.xx.x failed with the following error message : The WinRM client cannot
process the request. Default authentication may be used with an IP address under the following conditions: the transport
is HTTPS or the destination is in the TrustedHosts list, and explicit credentials are provided. Use winrm.cmd to configure
TrustedHosts. Note that computers in the TrustedHosts list might not be authenticated.
The complete code block is:
$logPath = "C:\Users\Log.txt"
$trancriptPath = "C:\Users\LogTranscript.txt"
$getDate = Get-Date -Format "dddd MM/dd/yyyy HH:mm "
$counter = 0
Start-Transcript -Path $trancriptPath -Append
Add-Content -Path $logPath -Value ("LOG CREATED $getDate") -PassThru
#Credentials For Server5
$password = ConvertTo-SecureString “password” -AsPlainText -Force
$userName = "username"
[pscredential]$cred = New-Object System.Management.Automation.PSCredential -ArgumentList #($userName, $password)
Enter-PSSession -ComputerName "xx.xx.xx.x" -Credential $cred
#Sources
$srcMt01 = "\\Server2\programs\RECEIVE\*"
$srcMt01NameChg ="\\Server2\programs\RECEIVE"
$srcMC2o = "\\Server3\Predator\Revised Programs\MC2o\*"
$srcMC2oNameChg ="\\Server3\Predator\Revised Programs\MC2o"
$srcHm03 = "\\Server4\Predator\Revised Programs\H3\*"
$srcHm03NameChg ="\\Server4\Predator\Revised Programs\H3"
$srcMca = "\\Server5\Public\NcLib\FromNC\*"
$srcMcaNameChg ="\\Server5\Public\NcLib\FromNC"
$srcMt02 = "\\Server6\programs\RECEIVE\*"
$srcMt02NameChg ="\\Server6\programs\RECEIVE"
#Destination
$destMt01 = "\\Sever1\MfgLib\RevisedPrograms\MT01"
$destMC2o = "\\Server1\MfgLib\RevisedPrograms\MC2old"
$destHm03 = "\\Sever1\MfgLib\RevisedPrograms\H3"
$destMca = "\\Sever1\MfgLib\RevisedPrograms\MC-A"
$destMt02 = "\\Sever1\MfgLib\RevisedPrograms\MT02"
Function MoveFiles{
Param(
[string]$src,
[string]$dest,
[string]$srcNameChange
)
Get-ChildItem -Force -Recurse $src -ErrorAction Stop -ErrorVariable SearchError | ForEach-Object{
$counter++
$fileName = $_.Name
# Check for duplicate files
$file = Test-Path -Path $dest\$fileName
Write-Output $file
if($file)
{
"$srcNameChange\$fileName" | Rename-Item -NewName ("Copy_"+$fileName);
Add-Content -Path $logPath -Value ("$fileName exists in destination folder. Name change was successful") -PassThru
}
}
Move-Item -Path $src -Destination $dest -Force
Add-Content -Path $logPath -Value ("$counter file(s) moved to $dest") -PassThru
}
MoveFiles -src $srcMt01 -dest $destMt01 -srcNameChange $srcMt01NameChg
MoveFiles -src $srcMC2o -dest $destMC2o -srcNameChange $srcMC2oNameChg
MoveFiles -src $srcHm03 -dest $destHm03 -srcNameChange $srcHm03NameChg
MoveFiles -src $srcMca -dest $destMca -srcNameChange $srcMcaNameChg
MoveFiles -src $srcMt02 -dest $destMt02 -srcNameChange $srcMt02NameChg
Stop-Transcript
Any help is appreciated.
You might find it easier to remote into the server using Invoke-Command and then running your file copy using a script block on the remote server. You will probably need to use CredSSP authentication so the copy process can connect to the destination server.
The server running the script will need to be configured as a WinRM client and the remote servers will need WinRM configured to accept connections. This is most likely where your current WinRM error is coming from. That's a pretty involved discussion so do some research and post specific questions as you uncover them.
Ex.
$Destination = "\\Sever1\MfgLib\RevisedPrograms\MT01"
$SourceServer = "Server2"
$password = ConvertTo-SecureString “password” -AsPlainText -Force
$userName = "username"
[pscredential]$cred = New-Object System.Management.Automation.PSCredential -ArgumentList #($userName, $password)
$ScriptBlock = {
param ( [string]$dest )
Code to move the files from source to $dest
}
Invoke-Command -ComputerName $SourceServer -ScriptBlock $ScriptBlock -Authentication CredSSP -Credentials $Cred -ArgumentList $Destination
Is there a way to "invoke-command" to a remote computer such that I can reboot my computer and the job will still be running, and I can check the output log whenever I want?
PS> invoke-command -Computer Remote1 -File "ScriptThatRunsFor7days.ps1"
PS> restart-computer
PS> # Hey where's my job on remote computer? Can i see it running and connect to
# its output after rebooting my computer?
Isn't it easier to just register a scheduled task that runs the script on the remote computer?
For logging just use the cmdlet Start-Transcript at the top of the script.
I made a script not to long ago to easely register scheduled tasks on a remote computer. Maybe you can try out and see if it works for you?
[CmdletBinding()]
param(
[parameter(Mandatory=$true)]
[string]
$PSFilePath,
[parameter(Mandatory=$true)]
[string]
$TaskName,
[parameter(Mandatory=$true)]
[string]
$ComputerName
)
$VerbosePreference="Continue"
New-Variable -Name ScriptDestinationFolder -Value "Windows\PowershellScripts" -Option Constant -Scope Script
New-Variable -Name ScriptSourcePath -Value $PSFilePath -Option Constant -Scope Script
Write-Verbose "Script sourcepath: $ScriptSourcePath"
New-Variable -Name PSTaskName -Value $TaskName -Option Constant -Scope Script
Write-Verbose "TaskName: $TaskName"
$File = Split-Path $ScriptSourcePath -leaf
Write-Verbose "Filename: $File"
New-Variable -Name PSFileName -Value $File -Option Constant -Scope Script
Write-Verbose "PSFileName: $PSFileName"
$ExecutionTime = New-TimeSpan -Hours 8
Write-Verbose "Execution time: $ExecutionTime hours"
Invoke-Command -ComputerName $ComputerName -ScriptBlock {
$VerbosePreference="Continue"
#Removing old Scheduled Task
Write-Verbose "Unregistering old scheduled task.."
Stop-ScheduledTask -TaskName $Using:PSTaskName -ErrorAction SilentlyContinue
Unregister-ScheduledTask -TaskName $Using:PSTaskName -Confirm:$false -ErrorAction SilentlyContinue
#Creating destination directory for Powershell script
$PSFolderPath = "C:" , $Using:ScriptDestinationFolder -join "\"
Write-Verbose "Creating folder for script file on client: $PSFolderPath"
New-Item -Path $PSFolderPath -ItemType Directory -Force
#Scheduled Task definitions
$Trigger = New-ScheduledTaskTrigger -Daily -At "8am"
$PSFilePath = "C:", $Using:ScriptDestinationFolder , $Using:PSFileName -join "\"
Write-Verbose "Setting path for script file to destination folder on client: $PSFilePath"
$Action = New-ScheduledTaskAction -Execute PowerShell -Argument "-File $PSFilePath"
$Principal = New-ScheduledTaskPrincipal -UserID "NT AUTHORITY\SYSTEM" -LogonType S4U
$Settings = New-ScheduledTaskSettingsSet -AllowStartIfOnBatteries -DontStopIfGoingOnBatteries -DontStopOnIdleEnd -ExecutionTimeLimit $Using:ExecutionTime -StartWhenAvailable
$Task = Register-ScheduledTask -TaskName $Using:PSTaskName -Principal $Principal -Action $Action -Settings $Settings -Trigger $Trigger
$Task = Get-ScheduledTask -TaskName $Using:PSTaskName
$Task.Triggers[0].EndBoundary = [DateTime]::Now.AddDays(90).ToString("yyyyMMdd'T'HH:mm:ssZ")
Write-Verbose "Trigger expiration date set to: $Task.Triggers[0].EndBoundary"
$Task.Settings.DeleteExpiredTaskAfter = 'P1D'
Write-Verbose "Scheduled task will be deleted after $Task.Settings.DeleteExpiredTaskAfter after expiry."
$Task | Set-ScheduledTask -ErrorAction SilentlyContinue
} #End Invoke-Command
#Copy script file from source to the computer
$ScriptDestination = "\" , $ComputerName , "C$", $ScriptDestinationFolder -join "\"
Write-Verbose "Script destination is set to: $ScriptDestination"
Write-Verbose "Copying script file: `"$ScriptSourcePath`" to `"$ScriptDestination`""
Copy-Item -Path $ScriptSourcePath -Destination $ScriptDestination -Force
Usage:
Create-ScheduledTask-Test.ps1 -ComputerName MyRemoteComputer -PSFilePath "ScriptToRun.ps1" -TaskName DoSomeWork
Something with scheduled jobs. I'm copying the script to the remote computer using a pssession.
$s = new-pssession remote1
copy-item script.ps1 c:\users\admin\documents -tosession $s
invoke-command $s { Register-ScheduledJob test script.ps1 -Runnow }
And then later, only when it starts running, it will automatically appear as a regular job on the remote computer:
invoke-command remote1 { get-job | receive-job -keep }
vonPryz provided the crucial pointer:
On Windows, PowerShell offers disconnected remote sessions that allow you to reconnect and collect output later, from any client session, even after a logoff or reboot - assuming that the disconnected session on the remote computer hasn't timed out.
See the conceptual about_Remote_Disconnected_Sessions help topic.
The following sample script demonstrates the approach:
Save it to a *.ps1 file and adapt the $computerName and $sessionName variable values.
The script assumes that the current user identity can be used as-is to remote into the target computer; if that is not the case, add a -Credential argument to the Invoke-Command and Get-PSSession calls.
Invoke the script and, when prompted, choose when to connect to the disconnected remote session that was created - including after a logoff / reboot, in which case the script is automatically reinvoked in order to connect to the disconnected session and retrieve its output.
See the source-code comments for details, particularly with respect to the idle timeout.
One aspect not covered below is output buffering: a disconnected session that runs for a long time without having its output retrieved can potentially accumulate a lot of output. By default, if the output buffer fills up, execution is suspended. The OutputBufferingMode session option controls the behavior - see the New-PSSessionOption cmdlet.
The gist of the solution is:
An Invoke-Command call with the -InDisconnectedSession switch that starts an operation on a remote computer in an instantly disconnected session. That is, the call returns as soon as the operation was started without returning any results from the operation yet (it returns information about the disconnected session instead).
A later Receive-PSSession call - which may happen after a reboot - implicitly connects to the disconnected session and retrieves the results of the operation.
$ErrorActionPreference = 'Stop'
# ADAPT THESE VALUES AS NEEDED
$computer = '???' # The remote target computer's name.
$sessionName = 'ReconnectMe' # A session name of your choice.
# See if the target session already exists.
$havePreviousSession = Get-PSSession -ComputerName $computer -Name $sessionName
if (-not $havePreviousSession) {
# Create a disconnected session with a distinct custom name
# with a command that runs an output loop indefinitely.
# The command returns instantly and outputs a session-information object
# for the disconnected session.
# Note that [int]::MaxValue is used to get the maximum idle timeout,
# but the effective value is capped by the value of the remote machine's
# MaxIdleTimeoutMs WSMan configuration item, which defaults to 12 hours.
Write-Verbose -vb "Creating a disconnected session named $sessionName on computer $computer..."
$disconnectedSession =
Invoke-Command -ComputerName $computer -SessionName $sessionName -InDisconnectedSession -SessionOption #{ IdleTimeout=[int]::MaxValue } { while ($true) { Write-Host -NoNewLine .; Start-Sleep 1 } }
# Prompt the user for when to connect and retrieve the output
# from the disconnected session.
do {
$response = Read-Host #"
---
Disconnected session $sessionName created on computer $computer.
You can connect to it and retrieve its output from any session on this machine,
even after a reboot.
* If you choose to log off or reboot now, this script re-runs automatically
when you log back in, in order to connect to the remote session and collect its output.
* To see open sessions on the target computer on demand, run the following
(append | Remove-PSSession to remove them):
Get-PSSession -ComputerName $computer
---
Do you want to (L)og off, (R)eboot, (C)onnect right now, or (Q)uit (submit with ENTER)? [l/r/c/q]
"#
} while (($response = $response.Trim()) -notin 'l', 'r', 'c', 'q')
$autoRelaunchCmd = {
Set-ItemProperty registry::HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\RunOnce 'ReconnectDemo' "$((Get-Command powershell.exe).Path) -noexit -command `"Set-Location '$PWD'; . '$PSCommandPath'`""
}
switch ($response) {
'q' { Write-Verbose -vb 'Aborted.'; exit 2 }
'c' { break } # resume below
'l' {
Write-Verbose -vb "Logging off..."
& $autoRelaunchCmd
logoff.exe
exit
}
'r' {
Write-Verbose -vb "Rebooting..."
& $autoRelaunchCmd
Restart-Computer
exit
}
}
}
# Getting here means that a remote disconnection session was previously created.
# Reconnect and retrieve its output.
# Implicitly reconnect to the session by name,
# and receive a job object representing the remotely running command.
# Note: Despite what the docs say, -OutTarget Job seems to be the default.
# Use -Output Host to directly output the results of the running command.
Write-Verbose -vb "Connecting to previously created session $sessionName on computer $computer and receiving its output..."
$job = Receive-PSSession -ComputerName $computer -Name $sessionName -OutTarget Job
# Get the output from the job, timing out after a few seconds.
$job | Wait-Job -Timeout 3
$job | Remove-Job -Force # Forcefully terminate the job with the indefinitely running command.
# Remove the session.
Write-Host
Write-Verbose -Verbose "Removing remote session..."
Get-PSSession -ComputerName $computer -Name $sessionName | Remove-PSSession
function remote_nohup {
param(
[string]$Machine,
[string]$Cmd
)
$job_tstamp = $(get-date -f MMdd_HHmm_ss)
$job_name = "${job_tstamp}"
$job_dir_start = (Get-Location).Path
$job_dir_sched = "$env:userprofile/Documents/jobs"
$job_file = "${job_dir_sched}/${job_name}.run.ps1"
$job_log = "${job_dir_sched}/${job_name}.log"
$job_computer = $Machine
$job_cmd = $Cmd
# Create Job File
$job_ps1 = #(
"`$ErrorActionPreference = `"Stop`""
""
"Start-Transcript -path $job_log -append"
""
"try {"
" write-host 'job_begin:($job_name)'"
" "
" set-location $job_dir_start -EA 0"
" "
" write-host 'job_cmd:($job_cmd)'"
" $job_cmd | out-host"
""
" write-host 'job_end:($job_name)'"
"}"
"catch {"
" `$msg = `$_"
" write-host `$msg"
" write-error `$msg"
"}"
"finally {"
" Stop-Transcript"
"}"
""
"Exit-PSSession"
)
try {
New-Item -ItemType Directory -Force -EA:0 -Path $job_dir_sched | out-null
copy-Item $remote_profile_ps1 $job_profile
write-host "Creating File: $job_file"
$f1 = open_w $job_file -fatal
foreach ($line in $job_ps1) {
$f1.WriteLine($line)
}
}
finally {
$f1.close()
}
# Create Jobs Dir
write-host "Creating remote job Directory"
Invoke-Command -Computer $job_computer -ScriptBlock {
New-Item -ItemType Directory -Force -EA:0 -Path $using:job_dir_sched | out-null
}
# Copy Job File
write-host "copy-Item -recurse -ToSession `$s2 $job_file $job_file"
$s2 = New-PSSession -Computer $job_computer
copy-Item -ToSession $s2 $job_file $job_file
Receive-PSSession -Session $s2
Remove-PSSession -Session $s2
# Create Persistent Background Job
write-host "Submitting job to remote scheduler"
Invoke-Command -Computer $job_computer -ScriptBlock {
Register-ScheduledJob -RunNow -Name $using:job_name -File $using:job_file
Exit-PSSession
}
# NOTE: Log file from run is placed on
# remote computer under jobs dir
}
function open_w {
param([string]$path, [switch]$fatal)
try {
write-host "path: $path"
$pathfix = $ExecutionContext.SessionState.Path.GetUnresolvedProviderPathFromPSPath($path)
$handle = [System.IO.StreamWriter]::new($pathfix, $false) #false:over-write, true:append
}
catch {
if ($fatal) {
write-host -ForegroundColor Red "EXCEPTION: " + $PSItem.ToString()
exit 1
}
return $null
}
return $handle
}
I am attempting to create a generic script that will process registry files on a list of remote servers. The script will take the input of the path/filename to process and a list of servers to process it on. Copies the .reg file to the remote server and then attempts to process the .reg file on the remote server. In order to make it generic for use with any .reg file I want to pass the path\filename to process in a variable in the script block. I have seen examples of passing named variables and positional parameters but that doesn't seem to meet my requirements. I tried creating the script block contents as a Scriptblock type and calling it but it does not work. Most of the errors I have been getting are related to invalid parsing, ambiguous parameter, etc. I have seen a couple of working examples in which the .reg file path/filename is hard-coded but that defeats the purpose of what I am attempting to accomplish. I have tried the Invoke-Command with a session and with the -ComputerName variable in order to try the :using:" command with no success. Is there any way to pass a variable that is basically the command line values (switch params & filepath/name) for an executable within the scriptblock or am I going about this in the wrong manner? In code below I am crossing domains so having to open session for those servers in order to create directory if it doesn't exist.
while ($true)
{
$regFile = Read-Host -Prompt "Enter the path & name of registry file to be processed"
If(Test-Path -Path $regFile) { break }
Write-Host "You must enter valid path & filename of a registry file to process."
}
$servers = Get-Content D:\MLB\Scripts\servers.txt
$fileNm = [System.IO.Path]::GetFileName($regFile)
$pass = ConvertTo-SecureString "blahblah" -AsPlainText -Force
$Creds = new-object -typename System.Management.Automation.PSCredential( "domain\username", $pass)
foreach ($server in $servers)
{
$dirPath = ''
$newfile = '\\' + $server + '\d$\MLB\RegFiles\' + $fileNm
if($server.ToLower().Contains("web"))
{
$Session = New-PSSession -ComputerName $server -Credential $Creds
Invoke-Command -Session $Session -ScriptBlock { New-Item -ErrorAction SilentlyContinue -ItemType directory -Path D:\MLB\RegFiles }
$newfile = "d:\MLB\RegFiles\" + $fileNm
Copy-Item $regFile -Destination $newfile -ToSession $Session -Force
Remove-PSSession -Session $Session
}
else
{
$dirPath = "\\$server\d`$\MLB\RegFiles"
New-Item -ErrorAction SilentlyContinue -ItemType directory -Path $dirPath
$newfile = "\\$server\d`$\MLB\RegFiles\$fileNm"
Copy-Item $regFile -Destination $newfile -Force
}
Invoke-Command -ComputerName $server -Credential $Creds -ScriptBlock {
$args = "s/ $newfile"
Start-Process -filepath "C:\Windows\regedit.exe" -Argumentlist $args
}
I am following this MSDN guide to publish / upload ASP.Net Web Application files to Azure Web App (Resource Manager). But getting UploadFile error whenever the Sub Folder starts. Root folder is going fine.
Uploading to ftp://XXXXXX.ftp.azurewebsites.windows.net/site/wwwroot/bin/Antlr3.Runtime.dll
From C:\Users\SampleWebApp\bin\Antlr3.Runtime.dll
Exception calling "UploadFile" with "2" argument(s):
The remote server returned an error: (550) File unavailable (e.g., file not found, no access)
Param(
[string] [Parameter(Mandatory=$true)] $AppDirectory,
[string] [Parameter(Mandatory=$true)] $WebAppName,
[string] [Parameter(Mandatory=$true)] $ResourceGroupName
)
$xml = [Xml](Get-AzureRmWebAppPublishingProfile -Name $webappname `
-ResourceGroupName $ResourceGroupName `
-OutputFile null)
$username = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userName").value
$password = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userPWD").value
$url = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#publishUrl").value
Set-Location $appdirectory
$webclient = New-Object -TypeName System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$files = Get-ChildItem -Path $appdirectory -Recurse | Where-Object{!($_.PSIsContainer)}
foreach ($file in $files)
{
if ($file.FullName)
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
$uri = New-Object System.Uri("$url/$relativepath")
"Uploading to " + $uri.AbsoluteUri
"From " + $file.FullName
$webclient.UploadFile($uri, $file.FullName)
}
$webclient.Dispose()
as the issue starts only with first occurrence of sub directory(bin) file, it could be because of some other process is using the Antlr dll. can you close all active debug sessions and run this script again? and also make sure you're not having any whitespaces after forming relative uri path
[UPDATE]
It was failing to create sub-directory and hence the error "file not found" while uploading a file from sub directory.
made few changes in the for-loop to create sub-directory on ftp before uploading file from sub-directory and is working fine.
$appdirectory="<Replace with your app directory>"
$webappname="mywebapp$(Get-Random)"
$location="West Europe"
# Create a resource group.
New-AzureRmResourceGroup -Name myResourceGroup -Location $location
# Create an App Service plan in `Free` tier.
New-AzureRmAppServicePlan -Name $webappname -Location $location `
-ResourceGroupName myResourceGroup -Tier Free
# Create a web app.
New-AzureRmWebApp -Name $webappname -Location $location -AppServicePlan $webappname `
-ResourceGroupName myResourceGroup
# Get publishing profile for the web app
$xml = (Get-AzureRmWebAppPublishingProfile -Name $webappname `
-ResourceGroupName myResourceGroup `
-OutputFile null)
# Not in Original Script
$xml = [xml]$xml
# Extract connection information from publishing profile
$username = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userName").value
$password = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userPWD").value
$url = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#publishUrl").value
# Upload files recursively
Set-Location $appdirectory
$webclient = New-Object -TypeName System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$files = Get-ChildItem -Path $appdirectory -Recurse #Removed IsContainer condition
foreach ($file in $files)
{
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
$uri = New-Object System.Uri("$url/$relativepath")
if($file.PSIsContainer)
{
$uri.AbsolutePath + "is Directory"
$ftprequest = [System.Net.FtpWebRequest]::Create($uri);
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::MakeDirectory
$ftprequest.UseBinary = $true
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$response = $ftprequest.GetResponse();
$response.StatusDescription
continue
}
"Uploading to " + $uri.AbsoluteUri + " from "+ $file.FullName
$webclient.UploadFile($uri, $file.FullName)
}
$webclient.Dispose()
i also blogged about this in detail on how I troubleshooted this issue to get the fix here.
I am attempting to use the Remove-Item cmdlet as part of an automation for a system. The files are stored on a server that requires elevated rights to perform the file deletion. I have access to a domain admin account that I use for such automation scripts.
The code below will build the PSCredential object:
$password = New-Object System.Security.SecureString
"passwordhere".ToCharArray() | ForEach-Object { $password.AppendChar($_) }
$cred = New-Object System.Management.Automation.PSCredential("domain\username",$password)
$cred
I am passing this object to the following action:
Remove-Item -LiteralPath $path -Force -Credential $cred
Any ideas?
It's not clear to me if the files are local (you're running the script on the server) or remote (on another machine). If local try running the command using a background job and pass in the credentials to Start-Job:
$job = Start-Job { Remove-Item -LiteralPath $path -force } -cred $cred
Wait-Job $job
Receive-Job $job
If they're remote, try using remoting:
Invoke-Command -computername servername `
-scriptblock { Remove-Item -LiteralPath $path -force } `
-Cred $cred
Note: This requires that you execute Enable-PSRemoting on the remote machine.
In general, putting raw passwords in your script isn't a great idea. You can store the password in an encrypted manner using DPAPI and later, only that user account can decrypt the password e.g.:
# Stick password into DPAPI storage once - accessible only by current user
Add-Type -assembly System.Security
$passwordBytes = [System.Text.Encoding]::Unicode.GetBytes("Open Sesame")
$entropy = [byte[]](1,2,3,4,5)
$encrytpedData = [System.Security.Cryptography.ProtectedData]::Protect( `
$passwordBytes, $entropy, 'CurrentUser')
$encrytpedData | Set-Content -enc byte .\password.bin
# Retrieve and decrypted password
$encrytpedData = Get-Content -enc byte .\password.bin
$unencrytpedData = [System.Security.Cryptography.ProtectedData]::Unprotect( `
$encrytpedData, $entropy, 'CurrentUser')
$password = [System.Text.Encoding]::Unicode.GetString($unencrytpedData)
$password
Remove-Item can fail due to authorisation. Alternatively, either find the reference for each file and hit it with a .Delete() or move all of the files to the recycle bin.
foreach ($svr in $computers)
{
Invoke-Command -ComputerName $svr {
$folderitems = Get-ChildItem $cachefolder -Recurse
# Method 1: .Delete
foreach ($cachefolderitem in $cachefolderitems)
{
if ($cachefolderitem -like "*.ini")
{
$cachefolderitem.Delete()
}
}
# Method 2: Move all matching files to the recycle bin
Move-Item "$cachefolder\*.ini" 'C:\$Recycle.Bin' -Force
}