Error while Publishing files to Azure Web App using FTP - powershell

I am following this MSDN guide to publish / upload ASP.Net Web Application files to Azure Web App (Resource Manager). But getting UploadFile error whenever the Sub Folder starts. Root folder is going fine.
Uploading to ftp://XXXXXX.ftp.azurewebsites.windows.net/site/wwwroot/bin/Antlr3.Runtime.dll
From C:\Users\SampleWebApp\bin\Antlr3.Runtime.dll
Exception calling "UploadFile" with "2" argument(s):
The remote server returned an error: (550) File unavailable (e.g., file not found, no access)
Param(
[string] [Parameter(Mandatory=$true)] $AppDirectory,
[string] [Parameter(Mandatory=$true)] $WebAppName,
[string] [Parameter(Mandatory=$true)] $ResourceGroupName
)
$xml = [Xml](Get-AzureRmWebAppPublishingProfile -Name $webappname `
-ResourceGroupName $ResourceGroupName `
-OutputFile null)
$username = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userName").value
$password = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userPWD").value
$url = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#publishUrl").value
Set-Location $appdirectory
$webclient = New-Object -TypeName System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$files = Get-ChildItem -Path $appdirectory -Recurse | Where-Object{!($_.PSIsContainer)}
foreach ($file in $files)
{
if ($file.FullName)
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
$uri = New-Object System.Uri("$url/$relativepath")
"Uploading to " + $uri.AbsoluteUri
"From " + $file.FullName
$webclient.UploadFile($uri, $file.FullName)
}
$webclient.Dispose()

as the issue starts only with first occurrence of sub directory(bin) file, it could be because of some other process is using the Antlr dll. can you close all active debug sessions and run this script again? and also make sure you're not having any whitespaces after forming relative uri path
[UPDATE]
It was failing to create sub-directory and hence the error "file not found" while uploading a file from sub directory.
made few changes in the for-loop to create sub-directory on ftp before uploading file from sub-directory and is working fine.
$appdirectory="<Replace with your app directory>"
$webappname="mywebapp$(Get-Random)"
$location="West Europe"
# Create a resource group.
New-AzureRmResourceGroup -Name myResourceGroup -Location $location
# Create an App Service plan in `Free` tier.
New-AzureRmAppServicePlan -Name $webappname -Location $location `
-ResourceGroupName myResourceGroup -Tier Free
# Create a web app.
New-AzureRmWebApp -Name $webappname -Location $location -AppServicePlan $webappname `
-ResourceGroupName myResourceGroup
# Get publishing profile for the web app
$xml = (Get-AzureRmWebAppPublishingProfile -Name $webappname `
-ResourceGroupName myResourceGroup `
-OutputFile null)
# Not in Original Script
$xml = [xml]$xml
# Extract connection information from publishing profile
$username = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userName").value
$password = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userPWD").value
$url = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#publishUrl").value
# Upload files recursively
Set-Location $appdirectory
$webclient = New-Object -TypeName System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$files = Get-ChildItem -Path $appdirectory -Recurse #Removed IsContainer condition
foreach ($file in $files)
{
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
$uri = New-Object System.Uri("$url/$relativepath")
if($file.PSIsContainer)
{
$uri.AbsolutePath + "is Directory"
$ftprequest = [System.Net.FtpWebRequest]::Create($uri);
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::MakeDirectory
$ftprequest.UseBinary = $true
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$response = $ftprequest.GetResponse();
$response.StatusDescription
continue
}
"Uploading to " + $uri.AbsoluteUri + " from "+ $file.FullName
$webclient.UploadFile($uri, $file.FullName)
}
$webclient.Dispose()
i also blogged about this in detail on how I troubleshooted this issue to get the fix here.

Related

Error getting files. The collection has not been initialized

I am getting this error:
"Error getting files. The collection has not been initialized. It has not been requested or the request has not been executed. It may need to be explicitly requested."
Code:
#Add references to SharePoint client assemblies
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
[System.Reflection.Assembly]::LoadWithPartialName("WindowsBase")
#Variables
$SiteURL = "****************"
$LibraryName = "Documents"
$FolderName = "Customer Files"
Function Get-FilesFromFolder()
{
Try
{
#Load credentials of the admin account that has access to the library
$Cred = Get-Credential
$Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($Cred.UserName, $Cred.Password)
#Building Context
$Ctx = New-Object Microsoft.SharePoint.Client.ClientContext($SiteURL)
$Ctx.Credentials = $Credentials
#Get the library
$Library = $Ctx.Web.Lists.GetByTitle($LibraryName)
$Folders = $Library.RootFolder.Folders
$Ctx.Load($Folders)
$Ctx.ExecuteQuery()
#Get the folder by name
$Folder = $Folders | Where {$_.Name -eq $FolderName}
$Ctx.Load($Folder)
$Ctx.ExecuteQuery()
#Iterate through each file
Foreach($File in $Folder.Files)
{
#Write out each file name
Write-Host "Now printing names for files in the folder."
Write-Host "There are "$Folder.ItemCount" files in the folder."
Write-Host -f Green $File.Name
}
}
Catch
{
Write-Host -f Red "Error getting files." $_.Exception.Message
}
}
All other guides I can find show examples of code missing "$Ctx.Load($Variable)", which I have, and I am stuck.
Hoping someone can see what I am missing - Thanks.

PowerShell Change Download Folder Pathway in Network Share

I have this script that downloads files from a report server and puts those files in a local network share. The script does what it needs to, but the download folder looks like this hitsqlp -> Extracts -> output -> web16p...this is the pathway of where the folder needs to live, but it is replicating that pathway into subfolders so now I have to click on every subfolder to get to the files.
I want the folder 'SSRSFolder' to be a subfolder of \epicsqlt\Extracts\Output\HIT\web16p
Code below, I'm not sure where I went wrong:
set-location -path \\epicsqlt\Extracts\Output\HIT\web16p
$downloadFolder = "\\epicsqlt\Extracts\Output\HIT\web16p"
$ssrsServer = "blahblahblah"
$secpasswd = ConvertTo-SecureString "password" -AsPlainText -Force
$mycreds = New-Object System.Management.Automation.PSCredential ("username", $secpasswd)
$ssrsProxy = New-WebServiceProxy -Uri "$($ssrsServer)" -Credential $mycreds
$ssrsProxy = New-WebServiceProxy -Uri "$($ssrsServer)" -UseDefaultCredential
$ssrsItems = $ssrsProxy.ListChildren("/", $true) | Where-Object {$_.TypeName -eq "DataSource" -or $_.TypeName -eq "Report"}
Foreach($ssrsItem in $ssrsItems)
{
# Determine extension for Reports and DataSources
if ($ssrsItem.TypeName -eq "Report")
{
$extension = ".rdl"
}
else
{
$extension = ".rds"
}
Write-Host "Downloading $($ssrsItem.Path)$($extension)";
$downloadFolderSub = $downloadFolder.Trim('\') + $ssrsItem.Path.Replace($ssrsItem.Name,"").Replace("/","\").Trim()
New-Item -ItemType Directory -Path $downloadFolderSub -Force > $null
$ssrsFile = New-Object System.Xml.XmlDocument
[byte[]] $ssrsDefinition = $null
$ssrsDefinition = $ssrsProxy.GetItemDefinition($ssrsItem.Path)
[System.IO.MemoryStream] $memoryStream = New-Object System.IO.MemoryStream(#(,$ssrsDefinition))
$ssrsFile.Load($memoryStream)
$fullDataSourceFileName = $downloadFolderSub + "\" + $ssrsItem.Name + $extension;
$ssrsFile.Save($fullDataSourceFileName);
}
if i'm reading this right.
you are starting the script with
set-location -path \epicsqlt\Extracts\Output\HIT\web16p
then you are setting the $downloadfolder variable to that path and including $downloadfolder in your $downloadfoldersub creation.
so the result would be
epicsqlt\Extracts\Output\HIT\web16p\somepath\somefolder\
and then you are creating a new-item with that whole path, when you are already working in the \web16p\ folder.

PowerShell Script Error in command but works in ISE

I am running a script in the ISE that essentially downloads a file from a public site:
#This PowerShell code scrapes the site and downloads the latest published file.
Param(
$Url = 'https://randomwebsite.com',
$DownloadPath = "C:\Downloads",
$LocalPath = 'C:\Temp',
$RootSite = 'https://publicsite.com',
$FileExtension = '.gz'
)
#Define the session cookie used by the site and automate acceptance. $session = New-Object Microsoft.PowerShell.Commands.WebRequestSession
$cookie = New-Object System.Net.Cookie
$cookie.Name = "name"
$cookie.Value = "True"
$cookie.Domain = "www.public.com"
$session.Cookies.Add($cookie);
$FileNameDate = Get-Date -Format yyyyMMdd
$DownloadFileName = $DownloadPath + $FileNameDate + $FileExtension
$DownloadFileName
TRY{
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$WebSite = Invoke-WebRequest $Url -WebSession $session -UseBasicParsing #this gets the links we need from the main site.
$Table = $WebSite.Links | Where-Object {$_.href -like "*FetchDocument*"} | fl href #filter the results that we need.
#Write-Output $Table
$FilterTable=($Table | Select-Object -Unique | sort href -Descending) | Out-String
$TrimString = $FilterTable.Trim()
$FinalString = $RootSite + $TrimString.Trim("href :")
#Write-Verbose $FinalString | Out-String
#Start-Process powershell.exe -verb RunAs -ArgumentList "-File C:\some\path\base_server_settings.ps1" -Wait
Invoke-WebRequest $FinalString -OutFile $DownloadFileName -TimeoutSec 600
$ExpectedFileName = Get-ChildItem | Sort-Object LastAccessTime -Descending | Select-Object -First 1 $DownloadPath.Name | SELECT Name
$ExpectedFileName
Write-Host 'The latest DLA file has been downloaded and saved here:' $DownloadFileName -ForegroundColor Green
}
CATCH{
[System.Net.WebException],[System.IO.IOException]
Write "An error occured while downloading the latest file."
Write $_.Exception.Message
}
Expectation is that it downloads a file into the downloads folder and does in fact download the file when using the ISE.
When I try to run this as a command however (PowerShell.exe -file "/path/script.ps1) I get an error stating:
An error occurred while downloading the latest file. Operation is not valid due to the current state of the object.
out-lineoutput : The object of type
"Microsoft.PowerShell.Commands.Internal.Format.GroupEndData" is not
valid or not in the correct sequence. This is likely caused by a
user-specified "format-*" command which is conflicting with the
default formatting. At
\path\to\file\AutomatedFileDownload.ps1:29
char:9
$FilterTable=($Table | Select-Object -Unique | sort href -Des ...
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
CategoryInfo : InvalidData: (:) [out-lineoutput], InvalidOperationException
FullyQualifiedErrorId : ConsoleLineOutputOutOfSequencePacket,Microsoft.PowerShell.Commands.OutLineOutputCommand
I found several articles describing using the MTA or STA switch and I have tried to add in -MTA or -STA to the command, but it still gives me the same error in the command.
As commented, you are trying to get one link from the website, but pipe your commande to things like Format-List and Out-String, rendering the result to either nothing at all or as a single multiline string.. In both cases, this won't get you what you are after.
Not knowing the actual values of the linksof course, I suggest you try this:
Param(
$Url = 'https://randomwebsite.com',
$DownloadPath = "C:\Downloads",
$LocalPath = 'C:\Temp',
$RootSite = 'https://publicsite.com',
$FileExtension = '.gz'
)
# test if the download path exists and if not, create it
if (!(Test-Path -Path $DownloadPath -PathType Container)){
$null = New-Item -Path $DownloadPath -ItemType Directory
}
#Define the session cookie used by the site and automate acceptance.
$session = New-Object Microsoft.PowerShell.Commands.WebRequestSession
$cookie = New-Object System.Net.Cookie
$cookie.Name = "name"
$cookie.Value = "True"
$cookie.Domain = "www.public.com"
$session.Cookies.Add($cookie);
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
try {
$WebSite = Invoke-WebRequest -Uri $Url -WebSession $session -UseBasicParsing -ErrorAction Stop #this gets the links we need from the main site.
# get the file link
$lastLink = ($WebSite.Links | Where-Object {$_.href -like "*FetchDocument*"} | Sort-Object href -Descending | Select-Object -First 1).href
# create the file URL
$fileUrl = "$RootSite/$lastLink"
# create the full path and filename for the downloaded file
$DownloadFileName = Join-Path -Path $DownloadPath -ChildPath ('{0:yyyyMMdd}{1}' -f (Get-Date), $FileExtension)
Write-Verbose "Downloading $fileUrl as '$DownloadFileName'"
Invoke-WebRequest -Uri $fileUrl -OutFile $DownloadFileName -TimeoutSec 600 -ErrorAction Stop
# test if the file is downloaded
if (Test-Path -Path $DownloadFileName -PathType Leaf) {
Write-Host "The latest DLA file has been downloaded and saved here: $DownloadFileName" -ForegroundColor Green
}
else {
Write-Warning "File '$DownloadFileName' has NOT been downloaded"
}
}
catch [System.Net.WebException],[System.IO.IOException]{
Write-Host "An error occured while downloading the latest file.`r`n$($_.Exception.Message)" -ForegroundColor Red
}
catch {
Write-Host "An unknown error occured while downloading the latest file.`r`n$($_.Exception.Message)" -ForegroundColor Red
}

Uploading to SharepointOnline subfolder using Powershell

it seems like I can upload a file to SPO's Site Collection URL but not the subfolder. For example, I can upload to "https://xyz.sharepoint.com/sites/Reporting", but not "https://xyz.sharepoint.com/sites/Reporting/Sales". Here's the working code's relevant bit:
$Username = "me#domain.com"
$Password = "Password"
$SiteCollectionUrl = "https://xyz.sharepoint.com/sites/Reporting"
$securePassword = ConvertTo-SecureString $password -AsPlainText -Force
$DocLibName = "Sales"
$Folder="C:\MySalesFolder"
Function Get-SPOCredentials([string]$UserName,[string]$Password)
{
$SecurePassword = $Password | ConvertTo-SecureString -AsPlainText -Force
return New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName, $SecurePassword)
}
$Context = New-Object Microsoft.SharePoint.Client.ClientContext($SiteCollectionUrl)
$Context.Credentials = Get-SPOCredentials -UserName $UserName -Password $Password
#Retrieve list
$List = $Context.Web.Lists.GetByTitle($DocLibName)
$Context.Load($List)
$Context.ExecuteQuery()
#Upload file
Foreach ($File in (dir $Folder -File))
{
$FileStream = New-Object IO.FileStream($File.FullName,[System.IO.FileMode]::Open)
$FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
$FileCreationInfo.Overwrite = $true
$FileCreationInfo.ContentStream = $FileStream
$FileCreationInfo.URL = $File
$Upload = $List.RootFolder.Files.Add($FileCreationInfo)
$Context.Load($Upload)
$Context.ExecuteQuery()
I tried to hardcode "/Sales" subfolder but no luck. Anyone can point me in the right direction?
According to Trying to upload files to subfolder in Sharepoint Online via Powershell You will need to amend the FileCreationURL:
$FileCreationInfo.URL = $List.RootFolder.ServerRelativeUrl + "/" + $FolderName + "/" + $File.Name
I believe you just need to prepend $File.Name with the Folder path to your root directory.
Web.GetFolderByServerRelativeUrl Method:
Alternatively to attempting to Upload via List.RootFolder..., you can use the "GetFolderByServerRelativeUrl" method to get your Target Folder
$List = $Context.Web.Lists.GetByTitle($DocLibName)
$Context.Load($List.RootFolder)
$Context.ExecuteQuery()
$FolderName = "Sales"
$TargetFolder = $Context.Web.GetFolderByServerRelativeUrl($List.RootFolder.ServerRelativeUrl + "/" + $FolderName);
Then for upload you would use
$FileCreationInfo.URL = $File.Name
$UploadFile = $TargetFolder.Files.Add($FileCreationInfo)
$Context.Load($UploadFile)
$Context.ExecuteQuery()

Windows Azure Powershell Copying file to VM

I am trying to use Windows Azure PowerShell to copy a zip file into VM.
I have managed to connect to VM following the documentation.
But, I cannot find any tutorial to upload / copy / transfer a zip file to VM Disk, say into the C drive.
Can any one please help me giving any link for the tutorial or any idea how can I copy this?
Here is ano ther approach that I documented here. It involves
Creating and mounting an empty local VHD.
Copying your files to the new VHD and dismount it.
Copy the VHD to azure blob storage
Attach that VHD to your VM.
Here is an example:
#Create and mount a new local VHD
$volume = new-vhd -Path test.vhd -SizeBytes 50MB | `
Mount-VHD -PassThru | `
Initialize-Disk -PartitionStyle mbr -Confirm:$false -PassThru | `
New-Partition -UseMaximumSize -AssignDriveLetter -MbrType IFS | `
Format-Volume -NewFileSystemLabel "VHD" -Confirm:$false
#Copy my files
Copy-Item C:\dev\boxstarter "$($volume.DriveLetter):\" -Recurse
Dismount-VHD test.vhd
#upload the Vhd to azure
Add-AzureVhd -Destination http://mystorageacct.blob.core.windows.net/vhdstore/test.vhd `
-LocalFilePath test.vhd
#mount the VHD to my VM
Get-AzureVM MyCloudService MyVMName | `
Add-AzureDataDisk -ImportFrom `
-MediaLocation "http://mystorageacct.blob.core.windows.net/vhdstore/test.vhd" `
-DiskLabel "boxstarter" -LUN 0 | `
Update-AzureVM
Here is some code that I got from some powershell examples and modified. It works over a session created with New-PSSession. There's a cool wrapper for that also included below. Lastly, I needed to send a whole folder over so that's here too..
Some example usage for tying them together
# open remote session
$session = Get-Session -uri $uri -credentials $credential
# copy installer to VM
Write-Verbose "Checking if file $installerDest needs to be uploaded"
Send-File -Source $installerSrc -Destination $installerDest -Session $session -onlyCopyNew $true
<#
.SYNOPSIS
Returns a session given the URL
.DESCRIPTION
http://michaelcollier.wordpress.com/2013/06/23/using-remote-powershell-with-windows-azure-vms/
#>
function Get-Session($uri, $credentials)
{
for($retry = 0; $retry -le 5; $retry++)
{
try
{
$session = New-PSSession -ComputerName $uri[0].DnsSafeHost -Credential $credentials -Port $uri[0].Port -UseSSL
if ($session -ne $null)
{
return $session
}
Write-Output "Unable to create a PowerShell session . . . sleeping and trying again in 30 seconds."
Start-Sleep -Seconds 30
}
catch
{
Write-Output "Unable to create a PowerShell session . . . sleeping and trying again in 30 seconds."
Start-Sleep -Seconds 30
}
}
}
<#
.SYNOPSIS
Sends a file to a remote session.
NOTE: will delete the destination before uploading
.EXAMPLE
$remoteSession = New-PSSession -ConnectionUri $remoteWinRmUri.AbsoluteUri -Credential $credential
Send-File -Source "c:\temp\myappdata.xml" -Destination "c:\temp\myappdata.xml" $remoteSession
Copy the required files to the remote server
$remoteSession = New-PSSession -ConnectionUri $frontEndwinRmUri.AbsoluteUri -Credential $credential
$sourcePath = "$PSScriptRoot\$remoteScriptFileName"
$remoteScriptFilePath = "$remoteScriptsDirectory\$remoteScriptFileName"
Send-File $sourcePath $remoteScriptFilePath $remoteSession
$answerFileName = Split-Path -Leaf $WebPIApplicationAnswerFile
$answerFilePath = "$remoteScriptsDirectory\$answerFileName"
Send-File $WebPIApplicationAnswerFile $answerFilePath $remoteSession
Remove-PSSession -InstanceId $remoteSession.InstanceId
#>
function Send-File
{
param (
## The path on the local computer
[Parameter(Mandatory = $true)]
[string]
$Source,
## The target path on the remote computer
[Parameter(Mandatory = $true)]
[string]
$Destination,
## The session that represents the remote computer
[Parameter(Mandatory = $true)]
[System.Management.Automation.Runspaces.PSSession]
$Session,
## should we quit if file already exists?
[bool]
$onlyCopyNew = $false
)
$remoteScript =
{
param ($destination, $bytes)
# Convert the destination path to a full filesystem path (to supportrelative paths)
$Destination = $ExecutionContext.SessionState.`
Path.GetUnresolvedProviderPathFromPSPath($Destination)
# Write the content to the new file
$file = [IO.File]::Open($Destination, "OpenOrCreate")
$null = $file.Seek(0, "End")
$null = $file.Write($bytes, 0, $bytes.Length)
$file.Close()
}
# Get the source file, and then start reading its content
$sourceFile = Get-Item $Source
# Delete the previously-existing file if it exists
$abort = Invoke-Command -Session $Session {
param ([String] $dest, [bool]$onlyCopyNew)
if (Test-Path $dest)
{
if ($onlyCopyNew -eq $true)
{
return $true
}
Remove-Item $dest
}
$destinationDirectory = Split-Path -Path $dest -Parent
if (!(Test-Path $destinationDirectory))
{
New-Item -ItemType Directory -Force -Path $destinationDirectory
}
return $false
} -ArgumentList $Destination, $onlyCopyNew
if ($abort -eq $true)
{
Write-Host 'Ignored file transfer - already exists'
return
}
# Now break it into chunks to stream
Write-Progress -Activity "Sending $Source" -Status "Preparing file"
$streamSize = 1MB
$position = 0
$rawBytes = New-Object byte[] $streamSize
$file = [IO.File]::OpenRead($sourceFile.FullName)
while (($read = $file.Read($rawBytes, 0, $streamSize)) -gt 0)
{
Write-Progress -Activity "Writing $Destination" -Status "Sending file" `
-PercentComplete ($position / $sourceFile.Length * 100)
# Ensure that our array is the same size as what we read from disk
if ($read -ne $rawBytes.Length)
{
[Array]::Resize( [ref] $rawBytes, $read)
}
# And send that array to the remote system
Invoke-Command -Session $session $remoteScript -ArgumentList $destination, $rawBytes
# Ensure that our array is the same size as what we read from disk
if ($rawBytes.Length -ne $streamSize)
{
[Array]::Resize( [ref] $rawBytes, $streamSize)
}
[GC]::Collect()
$position += $read
}
$file.Close()
# Show the result
Invoke-Command -Session $session { Get-Item $args[0] } -ArgumentList $Destination
}
<#
.SYNOPSIS
Sends all files in a folder to a remote session.
NOTE: will delete any destination files before uploading
.EXAMPLE
$remoteSession = New-PSSession -ConnectionUri $remoteWinRmUri.AbsoluteUri -Credential $credential
Send-Folder -Source 'c:\temp\' -Destination 'c:\temp\' $remoteSession
#>
function Send-Folder
{
param (
## The path on the local computer
[Parameter(Mandatory = $true)]
[string]
$Source,
## The target path on the remote computer
[Parameter(Mandatory = $true)]
[string]
$Destination,
## The session that represents the remote computer
# [Parameter(Mandatory = $true)]
[System.Management.Automation.Runspaces.PSSession]
$Session,
## should we quit if files already exist?
[bool]
$onlyCopyNew = $false
)
foreach ($item in Get-ChildItem $Source)
{
if (Test-Path $item.FullName -PathType Container) {
Send-Folder $item.FullName "$Destination\$item" $Session $onlyCopyNew
} else {
Send-File -Source $item.FullName -Destination "$destination\$item" -Session $Session -onlyCopyNew $onlyCopyNew
}
}
}
You cannot use PowerShell to copy a file directly to a Virtual Machine's OS disk (or even to one of its attached disks). There's no API for communicating directly with a Virtual Machine's innards (you'd need to create your own custom service for that.
You can use PowerShell to upload a file to a Blob, with Set-AzureStorageBlobContent.
At that point, you could notify your running app (possibly with a Queue message?) on your Virtual Machine that there's a file waiting for it to process. And the processing could be as simple as copying the file down to the VM's local disk.
Install AzCopy from http://aka.ms/downloadazcopy
Read docs from: https://learn.microsoft.com/en-us/azure/storage/storage-use-azcopy
cd "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy"
Get Blob Storage (Secondary) Key
Powershell: Blob Upload single file
.\AzCopy /Source:C:\myfolder /Dest:https://myaccount.blob.core.windows.net/mycontainer/myfolder/ /DestKey:key /Pattern:abc.txt
Logon to Remote VM
Powershell: Blob Download single file
.\AzCopy /Source:https://myaccount.file.core.windows.net/myfileshare/myfolder/ /Dest:C:\myfolder /SourceKey:key /Pattern:abc.txt
Another solution is to use a Custom Extension Script.
Using a custom extension script allows you to copy file to the VM even if the VM does not have a public ip (private network). So you don't need to configure winRm or anything.
I've used custom extension scripts in the past for post-deployment like installing an app on a VM or a Scale Set. Basically you upload files to blob storage and the custom extension script will download these file on the VM.
I've created a test-container on my blob storage account and uploaded two files:
deploy.ps1: the script executed on the VM.
test.txt: a text file with "Hello world from VM"
Here is the code of the deploy.ps1 file:
Param(
[string] [Parameter(Mandatory=$true)] $filename,
[string] [Parameter(Mandatory=$true)] $destinationPath
)
# Getting the full path of the downloaded file
$filePath = $PSScriptRoot + "\" + $filename
Write-Host "Checking the destination folder..." -Verbose
if(!(Test-Path $destinationPath -Verbose)){
Write-Host "Creating the destination folder..." -Verbose
New-Item -ItemType directory -Path $destinationPath -Force -Verbose
}
Copy-Item $filePath -Destination $destinationPath -Force -Verbose
Here is the code to add a custom script extension to a virtual machine.
Login-AzureRMAccount
$resourceGroupName = "resourcegroupname"
$storageAccountName = "storageaccountname"
$containerName = "test-container"
$location = "Australia East"
$vmName = "TestVM"
$extensionName = "copy-file-to-vm"
$filename = "test.txt"
$deploymentScript = "deploy.ps1"
$destintionPath = "C:\MyTempFolder\"
$storageAccountKeys = (Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName).Value
$storageAccountKey = $storageAccountKeys[0]
Set-AzureRmVMCustomScriptExtension -ResourceGroupName $resourceGroupName -VMName $vmName -Name $extensionName -Location $location -TypeHandlerVersion "1.9" -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey -ContainerName $containerName -FileName $deploymentScript, $filename -Run $deploymentScript -Argument "$filename $destintionPath" -ForceRerun "1"
You can remove the extension after the file has been copied:
Remove-AzureRmVMCustomScriptExtension -ResourceGroupName $resourceGroupName -VMName $vmName -Name $extensionName -Force
In my scenario, I have a logic app that is triggered every time a new file is added to a container. The logic app call a runbook (required an azure automation account) that add the custom script extension then delete it.
I am able to copy binary on destination server but unable to install, I am using below syntax in deploy.ps1 at the bottom
powershell.exe Start-Process -Wait -PassThru msiexec -ArgumentList '/qn /i "c:\MyTempFolder\ddagent.msi" APIKEY="8532473174"'