write to text file on azure file share - powershell

I would like to move a ps script to Azure to run as a runbook. It basically checks all new SharePoint sites and checks if they meet certain criteria, like enables version control if not set.
The script works like:
$sitesdonelist = "c:\log.txt"
$sitesdone = get-content -Path $sitesdonelist
foreach($sitecoll in $sitecollections) {
$currentsite = $sitecoll.Url
if ($sitesdone -inotcontains $currentsite) {
checksite
add-content -Path $sitesdonelist -Value $sitecoll.Url
}
}
I would like this code to work on Azure and part of this is to move the sites done list to an Azure file share and append the processed sites to it. So far I see two options and none of them seem right:
Download the file to $env:TEMP, append and upload on finish: I
will miss all sites done if something fails halfway
Download the file to $env:TEMP, append and upload after every site: would make the process slow and cause lot of unnecessary data load
Is there a better option? Can I write directly to a file on Azure file share from an Azure PowerShell runbook?

If you want to write something as log to a file in your Azure file share from Azure Automation runbook , pls try the PS command below :
$appid = "<your appliaction ID>"
$passwd = "<your Azure AD application Client secret>"
$tenantId= "<your tenant ID>"
$secpasswd = ConvertTo-SecureString -String $passwd -AsPlainText -Force
$cred = New-Object Management.Automation.PSCredential ($appid, $secpasswd)
Connect-AzAccount -ServicePrincipal -Credential $cred -Tenant $tenantId
$fileName = "<file name,including path>"
$storageacc = Get-AzStorageAccount -ResourceGroupName <resource group name> -Name <storage account name>
$file = (Get-AzStorageFile -ShareName qsfileshare -Context $storageacc.Context -Path $fileName)[0]
$content = "content you want to write"
$file.UploadTextAsync($file.DownloadTextAsync().GetAwaiter().GetResult() + $content).GetAwaiter().GetResult()
#get the content of the file that we write.
$file.DownloadTextAsync().GetAwaiter().GetResult()
Result in automation :
Previously , the content of the file is "hello!!!" only, as you can see ,the content has been written to the file . Hope it helps .

Related

Upload file to Sharepoint Online (Microsoft 365) using Powershell (Option 1-Using PnP.Powershell)

I'm trying to upload a file into a Sharepoint Online (M365) library subfolder, but it keeps giving me errors. I have tried many scripts. This post is about using PnP.Powershell (I have posted questions about other scripts hoping someone can help me with any of them)
This is the code:
$url="https://mydomain.sharepoint.com/sites/mysharepointsite/"
$userID = "mail#foo.com"
$securePassword = ConvertTo-SecureString -String "myPaswword" -AsPlainText -Force
$credentials = New-Object System.Management.Automation.PSCredential -ArgumentList $userID, $securePassword
Install-Module -Name PnP.PowerShell
Connect-PnPOnline $url -Credentials $credentials
$files = Get-Item -Path "C:\myFolder\myFile.csv" -Force
foreach ($file in $files)
{
Add-PnPFile -Folder "myLibraryname/subfolder" -Path $file.FullName
Write-Host $done.Name "Uploaded into the Site" -BackgroundColor Green
}
It gives me this error
Connect-PnPOnline : AADSTS65001: The user or administrator has not consented to use the application with ID 'xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx' named 'PnP Management Shell'. Send an interactive authorization request for this user and resource.
How do I grant this permission in M365? Is it safe to grant it?
Thanks
PnP PowerShell need to be granted some privileges on your tenant.
Maybe you can try
Register-PnPManagementShellAccess
Ref: Connecting with PnP PowerShell
I guess it's safe if you ensure you are on the official build and not a customized (hacked) one.
Basically, it will grant the module the right to perform some task using a delegated authorization. It won't have more privileges than your account.

How to copy a file from local work space to remote server (not a network shared path) with powershell

I am trying to copy a file from my local workspace to a remote server (not a network shared path) by using the powershell command through Inline Powershell" task in TFS vNext build definition.
FYI, destination path is not a network shared path
I tried with below commands
$Session = New-PSSession -ComputerName "remote server name" -Credential "domain\username"
Copy-Item "$(Build.SourcesDirectory)\Test.htm" -Destination "C:\inetpub\wwwroot\aspnet_client\" -ToSession $Session
But it's promoting for the password every time and I tried with entering the password manually and the result looks good.
How can we achieve this step without prompting password or credentials
Are you sure it's not on a network share? :)
Powershell only takes password as a secure string. You can use $credential = Get-Credential to render a really cool box to store those credentials for you, or if you want to store your login programmatically (not recommended for obvious security reasons) use this:
$passwd = ConvertTo-SecureString "<password>" -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential("<username>",$passwd)
There might be a way to inherit your current domain credentials, but that's way beyond me, and a quick google search turns up nothing.
EDIT: Sorry I forgot to post the whole thing:
$passwd = ConvertTo-SecureString "<password>" -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential("<username>",$passwd)
$Session = New-PSSession -ComputerName "remote server name" -Credential $credential
Copy-Item "$(Build.SourcesDirectory)\Test.htm" -Destination "C:\inetpub\wwwroot\aspnet_client\" -ToSession $Session

Azure Runbook - Get a file from Azure File System Storage

I am creating a Azure workflow runbook wherein I have to get a file from Azure File System Storage and publish that to a azure web app.
I tried with New-PSDrive but that command is not supported in runbook (even InlineScript doesn't work). Could anyone help me with the script. In the below code I need to populate file path from azure file system.
$Conn = Get-AutomationConnection -Name AzureRunAsConnection
Connect-AzureRmAccount -ServicePrincipal -Tenant $Conn.TenantID `
-ApplicationId $Conn.ApplicationID `
-CertificateThumbprint $Conn.CertificateThumbprint
$zipFilePath = ???
Publish-AzureWebsiteProject -Name $siteName -Package $zipFilePath
I searched a lot but couldn't find much information on this.
Are you referring to a file in a Azure Storage account? If so, that is pretty easy to accomplish. Add the following to your Runbook, filling in the required information:
$StorageAccountKey = Get-AutomationVariable -Name 'storageKey'
$Context = New-AzureStorageContext -StorageAccountName 'your-storage' `
-StorageAccountKey $StorageAccountKey
Get-AzureStorageFileContent -ShareName 'your-share' -Context $Context `
-path 'your-file' -Destination 'C:\Temp'
$filePath = Join-Path -Path 'C:\Temp' -ChildPath 'your-file'
You also need to create an variable in your Automation Account, called "storageKey" containing your Storage Accounts key.
Mounting Azure File share as a drive is not currently supported in Automation cloud jobs, though it will probably be supported in a few months. In the meantime, use the Get-AzureStorageFile command from the Azure.Storage module to retrieve the file to a temp folder.
Alternatively, run this job on a Hybrid worker. In this case, make sure all the prerequisites are met in order to mount the share as a network drive.

How do I deploy to Azure App Service with PowerShell?

I have looked around and with the thousands of commands in the Azure and AzureRM commandlets in PowerShell, I'm still not sure how to do this.
What I have working so far:
Installed Azure and AzureRM modules and imported them to the script
Generated the "*.publishsettings" file from the get-AzurePublishSettingsFile command
Imported the "*.publishsettings" file
Can acccess the website with the "Stop-AzureWebsite" and "Start-AzureWebsite" commandlets
What I need to do:
create a new deployment and push files to the app-service site.
Notes: I do not have a Visual Studio project and .csproj file configs. I simply want to take the contents of a folder and push that to the website.
Any help would be useful as the documentation is really bad on details and there are thousands of commands in PowerShell to go through.
You could check this blog:Deploy an App Service using Azure PowerShell to a Deployment Slot.
Get-AzurePublishSettingsFile
Import-AzurePublishSettingsFile .\Your-Publish-Settings-credentials.publishsettings
Get-AzureSubscription
Select-AzureSubscription -SubscriptionName "The Subscription Name containing the slot"
Set-AzureSubscription -SubscriptionId "ID of subscription"
$WebAppName = "standard(staging)"
Get-AzureWebsite -Name $WebAppName
Publish-AzureWebsiteProject -Name $WebAppName -Package "C:\PowerShell\standard.zip" -Slot "staging"
The above link (https://blogs.msdn.microsoft.com/benjaminperkins/2016/10/01/deploy-an-app-service-using-azure-powershell-to-a-deployment-slot/)talks about a GIT based deployment. OP wanted something from a folder.
Check this one out -
Create an Azure Website with PowerShell and FTP
Unfortunately the accepted answer gave me the following error:
Get-AzureWebSite : Requested value 'PremiumV2' was not found
This StackOverflow answer suggests to use Get-AzureRmWebApp instead, but this introduces some challenges with authentication. After some searching I found the following article which explained exactly what I needed: an approach to do a publish to Azure without any human interaction.
Please see a very simplified version of the script below.
#In the Azure portal go to (search for) "Azure Active Directory" ->
#"Properties" -> Directory ID
$TenantId = "<Azure Active Directory Id>"
#In the Azure portal go to (search for) "Subscriptions" -> Subscription ID
$SubscriptionId = "<Azure Subscription Id>"
#In the Azure portal go to (search for) "Azure Active Directory" -> "App registrations" ->
#Create a new registration, this will give you the ID and Secret below.
#Make sure to give your new app registration sufficient rights to your app service
$ServicePrincipleApplicationId = "<Service Principle Id>"
$ServicePrincipleApplicationSecret = "<Service Principle Secret>"
$WebAppPath = "<Local folder where your package is located>"
$ResourceGroupName = "<The name of the Azure resource group that contains your app service>"
$WebAppName = "<The name of your Azure app service>"
$WebAppSlot = "<The name of the deployment slot you want to publish to>"
$MSDeployPath = "C:\Program Files\IIS\Microsoft Web Deploy V3\msdeploy.exe"
$source = "-source:contentPath=$WebAppPath"
$publishProfileOutputPath = Join-Path -Path $ENV:Temp -ChildPath 'publishprofile.xml'
$dest = "-dest:contentPath=d:\home\site\wwwroot\,publishSettings=$publishProfileOutputPath"
$SecurePassword = $ServicePrincipleApplicationSecret | ConvertTo-SecureString -AsPlainText -Force
$Credential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $ServicePrincipleApplicationId, $securePassword
$connectParameters = #{
Credential = $Credential
TenantId = $TenantId
SubscriptionId = $SubscriptionId
}
Add-AzureRmAccount #connectParameters -ServicePrincipal
Get-AzureRmWebAppSlotPublishingProfile -OutputFile $publishProfileOutputPath -Format WebDeploy -ResourceGroupName $ResourceGroupName -Name $WebAppName -Slot $WebAppSlot
Stop-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $WebAppName -Slot $WebAppSlot
& $MSDeployPath #('-verb:sync', $source, $dest)
Start-AzureRmWebAppSlot -ResourceGroupName $ResourceGroupName -Name $WebAppName -Slot $WebAppSlot
To deploy your zip package to Azure Web App Service using PowerShell cmdlet.
Refer MS Docs.
Connect to Azure Subscription via PowerShell. Execute Publish-AzWebApp to deploy Web App.
$webAppName = "<NameOfWebAppService>"
$resourceGroup = "<WebAppResourceGroupName>"
$zipArchiveFullPath = "<zip-package-filePath\FileName.zip>"
Publish-AzWebApp -ResourceGroupName "$resourceGroup" -Name "$webAppName" -ArchivePath "$($zipArchiveFullPath)" -Force

Import saved AzureRMContext still requires password

I was using Save-AzureRmProfile for all my scripts to execute azure requests in parallel. I borrowed this idea from auto login to azure with powershell/
I had to update my systems to latest version (AzureRM > 4) and despite the fact that the AzureRmProfile are now AzureRmContext cmdlets I still cannot use it as before.
Scenario
Open a PS console and execute
Save-AzureRmContext -Profile (Add-AzureRmAccount) -Path myprofile.json
# List my VMs
Get-AzureRmVm
Open a second PS console
Import-AzureRmContext -Path myprofile.json
# List my VMs
Get-AzureRmVm
Get-AzureRmVM : Your Azure credentials have not been set up or have expired, please run Login-AzureRMAccount to set up your Azure credentials.
How can I reuse my profile to be loaded in parallel executions?
There's a bug in the cmdlets. Not much you can do (only downgrade).
Track it here: https://github.com/Azure/azure-powershell/issues/3954
Here are a couple of workarounds.
Simple, in memory workaround, would need to be added whenever you import a context:
$ctx = Import-AzureRmContext -Path <path-to-context>
$ctx.Context.TokenCache.Deserialize($ctx.Context.TokenCache.CacheData)
More complex workaround. This creates a permanent file, TokenCache.dat, which, if present, may allow you to avoid this problem on a machine altogether.
In a new POSH window:
$ctx = Import-AzureRmContext -Path <path-to-saved-context>
$session = [Microsoft.Azure.Commands.Common.Authentication.AzureSession]::Instance
$cacheFile = [System.IO.Path]::Combine($session.ProfileDirectory, $session.TokenCacheFile)
if (Test-Path $cacheFile) {
$session.DataStore.CopyFile($cacheFile, ($cacheFile + ".bak"))
}
$session.DataStore.WriteFile( $cacheFile, [System.Security.Cryptography.ProtectedData]::Protect($ctx.Context.TokenCache.CacheData, $null, [System.Security.Cryptography.DataProtectionScope]::CurrentUser))
$session.TokenCache = New-Object -TypeName Microsoft.Azure.Commands.Common.Authentication.ProtectedFileTokenCache -ArgumentList $cacheFile
[Microsoft.Azure.Commands.Common.Authentication.Abstractions.AzureRmProfileProvider]::Instance.Profile.DefaultContext.TokenCache = $session.TokenCache
Note that this problem should be fixed in the next release
As a workaround until the issue is not fixed or I downgrade my PS installation I used
$azureAccountName ="my.email#example.com"
$Password = "12345678"
$azurePassword = ConvertTo-SecureString $Password -AsPlainText -Force
$psCred = New-Object System.Management.Automation.PSCredential($azureAccountName, $azurePassword)
Then in my parallel ScriptBlock I do a call like this to replace the broken import credentials
Login-AzureRmAccount -Credential $psCred
Not the kind of solutions I'm proud of but... it did the trick.