I use powershell to download a blob from blobstorage in an Azure startup task. I updated Microsoft.WindowsAzure.Storage library today from 3.0.3.0 to 4.0.1.0 via NuGet.
After the library update files are still downloaded correctly but I get same sort of warning in command window:
'Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information.'
function download_from_storage ($container, $blob, $connection, $destination) {
Add-Type -Path ((Get-Location).Path + '\Microsoft.WindowsAzure.Storage.dll')
$storageAccount = [Microsoft.WindowsAzure.Storage.CloudStorageAccount]::Parse($connection)
$blobClient = New-Object Microsoft.WindowsAzure.Storage.Blob.CloudBlobClient($storageAccount.BlobEndpoint, $storageAccount.Credentials)
$container = $blobClient.GetContainerReference($container)
$remoteBlob = $container.GetBlockBlobReference($blob)
$remoteBlob.DownloadToFile($destination + "\" + $blob, [System.IO.FileMode]::OpenOrCreate)
}
$connection_string = 'DefaultEndpointsProtocol=https;AccountName=<AcountName>;AccountKey=<Accountkey>'
# JRE
$jre = 'jre-7u60-windows-x64.exe'
$node = 'node-v0.10.29-x64.msi'
download_from_storage 'java-runtime' $jre $connection_string (Get-Location).Path
download_from_storage 'nodejs' $node $connection_string (Get-Location).Path
Since it is still working I am just clueless why the message occurs in the first place.
This is not exactly an answer to your question but here is a much simpler way of downloading files from blob storage:
$dlPath = "C:\temp\"
$container = "BlobContainer"
Set-AzureSubscription "NameOfYourSubscription" -CurrentStorageAccount "storageAccountName"
Get-AzureStorageContainer $container | Get-AzureStorageBlob |
Get-AzureStorageBlobContent -Destination $container
You can do this by installing Azure PowerShell itself in the startup task and then execute the download Azure blob cmdlet. Here are rouphly the steps
Installing Azure PowerShell automatically
Create new service project (New-AzureServiceProject)
Execute Add-AzureWebRole
Change the cscfg to use osFamily=3 (to use new PS version which is compatible with Azure PS)
Copy Azure PowerShell MSI under WebRole1\bin directory
Edit WebRole1\startup.cmd to include this line msiexec /i AzurePowerShell.msi /quiet
Authenticating Azure PowerShell so it can execute cmdlets (if you want to use storage cmdlets only you can ignore this step and pass your storage key/name when executing the Get-AzureStorageBlobContent cmdlet)
Copy a latest publish settings file (myPublishSettings.publishsettings) inside WebRole1\bin folder
Edit WebRole1\startup.cmd to include this line after the one added before: PowerShell.exe –Command “Import-AzurePublishSettingsFile .\myPublishSettings.publishsettings)
Related
I am working on a SSAS Tabular project in Visual Studio 2017 where I want to automate build and deploy (and testing) locally and in azure analysis services. The project is connected to an Azure Devops project and it works.
But I am struggling a lot when it comes to build and deploy the project in azure devops. I have followed a blog on this topic (https://notesfromthelifeboat.com/post/analysis-services-1-deployment/) and author has created some powershell scripts that works on my computer when I am running them locally on Windows Powershell ISE. But when I tried to create a build pipeline on devops using the same Powershell file in a Powershell task it fails. I created some variables and set a reference to the powershell file. So far so fine.When I tried to run the build I got an error saying:
Import-Module : The specified module 'SqlServer' was not loaded
because no valid module file was found in any module
It seems that powershell in devops are not able to load the Import-Module -Name SqlServer. I have searched on the net for a solution but nothing has worked so far, and other combination of the ImpI have discovered that there is a small difference in the $env:PSModulePath environment variable between in Powershell ISe and in the Powershell ci-build task, but I am not sure if that is the problem.
If any of you have experience knowing how to solve this issue or have a better solution on how to deploy SSAS Tabular model locally and especially deploying to azure (may be some of you have experience with automation) from build/release.
build setup on devops
Error from running the build
Powershell script
Command: .\deploy_model.ps1 -workspace c:\develop\tabular-automation -environment validation -analysisServicesUsername test_ssas -analysisServicesPassword test_ssas
param(
[Parameter(Mandatory)]
[string]$workspace,
[Parameter(Mandatory)]
[string]$environment,
[Parameter(Mandatory)]
[string]$analysisServicesUsername,
[Parameter(Mandatory)]
[string]$analysisServicesPassword,
[string]$databaseServer = "localhost",
[string]$analysisServicesServer = "localhost"
)
Import-Module -Name SqlServer
$ErrorActionPreference = "Stop"
# Build the model
$msbuild = 'C:\Program Files (x86)\Microsoft Visual Studio\2017\Professional\MSBuild\15.0\Bin\MSBuild.exe'
& "$msbuild" TabularExample.smproj "/p:Configuration=$environment" /t:Clean,Build /p:VisualStudioVersion=14.0
# Copy build outputs and deployment options to deployment directory
$deploymentDir = ".\deployment"
mkdir -Force $deploymentDir
cp "bin\$environment\*.*" $deploymentDir
cp .\deploymentoptions\*.* $deploymentDir
# Update deployment targets with parameters
$template = Get-Content .\deploymentoptions\Model.deploymenttargets
$expandedTemplate = $ExecutionContext.InvokeCommand.ExpandString($template)
$expandedTemplate | Set-Content "$deploymentDir\Model.deploymenttargets"
# Create the deployment script
Microsoft.AnalysisServices.Deployment.exe "$deploymentDir\Model.asdatabase" /s:"$deploymentDir\deploy.log" /o:"$deploymentDir\deploy.xmla" | Out-Default
# Deploy the model
$SECURE_PASSWORD = ConvertTo-SecureString $analysisServicesPassword -AsPlainText -Force
$CREDENTIAL = New-Object System.Management.Automation.PSCredential ($analysisServicesUsername, $SECURE_PASSWORD)
Invoke-ASCmd –InputFile "$workspace\$deploymentDir\deploy.xmla" -Server $analysisServicesServer -Credential $CREDENTIAL
For third party modules, or solution has been to modify the $env:PSModulePath to point to a network location that has the version of the module we want our build agents to run. I use code like below (We also set the PSModulePath to the relative path to our custom modules that are stored in the same repo, but I removed that part of code since you do not state that you have any custom modules)
I like this better than constantly running Install-Module because I have better control over what version of the modules are running, and I do not have to worry about having our build boxes constantly communicating with PowershellGallery
try {
Import-Module SQLServer -Force -ErrorAction Stop
}
catch {
$networkPath = "\\Network path to Modules\"
if (!(Test-Path $networkPath)) {
Write-Error "Can not set env:PSModulePath to the published location on the network" -ErrorAction Stop
}
else {
if (!($env:PSModulePath -like "*;$networkPath*")) {
$env:PSModulePath = $env:PSModulePath + ";$networkPath"
}
}
}
else {
Write-Host "Setting the modulePath to $modulePath"
if (!($env:PSModulePath -like "*;$modulePath*")) {
$env:PSModulePath = $env:PSModulePath + ";$modulePath\"
}
}
Import-Module SQLServer -Force -DisableNameChecking -ErrorAction Stop
}
I need a PowerShell script which will deploy an ASP.NET Core app's artifacts to Azure Web Service. Searching the Internet I managed to find this script:
param($websiteName, $packOutput)
$website = Get-AzureWebsite -Name $websiteName
Stop-AzureWebsite -Name $websiteName
# get the scm url to use with MSDeploy. By default this will be the second in the array
$msdeployurl = $website.EnabledHostNames[1]
$publishProperties = #{'WebPublishMethod'='MSDeploy';
'MSDeployServiceUrl'=$msdeployurl;
'DeployIisAppPath'=$website.Name;
'Username'=$website.PublishingUsername;
'Password'=$website.PublishingPassword
}
$publishScript = "${env:ProgramFiles(x86)}\Microsoft Visual Studio 14.0\Common7\IDE\Extensions\Microsoft\Web Tools\Publish\Scripts\1.2.0\default-publish.ps1"
. $publishScript -publishProperties $publishProperties -packOutput $packOutput
Start-AzureWebsite -Name $websiteName
I am using it the way it is shown on the screenshot:
But...nothing happens as the result of msdeploy command execution: no errors, no data deployed...
So, what is the correct way of deploying ASP.NET Core artifacts with PowerShell?
Visual Studio could generate Windows PowerShell publish script for deploying to a website. The publish script may look like this.
publish script
[cmdletbinding(SupportsShouldProcess=$true)]
param($publishProperties=#{}, $packOutput, $pubProfilePath)
# to learn more about this file visit https://go.microsoft.com/fwlink/?LinkId=524327
try{
if ($publishProperties['ProjectGuid'] -eq $null){
$publishProperties['ProjectGuid'] = 'xxxxxxxx-0260-4800-b864-e9afa92d7fc2'
}
$publishModulePath = Join-Path (Split-Path $MyInvocation.MyCommand.Path) 'publish-module.psm1'
Import-Module $publishModulePath -DisableNameChecking -Force
# call Publish-AspNet to perform the publish operation
Publish-AspNet -publishProperties $publishProperties -packOutput $packOutput -pubProfilePath $pubProfilePath
}
catch{
"An error occurred during publish.`n{0}" -f $_.Exception.Message | Write-Error
}
And a publish module that contains functions that will be used in the scripts. For more information about publishscripts for deploying to a website, please refer to this documentation.
I have an Azure Cloud Service Worker Role which needs a separate Windows Service installed to redirect application tracing to a centralized server. I've placed the installation binaries for this Windows Service in a Storage Account's file storage as shown below. I then have my startup task call a batch file, which in turn executes a power-shell script to retrieve the file and install the service
When Azure deploys a new instance of the role, the script execution fails with the following error:
Cannot find path
'\\{name}.file.core.windows.net\utilities\slab1-1.zip' because it does
not exist
However, when I run the script after connecting through RDP, all is fine. Does anybody know why this might be happening? Here is the script below...
cmdkey /add:$storageAccountName.file.core.windows.net /user:$shareUser /pass:$shareAccessKey
net use * \\$storageAccountName.file.core.windows.net\utilities
mkdir slab
copy \\$storageAccountName.file.core.windows.net\utilities\$package .\slab\$package
I always have problem here and there by using a script to access the mounted azure file drive. I believe this is more or less related to the drive is mounted only for the current user and may not always work the same when called from a script.
I ended up pulling files from azure file the hard way without network drive.
$source= $stroageAccountName
$sourceKey = $shareAccessKey
$sharename = "utilities"
$package = "slab1-1.zip"
$dest = ".\slab\" + $package
#Define Azure file share root
$ctx=New-AzureStorageContext $source $sourceKey
$share = get-AzureStorageShare $sharename -Context $ctx
Get-AzureStorageFileContent -share $share -Destination $dest -Path $package -confirm:$false
Code example here will get you a good start:
https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-files/
It would be harder to manage if you have more complex folder structure, but objects there are CloudFileDirectory and CloudFile, property and methods there works seamlessly for me in powershell 4.0
*Azure Powershell module is required for 'Get-AzureStorageFileContent' cmdlet
I have a requirement to download files from an AWS S3 bucket to a local folder, count the number of files in the local folder, check against S3, and send an email with the number of files.
I tried to download files from S3 but I am getting an error like get-s3object commandnotfoundexception. How do I resolve this issue?
Here is my code:
# Your account access key - must have read access to your S3 Bucket
$accessKey = "YOUR-ACCESS-KEY"
# Your account secret access key
$secretKey = "YOUR-SECRET-KEY"
# The region associated with your bucket e.g. eu-west-1, us-east-1 etc. (see http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-regions-availability-zones.html#concepts-regions)
$region = "eu-west-1"
# The name of your S3 Bucket
$bucket = "my-test-bucket"
# The folder in your bucket to copy, including trailing slash. Leave blank to copy the entire bucket
$keyPrefix = "my-folder/"
# The local file path where files should be copied
$localPath = "C:\s3-downloads\"
$objects = Get-S3Object -BucketName $bucket -KeyPrefix $keyPrefix -AccessKey $accessKey -SecretKey $secretKey -Region $region
foreach($object in $objects) {
$localFileName = $object.Key -replace $keyPrefix, ''
if ($localFileName -ne '') {
$localFilePath = Join-Path $localPath $localFileName
Copy-S3Object -BucketName $bucket -Key $object.Key -LocalFile $localFilePath -AccessKey $accessKey -SecretKey $secretKey -Region $region
}
}
Since this question is one of the top Google results for "powershell download s3 files" I'm going to answer the question in the title (even though the actual question text is different):
Read-S3Object -BucketName "my-s3-bucket" -KeyPrefix "path/to/directory" -Folder .
You might need to call Set-AWSCredentials if it's not a public bucket.
Similar to Will's example, if you want to download the whole content of a "folder" keeping the directory structure try:
Get-S3Object -BucketName "my-bucket" -KeyPrefix "path/to/directory" | Read-S3Object -Folder .
MS doc at https://docs.aws.amazon.com/powershell/latest/reference/items/Read-S3Object.html provides examples with fancier filtering.
If you have installed the AWS PowerShell Module, you haven't correctly loaded it into your current session. We're identifying this as the issue because the error you specified means that the given cmdlet can't be found.
Verify first that the module is installed, by any of the options below:
Load module into an existing session: (PowerShell v3 and v4):
From the documentation:
In PowerShell 4.0 and later releases, Import-Module also searches the Program Files folder for installed modules, so it is not necessary to provide the full path to the module. You can run the following command to import the AWSPowerShell module. In PowerShell 3.0 and later, running a cmdlet in the module also automatically imports a module into your session.
To verify correct installation, add the following command to the beginning of your script:
PS C:\> Import-Module AWSPowerShell
Load module into an existing session: (PowerShell v2):
To verify correct installation, add the following command to the beginning of your script:
PS C:\> Import-Module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"
Open a new session with Windows PowerShell for AWS Desktop Shortcut:
A shortcut is added to your desktop that starts PowerShell with the correct module loaded into the session. If your installation was successful, this shortcut should be present and should also correctly load the AWS PowerShell module without additional effort from you.
From the documentation:
The installer creates a Start Menu group called, Amazon Web Services,
which contains a shortcut called Windows PowerShell for AWS. For
PowerShell 2.0, this shortcut automatically imports the AWSPowerShell
module and then runs the Initialize-AWSDefaults cmdlet. For PowerShell
3.0, the AWSPowerShell module is loaded automatically whenever you run an AWS cmdlet. So, for PowerShell 3.0, the shortcut created by the
installer only runs the Initialize-AWSDefaults cmdlet. For more
information about Initialize-AWSDefaults, see Using AWS Credentials.
Further Reading:
AWS PowerShell Documentation - Download and Install the AWS Tools for Windows PowerShell
AWS PowerShell Documentation - Setting up the AWS Tools for Windows PowerShell
I have a powershell script that I wrote to backup a local sqlserver to an azure blob. Its based on one I took from MSDN, but I added an extra feature to delete any old backups that are over 30 days old. When I run this as a user, it works fine. When I added this to task scheduler, set to run as me, and I manually ask for it to run, it works fine. (All output is captured in a log file, so I can see that its all working). When run from the task scheduler at night when I'm not logged in (the task scheduler is set to run the script as me) it fails. Specifically, it claims my azure subscription name is not know when I call Set-AzureSubscription. Then, fails when trying to delete the blob with:
Get-AzureStorageBlob : Can not find your azure storage credential. Please set current storage account using "Set-AzureSubscription" or set the "AZURE_STORAGE_CONNECTION_STRING" environment variable.
The script in question:
import-module sqlps
import-module azure
$storageAccount = "storageaccount"
$subscriptionName = "SubName"
$blobContainer = "backup"
$backupUrlContainer = "https://$storageAccount.blob.core.windows.net/$blobContainer/"
$credentialName = "creds"
Set-AzureSubscription -CurrentStorageAccountName $storageAccount -SubscriptionName $subscriptionName
$path = "sqlserver:\sql\servername\SQLEXPRESS\databases"
$alldatabases = get-childitem -Force -path $path | Where-object {$_.name -eq "DB0" -or $_.name -eq "DB1"}
foreach ($db in $alldatabases)
{
Backup-SqlDatabase -BackupContainer $backupUrlContainer -SqlCredential $credentialName $db
}
$oldblobs = Get-AzureStorageBlob -container backup | Where-object { $_.name.Contains("DB") -and (-((($_.LastModified) - $([DateTime]::Now)).TotalDays)) -gt $(New-TimeSpan -Days 30).TotalDays }
foreach($blob in $oldblobs)
{
Write-Output $blob.Name
Remove-AzureStorageBlob -Container "backup" -Blob $blob.Name
}
The backup part of the script works, just not the blob deletion parts. It would appear that something is being done to the environment when I log in that allows the azure powershell scripts to work but that isn't being done when I run the command at night when I'm not logged in.
Any one have any idea what that might be?
Task scheduler is set to run the command with a
Powershell -Command "C:\Scripts\BackupDatabases.ps1" 2>&1 >> "C:\Logs\backup.log"
The Azure PowerShell environment just needs to understand what Azure subscription to work with by default. You probably did this for your own environment, but the task scheduler is running in a different environment.
You just need to add an additional command to the beginning of your script to set the Azure subscription. Something like this:
Set-AzureSubscription -SubscriptionName
The documentation for this command is here. You can also set by SubscriptionID etc. instead of SubscriptionName.
In addition, this article walks through how to connect your Azure subscription to the PowerShell environment.
UPDATE: I messed around and got it working. Try adding a "Select-AzureSubscription" before your Set-AzureSubscription command.
Select-AzureSubscription $subscriptionName
Set-AzureSubscription -SubscriptionName $subscriptionName -CurrentStorageAccountName $storageAccount
The documentation for Select-AzureSubscription is here. If you aren't relying on that storage account being set, you may be able to remove the Set-AzureSubscription command.
I was never able to make the powershell script work. I assume I could have made it work if I had set the credentials in the environment variable, as it said, but I instead wrote a little program to do the work for me.
Visit https://github.com/sillyotter/BackupDBToAzure if you need a tool to backup things to azure blobs and delete old leftover backups.
Thanks for the help!