I am writing a PowerShell script to communicate with my Microsoft Azure account, and I am facing an issue with uploading and downloading files using the Set-AzureStorageBlobContent and Get-AzureStorageBlobContent modules.
$Upload = #{
Context = $storageContext;
Container = $container;
File = "C:\... \file.csv";
}
Set-AzureStorageBlobContent #Upload -Force;
The first time I run this, it appears to start uploading, but it stays at 0% and never rises. If I cancel it and try executing my script again, I get an error that says:
"Set-AzureStorageBlobContent : A transfer operation with the same source and destination already exists."
A nearly identical thing happens when I try to download an existing blob from Azure.
$params = #{
Context = $storageContext;
Container = $container;
Blob = $blob;
Destination = $dest
}
New-Item -Path $dest -ItemType Directory -Force
Get-AzureStorageBlobContent #params
I have tried reinstalling Azure.Storage, which is the module that contains the cmdlet Get-AzureStorageBlobContent and Set-AzureStorageBlobContent, but I had no luck. Am I doing something incorrectly, or is this code just wrong somehow?
It seems to be something with your $Upload variable. Try this instead:
Get-ChildItem –Path C:\Images\* | Set-AzureStorageBlobContent -Container "yourcontainername"
more info: https://learn.microsoft.com/en-us/azure/storage/storage-powershell-guide-full
Related
I want to copy all the files in my server(azure vm) to a container (azure blob storage) is it possible through powershell?
I'm new to powershell please help me out
In any script with you please share
First, make sure you have installed the Az powershell module in your VM and the place you want to run the command. In my sample, I use my PC to run the command.
Try to store the script below in the PC, I use C:\Users\joyw\Desktop\script.ps1, the script will upload all the files in the folder C:\Users\xxxx\Desktop\test in your VM, you can change it to the path what you want.
script.ps1:
$context = New-AzStorageContext -StorageAccountName "<StorageAccountName>" -StorageAccountKey "<StorageAccountKey>"
$files = (Get-ChildItem "C:\Users\xxxx\Desktop\test" -recurse).FullName
foreach($file in $files){
Set-AzStorageBlobContent -Context $context -File "$file" -Container "<container name>" -Blob "$file"
}
Then run the command:
Connect-AzAccount
Invoke-AzVMRunCommand -ResourceGroupName 'rgname' -Name 'vmname' -CommandId 'RunPowerShellScript' -ScriptPath 'C:\Users\joyw\Desktop\script.ps1'
If you want to login with non-interactive way, see this link.
Change the parameters to what you want, after uploading the file to the container, it will appear like the original file structure.
I need to set SASS_BINARY_PATH environment variable with the local file I've downloaded to be able to install node-sass behind a corporate firewall. So on windows cmd, I just do:
SET SASS_BINARY_PATH=C:\Source\Repos\SRT\Srt.Web\sass-binary\v4.7.2\win32-x64-48_binding.node
And the installation works fine since it successfully sets the variable. But when I try doing it via Powershell, it doesn't work:
$env:SASS_BINARY_PATH="C:\Source\Repos\SRT\Srt.Web\sass-binary\v4.7.2\win32-x64-48_binding.node"
I've also tried another way on Powershell:
[Environment]::SetEnvironmentVariable("SASS_BINARY_PATH", "C:\Source\Repos\SRT\Srt.Web\sass-binary\v4.7.2\win32-x64-48_binding.node", "Machine")
Upon checking it on the control panel, it successfully added a "SASS_BINARY_PATH" system variable. But upon trying to reinstall node-sass, it fails again.
One of my observations is when I'm doing it the windows cmd way then check it by using the command line set, the variable shows up along with others. But when I use both the Powershell methods, it does not show up. Any ideas on this?
The error encountered when trying to npm-install node-sass over a corporate firewall is:
Downloading binary from
https://github.com/sass/node-sass/releases/download/v4.7
.2/win32-x64-48_binding.node Cannot download
"https://github.com/sass/node-sass/releases/download/v4.7.2/win3
2-x64-48_binding.node":
HTTP error 401 Unauthorized
Download win32-x64-48_binding.node manually
Put it in C:\Users\<user>\AppData\Roaming\npm-cache\node-sass\4.7.2 folder.
Then try to run npm install node-sass
here is the PowerShell command #jengfad used based on above solution which is commented in the discussion
$cacheSassPath = $env:APPDATA + '\npm-cache\node-sass'
if( -Not (Test-Path -Path $cacheSassPath ) )
{
Write-Host "cacheSassPath not exists"
New-Item -ItemType directory -Path $cacheSassPath
Write-Host "cacheSassPath CREATED"
}
<# Ensure has no content #>
Get-ChildItem -Path $cacheSassPath -Recurse| Foreach-object {Remove-item -Recurse -path $_.FullName }
<# Copy local sass binary (~Srt.Web\sass-binary\4.7.2) file to cache folder #>
$sassBinaryPath = split-path -parent $MyInvocation.MyCommand.Definition
$sassBinaryPath = $sassBinaryPath + "\sass-binary\4.7.2"
Copy-Item -Path $sassBinaryPath -Recurse -Destination $npmcachedir -Container
Write-Host "node-sass binary file successfully copied!"
Trying to delete local backup files after they have been uploaded to Azure storage, gets the following error:
Get-ChildItem : Cannot find path
'C:\Windows\system32\Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageBlob'
because it does not exist.
When running the following code:
$BackupDir= 'C:\BackupDir'
$AzureDir= Get-AzureStorageBlob -Context $context -Container $containername -blob $file.name
Get-ChildItem $AzureDir | ForEach-Object
{
$FileInBackup= $AzureDir + $_.Name
If (Test-Path $FileInBackup) {Remove-Item $FileInBackup}
}
Why is it looking in C:\Windows*blahblah*?
If I print variable $AzureDir to screen, I see all my blobs.
Basically, it's probably obvious but what I want to do is check each file in my backup DIR and if it exists in Azure, delete it, if not, continue on to the upload step. I can share the rest of my code if need be.
RESOLUTION UPDATE:
Thanks to #OmegaMan, who pointed me down the right path, I was able to fix my issue. Here is what I'm now using. It's cycling through 4 'blobs' correctly and using the results correctly:
$BackupDir = 'C:\BackupDir'
$AzureFiles = Get-AzureStorageBlob -Context $context -Container $containername -blob *.name
foreach ($AzureFile in $AzureFiles)
{
$FileInBackup = $AzureFile.name
If (Test-Path $BackupDir\$FileInBackup)
{
Remove-Item $FileInBackup
}
}
You seem to use $AzureDir in one instance to hold all the blob infos, which is fine but then the line $FileInBackup= $AzureDir + $_.Name seems to think $AzureDir is a literal directory name.
It appears you need to rectify where the base directory is instead of $AzureDir in those instances.
I am trying to create an Azure Automation job to create a new Azure Automation Runbook. I am using the following to try to get it to work.
$Context = New-AzureStorageContext $storageAccountName $storageAccountKey
$Path = Get-AzureStorageFile -ShareName "qdrive" -Path "TestWorkFlow.ps1" -Context $Context |Select-object Name |Out-String
Import-AzureRMAutomationRunbook -ResourceGroupName $ResourceGroupName -AutomationAccountName $AutomationAccountName -Path $Path -Type PowerShellWorkflow -Force -Name $Name -Published
I get an error message of
Import-AzureRMAutomationRunbook:Cannot find path 'C:\Windows\System32\
Name
------
TestWorkFlow.ps1
I need help figuring out how to send the path of the file to the $path variable in a UNC and not a URI.
Thanks!
The cmdlet needs to take a fully qualified path to the runbook .ps1 file, where the local machine has access to that path via normal local file system referencing. It looks like in this case $Path contains “Name ------ TestWorkFlow.ps1” – so therefore you are not storing the path in $Path correctly, hence the failure.
The $path variable for the -Path switch to the cmdlet needs to contain the full path, Including the filename itself. Like, "C:\Users\Johndoe\TestWorkFlow.ps1". Hope this helps.
I have a Powershell script that is stored here:
https://gitlab.example.example.co.uk/example/example/raw/master/shrink-diskpart.ps1
I would like to run this on many servers through a scheduled task from this gitlab so I can make single changes to the script and it will run the most up to date one on all the servers.
Can anyone advise if this is possible an if it is how it can be done?
Thanks
You can use Invoke-WebRequest with -OutFile to download a file, and then just execute it. You might store the file on the web server as a .txt so that you don't have to add a MIME type.
$scriptUrl = "http://localhost/test.txt"
$destination = "c:\temp\test.txt"
$scriptName = "c:\temp\test.ps1"
Invoke-WebRequest $scriptUrl -OutFile $destination
# if the file was downloaded, delete the old script file and rename the new
# file
if(test-path $destination){
remove-item $scriptName
Rename-Item $destination $scriptName
}
&$scriptName
Props to http://www.powershellatoms.com/basic/download-file-website-powershell/