I am using Microsoft Azure Fluent API with RunPowerShellScript method to execute powershell script.
My file share drive mounted properly but it's showing disconnected .But same powershell command when i run from virtual machine it's working properly.Please see below image :
Related
Im using azure devops release pipeline. Im planning to map the azure Fileshare to all VMS in deployment groups. I have the script from the portal to map fileshare in local. Replacing the passkey value in that script I'm passing as variable.
Tried to check and the value is getting. But in results it shows mapped with the drive letter. When I opened the file explorer it shows disconnected and unable to eject the sharepath shows "This network connection does not exist" Also it is not even mapped. If any issue in getting keys from variable means it should throw error without showing drive create status.
Looking for help is any step or mistake happened in the pipeline? Or in script?
Note: in local i can able to run the script successful and drive mapped successful as well.
Script:
cmd.exe /C "cmdkey /add:"storageaccount.file.core.windows.net" /user:"localhost\storageaccount" /pass:"accesskeyforstorageaccount"
New-PSDrive -Name Z -PSProvider Filesystem -Root "\storageaccount.file.core.windows.net\fileshare-name" -Persist
Make sure the drive name Z is not taken on Deployment group VM.
When you say in local i can able to run the script successful, do you mean you run on the deployment group VM? If not, try to run the script to check if it's successfully, check network, port.
You can check the code sample which is successful, double check if password is correctly set if you are using secret variable in the pipeline.
If the issue persists, please share more info like screenshot, log.
I'm using Powershell to write a folder synchronization tool to copy files from a local folder up to AWS S3 with the AWS CLI.
The script works as I can see files show up in S3, but the output of the aws sync command does not appear on screen (normally when aws sync is run from the command line it shows each file as it it uploads, the current status of all files/count, etc).
How do I get that to happen inside a Powershell script?
Here are some various things I've tried, but none of which worked:
aws s3 sync $local_folder $aws_bucket
$awsio = aws s3 sync $local_folder $aws_bucket
#Out-Host -InputObject $awsio
Write-Output $awsio
Turns out the answer was the first thing I tried which was just the normal command on its own line:
aws s3 sync $local_folder $aws_bucket
I think what happened is when I first tried that, it was doing something in the background before actually starting to run. So if I had waited longer I would have seen output appear on screen as I expected...
When I startup PowerShell from within my Azure Portal I can readily navigate to my Storage Accounts and files.
When running Powershell from my laptop, and logging into Azure using "Login-AzureRmAccount", I cannot do the same thing. My prompt is always "PS C:>" so any dir command is executed on my laptop rather than actually "inside" Azure.
What am I doing wrong and how can I navigate the Azure file system?
You'll need to create a PS drive from a provider. SHiPs was created to do e exactly this, take a look:
https://blogs.msdn.microsoft.com/powershell/2017/10/19/navigate-azure-resources-just-like-a-file-system/
I am trying to access a blob file in PowerShell and want to save it directly to an FTP server. How can this be done? Can this be done via the blob file URL? Or can I somehow have the file created in Powershell memory and then use $webclient.UploadFile to save it to FTP folder?
Another question related to this same download is that is there a way to just copy the file instead of getting the subdirectories copied as well? For example, I have a blob file like: dataload/files/my_blob_file. When I use the command Get-AzureStorageBlobContent -Destination $destination_path it saves the file in the same subfolder structure, but can I instead have a custom path or remove the subfolders and save it as c:\myfolder\my_blob_file instead of c:\myfolder\dataload\files\my_blob_file? I would want to accomplish this in the above FTP server.
I am trying to access a blob file in PowerShell and want to save it directly to an FTP server.
If your FTP server is Windows based, then you can just run the script on the FTP server and download the blob into the local path of this FTP server.
Can this be done via the blob file URL?
The command "Get-AzureStorageBlobContent" doesn't accept URL as parameter. That means you need to write the code or script to achieve that. Here is a simple demo written by me:
$url = "https://accountname.blob.core.windows.net/testcontainer/Test01.txt"
$separator = "://"
$option = [System.StringSplitOptions]::RemoveEmptyEntries
$temp = $url.Split($separator,4,$option)
$Protocol = $temp[0]
$HostName = $temp[1]
$Container = $temp[2]
$BlobName = $temp[3]
Or can I somehow have the file created in Powershell memory and then use $webclient.UploadFile to save it to FTP folder?
Storing a file into the RAM is not a good ideal even if we can achieve that. As I have mentioned above, if your FTP server is Windows based, please run the script on the FTP server directly.
If the script can't be run on the server for any reason, then please try to share the folder used by the FTP service and map it as a network driver on the computer which will run the script. So that you will be able to store the file into this network driver.
but can I instead have a custom path or remove the subfolders and save it as c:\myfolder\my_blob_file instead of c:\myfolder\dataload\files\my_blob_file?
Of course, just specify the path and file name as the Destination parameter:
Note: Actually, there is no concept of "folder" on Azure storage. The path is a part of the blob name. When you download the blob, you can rename the blob by specify the file name in the destination path. So that the additional folder will not be created on local folder.
=========================================================================
Update:
This script is to be run from Azure as part of Azure Automation. But when I try to call the FTP server (which is currently my local machine) I get "Unable to connect to remote server" error.
You may need Hybrid Runbook Worker to achieve your goal.
Runbooks in Azure Automation cannot access resources in your local data center since they run in the Azure cloud. The Hybrid Runbook Worker feature of Azure Automation allows you to run runbooks on machines located in your data center to manage local resources.
I'm using the default 21 port and I also tried using my public IP address
Exposing your FTP server to the Internet is not recommended. I would suggest using Hybrid Runbook Worker.
Also, how can I get the content of my blob file into a Powershell variable to work with it in the script?
To my knowledge, Get-AzureStorageBlobContent does not support return an object in RAM. You need downloading the content first, then use Get-Content to get the file content. If you use the Hybrid Runbook Worker, you'll be able to store the file locally.
==============================================================================
Update 2:
I am trying to understand as to how to call any external FTP server (which is currently on my machine for dev/test purpose, but may reside on any other external server in production), and I have to run it from a Powershell script in Azure Automation. So your reply: You may need Hybrid Runbook Worker to achieve your goal... will not work for me right?
The Hybrid Runbook Worker works for you. And it makes things easier. Because if you use Hybrid Runbook Worker, the Runbook is running on your local machine.
I'm able to download the blobs into my local machine and upload them to the public FTP server without any issue.
Are you saying that currently there is no way to upload and download files from external FTP server from Azure Powershell Automation?
I didn't successfully upload the blob to the public FTP server. Exception occurs when I try to upload the blob and only empty files with the name of the blob are uploaded to the FTP server. It might be a permission issue since the PowerShell script is running in a sandbox. That's the reason why I said that Hybrid Runbook Worker makes things easier.
In the end, please note: FTP authenticates users and transfers date in plaintext, which may cause security issue. FTPS and SFTP are more secure than FTP.
I have added a PowerShell script as a Group Policy computer startup script. The script runs fine and does all of the tasks fine. However, at the end of the script, it is supposed to copy a log file to a file share, which it is not doing. The file share shows that "SYSTEM" has full control, so I'm not sure what the issue is. I'm able to run the script as admin while on the same machine and it will copy the log to the server without a problem. It does not do this via computer startup script (under SYSTEM account) though. Any ideas?
You will need to give the computer account write permissions on the network share. When the SYSTEM account is used to access a network resource it will do so as the domain account of the computer (DOMAIN\COMPUTER$).