How to download a blob file from Azure Storage and save it to an FTP server using Powershell? - powershell

I am trying to access a blob file in PowerShell and want to save it directly to an FTP server. How can this be done? Can this be done via the blob file URL? Or can I somehow have the file created in Powershell memory and then use $webclient.UploadFile to save it to FTP folder?
Another question related to this same download is that is there a way to just copy the file instead of getting the subdirectories copied as well? For example, I have a blob file like: dataload/files/my_blob_file. When I use the command Get-AzureStorageBlobContent -Destination $destination_path it saves the file in the same subfolder structure, but can I instead have a custom path or remove the subfolders and save it as c:\myfolder\my_blob_file instead of c:\myfolder\dataload\files\my_blob_file? I would want to accomplish this in the above FTP server.

I am trying to access a blob file in PowerShell and want to save it directly to an FTP server.
If your FTP server is Windows based, then you can just run the script on the FTP server and download the blob into the local path of this FTP server.
Can this be done via the blob file URL?
The command "Get-AzureStorageBlobContent" doesn't accept URL as parameter. That means you need to write the code or script to achieve that. Here is a simple demo written by me:
$url = "https://accountname.blob.core.windows.net/testcontainer/Test01.txt"
$separator = "://"
$option = [System.StringSplitOptions]::RemoveEmptyEntries
$temp = $url.Split($separator,4,$option)
$Protocol = $temp[0]
$HostName = $temp[1]
$Container = $temp[2]
$BlobName = $temp[3]
Or can I somehow have the file created in Powershell memory and then use $webclient.UploadFile to save it to FTP folder?
Storing a file into the RAM is not a good ideal even if we can achieve that. As I have mentioned above, if your FTP server is Windows based, please run the script on the FTP server directly.
If the script can't be run on the server for any reason, then please try to share the folder used by the FTP service and map it as a network driver on the computer which will run the script. So that you will be able to store the file into this network driver.
but can I instead have a custom path or remove the subfolders and save it as c:\myfolder\my_blob_file instead of c:\myfolder\dataload\files\my_blob_file?
Of course, just specify the path and file name as the Destination parameter:
Note: Actually, there is no concept of "folder" on Azure storage. The path is a part of the blob name. When you download the blob, you can rename the blob by specify the file name in the destination path. So that the additional folder will not be created on local folder.
=========================================================================
Update:
This script is to be run from Azure as part of Azure Automation. But when I try to call the FTP server (which is currently my local machine) I get "Unable to connect to remote server" error.
You may need Hybrid Runbook Worker to achieve your goal.
Runbooks in Azure Automation cannot access resources in your local data center since they run in the Azure cloud. The Hybrid Runbook Worker feature of Azure Automation allows you to run runbooks on machines located in your data center to manage local resources.
I'm using the default 21 port and I also tried using my public IP address
Exposing your FTP server to the Internet is not recommended. I would suggest using Hybrid Runbook Worker.
Also, how can I get the content of my blob file into a Powershell variable to work with it in the script?
To my knowledge, Get-​Azure​Storage​Blob​Content does not support return an object in RAM. You need downloading the content first, then use Get-Content to get the file content. If you use the Hybrid Runbook Worker, you'll be able to store the file locally.
==============================================================================
Update 2:
I am trying to understand as to how to call any external FTP server (which is currently on my machine for dev/test purpose, but may reside on any other external server in production), and I have to run it from a Powershell script in Azure Automation. So your reply: You may need Hybrid Runbook Worker to achieve your goal... will not work for me right?
The Hybrid Runbook Worker works for you. And it makes things easier. Because if you use Hybrid Runbook Worker, the Runbook is running on your local machine.
I'm able to download the blobs into my local machine and upload them to the public FTP server without any issue.
Are you saying that currently there is no way to upload and download files from external FTP server from Azure Powershell Automation?
I didn't successfully upload the blob to the public FTP server. Exception occurs when I try to upload the blob and only empty files with the name of the blob are uploaded to the FTP server. It might be a permission issue since the PowerShell script is running in a sandbox. That's the reason why I said that Hybrid Runbook Worker makes things easier.
In the end, please note: FTP authenticates users and transfers date in plaintext, which may cause security issue. FTPS and SFTP are more secure than FTP.

Related

Azure Remote directory navigation

When I startup PowerShell from within my Azure Portal I can readily navigate to my Storage Accounts and files.
When running Powershell from my laptop, and logging into Azure using "Login-AzureRmAccount", I cannot do the same thing. My prompt is always "PS C:>" so any dir command is executed on my laptop rather than actually "inside" Azure.
What am I doing wrong and how can I navigate the Azure file system?
You'll need to create a PS drive from a provider. SHiPs was created to do e exactly this, take a look:
https://blogs.msdn.microsoft.com/powershell/2017/10/19/navigate-azure-resources-just-like-a-file-system/

How can I access files without mapping a drive through net use?

I want to access files from other domain, so I'm using this command:
net use n: \\10.0.0.1\share /user:domain\user password
but as I have to map this drive to N: and then use it.
Is there any way that I can directly use file without mapping.
Let suppose I want to use get-content to open a .txt file.
New to this so please provide me way.
In the past, I've had success accessing resources on a non-domain machine by using this variant of the NET command:
net use \\nodename /user:.\username password
The above is for a machine local account, replace the . with domain name for a domain account.
It establishes credentials to access the machine without mapping a drive. After this, accessing files through powershell cmdlets using UNC naming should work.

How to give File access to Tableau server?

I have a .twb file created using the excel data source that is located in some network file path.
On publishing to server I get this error.
How to resolve this?
I need to provide some access for "Run as user" in my server machine,how could I do that ?
Ensure that when the workbook is created, a connection to the Excel file is created using the full UNC path
Ensure that the Tableau Server Run As user is able to access that file using the UNC path specified in the workbook. An easy way to test this would be to open Windows Explorer as the Run As user, then paste in the UNC path. If the Excel file opens, then you're good to go.
If you still have issues, test access to the file while logged into the Tableau server to make sure there is no firewall or port blocking access from that host. Often Excel isn't installed on servers for security reasons, so the test is not whether Excel opens the file, but whether you can view the contents from the server as the Run As User, even just using the type command at the console.
If you can't get your network access fixed, an alternative is to build and publish extracts to the server. There are multiple ways to accomplish that

Copy file - Access denied

I have a batch file where I copy file from a different server. I have no issues running it on the server. But when I try to run it from a web application, the file does not copy. I keep getting access denied error. I have used xcopy, copy and robocopy.
I have provided full access to source and destination folder for all users.
No luck. Keep getting the same error :
copy /y \N01APW280\d$\Oracle\Middleware\user_projects\epmsystem1\diagnostics\logs\essbase\essbase_0\app\PLPLAN\PLPLAN.LOG D:\Hyperion\ERPI_Actuals_Load\Logs\
It is rather awkward to try and use a command-line utility, such as copy, from a web app; you should rather be using the programmatic abilities within your web application instead.
Aside from that, you main issue is that web apps are typically executed with very limited privileges, using local machine accounts that have no way of accessing administrative level shares on remote machines such as \N01APW280\d$. Another possible issue is that the local account that is being used by the web app cannot write to D:\Hyperion\ERPI_Actuals_Load\Logs\ folder. And finally, your app may have enough privileges to instantiate an external process such as copy.exe.

powershell - how to access shared folders via SFTP

I have VPN tunnel with 2 computers. When I connect with WinSCP (SFTP client) to second computer, I can see shared folders. The problem is, that I need to copy files from first computer to second with powershell script. Is it possible to access the shared folders via SFTP with Powershell? Thank you..
Since you're wanting to use PowerShell to manage the process, use PSCP.exe from the developer of PuTTY. That way you can do command line file transfers and don't have to worry about the GUI tools.