powershell - how to access shared folders via SFTP - powershell

I have VPN tunnel with 2 computers. When I connect with WinSCP (SFTP client) to second computer, I can see shared folders. The problem is, that I need to copy files from first computer to second with powershell script. Is it possible to access the shared folders via SFTP with Powershell? Thank you..

Since you're wanting to use PowerShell to manage the process, use PSCP.exe from the developer of PuTTY. That way you can do command line file transfers and don't have to worry about the GUI tools.

Related

Unix permissions needed when running Powershell script

As a final step in our AD account creation process that is being moved to a powershell script a few folders need to be created on the filer for users and I am coming unstuck with permissions.
I am just using the basic new-item command to create folder but the locations need unix permissions (775) set before anything can be created. I can't go there and right click in Windows explorer and click new.. and the powershell script is being bounced also due to permissions.
The reasoning from one of the tech guys here is that I am trying to create a sub folder via smb mount from Windows using ntfs permissions. There is no correlation to unix permissions and any of our Linux users won't be able to access / use the location created for them.
Sorry if that is a clumsy way of explaining it, I am not a systems engineer, just the guy trying to translate a whole heap if pearl scripts into a new powershell process.
Thank you
S.

How to download a blob file from Azure Storage and save it to an FTP server using Powershell?

I am trying to access a blob file in PowerShell and want to save it directly to an FTP server. How can this be done? Can this be done via the blob file URL? Or can I somehow have the file created in Powershell memory and then use $webclient.UploadFile to save it to FTP folder?
Another question related to this same download is that is there a way to just copy the file instead of getting the subdirectories copied as well? For example, I have a blob file like: dataload/files/my_blob_file. When I use the command Get-AzureStorageBlobContent -Destination $destination_path it saves the file in the same subfolder structure, but can I instead have a custom path or remove the subfolders and save it as c:\myfolder\my_blob_file instead of c:\myfolder\dataload\files\my_blob_file? I would want to accomplish this in the above FTP server.
I am trying to access a blob file in PowerShell and want to save it directly to an FTP server.
If your FTP server is Windows based, then you can just run the script on the FTP server and download the blob into the local path of this FTP server.
Can this be done via the blob file URL?
The command "Get-AzureStorageBlobContent" doesn't accept URL as parameter. That means you need to write the code or script to achieve that. Here is a simple demo written by me:
$url = "https://accountname.blob.core.windows.net/testcontainer/Test01.txt"
$separator = "://"
$option = [System.StringSplitOptions]::RemoveEmptyEntries
$temp = $url.Split($separator,4,$option)
$Protocol = $temp[0]
$HostName = $temp[1]
$Container = $temp[2]
$BlobName = $temp[3]
Or can I somehow have the file created in Powershell memory and then use $webclient.UploadFile to save it to FTP folder?
Storing a file into the RAM is not a good ideal even if we can achieve that. As I have mentioned above, if your FTP server is Windows based, please run the script on the FTP server directly.
If the script can't be run on the server for any reason, then please try to share the folder used by the FTP service and map it as a network driver on the computer which will run the script. So that you will be able to store the file into this network driver.
but can I instead have a custom path or remove the subfolders and save it as c:\myfolder\my_blob_file instead of c:\myfolder\dataload\files\my_blob_file?
Of course, just specify the path and file name as the Destination parameter:
Note: Actually, there is no concept of "folder" on Azure storage. The path is a part of the blob name. When you download the blob, you can rename the blob by specify the file name in the destination path. So that the additional folder will not be created on local folder.
=========================================================================
Update:
This script is to be run from Azure as part of Azure Automation. But when I try to call the FTP server (which is currently my local machine) I get "Unable to connect to remote server" error.
You may need Hybrid Runbook Worker to achieve your goal.
Runbooks in Azure Automation cannot access resources in your local data center since they run in the Azure cloud. The Hybrid Runbook Worker feature of Azure Automation allows you to run runbooks on machines located in your data center to manage local resources.
I'm using the default 21 port and I also tried using my public IP address
Exposing your FTP server to the Internet is not recommended. I would suggest using Hybrid Runbook Worker.
Also, how can I get the content of my blob file into a Powershell variable to work with it in the script?
To my knowledge, Get-​Azure​Storage​Blob​Content does not support return an object in RAM. You need downloading the content first, then use Get-Content to get the file content. If you use the Hybrid Runbook Worker, you'll be able to store the file locally.
==============================================================================
Update 2:
I am trying to understand as to how to call any external FTP server (which is currently on my machine for dev/test purpose, but may reside on any other external server in production), and I have to run it from a Powershell script in Azure Automation. So your reply: You may need Hybrid Runbook Worker to achieve your goal... will not work for me right?
The Hybrid Runbook Worker works for you. And it makes things easier. Because if you use Hybrid Runbook Worker, the Runbook is running on your local machine.
I'm able to download the blobs into my local machine and upload them to the public FTP server without any issue.
Are you saying that currently there is no way to upload and download files from external FTP server from Azure Powershell Automation?
I didn't successfully upload the blob to the public FTP server. Exception occurs when I try to upload the blob and only empty files with the name of the blob are uploaded to the FTP server. It might be a permission issue since the PowerShell script is running in a sandbox. That's the reason why I said that Hybrid Runbook Worker makes things easier.
In the end, please note: FTP authenticates users and transfers date in plaintext, which may cause security issue. FTPS and SFTP are more secure than FTP.

Is it possible to transfer files over the network using CIM or WMI without SMB shares? [duplicate]

I need to read and write(update) some remote machine file.I am able to find the remote file using WMI(System.Management) but not able to do read or updation on that.
Any help would be appreciated.
Thanks
Himanshu
The WMI doesn't have any class (or method) to read or write the content of files. You may only retrieve the metadata (FileName, Date, Size) of the files using CIM_DataFile, or do tasks like Copy, Rename, Delete or Compress files.
RRUZ is correct: WMI cannot copy or create files over a network. This is because it would require credential "hopping":
http://msdn.microsoft.com/en-us/library/windows/desktop/aa389288%28v=vs.85%29.aspx
However, a workaround was recently created by Stackoverflow.com user Frank White in C#, and the WMI logic ports directly to VBS. Here's his solution:
WMI remote process to copy file
I ported it to a fully working VBScript:
https://stackoverflow.com/a/11948096/1569434
First check your file access in premmisions and set user "Everyone" to Full Control
then try it again.

Can RemoteSigned run scripts created on same domain?

I'm creating and testing some powershell scripts to do some basic file copying. I've set my executionpolicy to RemoteSigned. According to the help, this should allow me to run scripts that were not downloaded from the internet. However, my observations seem to indicate that this will run only scripts created on the local machine.
For instance, if I create a script on my development machine and try to copy to my server (on my same domain), the script will not run. However, if I open up the Powershell ISE on the server and open my script, copy the code and paste it into a new file window and save it to the server, the script then runs. Further, if I want to create a self-signed certificate, it will not run on other computers (per the help).
So, this all seems a bit cumbersome that I have to develop my scripts on the machine they are to be run or go through the copy/paste routine mentioned above to get them to run on my server. I just want to know that I've understood all of this correctly and there is no other way to create a script within the same domain and run it under the remotesigned execution policy without paying the fee for a certificate.
this post here provide the method for executing script from shared folder. hope this could help you :-)

How do you use nAnt to copy files to a non-domain machine

I have a nAnt script that works perfectly to build and copy a website to another domain machine. However, when I try to copy the website to a machine not on the domain I get security errors.
I know it's because the user that I have set to run nAnt doesn't have permissions on the remote computer.
Is it possible to specify a remote user to authenticate against when trying to copy files to a non-domain computer? There doesn't seem to be any options for this in the official nAnt documentation.
What other options are available?
We've got round this by having a account with the same username and password on all servers that are involved in the copy. However, we do it the other way round. We copy from a machine in a workgroup to a maching in a domain and it works fine.
e.g. useraccount on workgroup computer:
.\CruiseControl password1
useraccount on domain:
domain\CruiseControl password1
I've stumbled upon the same problem. Ended up using PsExec to call XCOPY. Works fine.
I answered a question that was similar to this.
One might be able to use the exec task to launch the runas command to copy (or xcopy) the files over to a computer with a different username/password. If this is a non-domain account, you might have to use the local administrator account to authenticate. I'm not 100% on that one.
This should allow you to stay within NAnt. Let me know if this is not sufficient and we can try and figure something else out.