How can I access files without mapping a drive through net use? - powershell

I want to access files from other domain, so I'm using this command:
net use n: \\10.0.0.1\share /user:domain\user password
but as I have to map this drive to N: and then use it.
Is there any way that I can directly use file without mapping.
Let suppose I want to use get-content to open a .txt file.
New to this so please provide me way.

In the past, I've had success accessing resources on a non-domain machine by using this variant of the NET command:
net use \\nodename /user:.\username password
The above is for a machine local account, replace the . with domain name for a domain account.
It establishes credentials to access the machine without mapping a drive. After this, accessing files through powershell cmdlets using UNC naming should work.

Related

Tableau replacing backward slash by slash in UNC path

I'm trying to create a Tableau's workbook connected to a CSV file. This file is on another server, in a shared folder. I can access it on Tableau Desktop, but when I publish it on Tableau Server, it doesn't work. My workbook doesn't find my file.
I unchecked Include External File and my shared folder is shared to everyone, so I don't understand why it doesn't work.
My only hint is that I specify my path like
"\\servername\folder\...\"
and Tableau diplay :
"The directory is missing or has been moved: //servername/folder/...".
Have you already seen that issue ? Have you any solutions ?
Note : my file is on a Windows server and Tableau on a Linux one.
You'll need to mount the network shares to a mount point on your Linux Tableau server, then tell Tableau where to find the mount points and their associated UNC path. See this article by Tableau on how to do this.
In short, Windows lets us connect to different disk drives or servers/computers using the C:\ D:\ A:\ \\servername notation. On Linux, every drive and network share is given a "mount point", such as /mnt/flashdrive or /mnt/servername, and you have to explicitly tell it what type of connection it is whereas Windows can try to figure it out for you.

How to download a blob file from Azure Storage and save it to an FTP server using Powershell?

I am trying to access a blob file in PowerShell and want to save it directly to an FTP server. How can this be done? Can this be done via the blob file URL? Or can I somehow have the file created in Powershell memory and then use $webclient.UploadFile to save it to FTP folder?
Another question related to this same download is that is there a way to just copy the file instead of getting the subdirectories copied as well? For example, I have a blob file like: dataload/files/my_blob_file. When I use the command Get-AzureStorageBlobContent -Destination $destination_path it saves the file in the same subfolder structure, but can I instead have a custom path or remove the subfolders and save it as c:\myfolder\my_blob_file instead of c:\myfolder\dataload\files\my_blob_file? I would want to accomplish this in the above FTP server.
I am trying to access a blob file in PowerShell and want to save it directly to an FTP server.
If your FTP server is Windows based, then you can just run the script on the FTP server and download the blob into the local path of this FTP server.
Can this be done via the blob file URL?
The command "Get-AzureStorageBlobContent" doesn't accept URL as parameter. That means you need to write the code or script to achieve that. Here is a simple demo written by me:
$url = "https://accountname.blob.core.windows.net/testcontainer/Test01.txt"
$separator = "://"
$option = [System.StringSplitOptions]::RemoveEmptyEntries
$temp = $url.Split($separator,4,$option)
$Protocol = $temp[0]
$HostName = $temp[1]
$Container = $temp[2]
$BlobName = $temp[3]
Or can I somehow have the file created in Powershell memory and then use $webclient.UploadFile to save it to FTP folder?
Storing a file into the RAM is not a good ideal even if we can achieve that. As I have mentioned above, if your FTP server is Windows based, please run the script on the FTP server directly.
If the script can't be run on the server for any reason, then please try to share the folder used by the FTP service and map it as a network driver on the computer which will run the script. So that you will be able to store the file into this network driver.
but can I instead have a custom path or remove the subfolders and save it as c:\myfolder\my_blob_file instead of c:\myfolder\dataload\files\my_blob_file?
Of course, just specify the path and file name as the Destination parameter:
Note: Actually, there is no concept of "folder" on Azure storage. The path is a part of the blob name. When you download the blob, you can rename the blob by specify the file name in the destination path. So that the additional folder will not be created on local folder.
=========================================================================
Update:
This script is to be run from Azure as part of Azure Automation. But when I try to call the FTP server (which is currently my local machine) I get "Unable to connect to remote server" error.
You may need Hybrid Runbook Worker to achieve your goal.
Runbooks in Azure Automation cannot access resources in your local data center since they run in the Azure cloud. The Hybrid Runbook Worker feature of Azure Automation allows you to run runbooks on machines located in your data center to manage local resources.
I'm using the default 21 port and I also tried using my public IP address
Exposing your FTP server to the Internet is not recommended. I would suggest using Hybrid Runbook Worker.
Also, how can I get the content of my blob file into a Powershell variable to work with it in the script?
To my knowledge, Get-​Azure​Storage​Blob​Content does not support return an object in RAM. You need downloading the content first, then use Get-Content to get the file content. If you use the Hybrid Runbook Worker, you'll be able to store the file locally.
==============================================================================
Update 2:
I am trying to understand as to how to call any external FTP server (which is currently on my machine for dev/test purpose, but may reside on any other external server in production), and I have to run it from a Powershell script in Azure Automation. So your reply: You may need Hybrid Runbook Worker to achieve your goal... will not work for me right?
The Hybrid Runbook Worker works for you. And it makes things easier. Because if you use Hybrid Runbook Worker, the Runbook is running on your local machine.
I'm able to download the blobs into my local machine and upload them to the public FTP server without any issue.
Are you saying that currently there is no way to upload and download files from external FTP server from Azure Powershell Automation?
I didn't successfully upload the blob to the public FTP server. Exception occurs when I try to upload the blob and only empty files with the name of the blob are uploaded to the FTP server. It might be a permission issue since the PowerShell script is running in a sandbox. That's the reason why I said that Hybrid Runbook Worker makes things easier.
In the end, please note: FTP authenticates users and transfers date in plaintext, which may cause security issue. FTPS and SFTP are more secure than FTP.

How to fetch the files from local windows Share in Talend

How can I fetch a list of files from local Windows share (that requires credentials)? Is there anywhere I can specify a username/password to authenticate to that share location? using talend.
I don't know if it's an option for you because you didn't provide many context information.
But you could use the tSystem component to execute a command like this :
net use x: \\share\some /user:username password
Then use the tFileList component in order to retrieve the files from that directory.
I think you can mapped shared drive on your machine(where you want to execute the Talend job) then used that mapped drive location to get the files.

powershell - how to access shared folders via SFTP

I have VPN tunnel with 2 computers. When I connect with WinSCP (SFTP client) to second computer, I can see shared folders. The problem is, that I need to copy files from first computer to second with powershell script. Is it possible to access the shared folders via SFTP with Powershell? Thank you..
Since you're wanting to use PowerShell to manage the process, use PSCP.exe from the developer of PuTTY. That way you can do command line file transfers and don't have to worry about the GUI tools.

Accessing UNC file share with credentials in Perl

I am trying to get a Perl scripts to access a file over UNC path using specified credentials. Is this possible? If not with Perl, what else could I use?
Thanks
-Jesse
Do you have to access many varied paths with differing credentials or just one?
As non-programming solution can you map the network share to a drive letter permananently in Windows before running your perl program?
Check out the module Win32::NetResource it has methods to allow you to connect to Windows network resources, such as drive shares and supply the credentials.
AddConnection(\%NETRESOURCE,$Password,$UserName,$Connection)
Makes a connection to a network resource specified by %NETRESOURCE