I just burned a couple of hours searching for a solution to send files over an active PSSession. And the result is nada, niente. I'm trying to invoke a command on a remote computer over an active session, which should copy something from a network storage. So, basically this is it:
icm -Session $s {
Copy-Item $networkLocation $PCLocation }
Because of the "second hop" problem, I can't do that directly, and because I'm running win server 2003 I cant enable CredSSP. I could first copy the files to my computer and then send/push them to the remote machine, but how? I tried PModem, but as I saw it can only pull data and not push.
Any help is appreaciated.
This is now possible in PowerShell / WMF 5.0
Copy-Item has -FromSession and -toSession parameters. You can use one of these and pass in a session variable.
eg.
$cs = New-PSSession -ComputerName 169.254.44.14 -Credential (Get-Credential) -Name SQL
Copy-Item Northwind.* -Destination "C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQL2008R2\MSSQL\DATA\" -ToSession $cs
See more examples at here, or you can checkout the official documentation.
If it was a small file, you could send the contents of the file and the filename as parameters.
$f="the filename"
$c=Get-Content $f
invoke-command -session $s -script {param($filename,$contents) `
set-content -path $filename -value $contents} -argumentlist $f,$c
If the file is too long to fit in whatever the limits for the session are, you could read the file in as chunks, and use a similar technique to append them together in the target location
PowerShell 5+ has built-in support for doing this, described in David's answer.
I faced the same problem a while ago and put together a proof-of-concept for sending files over a PS Remoting session. You'll find the script here:
https://gist.github.com/791112
#requires -version 2.0
[CmdletBinding()]
param (
[Parameter(Mandatory=$true)]
[string]
$ComputerName,
[Parameter(Mandatory=$true)]
[string]
$Path,
[Parameter(Mandatory=$true)]
[string]
$Destination,
[int]
$TransferChunkSize = 0x10000
)
function Initialize-TempScript ($Path) {
"<# DATA" | Set-Content -Path $Path
}
function Complete-Chunk () {
#"
DATA #>
`$TransferPath = `$Env:TEMP | Join-Path -ChildPath '$TransferId'
`$InData = `$false
`$WriteStream = [IO.File]::OpenWrite(`$TransferPath)
try {
`$WriteStream.Seek(0, 'End') | Out-Null
`$MyInvocation.MyCommand.Definition -split "``n" | ForEach-Object {
if (`$InData) {
`$InData = -not `$_.StartsWith('DATA #>')
if (`$InData) {
`$WriteBuffer = [Convert]::FromBase64String(`$_)
`$WriteStream.Write(`$WriteBuffer, 0, `$WriteBuffer.Length)
}
} else {
`$InData = `$_.StartsWith('<# DATA')
}
}
} finally {
`$WriteStream.Close()
}
"#
}
function Complete-FinalChunk ($Destination) {
#"
`$TransferPath | Move-Item -Destination '$Destination' -Force
"#
}
$ErrorActionPreference = 'Stop'
Set-StrictMode -Version Latest
$EncodingChunkSize = 57 * 100
if ($EncodingChunkSize % 57 -ne 0) {
throw "EncodingChunkSize must be a multiple of 57"
}
$TransferId = [Guid]::NewGuid().ToString()
$Path = ($Path | Resolve-Path).ProviderPath
$ReadBuffer = New-Object -TypeName byte[] -ArgumentList $EncodingChunkSize
$TempPath = ([IO.Path]::GetTempFileName() | % { $_ | Move-Item -Destination "$_.ps1" -PassThru}).FullName
$Session = New-PSSession -ComputerName $ComputerName
$ReadStream = [IO.File]::OpenRead($Path)
$ChunkCount = 0
Initialize-TempScript -Path $TempPath
try {
do {
$ReadCount = $ReadStream.Read($ReadBuffer, 0, $EncodingChunkSize)
if ($ReadCount -gt 0) {
[Convert]::ToBase64String($ReadBuffer, 0, $ReadCount, 'InsertLineBreaks') |
Add-Content -Path $TempPath
}
$ChunkCount += $ReadCount
if ($ChunkCount -ge $TransferChunkSize -or $ReadCount -eq 0) {
# send
Write-Verbose "Sending chunk $TransferIndex"
Complete-Chunk | Add-Content -Path $TempPath
if ($ReadCount -eq 0) {
Complete-FinalChunk -Destination $Destination | Add-Content -Path $TempPath
Write-Verbose "Sending final chunk"
}
Invoke-Command -Session $Session -FilePath $TempPath
# reset
$ChunkCount = 0
Initialize-TempScript -Path $TempPath
}
} while ($ReadCount -gt 0)
} finally {
if ($ReadStream) { $ReadStream.Close() }
$Session | Remove-PSSession
$TempPath | Remove-Item
}
Some minor changes would allow it to accept a session as a parameter instead of it starting a new one. I found the memory consumption on the Remoting service on the destination computer could grow quite large when transferring large files. I suspect PS Remoting wasn't really designed to be used this way.
NET USE allows you to add a local drive letter for the remote system, which then then allows you to use the drive letter in your PSSession, or even without a PSSession. This is helpful if you don't have Powershell v5.0, and even if you do,
You may use the remote machine name or its IP address as part of the remote UNC path and you can specify the username and password credentials on the same line:
NET USE Z: \\192.168.1.50\ShareName /USER:192.168.1.50\UserName UserPassword
Another example:
NET USE Z: \\RemoteSystem\ShareName /USER:RemoteSystem\UserName UserPassword
OR
NET USE Z: \\RemoteSystem\ShareName /USER:Domain\UserName UserPassword
If you don't supply the user credentials on the same line, you will be prompted for them:
>NET USE Z: \\192.168.1.50\ShareName
Enter the user name for '192.168.1.50': 192.168.1.50\UserName
Enter the password for 192.168.1.50: *****
The command completed successfully.
You may remove the drive letter when you're finished with the following:
NET USE Z: /delete
You can get the full syntax with NET USE /?
>net use /?
The syntax of this command is:
NET USE
[devicename | *] [\\computername\sharename[\volume] [password | *]]
[/USER:[domainname\]username]
[/USER:[dotted domain name\]username]
[/USER:[username#dotted domain name]
[/SMARTCARD]
[/SAVECRED]
[[/DELETE] | [/PERSISTENT:{YES | NO}]]
NET USE {devicename | *} [password | *] /HOME
NET USE [/PERSISTENT:{YES | NO}]
NET is a standard external .exe command in the system folder and works in Powershell just fine.
$data = Get-Content 'C:\file.exe' -Raw
Invoke-Command -ComputerName 'server' -ScriptBlock { $using:data | Set-Content -Path 'D:\filecopy.exe' }
Don't actually know what the maximum file size limitation is.
Related
I wrote one script that does file downloads from FTP server but now due to change in servers, it should be translated for SFTP.
I checked many post but most of them using .net script.
I need to do it using powershell.
Below is the working script that downloads from FTP, please help me translate it for SFTP server.
SFTP server has hostname,password and username.
Script:-----------
function Dld-FtpFiles {
param
(
$FtpHost,
$FtpUser,
$FtpPassword,
$FtpFolder,
$DownloadDirectory
)
$Client = Connect-FTP -Server $FtpHost -Verbose -Username $FtpUser -Password $FtpPassword
$List = Get-FTPList -Client $Client -Path $FtpFolder
Get-Childitem -Path $DownloadDirectory | Remove-Item
$downloadedFiles = #()
foreach($originalFtpFile in $list) {
$downloadedFile = Receive-FTPFile -Client $Client -RemoteFile $originalFtpFile -LocalPath $DownloadDirectory -LocalExists Overwrite -VerifyOptions Retry, None
Write-Host "Downloaded file $($originalFtpFile.Name) in folder $($downloadedFile.LocalPath)"
$downloadedFiles += ($downloadedFile)
}
Disconnect-FTP -Client $Client
$downloadedFiles = $downloadedFiles | Where-Object {$_}
return $downloadedFiles
}
Set-ExecutionPolicy RemoteSigned -Scope Process -Force
Import-Module Transferetto
$ftpDetails = #(
(New-Object PSOBject -Property #{
FtpHost = 'eu.data.com'
FtpUser = 'FT_34436'
FtpPassword = 'V64445h3'
FtpFolder = '/File'
DownloadDirectory = "\\Srcfile\data"
})
)
foreach($ftpDetail in $ftpDetails) {
$downloadedFiles = Dld-FtpFiles -FtpHost $ftpDetail.FtpHost -FtpUser $ftpDetail.FtpUser -FtpPassword $ftpDetail.FtpPassword -FtpFolder $ftpDetail.FtpFolder -DownloadDirectory $ftpDetail.DownloadDirectory
}
Good day, I would ask you to help me with finding the solution how to copy each MSI package to remote machine using link on nas storage.
# Get list of servers
param(
[ValidateSet('STUDENT_LAB', 'LIBRARY_LAB', 'TEACHER_LAB')]
[Parameter(Mandatory = $true,
HelpMessage = 'Select one of the valid servers by typing one of these names: STUDENT_LAB, LIBRARY_LAB, TEACHER_LAB')]
[string]$ServerGroup
)
$servers = #{
STUDENT_LAB = ('192.168.1.1','192.168.1.2','192.168.1.3')
LIBRARY_LAB = ('192.168.10.1','192.168.10.2','192.168.10.3')
TEACHER_LAB = ('192.168.15.1','192.168.15.2','192.168.15.3')
}[$ServerGroup]
Write-Output "The user chose $ServerGroup"
#this is what I don't know how to implement - download file from nas storage on remote machine
$sourcefiles = '\\NASSTORAGE\MSI\MICROSOFT\Microsoft-ODBCDriver-11-SQLServer-x64\msodbcsql.msi' ; '\\NASSTORAGE\MSI\MICROSOFT\Microsoft-ODBCDriver-17-SQLServr-x64\msodbcsql_17.2.0.1_x64.msi'; '\\NASSTORAGE\MSI\MICROSOFT\Microsoft-OLEDBDriver-SQL Server-x64\msoledbsql_18.1.0.0_x64.msi'
foreach($server in $servers) {
# Destination UNC path changes based on server name
$destinationPath = "\\$server\D$\tmp\"
# Check that full folder structure exists and create if it doesn't
if(!(Test-Path $destinationPath)) {
New-Item -ItemType Directory -Force -Path $destinationPath
}
# Copy the file across
Copy-Item $sourcefiles $destinationPath
#list of packages to install
$msiList = #(
'Microsoft-ODBCDriver-11-SQLServer-x64\msodbcsql.msi'
'Microsoft-ODBCDriver-17-SQLServr-x64\msodbcsql_17.2.0.1_x64.msi'
'Microsoft-OLEDBDriver-SQL Server-x64\msoledbsql_18.1.0.0_x64.msi'
)
#now I'm trying to install on remote machine
foreach ($msi in $msiList) {
$install = Join-Path -Path $destinationPath -ChildPath $msi
Start-Process "msiexec.exe" -ArgumentList "/I $install",'/qn' -Wait
}
}
And is there any way how to check if the msi was installed properly?
Thank you for your time.
you can add this at the installation section :
$LaunchMsi = Start-Process "msiexec.exe" -ArgumentList "/I $install",'/qn' -Wait -PassThru
$ReturnCode = $LaunchMsi.ExitCode
if (($ReturnCode -eq "0") -OR ($ReturnCode -eq "3010")) {Write-Host "installation OK, return code : $ReturnCode"}
Else {Write-host "installation KO, return code : $ReturnCode"}
After a massive amount of headaches, I was able to get this almost functioning.
Problem: In the error/output Robocopy appears to be treating $args[4] (ref: $sourcePath) as everysingle IP in the range instead of just one object.
I'm assuming the rest of the syntax is correct, because if I switch $ip = 101..119 | foreach { "192.168.1.$_" } to $ip = "192.168.1.101" everything works correctly.
The Robocopy dumps into the console -source as all of the IP addresses in the range from $ip. What am I doing wrong here?
#####################################################
#Purpose: to ping an IP range of network locations in a loop until successful. When successful, copy new files from source onto storage.
#Each ping and transfer needs to be ran individually and simultaneously due to time constraints.
#####################################################
#define the IP range, source path & credentials, and storage path
$ip = 101..119 | foreach { "192.168.1.$_" }
#$ip = "192.168.1.101" #everything works if I comment above and uncomment here
$source = "\\$ip"
$sourcePath = "$source\admin\"
$dest = "C:\Storage\"
$destFull = "$dest$ip\"
$username = "user"
$password = "password"
#This is how to test connection. Once returns TRUE, then copy new files only from source to destination.
#copy all subdirectories & files in restartable mode
foreach ($src in $ip){
Start-Job -ScriptBlock {
DO {$ping = Test-Connection $args[0] -BufferSize 16 -Count 4 -quiet}
until ($ping)
net use \\$args[1] $args[2] /USER:$args[3]
robocopy $args[4] $args[5] /E /Z
} -ArgumentList $src, $source, $password, $username, $sourcePath, $destFull -Name "$src" #pass arguments to Start-Job's scriptblock
}
#get all jobs in session, supress command prompt until all jobs are complete. Then once complete, get the results.
Get-Job | Wait-Job
Get-Job | Receive-Job
At the point you create $source, $ip is an array, so $source ends up as a very long string of all the items concatenated:
\\192.168.1.101 192.168.1.102 192.168.1.103 192.168.1.104 ...
You can see this for yourself, by running just these two lines, then examining the contents of $source:
$ip = 101..119 | foreach { "192.168.1.$_" }
$source = "\\$ip"
This has a knock-on effect to $sourcePath which is used as $args[4] in your call to RoboCopy. You should build your paths inside your foreach loop, where you have access to each IP address ($src) from the $ip collection.
Some sources/etc. are different, but that's just due to test environment. I decided to use the [io.path] for the paths since I was running into problems with $args when defining variables.
Thank you boxdog for the above help. I was completely overlooking that fact.
$ScriptBlock = {
$source = [io.path]::Combine('\\',$args[0])
$sourcePath = [io.path]::Combine('\\',$args[0],'c$','admin\')
$dest = "C:\Storage\"
$destFull = [io.path]::Combine($dest,$args[0])
DO {$ping = Test-Connection $args[0] -BufferSize 16 -Count 1 -quiet}
until ($ping)
net use $source password /USER:user
robocopy $sourcePath $destFull /E /Z
}
$ip = 101..119 | foreach { "192.168.1.$_" }
foreach ($dvr in $ip){
Start-Job $ScriptBlock -ArgumentList $dvr
}
Get-Job | Wait-Job
Get-Job | Receive-Job
I am trying to write a script that will loop through all of my local IIS website and update their physical path credentials whenever I'm forced to update my domain password.
The following works... the first time you run it...
function Set-Site-Credentials(
$SiteElement,
$Credentials
){
$SiteElement.virtualDirectoryDefaults.userName = "$($Credentials.Domain)\$($Credentials.UserName)"
$SiteElement.virtualDirectoryDefaults.password = $Credentials.Password
$SiteElement | Set-Item -Force
}
After running this, I noticed that the following properties also get set
$SiteElement.userName #Same as was set earlier on .virtualDirectoryDefaults
$SiteElement.password #Same as was set earlier on .virtualDirectoryDefaults
Subsequently, anytime I try to update the credentials using the code above, these two properties remain unchanged, and the changes don't take affect in IIS.
So the result is:
$SiteElement.userName #Unchanged
$SiteElement.password #Unchanged
$SiteElement.virtualDirectoryDefaults.userName #New value
$SiteElement.virtualDirectoryDefaults.password #New value
And the IIS site still shows the old username in the UI and the credentials fail.
So naturally I tried setting those extra 2 properties in my update function:
function Set-Site-Credentials(
$SiteElement,
$Credentials
){
$SiteElement.userName = "$($Credentials.Domain)\$($Credentials.UserName)"
$SiteElement.password = $Credentials.Password
$SiteElement.virtualDirectoryDefaults.userName = "$($Credentials.Domain)\$($Credentials.UserName)"
$SiteElement.virtualDirectoryDefaults.password = $Credentials.Password
$SiteElement | Set-Item -Force
}
The code throws no errors or warnings, but the end result is the same, those 2 extra properties remain unchanged.
I am using the following code to get "$SiteElement"
$sites = Get-ChildItem IIS:\Sites
$sites | Foreach-Object { Set-Site-Credentials -SiteElement $_ -Credentials $newCredentials }
Also, at the end of the script I restart IIS using this command:
Restart-Service W3SVC
Ugh, finally found a command that works. All in all I've tried 4 different variation of the same thing from different example around the interwebz, all of which only work the first time. But this command updates properly on subsequent changes:
function Set-Site-Credentials(
$SiteElement,
$Credentials
){
Set-WebConfiguration -Filter "$($SiteElement.ItemXPath)/application[#path='/']/virtualDirectory[#path='/']" -Value #{userName="$($Credentials.Domain)\$($Credentials.UserName)"; password="$($Credentials.Password)"}
}
The full script
param (
[switch]$All,
[switch]$AllPools,
[switch]$AllSites,
[string]$AppPool,
[string]$Site
)
Import-Module WebAdministration
function Set-AppPool-Credentials(
$AppPoolElement,
$Credentials
){
Set-ItemProperty $AppPoolElement.PSPath -name processModel -value #{userName="$($Credentials.Domain)\$($Credentials.UserName)";password="$($Credentials.Password)";identitytype=3}
}
function Set-Site-Credentials(
$SiteElement,
$Credentials
){
Set-WebConfiguration -Filter "$($SiteElement.ItemXPath)/application[#path='/']/virtualDirectory[#path='/']" -Value #{userName="$($Credentials.Domain)\$($Credentials.UserName)"; password="$($Credentials.Password)"}
}
$newCredentials = (Get-Credential).GetNetworkCredential()
$appPools = Get-ChildItem IIS:\AppPools
$sites = Get-ChildItem IIS:\Sites
if($All -or $AllPools){
$appPools | Foreach-Object { Set-AppPool-Credentials -AppPoolElement $_ -Credentials $newCredentials }
}
elseif($AppPool){
$poolElement = ($appPools | Where-Object { $_.name -eq $AppPool })
Set-AppPool-Credentials -AppPoolElement $poolElement -Credentials $newCredentials
}
if($All -or $AllSites){
$sites | Foreach-Object { Set-Site-Credentials -SiteElement $_ -Credentials $newCredentials }
}
elseif($Site){
$siteElement = ($sites | Where-Object { $_.name -eq $Site })
Set-Site-Credentials -SiteElement $siteElement -Credentials $newCredentials
}
Restart-Service W3SVC
I have an array of Credential objects and I would like to test that these credentials have permissions to write a file to a file share.
I was going to do something like
$myPath = "\\path\to\my\share\test.txt"
foreach ($cred in $credentialList)
{
"Testing" | Out-File -FilePath $myPath -Credential $cred
}
but then I discovered that Out-File doesn't take Credential as a parameter. What's the best way to solve this?
You can use New-PSDrive:
$myPath = "\\path\to\my\share"
foreach ($cred in $credentialList)
{
New-PSDrive Test -PSProvider FileSystem -Root $myPath -Credential $Cred
"Testing" | Out-File -FilePath Test:\test.txt
Remove-PSDrive Test
}
Here is asituation where an old exe (net.exe) seems to do better than powershell...
I guess you could try to map a network drive with the credential provided then test to write a file to that drive :
$cred=get-credential
$pass=$cred.GetNetworkCredential().Password
net use q: \\servername\share $pass /user:$cred.username
Use this script taken from Microsofts TechNet Script Center : http://gallery.technet.microsoft.com/scriptcenter/Lists-all-the-shared-5ebb395a
It is a lot easier to alter to fit your needs then to start completely from scratch.
Open up ListSharedFolderPermissions.ps1, and find the three $Properties vars. add a line at the top of each one so you can tell which user your looking at, so it should now look like this:
$Properties = #{'Username' = $Credential.UserName
'ComputerName' = $ComputerName
. . . . . }
Next, add your new Username property to the select-object line (3 times) :
$Objs|Select-Object Username,ComputerName,ConnectionStatus,SharedFolderName,SecurityPrincipal, `
FileSystemRights,AccessControlType
Once youve added those small pieces in the six appropriate places your script is ready to use:
cd c:\Path\where\you\put\ps1\file
$permissions = #()
$myPath = "computername"
foreach ($cred in $credentialList)
{
$permissions += .\ListAllSharedFolderPermission.ps1 -ComputerName $myPath -Credential $cred
$permissions += " "
}
$permissions | Export-Csv -Path "C:\Permission.csv" -NoTypeInformation
Try using the Invoke-Command function. It will take a credential object and allow you to run an arbitrary script block under that command. You can use that to test out writing the file
Invoke-Command -ScriptBlock { "Testing" | Out-File $myPath } -Credential $cred
I think the Invoke-command approach should work. But if nothing works you can try the powershell impersonation module. It successfully impersonates a user for most Powershell commands without the -Credential switch.
A few ideas:
Create your own PowerShell Provider
Impersonate a user and then write to the share (not sure if possible in powershell)
Use net use d:... as #Kayasax has suggested
Use WScript.Network
I'm very interested in the PowerShell provider myself, but I decided to make something real quick so I went with using the WScript.Network library. I used a hash table to track whether a user would be "authenticated" or not.
$credentials = #() # List of System.Net.NetworkCredential objects
$authLog = #{}
$mappedDrive = 'z:'
$tmpFile = $mappedDrive, '\', [guid]::NewGuid(), '.tmp' -join ''
$path = [io.path]::GetPathRoot('\\server\share\path')
$net = new-object -comObject WScript.Network
foreach ($c in $credentials) {
if ($authLog.ContainsKey($c.UserName)) {
# Skipping because we've already tested this user.
continue
}
try {
if (Test-Path $mappedDrive) {
$net.RemoveNetworkDrive($mappedDrive, 1) # 1 to force
}
# Attempt to map drive and write to it
$net.MapNetworkDrive($mappedDrive, $path, $false, $c.UserName, $c.Password)
out-file $tmpFile -inputObject 'test' -force
# Cleanup
Remove-Item $tmpFile -force
$net.RemoveNetworkDrive($mappedDrive, 1)
# Authenticated.
# We shouldn't have reached this if we failed to mount or write
$authLog.Add($c.UserName, 'Authorized')
}
catch [Exception] {
# Unathenticated
$authLog.Add($c.UserName, 'Unauthorized')
}
}
$authLog
# Output
Name Value
---- -----
desktop01\user01 Authorized
desktop01\user02 Unauthorized