Powershell script to install multiple msi on remote machines - powershell

Good day, I would ask you to help me with finding the solution how to copy each MSI package to remote machine using link on nas storage.
# Get list of servers
param(
[ValidateSet('STUDENT_LAB', 'LIBRARY_LAB', 'TEACHER_LAB')]
[Parameter(Mandatory = $true,
HelpMessage = 'Select one of the valid servers by typing one of these names: STUDENT_LAB, LIBRARY_LAB, TEACHER_LAB')]
[string]$ServerGroup
)
$servers = #{
STUDENT_LAB = ('192.168.1.1','192.168.1.2','192.168.1.3')
LIBRARY_LAB = ('192.168.10.1','192.168.10.2','192.168.10.3')
TEACHER_LAB = ('192.168.15.1','192.168.15.2','192.168.15.3')
}[$ServerGroup]
Write-Output "The user chose $ServerGroup"
#this is what I don't know how to implement - download file from nas storage on remote machine
$sourcefiles = '\\NASSTORAGE\MSI\MICROSOFT\Microsoft-ODBCDriver-11-SQLServer-x64\msodbcsql.msi' ; '\\NASSTORAGE\MSI\MICROSOFT\Microsoft-ODBCDriver-17-SQLServr-x64\msodbcsql_17.2.0.1_x64.msi'; '\\NASSTORAGE\MSI\MICROSOFT\Microsoft-OLEDBDriver-SQL Server-x64\msoledbsql_18.1.0.0_x64.msi'
foreach($server in $servers) {
# Destination UNC path changes based on server name
$destinationPath = "\\$server\D$\tmp\"
# Check that full folder structure exists and create if it doesn't
if(!(Test-Path $destinationPath)) {
New-Item -ItemType Directory -Force -Path $destinationPath
}
# Copy the file across
Copy-Item $sourcefiles $destinationPath
#list of packages to install
$msiList = #(
'Microsoft-ODBCDriver-11-SQLServer-x64\msodbcsql.msi'
'Microsoft-ODBCDriver-17-SQLServr-x64\msodbcsql_17.2.0.1_x64.msi'
'Microsoft-OLEDBDriver-SQL Server-x64\msoledbsql_18.1.0.0_x64.msi'
)
#now I'm trying to install on remote machine
foreach ($msi in $msiList) {
$install = Join-Path -Path $destinationPath -ChildPath $msi
Start-Process "msiexec.exe" -ArgumentList "/I $install",'/qn' -Wait
}
}
And is there any way how to check if the msi was installed properly?
Thank you for your time.

you can add this at the installation section :
$LaunchMsi = Start-Process "msiexec.exe" -ArgumentList "/I $install",'/qn' -Wait -PassThru
$ReturnCode = $LaunchMsi.ExitCode
if (($ReturnCode -eq "0") -OR ($ReturnCode -eq "3010")) {Write-Host "installation OK, return code : $ReturnCode"}
Else {Write-host "installation KO, return code : $ReturnCode"}

Related

Cannot install fonts with Powershell on Windows 10

On my work computer, I don't have admin privileges.
Installing new fonts cannot be done "the easy way".
At the time I was using Windows 7, I managed to run a PowerShell script that was launched at session startup and that installed the fonts from a given folder.
Here is the code I used:
add-type -name Session -namespace "" -member #"
[DllImport("gdi32.dll")]
public static extern int AddFontResource(string filePath);
"#
$FontFolder = "C:\Users\myusername\Documents\Fonts"
$null = foreach($font in Get-ChildItem -Path $FontFolder -Recurse -Include *.ttf, *.otg, *.otf) {
Write-Host "Installing : $($font.FullName)"
$result = [Session]::AddFontResource($font.FullName)
Write-Host "Installed $($result) fonts"
}
Now that I have switched to Windows 10, I thought I could go back to installing fonts "the easy way", as it is supposed to be possible to install fonts for your user without admin privileges.
This however still does not work: there is a popup window saying that "The requested file is not a valid font file". One solution is apparently to start the Windows firewall, which of course is not allowed by my administrator... but it is already running (see Edit below)
Back to the PowerShell then. The script unfortunately does not work anymore and does not provide any interesting pointers to where the problem comes from:
Installing : C:\Users\myusername\Documents\Fonts\zilla-slab\ZillaSlab-SemiBold.otf
Installed 0 fonts
Installing : C:\Users\myusername\Documents\Fonts\zilla-slab\ZillaSlab-SemiBoldItalic.otf
Installed 0 fonts
Installing : C:\Users\myusername\Documents\Fonts\zilla-slab\ZillaSlabHighlight-Bold.otf
Installed 0 fonts
I tried using a try catch, but still have no identified error:
add-type -name Session -namespace "" -member #"
[DllImport("gdi32.dll")]
public static extern int AddFontResource(string filePath);
"#
$FontFolder = "C:\Users\myusername\Documents\Fonts"
$null = foreach($font in Get-ChildItem -Path $FontFolder -Recurse -Include *.ttf, *.otg, *.otf) {
try {
Write-Host "Installing : $($font.FullName)"
$result = [Session]::AddFontResource($font.FullName)
Write-Host $result
}
catch {
Write-Host "An error occured installing $($font)"
Write-Host "$($error)"
Write-Host "$($error[0].ToString())"
Write-Host ""
1
}
}
And the resulting output
Installing : C:\Users\myusername\Documents\Fonts\zilla-slab\ZillaSlabHighlight-Bold.otf
0
Installing : C:\Users\myusername\Documents\Fonts\zilla-slab\ZillaSlabHighlight-Regular.otf
0
Installing : C:\Users\myusername\Documents\Fonts\ZillaSlab-Light.otf
0
Any idea how to solve this issue?
Edit:
Regarding the status of the security applications, here is the McAfee status:
McAfee Data Exchange Layer OK
McAfee DLP Endpoint OK
Programme de mise à jour McAfee OK
McAfee Endpoint Security OK
"Programme de mise à jour" means "update program" in French.
I also checked the list of running services :
mpssvc service (Windows defender firewall) is running
mfefire (McAfee Firewall core service) is not running
Edit2:
My last attempt is the following:
I copied the font file manually to the $($env:LOCALAPPDATA)\Microsoft\Windows\Fonts\ folder
Using regedit, I added the entry as shown below
I restarted. Still no Bebas font in WordPad or Publisher
Here's how I do it with a com object. This works for me as non-admin based on Install fonts without administrative privileges. I can see the fonts installed to "$env:LOCALAPPDATA\Microsoft\Windows\Fonts" in the Fonts area under Settings. I have Windows 10 20H2 (it should work in 1803 or higher). I also see the fonts installed in Wordpad.
$Destination = (New-Object -ComObject Shell.Application).Namespace(20)
$TempFolder = "$($env:windir)\Temp\Fonts\"
New-Item -Path $TempFolder -Type Directory -Force | Out-Null
Get-ChildItem -Path $PSScriptRoot\fonts\* -Include '*.ttf','*.ttc','*.otf' |
ForEach {
If (-not(Test-Path "$($env:LOCALAPPDATA)\Microsoft\Windows\Fonts\$($_.Name)")) {
$Font = "$($env:windir)\Temp\Fonts\$($_.Name)"
Copy-Item $($_.FullName) -Destination $TempFolder
$Destination.CopyHere($Font)
Remove-Item $Font -Force
} else { "font $($env:LOCALAPPDATA)\Microsoft\Windows\Fonts\$($_.Name) already installed" }
}
Example REG_SZ registry entry:
dir 'HKCU:\Software\Microsoft\Windows NT\CurrentVersion\Fonts*' | ft -a
Hive: HKEY_CURRENT_USER\Software\Microsoft\Windows NT\CurrentVersion
Name Property
---- --------
Fonts Nunito Black (TrueType) : C:\Users\myuser\AppData\Local\Microsoft\Windows\Fonts\Nunito-Black.ttf
You can install fonts on windows using following powershell scripts.
param(
[Parameter(Mandatory=$true,Position=0)]
[ValidateNotNull()]
[array]$pcNames,
[Parameter(Mandatory=$true,Position=1)]
[ValidateNotNull()]
[string]$fontFolder
)
$padVal = 20
$pcLabel = "Connecting To".PadRight($padVal," ")
$installLabel = "Installing Font".PadRight($padVal," ")
$errorLabel = "Computer Unavailable".PadRight($padVal," ")
$openType = "(Open Type)"
$regPath = "HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Fonts"
$objShell = New-Object -ComObject Shell.Application
if(!(Test-Path $fontFolder))
{
Write-Warning "$fontFolder - Not Found"
}
else
{
$objFolder = $objShell.namespace($fontFolder)
foreach ($pcName in $pcNames)
{
Try{
Write-Output "$pcLabel : $pcName"
$null = Test-Connection $pcName -Count 1 -ErrorAction Stop
$destination = "\\",$pcname,"\c$\Windows\Fonts" -join ""
foreach ($file in $objFolder.items())
{
$fileType = $($objFolder.getDetailsOf($file, 2))
if(($fileType -eq "OpenType font file") -or ($fileType -eq "TrueType font file"))
{
$fontName = $($objFolder.getDetailsOf($File, 21))
$regKeyName = $fontName,$openType -join " "
$regKeyValue = $file.Name
Write-Output "$installLabel : $regKeyValue"
Copy-Item $file.Path $destination
Invoke-Command -ComputerName $pcName -ScriptBlock { $null = New-ItemProperty -Path $args[0] -Name $args[1] -Value $args[2] -PropertyType String -Force } -ArgumentList $regPath,$regKeyname,$regKeyValue
}
}
}
catch{
Write-Warning "$errorLabel : $pcName"
}
}
}

Installing Citrix receiver VIA PowerShell

I'm trying to create a script to install receiver but I'm getting this popup box asking if I'm sure.
$InstallFiles = "C:\Users\raw.admin\Documents\CitrixReceiver.exe"
$ArgumentList = '/silent /includeSSON enable_SSON=yes enableprelaunch=true allowaddstore=a STORE0="Kiewit;https://apps.kiewit.com/citrix/kiewit/discovery;on"'
Write-host "Installing Citrix receiver"
Start-Process -FilePath $InstallFiles -ArgumentList $ArgumentList -wait
##Make sure Single Sign is in Provider Order Key
$Path = "HKLM:\system\CurrentControlSet\Control\NetworkProvider\Order"
$ProviderOrder = (Get-ItemProperty -path $path).ProviderOrder
If ($ProviderOrder -NotLike "*PnSson*")
{
Set-ItemProperty -path $path -Name ProviderOrder -value ($ProviderOrder + ",PnSson")
$NewProviderOrder = (Get-ItemProperty -path $path).ProviderOrder
Write-Host Provider Order key now has the following values $NewProviderOrder
}
Else
{
Write-Host "PnSson already present in Provider Order"
}
Write-host "Installation Complete"
I don't want the user to get prompt to answer the question of yes or no. Can it be done or is it something the user will just have to live with?

Powershell: Update-TfsWorkspace cmdlet how to update two workspaces

I want to update 2 workspaces from two different tfs in one script using powershell.
The first Workspace is updating without any Problems. After the update is finished powershell connects to the second Workspace, but isn't updating the local data like the first time.
I guess the old Connection might still block the pipe or something like that, but I haven't found any cmd to clean the pipe. My code looks like this:
param(
[string]$TestTFS = "http://TestTFS",
[string]$ProdTFS = "http://ProdTFS",
[string]$Teamproject="$\TeamprojectPath",
[string]$LocalTestWorkspace="C:\LocalTestWorkspacePath",
[string]$LocalProdWorkspace="C:\LocalProdWorkspacePath"
)
# Import Microsoft.TeamFoundation.PowerShell Snapin
Add-PSSnapin Microsoft.TeamFoundation.PowerShell
# Connect to production-TFS
$ProdEnvServer = Get-TfsServer -Name $ProdTFS
Write-Host "tfsConnect ="$ProdEnvServer
# Get prod teamprojekt
Get-TfsChildItem $Teamprojekt -Server $ProdEnvServer
# Update files in local prod workspace
Update-TfsWorkspace -Force -Recurse $LocalProdWorkspace
# Connect to test-TFS
$TestEnvServer = Get-TfsServer -Name $TestTFS
Write-Host "tfsConnect ="$TestEnvServer
# Get test teamprojekt
Get-TfsChildItem $Teamprojekt -Server $TestEnvServer
# Update files in local test workspace
Update-TfsWorkspace -Force -Recurse $LocalTestWorkspace
After 3 months and noone coming up with an answer. I just assume that the Cmdlets don't work as they should. The only option here seems to be a Workaround.
# Copy Team Project from Prod to Test TFS
param([string]$TestTFS = "http://TestTFS",
[string]$ProdTFS = "http://ProdTFS",
[String]$Teamproject="$/Teamproject",
[String]$LocalTestWorkspace="C:\LocalTestWorkspacePath",
[String]$LocalProdWorkspace="C:\LocalProdWorkspacePath")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Client")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.VersionControl.Client")
try
{
clear
$LocalTestProjectPath = $LocalTestWorkspace + $Teamproject.Substring(1)
$LocalProdProjectPath = $LocalProdWorkspace + $Teamproject.Substring(1)
# Connect to production-TFS
Write-Host "Getting latest of $ProdTFS"
$tfsColProd = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($ProdTFS)
[Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer] $vcsProd = $tfsColProd.GetService([type] "Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer")
# TryGetWorkspace is sometimes buggy and doesn't return an existing workspace
# Delete existing workspace manually before if that happens
$workspaceProd = $vcsProd.TryGetWorkspace($LocalProdWorkspace)
$isProdTempWorkspace = $false
# create Workspace if it doesn't exists
if (-not $workspaceProd) {
Write-Host "No workspace found, creating temporary for prod"
$workspaceProd = $vcsProd.CreateWorkspace("Temp_" + [System.Guid]::NewGuid().ToString())
$workspaceProd.Map($Teamproject, $LocalProdProjectPath)
$isProdTempWorkspace = $true
}
$itemSpecFullTeamProj = New-Object Microsoft.TeamFoundation.VersionControl.Client.ItemSpec($Teamproject, "Full")
$fileRequest = New-Object Microsoft.TeamFoundation.VersionControl.Client.GetRequest($itemSpecFullTeamProj,
[Microsoft.TeamFoundation.VersionControl.Client.VersionSpec]::Latest)
$workspaceProd.Get($fileRequest, [Microsoft.TeamFoundation.VersionControl.Client.GetOptions]::GetAll)
if ($isProdTempWorkspace) {
Write-Host "Deleting temporary workspace for prod"
$workspaceProd.Delete()
}
Write-Host "Getting latest of $TestTFS"
$tfsColTest = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($TestTFS)
$vcsTest = $tfsColTest.GetService([type] "Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer")
# TryGetWorkspace is sometimes buggy and doesn't return an existing workspace
# Delete existing workspace manually before if that happens
[Microsoft.TeamFoundation.VersionControl.Client.Workspace] $workspaceTest = $vcsTest.TryGetWorkspace($LocalTestWorkspace)
$isTestTempWorkspace = $false
# create Workspace if it doesn't exists
if (-not $workspaceTest) {
Write-Host "No workspace found, creating temporary for test"
$workspaceTest = $vcsTest.CreateWorkspace("Temp_" + [System.Guid]::NewGuid().ToString())
$workspaceTest.Map($Teamproject, $LocalTestProjectPath)
$isTestTempWorkspace = $true
}
$workspaceTest.Get($fileRequest, [Microsoft.TeamFoundation.VersionControl.Client.GetOptions]::GetAll)
# Remove local test folder and copy prod folder into test workspace
Write-Host "Copying over Prod to Test"
# Delete updated test project folder
Remove-Item -Path $LocalTestProjectPath -Force -Recurse
# Copy prod folder to test workspace
Copy-Item -Path $LocalProdProjectPath -Destination $LocalTestProjectPath -Force -Recurse
# Calling tfpt is the only thing that works
Write-Host "Comparing for changes"
$ps = new-object System.Diagnostics.Process
$ps.StartInfo.Filename = $env:TFSPowerToolDir + "tfpt.exe"
$ps.StartInfo.Arguments = "online /adds /deletes /diff /noprompt /recursive $LocalTestProjectPath"
$ps.StartInfo.RedirectStandardOutput = $false # careful, only output works, has hanging problems (2k Buffer limit)
$ps.StartInfo.RedirectStandardError = $false
$ps.StartInfo.UseShellExecute = $false
$ps.Start()
$ps.WaitForExit()
# Check in new test project folder into test environment
$wsCheckinParams = New-Object Microsoft.TeamFoundation.VersionControl.Client.WorkspaceCheckInParameters(
#($itemSpecFullTeamProj),"Update project to production environment version")
# CheckIn better manually to check for errors
$workspaceTest.CheckIn($wsCheckinParams)
if ($isTestTempWorkspace) {
Write-Host "Deleting temporary workspace for test"
$workspaceTest.Delete()
Remove-Item -Path D:\Development -Force -Recurse
}
}
catch [System.Exception]
{
Write-Host "Exception: " ($Error[0]).Exception
EXIT $LASTEXITCODE
}
My approach is very similar to Zittelrittel. Just send the path and it will automatically figure out the workspace.
This will not work in PowerShell ISE (x86), I had to use the 64-bit version!
Add-PSSnapin Microsoft.TeamFoundation.PowerShell
Write-Host "Updating Workspace1, please wait..."
Update-TfsWorkspace -item C:\dev\Workspace1\code -Recurse | Format-Table
Write-Host "Updating Workspace2, please wait..."
Update-TfsWorkspace -item C:\dev\Workspace1\code -Recurse | Format-Table
In your calls to update TFS workspace, pipe the result to out-null. This should effectively remove any data that would otherwise be stored in the pipeline.
Update-TfsWorkspace -Force -Recurse $LocalProdWorkspace | Out-Null
Update-TfsWorkspace -Force -Recurse $LocalTestWorkspace | Out-Null

Automating zip archive

So I am trying to process files in to separate backup files, however the this craetes a single archive named as the last file in the array. I am not sure what I am missing here.
$process = "C:\Program Files (x86)\7-Zip\7z.exe"
$destinationdir = "M:\WIP\OUT\"
$sourcedir = "M:\WIP\ZIP\"
$password = "password"
$ziplist = get-childitem $sourcedir
foreach ($zip in $ziplist)
{$destinationfile= $zip+".zip"
Start-Process $process -ArgumentList "a $destinationfile $zip -o$destinationdir -p$password"-NoNewWindow -Wait
}
Change
$destinationfile= $zip+".zip"
to
$destinationfile= $zip.FullName+".zip"

Send files over PSSession

I just burned a couple of hours searching for a solution to send files over an active PSSession. And the result is nada, niente. I'm trying to invoke a command on a remote computer over an active session, which should copy something from a network storage. So, basically this is it:
icm -Session $s {
Copy-Item $networkLocation $PCLocation }
Because of the "second hop" problem, I can't do that directly, and because I'm running win server 2003 I cant enable CredSSP. I could first copy the files to my computer and then send/push them to the remote machine, but how? I tried PModem, but as I saw it can only pull data and not push.
Any help is appreaciated.
This is now possible in PowerShell / WMF 5.0
Copy-Item has -FromSession and -toSession parameters. You can use one of these and pass in a session variable.
eg.
$cs = New-PSSession -ComputerName 169.254.44.14 -Credential (Get-Credential) -Name SQL
Copy-Item Northwind.* -Destination "C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQL2008R2\MSSQL\DATA\" -ToSession $cs
See more examples at here, or you can checkout the official documentation.
If it was a small file, you could send the contents of the file and the filename as parameters.
$f="the filename"
$c=Get-Content $f
invoke-command -session $s -script {param($filename,$contents) `
set-content -path $filename -value $contents} -argumentlist $f,$c
If the file is too long to fit in whatever the limits for the session are, you could read the file in as chunks, and use a similar technique to append them together in the target location
PowerShell 5+ has built-in support for doing this, described in David's answer.
I faced the same problem a while ago and put together a proof-of-concept for sending files over a PS Remoting session. You'll find the script here:
https://gist.github.com/791112
#requires -version 2.0
[CmdletBinding()]
param (
[Parameter(Mandatory=$true)]
[string]
$ComputerName,
[Parameter(Mandatory=$true)]
[string]
$Path,
[Parameter(Mandatory=$true)]
[string]
$Destination,
[int]
$TransferChunkSize = 0x10000
)
function Initialize-TempScript ($Path) {
"<# DATA" | Set-Content -Path $Path
}
function Complete-Chunk () {
#"
DATA #>
`$TransferPath = `$Env:TEMP | Join-Path -ChildPath '$TransferId'
`$InData = `$false
`$WriteStream = [IO.File]::OpenWrite(`$TransferPath)
try {
`$WriteStream.Seek(0, 'End') | Out-Null
`$MyInvocation.MyCommand.Definition -split "``n" | ForEach-Object {
if (`$InData) {
`$InData = -not `$_.StartsWith('DATA #>')
if (`$InData) {
`$WriteBuffer = [Convert]::FromBase64String(`$_)
`$WriteStream.Write(`$WriteBuffer, 0, `$WriteBuffer.Length)
}
} else {
`$InData = `$_.StartsWith('<# DATA')
}
}
} finally {
`$WriteStream.Close()
}
"#
}
function Complete-FinalChunk ($Destination) {
#"
`$TransferPath | Move-Item -Destination '$Destination' -Force
"#
}
$ErrorActionPreference = 'Stop'
Set-StrictMode -Version Latest
$EncodingChunkSize = 57 * 100
if ($EncodingChunkSize % 57 -ne 0) {
throw "EncodingChunkSize must be a multiple of 57"
}
$TransferId = [Guid]::NewGuid().ToString()
$Path = ($Path | Resolve-Path).ProviderPath
$ReadBuffer = New-Object -TypeName byte[] -ArgumentList $EncodingChunkSize
$TempPath = ([IO.Path]::GetTempFileName() | % { $_ | Move-Item -Destination "$_.ps1" -PassThru}).FullName
$Session = New-PSSession -ComputerName $ComputerName
$ReadStream = [IO.File]::OpenRead($Path)
$ChunkCount = 0
Initialize-TempScript -Path $TempPath
try {
do {
$ReadCount = $ReadStream.Read($ReadBuffer, 0, $EncodingChunkSize)
if ($ReadCount -gt 0) {
[Convert]::ToBase64String($ReadBuffer, 0, $ReadCount, 'InsertLineBreaks') |
Add-Content -Path $TempPath
}
$ChunkCount += $ReadCount
if ($ChunkCount -ge $TransferChunkSize -or $ReadCount -eq 0) {
# send
Write-Verbose "Sending chunk $TransferIndex"
Complete-Chunk | Add-Content -Path $TempPath
if ($ReadCount -eq 0) {
Complete-FinalChunk -Destination $Destination | Add-Content -Path $TempPath
Write-Verbose "Sending final chunk"
}
Invoke-Command -Session $Session -FilePath $TempPath
# reset
$ChunkCount = 0
Initialize-TempScript -Path $TempPath
}
} while ($ReadCount -gt 0)
} finally {
if ($ReadStream) { $ReadStream.Close() }
$Session | Remove-PSSession
$TempPath | Remove-Item
}
Some minor changes would allow it to accept a session as a parameter instead of it starting a new one. I found the memory consumption on the Remoting service on the destination computer could grow quite large when transferring large files. I suspect PS Remoting wasn't really designed to be used this way.
NET USE allows you to add a local drive letter for the remote system, which then then allows you to use the drive letter in your PSSession, or even without a PSSession. This is helpful if you don't have Powershell v5.0, and even if you do,
You may use the remote machine name or its IP address as part of the remote UNC path and you can specify the username and password credentials on the same line:
NET USE Z: \\192.168.1.50\ShareName /USER:192.168.1.50\UserName UserPassword
Another example:
NET USE Z: \\RemoteSystem\ShareName /USER:RemoteSystem\UserName UserPassword
OR
NET USE Z: \\RemoteSystem\ShareName /USER:Domain\UserName UserPassword
If you don't supply the user credentials on the same line, you will be prompted for them:
>NET USE Z: \\192.168.1.50\ShareName
Enter the user name for '192.168.1.50': 192.168.1.50\UserName
Enter the password for 192.168.1.50: *****
The command completed successfully.
You may remove the drive letter when you're finished with the following:
NET USE Z: /delete
You can get the full syntax with NET USE /?
>net use /?
The syntax of this command is:
NET USE
[devicename | *] [\\computername\sharename[\volume] [password | *]]
[/USER:[domainname\]username]
[/USER:[dotted domain name\]username]
[/USER:[username#dotted domain name]
[/SMARTCARD]
[/SAVECRED]
[[/DELETE] | [/PERSISTENT:{YES | NO}]]
NET USE {devicename | *} [password | *] /HOME
NET USE [/PERSISTENT:{YES | NO}]
NET is a standard external .exe command in the system folder and works in Powershell just fine.
$data = Get-Content 'C:\file.exe' -Raw
Invoke-Command -ComputerName 'server' -ScriptBlock { $using:data | Set-Content -Path 'D:\filecopy.exe' }
Don't actually know what the maximum file size limitation is.