I'm trying to use PowerShell to upload a (long) list of queued files, using System.Net.WebClient and the UploadFile function. This works fine, but after a file us uploaded, the ftp-connection never closes, either when the WebClient object instance goes out of scope or even after the script has finished. The function looks as follows:
function Upload-File() {
Param (
[string] $user,
[string] $password,
[string] $srceFileName,
[string] $destFileName
)
# Set up FTP-client
$client = New-Object System.Net.WebClient
$client.Credentials = New-Object System.Net.NetworkCredential($user, $password)
$client.UploadFile($destFileName, ".\$srceFileName")
$client.Dispose()
}
All the information I can find states that the connection should close automatically when $client goes out of scope but this is clearly not happening.
Any idea on how to force the connection to close?
(This is part of a legacy system and for now I am stuck with ftp, so switching to another protocol is not an option.)
For anyone else running into this problem, the solution is to use FtpWebRequest instead of WebClient and to set KeepAlive = $false. The function below will upload and then terminate the connection immediately afterwards.
function Upload-File() {
Param (
[string] $user,
[string] $password,
[string] $srceFileName,
[string] $destFileName
)
$request = [Net.WebRequest]::Create($destFileName)
$request.KeepAlive = $false
$request.Credentials =
New-Object System.Net.NetworkCredential($user, $password)
$request.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$fileStream = [System.IO.File]::OpenRead(".\$srceFileName")
$ftpStream = $request.GetRequestStream()
$fileStream.CopyTo($ftpStream)
$ftpStream.Dispose()
$fileStream.Dispose()
}
This post pointed me in the right direction.
Related
Other scripts with dynamic parameters have never given me any issues. The one thing that is different is this is the first time I am using a SecureString type on a static parameter.
If I call this paramenter, the Dynamic one will not appear. I'm not doing anything special to hide the dynamic parameter. It should always appear. The issue appears on both 5.1 and PowerShell core.
I would be interested to know if there is a solution to this problem.
The dynamic parameter looks at available system services and allows you to enter only those services as part of the parameter.
This is only an example to replicate what I'm experiencing. Calling the Name parameter and supplying the information has no effect on the Service Dynamic Parameter. As soon as Password Parameter is called, you cannot call the Service Parameter if you hadn't already.
function test-command {
[CmdletBinding()]
param(
[Parameter(Mandatory)]
$Name,
[Parameter(Mandatory)]
[SecureString]
$Password
)
DynamicParam {
$ParameterName = 'Service'
$RuntimeParameterDictionary = New-Object System.Management.Automation.RuntimeDefinedParameterDictionary
$AttributeCollection = New-Object System.Collections.ObjectModel.Collection[System.Attribute]
$ParameterAttribute = New-Object System.Management.Automation.ParameterAttribute
$ParameterAttribute.Mandatory = $true
$ParameterAttribute.HelpMessage = "Parameter Help."
$AttributeCollection.Add($ParameterAttribute)
$arrSet = (Get-Service).Name
$ValidateSetAttribute = New-Object System.Management.Automation.ValidateSetAttribute($arrSet)
$AttributeCollection.Add($ValidateSetAttribute)
$AttributeAlias = New-Object System.Management.Automation.AliasAttribute('s', 'Serv')
$AttributeCollection.Add($AttributeAlias)
$RuntimeParameter = New-Object System.Management.Automation.RuntimeDefinedParameter($ParameterName, [array], $AttributeCollection)
$RuntimeParameterDictionary.Add($ParameterName, $RuntimeParameter)
return $RuntimeParameterDictionary
}
begin {
$Service = $PsBoundParameters[$ParameterName]
}
process {
$Service
}
end {
}
}
Yes, I could call Service Parameter first and then the remaining ones, but if there is an answer as to why it's behaving this way I'd certainly like to know. Plus, if someone else uses my script, they wouldn't automatically know to do this. Is this a bug or am I doing something wrong?
I am looking for a solution to parse an error-response of a given web-service.
Below sample works great in general, but if the response is larger than 64kb then the content is not availabe in the exception at all.
I have seen some solutions recommending to use webHttpClient and increase the MaxResponseContentBufferSize here, but how can I do this for a given WebClient-object?
Is there any option to change that BufferSize globally for all net-webcalls like below TLS12-settings?
Here is my sample-code:
# using net-webclient to use individual user-side proxy-settings:
$web = new-object Net.WebClient
[Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$url = "address to web-service"
try {
$response = $web.DownloadString($url)
} catch [System.Net.WebException] {
# this part needs to work even if the error-response in larger than 64kb
# unfortunately the response-object is empty in such case
$message = $_.Exception.Response
$stream = $message.GetResponseStream()
$reader = new-object System.IO.StreamReader ($stream)
$body = $reader.ReadToEnd()
write-host "#error:$body"
}
I solved it at the end by switching to system.net.httpclient.
That way I still repect any custom proxy-settings and also avoid the above mentioned 64kb-limit in any error-response. Here a sample how to use it:
$url = "address to web-service"
$cred = Get-Credential
# define settings for the http-client:
Add-Type -AssemblyName System.Net.Http
$ignoreCerts = [System.Net.Http.HttpClientHandler]::DangerousAcceptAnyServerCertificateValidator
$handler = [System.Net.Http.HttpClientHandler]::new()
$handler.ServerCertificateCustomValidationCallback = $ignoreCerts
$handler.Credentials = $cred
$handler.PreAuthenticate = $true
$client = [System.Net.Http.HttpClient]::new($handler)
$client.Timeout = [System.TimeSpan]::FromSeconds(10)
$result = $client.GetAsync($url).result
$response = $result.Content.ReadAsStringAsync().Result
write-host $response
I'm trying to modify Contacts in Gmail account, using the .Net API, and Loading it in powershell.
I'm following the steps described here (Updating contacts, Doh !)
To update a contact, first retrieve the contact entry, modify the data and send an authorized PUT request to the contact's edit URL with the modified contact entry in the body.
OK, got it so I'm succesfull to retrieve contact informations using this code :
$Settings = New-Object Google.GData.Client.RequestSettings( "MyApp", $username , $password )
$Credentials = New-Object System.Net.NetworkCredential( $username, $password )
$Request = New-Object Google.Contacts.ContactsRequest( $Settings )
$Contacts = $Request.GetContacts()
$GoogleContact = $Contacts.Entries |? { $_.PrimaryEmail.Address -eq "john.doe#gmail.com" }
$GoogleContact.Title
Of course, I had a Mail message from google indicating that an external App was bloqued, and I changed a security parameter to allow this code to work ...
And it works, the code is prompting my Google Contact Title.
And now, my problem:
I'm changing a property on my Object:
$GoogleContact.Title = "Mac Gyver"
I'm using a function called Execute-HTTPPostCommand, slightly modified to add the Etag value, required by google to make sure I'm not modifying an entry that is actually modified somewhere else :
function Execute-HTTPPostCommand() {
param(
[string] $TargetUrl = $null
,[string] $PostData = $null
,$Credentials
,$Etag
)
$ErrorActionPreference = "Stop"
$global:webRequest = [System.Net.WebRequest]::Create($TargetUrl)
$webRequest.Headers.Add("etag", $Etag )
$webRequest.ContentType = "text/html"
$PostStr = [System.Text.Encoding]::UTF8.GetBytes($PostData)
$webrequest.ContentLength = $PostStr.Length
$webRequest.ServicePoint.Expect100Continue = $false
$webRequest.Credentials = $Credentials
$webRequest.PreAuthenticate = $true
$webRequest.Method = "PUT"
$Global:requestStream = $webRequest.GetRequestStream()
$requestStream.Write($PostStr, 0,$PostStr.length)
$requestStream.Close()
[System.Net.WebResponse] $global:resp = $webRequest.GetResponse();
$rs = $resp.GetResponseStream();
[System.IO.StreamReader] $sr = New-Object System.IO.StreamReader -argumentList $rs;
[string] $results = $sr.ReadToEnd();
return $results;
}
And calling it this way:
Execute-HTTPPostCommand -TargetUrl $GoogleContact.Id -PostData $GoogleContact -Credentials $Credentials -Etag $GoogleContact.ETag
the Contact.ID value is the URL required by google to update the contact, it looks like this : https://www.google.com/m8/feeds/contacts/userEmail/full/{contactId}
I'm getting an error 401 : unautorized.
Being a Windows Sysadmin, I'm not familiar with webservices PUT requests ...
I'm using the same credentials to Read Datas and trying to update datas.
What am I missing?
Ok, that was pretty easy, I should have RTFM ...
#Loading Google API
$Settings = New-Object Google.GData.Client.RequestSettings( "MyApp", $username , $password )
$Credentials = New-Object System.Net.NetworkCredential( $username, $password )
#Loading Contacts, and getting the one I want
$Request = New-Object Google.Contacts.ContactsRequest( $Settings )
$Contacts = $Request.GetContacts()
$GoogleContact = $Contacts.Entries |? { $_.PrimaryEmail.Address -eq "john.doe#gmail.com" }
#Updating the fields
$GoogleContact.Name.FullName = "Mac Gyver"
$GoogleContact.Name.GivenName = "Mac"
$GoogleContact.Name.FamilyName = "Gyver"
$GoogleContact.Title = "Handyman Masterchief"
#Update
$MAJ = $ContactRequest.Update($GoogleContact)
And it works as is, just like the .Net example.
No need to load a heavy PUT request, the API do his Job.
Sorry for loss of time guys !
I"m able to put a file up to a remote FTP with a modified version of...
$File = "D:\Dev\somefilename.zip"
$ftp = "ftp://username:password#example.com/pub/incoming/somefilename.zip"
"ftp url: $ftp"
$webclient = New-Object System.Net.WebClient
$uri = New-Object System.Uri($ftp)
"Uploading $File..."
$webclient.UploadFile($uri, $File)
I'm running into the problem that I"m trying to upload a file to a directory that doesn't exist, the put fails. So I need to create the target directory first. GET-MEMBER doesn't seem to show any methods I can invoke to create a directory, only file operations.
I use function Create-FtpDirectory
function Create-FtpDirectory {
param(
[Parameter(Mandatory=$true)]
[string]
$sourceuri,
[Parameter(Mandatory=$true)]
[string]
$username,
[Parameter(Mandatory=$true)]
[string]
$password
)
if ($sourceUri -match '\\$|\\\w+$') { throw 'sourceuri should end with a file name' }
$ftprequest = [System.Net.FtpWebRequest]::Create($sourceuri);
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::MakeDirectory
$ftprequest.UseBinary = $true
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$response = $ftprequest.GetResponse();
Write-Host Upload File Complete, status $response.StatusDescription
$response.Close();
}
Taken from Ftp.psm1 where you can find also other functions for FTP.
To others: sorry for not following well known verb-noun pattern. ;)
In C#, I might do something like this:
System.Net.WebClient w = new System.Net.WebClient();
w.Credentials = new System.Net.NetworkCredential(username, auth, domain);
string webpage = w.DownloadString(url);
Is there a Powershell version of this, or should I just call through to the CLR?
The PowerShell is almost exactly the same.
$webclient = new-object System.Net.WebClient
$webclient.Credentials = new-object System.Net.NetworkCredential($username, $password, $domain)
$webpage = $webclient.DownloadString($url)
For those that need Powershell to return additional information like the Http StatusCode, here's an example. Included are the two most likely ways to pass in credentials.
Its a slightly modified version of this SO answer:
How to obtain numeric HTTP status codes in PowerShell
$req = [system.Net.WebRequest]::Create($url)
# method 1 $req.UseDefaultCredentials = $true
# method 2 $req.Credentials = New-Object System.Net.NetworkCredential($username, $pwd, $domain);
try
{
$res = $req.GetResponse()
}
catch [System.Net.WebException]
{
$res = $_.Exception.Response
}
$int = [int]$res.StatusCode
$status = $res.StatusCode
return "$int $status"
In some case NTLM authentication still won't work if given the correct credential.
There's a mechanism which will void NTLM auth within WebClient, see here for more information: System.Net.WebClient doesn't work with Windows Authentication
If you're trying above answer and it's still not working, follow the above link to add registry to make the domain whitelisted.
Post this here to save other's time ;)