I need to pass Credentials to Powershell for CQD.
(CQD is the Office 365 "Call Quality Dashboard" for Teams).
I tried using the script from https://www.powershellgallery.com/packages/CQDPowerShell/2.0.0/Content/CQDPowerShell.psm1 or https://answers.microsoft.com/en-us/msteams/forum/all/need-connection-uri-for-creating-a-cqd-powershell/f3c9e340-bdac-47c9-af52-d47d6df49fca
But how do I bypass the login/password prompt and pass the credentials seamlessly to Powershell?
Here is my script:
function Get-CQDToken ([string]$client_id)
{
Add-Type -AssemblyName System.Web
$resourceUrl = $WebResource
$redirectUrl = "https://cqd.teams.microsoft.com/spd/"
$nonce = [guid]::NewGuid().GUID
$url = "https://login.microsoftonline.com/common/oauth2/authorize?response_type=token&redirect_uri=" +
[System.Web.HttpUtility]::UrlEncode($redirectUrl) +
"&client_id=$client_id" +
"&prompt=login" + "&nonce=$nonce" + "&resource=" + [System.Web.HttpUtility]::UrlEncode($WebResource)
Add-Type -AssemblyName System.Windows.Forms
$form = New-Object -TypeName System.Windows.Forms.Form -Property #{ Width = 440; Height = 640 }
$web = New-Object -TypeName System.Windows.Forms.WebBrowser -Property #{ Width = 420; Height = 600; Url = ($url) }
$DocComp = {
$Global:uri = $web.Url.AbsoluteUri
if ($Global:Uri -match "error=[^&]*|access_token=[^&]*") {$form.Close()}
}
$web.ScriptErrorsSuppressed = $true
$web.Add_DocumentCompleted($DocComp)
$form.Controls.Add($web)
$form.Add_Shown({$form.Activate()})
$form.ShowDialog() | Out-Null
$Script:TokenLifeTime = [Web.HttpUtility]::ParseQueryString(($web.Url -replace '^.*?(expires_in.+)$','$1'))['expires_in']
$Script:Token = [Web.HttpUtility]::ParseQueryString(($web.Url -replace '^.*?(access_token.+)$','$1'))['access_token']
return ('Bearer {0}' -f $Script:Token)
}
$userName = "aaron"
$password = "Password_1234"
$configRest = Invoke-RestMethod -Uri "https://cqd.teams.microsoft.com/repository/clientconfiguration" -Method Get -SessionVariable WebSession -UserAgent "CQDPowerShell V2.0"
$WebResource = $configRest.AuthLoginResource
$AADBearerToken = Get-CQDToken $configRest.AuthWebAppClientId
$WebSession.headers.Add('Authorization',$AADBearerToken)
Please go through this ms docs.
https://techcommunity.microsoft.com/t5/teams-developer/how-to-make-source-code-from-cqdpowershell-cmdlet-acquiring/m-p/1325146
While we need to have the code running indefinitely without the user interaction. Access code works for 1hr only again have to intract again. So it won't work
If they have their own datamart and want to build their own reporting UI and customize their own reports etc - create a subscription to the call records API Working with the call records API in Microsoft Graph - Microsoft Graph v1.0 | Microsoft Docs and use that to stream the call records into their own system.
I am looking for a solution to parse an error-response of a given web-service.
Below sample works great in general, but if the response is larger than 64kb then the content is not availabe in the exception at all.
I have seen some solutions recommending to use webHttpClient and increase the MaxResponseContentBufferSize here, but how can I do this for a given WebClient-object?
Is there any option to change that BufferSize globally for all net-webcalls like below TLS12-settings?
Here is my sample-code:
# using net-webclient to use individual user-side proxy-settings:
$web = new-object Net.WebClient
[Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$url = "address to web-service"
try {
$response = $web.DownloadString($url)
} catch [System.Net.WebException] {
# this part needs to work even if the error-response in larger than 64kb
# unfortunately the response-object is empty in such case
$message = $_.Exception.Response
$stream = $message.GetResponseStream()
$reader = new-object System.IO.StreamReader ($stream)
$body = $reader.ReadToEnd()
write-host "#error:$body"
}
I solved it at the end by switching to system.net.httpclient.
That way I still repect any custom proxy-settings and also avoid the above mentioned 64kb-limit in any error-response. Here a sample how to use it:
$url = "address to web-service"
$cred = Get-Credential
# define settings for the http-client:
Add-Type -AssemblyName System.Net.Http
$ignoreCerts = [System.Net.Http.HttpClientHandler]::DangerousAcceptAnyServerCertificateValidator
$handler = [System.Net.Http.HttpClientHandler]::new()
$handler.ServerCertificateCustomValidationCallback = $ignoreCerts
$handler.Credentials = $cred
$handler.PreAuthenticate = $true
$client = [System.Net.Http.HttpClient]::new($handler)
$client.Timeout = [System.TimeSpan]::FromSeconds(10)
$result = $client.GetAsync($url).result
$response = $result.Content.ReadAsStringAsync().Result
write-host $response
I'm trying to get a session cookie using PowerShell and InternetExplorer.Application but nothing seems to work.
There is no $ie.Document.cookie variable.
The session cookie is not available to JavaScript(because it is http-only)
# Create an ie com object
$ie = New-Object -ComObject "InternetExplorer.Application"
$ie.visible = $true;
$ie.navigate2("https://www.example.com/login");
# Wait for the page to load
while ($ie.Busy -eq $true) { Start-Sleep -Milliseconds 1000; }
#Add login details
$ie.Document.getElementById("username").value = "user";
$ie.Document.getElementById("password").value = "1234";
$ie.Document.getElementsByName("login_button")[0].Click();
while($ie.Busy -eq $true) { Start-Sleep -Milliseconds 1000; }
$div = $ie.Document.getElementsByTagName('div')[0];
$ie.navigate("javascript: window.location=document.cookie.match(new RegExp('(^| )csrf_token=([^;]+)'))[2]");
while($ie.Busy -eq $true) { Start-Sleep -Milliseconds 1000; }
$csrf = $ie.LocationUrl.Substring(32);
echo $csrf;
#Stop-Process -Name iexplore
$session = New-Object Microsoft.PowerShell.Commands.WebRequestSession
$cookie = New-Object System.Net.Cookie
$cookie.Name = "user_name"
$cookie.Value = "user"
$cookie.Domain = "www.example.com"
$session.Cookies.Add($cookie);
$cookie = New-Object System.Net.Cookie
$cookie.Name = "user_session_id"
$cookie.Value = "What I need"
$cookie.Domain = "www.example.com"
$session.Cookies.Add($cookie);
Invoke-WebRequest -URI "https://www.example.com/demo/my_file&csrf_token=$csrf" -WebSession $session -OutFile 'finally.zip';
echo 'Done!';
Note that the only way I found to get the csrf is to use javascript to get the value to the url, but I can't do it with the user_session_id because it is marked as http_only.
Take a look at these options to incorporate into what you already have.
First, get the cookies
$session = New-Object Microsoft.PowerShell.Commands.WebRequestSession
Get-Content .\cookie.txt |
foreach {
$line = $_ -split '/' | select -First 1
$tokens=$line.Split("`t").TrimEnd()
$c = #{
name=$tokens[0]
value=$tokens[1]
domain=$tokens[2]
}
$cookie = New-Object System.Net.Cookie
$cookie.Name=$c.name
$cookie.Value=$c.Value
$cookie.Domain=$c.domain
$session.Cookies.Add($cookie)
}
Getting Cookies using PowerShell
Here are two straightforward ways to get website cookies within PowerShell.
$url = "https://www.linkedin.com"
$webrequest = Invoke-WebRequest -Uri $url -SessionVariable websession
$cookies = $websession.Cookies.GetCookies($url)
# Here, you can output all of $cookies, or you can go through them one by one.
foreach ($cookie in $cookies) {
# You can get cookie specifics, or just use $cookie
# This gets each cookie's name and value
Write-Host "$($cookie.name) = $($cookie.value)"
}
Read over the stackoverflow for the answer, still can't find what causing this..
Trying to connect and send a POST request using powershell, and getting "unable to connect to remote server" error. Tried 3 different dummy servers like http://posttestserver.com/post.php
Script:
Get-ExecutionPolicy
[string]$url = "http://posttestserver.com/post.php"
function Execute-HTTPPostCommand()
{
param([string]$target = $null)
#$username = "administrator"
#$password = "mypass"
$webRequest = [System.Net.WebRequest]::Create($target)
$webRequest.ContentType = "text/html"
$post = "abcdefg"
$PostStr = [System.Text.Encoding]::UTF8.GetBytes($Post)
$webrequest.ContentLength = $PostStr.Length
$webRequest.ServicePoint.Expect100Continue = $false
#$webRequest.Credentials = New-Object System.Net.NetworkCredential -ArgumentList $username, $password
#$webRequest.PreAuthenticate = $true
$webRequest.Method = "POST"
try
{
$requestStream = $webRequest.GetRequestStream()
}
catch
{
write-host $_.Exception
}
$requestStream.Write($PostStr, 0,$PostStr.length)
$requestStream.Close()
[System.Net.WebResponse]$resp = $webRequest.GetResponse();
$rs = $resp.GetResponseStream();
[System.IO.StreamReader]$sr = New-Object System.IO.StreamReader -argumentList $rs;
[string]$results = $sr.ReadToEnd();
return $results;
}
Execute-HTTPPostCommand $url
[System.GC]::Collect()
I want to use PowerShell to transfer files with FTP to an anonymous FTP server. I would not use any extra packages. How?
I am not sure you can 100% bullet proof the script from not hanging or crashing, as there are things outside your control (what if the server loses power mid-upload?) - but this should provide a solid foundation for getting you started:
# create the FtpWebRequest and configure it
$ftp = [System.Net.FtpWebRequest]::Create("ftp://localhost/me.png")
$ftp = [System.Net.FtpWebRequest]$ftp
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential("anonymous","anonymous#localhost")
$ftp.UseBinary = $true
$ftp.UsePassive = $true
# read in the file to upload as a byte array
$content = [System.IO.File]::ReadAllBytes("C:\me.png")
$ftp.ContentLength = $content.Length
# get the request stream, and write the bytes into it
$rs = $ftp.GetRequestStream()
$rs.Write($content, 0, $content.Length)
# be sure to clean up after ourselves
$rs.Close()
$rs.Dispose()
There are some other ways too. I have used the following script:
$File = "D:\Dev\somefilename.zip";
$ftp = "ftp://username:password#example.com/pub/incoming/somefilename.zip";
Write-Host -Object "ftp url: $ftp";
$webclient = New-Object -TypeName System.Net.WebClient;
$uri = New-Object -TypeName System.Uri -ArgumentList $ftp;
Write-Host -Object "Uploading $File...";
$webclient.UploadFile($uri, $File);
And you could run a script against the windows FTP command line utility using the following command
ftp -s:script.txt
(Check out this article)
The following question on SO also answers this: How to script FTP upload and download?
I'm not gonna claim that this is more elegant than the highest-voted solution...but this is cool (well, at least in my mind LOL) in its own way:
$server = "ftp.lolcats.com"
$filelist = "file1.txt file2.txt"
"open $server
user $user $password
binary
cd $dir
" +
($filelist.split(' ') | %{ "put ""$_""`n" }) | ftp -i -in
As you can see, it uses that dinky built-in windows FTP client. Much shorter and straightforward, too. Yes, I've actually used this and it works!
Easiest way
The most trivial way to upload a binary file to an FTP server using PowerShell is using WebClient.UploadFile:
$client = New-Object System.Net.WebClient
$client.Credentials =
New-Object System.Net.NetworkCredential("username", "password")
$client.UploadFile(
"ftp://ftp.example.com/remote/path/file.zip", "C:\local\path\file.zip")
Advanced options
If you need a greater control, that WebClient does not offer (like TLS/SSL encryption, etc), use FtpWebRequest. Easy way is to just copy a FileStream to FTP stream using Stream.CopyTo:
$request = [Net.WebRequest]::Create("ftp://ftp.example.com/remote/path/file.zip")
$request.Credentials =
New-Object System.Net.NetworkCredential("username", "password")
$request.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$fileStream = [System.IO.File]::OpenRead("C:\local\path\file.zip")
$ftpStream = $request.GetRequestStream()
$fileStream.CopyTo($ftpStream)
$ftpStream.Dispose()
$fileStream.Dispose()
Progress monitoring
If you need to monitor an upload progress, you have to copy the contents by chunks yourself:
$request = [Net.WebRequest]::Create("ftp://ftp.example.com/remote/path/file.zip")
$request.Credentials =
New-Object System.Net.NetworkCredential("username", "password")
$request.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$fileStream = [System.IO.File]::OpenRead("C:\local\path\file.zip")
$ftpStream = $request.GetRequestStream()
$buffer = New-Object Byte[] 10240
while (($read = $fileStream.Read($buffer, 0, $buffer.Length)) -gt 0)
{
$ftpStream.Write($buffer, 0, $read)
$pct = ($fileStream.Position / $fileStream.Length)
Write-Progress `
-Activity "Uploading" -Status ("{0:P0} complete:" -f $pct) `
-PercentComplete ($pct * 100)
}
$ftpStream.Dispose()
$fileStream.Dispose()
Uploading folder
If you want to upload all files from a folder, see
PowerShell Script to upload an entire folder to FTP
I recently wrote for powershell several functions for communicating with FTP, see https://github.com/AstralisSomnium/PowerShell-No-Library-Just-Functions/blob/master/FTPModule.ps1. The second function below, you can send a whole local folder to FTP. In the module are even functions for removing / adding / reading folders and files recursively.
#Add-FtpFile -ftpFilePath "ftp://myHost.com/folder/somewhere/uploaded.txt" -localFile "C:\temp\file.txt" -userName "User" -password "pw"
function Add-FtpFile($ftpFilePath, $localFile, $username, $password) {
$ftprequest = New-FtpRequest -sourceUri $ftpFilePath -method ([System.Net.WebRequestMethods+Ftp]::UploadFile) -username $username -password $password
Write-Host "$($ftpRequest.Method) for '$($ftpRequest.RequestUri)' complete'"
$content = $content = [System.IO.File]::ReadAllBytes($localFile)
$ftprequest.ContentLength = $content.Length
$requestStream = $ftprequest.GetRequestStream()
$requestStream.Write($content, 0, $content.Length)
$requestStream.Close()
$requestStream.Dispose()
}
#Add-FtpFolderWithFiles -sourceFolder "C:\temp\" -destinationFolder "ftp://myHost.com/folder/somewhere/" -userName "User" -password "pw"
function Add-FtpFolderWithFiles($sourceFolder, $destinationFolder, $userName, $password) {
Add-FtpDirectory $destinationFolder $userName $password
$files = Get-ChildItem $sourceFolder -File
foreach($file in $files) {
$uploadUrl ="$destinationFolder/$($file.Name)"
Add-FtpFile -ftpFilePath $uploadUrl -localFile $file.FullName -username $userName -password $password
}
}
#Add-FtpFolderWithFilesRecursive -sourceFolder "C:\temp\" -destinationFolder "ftp://myHost.com/folder/" -userName "User" -password "pw"
function Add-FtpFolderWithFilesRecursive($sourceFolder, $destinationFolder, $userName, $password) {
Add-FtpFolderWithFiles -sourceFolder $sourceFolder -destinationFolder $destinationFolder -userName $userName -password $password
$subDirectories = Get-ChildItem $sourceFolder -Directory
$fromUri = new-object System.Uri($sourceFolder)
foreach($subDirectory in $subDirectories) {
$toUri = new-object System.Uri($subDirectory.FullName)
$relativeUrl = $fromUri.MakeRelativeUri($toUri)
$relativePath = [System.Uri]::UnescapeDataString($relativeUrl.ToString())
$lastFolder = $relativePath.Substring($relativePath.LastIndexOf("/")+1)
Add-FtpFolderWithFilesRecursive -sourceFolder $subDirectory.FullName -destinationFolder "$destinationFolder/$lastFolder" -userName $userName -password $password
}
}
Here's my super cool version BECAUSE IT HAS A PROGRESS BAR :-)
Which is a completely useless feature, I know, but it still looks cool \m/ \m/
$webclient = New-Object System.Net.WebClient
Register-ObjectEvent -InputObject $webclient -EventName "UploadProgressChanged" -Action { Write-Progress -Activity "Upload progress..." -Status "Uploading" -PercentComplete $EventArgs.ProgressPercentage } > $null
$File = "filename.zip"
$ftp = "ftp://user:password#server/filename.zip"
$uri = New-Object System.Uri($ftp)
try{
$webclient.UploadFileAsync($uri, $File)
}
catch [Net.WebException]
{
Write-Host $_.Exception.ToString() -foregroundcolor red
}
while ($webclient.IsBusy) { continue }
PS. Helps a lot, when I'm wondering "did it stop working, or is it just my slow ASDL connection?"
You can simply handle file uploads through PowerShell, like this.
Complete project is available on Github here https://github.com/edouardkombo/PowerShellFtp
#Directory where to find pictures to upload
$Dir= 'c:\fff\medias\'
#Directory where to save uploaded pictures
$saveDir = 'c:\fff\save\'
#ftp server params
$ftp = 'ftp://10.0.1.11:21/'
$user = 'user'
$pass = 'pass'
#Connect to ftp webclient
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
#Initialize var for infinite loop
$i=0
#Infinite loop
while($i -eq 0){
#Pause 1 seconde before continue
Start-Sleep -sec 1
#Search for pictures in directory
foreach($item in (dir $Dir "*.jpg"))
{
#Set default network status to 1
$onNetwork = "1"
#Get picture creation dateTime...
$pictureDateTime = (Get-ChildItem $item.fullName).CreationTime
#Convert dateTime to timeStamp
$pictureTimeStamp = (Get-Date $pictureDateTime).ToFileTime()
#Get actual timeStamp
$timeStamp = (Get-Date).ToFileTime()
#Get picture lifeTime
$pictureLifeTime = $timeStamp - $pictureTimeStamp
#We only treat pictures that are fully written on the disk
#So, we put a 2 second delay to ensure even big pictures have been fully wirtten in the disk
if($pictureLifeTime -gt "2") {
#If upload fails, we set network status at 0
try{
$uri = New-Object System.Uri($ftp+$item.Name)
$webclient.UploadFile($uri, $item.FullName)
} catch [Exception] {
$onNetwork = "0"
write-host $_.Exception.Message;
}
#If upload succeeded, we do further actions
if($onNetwork -eq "1"){
"Copying $item..."
Copy-Item -path $item.fullName -destination $saveDir$item
"Deleting $item..."
Remove-Item $item.fullName
}
}
}
}
You can use this function :
function SendByFTP {
param (
$userFTP = "anonymous",
$passFTP = "anonymous",
[Parameter(Mandatory=$True)]$serverFTP,
[Parameter(Mandatory=$True)]$localFile,
[Parameter(Mandatory=$True)]$remotePath
)
if(Test-Path $localFile){
$remoteFile = $localFile.Split("\")[-1]
$remotePath = Join-Path -Path $remotePath -ChildPath $remoteFile
$ftpAddr = "ftp://${userFTP}:${passFTP}#${serverFTP}/$remotePath"
$browser = New-Object System.Net.WebClient
$url = New-Object System.Uri($ftpAddr)
$browser.UploadFile($url, $localFile)
}
else{
Return "Unable to find $localFile"
}
}
This function send specified file by FTP.
You must call the function with these parameters :
userFTP = "anonymous" by default or your username
passFTP = "anonymous" by default or your password
serverFTP = IP address of the FTP server
localFile = File to send
remotePath = the path on the FTP server
For example :
SendByFTP -userFTP "USERNAME" -passFTP "PASSWORD" -serverFTP "MYSERVER" -localFile "toto.zip" -remotePath "path/on/the/FTP/"
Goyuix's solution works great, but as presented it gives me this error: "The requested FTP command is not supported when using HTTP proxy."
Adding this line after $ftp.UsePassive = $true fixed the problem for me:
$ftp.Proxy = $null;
Simple solution if you can install curl.
curl.exe -p --insecure "ftp://<ftp_server>" --user "user:password" -T "local_file_full_path"