Test WebClient API call - powershell

I have the following PowerShell script that makes an API Rest GET call.
$FullURL = "http://test.net/config/server/$($env:COMPUTERNAME)?format=test"
$API = New-Object System.Net.WebClient
$APIData = $API.DownloadString($FullURL)
Set-Content -Value $APIdata -Path $APIDataFile -Force
The call is dependent on the local hostname in the URI. It gets the data and exports to a text file as a backup. The problem - the API host maybe down or no information available for the host which will cause all sorts of errors in the script (this is a small part of the script as it adds the data to an XML file).
How do I add logic to the script to test the API call first, if the successfully then continue with making the API call?
It must work for PowerShell 2.0 because of windows 2003 hosts. API is gives an error 404 code if the theres no data.

You need to use a try/catch block. WebClient should raise an exception when the download isn't successful. Try something like:
$FullURL = "http://test.net/config/server/$($env:COMPUTERNAME)?format=test"
try {
$API = New-Object System.Net.WebClient
$APIData = $API.DownloadString($FullURL)
Set-Content -Value $APIdata -Path $APIDataFile -Force
}
catch [Net.WebException] {
# Do whatever you want if an exception is raised
}

Related

Calling Graph API with powershell or batch

In trying to design a simplified script for use with the office 365 graph API I can't seem to find any way to call it from a simplified outset.
For the use that I have intended for it I really don't want to take the time to build and compile an actual program when everything else can be done from powershell or a batch script.
In specific, I really only want to be able to call the graph API for a list of groups and store the result (in an array or text file). Is it possible to call the graph API from powershell or command line and if so, how?
In specific, I really only want to be able to call the graph API for a list of groups and store the result (in an array or text file).
If you just need to export a list of groups. I suggest you using the Azure Active Directory PowerShell.
$msolcred = get-credential
connect-msolservice -credential $msolcred
Get-MsolGroup | Out-File C:\Workbench\temp\tests\export.txt
Is it possible to call the graph API from powershell or command line and if so, how?
Yes, it is possible, to call the REST API:
First, you need to Obtaining an Access Token
Then, use the Invoke-RestMethod to call Graph API.
Invoke-RestMethod -Uri $uri -Headers #{Authorization = "Bearer {your_access_token}"}
You can use the PSMSGRAPH module for this. Can be download from the gallery
You must register an application in Azure to authenticate and delegate the necessary right to your app. You can do it at the appreg portal
Once this is done you just need to auth and run your request.
When running the code you will have to provide a credential to authorize.
$username = 'entertheappidhere'
$password = 'entertheapppaswordhere' | ConvertTo-SecureString -AsPlainText -Force
$ClientCredential = New-Object -TypeName
System.Management.Automation.PSCredential($username,$password)
$GraphAppParams = #{}
$GraphAppParams.Add('Name','Office365TenantMigration')
$GraphAppParams.Add('ClientCredential',$ClientCredential)
$GraphAppParams.Add('RedirectUri','https://localhost/')
$GraphAppParams.Add('Tenant','yourtenant.onmicrosoft.com')
$GraphApp = New-GraphApplication #GraphAppParams
# This will prompt you to log in with your O365/Azure credentials.
$AuthCode = $GraphApp | Get-GraphOauthAuthorizationCode
$GraphAccessToken = $AuthCode | Get-GraphOauthAccessToken -Resource 'https://graph.microsoft.com/'
$GraphAccessToken | Export-GraphOAuthAccessToken -Path 'f:\O365Report\AccessToken.XML'
$GraphAccessToken = Import-GraphOAuthAccessToken -Path 'f:\O365Report\AccessToken.XML'
$GraphAccessToken | Update-GraphOAuthAccessToken -Force
### Run the query
Invoke-GraphRequest -Uri "https://graph.microsoft.com/v1.0/groups"-Method GET -AccessToken $GraphAccessToken

PowerShell: how to move a file from a remote computer to a network share

I am using PowerShell to move a file from a network share to a remote computer (which I do not have admin rights on). The source path on the network share is \\share_computer\some_folder\file1.txt. The destination path to the file on the remote computer is \\remote_computer\d$\another_folder.
A simple Move-Item $from $to doesn't work. I get a PermissionDenied message when I try to access the network share. However, I have confirmed that I can access the shared file via something like
`$data = Get-Content "\\share_computer\some_folder\file1.txt"
$var = $data[0]`
I then tried the following:
$src = "\\share_computer\some_folder\file1.txt"
$dest = "\\remote_computer\d$\another_folder"
$username = "my_username"
$password = "my_password"
$WebClient = New-Object System.Net.WebClient
$WebClient.Credentials = New-Object System.Net.NetworkCredential($username, $password)
$WebClient.DownloadFile($src, $dest)
PowerShell is throwing the following error:
Exception calling "DownloadFile" with "2" argument(s): "An exception occurred during a WebClient request."
I don't know why PowerShell is throwing this error. Assuming the above is the correct technique to move the file, what do I need to do to correct it? Or, if the above is the incorrect technique, what should I do?

Cannot open sharepoint UNC path unless already opened through Windows Explorer

I'm hoping somebody can shed light on this, because it has been driving me to distraction.
I have a script which will save the reports it creates to a sharepoint document library via UNC path, if the path exists, otherwise it saves to the UNC path of a network drive location as a fallback.
I've noticed that checking with test-path, saving (through an msexcel COM object) or trying to open the folder in windows explorer using invoke-item only work if I had already accessed the sharepoint site (via web browser or windows explorer) since the PC last logged on (I'm running Windows 7 Enterprise Service Pack 1 - 64-bit edition).
If I haven't yet been on to sharepoint manually since last logon, test-path returns false, and the other methods cause ItemNotFoundException e.g.
ii : Cannot find path '\\uk.sharepoint.mydomain.local\sites\mycompany\myteam\Shared Documents\Reports' because it does not exist.
At line:1 char:1
+ ii '\\uk.sharepoint.mydomain.local\sites\mycompany\myteam\Shared Document ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (\\uk.sharepoint...\Reports:String) [Invoke-Item], ItemNotFoundException
+ FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.InvokeItemCommand
Example areas of code:
$LANPath = "\\myserver\myshare\teamdirs\scriptdir"
$SharepointPath = "\\uk.sharepoint.mydomain.local\sites\mycompany\myteam\Shared Documents\Reoprts"
$ScriptPath = $LANPath + "\bin"
If (Test-Path $SharepointPath) {$BasePath = $SharepointPath;write-host "Using sharepoint to save reports"} else {$BasePath = "$LANPath\Reports";write-host "Using LAN to save reports - sharepoint not accessible"}
and
$_|select -expandproperty HTMLBody | Out-File $($BasePath + "\Eml_body.html")
Write-Host "Reformating HTML"
$html = New-Object -ComObject "HTMLFile";
$source = Get-Content -Path ($BasePath + "\Eml_body.html") -Raw;
and when saving the excel spreadsheet from within my COM object:
$workbook._SaveAs($fileout,[Microsoft.Office.Interop.Excel.XlFileFormat]::xlOpenXMLWorkbook,$Missing,$Missing,$false,$false,[Microsoft.Office.Interop.Excel.XlSaveAsAccessMode]::xlNoChange,[Microsoft.Office.Interop.Excel.XlSaveConflictResolution]::xlLocalSessionChanges,$true,$Missing,$Missing)
You should be able to use a System.Net.WebClient object to access SharePoint file locations.
$client = New-Object System.Net.WebClient
The documentation for the WebClient.Credentials property suggests that the default credentials in this case may be for the ASP.NET server-side process rather than the current user's credentials:
If the WebClient class is being used in a middle tier application, such as an ASP.NET application, the DefaultCredentials belong to the account running the ASP page (the server-side credentials). Typically, you would set this property to the credentials of the client on whose behalf the request is made.
You therefore may want to set the credentials manually. You can plug them in as plain text...
$client.Credentials = New-Object System.Net.NetworkCredential("username","pswd","domain")
...or you could prompt the current user for their credentials.
$client.Credentials = Get-Credential
Here's an example that grabs a file and writes its content to the screen:
$client = New-Object System.Net.WebClient
$client.Credentials = Get-Credential
$data = $client.OpenRead("http://yoursharepointurl.com/library/document.txt")
$reader = New-Object System.IO.StreamReader($data)
$results = $reader.ReadToEnd()
Write-Host $results
$data.Close()
$reader.Close()
I know this is an old thread but for those searching, check out this link: https://www.myotherpcisacloud.com/post/Sometimes-I-Can-Access-the-WebDAV-Share-Sometimes-I-Cant!
Because SharePoint exposes its shares over WebDav, you need to ensure the WebClient service is running on the machine from which you are accessing the path. Browsing the path in explorer automatically fires up the service, while command-line methods do not.
If you change the startup type of WebClient to Automatic, it should resolve the issue.

webget in Powershell

Is there any command equivalent to webget in WindOS's PowerShell?
I am trying to create a script to download all publicly available files from the website. I am making the custom script because I need to store the files in specific directory structure (depending on name, type and size).
In PowerShell v2, use a WebClient:
(New-Object System.Net.WebClient).DownloadFile($url, $localFileName)
In v3, the Invoke-WebResquest cmdlet:
Invoke-WebRequest -Uri $url -OutFile $localFileName
Another option is with the Start-BitsTransfer cmdlet:
Start-BitsTransfer -Source $source -Destination $destination
In PowerShell V3, you can use the new cmdlet Invoke-WebRequest to send an http or https request to a web site/service e.g.:
$r = Invoke-WebRequest -URI http://www.bing.com?q=how+many+feet+in+a+mile
However to specifically download a file it is probably easiest to use the .NET API WebClient.DownloadFile() e.g.:
$remoteUri = "http://upload.wikimedia.org/wikipedia/commons/6/63/Wikipedia-logo.png"
$fileName = "$pwd\logo.png"
$webClient = new-object System.Net.WebClient
$webClient.DownloadFile($remoteUri, $fileName)
you can use the .NET class WebClient to download files.
PS > $source = "http://www.unsite.fr/untruc.zip"
PS > $destination = "c:\temp\untruc.zip"
PS >
PS >$wc = New-Object System.Net.WebClient
PS >$wc.DownloadFile($source, $destination)
If you prefer a "native" PowerShell cmdlet that works in PowerShell V2 or V3, I recommend Get-HttpResource from the PowerShell Community Extensions (PSCX). While PSCX surprisingly does not have the API available online (you have to install the extensions then you can use the normal PowerShell help to explore each command), I managed to find the API for Get-HttpResource here. Using the cmdlet can be as simple as this:
$myPage = Get-HttpResource http://blogs.msdn.com/powershell
However, there are a variety of parameters to the cmdlet that let you specify media type, credentials, encoding, proxy, user agent, and more.

Download URL content using PowerShell

I am working in a script, where I am able to browse the web content or the 'url' but I am not able to copy the web content in it & download as a file.
This is what I have made so far:
$url = "http://sp-fin/sites/arindam-sites/_layouts/xlviewer.aspx?listguid={05DA1D91-F934-4419-8AEF-B297DB81A31D}&itemid=4&DefaultItemOpen=1"
$ie=new-object -com internetexplorer.application
$ie.visible=$true
$ie.navigate($url)
while($ie.busy) {start-sleep 1}
How can I copy the content of $url and save it to local drive as a file?
Update:
I got these errors:
Exception calling "DownloadFile" with "2" argument(s): "The remote server returned an error: (401) Unauthorized." At :line:6 char:47 + (New-Object system.net.webclient).DownloadFile( <<<< "$url/download-url-content", 'save.html' )
Missing ')' in method call. At :line:6 char:68 + (New-Object system.net.webclient).DownloadFile( "$url", 'save.html' <<<<
Exception calling "DownloadFile" with "2" argument(s): "The remote server returned an error: (401) Unauthorized." At :line:6 char:47 + (New-Object system.net.webclient).DownloadFile( <<<< "$url", 'save.html' )
Ok, let me explain more, on what I am trying to do: I have a excel file in our share point site & this is the file I am trying to download locally(any format), which is a part of the script, so that for the later part of the script, I can compare this file with other data & get an output.
Now if I can somehow map "my documents" from the site & able to download the file, that will also work for me.
Update Jan 2014: With Powershell v3, released with Windows 8, you can do this:
(Invoke-webrequest -URI "http://www.kernel.org").Content
Original Post, valid for Powershell Version 2
This solution is very similar to the other answers from stej, Jay Bazusi and Marco Shaw.
It is a bit more general, by installing a new module into your module directory, psurl. The module psurl adds new commands in case you have to do a lot of html-fetching (and POSTing) with powershell.
(new-object Net.WebClient).DownloadString("http://psget.net/GetPsGet.ps1") | iex
See the homepage of the code-sharing website http://psget.net/.
This nice line of PowerShell script will dowload GetPsGet.ps1 and send
it to Invoke-Expression to install PsGet Module.
Then install PsUrl, a Powershell Module inspired by curl:
To install something (in our case PsUrl) from central directory just type:
install-module PsUrl
get-module -name psurl
Output:
ModuleType Name ExportedCommands
---------- ---- ----------------
Script psurl {Get-Url, Send-WebContent, Write-Url, Get-WebContent}
Command:
get-command -module psurl
Output:
CommandType Name Definition
----------- ---- ----------
Function Get-Url ...
Function Get-WebContent ...
Alias gwc Get-WebContent
Function Send-WebContent ...
Alias swc Send-WebContent
Function Write-Url ...
You need to do this only once.
Note that this error might occur:
Q: Error "File xxx cannot be loaded because the execution of scripts is disabled on this system. Please see "get-help about_signing" for more details."
A: By default, PowerShell restricts execution of all scripts. This is all about security. To "fix" this run PowerShell as Administrator and call
Set-ExecutionPolicy RemoteSigned
From now on, in your new powershell sessions/scripts, do this:
import-module psurl
get-url "http://www.google.com"
To download and save to a file, do this:
get-url "http://www.google.com" | out-file -filepath "myfile.html"
As I understand it, you try to use IE because if automatically sends your credentials (or maybe you didn't know of any other option).
Why the above answers don't work is because you try to download file from SharePoint and you send an unauthenticated request. The response is 401.
This works:
PS>$wc=new-object system.net.webclient
PS>$wc.UseDefaultCredentials = $true
PS>$wc.downloadfile("your_url","your_file")
if the the current user of Posh has rights to download the file (is the same as the logged one in IE).
If not, try this:
PS>$wc=new-object system.net.webclient
PS>$wc.Credentials = Get-Credential
PS>$wc.downloadfile("your_url","your_file")
If you just want to download web content, use
(New-Object System.Net.WebClient).DownloadFile( 'download url content', 'save.html' )
I'm not aware of any way to save using that interface.
Does this render the page properly:
PS>$wc=new-object system.net.webclient
PS>$wc.downloadfile("your_url","your_file")
As already answered in https://stackoverflow.com/a/35202299/4636579, but with a mandatory Proxy and the credentials. Without proxy, it would be:
$url="http://aaa.bbb.ccc.ddd/rss.xml"
$WebClient = New-Object net.webclient
$path="C:\Users\hugo\xml\test.xml"
$WebClient.DownloadFile($url, $path)
$web = New-Object Net.WebClient
$web | Get-Member
$content=$web.DownloadString("http://www.bing.com")
If you're truly only concerned with the raw string content, the best route, as mentioned by a few others, is using the constructs within .NET to do this. However, I think in the previous answers a few opportunities are missed.
It's often best to use WebRequest over WebClient as it provides better control over the entire request cycle
Response buffering via System.IO.StreamReader, made possible by using WebRequest
Creating a testable, reusable tool. Which is the very nature and purpose of PowerShell
function Get-UrlContent {
<#
.SYNOPSIS
High performance url fetch
.DESCRIPTION
Given a url, will return raw content as string.
Uses:
System.Net.HttpRequest
System.IO.Stream
System.IO.StreamReader
.PARAMETER Url
Defines the url to download
.OUTPUTS
System.String
.EXAMPLE
PS C:\> Get-UrlContent "https://www.google.com"
"<!doctype html>..."
#>
[cmdletbinding()]
[OutputType([String])]
param(
[Parameter(Mandatory, ValueFromPipeline)]
[ValidateNotNullOrEmpty()]
[string] $Url)
Write-Debug "`n----- [Get-UrlContent]`n$url`n------`n`n"
$req = [System.Net.WebRequest]::CreateHttp($url)
try {
$resp = $req.GetResponse()
}
catch {
Write-Debug "`n------ [Get-UrlContent]`nDownload failed: $url`n------`n"
}
finally {
if ($resp) {
$st = $resp.GetResponseStream()
$rd = [System.IO.StreamReader]$st
$rd.ReadToEnd()
}
if ($rd) { $rd.Close() }
if ($st) { $st.Close() }
if ($resp) { $resp.Close() }
}
}