I have previously automated login process on a website before which uses http however the same code is not working for another website which is using https (secure protocol). so i am not sure whether i have to do some extra work to logon that website using powershell.
EDIT: Adding code from comment
$ie = New-Object -ComObject "internetExplorer.Application"
$ie.Visible = $true
$ie.Navigate("secure.websitename.com/xyz/login.aspx";)
while ($ie.Busy -eq $true){Start-Sleep -Milliseconds 1000;}
Write-Host -ForegroundColor Magenta "Attempting to login";
$doc = $ie.Document
$LoginName = $doc.getElementsByName("txtUserName")
$LoginName.value = "username"
$txtPassword = $doc.getElementsByName("txtUserPass")
$txtPassword = "password"
$btnLogin = $doc.getElementsByName("cmdLogin")
You will need to include the "https" in the $ie.Navigate. I also believe if you pipe $ie> to Get-Member there is a Credentials property that you can use to pass the username and password. I don't have PowerShell in front of me right now but have played with this before with a https site.
Related
I am trying to create a script to delete old MOSS 2007 personal sites. I have most of the script working, but where I am having trouble is once a box pops up, to have it click 'Ok'. The script locates the URL, and clicks on 'delete', but then a box pops to confirm deletion and the options are 'Ok' and 'cancel'. I want the script to click on 'ok'. I've done some research on the sendkey methods, but since I'm not well versed in PS, I'm unable to get it to work. Also, please don't suggest using the SP cmds. We have MOSS 2007 running on 2003 servers, so any SP cmds or trying to run the script from the server is a moot point.
Please take a look at my script below.
[void]
[System.Reflection.Assembly]::LoadWithPartialName("'System.Windows.Forms")
[void]
[System.Reflection.Assembly]::LoadWithPartialName("'Microsoft.VisualBasic")
$user = "username"
$mysiteURL = "http://SharePointSite/personal/$user/_layouts/deleteweb.aspx"
Invoke-WebRequest -UseDefaultCredentials -uri $mysiteURL | Select-Object statusdescription
#Creates an Internet Explorer object
$ie = New-Object -ComObject 'internetExplorer.Application'
$ie.Visible= $true
$ie.Navigate($mysiteURL)
while ($ie.Busy -eq $true){Start-Sleep 4;}
$ie.Document.getElementByID('ctl00_PlaceHolderMain_ctl08_RptControls_BtnDelete').click()
#give the focus to ie
[Microsoft.VisualBasic.Interaction]::AppActivate("Message from webpage")
#send keys
start-sleep 1
[System.Windows.Forms.SendKeys]::Sendwait("{ENTER}");
Any help would be appreciated. Thank you!
I am trying to use PowerShell to login to a website. In the example below I am trying to login to live.com.
I am able to update the username field but the webpage runs some sort of input validation that does not accept my value. If I manually go in and edit the username field, like hitting space and then backspace, the input is then valid.
I found some documentation about changing the focus or using fireevent, but neither seems to work.
While sendkeys would resolve my issue, I have had numerous problems with sendkeys before and would really like to avoid going down that path.
$Site = 'https://login.live.com'
$UserName = 'FakeUserName#outlook.com'
$ie = New-Object -ComObject 'internetExplorer.Application'
$ie.Visible= $true
$ie.Navigate($Site)
while ($IE.busy)
{
Start-Sleep -Milliseconds 100
}
$Inputs = $IE.document.getElementsByTagName("input")
foreach ($Input in $Inputs)
{
if ($Input.type -eq "email")
{
$UserIDField = $Input
}
if ($Input.type -eq "submit")
{
$LoginButton = $Input
}
}
$UserIDField.focus()
$UserIDField.value = $UserName
$UserIDField.FireEvent('onchange')
$LoginButton.focus()
$LoginButton.click()
#Ranadip Dutta is certainly true, you should not, do that this way, but if you want to automate web browser Selenium is a good tool, here it tooks five minutes to automate Chrome on your web site. You can chooe an IE driver,Mozilla or Opera. for that have a look to Selenium.
# Selenium directory is the place where I expand Selenium Client & WebDriver Language Bindings for C#
$seleniumDir = 'D:\Developpements\Pgdvlp_PowerShell\selenium-dotnet-3.0.0'
# Selenium Webdriver
Add-Type -Path "$seleniumDir\net40\WebDriver.dll"
Add-Type -Path "$seleniumDir\net40\WebDriver.Support.dll"
Add-Type -Path "$seleniumDir\net40\ThoughtWorks.Selenium.Core.dll"
Add-Type -Path "$seleniumDir\net40\Selenium.WebDriverBackedSelenium.dll"
# With Chrome
# I Download Chrome driver here : https://chromedriver.storage.googleapis.com/index.html?path=2.25/
# It stands in "$seleniumDir" drive
$chrome = New-Object OpenQA.Selenium.Chrome.ChromeDriver "$seleniumDir"
#$chrome.Navigate().GoToUrl("https://fr.hightail.com/loginSpaces?redirect_url=https%3A%2F%2Fspaces.hightail.com%2Foauth%2Fhightail");
$chrome.Navigate().GoToUrl("https://login.live.com");
$Browser = $chrome
$email = $Browser.FindElements([OpenQA.Selenium.By]::Name('loginfmt'))
$email[0].SendKeys("adress#hotmail.com")
$button = $Browser.FindElements([OpenQA.Selenium.By]::Id('idSIButton9'))
$button.Click()
Start-Sleep 2
$passwd = $Browser.FindElements([OpenQA.Selenium.By]::Name('passwd'))
$passwd[0].SendKeys("toto")
$button = $Browser.FindElements([OpenQA.Selenium.By]::Id('idSIButton9'))
$button.Click()
If your website is checking for automated login then how can you expect it to be automated in this way. Sendkeys actually send like user input which is similar to what user does and thereby sorts your problem in that case.
I would like you to see if there is any API available for the web service to get logged in.
Other than that, I do not see anything which can help you. This concern is not about powershell or any scripting language. It is pretty much generic for your website.
You may also want to consider passing stored credentials more securely instead of putting your creds in full view plain text within your script(s).
TechNet - PowerShell Tip - Storing and Using Password Credentials
I am new to both powershell and sharepoint, and I need to make script to automate the removal and uploading of attachments from outlook to sharepoint. I have easily completed the first part of extracting the attachment, however the uploading to sharepoint has become difficult do to my company's rules. As I understand to use sharepoint cmdlets you need to add the sharepoint snap-in but I am unable to do so because I dont have access to the sharepoint server. Is there anyway to the snapin without being on the server and if not can I upload it another way?
You can't add the SP snap in unless the server is a SP server. Instead, use a webservice/webclient approach to upload the file. Something like this should work depending on your SP version:
http://blog.sharepoint-voodoo.net/?p=205
Accepted answer link is broken.
This script uses PowerShell to upload a file to a document library in SharePoint using purely web service calls so it could be done remotely, also meaning it should work with O365 though I have not tried.
These variables are used throughout the script for source file, destination file and authentication. If your workstation is on the same domain as SharePoint, and your logged on user has permissions to the SharePoint site, you can omit $username, $password, and $domain
$LocalPath = "C:\filename.docx"
$spDocLibPath = "http://site.contoso.com/sites/spteam/Shared Documents/"
$username = "someone"
$password = "somepassword"
$domain = "contoso"
$UploadFullPath = $spDocLibPath + $(split-path -leaf $LocalPath)
$WebClient = new-object System.Net.WebClient
if($username -eq "" -or $password -eq "" -or $password -eq "")
{
# Use Local Logged on User Credentials
$WebClient.Credentials = [System.Net.CredentialCache]::DefaultCredentials
}
else
{
# Alternate Login for specifying credentials
$WebClient.Credentials = new-object System.Net.NetworkCredential($username, $password, $domain)
}
$WebClient.UploadFile($UploadFullPath, "PUT", $LocalPath)
https://web.archive.org/web/20160404174527/http://blog.sharepoint-voodoo.net/?p=205
I'm trying to do some simple automation with Powershell, pulling link URLs from one of our company's local intranet pages, and then doing some work with those URLs. Eventually I'll use the script to open each link and click a button on the page. I'm using Internet Explorer 9 in Windows 7 x64.
Here's an example of a simple working powershell script that displays all the links on a page:
$ie = new-object -com "InternetExplorer.Application"
$ie.Visible = $true
$ie.Navigate( "http://www.reddit.com" )
While ($ie.Busy) {
Sleep 1
}
$links = $ie.Document.getElementsByTagName("a")
$links | foreach {
write-host $_.href
}
This script works fine until I replace the URL with a local intranet site. It follows the normal URL scheme ( http://internaldomain.com/etc ), but it's recognized as an intranet site. Once I'm trying to scrape a page in the intranet zone, the $ie.Document value suddenly becomes NULL and the script fails.
I'm guessing it's related to some obscure setting for that zone... I'm not sure. I found some suggestions online such as adding it to your trusted sites, but that has not worked. This is my first time using Powershell for web automation, so any help or insight would be appreciated.
Maybe the solution is here: http://blogs.msdn.com/b/ieinternals/archive/2011/08/03/internet-explorer-automation-protected-mode-lcie-default-integrity-level-medium.aspx
It explained the different levels of tabs, in ie. You have to use the "medium tab" to navigate in local zone.
Basically, the best way to keep your ie settings and use your script is to create a registry key, as explained in the link above.
Windows Registry Editor Version 5.00
[HKEY_CLASSES_ROOT\InternetExplorer.ApplicationMedium]
[HKEY_CLASSES_ROOT\InternetExplorer.ApplicationMedium\CLSID]
#="{D5E8041D-920F-45e9-B8FB-B1DEB82C6E5E}"
And in your script, use this new com object:
$ie = new-object -Com InternetExplorer.ApplicationMedium
...
Due to policy restrictions on my computer, I was not able to access the registry to create the key mentioned in another answer. However, I did find a way to do it indirectly using PowerShell in case this is helpful to anyone else:
$type = [Type]::GetTypeFromCLSID('D5E8041D-920F-45e9-B8FB-B1DEB82C6E5E')
$ie = [System.Activator]::CreateInstance($Type)
$ie.Visible = $true
$URL = "http://my.intranet.com"
$ie.Navigate($URL)
Write-Host "`$ie.Busy:" $ie.Busy
Write-Host "`$ie.ReadyState:" $ie.ReadyState
while($ie.Busy -or ($ie.ReadyState -ne 4) ) {
Start-Sleep -s 1
}
Write-Host "IE is ready"
Use
$ie.Document.documentElement.getElementsByClassName("underline")
and enjoy .....
I searched this site and found an FTPWebRequest example via Powershell. I put it to use and it works fine. However, when I enable SSL via EnableSsl=$True, all I get is timeouts or a delayed "227 Entering Passive Mode", which breaks the process. As soon as I disable EnableSsl, I can fly right through. Can someone point me in the right direction? SSL is supported on the FTP host.
I'd eventually like change the method to DownloadFile and loop the code to download files, after I get the list and find matches. I'd like to do it securely, though.
# Create an FTPWebRequest object to handle the connection to the FTP server
$ftprequest = [System.Net.FtpWebRequest]::Create($sourceuri)
# Set the request's network credentials for an authenticated connection
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
# Set FTPWebRequest method to ListDirectory
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::ListDirectory
$ftprequest.EnableSsl = $True
$ftprequest.UseBinary = $False
$ftprequest.UsePassive = $True
$ftprequest.KeepAlive = $False
$ftpresponse = $ftprequest.GetResponse()
Write-Out $ftpresponse.StatusCode
Write-Out $ftpresponse.StatusDescription
Come to find out, spontaneous issue was server-side.