HTTP 401 Basic Authentication error accessing Magento 2 Rest API - rest

I am attempting to use the Rest API in Magento 2. I have a piece of PHP that uses cURL to first get an admin token for my Magento user, then use the token to return a piece of Magento data (in this example a list of product types). The first part returns a token with no problems, but the second part comes back with an HTTP 401 Basic Authentication error.
My code is:
<?php
// Get handle for token retrieval
$userData = array("username" => "user", "password" => "password!");
$ch = curl_init("https://my.magento/rest/V1/integration/admin/token/");
// Set options
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "POST");
curl_setopt($ch, CURLOPT_POSTFIELDS, json_encode($userData));
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HTTPHEADER, array("Content-Type: application/json", "Content-Length: " . strlen(json_encode($userData))));
curl_setopt($ch, CURLOPT_VERBOSE, true);
$verbose = fopen('/tmp/curl.log', 'w+');
curl_setopt($ch, CURLOPT_STDERR, $verbose);
// Get token
$token = curl_exec($ch);
echo "Token returned: " . $token . "<BR><BR>";
// Display log
rewind($verbose);
$verboseLog = stream_get_contents($verbose);
echo "Verbose information 1:\n<pre>", htmlspecialchars($verboseLog), "</pre>\n";
echo "About to get product<BR>";
// Get handle for product types
$ch = curl_init("https://my.magento/rest/V1/products/types/");
// Set options
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "GET");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HTTPHEADER, array("Content-Type: application/json", "Authorization: Bearer " . json_decode($token)));
curl_setopt($ch, CURLOPT_VERBOSE, true);
$verbose = fopen('/tmp/curl.log', 'w+');
curl_setopt($ch, CURLOPT_STDERR, $verbose);
// Get types
$result = curl_exec($ch);
echo "Result: " . $result . "<BR>";
// Display log
rewind($verbose);
$verboseLog = stream_get_contents($verbose);
echo "<BR>Verbose information 2:\n<pre>", htmlspecialchars($verboseLog), "</pre>\n";
?>
And the browser output is:
Tokenreturned: "t8iskt68xlo5frf9hhtc1lk8wmqzbzx8"
Verbose information 1:
* About to connect() to my.magento port 443 (#2)
* Trying 104.25.128.20...
* Connected to mymagento (nn.nn.nn.nn) port 443 (#2)
* CAfile: /etc/pki/tls/certs/ca-bundle.crt
CApath: none
* SSL connection using TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256
* Server certificate:
* subject: CN=ssl379212.cloudflaressl.com,OU=PositiveSSL Multi-Domain,OU=Domain Control Validated
* start date: Oct 26 00:00:00 2018 GMT
* expire date: May 04 23:59:59 2019 GMT
* common name: ssl379212.cloudflaressl.com
* issuer: CN=COMODO ECC Domain Validation Secure Server CA 2,O=COMODO CA Limited,L=Salford,ST=Greater Manchester,C=GB
> POST /rest/V1/integration/admin/token/ HTTP/1.1
Host: sand2.firetoys.co.uk
Accept: */*
Content-Type: application/json
Content-Length: 48
* upload completely sent off: 48 out of 48 bytes
< HTTP/1.1 200 OK
< Date: Wed, 31 Oct 2018 12:50:01 GMT
< Content-Type: application/json; charset=utf-8
< Content-Length: 34
< Connection: keep-alive
< Set-Cookie: __cfduid=d69af7d1f0a1205231a8867c1f45875621540990201; expires=Thu, 31-Oct-19 12:50:01 GMT; path=/; domain=.my.magento; HttpOnly
< X-Frame-Options: SAMEORIGIN
< X-UA-Compatible: IE=edge
< Pragma: no-cache
< Expires: -1
< Cache-Control: no-store, no-cache, must-revalidate, max-age=0
< Accept-Ranges: bytes
< Set-Cookie: PHPSESSID=9p378rsfito8gfocnrufucssh6; expires=Wed, 31-Oct-2018 13:50:01 GMT; Max-Age=3600; path=/; domain=sand2.firetoys.co.uk; secure; HttpOnly
< Expect-CT: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
< Server: cloudflare
< CF-RAY: 47263eb629ea0ce9-LHR
<
* Connection #2 to host my.magento left intact
About to get product
Result:
Verbose information 2:
* About to connect() to my.magento port 443 (#3)
* Trying nn.nn.nn.nn...
* Connected to my.magento (nn.nn.nn.nn) port 443 (#3)
* CAfile: /etc/pki/tls/certs/ca-bundle.crt
CApath: none
* SSL connection using TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256
* Server certificate:
* subject: CN=ssl379212.cloudflaressl.com,OU=PositiveSSL Multi-Domain,OU=Domain Control Validated
* start date: Oct 26 00:00:00 2018 GMT
* expire date: May 04 23:59:59 2019 GMT
* common name: ssl379212.cloudflaressl.com
* issuer: CN=COMODO ECC Domain Validation Secure Server CA 2,O=COMODO CA Limited,L=Salford,ST=Greater Manchester,C=GB
> GET /rest/V1/products/types/ HTTP/1.1
Host: sand2.firetoys.co.uk
Accept: */*
Content-Type: application/json
Authorization: Bearer t8iskt68xlo5frf9hhtc1lk8wmqzbzx8
< HTTP/1.1 401 Unauthorized
< Date: Wed, 31 Oct 2018 12:50:01 GMT
< Content-Length: 0
< Connection: keep-alive
< Set-Cookie: __cfduid=d38c9e4bc3019d9ac55c7f68f5c5ca1161540990201; expires=Thu, 31-Oct-19 12:50:01 GMT; path=/; domain=.my.magento; HttpOnly
< X-Varnish: 7995397
< WWW-Authenticate: Basic
< Expect-CT: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
< Server: cloudflare
< CF-RAY: 47263eb70f5b3512-LHR
<
* Connection #3 to host my.magento left intact
When I try just browsing directly to https://my.magento/rest/V1/products/types/ I get a Magento error back saying I am not authorised for the Products resource, which I would expect as I am sending no token or login credentials, but at least it is getting through to Magento.
Any ideas??
I should add that the server is set for Basic authentication, and if I replace the Bearer auth with the necessary Basic auth in the header for the GET, it returns the Magento message about not having access to the resource, which is fair enough. So I guess there are two questions:
How can I get past the basic authentication and still include the bearer authentication in my GET request, given that you can't put two authentications into the header?
Why does the initial POST to get the token work without any basic auth??

"How can I get past the basic authentication and still include the bearer authentication in my GET request, given that you can't put two authentications into the header?"
Disable auth for /index.php/rest location (in webserver)
"Why does the initial POST to get the token work without any basic auth??"
If the POST location is protected then you should get a 401 response.
Did you put the username and password on url on post request? http://user:pass#my.magento/rest/V1/
By the way, putting user:pass#my.magento into URL, will be translated into Authorization: User .
But you also set Authorization: Bearer t8iskt68xlo5frf9hhtc1lk8wmqzbzx8 which will overwrite the http authentification Authorization.

Related

Curl doesn't redirect properly for some facebook profile IDs

I'm trying to hack together a bit of code (c++) that returns a current facebook display name for a given user ID. The code works perfectly fine for most IDs but for some IDs curl returns a 404 even though the same URL opens fine in a browser. Here's the bit of code related to curl.
int main(int argc, char const *argv[]) {
CURL* curl;
char userAgent[] = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US;
rv:1.8.1.13) Gecko/20080311 Firefox/2.0.0.13";
curl_global_init(CURL_GLOBAL_ALL);
curl = curl_easy_init();
string fbID = argv[1];
string searchTerm = "https://www.facebook.com/profile.php?id=" + fbID;
cout << "searching for: " << searchTerm << endl;
if(!curl)
return 0;
else{
curl_easy_setopt(curl, CURLOPT_USERAGENT, userAgent);
curl_easy_setopt(curl, CURLOPT_URL, searchTerm.c_str());
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, &writeCallback);
curl_easy_setopt(curl, CURLOPT_FOLLOWLOCATION, true);
curl_easy_setopt(curl, CURLOPT_VERBOSE, 1L); //tell curl to output
its progress
}
//scan the retrieved data for the name...
}
size_t writeCallback(char* buf, size_t size, size_t nmemb, void* up)
{ //i copied this function
for (int c = 0; c<size*nmemb; c++)
{
data.push_back(buf[c]);
}
return size*nmemb;
}
I don't exactly know where to look because it works for most ids but for example "1094145063" returns a 404 even though https://www.facebook.com/profile.php?id=1094145063
opens fine in a browser. (1331601579 for example works fine with my code)
I can't spot the difference between the sites.
It does however make a redirect to https://www.facebook.com/"username" which works fine once CURLOPT_FOLLOWLOCATION is set, but only on some sites. Here's the message curl outputs.
* Hostname was NOT found in DNS cache
* Trying 31.13.84.36...
* Connected to www.facebook.com (31.13.84.36) port 443 (#0)
* found 174 certificates in /etc/ssl/certs/ca-certificates.crt
* server certificate verification OK
* common name: *.facebook.com (matched)
* server certificate expiration date OK
* server certificate activation date OK
* certificate public key: EC
* certificate version: #3
* subject: C=US,ST=California,L=Menlo Park,O=Facebook\, Inc.,CN=*.facebook.com
* start date: Fri, 09 Dec 2016 00:00:00 GMT
* expire date: Thu, 25 Jan 2018 12:00:00 GMT
* issuer: C=US,O=DigiCert Inc,OU=www.digicert.com,CN=DigiCert SHA2 High Assurance Server CA
* compression: NULL
* cipher: AES-128-GCM
* MAC: AEAD
> GET /profile.php?id=1094145063 HTTP/1.1
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.13) Gecko/20080311 Firefox/2.0.0.13
Host: www.facebook.com
Accept: */*
HTTP/1.1 404 Not Found
X-XSS-Protection: 0
public-key-pins-report-only: max-age=500; pin-sha256="WoiWRyIOVNa9ihaBciRSC7XHjliYS9VwUGOIud4PB18="; pin-sha256="r/mIkG3eEpVdm+u/ko/cwxzOMo1bk4TyHIlByibiA5E="; pin-sha256="q4PO2G2cbkZhZ82+JgmRUyGMoAeozA+BSXVXQWB8XWQ="; report-uri="http://reports.fb.com/hpkp/"
Pragma: no-cache
content-security-policy: default-src * data: blob:;script-src *.facebook.com *.fbcdn.net *.facebook.net *.google-analytics.com *.virtualearth.net *.google.com 127.0.0.1:* *.spotilocal.com:* 'unsafe-inline' 'unsafe-eval' fbstatic-a.akamaihd.net fbcdn-static-b-a.akamaihd.net *.atlassolutions.com blob: data: 'self';style-src data: blob: 'unsafe-inline' *;connect-src *.facebook.com *.fbcdn.net *.facebook.net *.spotilocal.com:* *.akamaihd.net wss://*.facebook.com:* https://fb.scanandcleanlocal.com:* *.atlassolutions.com attachment.fbsbx.com ws://localhost:* blob: *.cdninstagram.com 'self';
Cache-Control: private, no-cache, no-store, must-revalidate
Strict-Transport-Security: max-age=15552000; preload
X-Content-Type-Options: nosniff
Expires: Sat, 01 Jan 2000 00:00:00 GMT
Vary: Accept-Encoding
Content-Type: text/html; charset=UTF-8
X-FB-Debug: pA9SSTdWWx2a66QeP7Je/4ik/2a+/ZL/m/nckHKf+KoEZLloClzu+qMDzyE/B8M1PRDl4SdS19C9vIIl7f43mA==
Date: Tue, 06 Jun 2017 18:25:15 GMT
Transfer-Encoding: chunked
Connection: keep-alive
* Connection #0 to host www.facebook.com left intact
Any help would be appreciated.
(my first SO post so please don't be too harsh if i did something dumb)

New-WebServiceProxy failing to authenticate with NTLM

I'm dealing with a rather peculiar issue. We have a need to hit the Lists service on our SharePoint farm. Web authentication federated through an Oracle SSO, but we do have accounts configured for automation that can perform web requests. Using AAM, we have an "internal" URL configured for server side automation that bypasses directly to AD, and everything else gets pushed to the SSO.
Here's the code (sanitized) that I'm using to try to get the list collection.
$username = "DOMAIN\username"
$password = "somepassword"
$site = "https://sp.biz.com/sites/SiteCollection"
$credentials = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $username, (ConvertTo-SecureString $password -AsPlainText -Force)
$proxy = New-WebServiceProxy -Uri "$site/_vti_bin/Lists.asmx" -Credentials $credentials
$proxy.GetListCollection()
I'm hit with a 403 when I use that code.
Exception calling "GetListCollection" with "0" argument(s): "Server was unable to process request. ---> Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED))"
If I change $site to use the internal URL (set via AAM) and run that on one of the front ends, I receive the list collection successfully. Now, at first I thought there was an issue with the account and permissions, but after running a Fiddler capture I see it not authenticating at all.
When I run the following cURL command, it authenticates and returns the list collection. Soap.xml is just the basic GetListCollection packet copied straight from the WDSL.
curl -v -u 'username':'pass' --ntlm -X POST -H "Content-Type: text/xml" --data-binary #soap.xml https://sp.biz.com/sites/SiteCollection/_vti_bin/Lists.asmx
Here's the sanitized verbose output from cURL.
* STATE: INIT => CONNECT handle 0x600056190; line 1029 (connection #-5000)
* Hostname was NOT found in DNS cache
* Trying <IPv6>...
* STATE: CONNECT => WAITCONNECT handle 0x600056190; line 1082 (connection #0)
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0* Connected to sp.biz.com (<IPv6>) port 443 (#0)
* successfully set certificate verify locations:
* CAfile: /usr/ssl/certs/ca-bundle.crt
CApath: none
* SSLv3, TLS handshake, Client hello (1):
} [data not shown]
* STATE: WAITCONNECT => PROTOCONNECT handle 0x600056190; line 1222 (connection #0)
* SSLv3, TLS handshake, Server hello (2):
{ [data not shown]
* SSLv3, TLS handshake, CERT (11):
{ [data not shown]
* SSLv3, TLS handshake, Server finished (14):
{ [data not shown]
* SSLv3, TLS handshake, Client key exchange (16):
} [data not shown]
* SSLv3, TLS change cipher, Client hello (1):
} [data not shown]
* SSLv3, TLS handshake, Finished (20):
} [data not shown]
* SSLv3, TLS change cipher, Client hello (1):
{ [data not shown]
* SSLv3, TLS handshake, Finished (20):
{ [data not shown]
* SSL connection using TLSv1.2 / DES-CBC3-SHA
* SSL certificate verify ok.
* STATE: PROTOCONNECT => DO handle 0x600056190; line 1241 (connection #0)
* Server auth using NTLM with user 'DOMAIN\username'
> POST /sites/SiteCollection/_vti_bin/Lists.asmx HTTP/1.1
> Authorization: NTLM <snip>
> User-Agent: curl/7.39.0
> Host: sp.biz.com
> Accept: */*
> Content-Type: text/xml
> Content-Length: 0
>
* STATE: DO => DO_DONE handle 0x600056190; line 1314 (connection #0)
* STATE: DO_DONE => WAITPERFORM handle 0x600056190; line 1441 (connection #0)
* STATE: WAITPERFORM => PERFORM handle 0x600056190; line 1454 (connection #0)
* HTTP 1.1 or later with persistent connection, pipelining supported
< HTTP/1.1 401 Unauthorized
* Server Microsoft-IIS/7.5 is not blacklisted
< Server: Microsoft-IIS/7.5
< SPRequestGuid: <snip>
< WWW-Authenticate: NTLM <snip>
< X-Powered-By: ASP.NET
< MicrosoftSharePointTeamServices: 14.0.0.7006
< X-MS-InvokeApp: 1; RequireReadOnly
< Date: Fri, 16 Jan 2015 01:02:56 GMT
< Content-Length: 0
< Set-Cookie: BIGipServerserver_pool=<snip>; expires=Sat, 17-Jan-2015 01:02:56 GMT; path=/
<
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
* Connection #0 to host sp.biz.com left intact
* Issue another request to this URL: 'https://sp.biz.com/sites/SiteCollection/_vti_bin/Lists.asmx'
* STATE: PERFORM => CONNECT handle 0x600056190; line 1601 (connection #-5000)
* Found bundle for host sp.biz.com: 0x60006aef0
* Re-using existing connection! (#0) with host sp.biz.com
* Connected to sp.biz.com (<IPv6>) port 443 (#0)
* STATE: CONNECT => DO handle 0x600056190; line 1075 (connection #0)
* Server auth using NTLM with user 'DOMAIN\username'
> POST /sites/SiteCollection/_vti_bin/Lists.asmx HTTP/1.1
> Authorization: NTLM <snip>
> User-Agent: curl/7.39.0
> Host: sp.biz.com
> Accept: */*
> Content-Type: text/xml
> Content-Length: 353
>
} [data not shown]
* upload completely sent off: 353 out of 353 bytes
* STATE: DO => DO_DONE handle 0x600056190; line 1314 (connection #0)
* STATE: DO_DONE => WAITPERFORM handle 0x600056190; line 1441 (connection #0)
* STATE: WAITPERFORM => PERFORM handle 0x600056190; line 1454 (connection #0)
* HTTP 1.1 or later with persistent connection, pipelining supported
< HTTP/1.1 200 OK
< Cache-Control: private, max-age=0
< Content-Type: text/xml; charset=utf-8
* Server Microsoft-IIS/7.5 is not blacklisted
< Server: Microsoft-IIS/7.5
< SPRequestGuid: <snip>
< Set-Cookie: FedAuth=<snip>; expires=Fri, 16-Jan-2015 08:36:07 GMT; path=/; secure; HttpOnly
< X-SharePointHealthScore: 0
< X-AspNet-Version: 2.0.50727
< Persistent-Auth: true
< X-Powered-By: ASP.NET
< MicrosoftSharePointTeamServices: 14.0.0.7006
< X-MS-InvokeApp: 1; RequireReadOnly
< Date: Fri, 16 Jan 2015 01:02:56 GMT
< Content-Length: 104088
< Vary: Accept-Encoding
<
{ [data not shown]
* STATE: PERFORM => DONE handle 0x600056190; line 1626 (connection #0)
100 101k 100 101k 100 353 219k 762 --:--:-- --:--:-- --:--:-- 219k
* Connection #0 to host sp.biz.com left intact
Any assistance is greatly appreciated. I'm not opposed to a C# solution over PowerShell if the cmdlets are lacking.
01-16-2015 12:13PM EST Update - I updated the question to reflect HighlyUnavailable's suggestion and included headers from the Fiddler capture.
Here are the sanitized headers from the PowerShell script:
CONNECT sp.biz.com:443 HTTP/1.1
Host: sp.biz.com
Connection: Keep-Alive
HTTP/1.1 200 Connection Established
FiddlerGateway: Direct
StartTime: 12:14:46.372
Connection: close
------------------------------------------------------------------
GET https://sp.biz.com/sites/SiteCollection/_vti_bin/Lists.asmx HTTP/1.1
User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; MS Web Services Client Protocol 2.0.50727.5485)
Host: sp.biz.com
Connection: Keep-Alive
HTTP/1.1 200 OK
Cache-Control: private, max-age=0
Content-Type: text/html; charset=utf-8
Server: Microsoft-IIS/7.5
SPRequestGuid: <snip>
X-SharePointHealthScore: 0
X-AspNet-Version: 2.0.50727
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.7006
X-MS-InvokeApp: 1; RequireReadOnly
Date: Fri, 16 Jan 2015 17:14:46 GMT
Connection: keep-alive
Content-Length: 9066
Set-Cookie: BIGipServerserver_pool=<snip>; expires=Sat, 17-Jan-2015 17:14:46 GMT; path=/
Vary: Accept-Encoding
------------------------------------------------------------------
GET https://sp.biz.com/_vti_bin/Lists.asmx?disco HTTP/1.1
User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; MS Web Services Client Protocol 2.0.50727.5485)
Host: sp.biz.com
HTTP/1.1 200 OK
Cache-Control: private
Content-Type: text/xml; charset=utf-8
Server: Microsoft-IIS/7.5
SPRequestGuid: <snip>
X-SharePointHealthScore: 0
X-AspNet-Version: 2.0.50727
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.7006
X-MS-InvokeApp: 1; RequireReadOnly
Date: Fri, 16 Jan 2015 17:14:46 GMT
Connection: close
Content-Length: 747
------------------------------------------------------------------
CONNECT sp.biz.com:443 HTTP/1.1
Host: sp.biz.com
Connection: Keep-Alive
HTTP/1.1 200 Connection Established
FiddlerGateway: Direct
StartTime: 12:14:47.505
Connection: close
------------------------------------------------------------------
GET https://sp.biz.com/_vti_bin/Lists.asmx?wsdl HTTP/1.1
User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; MS Web Services Client Protocol 2.0.50727.5485)
Host: sp.biz.com
HTTP/1.1 200 OK
Cache-Control: private
Content-Type: text/xml; charset=utf-8
Server: Microsoft-IIS/7.5
SPRequestGuid: <snip>
X-SharePointHealthScore: 0
X-AspNet-Version: 2.0.50727
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.7006
X-MS-InvokeApp: 1; RequireReadOnly
Date: Fri, 16 Jan 2015 17:14:46 GMT
Connection: close
Content-Length: 72672
Set-Cookie: BIGipServerserver_pool=<snip>; expires=Sat, 17-Jan-2015 17:14:47 GMT; path=/
Vary: Accept-Encoding
------------------------------------------------------------------
CONNECT sp.biz.com:443 HTTP/1.1
Host: sp.biz.com
Connection: Keep-Alive
HTTP/1.1 200 Connection Established
FiddlerGateway: Direct
StartTime: 12:14:48.727
Connection: close
------------------------------------------------------------------
POST https://sp.biz.com/_vti_bin/Lists.asmx HTTP/1.1
User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; MS Web Services Client Protocol 2.0.50727.5485)
Content-Type: text/xml; charset=utf-8
SOAPAction: "http://schemas.microsoft.com/sharepoint/soap/GetListCollection"
Host: sp.biz.com
Content-Length: 321
Expect: 100-continue
HTTP/1.1 500 Internal Server Error
Cache-Control: private
Content-Type: text/xml; charset=utf-8
Server: Microsoft-IIS/7.5
X-AspNet-Version: 2.0.50727
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.7006
X-MS-InvokeApp: 1; RequireReadOnly
Date: Fri, 16 Jan 2015 17:14:48 GMT
Content-Length: 459
Set-Cookie: BIGipServerserver_pool=686493706.47873.0000; expires=Sat, 17-Jan-2015 17:14:48 GMT; path=/
------------------------------------------------------------------
Here are the headers for the cURL command.
CONNECT sp.biz.com:443 HTTP/1.1
Host: sp.biz.com:443
User-Agent: curl/7.39.0
Connection: Keep-Alive
Content-Type: text/xml
HTTP/1.1 200 Connection Established
FiddlerGateway: Direct
StartTime: 12:21:07.928
Connection: close
------------------------------------------------------------------
POST https://sp.biz.com/sites/SiteCollection/_vti_bin/Lists.asmx HTTP/1.1
Authorization: NTLM <snip>=
User-Agent: curl/7.39.0
Host: sp.biz.com
Accept: */*
Content-Type: text/xml
Content-Length: 0
HTTP/1.1 401 Unauthorized
Server: Microsoft-IIS/7.5
SPRequestGuid: <snip>
WWW-Authenticate: NTLM <snip>
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.7006
X-MS-InvokeApp: 1; RequireReadOnly
Date: Fri, 16 Jan 2015 17:21:07 GMT
Content-Length: 0
Set-Cookie: BIGipServerserver_pool=<snip>; expires=Sat, 17-Jan-2015 17:21:07 GMT; path=/
Proxy-Support: Session-Based-Authentication
------------------------------------------------------------------
POST https://sp.biz.com/sites/SiteCollection/_vti_bin/Lists.asmx HTTP/1.1
Authorization: NTLM <snip>
User-Agent: curl/7.39.0
Host: sp.biz.com
Accept: */*
Content-Type: text/xml
Content-Length: 417
HTTP/1.1 200 OK
Cache-Control: private, max-age=0
Content-Type: text/xml; charset=utf-8
Server: Microsoft-IIS/7.5
SPRequestGuid: <snip>
Set-Cookie: FedAuth=<snip>; expires=Sat, 17-Jan-2015 03:20:50 GMT; path=/; secure; HttpOnly
X-SharePointHealthScore: 0
X-AspNet-Version: 2.0.50727
Persistent-Auth: true
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.7006
X-MS-InvokeApp: 1; RequireReadOnly
Date: Fri, 16 Jan 2015 17:21:07 GMT
Content-Length: 66628
Vary: Accept-Encoding
------------------------------------------------------------------
You're mixing two fundamentally different techniques here.
$proxy = New-WebServiceProxy -Uri "$site/_vti_bin/Lists.asmx" -UseDefaultCredential
$proxy.PreAuthenticate = $TRUE
$proxy.Credentials = $credentials
UseDefaultCredential will attempt to pass your currently logged in Windows domain user to the site. However, you're setting Credentials as well. Normally, you would use -Credential $credentials (see http://technet.microsoft.com/en-us/library/hh849841.aspx )
The curl command you're running is more akin to using -Credential: -u is equivalent.
Try using something like $proxy = New-WebServiceProxy -Uri "$site/_vti_bin/Lists.asmx" -Credential $credentials instead.
If that doesn't work, please edit your question to include the headers being returned from the Oracle SSO connection - it could be that it simply isn't even asking for credentials.
I never ended up coming up with a solution for this, but I can explain why. In our environment we use Forms Based Authentication against our Oracle Identity Foundation SSO with SAML v1.1.
When you attempt to authenticate, it redirects you to the SSO, but the client is attempting to use NTLM against the actual Web Front Ends instead of the SSO. To make this work, you need to include the X-FORMS_BASED_AUTH_ACCEPTED: f header in your request for it to actually authenticate using NTLM against the WFE (and not the SSO).
Here's the issue: You can't add headers to New-WebServiceProxy in PowerShell (up to 4.0 -- I haven't rolled out 5 yet). The only recommendation I can make for others having issues is to follow HighlyUnavailable's suggestions, or use Invoke-WebRequest and build your SOAP calls by hand.
The only issue is that Invoke-WebRequest can chew up your encoding, so here's how I've worked around it. If anybody has a suggestion for working around the encoding issue, I'm all ears.
# Set your credentials here.
$UserName = 'BartSimpson'
$Password = '3atmMySh0rtz!'
$Domain = 'SF'
$SecurePassword = ConvertTo-SecureString -String $Password -AsPlainText -Force
$Credentials = New-Object System.Management.Automation.PSCredential (($Domain + "\" + $UserName), $SecurePassword)
# SOAP request headers and body
$BaseHeaders = #{"X-FORMS_BASED_AUTH_ACCEPTED" = 'f';
"SOAPAction" = "`"http://schemas.microsoft.com/sharepoint/soap/GetListCollection`"";
"Content-Type" = "text/xml; charset=utf-8"}
$SOAP = #"
<?xml version="1.0" encoding="utf-8"?>
<soap:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
<soap:Body>
<GetListCollection xmlns="http://schemas.microsoft.com/sharepoint/soap/" />
</soap:Body>
</soap:Envelope>
"#
# Gives us a random temp file to pipe output to
$TmpFile = [System.IO.Path]::GetTempFileName()
Invoke-WebRequest -Uri $URL -Headers $BaseHeaders -Credential $Credentials -Method POST -Body $SOAP -OutFile $TmpFile
# Get the outfile with UTF8 encoding
[xml]$Result = Get-Content -Raw -Path $TmpFile -Encoding UTF8
# Remove the temporary file
Remove-Item $TmpFile
Seems like a long way to go, which it is, but it works if you insist on using PowerShell.
I switched to python-suds and was able to do what I needed to.
In the above code, there is extra 'S' in -Credentials.
So replace the code:
$proxy = New-WebServiceProxy -Uri "$site/_vti_bin/Lists.asmx" -Credentials $credentials
with below code:
$proxy = New-WebServiceProxy -Uri "$site/_vti_bin/Lists.asmx" -Credential $credentials

401 Authorization Required: Failed to validate oauth signature and token

I've tried using Net::Twitter::Role::OAuth to add Sign in with Twitter to my application.
I've used this successfully in the past, but not with SSL enabled, which apparently is now required by the Twitter API. I have a controller action very similar to the examples in the synopsis:
sub twitter_authorize : Local {
my($self, $c) = #_;
my $nt = Net::Twitter->new(traits => [qw/API::RESTv1_1 OAuth/], %param);
my $url = $nt->get_authorization_url(callback => $callbackurl);
$c->response->cookies->{oauth} = {
value => {
token => $nt->request_token,
token_secret => $nt->request_token_secret,
},
};
$c->response->redirect($url);
}
However, this fails at the $nt->get_authorization_url() call with a 401 Unauthorized error.
Looking at the oauth/request_token docs, I tried running the request through cURL, as follows:
curl --request 'POST' 'https://api.twitter.com/oauth/request_token' --header 'Authorization: OAuth oauth_consumer_key="xxxx", oauth_nonce="xxxx", oauth_signature="xxxx", oauth_signature_method="HMAC-SHA1", oauth_timestamp="xxxx", oauth_callback="oob", oauth_version="1.0"' --verbose
And the response is as follows:
* About to connect() to api.twitter.com port 443 (#0)
* Trying 199.16.156.104...
connected
* Connected to api.twitter.com (199.16.156.104) port 443 (#0)
* successfully set certificate verify locations:
* CAfile: none
CApath: /etc/ssl/certs
* SSLv3, TLS handshake, Client hello (1):
* SSLv3, TLS handshake, Server hello (2):
* SSLv3, TLS handshake, CERT (11):
* SSLv3, TLS handshake, Server finished (14):
* SSLv3, TLS handshake, Client key exchange (16):
* SSLv3, TLS change cipher, Client hello (1):
* SSLv3, TLS handshake, Finished (20):
* SSLv3, TLS change cipher, Client hello (1):
* SSLv3, TLS handshake, Finished (20):
* SSL connection using AES128-SHA
* Server certificate:
* subject: C=US; ST=California; L=San Francisco; O=Twitter, Inc.; OU=Twitter Security; CN=api.twitter.com
* start date: 2014-08-03 00:00:00 GMT
* expire date: 2016-12-31 23:59:59 GMT
* subjectAltName: api.twitter.com matched
* issuer: C=US; O=VeriSign, Inc.; OU=VeriSign Trust Network; OU=Terms of use at https://www.verisign.com/rpa (c)10; CN=VeriSign Class 3 Secure Server CA - G3
* SSL certificate verify ok.
> POST /oauth/request_token HTTP/1.1
> User-Agent: curl/7.19.7 (x86_64-pc-linux-gnu) libcurl/7.19.7 OpenSSL/0.9.8k zlib/1.2.3.3 libidn/1.15
> Host: api.twitter.com
> Accept: */*
> Authorization: OAuth oauth_consumer_key="xxxx", oauth_nonce="xxxx", oauth_signature="xxxx", oauth_signature_method="HMAC-SHA1", oauth_timestamp="xxxx", oauth_callback="oob", oauth_version="1.0"
>
< HTTP/1.1 401 Authorization Required
< cache-control: no-cache, no-store, must-revalidate, pre-check=0, post-check=0
< content-length: 44
< content-security-policy-report-only: default-src https:; connect-src https:; font-src https: data:; frame-src https: http://*.twimg.com http://itunes.apple.com about: javascript:; img-src https: data:; media-src https:; object-src https:; script-src 'unsafe-inline' 'unsafe-eval' about: https:; style-src 'unsafe-inline' https:; report-uri https://twitter.com/i/csp_report?a=NVXW433SMFUWY%3D%3D%3D&ro=true;
< content-type: text/html; charset=utf-8
< date: Tue, 21 Oct 2014 10:29:57 UTC
< expires: Tue, 31 Mar 1981 05:00:00 GMT
< last-modified: Tue, 21 Oct 2014 10:29:57 GMT
< pragma: no-cache
< server: tsa_b
< set-cookie: _twitter_sess=BAh7CDoPY3JlYXRlZF9hdGwrCD2PQTJJAToHaWQiJTE3M2Q4OWIyZWE1Nzc1%250AZmYxMjRkYmUyZDVjOTBlYjQxIgpmbGFzaElDOidBY3Rpb25Db250cm9sbGVy%250AOjpGbGFzaDo6Rmxhc2hIYXNoewAGOgpAdXNlZHsA--b807e4ebb8d45756e9686971b951a549d0d83b61; domain=.twitter.com; path=/; secure; HttpOnly
< set-cookie: guest_id=v1%3A141388739758201626; Domain=.twitter.com; Path=/; Expires=Thu, 20-Oct-2016 10:29:57 UTC
< status: 401 Unauthorized
< strict-transport-security: max-age=631138519
< vary: Accept-Encoding
< x-connection-hash: 54a185631d5f0b3a3a9dc46fe1f40a57
< x-content-type-options: nosniff
< x-frame-options: SAMEORIGIN
< x-mid: 0258025664ce095129d0cc294100d71a2e6e66ac
< x-runtime: 0.01294
< x-transaction: 6fad295009a89877
< x-ua-compatible: IE=edge,chrome=1
< x-xss-protection: 1; mode=block
<
* Connection #0 to host api.twitter.com left intact
* Closing connection #0
* SSLv3, TLS alert, Client hello (1):
Failed to validate oauth signature and token
Weirdly, if I remove the oauth_callback key from the Authorization header, it works fine and I get the tokens. However, the API docs suggest that this parameter is required. Is there something wrong with how I'm passing in the oauth_callback item?
I've tried setting it to oob (which is supposed to work for "out of band" access). And I've copied the encoded URL from the API docs. It doesn't work with either.
As it works without oauth_callback, it's not a time issue on my machine, as is a commonly reported problem. I haven't tried sending the Net::Twitter request without the callback (I haven't checked if that's possible) but I assume that would fix it there too. However, I do need the request to have a valid callback URL or the user won't be redirected back to the application for the rest of the sign in flow.
Adam,
I was having a similar issue and after a thorough investigation of what other Twitter API wrapper libraries were doing discovered that the oauth_callback needed to be encoded twice.
On doing a retrospective? search to get a little more explanation/clarity I found the following SO answer Twitter Oauth URL encoding inconsistencies? which extremely succinctly explained the issue I was having.
Is this perhaps the issue you have encountered?

Google oAuth2 unauthorized_client by refresh_token

i'm trying to use an refresh token from the oAuth2 web redirect auth in my console perl script. The client id is the same and correct client id i used in my javascript an i checked it 5 times that it ist the same as i have in my google API's Console.
The client secret is checked two and it is correct.
The refresh token was created with approval_prompt=force&access_type=offline
Here is my perl sample code i use:
# -----------------------------------------------------------------------------------
my $CLIENT_ID = 'XXXXX.apps.googleusercontent.com';
my $CLIENT_SECRET = 'YYYYYYYYYYY';
# -----------------------------------------------------------------------------------
# TESTING
my $refresh_token = '1/is_5_minutes_old';
# -----------------------------------------------------------------------------------
my $string = '';
$string .= 'grant_type=refresh_token';
$string .= '&client_id=' . $CLIENT_ID;
$string .= '&client_secret=' . $CLIENT_SECRET;
$string .= '&refresh_token=' . $refresh_token;
$ua = LWP::UserAgent->new;
my $req =
HTTP::Request->new( POST => 'https://accounts.google.com/o/oauth2/token' );
$req->content_type('application/x-www-form-urlencoded');
$req->content($string);
print $string . "\n";
my $res = $ua->request($req);
print $res->as_string;
The response of it:
HTTP/1.1 400 Bad Request
Cache-Control: no-cache, no-store, max-age=0, must-revalidate
Connection: close
Date: Mon, 02 Sep 2013 10:50:26 GMT
Pragma: no-cache
Server: GSE
Content-Type: application/json
Expires: Fri, 01 Jan 1990 00:00:00 GMT
Alternate-Protocol: 443:quic
Client-Date: Mon, 02 Sep 2013 10:50:26 GMT
Client-Peer: 74.125.136.84:443
Client-Response-Num: 1
Client-SSL-Cert-Issuer: /C=US/O=Google Inc/CN=Google Internet Authority G2
Client-SSL-Cert-Subject: /C=US/ST=California/L=Mountain View/O=Google Inc/CN=accounts.google.com
Client-SSL-Cipher: RC4-SHA
Client-SSL-Warning: Peer certificate not verified
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
X-XSS-Protection: 1; mode=block
{
"error" : "unauthorized_client"
}
i hope you have an idea to help.
greatings
Invalid client usually means that the client ID and client secret don't match, or there is a typo in one of them (though you mention you've double checked this!). Nothing in your code looks wrong.
When you retrieve the refresh token, could you try putting the access token that comes along with it into the tokeinfo endpoint and making sure the values for client ID there match the ones you've configured with: https://www.googleapis.com/oauth2/v1/tokeninfo?access_token=
It might be worth dumping out the request to make sure there isn't a mistake in that (e.g. too short a content-length header or similar).
Dumping the request is key. To help , here is one I baked earlier..
==POST==
https://accounts.google.com/o/oauth2/token
refresh_token=1/_PEzU2m71wertwertwerJUtrtrytrytryf3trytryoCo
&client_id=612222222225
&client_secret=Q7334534543534yKLu
&grant_type=refresh_token
Are you using the short form of the client id, ie. just the number?

Paypal IPN callback stopped working in sandbox environment

I'm having a problem with the paypal IPN callback. Paypal's IPN callback stopped working, in sandbox environment.
I've been testing my client's website, for the past weeks, and it has always been working correctly - The payment was made, and a callback IPN was sent back to the website, confirming the payment, and updating the website's database.
I haven´t change anything in my code, and it suddenly stopped working. The payment is still made and saved in the paypal account, but the IPN is always retrying... it doesn't complete.
Here's the code in use:
<?php
// STEP 1: read POST data
// Reading POSTed data directly from $_POST causes serialization issues with array data in the POST.
// Instead, read raw POST data from the input stream.
$raw_post_data = file_get_contents('php://input');
$raw_post_array = explode('&', $raw_post_data);
$myPost = array();
foreach ($raw_post_array as $keyval) {
$keyval = explode ('=', $keyval);
if (count($keyval) == 2)
$myPost[$keyval[0]] = urldecode($keyval[1]);
}
// read the IPN message sent from PayPal and prepend 'cmd=_notify-validate'
$req = 'cmd=_notify-validate';
if(function_exists('get_magic_quotes_gpc')) {
$get_magic_quotes_exists = true;
}
foreach ($myPost as $key => $value) {
if($get_magic_quotes_exists == true && get_magic_quotes_gpc() == 1) {
$value = urlencode(stripslashes($value));
} else {
$value = urlencode($value);
}
$req .= "&$key=$value";
}
// STEP 2: POST IPN data back to PayPal to validate
$ch = curl_init('https://www.sandbox.paypal.com/cgi-bin/webscr');
curl_setopt($ch, CURLOPT_HTTP_VERSION, CURL_HTTP_VERSION_1_1);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $req);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);
curl_setopt($ch, CURLOPT_FORBID_REUSE, 1);
curl_setopt($ch, CURLOPT_HTTPHEADER, array('Connection: Close'));
// In wamp-like environments that do not come bundled with root authority certificates,
// please download 'cacert.pem' from [link removed] and set
// the directory path of the certificate as shown below:
// curl_setopt($ch, CURLOPT_CAINFO, dirname(__FILE__) . '/cacert.pem');
if( !($res = curl_exec($ch)) ) {
// error_log("Got " . curl_error($ch) . " when processing IPN data");
curl_close($ch);
exit;
}
curl_close($ch);
……
?>
Firing some outputs to text files, I discovered that it passes the first Step and stops on the Second, before
if( !($res = curl_exec($ch)) ) {
I've already submitted three help requests to PayPal, but still didn't get an answer from them.
The cURL request initiates a HTTP POST back to PayPal in order to validate the IPN message. If it stops at this step, it means you do receive the IPN POST from PayPal, but you've got a problem connecting back to PayPal.
What does // error_log("Got " . curl_error($ch) . " when processing IPN data"); contain?
In order to resolve this, test the HTTPS connection from your server to the following addresses:
https://www.paypal.com/cgi-bin/webscr?cmd=_notify-validate
https://www.sandbox.paypal.com/cgi-bin/webscr?cmd=_notify-validate
For example, you can run cURL directly from your server (assuming you've got SSH access);
curl -v https://www.sandbox.paypal.com/cgi-bin/webscr?cmd=_notify-validate
This should return something similar to the following:
$ curl -v https://www.sandbox.paypal.com/cgi-bin/webscr?cmd=_notify-validate
* About to connect() to www.sandbox.paypal.com port 443 (#0)
* Trying 173.0.82.77...
* connected
* Connected to www.sandbox.paypal.com (173.0.82.77) port 443 (#0)
* successfully set certificate verify locations:
* CAfile: /usr/ssl/certs/ca-bundle.crt
CApath: none
* SSLv3, TLS handshake, Client hello (1):
* SSLv3, TLS handshake, Server hello (2):
* SSLv3, TLS handshake, CERT (11):
* SSLv3, TLS handshake, Server finished (14):
* SSLv3, TLS handshake, Client key exchange (16):
* SSLv3, TLS change cipher, Client hello (1):
* SSLv3, TLS handshake, Finished (20):
* SSLv3, TLS change cipher, Client hello (1):
* SSLv3, TLS handshake, Finished (20):
* SSL connection using AES256-SHA
* Server certificate:
* subject: 1.3.6.1.4.1.311.60.2.1.3=US; 1.3.6.1.4.1.311.60.2.1.2=Delaware; businessCategory=Private Organization; serialNumber=3014267; C=US; postalCode=95131-2021; ST=California; L=San Jose; street=2211 N 1st St; O=PayPal, Inc.; OU=PayPal Production; CN=www.sandbox.paypal.com
* start date: 2011-09-01 00:00:00 GMT
* expire date: 2013-09-30 23:59:59 GMT
* common name: www.sandbox.paypal.com (matched)
* issuer: C=US; O=VeriSign, Inc.; OU=VeriSign Trust Network; OU=Terms of use at https://www.verisign.com/rpa (c)06; CN=VeriSign Class 3 Extended Validation SSL CA
* SSL certificate verify ok.
> GET /cgi-bin/webscr?cmd=_notify-validate HTTP/1.1
> User-Agent: curl/7.28.1
> Host: www.sandbox.paypal.com
> Accept: */*
>
* HTTP 1.1 or later with persistent connection, pipelining supported
< HTTP/1.1 200 OK
< Date: Wed, 14 Aug 2013 20:15:53 GMT
< Server: Apache
< X-Frame-Options: SAMEORIGIN
< Set-Cookie: xxxxx
< X-Cnection: close
< Set-Cookie: xxxx domain=.paypal.com; path=/; Secure; HttpOnly
< Set-Cookie: Apache=10.72.128.11.1376511353229960; path=/; expires=Fri, 07-Aug-43 20:15:53 GMT
< Vary: Accept-Encoding
< Strict-Transport-Security: max-age=14400
< Transfer-Encoding: chunked
< Content-Type: text/html; charset=UTF-8
<
* Connection #0 to host www.sandbox.paypal.com left intact
**INVALID* Closing connection #0**
* SSLv3, TLS alert, Client hello (1):
If you receive a timeout or an SSL handshake error, you will need to investigate this separately.