New-WebServiceProxy failing to authenticate with NTLM - powershell

I'm dealing with a rather peculiar issue. We have a need to hit the Lists service on our SharePoint farm. Web authentication federated through an Oracle SSO, but we do have accounts configured for automation that can perform web requests. Using AAM, we have an "internal" URL configured for server side automation that bypasses directly to AD, and everything else gets pushed to the SSO.
Here's the code (sanitized) that I'm using to try to get the list collection.
$username = "DOMAIN\username"
$password = "somepassword"
$site = "https://sp.biz.com/sites/SiteCollection"
$credentials = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $username, (ConvertTo-SecureString $password -AsPlainText -Force)
$proxy = New-WebServiceProxy -Uri "$site/_vti_bin/Lists.asmx" -Credentials $credentials
$proxy.GetListCollection()
I'm hit with a 403 when I use that code.
Exception calling "GetListCollection" with "0" argument(s): "Server was unable to process request. ---> Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED))"
If I change $site to use the internal URL (set via AAM) and run that on one of the front ends, I receive the list collection successfully. Now, at first I thought there was an issue with the account and permissions, but after running a Fiddler capture I see it not authenticating at all.
When I run the following cURL command, it authenticates and returns the list collection. Soap.xml is just the basic GetListCollection packet copied straight from the WDSL.
curl -v -u 'username':'pass' --ntlm -X POST -H "Content-Type: text/xml" --data-binary #soap.xml https://sp.biz.com/sites/SiteCollection/_vti_bin/Lists.asmx
Here's the sanitized verbose output from cURL.
* STATE: INIT => CONNECT handle 0x600056190; line 1029 (connection #-5000)
* Hostname was NOT found in DNS cache
* Trying <IPv6>...
* STATE: CONNECT => WAITCONNECT handle 0x600056190; line 1082 (connection #0)
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0* Connected to sp.biz.com (<IPv6>) port 443 (#0)
* successfully set certificate verify locations:
* CAfile: /usr/ssl/certs/ca-bundle.crt
CApath: none
* SSLv3, TLS handshake, Client hello (1):
} [data not shown]
* STATE: WAITCONNECT => PROTOCONNECT handle 0x600056190; line 1222 (connection #0)
* SSLv3, TLS handshake, Server hello (2):
{ [data not shown]
* SSLv3, TLS handshake, CERT (11):
{ [data not shown]
* SSLv3, TLS handshake, Server finished (14):
{ [data not shown]
* SSLv3, TLS handshake, Client key exchange (16):
} [data not shown]
* SSLv3, TLS change cipher, Client hello (1):
} [data not shown]
* SSLv3, TLS handshake, Finished (20):
} [data not shown]
* SSLv3, TLS change cipher, Client hello (1):
{ [data not shown]
* SSLv3, TLS handshake, Finished (20):
{ [data not shown]
* SSL connection using TLSv1.2 / DES-CBC3-SHA
* SSL certificate verify ok.
* STATE: PROTOCONNECT => DO handle 0x600056190; line 1241 (connection #0)
* Server auth using NTLM with user 'DOMAIN\username'
> POST /sites/SiteCollection/_vti_bin/Lists.asmx HTTP/1.1
> Authorization: NTLM <snip>
> User-Agent: curl/7.39.0
> Host: sp.biz.com
> Accept: */*
> Content-Type: text/xml
> Content-Length: 0
>
* STATE: DO => DO_DONE handle 0x600056190; line 1314 (connection #0)
* STATE: DO_DONE => WAITPERFORM handle 0x600056190; line 1441 (connection #0)
* STATE: WAITPERFORM => PERFORM handle 0x600056190; line 1454 (connection #0)
* HTTP 1.1 or later with persistent connection, pipelining supported
< HTTP/1.1 401 Unauthorized
* Server Microsoft-IIS/7.5 is not blacklisted
< Server: Microsoft-IIS/7.5
< SPRequestGuid: <snip>
< WWW-Authenticate: NTLM <snip>
< X-Powered-By: ASP.NET
< MicrosoftSharePointTeamServices: 14.0.0.7006
< X-MS-InvokeApp: 1; RequireReadOnly
< Date: Fri, 16 Jan 2015 01:02:56 GMT
< Content-Length: 0
< Set-Cookie: BIGipServerserver_pool=<snip>; expires=Sat, 17-Jan-2015 01:02:56 GMT; path=/
<
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
* Connection #0 to host sp.biz.com left intact
* Issue another request to this URL: 'https://sp.biz.com/sites/SiteCollection/_vti_bin/Lists.asmx'
* STATE: PERFORM => CONNECT handle 0x600056190; line 1601 (connection #-5000)
* Found bundle for host sp.biz.com: 0x60006aef0
* Re-using existing connection! (#0) with host sp.biz.com
* Connected to sp.biz.com (<IPv6>) port 443 (#0)
* STATE: CONNECT => DO handle 0x600056190; line 1075 (connection #0)
* Server auth using NTLM with user 'DOMAIN\username'
> POST /sites/SiteCollection/_vti_bin/Lists.asmx HTTP/1.1
> Authorization: NTLM <snip>
> User-Agent: curl/7.39.0
> Host: sp.biz.com
> Accept: */*
> Content-Type: text/xml
> Content-Length: 353
>
} [data not shown]
* upload completely sent off: 353 out of 353 bytes
* STATE: DO => DO_DONE handle 0x600056190; line 1314 (connection #0)
* STATE: DO_DONE => WAITPERFORM handle 0x600056190; line 1441 (connection #0)
* STATE: WAITPERFORM => PERFORM handle 0x600056190; line 1454 (connection #0)
* HTTP 1.1 or later with persistent connection, pipelining supported
< HTTP/1.1 200 OK
< Cache-Control: private, max-age=0
< Content-Type: text/xml; charset=utf-8
* Server Microsoft-IIS/7.5 is not blacklisted
< Server: Microsoft-IIS/7.5
< SPRequestGuid: <snip>
< Set-Cookie: FedAuth=<snip>; expires=Fri, 16-Jan-2015 08:36:07 GMT; path=/; secure; HttpOnly
< X-SharePointHealthScore: 0
< X-AspNet-Version: 2.0.50727
< Persistent-Auth: true
< X-Powered-By: ASP.NET
< MicrosoftSharePointTeamServices: 14.0.0.7006
< X-MS-InvokeApp: 1; RequireReadOnly
< Date: Fri, 16 Jan 2015 01:02:56 GMT
< Content-Length: 104088
< Vary: Accept-Encoding
<
{ [data not shown]
* STATE: PERFORM => DONE handle 0x600056190; line 1626 (connection #0)
100 101k 100 101k 100 353 219k 762 --:--:-- --:--:-- --:--:-- 219k
* Connection #0 to host sp.biz.com left intact
Any assistance is greatly appreciated. I'm not opposed to a C# solution over PowerShell if the cmdlets are lacking.
01-16-2015 12:13PM EST Update - I updated the question to reflect HighlyUnavailable's suggestion and included headers from the Fiddler capture.
Here are the sanitized headers from the PowerShell script:
CONNECT sp.biz.com:443 HTTP/1.1
Host: sp.biz.com
Connection: Keep-Alive
HTTP/1.1 200 Connection Established
FiddlerGateway: Direct
StartTime: 12:14:46.372
Connection: close
------------------------------------------------------------------
GET https://sp.biz.com/sites/SiteCollection/_vti_bin/Lists.asmx HTTP/1.1
User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; MS Web Services Client Protocol 2.0.50727.5485)
Host: sp.biz.com
Connection: Keep-Alive
HTTP/1.1 200 OK
Cache-Control: private, max-age=0
Content-Type: text/html; charset=utf-8
Server: Microsoft-IIS/7.5
SPRequestGuid: <snip>
X-SharePointHealthScore: 0
X-AspNet-Version: 2.0.50727
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.7006
X-MS-InvokeApp: 1; RequireReadOnly
Date: Fri, 16 Jan 2015 17:14:46 GMT
Connection: keep-alive
Content-Length: 9066
Set-Cookie: BIGipServerserver_pool=<snip>; expires=Sat, 17-Jan-2015 17:14:46 GMT; path=/
Vary: Accept-Encoding
------------------------------------------------------------------
GET https://sp.biz.com/_vti_bin/Lists.asmx?disco HTTP/1.1
User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; MS Web Services Client Protocol 2.0.50727.5485)
Host: sp.biz.com
HTTP/1.1 200 OK
Cache-Control: private
Content-Type: text/xml; charset=utf-8
Server: Microsoft-IIS/7.5
SPRequestGuid: <snip>
X-SharePointHealthScore: 0
X-AspNet-Version: 2.0.50727
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.7006
X-MS-InvokeApp: 1; RequireReadOnly
Date: Fri, 16 Jan 2015 17:14:46 GMT
Connection: close
Content-Length: 747
------------------------------------------------------------------
CONNECT sp.biz.com:443 HTTP/1.1
Host: sp.biz.com
Connection: Keep-Alive
HTTP/1.1 200 Connection Established
FiddlerGateway: Direct
StartTime: 12:14:47.505
Connection: close
------------------------------------------------------------------
GET https://sp.biz.com/_vti_bin/Lists.asmx?wsdl HTTP/1.1
User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; MS Web Services Client Protocol 2.0.50727.5485)
Host: sp.biz.com
HTTP/1.1 200 OK
Cache-Control: private
Content-Type: text/xml; charset=utf-8
Server: Microsoft-IIS/7.5
SPRequestGuid: <snip>
X-SharePointHealthScore: 0
X-AspNet-Version: 2.0.50727
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.7006
X-MS-InvokeApp: 1; RequireReadOnly
Date: Fri, 16 Jan 2015 17:14:46 GMT
Connection: close
Content-Length: 72672
Set-Cookie: BIGipServerserver_pool=<snip>; expires=Sat, 17-Jan-2015 17:14:47 GMT; path=/
Vary: Accept-Encoding
------------------------------------------------------------------
CONNECT sp.biz.com:443 HTTP/1.1
Host: sp.biz.com
Connection: Keep-Alive
HTTP/1.1 200 Connection Established
FiddlerGateway: Direct
StartTime: 12:14:48.727
Connection: close
------------------------------------------------------------------
POST https://sp.biz.com/_vti_bin/Lists.asmx HTTP/1.1
User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; MS Web Services Client Protocol 2.0.50727.5485)
Content-Type: text/xml; charset=utf-8
SOAPAction: "http://schemas.microsoft.com/sharepoint/soap/GetListCollection"
Host: sp.biz.com
Content-Length: 321
Expect: 100-continue
HTTP/1.1 500 Internal Server Error
Cache-Control: private
Content-Type: text/xml; charset=utf-8
Server: Microsoft-IIS/7.5
X-AspNet-Version: 2.0.50727
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.7006
X-MS-InvokeApp: 1; RequireReadOnly
Date: Fri, 16 Jan 2015 17:14:48 GMT
Content-Length: 459
Set-Cookie: BIGipServerserver_pool=686493706.47873.0000; expires=Sat, 17-Jan-2015 17:14:48 GMT; path=/
------------------------------------------------------------------
Here are the headers for the cURL command.
CONNECT sp.biz.com:443 HTTP/1.1
Host: sp.biz.com:443
User-Agent: curl/7.39.0
Connection: Keep-Alive
Content-Type: text/xml
HTTP/1.1 200 Connection Established
FiddlerGateway: Direct
StartTime: 12:21:07.928
Connection: close
------------------------------------------------------------------
POST https://sp.biz.com/sites/SiteCollection/_vti_bin/Lists.asmx HTTP/1.1
Authorization: NTLM <snip>=
User-Agent: curl/7.39.0
Host: sp.biz.com
Accept: */*
Content-Type: text/xml
Content-Length: 0
HTTP/1.1 401 Unauthorized
Server: Microsoft-IIS/7.5
SPRequestGuid: <snip>
WWW-Authenticate: NTLM <snip>
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.7006
X-MS-InvokeApp: 1; RequireReadOnly
Date: Fri, 16 Jan 2015 17:21:07 GMT
Content-Length: 0
Set-Cookie: BIGipServerserver_pool=<snip>; expires=Sat, 17-Jan-2015 17:21:07 GMT; path=/
Proxy-Support: Session-Based-Authentication
------------------------------------------------------------------
POST https://sp.biz.com/sites/SiteCollection/_vti_bin/Lists.asmx HTTP/1.1
Authorization: NTLM <snip>
User-Agent: curl/7.39.0
Host: sp.biz.com
Accept: */*
Content-Type: text/xml
Content-Length: 417
HTTP/1.1 200 OK
Cache-Control: private, max-age=0
Content-Type: text/xml; charset=utf-8
Server: Microsoft-IIS/7.5
SPRequestGuid: <snip>
Set-Cookie: FedAuth=<snip>; expires=Sat, 17-Jan-2015 03:20:50 GMT; path=/; secure; HttpOnly
X-SharePointHealthScore: 0
X-AspNet-Version: 2.0.50727
Persistent-Auth: true
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.7006
X-MS-InvokeApp: 1; RequireReadOnly
Date: Fri, 16 Jan 2015 17:21:07 GMT
Content-Length: 66628
Vary: Accept-Encoding
------------------------------------------------------------------

You're mixing two fundamentally different techniques here.
$proxy = New-WebServiceProxy -Uri "$site/_vti_bin/Lists.asmx" -UseDefaultCredential
$proxy.PreAuthenticate = $TRUE
$proxy.Credentials = $credentials
UseDefaultCredential will attempt to pass your currently logged in Windows domain user to the site. However, you're setting Credentials as well. Normally, you would use -Credential $credentials (see http://technet.microsoft.com/en-us/library/hh849841.aspx )
The curl command you're running is more akin to using -Credential: -u is equivalent.
Try using something like $proxy = New-WebServiceProxy -Uri "$site/_vti_bin/Lists.asmx" -Credential $credentials instead.
If that doesn't work, please edit your question to include the headers being returned from the Oracle SSO connection - it could be that it simply isn't even asking for credentials.

I never ended up coming up with a solution for this, but I can explain why. In our environment we use Forms Based Authentication against our Oracle Identity Foundation SSO with SAML v1.1.
When you attempt to authenticate, it redirects you to the SSO, but the client is attempting to use NTLM against the actual Web Front Ends instead of the SSO. To make this work, you need to include the X-FORMS_BASED_AUTH_ACCEPTED: f header in your request for it to actually authenticate using NTLM against the WFE (and not the SSO).
Here's the issue: You can't add headers to New-WebServiceProxy in PowerShell (up to 4.0 -- I haven't rolled out 5 yet). The only recommendation I can make for others having issues is to follow HighlyUnavailable's suggestions, or use Invoke-WebRequest and build your SOAP calls by hand.
The only issue is that Invoke-WebRequest can chew up your encoding, so here's how I've worked around it. If anybody has a suggestion for working around the encoding issue, I'm all ears.
# Set your credentials here.
$UserName = 'BartSimpson'
$Password = '3atmMySh0rtz!'
$Domain = 'SF'
$SecurePassword = ConvertTo-SecureString -String $Password -AsPlainText -Force
$Credentials = New-Object System.Management.Automation.PSCredential (($Domain + "\" + $UserName), $SecurePassword)
# SOAP request headers and body
$BaseHeaders = #{"X-FORMS_BASED_AUTH_ACCEPTED" = 'f';
"SOAPAction" = "`"http://schemas.microsoft.com/sharepoint/soap/GetListCollection`"";
"Content-Type" = "text/xml; charset=utf-8"}
$SOAP = #"
<?xml version="1.0" encoding="utf-8"?>
<soap:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
<soap:Body>
<GetListCollection xmlns="http://schemas.microsoft.com/sharepoint/soap/" />
</soap:Body>
</soap:Envelope>
"#
# Gives us a random temp file to pipe output to
$TmpFile = [System.IO.Path]::GetTempFileName()
Invoke-WebRequest -Uri $URL -Headers $BaseHeaders -Credential $Credentials -Method POST -Body $SOAP -OutFile $TmpFile
# Get the outfile with UTF8 encoding
[xml]$Result = Get-Content -Raw -Path $TmpFile -Encoding UTF8
# Remove the temporary file
Remove-Item $TmpFile
Seems like a long way to go, which it is, but it works if you insist on using PowerShell.
I switched to python-suds and was able to do what I needed to.

In the above code, there is extra 'S' in -Credentials.
So replace the code:
$proxy = New-WebServiceProxy -Uri "$site/_vti_bin/Lists.asmx" -Credentials $credentials
with below code:
$proxy = New-WebServiceProxy -Uri "$site/_vti_bin/Lists.asmx" -Credential $credentials

Related

Active-Active minio cluster sync data

https://docs.min.io/minio/baremetal/replication/enable-server-side-two-way-bucket-replication.html#required-permissions
I follow this guide, when I do "mc admin policy add". I got an err:
mc: <ERROR> Unable to add new policy: conditions are not supported for action s3:GetBucketVersioning.
Then I add '--debug':
mc: <DEBUG> PUT /minio/admin/v3/add-canned-policy?name=ReplicationRemoteUserPolicy.json HTTP/1.1
Host: xxxxx:xxx
User-Agent: MinIO (linux; amd64) madmin-go/0.0.1 mc/RELEASE.2022-07-06T14-54-36Z
Content-Length: 1328
Accept-Encoding: gzip
Authorization: AWS4-HMAC-SHA256 Credential=admin/20220707//s3/aws4_request, SignedHeaders=host;x-amz-content-sha256;x-amz-date, Signature=**REDACTED**
X-Amz-Content-Sha256: 26a5f72146edcd356b967fb84a6b1407418205af1904f9408fd2e85b196c98d1
X-Amz-Date: 20220707T094030Z
mc: <DEBUG> HTTP/1.1 400 Bad Request
Content-Length: 237
Accept-Ranges: bytes
Content-Security-Policy: block-all-mixed-content
Content-Type: application/json
Date: Thu, 07 Jul 2022 09:40:19 GMT
Server: MinIO
Vary: Origin
X-Amz-Request-Id: 16FF82A1BB2425B5
X-Xss-Protection: 1; mode=block
{"Code":"XMinioMalformedIAMPolicy","Message":"conditions are not supported for action s3:GetBucketVersioning","Resource":"/minio/admin/v3/add-canned-policy","RequestId":"16FF82A1BB2425B5","HostId":"3473e3d7-6fef-4358-83e7-f7e333eb8675"}
mc: <DEBUG> Response Time: 3.844982ms
---------START-HTTP---------
PUT /minio/admin/v3/add-canned-policy?name=ReplicationRemoteUserPolicy.json HTTP/1.1
Host: xxxxx:xxxx
User-Agent: MinIO (linux; amd64) madmin-go/0.0.1 mc/RELEASE.2022-07-06T14-54-36Z
Content-Length: 1328
Accept-Encoding: gzip
Authorization: AWS4-HMAC-SHA256 Credential=admin/20220707//s3/aws4_request, SignedHeaders=host;x-amz-content-sha256;x-amz-date, Signature=**REDACTED**
X-Amz-Content-Sha256: 26a5f72146edcd356b967fb84a6b1407418205af1904f9408fd2e85b196c98d1
X-Amz-Date: 20220707T094030Z
HTTP/1.1 400 Bad Request
Content-Length: 237
Accept-Ranges: bytes
Content-Security-Policy: block-all-mixed-content
Content-Type: application/json
Date: Thu, 07 Jul 2022 09:40:19 GMT
Server: MinIO
Vary: Origin
X-Amz-Request-Id: 16FF82A1BB2425B5
X-Xss-Protection: 1; mode=block
{"Code":"XMinioMalformedIAMPolicy","Message":"conditions are not supported for action s3:GetBucketVersioning","Resource":"/minio/admin/v3/add-canned-policy","RequestId":"16FF82A1BB2425B5","HostId":"3473e3d7-6fef-4358-83e7-f7e333eb8675"}
---------END-HTTP---------
mc: <ERROR> Unable to add new policy: conditions are not supported for action s3:GetBucketVersioning
(1) admin-policy-add.go:140 cmd.mainAdminPolicyAdd(..) Tags: [cluster202, ReplicationRemoteUserPolicy.json, /dev/stdin]
(0) admin-policy-add.go:140 cmd.mainAdminPolicyAdd(..)
Commit:81c4a5ad6ee4 | Release-Tag:RELEASE.2022-07-06T14-54-36Z | Host:clone-instance-testv3 | OS:linux | Arch:amd64 | Lang:go1.18.3 | Mem:3.3 MB/17 MB | Heap:3.3 MB/7.7 MB.
How can I fix this?
I used helm chart install minio cluster, the tag of docker images is :RELEASE.2021-02-14T04-01-33Z

401 error after successful login using browser

Using my browser I point to a URL and I am prompted with a username/password dialog. I enter my username/password and I get my webpage.
I get a 401 error, however, when using curl:
curl --anyauth --user "$USERNAME:$PASSWORD" $URL
wget:
wget --http-user=$USERNAME --http-password=$PASSWORD $URL
Python:
response = requests.get(url, auth=requests.auth.HTTPBasicAuth(username, password))
response = requests.get(url, auth=requests.auth.HTTPDigestAuth(username, password))
The verbose (sanitized) output is below for curl:
* About to connect() to application.intranet.net port 443 (#0)
* Trying 10.10.10.139...
* Connected to application.intranet.net (10.10.10.139) port 443 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
* CAfile: /etc/pki/tls/certs/ca-bundle.crt
CApath: none
* SSL connection using TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA
* Server certificate:
* subject: CN=application.intranet.net,OU=COMPANY - Web Hosting,O=Com Pany Inc.,STREET=address,L=city,ST=state,postalCode=12345,C=US
* start date: Apr 06 00:00:00 2020 GMT
* expire date: Apr 06 23:59:59 2022 GMT
* common name: application.intranet.net
* issuer: CN=COMODO RSA Organization Validation Secure Server CA,O=COMODO CA Limited,L=Salford,ST=Greater Manchester,C=GB
> GET /appname/Reporting/ReportListStart.aspx HTTP/1.1
> User-Agent: curl/7.29.0
> Host: application.intranet.net
> Accept: */*
>
< HTTP/1.1 401 Unauthorized
< Cache-Control: private
< Content-Type: text/html
< Server: application Server
< WWW-Authenticate: Negotiate
< WWW-Authenticate: NTLM
< X-Frame-Options: SAMEORIGIN
< X-Content-Type-Options: nosniff
< Date: Wed, 23 Dec 2020 16:17:22 GMT
< Content-Length: 1293
<
* Ignoring the response-body
* Connection #0 to host application.intranet.net left intact
* Issue another request to this URL: 'https://application.intranet.net/appname/Reporting/ReportListStart.aspx'
* Found bundle for host application.intranet.net: 0x1f4b050
* Re-using existing connection! (#0) with host application.intranet.net
* Connected to application.intranet.net (10.10.10.139) port 443 (#0)
> GET /appname/Reporting/ReportListStart.aspx HTTP/1.1
> User-Agent: curl/7.29.0
> Host: application.intranet.net
> Accept: */*
>
< HTTP/1.1 401 Unauthorized
< Cache-Control: private
< Content-Type: text/html
< Server: application Server
* gss_init_sec_context() failed: : No Kerberos credentials available (default cache: KEYRING:persistent:9013)
< WWW-Authenticate: Negotiate
< WWW-Authenticate: NTLM
< X-Frame-Options: SAMEORIGIN
< X-Content-Type-Options: nosniff
< Date: Wed, 23 Dec 2020 16:17:22 GMT
< Content-Length: 1293
<
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1"/>
<title>401 - Unauthorized: Access is denied due to invalid credentials.</title>
<style type="text/css">
<!--
body{margin:0;font-size:.7em;font-family:Verdana, Arial, Helvetica, sans-serif;background:#EEEEEE;}
fieldset{padding:0 15px 10px 15px;}
h1{font-size:2.4em;margin:0;color:#FFF;}
h2{font-size:1.7em;margin:0;color:#CC0000;}
h3{font-size:1.2em;margin:10px 0 0 0;color:#000000;}
#header{width:96%;margin:0 0 0 0;padding:6px 2% 6px 2%;font-family:"trebuchet MS", Verdana, sans-serif;color:#FFF;
background-color:#555555;}
#content{margin:0 0 0 2%;position:relative;}
.content-container{background:#FFF;width:96%;margin-top:8px;padding:10px;position:relative;}
-->
</style>
</head>
<body>
<div id="header"><h1>Server Error</h1></div>
<div id="content">
<div class="content-container"><fieldset>
<h2>401 - Unauthorized: Access is denied due to invalid credentials.</h2>
<h3>You do not have permission to view this directory or page using the credentials that you supplied.</h3>
</fieldset></div>
</div>
</body>
</html>
* Connection #0 to host application.intranet.net left intact
wget:
--2020-12-23 11:18:14-- https://application.intranet.net/appname/Reporting/ReportListStart.aspx
Resolving application.intranet.net (application.intranet.net)... 10.10.10.139, 10.10.10.10
Connecting to application.intranet.net (application.intranet.net)|10.10.10.139|:443... connected.
HTTP request sent, awaiting response... 401 Unauthorized
Reusing existing connection to application.intranet.net:443.
HTTP request sent, awaiting response... 401 Unauthorized
Reusing existing connection to application.intranet.net:443.
HTTP request sent, awaiting response... 401 Unauthorized
Authorization failed.
Python:
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): application.intranet.net:443
send: b'GET /appname/Reporting/ReportListStart.aspx HTTP/1.1\r\nHost: application.intranet.net\r\nUser-Agent: python-requests/2.25.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\nAuthorization: Basic U19KaXJhX0ludGVybmFsQXVkaXQ6R2l4X0lLdzFqTEYtMld0cw==\r\n\r\n'
reply: 'HTTP/1.1 401 Unauthorized\r\n'
header: Cache-Control: private
header: Content-Type: text/html
header: Server: application Server
header: WWW-Authenticate: Negotiate
header: WWW-Authenticate: NTLM
header: X-Frame-Options: SAMEORIGIN
header: X-Content-Type-Options: nosniff
header: Date: Wed, 23 Dec 2020 17:01:10 GMT
header: Content-Length: 1293
DEBUG:urllib3.connectionpool:https://application.intranet.net:443 "GET /appname/Reporting/ReportListStart.aspx HTTP/1.1" 401 1293
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): application.intranet.net:443
send: b'GET /appname/Reporting/ReportListStart.aspx HTTP/1.1\r\nHost: application.intranet.net\r\nUser-Agent: python-requests/2.25.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'
reply: 'HTTP/1.1 401 Unauthorized\r\n'
header: Cache-Control: private
header: Content-Type: text/html
header: Server: application Server
header: WWW-Authenticate: Negotiate
header: WWW-Authenticate: NTLM
header: X-Frame-Options: SAMEORIGIN
header: X-Content-Type-Options: nosniff
header: Date: Wed, 23 Dec 2020 17:01:10 GMT
header: Content-Length: 1293
DEBUG:urllib3.connectionpool:https://application.intranet.net:443 "GET /appname/Reporting/ReportListStart.aspx HTTP/1.1" 401 1293
From my browser there is the initial request that returns a 302:
Request URL: https://application.wuintranet.net/appname/Reporting/ReportListStart.aspx
Request Method: GET
Status Code: 302 Found
Remote Address: 10.10.10.123:443
Referrer Policy: strict-origin-when-cross-origin
Cache-Control: private
Content-Length: 160
Content-Type: text/html; charset=utf-8
Date: Wed, 23 Dec 2020 17:14:54 GMT
Location: /appname/Reporting/ReportListStart.aspx
Persistent-Auth: true
Server: application Server
Set-Cookie: ASP.NET_SessionId=dy2rr35onasw5ctumhuqb4af; path=/; secure; HttpOnly; SameSite=Lax
Set-Cookie: appname_Cookie=ConnectionTitle=DELwLGx+KbrtS0gKvmretg==&IsConnectionTitleSet=True&IsLogOff=False&CurrentOrganization=ELx658BVmiesDFQg7w5RtA==&IsOrganizationRequired=YBfC/taoB3Ll19UPqF9IEA==; path=/; secure; HttpOnly
Set-Cookie: .application_SSO_Cookie=ConnectionTitle=DELwLGx+KbrtS0gKvmretg==&IsConnectionTitleSet=True&IsLogOff=True&CurrentOrganization=ELx658BVmiesDFQg7w5RtA==&IsOrganizationRequired=YBfC/taoB3Ll19UPqF9IEA==; path=/; secure; HttpOnly
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
Cache-Control: max-age=0
Connection: keep-alive
Host: application.wuintranet.net
Sec-Fetch-Dest: document
Sec-Fetch-Mode: navigate
Sec-Fetch-Site: none
Sec-Fetch-User: ?1
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.88 Safari/537.36
and then the browser-generated followup that returns 200:
Request URL: https://application.wuintranet.net/appname/Reporting/ReportListStart.aspx
Request Method: GET
Status Code: 200 OK
Remote Address: 10.10.10.123:443
Referrer Policy: strict-origin-when-cross-origin
Cache-Control: private
Content-Encoding: gzip
Content-Length: 32914
Content-Type: text/html; charset=utf-8
Date: Wed, 23 Dec 2020 17:14:54 GMT
Persistent-Auth: true
Server: application Server
Set-Cookie: appname_Cookie=ConnectionTitle=DELwLGx+KbrtS0gKvmretg==&IsConnectionTitleSet=True&IsLogOff=False&CurrentOrganization=ELx658BVmiesDFQg7w5RtA==&IsOrganizationRequired=YBfC/taoB3Ll19UPqF9IEA==; path=/; secure; HttpOnly
Vary: Accept-Encoding
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
Cache-Control: max-age=0
Connection: keep-alive
Cookie: ASP.NET_SessionId=dy2rr35onasw5ctumhuqb4af; appname_Cookie=ConnectionTitle=DELwLGx+KbrtS0gKvmretg==&IsConnectionTitleSet=True&IsLogOff=False&CurrentOrganization=ELx658BVmiesDFQg7w5RtA==&IsOrganizationRequired=YBfC/taoB3Ll19UPqF9IEA==; .application_SSO_Cookie=ConnectionTitle=DELwLGx+KbrtS0gKvmretg==&IsConnectionTitleSet=True&IsLogOff=True&CurrentOrganization=ELx658BVmiesDFQg7w5RtA==&IsOrganizationRequired=YBfC/taoB3Ll19UPqF9IEA==
Host: application.wuintranet.net
Sec-Fetch-Dest: document
Sec-Fetch-Mode: navigate
Sec-Fetch-Site: none
Sec-Fetch-User: ?1
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.88 Safari/537.36
replace
curl --anyauth
with
curl --ntlm

Curl doesn't redirect properly for some facebook profile IDs

I'm trying to hack together a bit of code (c++) that returns a current facebook display name for a given user ID. The code works perfectly fine for most IDs but for some IDs curl returns a 404 even though the same URL opens fine in a browser. Here's the bit of code related to curl.
int main(int argc, char const *argv[]) {
CURL* curl;
char userAgent[] = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US;
rv:1.8.1.13) Gecko/20080311 Firefox/2.0.0.13";
curl_global_init(CURL_GLOBAL_ALL);
curl = curl_easy_init();
string fbID = argv[1];
string searchTerm = "https://www.facebook.com/profile.php?id=" + fbID;
cout << "searching for: " << searchTerm << endl;
if(!curl)
return 0;
else{
curl_easy_setopt(curl, CURLOPT_USERAGENT, userAgent);
curl_easy_setopt(curl, CURLOPT_URL, searchTerm.c_str());
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, &writeCallback);
curl_easy_setopt(curl, CURLOPT_FOLLOWLOCATION, true);
curl_easy_setopt(curl, CURLOPT_VERBOSE, 1L); //tell curl to output
its progress
}
//scan the retrieved data for the name...
}
size_t writeCallback(char* buf, size_t size, size_t nmemb, void* up)
{ //i copied this function
for (int c = 0; c<size*nmemb; c++)
{
data.push_back(buf[c]);
}
return size*nmemb;
}
I don't exactly know where to look because it works for most ids but for example "1094145063" returns a 404 even though https://www.facebook.com/profile.php?id=1094145063
opens fine in a browser. (1331601579 for example works fine with my code)
I can't spot the difference between the sites.
It does however make a redirect to https://www.facebook.com/"username" which works fine once CURLOPT_FOLLOWLOCATION is set, but only on some sites. Here's the message curl outputs.
* Hostname was NOT found in DNS cache
* Trying 31.13.84.36...
* Connected to www.facebook.com (31.13.84.36) port 443 (#0)
* found 174 certificates in /etc/ssl/certs/ca-certificates.crt
* server certificate verification OK
* common name: *.facebook.com (matched)
* server certificate expiration date OK
* server certificate activation date OK
* certificate public key: EC
* certificate version: #3
* subject: C=US,ST=California,L=Menlo Park,O=Facebook\, Inc.,CN=*.facebook.com
* start date: Fri, 09 Dec 2016 00:00:00 GMT
* expire date: Thu, 25 Jan 2018 12:00:00 GMT
* issuer: C=US,O=DigiCert Inc,OU=www.digicert.com,CN=DigiCert SHA2 High Assurance Server CA
* compression: NULL
* cipher: AES-128-GCM
* MAC: AEAD
> GET /profile.php?id=1094145063 HTTP/1.1
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.13) Gecko/20080311 Firefox/2.0.0.13
Host: www.facebook.com
Accept: */*
HTTP/1.1 404 Not Found
X-XSS-Protection: 0
public-key-pins-report-only: max-age=500; pin-sha256="WoiWRyIOVNa9ihaBciRSC7XHjliYS9VwUGOIud4PB18="; pin-sha256="r/mIkG3eEpVdm+u/ko/cwxzOMo1bk4TyHIlByibiA5E="; pin-sha256="q4PO2G2cbkZhZ82+JgmRUyGMoAeozA+BSXVXQWB8XWQ="; report-uri="http://reports.fb.com/hpkp/"
Pragma: no-cache
content-security-policy: default-src * data: blob:;script-src *.facebook.com *.fbcdn.net *.facebook.net *.google-analytics.com *.virtualearth.net *.google.com 127.0.0.1:* *.spotilocal.com:* 'unsafe-inline' 'unsafe-eval' fbstatic-a.akamaihd.net fbcdn-static-b-a.akamaihd.net *.atlassolutions.com blob: data: 'self';style-src data: blob: 'unsafe-inline' *;connect-src *.facebook.com *.fbcdn.net *.facebook.net *.spotilocal.com:* *.akamaihd.net wss://*.facebook.com:* https://fb.scanandcleanlocal.com:* *.atlassolutions.com attachment.fbsbx.com ws://localhost:* blob: *.cdninstagram.com 'self';
Cache-Control: private, no-cache, no-store, must-revalidate
Strict-Transport-Security: max-age=15552000; preload
X-Content-Type-Options: nosniff
Expires: Sat, 01 Jan 2000 00:00:00 GMT
Vary: Accept-Encoding
Content-Type: text/html; charset=UTF-8
X-FB-Debug: pA9SSTdWWx2a66QeP7Je/4ik/2a+/ZL/m/nckHKf+KoEZLloClzu+qMDzyE/B8M1PRDl4SdS19C9vIIl7f43mA==
Date: Tue, 06 Jun 2017 18:25:15 GMT
Transfer-Encoding: chunked
Connection: keep-alive
* Connection #0 to host www.facebook.com left intact
Any help would be appreciated.
(my first SO post so please don't be too harsh if i did something dumb)

spring cloud zuul: How can return service httpstatus(or handle zuul status return)?

If use zuul routes for eureka service id,service return some httpStatus like 503
curl 10.1.0.21:5701/usr/users/xxx/a -H "Api-Version: v10" -i
Returns:
HTTP/1.1 503 Service Unavailable
Set-Cookie: Session-Token=9e66fb56-21e2-457a-a00f-f788c5ce820b; path=/; domain=.0.21:5701; HttpOnly; Max-Age=2592000; Expires=Sat, 29-Oct-2016 09:23:41 GMT
Date: Thu, 29 Sep 2016 09:23:41 GMT
Session-Token-Expires: 2592000
Connection: keep-alive
ETag:
Session-Token: 9e66fb56-21e2-457a-a00f-f788c5ce820b
Transfer-Encoding: chunked
Content-Type: application/json;charset=UTF-8
X-Application-Context: user-api-provider:5701
Content-Language: en-
Access-Control-Max-Age: 1728000
[{"key":"serviceUnavailable","message":"无效的接口"}]
But zuul reuturns ok/200, why? How can I change how Zuul handles statuses?
curl 10.1.0.19:8090/usr/users/xxx/a -H "Api-Version: v10" -i
Returns:
HTTP/1.1 200 OK
Connection: keep-alive
Content-Length: 0
X-Application-Context: zuul-server:8090
Date: Thu, 29 Sep 2016 09:24:10 GMT
Works for me with latest spring-cloud-netflix (1.2.0):
#SpringBootApplication
#EnableZuulProxy
public class SimpleApplication {
public static void main(String[] args) {
SpringApplication.run(SimpleApplication.class, args);
}
}
with application.properties:
zuul.routes.test.url=http://httpstat.us/503
zuul.routes.test.path=/test/**
and:
$ curl localhost:8080/test/
> GET /test/ HTTP/1.1
> User-Agent: curl/7.35.0
> Host: localhost:8080
> Accept: */*
>
< HTTP/1.1 503
< X-Application-Context: application
< Cache-Control: private
< X-AspNetMvc-Version: 5.1
< X-AspNet-Version: 4.0.30319
< X-Powered-By: ASP.NET
< Access-Control-Allow-Origin: *
< Date: Thu, 29 Sep 2016 10:24:16 GMT
< Content-Type: text/plain;charset=utf-8
< Transfer-Encoding: chunked
< Connection: close
<
* Closing connection 0
503 Service Unavailable$

401 Authorization Required: Failed to validate oauth signature and token

I've tried using Net::Twitter::Role::OAuth to add Sign in with Twitter to my application.
I've used this successfully in the past, but not with SSL enabled, which apparently is now required by the Twitter API. I have a controller action very similar to the examples in the synopsis:
sub twitter_authorize : Local {
my($self, $c) = #_;
my $nt = Net::Twitter->new(traits => [qw/API::RESTv1_1 OAuth/], %param);
my $url = $nt->get_authorization_url(callback => $callbackurl);
$c->response->cookies->{oauth} = {
value => {
token => $nt->request_token,
token_secret => $nt->request_token_secret,
},
};
$c->response->redirect($url);
}
However, this fails at the $nt->get_authorization_url() call with a 401 Unauthorized error.
Looking at the oauth/request_token docs, I tried running the request through cURL, as follows:
curl --request 'POST' 'https://api.twitter.com/oauth/request_token' --header 'Authorization: OAuth oauth_consumer_key="xxxx", oauth_nonce="xxxx", oauth_signature="xxxx", oauth_signature_method="HMAC-SHA1", oauth_timestamp="xxxx", oauth_callback="oob", oauth_version="1.0"' --verbose
And the response is as follows:
* About to connect() to api.twitter.com port 443 (#0)
* Trying 199.16.156.104...
connected
* Connected to api.twitter.com (199.16.156.104) port 443 (#0)
* successfully set certificate verify locations:
* CAfile: none
CApath: /etc/ssl/certs
* SSLv3, TLS handshake, Client hello (1):
* SSLv3, TLS handshake, Server hello (2):
* SSLv3, TLS handshake, CERT (11):
* SSLv3, TLS handshake, Server finished (14):
* SSLv3, TLS handshake, Client key exchange (16):
* SSLv3, TLS change cipher, Client hello (1):
* SSLv3, TLS handshake, Finished (20):
* SSLv3, TLS change cipher, Client hello (1):
* SSLv3, TLS handshake, Finished (20):
* SSL connection using AES128-SHA
* Server certificate:
* subject: C=US; ST=California; L=San Francisco; O=Twitter, Inc.; OU=Twitter Security; CN=api.twitter.com
* start date: 2014-08-03 00:00:00 GMT
* expire date: 2016-12-31 23:59:59 GMT
* subjectAltName: api.twitter.com matched
* issuer: C=US; O=VeriSign, Inc.; OU=VeriSign Trust Network; OU=Terms of use at https://www.verisign.com/rpa (c)10; CN=VeriSign Class 3 Secure Server CA - G3
* SSL certificate verify ok.
> POST /oauth/request_token HTTP/1.1
> User-Agent: curl/7.19.7 (x86_64-pc-linux-gnu) libcurl/7.19.7 OpenSSL/0.9.8k zlib/1.2.3.3 libidn/1.15
> Host: api.twitter.com
> Accept: */*
> Authorization: OAuth oauth_consumer_key="xxxx", oauth_nonce="xxxx", oauth_signature="xxxx", oauth_signature_method="HMAC-SHA1", oauth_timestamp="xxxx", oauth_callback="oob", oauth_version="1.0"
>
< HTTP/1.1 401 Authorization Required
< cache-control: no-cache, no-store, must-revalidate, pre-check=0, post-check=0
< content-length: 44
< content-security-policy-report-only: default-src https:; connect-src https:; font-src https: data:; frame-src https: http://*.twimg.com http://itunes.apple.com about: javascript:; img-src https: data:; media-src https:; object-src https:; script-src 'unsafe-inline' 'unsafe-eval' about: https:; style-src 'unsafe-inline' https:; report-uri https://twitter.com/i/csp_report?a=NVXW433SMFUWY%3D%3D%3D&ro=true;
< content-type: text/html; charset=utf-8
< date: Tue, 21 Oct 2014 10:29:57 UTC
< expires: Tue, 31 Mar 1981 05:00:00 GMT
< last-modified: Tue, 21 Oct 2014 10:29:57 GMT
< pragma: no-cache
< server: tsa_b
< set-cookie: _twitter_sess=BAh7CDoPY3JlYXRlZF9hdGwrCD2PQTJJAToHaWQiJTE3M2Q4OWIyZWE1Nzc1%250AZmYxMjRkYmUyZDVjOTBlYjQxIgpmbGFzaElDOidBY3Rpb25Db250cm9sbGVy%250AOjpGbGFzaDo6Rmxhc2hIYXNoewAGOgpAdXNlZHsA--b807e4ebb8d45756e9686971b951a549d0d83b61; domain=.twitter.com; path=/; secure; HttpOnly
< set-cookie: guest_id=v1%3A141388739758201626; Domain=.twitter.com; Path=/; Expires=Thu, 20-Oct-2016 10:29:57 UTC
< status: 401 Unauthorized
< strict-transport-security: max-age=631138519
< vary: Accept-Encoding
< x-connection-hash: 54a185631d5f0b3a3a9dc46fe1f40a57
< x-content-type-options: nosniff
< x-frame-options: SAMEORIGIN
< x-mid: 0258025664ce095129d0cc294100d71a2e6e66ac
< x-runtime: 0.01294
< x-transaction: 6fad295009a89877
< x-ua-compatible: IE=edge,chrome=1
< x-xss-protection: 1; mode=block
<
* Connection #0 to host api.twitter.com left intact
* Closing connection #0
* SSLv3, TLS alert, Client hello (1):
Failed to validate oauth signature and token
Weirdly, if I remove the oauth_callback key from the Authorization header, it works fine and I get the tokens. However, the API docs suggest that this parameter is required. Is there something wrong with how I'm passing in the oauth_callback item?
I've tried setting it to oob (which is supposed to work for "out of band" access). And I've copied the encoded URL from the API docs. It doesn't work with either.
As it works without oauth_callback, it's not a time issue on my machine, as is a commonly reported problem. I haven't tried sending the Net::Twitter request without the callback (I haven't checked if that's possible) but I assume that would fix it there too. However, I do need the request to have a valid callback URL or the user won't be redirected back to the application for the rest of the sign in flow.
Adam,
I was having a similar issue and after a thorough investigation of what other Twitter API wrapper libraries were doing discovered that the oauth_callback needed to be encoded twice.
On doing a retrospective? search to get a little more explanation/clarity I found the following SO answer Twitter Oauth URL encoding inconsistencies? which extremely succinctly explained the issue I was having.
Is this perhaps the issue you have encountered?