CSRF Referer checking failed Django 1.8 - csrf

I have searched about this but could not find any workable solution.
I have a website like www.example.com and subdomains a.example.com and b.example.com. When I try to post a request from a.example.com to b.example.com I get the error of Referer checking failed.
I have following settings in a.example.com and b.example.com:
CSRF_COOKIE_DOMAIN = ".example.com"
But I am not able to make use of CSRF_COOKIE_DOMAIN correctly.

Django 1.8 has strict referer checking for HTTPS. You cannot post from a.example.com to b.example.com with CSRF protection enabled in Django 1.8.
With Django 1.9 they added CSRF trusted origins.

Related

Typo3v11 site configuration https results in 404

we have a running page with Typo3 v9.5 and switched to Typo3 v11.
There is a big problem with the Site Configuration. If we set the Entry Point to protocol https we got 404 errors on all pages. If we set the protocol to http it works as expected also with https://.
In front of the Apache we have a HAProxy which unpack the https and redirect the queries to a varnish proxy and the varnish redirects to Apache Servers (more than one).
One more problem with this http Entry Point is that a call to https://mypage/typo3/ results in a redirect to insecure http://mypage/typo3/ which will not be accepted.
What I have to change so that the Entry Point uses https protocol and an access to my pages do not result in 404 errors?
Thanks #julian-hofmann for the tip again.
In our environment I need the following settings in the AdditionalConfiguration.php
$GLOBALS['TYPO3_CONF_VARS']['SYS']['reverseProxySSL'] = '*';
# there is more than one reverseProxyIP
$GLOBALS['TYPO3_CONF_VARS']['SYS']['reverseProxyIP'] = '*';
# removes from http header x_forwarded_for the last entry (proxy)
$GLOBALS['TYPO3_CONF_VARS']['SYS']['reverseProxyHeaderMultiValue'] = 'last';
And have to set the http header variable x_forwarded_proto to https also.
Now it works as expected.

Deploy my symfony api rest app CORS errors

I developped an application NUXTJS with a backend in Symfony. I deploy an API REST that allows me to talk beetween front and back. Everything work in local.
I want to deploy this on my host.
So I created two subdomains : one for my front and one for my backend.
So when I try to access to my application, I try to connect but I have these two errors CORS :
Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://mysubdomain.domain.fr/api/login_check. (Reason: CORS header 'Access-Control-Allow-Origin' missing).
And
Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at https://mysubdomain.domain.fr/api/login_check. (Reason: CORS request did not succeed).
How can I fix this ?
Thanks a lot
You need to enable CORS by specifying the nextjs domain, not the api domain. If you want to be sure you can also allow all domain by using a *. But this reduces security a bit.
If you really want to be secure, you could add a proxy to your nextjs server, which proxies to your Symfony application. For example, in your NextJS application, proxy /api to the Symfony application hostname.
This way you don't need to enable CORS at all.
If you use Api Platform you may need to configure the CORS_ALLOW_ORIGIN variable in your .env file 😁

Facebook - Curl Error SSL_CACERT SSL certificate

I am getting "Curl Error : SSL_CACERT SSL certificate problem: unable to get local issuer certificate" when asking Facebook to scrape my page over https. How can I fix this so that Facebook can scrape my page without errors?
The page is hosted via Apache 2.4 proxying to IIS 10. Apache handles all certificates and IIS is on the local network. My page is running asp code (so no php) and solutions similar to these: edit the php.ini file or adding curl.pem to php folder will not work fix my problem ... or so I think?!?
IIS has no certificate installed.
I do have extension=php_curl.dll enabled -- and extension_dir = 'C:\64bit\php-7.0.6-Win32-VC14-x64\ext' defined in my php.ini file. I followed these steps to install Curl on Windows. And phpinfo.php confirms that cURL is enabled (cURL Information 7.47.1).
My proxy setup in my Apache config file is:
<IfModule mod_proxy.c>
ProxyRequests Off
ProxyPass / http://192.168.1.101:88/com_ssl/
ProxyPassReverse / http://192.168.1.101:88/com_ssl/
RewriteRule ^(.+)$ https://www.domainname.com/$1 [P,L]
</IfModule>
I have no RequestHeader defined in my Apache proxy config file, such as suggested here in Step 10:
RequestHeader set "X-RP-UNIQUE-ID" "%{UNIQUE_ID}e"
RequestHeader set "X-RP-REMOTE-USER" "%{REMOTE_USER}e"
RequestHeader set "X-RP-SSL-PROTOCOL" "%{SSL_PROTOCOL}s"
RequestHeader set "X-RP-SSL-CIPHER" "%{SSL_CIPHER}s"
Is this what is missing to fix the error?
"unable to get local issuer certificate" is almost always the error message you get when the server doesn't provide an intermediate certificate as it should in the TLS handshake, and as WizKid suggests, running the ssllabs test against the server will indeed tell you if that is the case.
If you are using nodejs server and getting this error 'Curl Error SSL_CACERT SSL certificate' then you need to add your CA along with your SSL CRT.
var fs = require('fs');
var https = require('https');
var options = {
key: fs.readFileSync('server-key.pem'),
cert: fs.readFileSync('server-crt.pem'),
ca: fs.readFileSync('ca-crt.pem'), // <= Add This
};
https.createServer(options, function (req, res) {
console.log(new Date()+' '+
req.connection.remoteAddress+' '+
req.method+' '+req.url);
res.writeHead(200);
res.end("hello world\n");
}).listen(4433);
This may not have been the case at the time but I will add this info in case others encounter the same issue.
If you are using a CDN, like cloudflare, it is important to set up your SSL before adding to cloudflare as it can generate issues.
It is also important to ensure that all domains are correctly annotated in the DNS control of cloudflare, otherwise you may end up serving your main domain via cloudflare and your subdomain(s) directly from your server. Whilst this wont matter much to the user (still shows secure, still have access, still passes SSL tests) it may flag issues with sharing apps onto social media. Basically, I replicated the error by splitting the DNS setup as above and achieved the flagged error as highlighted by the op. Then I added the DNS for the subdomain into cloudflare, tested a few hours later (after resetting the page in debudder: https://developers.facebook.com/tools/debug/sharing/?q=https%3A%2F%2Fus.icalculator.info%2Fterminology%2Fus-tax-tables%2F2019%2Fvirginia.html). and, hey presto, the error goes. So, if you encounter that issue and you use cloudflare, that is something to check you have set up correctly.

Is there a way to access an insecure HTTP api request in GITHUB?

My repo on GitHub consists of a simple API data pull.
But GitHub throws an error as the data request is made on an insecure HTTP link rather than an HTTPS link.
Is there a workaround that, like maybe asking GitHub to override the security and accept the data from the HTTP link anyway?
Accept data from http anyway?
No, any call to GitHub API is always redirected to https
$ curl -L -i http://api.github.com/users/octocat/orgs
HTTP/1.1 301 Moved Permanently
Content-length: 0
Location: https://api.github.com/users/octocat/orgs
Connection: close
So using https is, for GitHub API, the only option allowed right now.

Facebook link sharing/Debugger refuses connection to TLS 1.2 website

(According to https://developers.facebook.com/tools-and-support/ there are Facebook engineers reading this.)
Some of our web hosting customers recently complained about missing images/text when sharing content from their https website to Facebook.
I tracked this problem to a security change in our environment that disabled TLS v1.0 for customer HTTPS sites. The curl output in Facebook Debugger merely showed an SSL connection error and I can reproduce the problem locally if I force curl to not try TLS v1.1 nor v1.2.
These values in Apache 2.4 vhost configuration makes Facebook not connect to my customers site:
SSLProtocol All -SSLv2 -SSLv3 -TLSv1
Changing SSLProtocol to this make Facebook work OK:
SSLProtocol All -SSLv2 -SSLv3
'All' includes TLS v1.1 and v1.2. Why doesn't Facebook link-sharing and the Facebook Debugger work against modern sites that use TLS v1.1 and TLS v1.2 (and have SSLv3 & TLSv1 disabled)?
Thanks.