i'm having problems with the HTTP:Cookies extract_cookies() method, i have a HTTP::Header object with multiple cookies in it under a single field name and the method is only extracting a single cookie. The solution is possibly to have each cookie under a separate 'Set-Cookie" field, but from which i can see HTTP::Headers does not allow you to have more than 1 field with the same name. Ideas?
How does this come about? A browser should never send multiple cookies with the same name (at least for the same domain/host and path).
Update: sorry, I misunderstood. It does appear that multiple Netscape cookies are only expected in multiple Set-Cookie headers (but new-style cookies are all expected in the same Set-Cookie2 header). HTTP::Headers should work fine with multiple headers with the same name; what are you seeing?
Related
I'm currently working on an api where I use a refresh token saved in a cookie. My problem is that I need this refresh cookie on two paths, on the /refresh-token and on /logout, but as far as I know, I can only set one path attribute for one cookie. So should I use two cookies (sounds redundant for me) or should put both paths under something like /xyz >> /xyz/logout and /xyz/refresh-token so that I can set the path of the cookie to /xzy?
If you don't specify a PATH (or simply set it to "/"), the cookie will be available in both places, assuming they're both on the same domain.
That being said, it will be accessible to other PATHs as well.
here's response header of my redirection endpoint with status code 302.
"Location": "http://<target-domain>",
"Set-Cookie": "username=user1;"
I can see it redirects correctly to 302. but the cookie does not get set on the <target-domain>
Looks like the header "Set-Cookie": "username=user1;" does not get passed to the <target-domain> on redirection.
I see 2 network activities in my development tool,
redirection endpoint responds with status code 302. I see Location and Set-Cookie in the response header.
target domain responds with status code 200. I don't see Location and Set-Cookie anymore.
Is there a way to set the cookies on the <target-domain>?
You can't set cookies on a domain other than the one you're on, so basically no. The only exception to this is you can set cookies on example.com if your current domain is something like subdomain.example.com, where you can attach the cookies to a shorter form of your domain, but it must be the same base domain.
If you need the other site to set a cookie with a value it does not know, you'll have to pass that value through somehow. Using a redirect with a query string leaves it open to tampering by the user unless you cryptographically sign it (annoying) or ship over a token that can be used to retrieve the raw value. You may need a short-term store for this, like Redis, Memcached, or even a database row you can purge later.
If it were possible to set cookies on any domain at all there'd be utter chaos. These things are heavily restricted for a reason.
I already seen some question from here (stackoverflow) and THIS post, but I still have some questions...
Using hidden value in the post form and check it when post reach the server.
The hidden value can easy be copied and send exactly like the real one, "hard to guess" (like md5) will not help. (right?)
Setting a cookie when you reach the form and send the cookie value as a hidden value.
You can easily change a cookie value or send a custom cookie exactly like the real one using the same real hidden value. (right?)
Using 'timeout', the POST values cannot reach too late.
So, if you're slow you will fail when you try to set everything up with the hidden value. If you're fast it gonna work. (right?)
I want to be protected about CSRF...but how exactly I do it?
The easiest way I found to prevent CSRF issues is:
On the server side, assign an HttpOnly cookie to the client with a random (unguessable) token
Place a hidden field on the form with that cookie value
Upon form submit, ensure the hidden field value equals the cookie value (on the server side of things)
If you make the following changes then I think you're safe
no data updates should be allowed through GET (or better POST as well) (since both can be used through HTML forms)
disable CORS on your server (or at least on endpoints that are critical and/or make changes to data)
allow JSON-only APIs (ie. only accept input through JSON on critical endpoints at least)
Just to add to above: Do not use method overrides and do not support old browsers.
This is sort of a follow-up to Why are my cookies containing JSON occasionally malformed, which we have resolved.
I have a 3-value cookie, and we're url encoding the main value. The other two values are a timestamp and a hash. It looks like this in our response header:
foo=d=634027688530013385&v=%7b%22HasDog%22%3afalse%2c%22Greeting%22%3anull%2c%22RecentRecipes%22%3a%5b%5d%2c%22Remember%22%3afalse%7d&h=ARv5QGf4Cnftc4tFaPoy/VH8Pbo=; path=/; HttpOnly
In our logs, we see cases where we can't parse the three values correctly because the entire cookie is now encoded:
Cookie looks mangled: d%3D634027653097874122%26v%3D%7B%22HasAcceptedTerms%22%3Afalse%2C%22RecipeBoxCount%22%3A0%2C%22Remember%22%3Afalse%7D%26h%3DR85mJ%2FTdA6yrVe5pVCVpfG2jumM%3D
Unfortunately, we're not capturing the user agent to see if this is related to a specific browser.
I have several options to fix this. I just think the behavior is odd enough to warrant a question.
This may not be an answer but....
This is interesting and warrants a deeper look.
I would like to see a public facing test page that shows red or green with the cookie in bold text and then run it through http://browsercam.com.
I did this when I thought I had found a bug in mozilla's native json support. turns out I was right.
Get your testpage working for sure for sure before you fill out the form for a free trial (200 shots), set the resolution to 640/480 and select all browsers/platform, 182 distinct combinations, set a delay to allow the redirection to set the cookie and track down the culprit.
Or take the time and get on http://testswarm.com/.
Please do follow up on this.
I have used referrer before in foo.php to decide whether the page iframing foo.php is of a particular URL. (using $_SERVER['HTTP_REFERER'])
It turned out that most of the time, it worked (about 98% of the time), but it also seemed like some users arrived the page and $_SERVER['HTTP_REFERER'] was not set in foo.php and therefore broke the code. [update: These user claimed that they followed the usual page flow and didn't use the URL of foo.php all by itself on the browser (that they let it be an iframe) and the users never altered their browser settings.]
I wonder what the reasons are that it could happen?
The HTTP/1.1 RFC does not make it mandatory to send an HTTP referer header. You can't make any assumptions about its presence when writing robust code; perfectly conforment browsers may not include it.
Moreoever, the RFC advises that "The Referer field MUST NOT be sent if the Request-URI was obtained from a source that does not have its own URI, such as input from the user keyboard", and "We suggest, though do not require, that a convenient toggle interface be provided for the user to enable or disable the sending of From and Referer information".
The later is not very common (though some browsers have a "Private" mode that fulfils the requirements). More likely for your 2% is that people Bookmarked the URL, which fulfils the first criteria (URI obtained from a source without a URI), and so the browser sends no referer.
Not by default AFAIK, but it's easy to turn it off (for privacy) e.g. in Firefox via about:config, and surely some users could be using browsers distributed to them (e.g. by their IT department) with such kinds of setting. So you should try to avoid relying on REFERER for any important functionality (also because it's mis-spelled, of course;-).