What does "Cookies used by frames from X" mean in Chrome Dev Tools? - google-chrome-devtools

In Chrome Dev Tools > Application under Cookies I see a list of URLs. When I hover over each one, Chrome shows the message "Cookies used by frames from X".
At first, I assumed that frames meant "iframe", i.e. if an iframe was used on this webpage from source X, then any cookie set by visiting that (iframe view of the) site would show up in the dev tools for your inspection. However, I checked to see if there was always a corresponding iframe for each source, and that is not the case.
So my questions in summary are:
what does "frame" mean in this context?
Why are these cookies almost always empty? For example, I'm looking right now at "Cookies used by frames from https://www.facebook.com", and it's empty.

As mentioned in Chrome DevTools documentation:
When you expand the Cookies category, it displays a list of domains of
the main document and those of all loaded frames. Selecting one of
these "frame groups" displays all cookies, for all resources, for all
frames in that group. There are two consequences of this grouping to
be aware of:
Cookies from different domains may appear in the same frame group.
The same cookie may appear in several frame groups.
Here is a more explicit explanation from Mozilla Storage Inspector documentation (although it isn't Chrome, the concept is similar):
Cookies — All the cookies created by the page or any iframes inside of
the page. Cookies created as a part of response of network calls are
also listed, but only for calls that happened while the tool is open.
So, basically, you will see the main document's domain and all its iframes.
About the "empty" frames with no corresponding frame
It is interesting. It is related to the Preload pages for faster browsing and searching example or Page Prefetch or prerendering feature that Chrome has.
This feature is under Chrome settings > Privacy and Security > Cookies and site data.
What this feature does is prefetch links on the page that you are probably gonna click. This allows the page to load a bit faster when accessing it.
If you dig in their docs you will find this:
Prerendered requests will use a ChromeURLRequestContext which has a
new CookieStore interface, but is otherwise the same as the current
profile’s ChromeURLRequestContext. If the PrerenderContents are
discarded without being used, the changes made to the CookieStore
interface go away. Otherwise, the deltas will be committed to the main
CookieStore for the profile. If there is a merge problem, the
prerendered page is discarded and a fresh request is issued.
This means that for the prerendered requests using a different CookieStore - this is the reason that it's empty on your main profile DevTools. It stores the cookies in a different store and merges them to the main store after you actually click on the link.

Related

Dynamically creating static routes from database using Next JS

I'm trying to understand how Next JS does dynamic routing and I'm a little confused on how to properly implement it in my own website. Basically, I have a database (MySQL) of content that keeps growing, let's say they're blog posts, with images stored in GCS. From what I understand you can create a pages/[id].js file in your pages folder that can handle dynamically creating routes for new pages, but, in order for you to get a good SEO score you, the Google crawlers need to see your content before any javascript or data requests are made. So the pages have to be physically available for the content to instantly appear upon loading. So if I have pages/[id].js and I have content added to the database daily, how are physical content files supposed to spontaneously populate the pages folder? And if pages files keep getting created, how do I prevent my disk from running out of space? I'm sure there is something I'm not understanding.
I read on nextjs.org that you can have a function getStaticPaths that needs to return a list of possible values for 'id'. I'm wondering, if my site is live and new content (pages) is constantly being added to the database with their own unique ids, how is it "aware" of those ids? Do I need to write a program or message queue system that constantly appends new ids to a file that is read by getStaticPaths? What if I have to duplicate my site on multiple servers around the world, or if a server dies, do I have to keep track of the file's contents in order to boot up a new server with the same content?
From what I understand, in order for Google to see any sort of content on your website, the pages text (content) needs to be static and quickly available via physical files. Images and everything else can be loaded later since Google's crawlers mainly care about text. So if every post needs to be a physical file in your app's pages folder, how do new pages files get created if the content is added to the database?
TL:DR My main concern is having my content readily available for Google crawlers in order to get a good score for my website. How do I achieve that if content is added to my database?
As you stated before, you can set up getStaticPaths to provide a list of values for id at build time. If I understand correctly, you are most concerned about what happens to new content added after the initial build.
For this you have to return the fallback key from getStaticPaths.
If the fallback key is false, then all IDs not specified initially will go to 404 and you’d need to rebuild the app every time you add new content. This is what you don't want.
If you set it to true, then the initial values will be prerendered just like before, but new values will NOT go 404. Instead, the first user visiting a path with a new Id will trigger the rendering of that new page. This allows you to dynamically check for new content if a request hits an id that wasn't available at build time.
It is interesting here that the first visitor will temporarily see a ‘fallback’-version of the page, while next.js processes the request. On that fallback, you would usually just show a loading spinner. The server then passes the data to the client in order to properly render the full page. So in practice, the user will first see a loading indicator, then the page updates itself with the actual content. Subsequent visitors will get the now prerendered result immediately.
You may now be worried about crawlers hitting that fallback page and not getting SEO content. This concern has been addressed here: https://github.com/vercel/next.js/discussions/12482
Apart from being able to serve new pages after build, the fallback strategy has another use in that it allows you to prerender only a small subset of your website (like your most visited pages), while the other pages will be generated only when necessary.
From the docs: When is fallback: true useful?
You may statically generate a small subset of pages and use fallback:
true for the rest. When someone requests a page that’s not generated
yet, the user will see the page with a loading indicator. Shortly
after, getStaticProps finishes and the page will be rendered with the
requested data. From now on, everyone who requests the same page will
get the statically pre-rendered page.
This ensures that users always have a fast experience while preserving
fast builds and the benefits of Static Generation.

Is it possible to add adverts to a custom Facebook Page Tab app?

I need to create a custom Facebook Page Tab app which will show an external site in an iframe. This need to have adverts on it but I'm not sure if this is possible as the site is hosted externally.
I'm not sure if I need to sign up to the Facebook Audience Network to get approved etc. either?
Any help or advice would be great.
Many browsers have this limitation of not allowing external sites to be shown in an iframe. Imagine the case when you are working hard to create a site and others show all your content in iframes. That is, naturally frustrating.
However, there is a candidate-solution: Let's suppose you create a page which sends a request to the other site and appends all the content into the body and head of your page. This is very much possible, so the solution is to:
Create a page in your site, let's call it outsider
In the server-side code of your outsider page send a request to the desired page to be shown
You will get the html of the page. Process it and include its content into the head and body of outsider. This includes:
3.1. Checking all the CSS to be reached, as the target page might refer to local CSS, which is unreachable locally at your end. Process the URLs of CSS files
3.2. Checking all the Javascript to be reached, as the target page might refer to local JS, which is unreachable locally at your end. Process the URLs of JS files
3.3. Apply the idea described in 3.1. and 3.2. for other resources, like images, until you are satisfied with the content of outsider
Create an iframe, having the source to point to outsider. outsider is inside your scope, so it should be shown
NOTE: If the site owning the target page does not like the possibility of you showing their content inside iframes, they might protect it by, let's say, having Javascript in their code, which checks whether the page is inside an iframe. Remove that code while processing the response to your request. If nothing else prevents you from showing the page in an iframe, then you should achieve success.

Where, exactly channel url is used?

On what browsers or user agents that channel URL is actually used, and what for?
I have no intention of having my site to work on Internet Explorer <= 8 (it is an HTML5 <canvas> game, and I am serving everything else as "application/xhtml+xml").
So, if channel is only useful on that old crap, I can gladly get rid of it...
Related (possibly): Channel URL Facebook
Because the social plugin is cross domain call, it needs a way to communicate. The wrokaround is to include a hidden iframe in the page for that. But, with this workaround, that iframe is loaded every time when page loads and will double the traffic reported. This is why channel url was done. What it does, it load the fb js in that page, and from that moment on, the js is available on your domain.
It will improve your loading times (cache) and will fix the reporting issue (you will see in reports channel page reported separately). But is not necessary for any html5 capable browser.
So, if you are using only HTML5 capable browsers, you are safe to ignore that. I am not sure about ie9, I will try to test it with my app by removing channel url and let you know.
Edit: By removing the channel URL from my app, I start getting double traffic reports from IE9. I think that is a good idea to keep the file there, is is just a simple html file with a single line. Better to be safe than sorry.

facebook iframe app - how to organize and write code for faster page loading - PHP SDK

I am writing an app within a facebook iframe and am unsure how best to write this. I originally wrote all the code within the main canvas.php file but found everything was running too slow before results were being loaded into the iframe.
I then tried using the php header location method so to try and load different pages into the iframe, thus reducing page load time. However, the header location is ignored.
I have also tried using javascript to get the page to load within the iframe instead, this does load in the new page but the page experiences lots of problems. It wil not pass parrameteres to itself using $_GET.
Basically, I need to perform some checks when the canvas page is first loaded in the iframe and then re-direct to another file to avoid the checks being perfomed on every page load as this seriously shows everything down. I then need to have page reloads with different parrameteres in the URL to populate the iframe with different results, again this is very slow as it has to perfomr all the checks again.
Therefore, how can I achieve a smooth workflow as a normal site within a facebook iframe?
[EDIT] Just thought is Ajax a valid option?
Many thanks in advance.
Most people experience slow response times due to not having a channelURL specified. See http://developers.facebook.com/docs/reference/javascript/
Channel File
The channel file addresses some issues with cross domain communication
in certain browsers. The contents of the channel.html file can be just
a single line:
It is important for the channel file to be cached for as long as
possible. When serving this file, you must send valid Expires headers
with a long expiration period. This will ensure the channel file is
cached by the browser which is important for a smooth user experience.
Without proper caching, cross domain communication will become very
slow and users will suffer a severely degraded experience. A simple
way to do this in PHP is:
The channelUrl parameter is optional, but recommended. Providing a
channel file can help address three specific known issues. First,
pages that include code to communicate across frames may cause Social
Plugins to show up as blank without a channelUrl. Second, if no
channelUrl is provided and a page includes auto-playing audio or
video, the user may hear two streams of audio because the page has
been loaded a second time in the background for cross domain
communication. Third, a channel file will prevent inclusion of extra
hits in your server-side logs. If you do not specify a channelUrl, you
can remove page views containing fb_xd_bust or fb_xd_fragment
parameters from your logs to ensure proper counts.
The channelUrl must be a fully qualified URL matching the page on
which you include the SDK. In other words, the channel file domain
must include www if your site is served using www, and if you modify
document.domain on your page you must make the same document.domain
change in the channel.html file as well. The protocols must also
match. If your page is served over https, your channelUrl must also be
https. Remember to use the matching protocol for the script src as
well. The sample code above uses protocol-relative URLs which should
handle most https cases properly.

Facebook Iframe App with multiple pages in Safari Session Variables not persisting

I have a facebook Iframe application with multiple PHP pages in it.
I have some links that point relatively to the files inside my "iframe folder".
Having some issues with session variables inside the iframe. I set some session variables but they do not persist from one page to another.
This does work on other browsers.
I've been reading that Safari does not support Cross-Domain cookies and this might be the problem , but im not sure how to fix this.
Any help?
I believe this solution has become obsolete with the latest (6.0 and later) versions of Safari.
Safari by default does not allow cookies to be set from third parties. This affects Facebook iframe applications because the user is accessing a page served from apps.facebook.com but the iframe is being served from yourdomain.com, the "third party" in this case.
There are several several solutions mentioned around the web. The best I've found and one recommended by Facebook in its list of miscellaneous issues is to fake a POST request to yourdomain.com using JQuery. This solution detailed by Anant Garg works in general for different host/iframe domains and needs to be adapted for Facebook apps. The key parts are:
$("body").append('
<iframe id="sessionframe" name="sessionframe" onload="submitSessionForm()" src="http://www.yourdomain.com/blank.php" style="display:none;"></iframe>
<form id="sessionform" enctype="application/x-www-form-urlencoded"
action="http://www.yourdomain.com/startsession.php"
target="sessionframe" method="post"></form>');
var firstTimeSession = 0;
function submitSessionForm() {
if (firstTimeSession == 0) {
firstTimeSession = 1;
$("#sessionform").submit();
}
}
Another solution by Will Henderson is to instrument each link on your page with session information using a Javascript function. Then modify your server code to capture this session information by reading it from GET parameters.
I wrote the blog post Dominic refers to in his answer.
The problem is that the default behavior of Safari is to only accept cookies from sites that you visit. This excludes "third party" cookies. Safari treats the page inside an IFRAME as a third-party site, and until you interact with that content (by clicking a link, for example), it will refuse those cookies.
Your PHP code needs to set a cookie on the first page that uses the session in order for that session to persist from one page to another, but if the session variables are in the very first page in the IFRAME, you have a chicken-and-egg problem.
My solution is to retain all of the special Facebook parameters through to the second page loaded into the IFRAME. Because you've interacted with it, cookies set on the second page will persist, and this allows your PHP code to keep whatever state it needs to communicate back to Facebook.
This won't likely help your PHP session, though, so I suggest adding another parameter to links on the first page that allows the second page to look the session up, or otherwise recreate it.
I think the best solution is to manually keep track of the session ID i.e. by using session_id($_GET['session]); Just make sure you do this before calling session_start(); and everything works.
Safari accepts cookies only from the page the user navigates to. The easiest and most effective way to fix this is to redirect the request from landing page of your canvas app to a different page on your domain using top.location.href and redirect the user back to the canvas app from that page.
For example, if abc.php is your landing page and the canvas URL is facebook.com/abc. First redirect the request from abc.php to a different page like xyz.php then redirect again from xyz.php to facebook.com/abc. Don't forget to start the session in xyz.php.
This is the simple fix...
and thanks for all the input. I ended up solving the problem by appending the "signed_request" paramter on every page. I just put it in as a hidden field and set it in the code behind. That way I managed to get it to work in Safari. Hope it works for you too.
With the release of Safari 7, not only 3rd Party cookie is being blocked. Local Storage as well as WebDB, any kind of website data are being blocked. When you go to Safari Preferences (CMD+comma), Under privacy tab, on Safari 7, it now says : "Block cookies and other website", originally was "Block cookies". That confirms the changes.
Other browsers might follow through in the future. Most probably Firefox. Chrome, cough *cough* probably not.
You probably have to employ some workaround using redirection technique or popup similar to what disqus did.
If you using .NET then there is a much simpler solution to this problem.
Just set cookieless to false in your web.config. Ex:
sessionState mode="InProc" cookieless="true" timeout="60"
Its a lot easier than posting an iframe, or opening a popup window with the url of the iframe.
kind regards,
David
I used this header with PHP, that fix my problems
if ( strpos($_SERVER['HTTP_USER_AGENT'], 'MSIE') ) header('P3P:CP="IDC DSP COR ADM DEVi TAIi PSA PSD IVAi IVDi CONi HIS OUR IND CNT"');