I am using DFP to serve ads, and fiddler to monitor the requests.
After 3 GET requests to the dfp server, in a short period of time (say 30 seconds), every subsequent request will a list of empty ads.
Does DFP have some sort of spam protection? if so, is there a way around it? debugging an ad implementation is quite slow when you're ad payloads are empty!
There is definitely some rate limiting going on within DFP... I have run into this many times! I think it may be per ad unit from what I can tell... and it doesn't last very long....
As for debugging have your tried the DFP google console? That makes debugging a lot easier... and I am pretty sure it will give you the diagnostics you need to debug without the rate limit being an issue.
Have you looked into adding a "corrolator" value in the parameters?
Just &c=rand(10000,99999).
Related
I'm part of a team working on improving the lighthouse score of our website :
https://www.bikewale.com/m/royalenfield-bikes/classic-350/
We are concentrating on optimising javascript delivery on the page, in order to decrease the time-to-interactive. However, we noticed that scripts like gtm.js, gpt.js and loading of ads on page load, is limiting our maximum improvement to around 70 (lighthouse performance score).
After doing optimisations to javascript delivery on our end, we were able to score atmost 70. We tried removing the js files for google tag manager and gpt, and saw the score rising to 95 (approx). Also, lazy loading all ads, and hence the request to dfp gives us a boost to around 75 (we can't do this due to the first ad is in the first fold).
Please note that we have followed the guides and best practices mentioned in the following links :
gtm - https://developers.google.com/tag-manager/quickstart
gpt - https://support.google.com/admanager/answer/7485975
googletag.pubads().refresh(immediateAds); // immediateAds is array of first fold ads
The refresh method is deteriorating the performance.
Is there a way to optimise the delivery of ads and gtm scripts, in order to improve the performance? Possibly a newer version of the scripts or an alternative? Is there a way to load the first fold ad immediately, and lazy load other ads on the page, without using the refresh() method
Congrats on achieving the 70 score! It's a very respectable score for an e-commerce site.
I'm not super familiar with GTM or GPT, but I can recommend one optimization to help those libraries do their jobs more effectively: preconnect to origins from which ads are served.
For each of those origins, you should add two hints near the top of your page:
<link rel="dns-prefetch" href="https://dt.adsafeprotected.com">
<link rel="preconnect" href="https://dt.adsafeprotected.com">
The first hint asks the browser to do a DNS lookup for the origin. The second asks the browser to set up a TCP connection. Preconnect accomplishes everything dns-prefetch does, but not all browsers support preconnect. Using both hints lets you get the best performance out of as many browsers as possible
Both of these hints give the browser a head start for resources that it won't otherwise know about until later in the page load process.
Keep in mind, depending on the resources loaded, you may need two preconnect hints. You can check the waterfall chart to make sure all connections are set up at the beginning of the page load.
I have to create script to create adsets and ads for a facebook campaign and I have to do it for a lot of items. For now, i can create every needed entity but there is a big problem, the rate limit. I reach it pretty quick (I can create like 15 items before getting a rate limit exception) and this is very limitating, creating eveything by hand is actually much faster... I want to apply to the next level of rate limitation but I can't. One of my coworker contacted someone from facebook and we were told we did not make any API call using my app ID. Since I am able to create a campaigns, adsets, ads... and we can see those in power editor I don't understand what is going on.
What my dashboard looks like
We will need to be able to create everything using the API really soon so, after some research, I try asking the question here. Did I miss something when creating my app ?
You probably want to go through the official request to promote your app from a Basic level to a Standard level. The level for your app determines how heavily it is rate limited. Details here: https://developers.facebook.com/docs/marketing-api/access
It sounds as if you have not make your official request in app dashboard. It's possible we evaluated your number of API calls before you reached the threshold, or the data we are able to see on your API calls was from an earlier time period when you did not consistently reach the boundary.
You could also be hitting rate limits due to your error rates.
You can apply here, and if needed, reapply: https://www.facebook.com/business/standardadsapi?attachment_canonical_url=https%3A%2F%2Fwww.facebook.com%2Fbusiness%2Fstandardadsapi
I have an app I'm developing against Facebook that timed out a few hours ago during my first production use. Of course I tried to get it do too much and the http call timed out. So, I rewrote what I was doing to use threaded connections, which sped up the interaction significantly! However, I was so engrossed in getting my interaction to speed up (it equated to about 25-50 calls, not exactly sure, I was expecting 25 but some of my results show it was 50 times), I didn't even stop to think about how fast I was hitting facebook.
So, I started getting the "Uncaught OAuthException: It looks like you were misusing this feature by going too fast. You窶况e been blocked from using it." which is what I now get even if I try to run my program with only 1 hit. I've added a sleep into my system to limit the hits at 1/second, but I'm concerned that my app (that was not making public posts so no one could have been bothered by them) is now forever banned from facebook, as it says I'm banned from the feature with a reference to learn about blocks in the Help Center; except I can't find any reference in the Help Center to my specific situation.
Does anyone know how long my app is out of commission?
And what are the specific (reference please, because I've search the hell out of fb and can't find one) limits regarding speed at which you can access facebook?
It depends on what has blocked you. In this case it was a spam bot that stopped me from posting comments into a group. Apparently there is a non-specific number of times you can post comments in a group in a short amount of time. The amount varies, but hovers around 150ish give or take 50 (at the time of my tests).
The ban appeared to be consistently set to about 19 hours at that time (May 2014). I've confirmed by continued testing in test groups and subsequent bans. However, Facebook developers are unable to give a solid set of numbers as they say it's controlled by a spam algorithm which changes based on server usage. So, 150 comments within about 3 minutes = ban for about 19 hours.
I am the manager of an iOS application and it uses Google Places API. Right now I am limited to 100,000 requests and during our testing, one or two users could use up to 2000 requests per day (without autocomplete). This means that only about 50 to 200 people will be able to use the app per day before I run out of quota. I know I will need to fill out the uplift request form when the app launches to get more quota but I still feel that I will need a very large quota based on these test results. Can anyone help me with this issue?
Note: I do not want to launch the app until I know I will be able to get a larger quota.
First up, put your review request in sooner rather than later so I have time to review it and make sure it complies with our Terms of Service.
Secondly, how are your users burning 2k requests per day? Would caching results help you lower your request count?
I'm facing the same problem!
Is it possible to use Places library of the Google Maps Javascript API which gives the quota on each end user instead of an API key so that the quota will grow as user grows. See here
Theoretically I think it's possible to do that since it just need a webView or javascript runtime to use the library, but didn't see anyone seems to use this approach.
So I've hit a bit of a dilemma with my application load testing. My application relies on valid Facebook logins as I create shadow records that correspond to the users who log in.
How can I load test my application while using Facebook calls (rather than disabling).
I need to ensure at least 100,000 users can connect without getting bogged down.
My code runs fairly fast so far on since loads I'm averaging 1000 ms pre-caching. But I'd like to do some more load testing before I turn on my cache.
How can I do this?
From what I've come across, everyone seems to say just turn off Facebook calls and load test as if the application was a regular site. Also, I came across something called friendrunner which seemed like it could be the solution to my problem. Except no one from there has gotten back to me as of yet.
You can't. Or rather, you really shouldn't and probably can't anyway. Facebook is one of the more aggressive sites when it comes to introducing measures designed to prevent synthetic (scripted) interaction and if you try to get around these measures you risk Facebook taking measures against you (probably not legal, but they can surely suspend your account and if you have a corporate agreement with them it could get embarrassing).
But this shouldn't be an issue for performance testing. You simply need to spoof the Facebook calls and focus on writing scripts that only call the servers that you want to load test. This is best practice for any project. In the past, I have simply used random strings to simulate the Facebook account id and, where you application requires certain user information from an account, you will need to be slightly more creative and stub this out. As far as I can tell, friendrunner is just that, a Facebook stub.