When discussing on Facebook Messenger, you can upload files and send them to the person you're discussing with. I'm actually developing a Messenger Bot (see documentation) and therefore, I'm trying to find the limits of this upload tool.
I found that it's not possible to upload files bigger than 25MB (I put here how I found this information).
However, I couldn't find for how long these attachments are available once uploaded? Indeed, Facebook Messenger is uploading the file on server and is sending a link to that resource in the communication. But I couldn't find if this link will be available forever or if access to it will be limited in time (and therefore, I guess the attachment would be deleted after this duration).
In order to get a response to my question, I uploaded a file and checked regularly its availability. Since I couldn't check every minute, I got a +/- 1 day precision but it seems that files stay available between 2 and 3 days.
Related
We (a local hackerspace) have a Tumblr blog and wanted to make ourselves a Facebook page. Before going live we wanted to import all our Tumblr content to Facebook so our fans on Facebook can browse it here as well. For this I have made an app that reads all the posts from our Tumblr blog and publishes them to our new Facebook page (backdating those posts as well). Here's my problem: after the app does about ~130 re-posts (~260 operations: publish + backdate) I start getting an error:
Received Facebook error response of type OAuthException: It looks like you were misusing this feature by going too fast. You’ve been blocked from using it.
Learn more about blocks in the Help Center. (code 368, subcode 1390008)
The block is gone the next day, but after a similar amount of operations it's back. After a couple of hours later, when the block is gone again, I introduced 6 second delays between operations, but that didn't help and after 19 re-posts I'm blocked again. Some facts:
I am publishing posts to a feed of (yet) unpublished page I am the (only) owner of.
The app is a standalone JAVA application and uses restfb to work with Facebook.
The line that is causing the error: facebookClient.publish("me/feed", FacebookType.class, params.toArray(new Parameter[0]));
All publish operations contain a link, mostly to respective posts on out Tumblr. Some contain message, caption or a name (depending on post type).
I need to re-post ~900 posts from Tumblr, I have done ~250 so far. When over, I will likely put in on server, scheduled, to keep syncing single new posts.
This app is not meant to be used publicly, it is rather a personal utility (but the code will be posted to GitHub, should anybody need it).
This is my first experience with Facebook API and I wasn't able to find a place where I could officially address them with this question. I could proceed by doing 100 posts/day, but I'm afraid I will eventually get banned for good, even though I don't feel like doing anything wrong.
I haven't put any more code here, as the code itself does not seem to be a problem, but rather the rate at which it is executed.
So, should I proceed with 100 posts/day and hope I won't be banned, or is there another "correct" way of dealing with this?
Thanks in advance!
I'm answering a bit late but I just had this problem too so I did some research : it seems that besides the rate limits shown in Facebook docs, there's also a much more limited and opaque rate for POST requests to limit spam.
It's not clearly set but it could depend on your relationship to the page you're writing to (admin or not), if you post to multiple pages and finally if you post too quickly.
To answer the question, it seems that it would have been okay if you had done like 1 post per minute or less.
I think you exceed the rate limiting for your user Id.
- Your app can make 200 calls per hour per user in aggregate. As an
example, if your app has 100 users, this means that your app can make
20,000 calls. One user could make 19,000 of those calls and another
could make 1,000, so this isn't a per-user limit. It's a per-app
limit
- That hour is a sliding window, updated every few minutes
- If your app is rate limited, all calls for that app will be limited, not
just for a specific user
- The number of users your app has is the
average daily active users of your app, plus today's new logins
Check this: https://developers.facebook.com/docs/graph-api/advanced/rate-limiting
It looks like you were misusing this feature by going too fast. You’ve been blocked from using it.
Learn more about blocks in the Help Center.
If you think you're seeing this by mistake, please let us know.
Okay, so a Wordpress gallery plugin lead to a massive headache - with about 17 galleries having their own pagination, the links within created what might as well be infinite number of variant URLs combining the various query variables from each gallery.
As such, Google has been not so smart and has been HAMMERING the server to the tune of 4 gigs an hour prior to my actions, and about 800 requests a minute on the same page sending the server load up to 30 at one point.
It's been about 12 hours, and regardless of the changes I've made, Google is not listening (yet) and is still hammering away.
My question is: Is there a way to contact Google support and tell them to shut their misbehaving bot down on a particular website?
I want a more immediate solution as I do not enjoy the server being bombarded.
Before you say it, even though this isn't what I'm asking about, I've done the following:
Redirected all traffic using the misused query variable back to the Googlebot IP in hopes that the bot being forwarded back to itself will be a wake up call that something is not right with the URL. (I don't care if this is a bad idea)
Blocking the most active IP address from accessing that site.
Disabled the URLs from being created by the troubled plugin.
In Google Webmaster Tools/Search Console, I've set the URL parameters to "No: Doesn't affect page content" for the query variables.
Regardless of all of this, Google is still hammering away at 800 requests per minute/13 requests a second.
Yes, I could just wait it out, but I'm looking for a "HEY GOOGLE! STOP WHAT YOU ARE DOING!" solution besides being patient and allowing resources to be wasted.
First of all I'm not sure if this is the correct place to post this question, if its not please tell where should I post it.
My doubt is if I can use Dropbox to host images and then send emails linking that image to preview it , I don't want to send it as an attachment , but as an image in the email , is that possible ? Or do I have to upload it to a hosting?
This question was already replied here and I strongly agree with the accepted response: don't do it in production.
Dropbox imposes limits on bandwidth that you can confirm here and stated below, so I would say it's ok for internal testing only.
Dropbox Basic (free) accounts:
The total amount of traffic that all of your links and file requests
together can generate without getting banned is 20 GB per day. The
total number of downloads that all of your links together can generate
is 100,000 downloads per day.
If you don't have other option or still insist on doing it, just be sure you keep yourself under the limits in order to don't go against their terms of use and avoid being banned.
I'm building an application for my personal use that saves all my facebook messages in a database on my computer.
But I have a problem as it seems only few messages can be accessed through the Graph API.
I created a token with all the possible permissions.
When issuing a call:
/me/inbox
I get all the threads in my inbox but for some of them the comments field which contains the actual messages is missing. It's mostly for conversation with people that are not friend with me on facebook.
For those threads, when I try to get more information by /<id_of_the_thread>
I get an error (code 100) Unsupported get request. from the graph api.
Is it a normal behaviour of the API?
What am I missing here?
Don't hesitate if you know a better way of saving all my messages.
Another, somewhat inferior, but much more accessible way of obtaining one's Facebook messages is by downloading a copy of your Facebook data through https://www.facebook.com/settings. This way you can download an archive with all your FB data, including your messages. They are however capped to 10,000 messages per conversation, and are all stored in one .htm file, which is not very practical if you want to do further operation on them.
No i think, we can't specified the Thread by using ID, but commonly i'm sorting the threads by its sender. CMIIW
I have a web site with various graphs embedded in it that are generated externally. Occasionally those graphs will fail to generate and I would like to catch that when it happens. These graphs are embedded in multiple pages and I would rather not check each page manually. Is there any kind of tool or perhaps a browser addon that could periodically take screenshots of different URLs and email them in a single email? It would be sufficient to have scaled-down screenshots of full pages emailed maybe once a day to me, allowing me to take a quick glance and see that all the graphs are there and look okay.
I'm a big fan of automation. Rather than have emails generated that you then have to look at, take a look at 'replacing custom missing images in jquery'. This will run a piece of Javascript for each image that fails. Extending that to make a request to a URL that you control, which may also include the broken URL (or just the filename that is broken) would not be too hard. That URL would then generate an email, and store the broken URL so that it doesn't send 5000 emails if there's a flurry of hits to your page.
Another idea building on the above is to effectively change the external 404 from the source site to a local one (eg /backend/missing-images/) - the full-path need not exist - you are just generating a local 404 record in your apache logs. Logwatch will send a list of 404 pages from the apache log to you daily (or more often, if you want) by email.