I am posting a link to the feed of a page with Graph API. Last time I checked, my code was working a couple month ago. But today I find out that the same code stops working and returns an error.
Basically what I do is:
$ curl -i -F 'access_token=my_application_token' -F 'link=http://www.foodnetwork.com/recipes/tyler-florence/parsnip-puree-recipe2/index.html' -F 'name=Parsnip Puree' -F 'picture=http://img.foodnetwork.com/FOOD/2009/02/25/TU0603-1_Parsnip-Puree_s4x3_tz.jpg' -F 'id=my_page_url' https://graph.facebook.com/feed
It now returns the following result:
HTTP/1.1 500 Internal Server Error
Access-Control-Allow-Origin: *
Cache-Control: no-store
Content-Type: text/javascript; charset=UTF-8
Expires: Sat, 01 Jan 2000 00:00:00 GMT
Pragma: no-cache
WWW-Authenticate: OAuth "Facebook Platform" "unknown_error" "An unknown error has occurred."
X-FB-Rev: 600290
X-FB-Debug: mTeWwusHg5daIP2IMHlebi8fnLT9PO0CNJQeshMC+Hg=
Date: Mon, 30 Jul 2012 19:07:01 GMT
Connection: keep-alive
Content-Length: 87
{"error":{"message":"An unknown error has occurred.","type":"OAuthException","code":1}}
If I try the same post without the parameter of "link", then it works:
$ curl -i -F 'access_token=my_application_token' -F 'name=Parsnip Puree' -F 'picture=http://img.foodnetwork.com/FOOD/2009/02/25/TU0603-1_Parsnip-Puree_s4x3_tz.jpg' -F 'id=my_page_url' https://graph.facebook.com/feed
This returns the following, and I can see the post on my Facebook wall (without the desired link, of course):
HTTP/1.1 200 OK
Access-Control-Allow-Origin: *
Cache-Control: private, no-cache, no-store, must-revalidate
Content-Type: text/javascript; charset=UTF-8
Expires: Sat, 01 Jan 2000 00:00:00 GMT
Pragma: no-cache
X-FB-Rev: 600290
X-FB-Debug: iVVyk65AbEbnXNm0RyurLp/ZQA/oNXJ47w1UkLXXTfw=
Date: Mon, 30 Jul 2012 19:07:19 GMT
Connection: keep-alive
Content-Length: 40
{"id":"155190691260287_268086653304xxx"}
What puzzles me is that the same code with "link" parameter had been working. And the Facebook documentation does not say anything has changed about the "link" parameter for posting to feed.
Any idea what went wrong?
Thanks.
I'm having a similar issue lately, posting to page feeds with links seems to fail by the reply, however the post is still created. I have the proper permissions and the app is public. There's currently a bug-report opened at https://developers.facebook.com/bugs/726825557390253 that might shed some light on the problem should a FB dev respond!
Related
i have this url xxxxx.com/xxxx/xx-xx-2/xxxx and want remove the "-2" from second parameter in the url and that it stays that way xxxxx.com/xxxx/xx-xx/xxxx, The "-2" is random. i don't have idea how to do it :( any help would appreciate it.
My apologies for my bad english
You should add below
location ~ /[^/]+/[^/]+-[\d+]/ {
rewrite ^/([^/]+/[^/]+)-[\d+]/(.*) $1/$2 redirect;
}
In a url /part1/part2-number/ The /[^/]+ matches /part1, /[^/]+ matches /part2 and -[\d+]/ matched -number and then your rewrite it to remove the number
Test Results:
$ curl -I localhost/tarun/lalwani-2/abc
HTTP/1.1 302 Moved Temporarily
Server: openresty/1.11.2.2
Date: Fri, 06 Oct 2017 11:44:23 GMT
Content-Type: text/html
Content-Length: 167
Location: http://localhost/tarun/lalwani/abc
Connection: keep-alive
$ curl -I localhost/tarun/lalwani-10/abc
HTTP/1.1 302 Moved Temporarily
Server: openresty/1.11.2.2
Date: Fri, 06 Oct 2017 11:44:27 GMT
Content-Type: text/html
Content-Length: 167
Location: http://localhost/tarun/lalwani/abc
Connection: keep-alive
Every article on the following website fails when attempting to share the page via the share link in our social media widget.
for example:
the following page, http://news.gc.ca/web/article-en.do?mthd=index&crtr.page=1&nid=957389 when shared via the following link:
Returns an Error: Connection error in the share preview window. Response headers for this request are:
Cache-Control: private, no-cache, no-store, must-revalidate
Content-Encoding: gzip
Content-Type: text/html
Date: Wed, 01 Apr 2015 14:22:20 GMT
Expires: Sat, 01 Jan 2000 00:00:00 GMT
Pragma: no-cache
Strict-Transport-Security: max-age=15552000; preload
Vary: Accept-Encoding
X-Content-Type-Options: nosniff
X-FB-Debug: iYvCJSRJJqVdVXyIubD1kVq6teZBILIBRrCACZW9hI/Ms+B2qsquq52KyU5820UTfLmuXTis3LbRoL2bMlCVBw==
X-Frame-Options: DENY
X-XSS-Protection: 0
X-Firefox-Spdy: 3.1
200 OK
running the above URL through the OpenGraph object debugger returns:
Error parsing input URL, no data was cached, or no data was scraped.
and the scraper sees the following for the URL:
Document returned no data
We need to determine the cause of this since none of our content can be shared from our site at the moment.
Any ideas, tips greatly appreciated!
Background info: Some of my website innerpages had malware on it last week. The whole site has been reset, updated en cleared from malware a couple days ago. The website is SE optimized, but not spammed at all.
Q: Some innerpages of the website have suddenly dropped from the Google results.
Searching site: http://www.domain.com/innerpage doesn't even give a result.
Searching cache: http://www.domain.com/innerpage has no results since today
Webmaster page error: The page seems to redirect to itself. This may result in an infinite redirect loop. Please check the Help Center article about redirects.
HTTP/1.1 301 Moved Permanently
Date: Mon, 28 Oct 2013 20:15:18 GMT
Server: Apache
X-Powered-By: PHP/5.3.21
Set-Cookie: PHPSESSID=170fc3f0740f2eb26ed661493992843a; path=/
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
X-Pingback: http://www.domain.com/xmlrpc.php
Location: http://www.domain.com/innerpage/
Vary: Accept-Encoding,User-Agent
Content-Encoding: gzip
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: text/html; charset=UTF-8
The .htaccess file looks fine too. Do you guys have any idea what going on here?
The page is w3c valid and even online googlebot simulators show state 200 OK.
FB doesn't seem to respect ACCEPT headers sent to /feeds/page.php. See below example:
GET /feeds/page.php?id=10036618151&format=rss20 HTTP/1.1
Connection: Keep-Alive
**Accept: text/xml,application/xml**
Accept-Language: en-us
User-Agent: Mozilla/4.0 (compatible; Win32; WinHttp.WinHttpRequest.5)
Host: www.facebook.com
HTTP/1.1 200 OK
Cache-Control: private, no-cache, no-store, must-revalidate
**Content-type: application/rss+xml**
Expires: Sat, 01 Jan 2000 00:00:00 GMT
Last-Modified: Wed, 08 Feb 2012 10:05:49 -0800
P3P: CP="Facebook does not have a P3P policy. Learn why here: http://fb.me/p3p"
Pragma: no-cache
X-Content-Type-Options: nosniff
X-Frame-Options: DENY
Set-Cookie: datr=-vIzT4cxw52hjjqTfrpQkNYX; expires=Sat, 08-Feb-2014 16:23:22 GMT;path=/; domain=.facebook.com; httponly
X-FB-Debug: qx/SiyRZDiVPm4wfiKVj37HImPoKM+DVAsO4oKSbSr0=
X-Cnection: close
Date: Thu, 09 Feb 2012 16:23:22 GMT
Content-Length: 41236
I cannot seem to find a way to post a new bug report on http://developers.facebook.com/bugs, as I don't have the "Create" (nor the "Subscribe") buttons as described here http://developers.facebook.com/blog/post/559/
I have read that there are a fair few FB developers involved with this site, and was hoping that someone could shed some light on what I might be doing wrong / how to request FB changes the code to respect my request, or 406 me.
You are going to want to verify yourself as a developer in order to create a bug.
Check out this link for verification via -
mobile number
credit card
Once you have verified your account, head back to the bug system.
I'm watching 392 repositories on Github. However, the Github API only returns 100. Does anyone have any idea why?
https://github.com/api/v2/json/repos/watched/trivektor
You need to paginate manually using the page parameter. The HTTP Response headers will tell you the next and the last page, if available. Check the headers:
X-Next
X-Last
Examples:
curl -D- https://github.com/api/v2/json/repos/watched/trivektor
HTTP/1.1 200 OK
Server: nginx/1.0.4
Date: Sat, 22 Oct 2011 08:24:45 GMT
Content-Type: application/json; charset=utf-8
Connection: keep-alive
Status: 200 OK
X-RateLimit-Limit: 60
ETag: "c597e396e9f17b91c5c5a7e462ba954f"
X-Next: https://github.com/api/v2/json/repos/watched/trivektor?page=2
X-Last: https://github.com/api/v2/json/repos/watched/trivektor?page=5
Now the 2nd page:
curl -D- https://github.com/api/v2/json/repos/watched/trivektor?page=2
HTTP/1.1 200 OK
Server: nginx/1.0.4
Date: Sat, 22 Oct 2011 08:28:08 GMT
Content-Type: application/json; charset=utf-8
Connection: keep-alive
Status: 200 OK
X-RateLimit-Limit: 60
ETag: "c57d0e97e2062672cb3771467cf2abc7"
X-Next: https://github.com/api/v2/json/repos/watched/trivektor?page=3
X-Last: https://github.com/api/v2/json/repos/watched/trivektor?page=5
X-Frame-Options: deny
X-RateLimit-Remaining: 58
X-Runtime: 353ms
Content-Length: 44966
Cache-Control: private, max-age=0, must-revalidate
And the last one:
curl -D- https://github.com/api/v2/json/repos/watched/trivektor?page=5
HTTP/1.1 200 OK
Server: nginx/1.0.4
Date: Sat, 22 Oct 2011 08:28:30 GMT
Content-Type: application/json; charset=utf-8
Connection: keep-alive
Status: 200 OK
X-RateLimit-Limit: 60
ETag: "11ce44ebc229eab0dc31731b39e10dcf"
X-Frame-Options: deny
X-RateLimit-Remaining: 57
X-Runtime: 93ms
Content-Length: 7056
Cache-Control: private, max-age=0, must-revalidate
Very common for API's to limit the size of a response object to protect against outliers. Given that it's returning a round number, that suggests this is by design. I don't see them discussing paging in their docs, so it might just be a hard cap. Either way, you should just ping github.