I am trying to upload an image to Twitter and make it appear automatically, meaning not needing to click "Show Photo". I am able to post a status and image using POST statuses/update_with_media. That is fairly easy to do, but it still does not appear "inline." The documentation shows an example of uploading an image with twurl but I am working on Windows. Does anyone have an example with jQuery or a REST client showing how to us the https://upload.twitter.com/1.1/media/upload.json service? I tried putting the parameters in the url and the body but each time I get Bad Request.
Even though I spend about 15 hours working on this, I found a "solution" 10 minutes after posting on SO. It seems that the size of the image plays a role. I just used statuses/update_with_media using a larger image and it appeared inline. A slightly smaller image does not appear inline. I would still like to know how to call the service though.
Related
I am evaluating a completely headless setup with kirby.
I would like to be able to upload an image in one big size and kirby should generate different sizes of responsive images. (I know this feature from WordPress where this is possible).
I see there is this: https://getkirby.com/docs/guide/templates/resize-images-on-the-fly
But this is all php kirby code, and I want just to consume the REST API. So now kirby syntax available.
I tried to preconfigure the sizes within site/config/config.php but when uploading a new image, it would not save it in different sizes…
I then thought, maybe it would just generate the specific size the first time the image is requested. I've seen this behaviour with multiple CDNs. But also I did not find any documentation about that.
So what would be the best way to upload an image, and then being able to load a resized version of that image via REST API?
Is there even a possibility?
thank you for every input on that.
Cheers
Well some time has passed and I learnt a bit more about kirby and how it works.
The image is generated on the fly, when executing kirby syntax like $file->resize(720, null, 60)->url() (read more: https://getkirby.com/docs/guide/templates/resize-images-on-the-fly)
So to get images in different sizes via REST API, one can write a custom route where you are responsible of executing these resize commands whenever somebody is consuming this endpoint. A simpler solution would be using the plugin better-rest (https://github.com/robinscholz/better-rest) which does part of it for you already (for example in the better-rest endpoint images are amongst other stuff returned as srcset).
I hope this helps somebody one day. Cheers
I've been working on an Android game, and I wanted to setup a webpage to redirect to the download page. This part was easy.
So I share it on my facebook timeline. Ok, my screenshot gets cropped horribly. Spend 30 minutes reading all sorts of guides on how to point facebook to the proper image for a thumbnail ...
No matter what I do, facebook always, always, always shows a horribly cropped image. I would like to simply specify the correct image to facebook using og:image but it seems to always ignore this. I've tried many different combinations of properties and it's always wrong. I've tried making the og:image square, doesn't matter. I've tried making it smaller, doesn't matter.
It's also hard to figure out if facebook is caching an old image or not, no matter how many times I go on the debug link and tell it to scrape things again.
Can anyone tell me any change I can make to show the entire image instead of the horrible cropped version?
I went to http://validator.w3.org and made sure the page didn't have errors beyond the og ones you apparently can't avoid.
The page is: www.playlunapuma.com
There must be a way to convince facebook to show a different image that looks good.
The thing that finally worked was resizing the og:image to 600x315 exactly.
When I use the Facebook debugger to view my website top page, under the "Warnings That Should Be Fixed" section, I get a few different errors. However, if I press the "Fetch new scrape information" button one time, then on the next round only one error remains.
The error I am currently trying to solve is this one:
og:image could not be downloaded or is too small
og:image was not defined, could not be downloaded or was not big
enough. Please define
a chosen image using the og:image metatag, and use an image that's at
least 200x200px and is accessible from Facebook. Image
'http://davegutteridge.com/+image/mizumushi_profile.jpg' will be used
instead.
Facebook seems to have cached an old image and is displaying that one, even though I have deleted it from my server.
The image I want to show is this one. To try and make it work, I've tried removing the "exif" data, I've tried making it square (my original was 1200 by 600 pixels), and I've ensured it's well above 200 by 200 pixels. I have also tried refreshing multiple times, and waiting over 24 hours to see if Facebook's cache changes. 48 hours after first discovering this problem, the debugger still shows the wrong image.
How do I get Facebook to reference the correct image?
Make sure you follow the specificiations: https://developers.facebook.com/docs/sharing/best-practices#images
Square images are not a good choice, but i just tested it in the debugger and it does work fine. Btw, those are just warnings, not errors.
I had a similar problem, where the image was set "correctly" but the debugger would not take it. So I played around, and figured out, that I could help the debugger by adding the og:image:width and og:image:height.
After that, the image was taken instantly :)
This Process is called pre-caching and here is the ref to it ;)
https://developers.facebook.com/docs/sharing/best-practices#precaching
Hope this helps!
I'm trying to get version of a photo that FB uses as the background image for album layouts where one side is always exactly 206px and the other one at least the same, for example https://fbcdn-sphotos-b-a.akamaihd.net/hphotos-ak-xpa1/v/t1.0-9/p206x206/10401476_409218399235620_1454525834273554679_n.jpg?oh=eda59fce63113796b35c46cc4bec162a&oe=55760EFD&gda=1433681435_6b4729404e05493108c271a50c753d7f I've scoured both the graph and SO, but am having absolutely no luck here. I'm aware that you can get 3 versions of a photo using type=album/thumbnail/normal ie https://graph.facebook.com/409218399235620/picture?type=album (you'd think this one would work, but they probably never updated it to redirect to the new size?). I've tried all kinds of variants of this knowing that the graph documentation is pretty crap, to no avail.
I've also used the Explorer to for example pull up ?fields=images, but it's never one of the listed images there. I've tried using widths, replacing parts of the URL etc. etc. I am running the checks with a valid access_token.
I've pretty much resigned myself to just resizing myself, but given the time I've spent on this, I thought I'd at least put it out there in case someone else has had the same problem, and come up with a solution.
I have already checked out this question, and it sounds like he's describing the same exact problem as me except for a few things:
I'm not running on https
80% of the time I try to debug, I get this message " Error parsing input URL, no data was scraped."
The scraper works perfectly on a different domain, but same server, same theme with almost identical content. Every time I try a domain it scrapes it perfectly including the image
During the 20% that it actually scrapes my page, I am having the same issue in the above link. It is reading my thumbnail, yet showing a blank image. The link brings me to a working image but it doesn't want to show anything.
The weird part is it worked completely fine about 10 months ago when I updated this blog on a daily basis. The only difference is I've switched servers recently. While that would explain a possibility, the other domain switched as well and doesn't have this problem.
I am at a loss why my links either show no image at all in facebook or give me the:
Domain Link
Domain
(no image, no description)
Very frustrating situation. Does anyone have any suggestions?
Update:
I have 6 domains...
When I moved servers recently, I found the new server wasn't prepared to compress the pages, so my blog posts looked crazy. This forced me to turn compression 'off' on WP Super Cache on my main blog. I also did it to my 2nd highest traffic blog figuring I'd get to the other 4 later.
Well, now those first two blogs appear to work fine in the facebook debugger, but the remaining 4 have troubles. The tricky part is, I completely removed WP Super Cache from one site and still had trouble fetching the data.
So while it seems logically it should have been a WP Super Cache issue, continuing to have errors despite removing it leads me to believe now? I'm still so baffled.
Update:
Ok, I loaded Chrome and IE, and both were able to pull the data with ease. The google snippet tool also worked great. I am going to try posting a link to my facebook fan page via chrome and see if it works correctly.
I did clear my FF cache and it didn't change, but I am still confused why one domain works ok while the other does not. Either way, if adding in Chrome works, I'll stick with that for now.
Any other suggestions?
Cache should not make any problem. If a browser can see your page, so can facebook debugger.
See if some 500 error is there. Try from different browser, clearing the browser cache etc. Try google rich snippet and see if a custom search engine is scrapping it fine.
PS: It will be nicer if you post url.