My Article page (https://psychology-to-go.com/what-is-stage-5-alzheimers) does not perform well on PageSpeed Insight. It only scores 54 for mobile and 67 for desktop. Even though I did the following things already:
compressed all jpgs,
preloaded the hero image
used webp,
compressed mp4 and created fallback with webm,
minified CSS and deleted unused CSS classes,
The total blocking time is almost 1.2 seconds...
Is it maybe my server? What can I try to improve in my code that it runs faster?
The entire idea behind me coding with AMP was to have a super-fast website... Quiet the bummer...
P.S.: I have a cheap 5 Dollar server and run with laravel forge the backend. Should I consider upgrading? Could somebody maybe share their server specs, too?
Do you try AMP Optimizer? Its prerender your AMP code and saved time.
Related
Facebook recently announced the introduction of messenger codes which can be used to add new contacts and, more importantly, communicate directly with businesses and business pages (which is why I'm interested in it).
It took me ages to find it but on the bottom left of the messages tab on my Facebook page I have the option to download my code in three different sizes - clicking the disc will open a modal window where you can click the Download button and choose from 300, 600 or 1000px PNG file downloads.
NOTE: While they are PNG files the background is not transparent which seems like a bit of an oversight to me but hey ho that's what Photoshop is for I guess.
The problem is that while I can download my code I can't find any way to test it on printed materials (or even electronically at the moment!). The scanning feature doesn't seem to have been rolled out for me yet (I tried re-installing the Messenger app to see if I got a newer version but that didn't work) and nor for anyone I know (I'm in the UK). The codes are bespoke to Messenger so can't be scanned or tested using any other app.
I'm probably too far ahead of the game but is there any way I can test to see if my code scans correctly, or anywhere I can go to find out? I would like to use it on some promotional material which is likely to be long term materials that I don't want to have to update in the near future (several years, by which time it's likely these codes will be more commonplace).
I also need to know what the redundancy is like. For example the high redundancy QR codes I generate can have up to 30% of the code covered while still being usable, which is great for design purposes. I can't find any official documentation as yet for these codes at all, let alone what is required, what the spec. is etc.
I know the most likely option is 'sit and wait' but I really would rather not if possible. I've never been very patient...
Thanks
UPDATE: My Messenger app has now been updated so I can test, but I'm leaving this here in case anyone knows of another way to test perhaps? If someone doesn't have Messenger on their phone for example.
My web site www.jeancharlesbarthelet.com has good meta . (I checked)
When I tried to share on Facebook, there is no image, no description.
I tried one hundred times the debug tool !! (for one month)
Nothing works.
I am very worried because lots of people are sharing my news at this time..
Thanks for help.
You should try and show some code examples....
However, with that said, it may be a caching issue with your images and their server.
I did see that the path to the image (http://www.jeancharlesbarthelet.com/autre/logo.jpg) does serve up the image correctly. Perhaps revisit thie tomorrow?
When did you upload it to your server? if your hosting company uses akamai to cache it could take up to 6 hours to be seen globally...
I have already checked out this question, and it sounds like he's describing the same exact problem as me except for a few things:
I'm not running on https
80% of the time I try to debug, I get this message " Error parsing input URL, no data was scraped."
The scraper works perfectly on a different domain, but same server, same theme with almost identical content. Every time I try a domain it scrapes it perfectly including the image
During the 20% that it actually scrapes my page, I am having the same issue in the above link. It is reading my thumbnail, yet showing a blank image. The link brings me to a working image but it doesn't want to show anything.
The weird part is it worked completely fine about 10 months ago when I updated this blog on a daily basis. The only difference is I've switched servers recently. While that would explain a possibility, the other domain switched as well and doesn't have this problem.
I am at a loss why my links either show no image at all in facebook or give me the:
Domain Link
Domain
(no image, no description)
Very frustrating situation. Does anyone have any suggestions?
Update:
I have 6 domains...
When I moved servers recently, I found the new server wasn't prepared to compress the pages, so my blog posts looked crazy. This forced me to turn compression 'off' on WP Super Cache on my main blog. I also did it to my 2nd highest traffic blog figuring I'd get to the other 4 later.
Well, now those first two blogs appear to work fine in the facebook debugger, but the remaining 4 have troubles. The tricky part is, I completely removed WP Super Cache from one site and still had trouble fetching the data.
So while it seems logically it should have been a WP Super Cache issue, continuing to have errors despite removing it leads me to believe now? I'm still so baffled.
Update:
Ok, I loaded Chrome and IE, and both were able to pull the data with ease. The google snippet tool also worked great. I am going to try posting a link to my facebook fan page via chrome and see if it works correctly.
I did clear my FF cache and it didn't change, but I am still confused why one domain works ok while the other does not. Either way, if adding in Chrome works, I'll stick with that for now.
Any other suggestions?
Cache should not make any problem. If a browser can see your page, so can facebook debugger.
See if some 500 error is there. Try from different browser, clearing the browser cache etc. Try google rich snippet and see if a custom search engine is scrapping it fine.
PS: It will be nicer if you post url.
I have a jquery-mobile application that is running inside a UIWebView in an iphone application. The webview shows the jquery-mobile page correctly, BUT only when the page is not loaded with a 3G connection. I know this sounds very weird and it is a very weird problem indeed, because if the page is loaded with a WIFI connection, it is display perfectly... here is a screenshot...
If instead my client uses a 3G connection to load the jquery mobile page, it seems that for some reason the javascripts and CSS that are needed to display the jquery mobile page are not loaded. Here is a second screenshot showing what the page looks like when it is loaded with 3G...
(Note: I know that this screenshot is not exactly the same page as the first one, but when it loads correctly it has the same styling as the first screenshot)
As you can see from the title of the page, the 3G connection that is giving this problem is in the Netherlands and my client has tried two different 3G providers in the Netherlands and encounters the same problem with both providers. If I test the application where I live, namely in South Africa, the page loads correctly with my 3G connection.
So, my question is, does anyone have any idea what could be causing the jquery mobile javascripts and CSS to fail to load on 3G connections in the Netherlands?
I have determined what was going wrong on the 3G connection in Holland. This problem arises because several mobile operators do content modification before delivering it to the phone and this modification breaks jQuery. From my experience and from what I have read on the internet, it seems that the following providers do content modification: O2 in UK, Vodafone in the Netherlands and T-Mobile in the Netherlands.
To see reports of other people who have been encountering issues with these 3G connections breaking javascripts look at the following links:
http://stuartroebuck.blogspot.com/2010/07/mobile-proxy-cache-content-modification.html
http://blog.gotfocussolutions.com/index.php/2011/10/jquery-mobile-doesnt-work-on-o2-3gedge-due-to-mobile-proxy-cache-content-modification/
http://bugs.jquery.com/ticket/8917
http://www.ladysign-apps.com/blog/code/javascript/jquery-does-not-load-3g-iphone-safari/
The last link listed above also gives the work around to this problem; the javascript file that is being being modified and broken by the 3G connection must be moved to an external server. So, for example, if jQuery is being broken by the 3G connection (as was the case for me), then don't serve the jquery file yourself, instead make use of a CDN like google:
http://ajax.googleapis.com/ajax/libs/jquery/1.6.2/jquery.min.js
Hope this information saves someone else the hours of frustration that this issue has cost me!
For anyone experiencing this problem on O2 (certainly here in the UK anyway) the reason is because O2 have an "optimisation platform" which takes external CSS and Javascript files and sets them inline on the document which can cause conflicts. (source)
This URL was certainly one of the better links I stumbled upon to regarding this particular problem:
http://stuartroebuck.blogspot.co.uk/2010/08/official-way-to-bypassing-data.html
One of the more reliable work arounds is to modify your website's headers to return the Cache-Control: no-transform header as O2 have indicated that if this response is provided then they will not modify the headers.
You can add the following to your .htaccess file:
<files ~ "\.(html|php|js)$">
Header add Cache-Control "no-transform"
</files>
I solve this by changing url in ajax, from ../folder/subfolder/test.php to http://mydomain/folder/subfolder/test.php
I am building an iPhone Wikipeida game app, that requires modifying the default Wiki HTML a little bit (mostly simplifying the page).
So far I am directly downloading the HTML output from en.wikipedia.org/wiki/Article_Foo to a python Google App Engine, and then modify its CSS and HTML structure, cache it, and finally output to iPhone. It works but I find this method quite tedious, there must be a better method?
Please note that I use App Engine not just for parsing the Wiki, but the game also requires it to keep the stores...etc, hence not a overkill. Also, I would prefer doing all the work with python on App Engine, to keep the iPhone client as thin and mobile as possible (XML on iPhone is a big no fun)
Thanks a lot.
=======
Nick mentions why not use the mobile Wiki which already optimizes for iPhone. However, the issue is that it goes down quite frequently (every couple weeks or so), also its HTML structure changes quite frequently too.
You can use the MediaWiki API to download the markup text and use some API tools for Python that could make the process/modify work easier.
Caching and outputting to iPhone is fine. I believe there is not much to simplify here.
Why not just fetch the mobile version of the page from http://en.m.wikipedia.org/? This is already formatted for mobile devices.
You can set up your own copy of the server used by m.wikimedia.org:
http://github.com/hcatlin/wikimedia-mobile
It's written in Ruby, but this shouldn't be an issue if your app just uses the HTML output.