I'm streaming videos via rtmp from Amazon Cloudfront. Videos are taking a loooong time to start playing, and I don't have any way of figuring out why. Normally I'd use the "Net" panel in Firebug or Web Inspector to get a good first impression of when an asset starts to load and how long it takes to be sent (which can indicate whether the problem is on the server end or network versus the browser rendering). But since the video is played within a Flash player (Flowplayer in this case), it's not possible to glean any info about the status of the stream. Also since it's served from Amazon Cloudfront, I can't put any kind of debugging or measuring tools on the server (if such a tool even exists).
So... my question is: what are some ways I can go about investigating this problem? I'm hoping there would be some settings I can tweak on either the front-end (flowplayer) or back-end (Cloudfront), but without being able to measure anything or even understand where the problem is, I'm at a loss as to what those could be.
Any ideas for how to troubleshoot streaming video performance?
You can use WireShark (can diessect RTMP) or Fiddler to check what is going on... another point (besides the client and the server) to keep in mind is your ISP.
To dig deeper you can use this http://rtmpdump.mplayerhq.hu/ OR http://www.fluorinefx.com/ OR http://www.broccoliproducts.com/softnotebook/rtmpclient/rtmpclient.php.
You need to keep in mind that RTMP isn't ideal since it usually bypasses proxies and tries to make direct connection... if this doesn't work it can fallback, but that means that some time has already passed (it wait for a connection timeout etc.)... if you have an option to set CloudFront/Flowplayer to RTMPT then I would recommend doing so since that uses Port 80 for the connection.
Presumably - if you go and attempt to view a video - then come back 20min later and hit it again - it loads quickly?
SAN -> Edge Servers ---> Client
This is all well and good in a specific use case (i.e. small filesize of the origin content, large long running cache) - but, it becomes an issue when it's scaled out, with lots of media hosts running content through the system i.e. CloudFront.
The media cache they keep on their edge servers gets dumped fairly often - after the cache is filled - start dumping from the oldest file in cache - so if you have large video files that are not viewed often - they won't be sitting in the edge server cache, and take a long time to transfer to the edges - thus, giving an utterly horrific end user experience.
The same is true of youtube, for example - go and watch some randomly obscure, high duration video - and try it through a couple of proxies, so you hit different edge servers, you'll see exactly the same thing occur.
I noticed a very noticable lag when streaming RMTP from cloudfront. I found that switching to straight http progressive from the amazon S3 bucket made the lag time go away.
Related
My company have an application which could be installed with Qt Online-Installers. The data are stored on the our personal server, but, with time, we found out, that the internet connection is a bit slow for users on the other edge of the world. So, there is a question - "What services are we able to use to store these data, which are designed for these purposes?". When I was investigating this question I found the Information about the thing which is called "Content Delivery Network", but I'm not sure if it's something fits or not.
Unfortunately, I don't have enough experience in this area, so, maybe somebody knows more and could give me an advice. Thank you!
Cloudfront on AWS . Depends on what your content is but can probably store it on s3 and then use Cloudfront to cache it at edge locations across the globe.
Your research led you to the right topic because it sounds like you could benefit from a CDN. CDNs store cached versions of your website, download files, video, etc. on their servers which is often a distributed network of servers across the globe, known as 'Points of Presence' (PoPs). When a user requests a file from your website, assuming it is leveraging a CDN, the user request actually goes to the closest POP and retrieves the file. This improves performance because the user may be very far from your origin server, or your origin server may not have enough resources to answer every request by itself.
The amount of time a CDN caches objects from your site depends on configurable settings. You can inform the CDN on how to cache objects using HTTP cache headers. Here is an intro video from Akamai, the largest CDN, with some helpful explanation of HTTP caching headers.
https://www.youtube.com/watch?v=zAxSE1M4yKE
Cheers.
does anyone know of best practices or common strategies in backend design for serving dynamic images and videos to client applications?
Background: I'm currently building an application that allows users to upload their own images and videos. I'm not really sure about how to serve these media files back to the client in the most efficient way. Do I store the files on the same VPS that my application server is running on? Do I need to save the files in different qualities / densities to better adjust for the clients' screen resolution? (I'll have mostly mobile clients)
I tried googling these questions but apparently I'm asking the wrong questions :-)
I would really appreciate maybe a reference or professional vocabulary on these topics.
Thanks in advance.
1) You need to split web server and application server.
First of all do not try to stream media files from your backend unless you can offload low-level stuff to OS - most likely you will do it wrong.
Use proxy server as an web server to serve such files.
nginx will do.
Also you need to have backup of your media files the same way as you do backup of your database.
Storing static huge media files along with application server is wrong move - it will not scale at all.
You can add cron task to move files to some CDN server - when your move is complete you replace URL in database to match new location.
So by using nginx you will save precious CPU and RAM while file is getting moved to external server.
And CDN will help you to dedicate bandwidth and CPU/RAM resources to application server.
2) Regarding image resolution and downsampling:
Screens of modern handsets have the same or even better resolution compared to typical office workstation.
Link speeds have much bigger impact on UX.
If client has smartphone with huge screen but with slow link you still have to deliver image or video as fast as possible even if quality of media will not be match the resolution of handset.
It makes sense to downsample images on demand and store result on disk for nginx/CDN to serve it again.
In case of videos it makes sense to make "bad" version with big compression(quality loss) for the cases of slow link - device will downsample it itself during playback.
And you can keep client statistics (screen sizes/downlink speeds) and generate optimized versions of such video file later when you see that it is "popular".
FYI: Several years ago some social meda giant dropped idea to prepare all possible versions of the same media file in favour of FPGA on-the-fly resampler.
I do not remember the name of the company and URL to the article. It was probably instagram.
Some cloud providers have offers with FPGA or CUDA on board to do heavy lifting.
So in some cases you could exchange storage for heave horsepower to do conversion on the fly.
I built a website for someone and I used https://gtmetrix.com to get some analytics, mainly because the wait time is huge (~20 sec) without having any heavy images. Please find attached a screenshot here:
http://img42.com/05yvZ
One of my problems is that it takes quite a long time to perform the 301 redirect. Not sure why, but if someone has a key to the solution I would really appreciate. At least some hints to search would be nice.
The second problem is after the redirection, the waiting time is still huge. As expected I have a few plugins. Their javascripts are called approx. 6 secs after the redirection. Would someone please show me some directions on where to search please?
P.S. I have disabled all plugins and started from a naked plain Twenty Eleven theme, but I still have waiting times during redirection and smaller delay after redirection.
Thanks in advance
But a few suggestions:
1 and 2.) If the redirect is adding noticeable delays; test different redirect methods. There are several approaches to this -- including HTML meta and server side (ie PHP) methods -- I typically stick to server side; if it's showing noticeable delays using a server side method, this may be a great indicator that your experiencing server issues - and may be very well your server all along causing your speed issues; contact your host provider.
3.) Take a look at the size of your media. Images and Video; also Flash if your using any. Often cases it's giant images that were just sliced / saved poorly and not optimized for web in a image editing software like PhotoShop. Optimize your images for web and re-save them at a lower weight to save significant on load time. Also, many cases nowadays and you can avoid using clunky images in general by building the area out using pure CSS3. (ie. Odd repeatable .gifs to create gradients or borders etc.)
I've got a browser game and I have recently started adding audio to the game.
Chrome does not load the whole page and gets stuck at "91 requests | 8.1 MB transferred" and does not load any more content; and it even breaks the website in all other tabs, saying Waiting for available socket.
After 5 mins (exactly) the data are loaded.
This does not happen on any other browser.
Removing one MP3 file (the latest added one) fixed the problem, so is it perhaps a data limit problem?
Explanation:
This problem occurs because Chrome allows up to 6 open connections by default. So if you're streaming multiple media files simultaneously from 6 <video> or <audio> tags, the 7th connection (for example, an image) will just hang, until one of the sockets opens up. Usually, an open connection will close after 5 minutes of inactivity, and that's why you're seeing your .pngs finally loading at that point.
Solution 1:
You can avoid this by minimizing the number of media tags that keep an open connection. And if you need to have more than 6, make sure that you load them last, or that they don't have attributes like preload="auto".
Solution 2:
If you're trying to use multiple sound effects for a web game, you could use the Web Audio API. Or to simplify things, just use a library like SoundJS, which is a great tool for playing a large amount of sound effects / music tracks simultaneously.
Solution 3: Force-open Sockets (Not recommended)
If you must, you can force-open the sockets in your browser (In Chrome only):
Go to the address bar and type chrome://net-internals.
Select Sockets from the menu.
Click on the Flush socket pools button.
This solution is not recommended because you shouldn't expect your visitors to follow these instructions to be able to view your site.
Looks like you are hitting the limit on connections per server. I see you are loading a lot of static files and my advice is to separate them on subdomains and serve them directly with Nginx for example.
Create a subdomain called img.yoursite.com and load all your images
from there.
Create a subdomain called scripts.yourdomain.com and load all your JS and CSS files from there.
Create a subdomain called sounds.yoursite.com and load all your MP3s from there... etc..
Nginx has great options for directly serving static files and managing the static files caching.
The message:
Waiting for available socket...
is shown, because you've reached a limit on the ssl_socket_pool either per Host, Proxy or Group.
Here are the maximum number of HTTP connections which you can make with a Chrome browser:
The maximum number of connections per proxy is 32 connections. This can be changed in Policy List.
Maximum per Host: 6 connections.
This is likely hardcoded in the source code of the web browser, so you can't change it.
Total 256 HTTP connections pooled per browser.
Source: Enterprise networking for Chrome devices
The above limits can be checked or flushed at chrome://net-internals/#sockets (or in real-time at chrome://net-internals/#events&q=type:SOCKET%20is:active).
Your issue with audio can be related to Chrome bug 162627 where HTML5 audio fails to load and it hits max simultaneous connections per server:proxy. This is still active issue at the time of writing (2016).
Much older issue related to HTML5 video request stay pending, then it's probably related to Issue #234779 which has been fixed 2014. And related to SPDY which can be found in Issue 324653: SPDY issue: waiting for available sockets, but this was already fixed in 2014, so probably it's not related.
Other related issue now marked as duplicate can be found in Issue 401845: Failure to preload audio metadata. Loaded only 6 of 10+ which was related to the problem with the media player code leaving a bunch of paused requests hanging around.
This also may be related to some Chrome adware or antivirus extensions using your sockets in the backgrounds (like Sophos or Kaspersky), so check for Network activity in DevTools.
simple and correct solution is put off preload your audio and video file from setting and recheck your page your problem of waiting for available socket will resolved ...
if you use jplayer then replace preload:"metadata" to preload:"none" from jplayer JS file ...
preload:"metadata" is the default value which play your audio/video file on page load thats why google chrome showing "waiting for available socket" error
Our first thought is that the site is down or the like, but the truth is that this is not the problem or disability. Nor is it a problem because a simple connection when tested under Firefox, Opera or services Explorer open as normal.
The error in Chrome displays a sign that says "This site is not available" and clarification with the legend "Error 15 (net :: ERR_SOCKET_NOT_CONNECTED): Unknown error". The error is quite usual in Google Chrome, more precisely in its updates, and its workaround is to restart the computer.
As partial solutions are not much we offer a tutorial for you solve the fault in less than a minute.
To avoid this problem and ensure that services are normally open in Google Chrome should insert the following into the address bar: chrome: // net-internals (then give "Enter"). They then have to go to the "Socket" in the left menu and choose "Flush Socket Pools" (look at the following screenshots to guide http://www.fixotip.com/how-to-fix-error-waiting-for-available-sockets-in-google-chrome/)
This has the problem solved and no longer will experience problems accessing Gmail, Google or any of the services of the Mountain View giant. I hope you found it useful and share the tutorial with whom they need or social networks: Facebook, Twitter or Google+.
Chrome is a Chromium-based browser and Chromium-based browsers only allow maximum 6 open socket connections at a time, when the 7th connection starts up it will just sit idle and wait for one of the 6 which are running to stop and then it will start running.
Hence the error code ‘waiting for available sockets’, the 7th one will wait for one of those 6 sockets to become available and then it will start running.
You can either
Clear browser cache & cookies (https://geekdroids.com/waiting-for-available-socket/#1_Clear_browser_cache_cookies)
Flush socket pools (https://geekdroids.com/waiting-for-available-socket/#2_Flush_socket_pools)
Flush DNS (https://geekdroids.com/waiting-for-available-socket/#3_Flush_DNS)
For security purposes I want to source out some logic parts of the game out onto the server.
For example (Java Applet):
I have a player that shoots items to get points. For each successfull shoot, there will be called a function on the server that checks if it was legal and how many points the player gets and adds them to the total points. Total points are kept on server and a copy of actual total points is sent to the applet for displaying.
Now my questions:
I want to use a php script for the server side operations. Are there faster solutions?!
How can I handle this for each user that plays this game. For each request, there must be a new thread (which will already be handled by apache webserver) and I save the total points to a SESSION variable until the game is over.?!
Thanks!
regards matthais
Using a PHP script from within Apache might not be the best performing/scalable solution, but it may be a quick way set up and test your game.
Be aware that with HTTP you open a new connection every time you send a request to the server. This is a serious overhead! (EDIT: sorry, got mixed up here, since HTTP 1.1 persistent connection is the default. But there is still an overhead for each HTTP request!)
Maybe you want to create your own server listening on some TCP port (in any language of your preference). Of course, you'd have to implement the security layer on your own (or include a third party library).
This actually boils down to a question of urgency and available resources.
EDIT
For your question #2: Using PHP session variables sure is a good approach (but maybe not scalable). Otherwise, saving data as cookies may be a second possibility, but may be prone to manipulation again.