I'm (trying) to use HTTP-Live-Streaming in my app and after weeks of re-encoding it seems to work now without errors by the mediastream validator.
On my latest iPod Touch (iOS 4.0) with WiFi the videostream loads in 1sec and switches to the highest bandwidth stream.
On another test device iPhone 3G (iOS 3.0) with WiFi it takes up to 30 seconds to load the stream - although I see in my log files that it looks for the high quality stream after 1 second. But I get a black screen with audio only in the first 30 seconds. Is this problem to due the better CPU on the latest iPod touch or is it due to the iOS upgrade?
Also I'm fearing another rejection by Apple because the last time they checked my stream they only looked at each videostream for about 3 seconds and then rejected because they didn't see any video.
Take a closer look at the segmented files. Example: can you play the first low-quality MPEG-TS segment in VLC? Is their video there?
I've found iOS devices to be very picky about that they will and won't play. Make sure you are using a lowest common denominator code settings. I'm a big fan of The Complete Guide to iPod, Apple TV and iPhone Video Formats
Related
I have a simple synth that plays a 100hz tone using an OscillatorNode. My synth is about a whole step flat on safari iPad 4 ios 7.1.1, compared to all the other browsers I've tried (chrome iPad 4, safari ipad 2 ios 7.1.1, safari iPhone 5, chrome and safari on my mac). I've verified that the sample rate of the out-of-tune browser, iPad 4 safari, is 44100hz. The in-tune browsers report the same sample rate, 4400hz.
My code is pretty simple and I don't see how this could be a programming error on my part. Especially considering the iPad 2 and iPad 4 are running the same OS (and presumably the same version of safari). It seems like there's something weird, low-level and hardware-dependent going on.
Is this a known issue? If so, is there any way to test for it or work around it?
===== edit ========
Here's an example (safari only) -- dead simple oscillator test. Plays at one pitch on my iPhone 5s, a different pitch on my iPad 4. http://www.morganpackard.com/webaudio_test/OscillatorTest.html
var context = new webkitAudioContext();
var osc = context.createOscillator();
osc.connect(context.destination);
osc.frequency.value = 440;
osc.start(0);
This is probably due to one device playing at 44.1kHz and the other playing at 48kHz. There is probably a browser bug preventing the change of sample rate, and the subsequent misreporting of sample rate.
Chrome on Android has a similar issue where the record and playback sample rates must be identical. Since this doesn't typically happen when recording from the on-board microphone, for while it would seem that recording audio was always silent.
I am using OSX mountain lion, and I have been able to record a video using quick time to screen capture, but it does not record the sound generated from the iOS Simulator, but rather from the microphone.
I want to record audio and video from the iOS Simulator.
You could:
- try a professional screen recording software (Camtasia, Screenflow,...)
- use a virtual sound output device that captures the sound and writes it to disk
- connect your sound output to your input (using a Cinch cable)
See http://bit.ly/UXBJ9N for more info on the latter two.
I ended up using sound flower (which I can't praise enough - it was much more simple than I expected - and tiny app size too) to capture audio, and a random screen capture utility to capture video, then I married them up in a video editing application. Not perfect, but it works.
I will post here if I find a simpler solution, because despite all the blog posts about this matter, I could not find something ideal.
I have a hardware codec that encodes video in H.264 (Baseline profile, level 3) which I package into MPEG2 Transport Stream so that I stream it to iDevices (HTTP Live Streaming).
The problem I have is that the video plays only on the more recent iDevices (iPhone 4S/iPhone 5, iPad 2/3) but not on the older iPhones or iPad 1 (there is activity on the screen but nothing even remotely close to actual video).
Further, when it works, the video plays at exactly 1/2 the framerate (30 fps plays as 15 fps).
Safari on Mac Mini or Macbook Pro exhibit no problem whatsoever. VLC & mplayer don't have any problem with the TS files either.
When I package the same video into a MP4 container, all devices play the video properly.
Any suggestions on how to debug this problem?
Is there any way of getting debug information from iPhone or iPad that would help me figure out what's going on?
Reduce your level? Do older devices support level 3? If not go to level 1.2 and check
Developed an internet radio streamer using jPlayer which utilizes the html5 audio tags with jQuery and has a flash fall back for unsupported browsers. Upon testing the player on the iPhone (iOS 5.0.1), we ran into a very peculiar issue.
When the iPhone is connected to WiFi, it streams perfectly using the HE-AAC V2 stream # 64kbps 44.1kHz (the preferred codec for apple products). However, when the iPhone is connected to the 3G mobile network, it "stutters" or stops streaming for 1-2 secs every 1-2 minutes (does not stop streaming completely). The troubling thing is when the iPhone is forced to use a separate MP3 stream at the same bit rate, it does not have this issue and works very well on 3G.
UPDATE 5
We recently acquired a 3G/4G Sprint mobile hotspot device and tested this issue with the device. When the iPhone is connected to the mobile hotspot, it shows as being connected to a wifi device and the issue does not render even tho the actual connection is via 3G/4G. This might point back to the issue being with the iPhone not handling HE-AAC via HTTP Live Streaming and when directly connected to the mobile network.
UPDATE 4
Updated the iPhone to iOS 5.1 yet the issue persists.
UPDATE 3
Read here on SO various issues of script not rendering correctly when connected to mobile networks. The finger seems to point to the mobile network carriers that may be inserting Proxy to serve webpages, e.g. for downsizing images. Also it might inject some JavaScript pages. The test page can be found HERE Note: this page is using HE-AAC so it will only work on iPhone...
UPDATE
According to Apple's HTTP Live Streaming doc for iOS devices, that "Audio-only content can be either MPEG-2 transport or MPEG elementary audio streams, either in AAC format with ADTS headers or in MP3 format." Our music server is using OddcastV3 encoder to send out three streams (MP3, HE-AAC V2, and Oggvorbis) to the icecastV2 server. Not sure if the encoder is inserting ADTS headers for the HE-AAC V2 stream. Is there a way to check for this?
Comming from a radio planning point of view - here are my two cents:
What you are describing sounds like bandwidth shaping - which is both a common and often neccesary design of radio networks (like 3G networks). In most 3G operators I worked at you would typically optimize your network to give high-speed burst (think downloading an image, sending one email or fetching one HTML page) - over "long-running" high bandwidth services.
This is due to the simple fact that this is what most users want/need.
This shaping can on a typical 3GPP (GSM 3G) network result in that you will first get a RAB (radio access bearer) supporting 384kbit and is then downgraded as long as your your device accepts it.
This means that typicall you will get switched from 384 -> 256 -> 128, then 64kbit where maybe your device starts recieving data to slowly, then the network upgrades it and again downgrade it after a while.
So why is not then the MP3 file stuttering? my guess is that the total kbit rate might differ - so you are fine in the 64kbit RAB. This is a common phenomena.
We have managed to get the exact same thing working. 64kbit AAC-v2 on mobile devices. We are streaming files and not a steady stream, I think Magnus is right when he explains how the network prioritized traffic to bursts, in our case that means we have large parts of the file right away and the player can continue to play until he next burst comes in. In your case that means the stream pauses until the next burst comes.
Either if you can switch to larger chunks in your streaming (larger buffer) or stream whole files instead?
We had a very strange phenomenon with iOS, we had to rename all files from .m4a to .aac to be able to get them streaming on iOS. If we didn't rename them iOS wouldnt play them.
Good luck.
We are using http live streaming for on demand video from within our iPhone app and on the 3GS models the videos play as they are meant to. However, on the models pre 3GS it gives an error saying this movie format is not supported.
I have seen other threads on this however no solutions or insights.
Does anyone know if this really is a hardware limitation of the pre 3GS phones or does it have something to do with our code?
"iPhone 3G supports H.264 Baseline Profile Level 3.1. If your app runs on older iPhones, however, you should use H.264 Baseline Profile 3.0 for compatibility."
Info taken from this technote.
HTTP Live Streaming is supported on all iPhone, iPod Touch and iPad hardware if you have sufficient network bandwidth for your lowest bit-rate stream and the right level of OS. On an original iPhone 2G running iPhone OS 3.1.3 we are routinely playing HTTP Live Streams over WiFi. It also works in our tests over Edge, but the bandwidth on Edge is usually too low for the rates at which we are encoding. We have seen some issues with bandwidth adaptation on an iPod Touch running 3.1 which we suspect are related to that particular device/OS combination, but are not certain of that.