I have a simple synth that plays a 100hz tone using an OscillatorNode. My synth is about a whole step flat on safari iPad 4 ios 7.1.1, compared to all the other browsers I've tried (chrome iPad 4, safari ipad 2 ios 7.1.1, safari iPhone 5, chrome and safari on my mac). I've verified that the sample rate of the out-of-tune browser, iPad 4 safari, is 44100hz. The in-tune browsers report the same sample rate, 4400hz.
My code is pretty simple and I don't see how this could be a programming error on my part. Especially considering the iPad 2 and iPad 4 are running the same OS (and presumably the same version of safari). It seems like there's something weird, low-level and hardware-dependent going on.
Is this a known issue? If so, is there any way to test for it or work around it?
===== edit ========
Here's an example (safari only) -- dead simple oscillator test. Plays at one pitch on my iPhone 5s, a different pitch on my iPad 4. http://www.morganpackard.com/webaudio_test/OscillatorTest.html
var context = new webkitAudioContext();
var osc = context.createOscillator();
osc.connect(context.destination);
osc.frequency.value = 440;
osc.start(0);
This is probably due to one device playing at 44.1kHz and the other playing at 48kHz. There is probably a browser bug preventing the change of sample rate, and the subsequent misreporting of sample rate.
Chrome on Android has a similar issue where the record and playback sample rates must be identical. Since this doesn't typically happen when recording from the on-board microphone, for while it would seem that recording audio was always silent.
Related
I have a hardware codec that encodes video in H.264 (Baseline profile, level 3) which I package into MPEG2 Transport Stream so that I stream it to iDevices (HTTP Live Streaming).
The problem I have is that the video plays only on the more recent iDevices (iPhone 4S/iPhone 5, iPad 2/3) but not on the older iPhones or iPad 1 (there is activity on the screen but nothing even remotely close to actual video).
Further, when it works, the video plays at exactly 1/2 the framerate (30 fps plays as 15 fps).
Safari on Mac Mini or Macbook Pro exhibit no problem whatsoever. VLC & mplayer don't have any problem with the TS files either.
When I package the same video into a MP4 container, all devices play the video properly.
Any suggestions on how to debug this problem?
Is there any way of getting debug information from iPhone or iPad that would help me figure out what's going on?
Reduce your level? Do older devices support level 3? If not go to level 1.2 and check
I am developing and iPhone app with Adobe Air 2.6 using Flash CS 5.5. I am trying to capture microphone input and then playback an mp3 file. The problem is now, that once I capture the microphone data with the SampleDataEvent.SAMPLE_DATA event, the volume of the playback mechanism seems to be decreased significantly.
To reproduce:
playback a (remote) mp3 file and the volume is ok
get the microphone and add the event listener (see code below), the listener function does not even need any code for this problem to occur.
same as step 1 (playback remote mp3) and the volume is very low.
// add the event listener
_microphone.addEventListener(SampleDataEvent.SAMPLE_DATA, onSampleData);
private function onSampleData(event:SampleDataEvent):void
{
//while(event.data.bytesAvailable > 0) {
// _buffer.writeFloat(event.data.readFloat());
//}
}
// call this before playing back the mp3
_microphone.removeEventListener(SampleDataEvent.SAMPLE_DATA, onSampleData);
When testing with the Flash IDE, the problem does NOT occur and everything works as expected. Tested on iPhone 3GS with iOS 4.3.3.
If anybody experienced this problem I would greatly appreciate your insights.
UPDATE:
I think it is not an AIR problem per se. After using the iOS Microphone the whole app almost mutes itself, that is also the case for the typing on the virtual keyboard (which does not come from AIR). Doesn't really help me but maybe somebody knows how to shut down the microphone in an app so that speaker levels get back to normal?
UPDATE 2:
Here you can see a running example from Adobe http://tv.adobe.com/watch/adc-presents/developing-for-ios-with-air-for-mobile-26/, note that the speaker volume of the iphone in the live demo is really low too. So that must be a pretty huge bug then, making the microphone on the iPhone unusable.
download and use AIR 3 sdk
then set
SoundMixer.useSpeakerphoneForVoice = true;
SoundMixer.audioPlaybackMode = AudioPlaybackMode.MEDIA;
and your problem should be solved
I'm building an app that measures sound volume. I understand that audio hardware in the iPhone is not as accurate as professional hardware, which is OK, but I need to know if there are any differences between the different iPhone models. For example, is it possible that the volume measured on an iPhone 3G will be different on an iPhone 4? Unfortunately I do not possess any models earlier than the 4 so I'm unable to test this myself.
The audio frameworks seem to be identical for identical iOS versions (except for the 2G). However the physical microphones (and acoustical environments) are different. People have published various test results, such as:
http://blog.faberacoustical.com/2009/iphone/iphone-microphone-frequency-response-comparison/
and
http://blog.faberacoustical.com/2010/iphone/iphone-4-audio-and-frequency-response-limitations/
But it's possible that the mic response may vary with manufacturing batches as well. YMMV.
I'd just like to add that no matter what I try, there seems to be no way of exporting audio through AVAssetExportSession using an iPhone 3G. It's working with 3GS, iPod Touches, 4, iPad and so on.
I'm (trying) to use HTTP-Live-Streaming in my app and after weeks of re-encoding it seems to work now without errors by the mediastream validator.
On my latest iPod Touch (iOS 4.0) with WiFi the videostream loads in 1sec and switches to the highest bandwidth stream.
On another test device iPhone 3G (iOS 3.0) with WiFi it takes up to 30 seconds to load the stream - although I see in my log files that it looks for the high quality stream after 1 second. But I get a black screen with audio only in the first 30 seconds. Is this problem to due the better CPU on the latest iPod touch or is it due to the iOS upgrade?
Also I'm fearing another rejection by Apple because the last time they checked my stream they only looked at each videostream for about 3 seconds and then rejected because they didn't see any video.
Take a closer look at the segmented files. Example: can you play the first low-quality MPEG-TS segment in VLC? Is their video there?
I've found iOS devices to be very picky about that they will and won't play. Make sure you are using a lowest common denominator code settings. I'm a big fan of The Complete Guide to iPod, Apple TV and iPhone Video Formats
We are using http live streaming for on demand video from within our iPhone app and on the 3GS models the videos play as they are meant to. However, on the models pre 3GS it gives an error saying this movie format is not supported.
I have seen other threads on this however no solutions or insights.
Does anyone know if this really is a hardware limitation of the pre 3GS phones or does it have something to do with our code?
"iPhone 3G supports H.264 Baseline Profile Level 3.1. If your app runs on older iPhones, however, you should use H.264 Baseline Profile 3.0 for compatibility."
Info taken from this technote.
HTTP Live Streaming is supported on all iPhone, iPod Touch and iPad hardware if you have sufficient network bandwidth for your lowest bit-rate stream and the right level of OS. On an original iPhone 2G running iPhone OS 3.1.3 we are routinely playing HTTP Live Streams over WiFi. It also works in our tests over Edge, but the bandwidth on Edge is usually too low for the rates at which we are encoding. We have seen some issues with bandwidth adaptation on an iPod Touch running 3.1 which we suspect are related to that particular device/OS combination, but are not certain of that.