AVAssetWriter startWriting Problem - iphone

AVAssetWriter startWriting is returning BOOL False value when i'm writing movie on 2G device, but for all other devices its returning TRUE value and working fine.Anyone faced this problem or do you have any clue why its happening,help me please

I am receiving false on startWriting on an iPad, when it works on both my iPhone3 and 4 (all have iOS 4.2). The status of the writer is failed, with the NSError as: "The operation couldn't be completed. (AVFoundationErrorDomain error -11800)."
Creation of the writer yielded no error when creating with file type: AVFileTypeQuickTimeMovie, and the file did NOT already exist. I've also tried using different pixel buffer pixel formats to no avail. Lastly, I've tried changing the video type to MPEG4 and M4V...again, to no avail.
I'm posting this here instead of creating a new problem, as they both are the same result and is not addressed. I need to have this resolved w/in a few days, so if I learn anything, I'll post what I find.

The most likely cause is that the specific iOS device you are running on does not include the hardware H264 encoder. I think that versions of iPhone earlier than 3GS have no hardware, but I am not sure if the iPad 1 has H264 encoding hardware. I know iPad 2 does have this hardware.

Related

AVAssetExportSession works on iPad, no audio on iPhone

I have the exact same code running on both the iPad and iPhone versions of my app and the code works fine on the iPad (the video is being exported properly with audio), but the exported video on the iPhone doesn't have any sound. I even ran the iPhone version on the iPad and it worked fine, which means that nothing should be wrong with the code itself.
Any insight on why the iPhone isn't exporting the video with audio would be much appreciated.
I have done some research and somebody mentioned that memory issues could be causing some export problems. The memory and CPU usage are fairly high during the video processing/exporting, but never high enough to receive a memory warning.
Thanks in advance.
You didn't mention if you stepped through the code (line by line) on the iPhone, setting breakpoints, watching each variable to make sure the value is correct, etc. This would be the first step.

Record iPhone app video (without simulator)

How can I record a video of an iPhone app? I can't use the Simulator because the application is very OpenGL-heavy and uses an accelerometer/gyroscope.
You can have the iphone output video and capture on another device: How can I use MPTVOutWindow iPhone undocumented class?
one of the links in that answer says it doesn't work in iOS4+, but on a project I worked on less than one month ago, we used that feature from an iPhone4 to present, so I would challenge that (unless the developer that handled that portion used another approach)
I'm not sure there is a "native" solution here, short of building video capture into your actual app.
The cleanest way of handling this, assuming your game/app has a cleanly designed input pipeline, is probably to mock the input:
Put in (debug-only code) that lets you "record" all the raw input events.
Using the device, play out the demo session to create a "recording".
Run the app in the simulator, and feed it the input "recording" you made on the device.
The simulator will run GL stuff just fine, and probably at a higher framerate than your device will.

Adobe Air + iphone microphone problem: low volume after SampleDataEvent.SAMPLE_DATA event

I am developing and iPhone app with Adobe Air 2.6 using Flash CS 5.5. I am trying to capture microphone input and then playback an mp3 file. The problem is now, that once I capture the microphone data with the SampleDataEvent.SAMPLE_DATA event, the volume of the playback mechanism seems to be decreased significantly.
To reproduce:
playback a (remote) mp3 file and the volume is ok
get the microphone and add the event listener (see code below), the listener function does not even need any code for this problem to occur.
same as step 1 (playback remote mp3) and the volume is very low.
// add the event listener
_microphone.addEventListener(SampleDataEvent.SAMPLE_DATA, onSampleData);
private function onSampleData(event:SampleDataEvent):void
{
//while(event.data.bytesAvailable > 0) {
// _buffer.writeFloat(event.data.readFloat());
//}
}
// call this before playing back the mp3
_microphone.removeEventListener(SampleDataEvent.SAMPLE_DATA, onSampleData);
When testing with the Flash IDE, the problem does NOT occur and everything works as expected. Tested on iPhone 3GS with iOS 4.3.3.
If anybody experienced this problem I would greatly appreciate your insights.
UPDATE:
I think it is not an AIR problem per se. After using the iOS Microphone the whole app almost mutes itself, that is also the case for the typing on the virtual keyboard (which does not come from AIR). Doesn't really help me but maybe somebody knows how to shut down the microphone in an app so that speaker levels get back to normal?
UPDATE 2:
Here you can see a running example from Adobe http://tv.adobe.com/watch/adc-presents/developing-for-ios-with-air-for-mobile-26/, note that the speaker volume of the iphone in the live demo is really low too. So that must be a pretty huge bug then, making the microphone on the iPhone unusable.
download and use AIR 3 sdk
then set
SoundMixer.useSpeakerphoneForVoice = true;
SoundMixer.audioPlaybackMode = AudioPlaybackMode.MEDIA;
and your problem should be solved

AVAssetExportSession missing audio track when exporting on device

I run the export on the simulator and everything works great. I run it on the device and the video gets exported but there's no audio. This leads me to believe that I must be using an audio format that the device doesn't support but OS X does, as the simulator uses what OS X uses. I've tried m4a, aiff, and aifc and have had no luck! Any ideas??
I have a very similar problem. It does not seem to do with codecs, as I made a separate test case that runs fine with the same video. There’s a related question that says the problem might be in playing the same assets using MPMoviePlayerController. That got me on the right track (sort of).
In my case the trouble stem from using the assets in an AVPlayer during the export. I was not able to find the exact combination that causes the export to drop the audio track – in the separate test project the export runs fine even though the asset plays in AVPlayer at the same time. After several hours of trying to find the exact cause I gave up and simply popped the asset out of the player using replaceCurrentItemWithPlayerItem:nil during export. It’s a hack, but it works.
AVFoundation is a very powerful framework, but God I wish it wasn’t so finicky or at least logged more errors instead of silently producing garbage.

HTTP-Live-Streaming - Loading Issue?

I'm (trying) to use HTTP-Live-Streaming in my app and after weeks of re-encoding it seems to work now without errors by the mediastream validator.
On my latest iPod Touch (iOS 4.0) with WiFi the videostream loads in 1sec and switches to the highest bandwidth stream.
On another test device iPhone 3G (iOS 3.0) with WiFi it takes up to 30 seconds to load the stream - although I see in my log files that it looks for the high quality stream after 1 second. But I get a black screen with audio only in the first 30 seconds. Is this problem to due the better CPU on the latest iPod touch or is it due to the iOS upgrade?
Also I'm fearing another rejection by Apple because the last time they checked my stream they only looked at each videostream for about 3 seconds and then rejected because they didn't see any video.
Take a closer look at the segmented files. Example: can you play the first low-quality MPEG-TS segment in VLC? Is their video there?
I've found iOS devices to be very picky about that they will and won't play. Make sure you are using a lowest common denominator code settings. I'm a big fan of The Complete Guide to iPod, Apple TV and iPhone Video Formats