I want to record the activities of my app as a video with sound.
I am able to do this with AVAssetWriter and AVAssetWriterInput. Actually I am rendering a view's layer to a gaphicscontext and then use this rendered image to make a video file.
But I am not able to add audio to this video file. I want to add sounds that My app produced to this Video File.
How can I implement this using objective c
Related
I want to create a video stream of a View in iOS. For example place a View in iPhone and draw something on it and I want to create a video stream (H.264 or MP4 or any famous standard) so that I can save a video file which contain recording of my NSView and all drawing and other operations that I perform.
Any idea where to start? Is there some API available in iOS to record video iPhone screen or a specific view??
Thanks in advance.
This blog post contains a link to sample project that shows how to capture screen content on iOS and add it to an AVFoundation asset.
Download the sample project called VTMScreenRecorderTest.zip
Also take a look on the slides (the screen capture part starts at slide 44).
The capture code is based on Apple's Technical Q&A 1703.
I am writing an iPhone application that capture video from iPhone back camera and edit that video and record edited video. The following are the functions to perform.
Get video from back camera.
Record the video to some location in iPhone.
Get the 1 mins duration video from the recorded video.
Get the frame by frame CMSampleBuffer from the video.
Modify the image of the CMSampleBuffer and write the modified video in the iPhone.
The steps 1 – 4 are worked fine for me. But the modification of image of the CMSampleBuffer are not working. The modifications I want to perform is draw an additional image in the image of CMSampleBuffer. Draw some texts in the image of the CMSampleBuffer. Can somebody tell me how can I do this?
Is there a library or built in graphical player to represent a playing audio file in iOS.
I dont want a full screen player, but a small inline player that can be embedded into a UIView.
Does this exist in iOS?
Apple has a good example of this, avTouch. I have successfully adapted parts of their code to display audio levels in the past.
avTouch
I'm using mpmovieplayer to play video on ios 4, and I want to keep the audio playing when my app enter background, but it doesn't work. I thing it's because video player use GPU to render video on screen and this is not allowed by apple when app enter background.
So, is there any way to do that? There are some apps have this feature, and they seems just use the mpmovieplayer, is it work by detach the video layer of mpmovieplayer?
Thanks for your answer!
You'll need to have separate audio file and video file for this. Play both when you need to play video and just the audio when app gets into background.
I would like to programmatically produce a video using the microphone on the iPad (for sound) and the screen display (for visual)? Is this possible? How should I proceed?
Possible, but not easy. A rough outline:
Start capturing the audio: How do I record audio on iPhone with AVAudioRecorder?
Capture individual images of the screen as fast as you can: How to capture current view screenshot and reuse in code? (iPhone SDK)
Later, combine the two into a video. I have no clue how to do this part.