IPhone: Video API: Live video streaming modify - iphone

I have a question about video stream processing. Is it possible to get access and modify real time video stream during recording (f.e. I want to add some text to video)? I can do this as a preview by getting separate frames, but I'm looking for tool which will allow me to store video with my text in video frames.
Probably there is already some libraries/tools available (but I haven't found any yet).

Try GPUIMAGE library. It can help you.

You should check AVCam sample code by apple. That might be a starting point.

Related

How I can record and save RTCVideoTrack locally in appRTC iOS?

I am using replay kit to record the screen, what I want to achieve is recording or capturing the screen with Audio while I am doing a call using webRTC SDK. I have used appRTC from github.
I think I can achieve this by AVCaptureSession! as I want to exclude replayKit
There is no relevant code to provide.
This is challenging, but it can be done. I can't provide the detailed answers on this because it's pretty core to our app and what we're building and it's A LOT of code, but hopefully it helps to know it can be done.
A couple of pointers for you:
Take a look at http://cocoadocs.org/docsets/GoogleWebRTC/1.1.20266/Classes/RTCCameraVideoCapturer.html This will let you access the AVCaptureSession that WebRTC is using, you can successfully hook up your AVAssetWriter to this.
Look into the RTCVideoRenderer protocol reference. http://cocoadocs.org/docsets/Quickblox-WebRTC/2.2/Protocols/RTCVideoRenderer.html It will enable you to take the frames as WebRTC renders them and process them before sending passing them back to WebRTC. You'll need to convert the RTCI420Frame you receive to a CVPixelBufferRef (which is a YUV420 to RGB conversion).

Live Streaming Video in iPhone

I am new to iPhone Development. I need to capture video. While I'm capturing video it display on server too. Something like live streaming.
Anyone have idea from where I should have to start for this functionality?
Thanks in Advance.
Your question seems similar to this
Xcode ios: Streaming of video file while recording and removed redundant personal statements
First Half Solution
Using AVFoundation you can get video Buffer/frames while recording.
Second Half
But for uploading i didn't find any solution
There is Input Stream option there in iOS APIs but it need some file path. but as video is not recorded we didn't have any path.
Edit 1
Here is Best Example for AVFoundation provided by Apple, you can start with
I recommend you to use wowza wowza.com/https://www.wowza.com, it has all the features, from live stream, video on demand and etc.

iOS video player metadata

My question is if there is any built-in interpretation of metadata by the video player in iOS. I know one can add meta-data to a video and interpret it within a custom application as shown here.
In iOS on ipod or iphone, an HTML video is opened within the native player. I would like to display a message above or below the video for a short duration at the beginning. Since I cannot control the native player I thought there might be some built in metadata interpretation that could be used to perform this. I have not been able to find any information on this.
Any help is appreciated.
The blog you've posted includes details on using the native player MPMoviePlayerController to display meta data, which is pretty cool actually. You learn something new every day! If you're making a Phonegap App I suppose you could write a plugin to do this?
Or alternatively, have a look at this other OS question which appears to suggest that it is possible - though not seemingly with metadata embedded in the actual video. Apparently this works on iOS.
Reading metadata from the <track> of an HTML5 <video> using Captionator

Capturing video while processing it through a shader on iPhone

I am trying to develop an iPhone app that processes/filters and records video.
I have two sample apps that have aspects of what I need and am trying to combine them.
AVCamDemo from the WWDC10 sample code package (Apple Developer ID required)
This deals with capturing/recording video.
Brad Larson's ColorTracking sample app referenced here
This deals with live processing of video using OpenGL ES
I get stuck when trying to combine the two.
What I have been trying to do is to use AVCaptureVideoOutput and the AVCaptureVideoDataOutputSampleBufferProtocol to process/filter the video frames through OpenGL ES (as in -2-), and at the same time somehow use AVCaptureMovieFileOutput to record the processed video (as in -1-).
Is this approach possible? If so, how would I need to set up the connections within the AVSession?
Or do I need to use the AVCaptureVideoDataOutputSampleBufferProtocol to process/filter the video AND then recombine the individual frames back into a movie - without using AVCaptureMovieFileOutput to save the movie file?
Any suggestions for the best approach to accomplish this are much appreciated!

Creating YouTube videos in iPhone app

I'm doing a project where we want to create a video inside an iPhone app and upload it to YouTube. I've seen the you upload the video using Google's Data API (http://code.google.com/p/gdata-objectivec-client/).
However it seems that you need to upload the movie as an actual movie. Has anyone got any experience on making a movie in a format that YouTube will accept via the Data API and care to give me a few pointers on what would work?
(Just a quick note, I cannot use hidden APIs for this project)
Many thanks
Youtube accepts a broad range of formats. Just try it yourself, use any free video editing software to create a short movie and upload that movie to youtube, you're almost guaranteed that youtube would be able to process that.
The second part of your question is whether ios is able to produce a movie from still frames, then the answer is - yes - and you want to look at AVFoundation, particularly at AVAssetWriter