I need to save the video before uploading - iphone

I have an application that has the possibility to record a video and to upload it.
After i get the video from UIImagePickerView i upload it using a PUT method using ASIHTTPRequest.
I get memory warnings and sometimes the upload times out.
I was thinking that i have to save the video before uploading.
What is the best solution for my problem?
Regards,
George

Well I don't know how big your video is, but I think either AVFoundation or MediaPlayer have methods for getting specific frames out of the video.
If you get out of memory, you can get either split it up to frames and upload each one of them in a loop, or split the video in n-second parts, where n seconds is a sub video that has good uploading time and good memory footprint.
I already have used in a project AVAssetImageGenerator, to get UIImages from a video and upload them, and it worked fine.

Related

Force GPU usage with copying video sample buffers via AVAssetReader/AVSampleBufferDisplayLayer

I am looping 1 second mp4/h264 videos with no audio on an M1 Mac Mini. AVPlayer was causing hitches with scrolling.
Now I read videos using AVAssetReader and feed those CMSampleBuffers into a AVSampleBufferDisplayLayer.
To get it to seamlessly loop without having to create a new AVAssetReader, I just cache all of those samples in an array and create copies with new timing via CMSampleBufferCreateCopyWithNewTiming. These are short videos so it's not a lot of data.
It all works pretty great now with no hitches. However the VTDecoderXPCService is going nuts on the CPU. I was expecting the GPU to be doing most of the work for decoding.
Is copying those samples from memory via CMSampleBufferCreateCopyWithNewTiming causing it? Is there a better way?

Live Streaming Video in iPhone

I am new to iPhone Development. I need to capture video. While I'm capturing video it display on server too. Something like live streaming.
Anyone have idea from where I should have to start for this functionality?
Thanks in Advance.
Your question seems similar to this
Xcode ios: Streaming of video file while recording and removed redundant personal statements
First Half Solution
Using AVFoundation you can get video Buffer/frames while recording.
Second Half
But for uploading i didn't find any solution
There is Input Stream option there in iOS APIs but it need some file path. but as video is not recorded we didn't have any path.
Edit 1
Here is Best Example for AVFoundation provided by Apple, you can start with
I recommend you to use wowza wowza.com/https://www.wowza.com, it has all the features, from live stream, video on demand and etc.

Can MpMoviePlayer get it's data from an NSInputStream

I have an iPad app which has a network connection from another iPad. On the client iPad, I want to be able to take data from a NSInputStream (which comes from the server iPad), and play it in MpMoviePlayer as it downloads from the server iPad.
I know that I can download the entire video, save it to a file, and open it in media player, but I want to be able to start playing before the full file has been downloaded.
I have NOT tried saving a chunk of it to the file and playing it, then adding to the file as it becomes available in the stream, because a) the file is likely to get locked, and b) the movieplayer is likely to open the file and read it into an internal cache, so adding to the file later won't (I don't think) play the new content. I'm willing to try it down the road, if nobody has any brilliant ideas, but I give it a very low likelihood of working - I'd guess a 10% chance of success.
If MpMoviePlayer had an initWithData method, I would simply give it a NSMutableData, and add to the data as it became available to the stream, but I don't see a method like that. Does anyone have any ideas for how I can do this?

IPhone: Video API: Live video streaming modify

I have a question about video stream processing. Is it possible to get access and modify real time video stream during recording (f.e. I want to add some text to video)? I can do this as a preview by getting separate frames, but I'm looking for tool which will allow me to store video with my text in video frames.
Probably there is already some libraries/tools available (but I haven't found any yet).
Try GPUIMAGE library. It can help you.
You should check AVCam sample code by apple. That might be a starting point.

Simultaneously stream and save a video?

I'm writing an app, part of which allows the user stream/play videos. I want to restrict the functionality so that they can only stream videos if they have a WiFi connection. I will then save the video so that when they have a 3G only (or lesser) connection they can't stream videos and can only replay videos that are saved on the phone.
Ideally, I'd like to get MPMoviePlayerController to stream/play the movie and then access the movie data and save it. However, the MPMoviePlayerController api doesn't seem to support access to the movie data.
I'd like to avoid and download-then-play scenario. Any ideas?
Two solutions come to mind.
Both this solutions require that the file is in a format that can be played progressive, e.g. that you don't need the whole file to be able to play it (but that would be a prerequisite anyway).
use a thread to download the data and append it to a file, and play the file from another thread. Now, that requires that you can handle EOF events in the MPMoviePlayerController and pause the playing until the cache file is appended to and then resume for the same point.
So far what I've seen people doing this it doesn't work because MPMoviePlayerController can't handle the EOF event. (not tested it my self yet) [Caching videos to disk after successful preload by MPMoviePlayerController
Skip the playing from a file and setup a local HTTP server and stream from that (on localhost). This is also not tested.
The idea is that MPMoviePlayerController would handlle the event of missing data better from a HTTP stream then from reading the file directly.
Downside might be that it is less efficient, but I think that is a minor increase in CPU. I don't know if the network interface would handle it, but I'm assuming it's not an issue.
I leave this answer as a wiki, because I don't have a working solution but I too want one.
There is a way to make this work, but you have to write your own HTTP Live Streaming downloader.
Basically, you parse the .m3u8 file (it's a pretty simple standard, but can get tricky with alternate streams and the possibility that the stream will simply drop out and need a new playlist to continue) and then download the chunks in .ts format to your local storage, say the Documents folder or Caches etc.
Then you'll have to set up a local HTTP server to allow the MPMoviePlayerController or AVPlayer to access the files over HTTP (since they won't touch a local file path), including a re-coded playlist file pointing to the local files, which you'll have to create yourself from the original playlist(s).
CocoaHTTPServer works great for this.
Once you've done all that, it works great. It's unavoidable that you get a little delay while you download the first chunk or two before presenting your local HTTP URL to the movie player, but after that you get seamless download, recording and preview playback.
Good luck!
the iPhone is using progressive download so it will not save on the device. For that you need to explicitly download it and then play the video from your local folder.