Is it possible to play a video on iPhone and have a subtitles synchronized with it show up? - iphone

I want to add a "subtitles" to a video played in an iPhone app. I don't want those subtitles encoded into the video itself - ideally I'd love to have a view showing the video (with pause, play, volume and such standard controls) together with a view displaying the text that changes together with movie time changing.
If I drawn that, it's something like this,
So, basicly, I would need a way to get a method called when movie is playing, and then synchronize the text displayed on the label with the movie timing.
Anyone used a solution that was able to do it?

I've recently done something that syncs graphics to times in an audio track. The way I did it was by using the currentPlaybackTime property of the MPMediaPlayback interface (which the MoviePlayer controller should also conform to). This returns the seconds elapsed in the media, in a double (typedef'ed as NSTimeInterval). The actual synchronisation in my app was not done in notifications, as I couldn't find any resembling a "tick", but instead I created a timer, calling a function queried the currentPlaybackTime, and updated the graphics based on this.
In terms of your implementation, I would assume you have some kind of system for associating label text (subtitles) with a particular time. You could then compare the text's time range with the time returned from currentPlaybackTime to find the correct text to display.

Related

Is it possible to record the audio that comes out of the iPhone?

I am working on an app that allows the user to create a sort of dub. There is an audio file playing, and the user can tap at certain moments to insert sound (kind of like a censor button.) I'm wondering how to go about capturing the final product.
Capturing audio directly from the iPhone seems the easiest route, as the user already hears the finished product as it is made. However, I can't find anything on how to do this. If not possible, are there any suggestions?
The best way would probably be to be using the AV Foundation framework for mixing and then buffering the audio as well as playing it. This would allow for a high abstraction level while guaranteeing both played back and saved audio to be equal.
Apart from that: from a How can I achieve this with minimum code-perspective, without more information about your setup, the question is way too broad and/or opinion-based.
You will have to work with buffers. Don't know right now how it is done in Swift but you can implement it in Obj-C and then bridge it out.
You can refer to this answers here in StackOverflow (They are a bit old)
https://stackoverflow.com/a/11218339/2683201
https://stackoverflow.com/a/10101877/2683201
and a project also exists (but is in Obj-C)
https://github.com/alexbw/novocaine
Mainly the idea for your case would be to have 2 separated buffers and your sound effect.
Then, you will be playing from buffer A (your music) and copying played data into buffer B (final Output) unless you are playing the effect. In wich case you will be copying the effect data into your buffer B.
Other option is to do it offline:
Play your music (or audio) and keep a timer running synced with the elapsed time of your "to be censored audio".
Save the timestamp of when you start and end tapping the censor button (for example).
Overlap buffer A with your effect in those recorded (start-end) timestamps.
Save the buffer as a file (or do whatever you need to do with it)
UPDATE:
You should take a look into the Apple implementation of something like this:
https://developer.apple.com/library/ios/samplecode/AVAEMixerSample/Introduction/Intro.html

iphone html 5 video - how to start from different time

What is the correct way to begin playback of a video from a specific time?
Currently, the approach we use is to check at an interval whether it's possible to seek via currentTime and then seek. The problem with this is, when the video fullscreen view pops up, it begins playback from the beginning for up to a second before seeking.
I've tried events such as onloadmetadata and canplay, but those seem to happen too early.
Added information:
It seems the very best I can do is to set a timer that tries to set currentTime repeatedly as soon as play() is called, however, this is not immediate enough. The video loads from the beginning, and after about a second, depending on the device, jumps. This is a problem for me as it provides an unsatisfactory experience to the user.
It seems like there can be no solution which does better, but I'm trying to see if there is either:
a) something clever/undocumented which I have missed which allows you to either seek before loading or otherwise indicate that the video needs to start not from 00:00 but from an arbitrary point
b) something clever which allows you to hide the video while it's playing and not display it until it has seeked (So you would see a longer delay on the phone before the fullscreen video window pops up, but it would start immediately where I need it to instead of seeking)
do something like this;
var video = document.getElementsById("video");
video.currentTime = starttimeoffset;
more Information can be found on this page dedicated to video time offset howtos
For desktop browser Chrome/Safari, you can append #t=starttimeoffsetinseconds to your video src url to make it start from certain position.
For iOS device, the best we can do is to listen for the timeupdated event, and do the seek in there. I guess this is the same as your original approach of using a timer.
-R

Capture video without displaying the actual video feed

So I have an application that can currently capture video with the front facing iphone camera and then do some processing on the video feed real-time. What I'm trying to do, however, is make this process run in the background and put other controls onscreen. So for example, say I'd like to run the camera and process the image feed, but I want the user to see a black screen with some buttons on it. Any ideas on how to do this?
Just so we get terminology right, by "in the background", you mean running the camera capture while your application is in the foreground, but not displaying the actual video feed. This is possible, but I wanted to make clear that if you move your whole application into the background you will not have access to the camera then.
There are a few ways to do this, but the one that I've spent the most time with is grabbing frames of video (or photos) via AV Foundation. Using an AVCaptureDevice and AVCaptureSession, you can grab the frames of video and route them to an encoder for saving to disk or for processing using your own custom code. None of this requires the camera feed to be displayed onscreen, so you can put up whatever interface you like and do this video recording or photo capture without any onscreen indication.
I would caution that you should make it explicit to your users what you are doing, so that you do not run the risk of violating someone's privacy. Apple does not react kindly to those who do this (for good reason).
I encapsulate a lot of this within my open source GPUImage video and photo processing framework, so you could look at the code for the GPUImageVideoCamera class there to see how I configure the capture inputs. I hand the video frames off to OpenGL ES for the application of filters and other processing operations, but you could ignore that portion of it if you just wanted to do your own encoding or processing.
Heres an exemple code from Apple's doc:
http://developer.apple.com/library/ios/#samplecode/PhotoPicker/Introduction/Intro.html
there is also the way to customize the camera interface.

Playing a real time video stream from iPhone camera on a 20 second delay

I am trying to see if it is possible to record a video from the iPhone's camera and write this to a file. I then want the video to start playing on the screen a set time after. This all needs to happen continuously. For example, I want the video on the screen to always be 20 seconds behind what the camera is recording.
Some background:
I have a friend who is a coach and would like for his players to be able to see their last play. This could be accomplished by a feed going to a TV from an iPad always 20 seconds behind what is recorded. This needs to continually run until practice is over. (I would connect the iPad to the TV either with a cable or AirPlay to an Apple TV). The video would never need to be saved and should just be discarded after playing.
Is this even possible with the APIs AVFoundation offers? Will the iPhone let you write to a file and read from a file at the same time to accomplish this? Any other better way to accomplish this?
Thanks for your time.
Instead of writing to a file, how about saving your frames in a circular buffer big enough to hold X seconds of video?
The way I would start to do this would be to look at what's provided in AVCaptureVideoDataOutput and its delegate methods (where you can get the frame data from).

MPMovie player how to get the amount of time played?

I was trying to go through the iPhone's sample code for mediaplayer.
I want to be able to capture the amount of time the media player has played the video. The duration at which the media player has stopped. Is there a method or property that will tell me the duration of play of the media??
Unfortunately the current API for MPMoviePlayerController allows basically no control. You can tell it to play and stop... otherwise where's a delegate method so you can be notified when the movie finishes playing and that's it, there's no additional controls. (a real bummer)
However, while we cant discuss the new 3.2 SDK yet, I'll give you a tip and say go look at the documentation of MPMoviePlayer in 3.2 and I think you'll be happy.
http://developer.apple.com/iphone/prerelease/library/documentation/MediaPlayer/Reference/MPMoviePlayerController_Class/MPMoviePlayerController/MPMoviePlayerController.html
moviePlayer.currentPlaybackTime
It's not possible to do KVO on it but you could do like me and create an scheduledTimer which updates every second to check what the current playbacktime is and update your graphics accordingly :)
Yes, You can use the property "duration" defined by MPMediaPlayerController. Plese try it out and check the output. U can refer the here duration property