What is the correct way to begin playback of a video from a specific time?
Currently, the approach we use is to check at an interval whether it's possible to seek via currentTime and then seek. The problem with this is, when the video fullscreen view pops up, it begins playback from the beginning for up to a second before seeking.
I've tried events such as onloadmetadata and canplay, but those seem to happen too early.
Added information:
It seems the very best I can do is to set a timer that tries to set currentTime repeatedly as soon as play() is called, however, this is not immediate enough. The video loads from the beginning, and after about a second, depending on the device, jumps. This is a problem for me as it provides an unsatisfactory experience to the user.
It seems like there can be no solution which does better, but I'm trying to see if there is either:
a) something clever/undocumented which I have missed which allows you to either seek before loading or otherwise indicate that the video needs to start not from 00:00 but from an arbitrary point
b) something clever which allows you to hide the video while it's playing and not display it until it has seeked (So you would see a longer delay on the phone before the fullscreen video window pops up, but it would start immediately where I need it to instead of seeking)
do something like this;
var video = document.getElementsById("video");
video.currentTime = starttimeoffset;
more Information can be found on this page dedicated to video time offset howtos
For desktop browser Chrome/Safari, you can append #t=starttimeoffsetinseconds to your video src url to make it start from certain position.
For iOS device, the best we can do is to listen for the timeupdated event, and do the seek in there. I guess this is the same as your original approach of using a timer.
-R
Related
I have an WKInterfaceInlineMovie within an WKInterfaceController. The URL of the video is set at some point after the video file is downloaded. Playing works fine besides this problem which is a different story, I think.
Here's the problem. If I keep the screen open, lower my wrist and then raise it again, I can see the same screen and the video starts playing automatically.
It looks very weird and unexpected especially because I have some custom UI (video progress indicator, animated Play/Pause buttons) which is triggered when I manually start the video but it obviously doesn't react on this unwanted automatic video start. If I close the extension with the Crown button, next time I open the app, it again shows the screen with video and start playing automatically. I can even not using the extension for a while and receive a user notification later – while the custom notification UI is displayed, I can hear the video starts playing somewhere below for a short period of time.
When it happens I always receive two messages in console:
<<<< PlayerRemoteXPC >>>> remoteXPCPlaybackItem_NotificationFilter: [0x128b86e0] I/NQB.01 Received kFigPlaybackItemNotification_FirstVideoFrameEnqueued
<<<< PlayerRemoteXPC >>>> remoteXPCItem_handleFirstFrameNotificationLatch: [0x128b86e0] I/NQB.01 Posting kFigPlaybackItemNotification_FirstVideoFrameEnqueued
I have the Autoplay checkmark unchecked in storyboard. I also tried to set setAutoplays(_:) to false programmatically when the outlet is initialized, or later when a URL is set to the movie. All of this makes no difference.
The behavior is the same no matter was the video playing or not when the screen get deactivated. I tried to call pause() on willDisappear() and didDeactivate() – it also doesn't do any difference.
I even tried to call pause() on didAppear() and willActivate() – didn't help either. Curiously enough, these methods are not called when I lower the wrist and raise it again (however, willDisappear() and didDeactivate() are both called.) But perhaps, it's a different story, too.
In an iOS-project I am using the AVAudioPlayerNode in conjunction with the AVAudioEngine and an AVAudioUnitTimePitch. Everything works peachy. However, I was wondering if there is a way to figure out what the current player's state (e.g. isPlaying, isPaused) or at least the playback position is.
While AVAudioPlayer at least allows you to get the currentTime-parameter, I could not yet figure out how to get that information with AVAudioPlayerNode. I tried playing around with the nodeTimeForPlayerTime and playerTimeForNodeTime methods described in the swift documentation but I couldn't make any progress.
Any help would be highly appreciated.
Since the AVAudioPlayerNode is designed as an audio stream, it doesn't necessarily keep track of the time within a particular file. However, the AVAudioPlayerNode does keep a running time of how long its been playing all audio. This timer doesn't reset with each file, in order to change it, you must explicitly tell it where you want to start counting from.
So to find the current time the player has been playing you must do the following:
player.lastRenderTime.sampleTime / file.fileFormat.sampleRate
Now in order to get the timer to reset after each file, you must explicitly reset the players current time. To do this use the player.playAtTime: function.
If you would like an example, check one out here: https://github.com/danielmj/AEAudioPlayer
I am trying to see if it is possible to record a video from the iPhone's camera and write this to a file. I then want the video to start playing on the screen a set time after. This all needs to happen continuously. For example, I want the video on the screen to always be 20 seconds behind what the camera is recording.
Some background:
I have a friend who is a coach and would like for his players to be able to see their last play. This could be accomplished by a feed going to a TV from an iPad always 20 seconds behind what is recorded. This needs to continually run until practice is over. (I would connect the iPad to the TV either with a cable or AirPlay to an Apple TV). The video would never need to be saved and should just be discarded after playing.
Is this even possible with the APIs AVFoundation offers? Will the iPhone let you write to a file and read from a file at the same time to accomplish this? Any other better way to accomplish this?
Thanks for your time.
Instead of writing to a file, how about saving your frames in a circular buffer big enough to hold X seconds of video?
The way I would start to do this would be to look at what's provided in AVCaptureVideoDataOutput and its delegate methods (where you can get the frame data from).
I want to add a "subtitles" to a video played in an iPhone app. I don't want those subtitles encoded into the video itself - ideally I'd love to have a view showing the video (with pause, play, volume and such standard controls) together with a view displaying the text that changes together with movie time changing.
If I drawn that, it's something like this,
So, basicly, I would need a way to get a method called when movie is playing, and then synchronize the text displayed on the label with the movie timing.
Anyone used a solution that was able to do it?
I've recently done something that syncs graphics to times in an audio track. The way I did it was by using the currentPlaybackTime property of the MPMediaPlayback interface (which the MoviePlayer controller should also conform to). This returns the seconds elapsed in the media, in a double (typedef'ed as NSTimeInterval). The actual synchronisation in my app was not done in notifications, as I couldn't find any resembling a "tick", but instead I created a timer, calling a function queried the currentPlaybackTime, and updated the graphics based on this.
In terms of your implementation, I would assume you have some kind of system for associating label text (subtitles) with a particular time. You could then compare the text's time range with the time returned from currentPlaybackTime to find the correct text to display.
I want a functionality in which i want to detect if my device is being shaked.The problem is i can detect the shake with didAccelerate method of UIAcceleratorDelegate , but i dont know how to detect if the device is still shaking. I want to play an audio file when the user shakes the device for first time,i have to check if the user is still shaking the device while playing the 1st audio file,if it is still being shaked, then i have to play another file.
See the sample project GLPaint from Apple which was found by visiting http://developer.apple.com/iphone and entering "shake" in the search box. Developer account not required.
You might consider writing a Method that runs in a separate thread that polls wether your device is being shaken every now and then and fire events that you handle somewhere else in your code (or instead of that, handle whatever you want to handle within the threads context itself, even tough i would not recommend doing that).
You just have to make sure, that your "shake-detektor"-thread exits at some point in time, you probably want to do that when the second audio file stopped playing. So your loop could test on that condition.
Hope I could help a bit.