I am trying to load the HTML5 SoundCloud widget and immediately start playing track 30 from a set.
My code looks like:
<iframe id="sc-widget" src="https://w.soundcloud.com/player/?url=http://soundcloud.com/drasla&start_track=30&auto_play=true></iframe>
However, this starts only playing track #25. The widget seems to paginate the data and only pulls the first 25 tracks at first unless the user scrolls down.
I have tried passing a limit=100 parameter into the URL, but this doesn't seem to increase the initial load.
Is there anyway to immediately skip to a track after #25 using the widget?
Create a playlist for the artist, add tracks, then setup your widget code:
https://w.soundcloud.com/player/?url=https://api.soundcloud.com/playlists/<playlist_number>?start_track=<track_number>&auto_play=true
example: https://w.soundcloud.com/player/?url=https://api.soundcloud.com/playlists/462496?start_track=38&auto_play=true
You'll notice the player jumps directly to track #38 in the example.
Related
I am working on an app that plays audiobooks, and when you click a book in your list a full window player will start playing the book, however when you close/minimize it, I want a miniplayer to show right about the navbar in the bottom. This player has to continue playing from where the first player "stopped" until the user decides to pause.
Im unsure if this can be achieved without having to have both play the audio at the same time? Reason is that when the Fullscreen is closed, player.Dispose is called.
I currently have a function which saves the position of the player and stores it in shared pref, it however cuts the seconds of, so even if I called that in the miniplayer, it wouldnt continue from the same position.
Im new to flutter/dart btw, with only a few months experience. Thanks in advance
You should use a single player and then have two player "widgets" displaying the state of that same player. That way, you don't need to try to stop and start the player - the player can continue playing the whole time while the widgets are the only things changing.
If you're a beginner, the simplest way to define a single player that can be referenced from everywhere is to define it in a global variable, outside of any class declarations (for something more advanced, read about the service locator pattern):
final player = AudioPlayer();
Then let's say you have two widgets to view the state of the same player: FullPlayerWidget and MiniPlayerWidget. They can both reference the same player instance defined above.
Finally, your logic error is that you are calling dispose when minimising the full player widget. You shouldn't do that unless you want the audiobook to stop playing permanently. You don't want it to stop permanently, and you don't even want it to stop at all!
Actually, the only thing you want to happen when minimising is a visual thing. You want to hide the full widget and display the mini widget. But on the audio side, you want the player itself to be unaffected, continuing to play audio as if nothing had happened. Therefore, in this event you should in fact not call any methods on the player.
The best thing you can do as a beginner is to study lots of examples. E.g. podcastproject (although its dependencies are a bit out of date now).
I can play a live stream with the hls.js player using playlists master.m3u8, video.m3u8 and metadata.m3u8. The video is created using ffmpeg hls commands and is using a rolling live window with the args:
-hls_list_size 20 -hls_time 2 -hls_flags delete_segments
This creates video fragments starting with video0.ts to video19.ts, then starts removing the first fragments as it adds new ones. The video.m3u8 eventually looks like...
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:2
#EXT-X-MEDIA-SEQUENCE:25
#EXTINF:2.002133,
video25.ts
#EXTINF:2.002133,
video26.ts
...
My metadata.m3u8 playlist looks similar though I am creating that from a separate source. The video and metadata playlist are kept in sync and play fine from the start of my live.
#EXTM3U
#EXT-X-VERSION:6
#EXT-X-TARGETDURATION:2
#EXT-X-MEDIA-SEQUENCE:25
#EXTINF:2.000
sub_25.vtt
#EXTINF:2.000
sub_26.vtt
...
The problem starts when I reload the player page. On a reload, the player loads the playlists and will play correctly at the current live point.
I see it load fragments around video45.ts and sub_45.vtt. This is seemingly correct as the media sequence in video.m3u8 is at 25. Add in the 20 fragment playlist size and the live position is around 45th fragment. This is around 90 seconds into the live.
However, the media time in the player shows 40 seconds. It seems to use the number of fragments in the playlist only to come up with 40 seconds, even though the real live time is 90 seconds.
The final resulting problem is that the 40 second media time is being used to reference the text track cues and showing captions for 40 seconds.. not 90 seconds mark where the video actually is from.
Is there a way to get the player to correctly reflect the 'real' time, despite the rolling window of live, so the captions (which are correctly loaded) to display at the correct time?
Or is the rolling window live playback not going to work with subtitles in vtt?
If I disable the rolling window support I can reload the live many times and the 'full' live time loads and the captions line up ok.
what I'd like to do:
I would like to record a video using Flutter's CameraController that has the same duration as an animated webp. On top of my screen, the animated webp is playing and below there is a CameraPreview() widget that records whatever my camera catches. This recorded video's duration should be exact as long as the animated webp's duration.
what I've tried so far:
Since Giphy offers not only a webp-version, but also an mp4-version, I downloaded the mp4 version and used ffmpeg to get the duration of that file.
I then used a timer and called VideoController.stopVideoRecording() after this duration automatically after VideoController.startVideoRecording().
what I'd expect to happen:
I'd expect this recorded video to be as long as the animated webp. Unfortunately, it's not.
So, my question is:
Do you guys have any idea how I could manage to record a video with same duration as an animated webp?
Thanks :)
Ok, sort of found what the issue is: Webp (and also Gifs) in Flutter are played slower than in browsers. Dont know if that is the case for all webps and gifs, but the ones I tested are all animated faster in a desktop-browser than in flutter. So, animation time of those webp is not the same as playing-time of the respective .mp4 file.
I use those mp4-versions now instead and that does the job.
What is the correct way to begin playback of a video from a specific time?
Currently, the approach we use is to check at an interval whether it's possible to seek via currentTime and then seek. The problem with this is, when the video fullscreen view pops up, it begins playback from the beginning for up to a second before seeking.
I've tried events such as onloadmetadata and canplay, but those seem to happen too early.
Added information:
It seems the very best I can do is to set a timer that tries to set currentTime repeatedly as soon as play() is called, however, this is not immediate enough. The video loads from the beginning, and after about a second, depending on the device, jumps. This is a problem for me as it provides an unsatisfactory experience to the user.
It seems like there can be no solution which does better, but I'm trying to see if there is either:
a) something clever/undocumented which I have missed which allows you to either seek before loading or otherwise indicate that the video needs to start not from 00:00 but from an arbitrary point
b) something clever which allows you to hide the video while it's playing and not display it until it has seeked (So you would see a longer delay on the phone before the fullscreen video window pops up, but it would start immediately where I need it to instead of seeking)
do something like this;
var video = document.getElementsById("video");
video.currentTime = starttimeoffset;
more Information can be found on this page dedicated to video time offset howtos
For desktop browser Chrome/Safari, you can append #t=starttimeoffsetinseconds to your video src url to make it start from certain position.
For iOS device, the best we can do is to listen for the timeupdated event, and do the seek in there. I guess this is the same as your original approach of using a timer.
-R
I have lots of buttons on a web page. Depending on which one is clicked, I want to play a different video.
A large number of <video> elements doesn't seem to work particularly quickly or reliably.
So far, I have tried to:
Create and play() the video element dynamically, after an image is clicked:
var video = document.createElement('video');
video.src = 'video.mp4';
document.body.appendChild(video.play);
video.play();
This works on iOS 4, but not on iOS 3.
Create the video element before, and just change the src.
Doesn't work either.
It seems like the video object must have already done "it's thing", before it can be played.
Use window.open() to open the video URL.
This will cause an annoying new tab to open, which will remain open after playback has completed.
Set window.location
This will cause the current page to be reloaded after playback has completed, which I'm trying to avoid.
Any more ideas?