I have a .m3u8 video URL that works perfectly when played on a browser player and continously shows the live video.
However with video_player on Flutter, after initializing the video controller, only about 20 or so seconds is available in the player, and when reaching the end of it, it does not continue playing and stops playing the video as if it is a limited 20 second video.
If I want to see the next few seconds or closer to the live feed, I need to dispose and reinitalize the controller.
I don't see this issue posted anywhere so where am I going wrong?
videoPlayerController = VideoPlayerController.network('https://example.com/abcdef.m3u8');
(videoPlayerController.value.isInitialized)
?
Expanded(
child: VideoPlayer(
videoPlayerController,
),
)
: Text('Nope'),
SizedBox(
height: screenHeight * 0.1,
),
SizedBox(
width: screenHeight * 0.2,
child: FloatingActionButton(
heroTag: 'Start',
onPressed: () async {
await videoPlayerController.initialize();
videoPlayerController.play();
home.notifyListeners();
},
child: const Text('Start'),
),
),
I found the reason why I have been experiencing this problem.
In my testing I have also setup RTMP streaming through the camera plugin, and I had that on while I tried to fetch the HLS video. I thought both are unrelated so it shouldn't affect anything but perhaps they both use a same library, so that's why the issue. With no streaming being done, the HLS video plays normally as expected.
Related
The app loads a few short (4-5 Seconds long) audio files during startup and plays them at certain events. It's basically working, except for one little detail on Android:
The first time an audio file is played, it sounds as if playback starts a few seconds into the file, instead of at the beginning. Any following attempts to play the file work as expected. It's just the very first playback that starts a little too late into the file.
Question now is, how can it be fixed?
This is how the files are loaded during initialization of the app:
late AudioPlayer _someAudio;
init() async {
AudioLoadConfiguration loadCfg = AudioLoadConfiguration(
androidLoadControl: AndroidLoadControl(
prioritizeTimeOverSizeThresholds: true,
minBufferDuration: const Duration(seconds: 10),
maxBufferDuration: const Duration(seconds: 20),
bufferForPlaybackDuration: const Duration(seconds: 10),
),
);
_someAudio = AudioPlayer(audioLoadConfiguration: loadCfg);
await _someAudio.setAsset('assets/audio/someeffect.mp3');
}
Note 1: AudioLoadConfiguration doesn't seem to have any effect here. It has just been added in an attempt to fix the issue.
And this is how the files are played:
void playSomeEffect() async {
await _someAudio.seek(Duration.zero);
await _someAudio.play();
}
Note 2: It seems to be an Android issue (iOS is working as expected). Happens on a Samsung and a Motorola - other devices have not yet been tested.
Do you have any ideas how to ensure that first playback starts at the beginning? Any advise is appreciated. Thank you.
So I am using the video_player package in my Flutter project and have made use of VideoProgressIndicator and it works exactly as intended:
Widget progBar(BuildContext context) {
return VideoProgressIndicator(
_controller,
allowScrubbing: true, // user can touch/drag bar to change position of video.
colors: VideoProgressColors(playedColor: Colors.red, bufferedColor: Colors.white),
padding: const EdgeInsets.symmetric(vertical: 6, horizontal: 9),
);
}
The issue:
I have the timestamp(s) of the video as part of the player and when the user uses the VideoProgressIndicator to change the position of the video, I don't have the new position of the video so that I can update the timestamp(s).
My question:
How can I get the new position of the video when the VideoProgressIndicator changes it?
Thank You!
Ok so I figured out my solution to be simply to create a custom progress bar that manipulates a timer.
I am working with a video player called 'flick video player'. I can play videos fairly okay with default functionality. The problem occurs when I scroll down the screen and the video continues to play in the background. I would like to pause it when it isn't visible, or when a user navigates to a different page on the project app.
The video player that I am using (flick_video_player) has video_player as its dependency.
Answers are much appreciated.
Regards
I think you can use visibility detector for the purpose-
VisibilityDetector(
key: ObjectKey(flickManager),
onVisibilityChanged: (visibility){
if (visibility.visibleFraction == 0 && this.mounted) {
flickManager?.flickControlManager?.pause();//pausing functionality
}
},
child: Container(
child: AspectRatio(
aspectRatio: 1280/720,
child: FlickVideoPlayer(
flickManager: flickManager
),
/*VideoPlayer(
video_controller
),*/
),
),
),
I was working on something similar. For more info like how to play it again and more you can refer this repo- https://github.com/GeekyAnts/flick-video-player/tree/master/example/lib/feed_player
Hope it helped!
Maybe this visibility detector package can help https://pub.dev/packages/visibility_detector
Wrap your list of videos with a NotificationListener and listen to whether the user has started or stopped scrolling. Use this value to either play or pause your video.
Edit: misread your question. This will work for pausing once the user scrolls. If you want to detect whether the video is within the current view, check out ScrollablePositionedList.
return NotificationListener(
onNotification: (notificationInfo) {
if (notificationInfo is ScrollStartNotification) {
// Set a state value to indicate the user is scrolling
}
if (notificationInfo is ScrollEndNotification) {
// Set a state value to indicate the user stopped scrolling
}
return true;
},
child: YourVideos(),
);
This is exactly what you need, inview_notifier_list:
InViewNotifierList(
isInViewPortCondition:
(double deltaTop, double deltaBottom, double vpHeight) {
return deltaTop < (0.5 * vpHeight) && deltaBottom > (0.5 * vpHeight);
},
itemCount: 10,
builder: (BuildContext context, int index) {
return InViewNotifierWidget(
id: '$index',
builder: (BuildContext context, bool isInView, Widget child) {
return Container(
height: 250.0,
color: isInView ? Colors.green : Colors.red,
child: Text(
isInView ? 'Is in view' : 'Not in view',
),
);
},
);
},
);
I am having an issue with Flutter video_player where sometimes the video hangs and the sound is not in sync with the video. There is a delay of ~1 second between the video and sound. This only happens when playing certain videos, most videos are fine. I've checked that one of the affected videos is in the same format (mp4) as other videos and also have downloaded that video from my S3 bucket and have confirmed that it plays correctly in that case, so I believe it must be an issue with the video_player plugin. Here is my code to load the video controller. Is there any reason that videos would behave differently with this plugin where the audio and video are not in sync?
void loadVideo() async {
videoController =
VideoPlayerController.network(videoLink);
videoController.initialize().then((_) {
if (!mounted) {
// displays an error message if the video controller doesn't load
videoError = true;
setState(() {});
return;
}
setState(() {});
});
videoController.addListener(_listenForError);
}
void playVideo() async {
videoController.play();
}
Widget cameraWidget = Transform.scale(
scale: videoController.value.aspectRatio / deviceRatio,
child: AspectRatio(
aspectRatio: videoController.value.aspectRatio,
child: VideoPlayer(videoController),
),
);
Hi this code works wonderfully with Angela in my Flutter course haha but for some reason the player.play('notes1.wav') isnt working for me.. tips? I'm getting this error..
error: The method 'play' isn't defined for the type 'Type'. (undefined_method at [xylophone] lib/main.dart:17)
import 'package:flutter/material.dart';
import 'package:audioplayers/audio_cache.dart';
void main() => runApp(XylophoneApp());
class XylophoneApp extends StatelessWidget {
#override
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
body: SafeArea(
child: Center(
child: FlatButton(
onPressed: () {
final player = AudioCache;
player.play('note1.wav');
},
child: Text('Click Me'),
),
),
),
),
);
}
}
I'm working on the same course! You've got the opening/closing brackets missing on the AudioCache declaration. It should be:
final player = AudioCache();
I'd also do this for playing the audio, as it will stop making a sound after you press a few buttons:
player.play('note$note.wav',
mode: PlayerMode.LOW_LATENCY,
stayAwake: false);
It's better than the original code, but not perfect. I think there are audio 32 channels that get used up quite quickly - the about change seems to release them a bit quicker. Good luck with the rest of the course!
Migration Guide
dependencies: audioplayers: ^0.x.x
to
dependencies: audioplayers: ^1.0.1
https://github.com/bluefireteam/audioplayers/blob/main/migration_guide.md
AudioCache is dead, long live Sources One of the main changes was my desire to "kill" the AudioCache API due to the vast confusion that it caused with users (despite our best efforts documenting everything).
We still have the AudioCache class but its APIs are exclusively dedicated to transforming asset files into local files, cache them, and provide the path. It however doesn't normally need be used by end users because the AudioPlayer itself is now capable of playing audio from any Source.
What is a Source? It's a sealed class that can be one of:
UrlSource: get the audio from a remote URL from the Internet DeviceFileSource: access a file in the user's device, probably selected by a file picker AssetSource: play an asset bundled with your app, normally within the assets directory BytesSource (only some platforms): pass in the bytes of your audio directly (read it from anywhere). If you use AssetSource, the AudioPlayer will use its instance of AudioCache (which defaults to the global cache if unchanged) automatically. This unifies all playing APIs under AudioPlayer and entirely removes the AudioCache detail for most users.
AudioCache is obsolete now, try this
import 'package:audioplayers/audioplayers.dart';
final player = AudioPlayer();
player.play(AssetSource('note1.wav'));
AudioCache is deprecated. It will not work now.
Instead use this code:
child: TextButton(
onPressed: () async {
final player = AudioPlayer();
await player.play(
AssetSource('note1.wav'),
);
},
child: Text("Play me"),
),