SoundCloud player widget ignores show_artwork setting - soundcloud

The SoundCloud Player Widget always shows the artwork, even if I set show_artwork=false. Are there plans to support this parameter? If this parameter is no longer supported, it should be removed from the widget docs at
https://developers.soundcloud.com/docs/widget

Are you looking at the old flash player or the new (where new = since 2011) HTML5 player?
Here's a player with artwork turned off:
https://w.soundcloud.com/player/?url=http://api.soundcloud.com/tracks/33427584&show_artwork=false

Related

Displaying Multiple Videos In Flutter

I'm trying to display a list of videos on my UI just like the way the Youtube app is. But my issue is i just cant find how to display the list of videos using the VideoPlayer plugin. The video player plugin takes just a VideoPlayerController and i don't know how to set all the list of videos to the controller so when they user tap on any video it plays. Or would i have to create lots of VideoPlayerController in a loop and assign them to each video in the list?
Add Dependency
chewie: ^0.9.10
Overall, for showing youtube like list, you need separate thumbnail images, so that on clicked specific video can be played.
Run below code, where everything is available which you required.
https://github.com/codemissions/flutter-video-streaming-app
just create new variable call path
String? _path;
then
_controller = VideoPlayerController.asset(_path!);
now, you can passing the path variable from resource that you have

Overlay text and images to YouTube live stream from Flutter app

I am looking into creating a Flutter mobile app that live streams to YouTube using the YouTube Live Streaming API. I have checked the API and found that it does not offer a way to overlay text and images onto the livestream. How would I achieve this using Flutter?
I imagine this involves using the Stack widget to overlay content on top of the user's video feed. However this would somehow need to be encoded into the video stream to be sent to YouTube.
this type of work is usually done with FFmpeg
See this discussion for more info: https://video.stackexchange.com/questions/12105/add-an-image-overlay-in-front-of-video-using-ffmpeg
FFmpeg for mobile devices is made available by this project:
https://github.com/tanersener/mobile-ffmpeg
And then, as always, we have a flutter package called flutter_ffmpeg to allow us these features on flutter
https://pub.dev/packages/flutter_ffmpeg
TLDR: You can use CameraController (Camera package) and Canvas in Flutter for drawing the text. Unfortunately CameraController.startImageStream is not documented in the API docs, and is a 1 year+ GitHub issue.
Everytime the camera plugin gives you a video frame controller.startImageStream((CameraImage img) { /* your code */}, you can draw the image onto the canvas, draw the text, capture the video and call the YouTube API. You can see an example of using the video buffer in Tensorflow Lite package here or read more info at this issue.
On this same canvas, you can draw whatever you want, like drawArc, drawParagraph, drawPoints. It gives you ultimate flexibility.
A simple example of capturing the canvas contents is here, where I have previously saved the strokes in state. (You should use details about the text instead, and just pull the latest frame from the camera.):
Future<img.Image> getDrawnImage() async {
ui.PictureRecorder recorder = ui.PictureRecorder();
Canvas canvas = Canvas(recorder);
canvas.drawColor(Colors.white, BlendMode.src);
StrokesPainter painter = StrokesPainter(
strokes: InheritedStrokesHistory.of(context).strokes);
painter.paint(canvas, deviceData.size);
ui.Image screenImage = await (recorder.endRecording().toImage(
deviceData.size.width.floor(), deviceData.size.height.floor()));
ByteData imgBytes =
await screenImage.toByteData(format: ui.ImageByteFormat.rawRgba);
return img.Image.fromBytes(deviceData.size.width.floor(),
deviceData.size.height.floor(), imgBytes.buffer.asUint8List());
}
I was going to add a link to an app I made which allows you to draw and screenshot the drawing into your phone gallery (but also uses Tensorflow Lite), but the code is a little complicated. Its probably best to clone it and see what it does if you are struggling with capturing the canvas.
I initially could not find the documentation on startImageStream and forgotten I have used it for Tensorflow Lite, and suggested using MethodChannel.invokeMethod and writing iOS/ Android specific code. Keep that in mind if you find any limitations in Flutter, although I don't think Flutter will limit you in this problem.

Is it possible to stream endless audio in flutter?

I'm developing an online radio app in flutter and I'm looking for an audio player which supports endless audio streaming from a certain URL (e.g. http://us4.internet-radio.com:8258/stream?type=http). It is highly desirable for it to be supported both on iOS and Android.
Is there such an option in flutter?
From what I've found, there are no solutions that satisfy me needs. The closest one is fluttery_audio, but, apparently, it doesn't support endless audio.
I apologize for my jargon with 'endless audio streaming', I'm not really sure what's the technical name for an online radio player is.
Try with flutter_webview_plugin and hide it.
https://pub.dev/packages/flutter_webview_plugin
final flutterWebviewPlugin = new FlutterWebviewPlugin();
flutterWebviewPlugin.launch(url, hidden: true);
You can also try flutter radio package, Check working app here...
Source code

Marmalade SDK: is there a way to skip/seek to a part of a video in s3e video play?

I am currently using s3eVideo in the Marmalade SDK to play a video in my project after a button event. I attempted to find a way to implement a slider bar (or something of the like) to go back and forth in the video. I am unsure if this feature is even supported, but I may be wrong. Otherwise, is there a way to open a native video player outside of the app and then play the video that way with the seek feature I need?
Any help would be greatly appreciated.
There doesn't seem to be a way of finding out the length of the video but s3eVideoSetInt (S3E_VIDEO_POSITION, timeInMilliseconds) should do the trick.
I guess it will depend on if the frame index in the file is ok and if the specific platform supports it. Only really did play/stop video in Marmalade so you may have to try using this function while playing, while pause etc and see what works and what errors.

How to set current playback duration and elapsed time on iOS 7 lockscreen?

Starting from iOS 5, every music player can set current playing music information such as title, artist, album title, and artwork on [MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo to show on lock screen.
On iOS 7, playback position slider, duration, and elapsed time information are added to both lock screen and control center. However, I cannot find any documents to set these kinds of information and enable the slider to change playback position.
Is there any way to solve this problem?
You need to setup playback rate to 1.0f even if documentation says it's 1.0 by default.
NSDictionary *mediaInfo = #{
MPMediaItemPropertyTitle: audio.title,
MPMediaItemPropertyArtist: audio.artist,
MPMediaItemPropertyPlaybackDuration: audio.duration,
MPNowPlayingInfoPropertyPlaybackRate: #(1.0)
};
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:mediaInfo];
They're all documented in the reference for MPNowPlayingInfoCenter. The currently playing properties are optional values that may or may not be set. The link to that is in the sentence at the end of the list of normal playing properties:
Additional properties you can set are described in this document in “Additional Metadata Properties.”. (emphasis mine)
The properties that you are interested in are: MPNowPlayingInfoPropertyElapsedPlaybackTime and MPMediaItemPropertyPlaybackDuration.
This information is all publicly available, and as the iOS 7 SDK does not seem to be published yet (as of 2013-09-14), I presume it was available prior to that version of iOS as well.
Just be warned: Apple's document never made this clear -- If you use MPMusicPlayerController, your music is played under the hood by the "music" app and you do NOT have any control of nowPlayingInfoCenter. And you will NOT receive remote control events generated by the user actions (such as play/pause) applied to the lock screen because those events are propagated via the nowPlayingInfoCenter to the "music" app, not to yours. When using other media players, such as AV or AvAudio, you can control the nowPlayingInfoCenter and receive the remote control events. But if you use AVAudioSessionCategoryOptions.MixWithOthers to set up the AV player, you can't control nowPlayingInfoCenter either. I wish Apple documented those details better.