I have seen that most articles and videos show how to decode JSON with flutter isolates,
I am curious to know how other developers are using the isolates for which kinds of tasks
Related
I'm working on an app that works with audio in real time measuring things such as intensity. I need it to run on Android, IOS and web. I already covered Android and IOS using flutter_sound package, which allows me to access the byte stream and do stuff from there. However this is not possible for web, since the codec used by flutter_sound that gives access to stream is not supported by web browsers.
Is there any package for web that allows byte handling or any workaround to this?
I've been searching with no success. So far all packages that I found that support web only allow to access data through files
I want to share content(image, audio, video, or document) to other platforms using sharing integration and I found one package for the same as share_pluse.
share_plus provides options for all platforms which are feasible to share the content(image, audio, video, or document). But in my case, I want to show only fix the number of platforms to users on which they can share the content.
So Is there any way using which we can fix the platform on which users can share the content?
I am trying to create a video streaming app in Flutter, using Mux. I also want to track user data to help recommend more specific videos per user. However, I do not know how exactly to integrate mux/data with the video_player plugin in flutter. Have I misunderstood how exactly these plugins/services work? I am using flutter for my frontend and python for the backend.
Links:
video_player: https://pub.dev/packages/video_player
mux/data: https://docs.mux.com/guides/data/track-your-video-performance
You could use this player instead which is functional: https://pub.dev/packages/mux_player
However, I'm struggling to make it autoplay and not go fullscreen when you play it as it feels kind of weird. It does track all data as you would expect.
So I want to stream videos to Kinesis stream from flutter. I have searched through the kinesis documents but couldn't find any SDKs available for flutter.
Is there any library available to do it?
Or if anyone has done it before, would really appreciate the help.
As Far as I know, there is no current implementation in flutter for Kinesis Video "Producer" SDK.
However there's Android Implementation, so what I would suggest is to add this Android Native Code in your project, and call it from Flutter Side.
The flutter camera library can be modified to work with Kinesis Producer SDK.
Or like #Andrija said, REST API as a proxy for Kinesis can be used. But the downside is audio won't be streamed, you might need to container (MKV/MP4) the audio and video and send that.
Having said all that, if you can somehow encode the video and audio (MKV/MP4) from flutter, then you can use putRecord in aws_kinesis_api flutter api to send it to Kinesis. But nowhere does it say it has to be video/audio, this is a generic api to put data into the stream.
EDIT: This confirms that as of current SDK, there is no Kinesis Producer code for Flutter https://github.com/agilord/aws_client/issues/242#issuecomment-860731956
I'm trying to build a simple audio-conferencing app similar to Clubhouse in Flutter, but with only the very barebones. I was trying to get started with the flutter WebRTC plugin because I thought that webRTC might be a good try to see if it works, but I couldn't find a way to do audio rooms or anything. I googled for AGES. Is there a way to get this done?