Flutter is new to me, and I'm creating an app using video_player library, where the video plays from local resource as well as from the internet using direct url like https://flutter.github.io/assets-for-api-docs/assets/videos/bee.mp4.
However, I am trying to read a stream in chunks and play them to video_player, but it does not seem to be working.
I have created an API with the help of Asynchronously streaming video with ASP.NET Web API which return the Stream outputStream as return value.
Note: The video stored in my file server and API server are also different, and both are on the same network. API server can read the data in chunks from the actual url that is not publicly accessible and its working fine in html video tag.
Does the video_player support such streaming or do I need to use any other player?
Please help me guys, whether they are in the right direction or not.
I am sure the details provided are not enough, but as a flutter beginner this is what i can provide here.
Best Regards,
Hi ten
Related
The backend api sends a link url to the video that can be used in the video player's source. But the issue is, the source can vary from m4v, mp4 and other potential formats. I want to support the major formats, if not all, in the app.
Backend is in Django. I am open to modify the api code as well.
If the video formats need conversion, suggest the fastest way with an ETA for a ~100MB file.
Tried video_player plugin from pub.dev. At worse, I am thinking of integrating flutter_ffmpeg but I don't want users to wait and ruin their UX.
So I want to stream videos to Kinesis stream from flutter. I have searched through the kinesis documents but couldn't find any SDKs available for flutter.
Is there any library available to do it?
Or if anyone has done it before, would really appreciate the help.
As Far as I know, there is no current implementation in flutter for Kinesis Video "Producer" SDK.
However there's Android Implementation, so what I would suggest is to add this Android Native Code in your project, and call it from Flutter Side.
The flutter camera library can be modified to work with Kinesis Producer SDK.
Or like #Andrija said, REST API as a proxy for Kinesis can be used. But the downside is audio won't be streamed, you might need to container (MKV/MP4) the audio and video and send that.
Having said all that, if you can somehow encode the video and audio (MKV/MP4) from flutter, then you can use putRecord in aws_kinesis_api flutter api to send it to Kinesis. But nowhere does it say it has to be video/audio, this is a generic api to put data into the stream.
EDIT: This confirms that as of current SDK, there is no Kinesis Producer code for Flutter https://github.com/agilord/aws_client/issues/242#issuecomment-860731956
hi everybody i try to develop a web application that can control Smart tv like this guide http://samsungdforum.com/Guide/tut00024/index.html i work fine but now i would like to upload video from computer then it can display on the smart tv like image shown on the tutorial have any one any idea or exemple or suggestion about modification of code that can i do that can help me i would like to modify code of convergence tutorial than can sens message or send video client application to smart tv application
Sending files is covered by the tutorial. You can find API reference for this here.
Sending video file is not exactly a wise thing, because there is a 3MB limit for a file that can be sent using Convergence API. This API is designed for sending messages between TV and external client rather than files. If you want to launch video playback, send video URL from web app to the TV and let the TV download the video by itself.
So generally, I want to make an app which has video chat functionality for iPhone. But after many searches, I am still not able to find any successful results. Is there any public or even for that matter, private API available for doing this on iPhone??? If you have an YES answer, please help me.
Basically, what I want is to read the streams of the video on both the devices connected for chatting. Thanks a lot in advance and please help me if you can.
p.s - I have already checked iDoubs but it failed and always shows some unknown problem and for that reason, doesn't allow me to connect to anyone.
ALSO : The suggested method I have found is via HTTP Live Streaming. But, in that too, I have multiple doubts.
1.) I need to find how do I upload my video from iPhone to the HTTP server from where I would be broadcasting?
2.) Can you please post something related to setting up the server? How do I feed the video to the FFMPEG Server?
Mainly, I need to find the upload method. I am right now simply sending hex-code in the form of NSDATA to the server and I am stuck there. The main problem is, It is live. How do I handle that?
It would be best, if you could help me make the iDoubs work properly.
Thank you so much for any kind of support!
have a look on this how to implement video chat in iphone But before starting you must have a IMS server up & running.
here is the live video chat framework what you are looking for. Its easy and simple to implement for face to face video chat. I have already tried this. Its working very fine. Great thing about this framework is multiple platform support.
Tokbox : https://tokbox.com/platform
https://tokbox.com/opentok/tutorials/
Sample Code:
https://github.com/opentok/opentok-ios-sdk-samples/
Edit:
Here is the article explaining opentok using parse.
http://www.iphonegamezone.net/ios-tutorial-create-iphone-video-chat-app-using-parse-and-opentok-tokbox/
HTTP live streaming is primarily an approach for adaptive streaming from server-to-client. For client-to-server rather go for traditional streaming. There exists an open library for streaming, see this question.
Whilst it is possible to facetime to do two-way chat, it is not certain that you will be able to using public iOS APIs. That said, I have implemented one-way live streaming for iPhone and the difficult part was not the core streaming itself, but encoding of the payload. You will be able to do H264 in hardware and AAC / iLBC in software.
How you want to feed this to the FFMPEG depends on your transport, possibly changing from 'file' H264 frames to 'streaming' H264. Check out the H264 frame types if you implement frame dropping; reconfiguring the H264 encoder on-the-fly is not possible to my knowledge, but restarting with fresh parameters typically does not take more than a second or so.
Did you attempt to play back a live resource while capturing? That is a good starting point. If you come across an open API for H264 encoding, please post it here ;-)
I just started work on live streaming on iPhone. So any help of how to do live streming in iPhone. I think if I can add video tag in HTML5 and then load that html in UIWebView will work.
Am I right? If not what is your sugestion to do live streaming. I want to embed some news channel live streaming link in the application so from where I can find those links.
You have to go through HTTP Live streaming document provided by Apple.There are some sample live streaming URLs.The file extension will be .m3u8.If you want to configure your own webwserver , you have to configure FFMPEG server in your webserver.The links which will help you
1)Apple document
2)stackoverflow
3)stackoverflow
4)stackoverflow
If you're making a web app in html5 then the video tag is a good choice.
But, If you're developing a native app then MPMoviePlayerController would be a much better choice. There are many example of how to use it online.
iOS doesn't support RTMP or RSTP, so your stream would need to be a HTTP Live stream. From memory the codec choice is very limited too, eg if you supply H264+mp3 you won't get any sound despite iOS supporting mp3.
Also remember that streams from other people (such as the BBC) will normally be protected by international copyright law, so unless you have prior permission to use their stream in your app you may be breaking the law.
Apple has some nice resources on Http Live Streaming.