I'm trying to upload a video on s3 but nothing works.
Moreover, we can do it using the native code of iOS and Android and connect it using the flutter channel. But It seems too expensive in terms of time and complexity.
We have another way that we can send the media file to our backend server and from the server side, we can upload it to the S3 bucket.
But I'm trying to figure out a way to do this from the front-end side itself.
Thanks for any suggestions.
Related
Flutter is new to me, and I'm creating an app using video_player library, where the video plays from local resource as well as from the internet using direct url like https://flutter.github.io/assets-for-api-docs/assets/videos/bee.mp4.
However, I am trying to read a stream in chunks and play them to video_player, but it does not seem to be working.
I have created an API with the help of Asynchronously streaming video with ASP.NET Web API which return the Stream outputStream as return value.
Note: The video stored in my file server and API server are also different, and both are on the same network. API server can read the data in chunks from the actual url that is not publicly accessible and its working fine in html video tag.
Does the video_player support such streaming or do I need to use any other player?
Please help me guys, whether they are in the right direction or not.
I am sure the details provided are not enough, but as a flutter beginner this is what i can provide here.
Best Regards,
Hi ten
I am trying to create a web server(REST APIs), which should be able to store, organise and stream videos for a client request.
My confusion:
How should a user can upload videos. From research, I decided that I would store all the metadata for the videos in database(google datastore), and all the video files in separate storage(Google cloud storage). Now, to upload videos, what is the proper way?
Once a video is uploaded and stored, how will the streaming will happen. Suppose a user make a request to watch a video, server will get a http request for that. But how to stream videos? Is there any service for this? Because using http streaming directly in code affects performance I guess.
From my understanding, I want to use a service which should be able to stream videos from my storage to a client upon the server's request. I guessed the server should make request to this "video streaming service" only after verifying the user credentials.
For question 1 (how to enable customers to upload objects), signed URLs are a good bet.
Question 2 is a lot bigger. Depending on your needs you could simply point clients to GCS video files, but modern media serving is a bit more advanced than that. You may want to look into using GCE with a streaming video service, for example something like Wowza. Google offers a click-to-deploy experience for it: http://cloud.google.com/tryitnow/wowza
(Keep in mind that Wowza is a separate products requiring a paid license. I don't have any experience with it and neither advocate for nor disapprove of it).
I am trying existing to stream music/video on an iphone using HTTP Live Streaming. I read the apple docs on HTTP live streaming (http://developer.apple.com/library/mac/#documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/Introduction/Introduction.html), and I get how it works.
What it doesn't say is how one would use iphone as a server? Do I have to add the tools to my ios app(mediastreamsegmenter, variantplaylistcreator) and run it as a NSTask or is there some kind of native support to stream media files.
If you really want to stream from an iPhone app you can't do this with the iPhone acting as a server. You need a separate server where you can send data from the iPhone app. So you can use the camera or the microphone in the app to get live content and then you can send asynchronously data to the server, which using mediastreamsegmenter and variantplaylistcreator will convert the data to ts segments and then will append them at the end of the m3u8 file and meanwhile another iPhone app can act as a client and watch the live content that you are streaming from the first app.
From my experience this is the only way to achieve that. Hope that helps.
I am making an iPhone application which will send video to a server for live streaming and I wanted to know that do we require a media server for this?
Yeah, You need to create a media server. You can send your streams to server from mobile using one of the many SDKs available.
For media server:
There are many ways that you can setup a server. For now lets see RTMP server which could be used with nginx. You can use hls(HTTP Live Streaming) as stated in above with this package. Here, the RTMP Server will receive the stream and converts it into the hls recommended format and HTTP server will distribute the streaming.
This link will provide you more information.
To distribute your media content you can use an ordinary HTTP server. If you want to provide live content, you need a server component that encapsulates your content into a format that can be distributed over HTTP.
Apple provides a suite of command line utilities that allow you to prepare your content. Just search for "HTTP Live Streaming Tools" at https://developer.apple.com/downloads
The HTTP Live Streaming Overview also is a good starting point.
I am creating an app that capture video and upload it to server i have create the code for capture video ,and for upload I am studying this code but I did not understand to where to put my server address to store data there.
And also is there posible to store video using FTP because in my company PHP and .NET guys use FTP and i want to know that i am able to upload video using FTP in my iPhone app.
In the WebService.m file, there's a placeholder that says "your URL string", replace that with your actual server.
If you want to use FTP, you can do that as well using a networking library that supports FTP, one in particular is the very popular libcURL library.