Storing, Streaming videos and handling requests using REST APIs - rest

I am trying to create a web server(REST APIs), which should be able to store, organise and stream videos for a client request.
My confusion:
How should a user can upload videos. From research, I decided that I would store all the metadata for the videos in database(google datastore), and all the video files in separate storage(Google cloud storage). Now, to upload videos, what is the proper way?
Once a video is uploaded and stored, how will the streaming will happen. Suppose a user make a request to watch a video, server will get a http request for that. But how to stream videos? Is there any service for this? Because using http streaming directly in code affects performance I guess.
From my understanding, I want to use a service which should be able to stream videos from my storage to a client upon the server's request. I guessed the server should make request to this "video streaming service" only after verifying the user credentials.

For question 1 (how to enable customers to upload objects), signed URLs are a good bet.
Question 2 is a lot bigger. Depending on your needs you could simply point clients to GCS video files, but modern media serving is a bit more advanced than that. You may want to look into using GCE with a streaming video service, for example something like Wowza. Google offers a click-to-deploy experience for it: http://cloud.google.com/tryitnow/wowza
(Keep in mind that Wowza is a separate products requiring a paid license. I don't have any experience with it and neither advocate for nor disapprove of it).

Related

Why do you need a separate encoder for streaming live video?

I have been searching for an API service that allows for browser based video capturing and encoding for the purpose of live streaming. The company I work for needs an "all inclusive" API but it does not seem to exist. The closest we have come to are streaming services that allow for a player to be embedded and the stream output to be linked to that player. These services always seem to require that you use a separate software to encode your live video.
Are there copyrights held by YouTube and Vimeo that prevent others from creating these technologies? Are there limitations with cost and scale?
Live streaming is typically broken down into to categories:
Video conferencing where where is a limited number of participants. Here video quality typically doesn't matter. This is what browser based broadcasting solutions are designed for.
Second is large audience; where there is a single broadcaster with many viewers. Here separate encoding software is preferred because they are much more feature rich, allow for more options and controls, and allow for using good quality cameras.
COVID-19 made popular new categories of a broadcasted conference calls and simple "one too many" broadcasts from a laptops.
Not many companies have built an end to end services for this use case as significant demand for them has only existed for a few months, and it takes years to build something like this. When Covid is over this market may dry up again.
Qs: API service that allows for browser based video capturing and encoding for the purpose of live streaming:
WebRTC
Qs: Streaming player to be embedded and the stream output:
HLS/DASH Player on Any Standard Browser
You can have a Media Gateway to convert from Webrtc to HLS/DASH (one to many or broadcasting scenario):
Janus
Here is a diagram to illustrate the same

Live Streaming - WebRTC to RTMP

I am working on a live streaming project and came across many services like Wowza, Dacast, Ant etc. The one suits for my requirement uses RTMP protocol and so I will have to use an encoding software like OBS to publish the stream. I actually want to publish the stream from browser/iOS/Android.
I came across this FB presentation and seems like they are using RTMP protocol. FB is successfully doing the broadcast from the browser somehow.
Can I get an insight into how the things would be working with FB / similar RTMP based live streaming apps? Thanks in advance.
Facebook supports RTMP ingest of video (used by people who utilize the Live API), as well as WebRTC ingest for browser clients.
RTMP is not used as a distribution protocol. For that, there is DASH.

Stream a live video call between 2 people to thousands of people

How can i stream a live video call between 2 people, to thousands of people. I prefer to use webRTC but I can't find the answer to my question. The viewers should be able to watch the stream in a web app.
Streaming to thousands of people is not trivial! It's not as hard as it used to be 10 years ago but is still pretty hard.
WebRTC supports direct browser to browser (peer to peer) connections. This means that WebRTC is primarily targeted at 1:1 conversation. If you want the same conversation (video or audio) to be shared among more than 2 people you have the following options:
Connect any user to any other user. This creates a fully connected graph between the viewers. This is easy to do because all you need is webrtc. No special hardware/software. However it is also very inefficient in thems of trafic and distribution and doesn't scale boyound 5-6 people.
Use A WebRTC Video relay like Jitsi VideoBridge. According to the official performance claims VideoBridge can scale to 500-1000 people given fast and wide enough internet connection.
Direct the Webrtc stream between the two participants to a WebRTC enabled streaming server. If needed, transcode the input stream to a suitalbe codex - x264/VP8/VP9. Convert the input stream to a sutable protocl - RTMP/HLS/DASH. Distribute the content using the buildin functionality of the media server or by the use of a CDN. Play the video on the client side with a player - Flowplayer/JwPlayer/ViblastPlayer/VideoJs/your own Custome Player or a combination of the above. This is the hardest solution but it is also the best one in terams of scalability and platform reach. Such a solution can scale easily to thousands of people and reach all major browsers and mobile platforms.
I guess the third alternative is the one for you. You can read more about the whole capturing/publishing/transcoding/converting business in BlookGeek's greate blog post.
A webrtc based peer2peer connection is not the choice for one-to-n streaming. As there is no broadcast so far in webrtc you should consider another technique.

Streaming live video and audio from iphone iOS 7

First, I wish to emphasize the keyword from. There are a lot of questions and answers on this topic but I found that no answer provide a step by step road-map to achieve this.
What I wish to achieve :
I wish to stream the video and audio (live) being recorded from the camera of iPhone/iPad to my server. And that's it.
What have I figured till now :
I guess that we can't use HTTP live streaming because it's meant for server to client and not client to server. AV framework allows the output only in the form of a mov file.
What am I not able to figure :
I don't know how to get individual frames (live) and send them to my server one by one
PS: I really don't know anything about this... You are welcomed to oversimplify things. I am writing server in node.js
You may take a look at Wowza GoCoder iOS app.
It requires Wowza as the media server though, so you'll be able to provide full-features streaming to anyone you want.
Server side set up is done easily via Wowza configs or by third-party cloud control .

What kind of server is required for live streaming of video?

I am making an iPhone application which will send video to a server for live streaming and I wanted to know that do we require a media server for this?
Yeah, You need to create a media server. You can send your streams to server from mobile using one of the many SDKs available.
For media server:
There are many ways that you can setup a server. For now lets see RTMP server which could be used with nginx. You can use hls(HTTP Live Streaming) as stated in above with this package. Here, the RTMP Server will receive the stream and converts it into the hls recommended format and HTTP server will distribute the streaming.
This link will provide you more information.
To distribute your media content you can use an ordinary HTTP server. If you want to provide live content, you need a server component that encapsulates your content into a format that can be distributed over HTTP.
Apple provides a suite of command line utilities that allow you to prepare your content. Just search for "HTTP Live Streaming Tools" at https://developer.apple.com/downloads
The HTTP Live Streaming Overview also is a good starting point.