How to initialize HLS stream? - streaming

My IP Camera push stream only when somebody request. So there is no ts segment or m3u8 file when a client requests stream. Is there a way to tell the client to retry the same request a second later, so that the server has time to send a command, wait for streaming video and generate a m3u8 file.
I don't want to hold the connection to the client while waiting for the first m3u8 file because it may cause too many connections which the server can't handle.
Edit:
The server running Nginx(actually OpenResty) receives audio/video data from an IP Camera, then do transcoding with ffmpeg library and finally publish via HLS protocol.
The problem:
The IP Camera does not push stream all the time. It pushes only when client requests. It takes several seconds for the server to receive media stream and generate the first m3u8 file after the client (HLS player such as ffplay or video.js with videojs-contrib-hls plugin) requests. So the client will get a 404 and fail to open the HLS stream.
What I tried:
I don't want to hold the connection to the client while waiting for the first m3u8 file because it may cause too many connections which the server can't handle.
The client can check if the stream is ready by adding custom code. But I want to know if there is a better way first.
Question:
How can the server Initial the HLS stream and avoid the problem?
Is there a way to generate a fake m3u8 file to tell the standard HLS players to retry again automatically?
Or is there an way to configure the HLS player so that it will retry automatically when it get 404 from server. Any players that can run in browser is acceptable, such as flowplayer with flashls, video.js with videojs-contrib-hls.
Edit 2:
How does the camera start:
My system has a control module to communicate with the camera. It will control the camera to start or stop pushing media. I'm sure the camera will start only after a client request. So it doesn't matter how the camera starts
About Nginx in my system:
The camera POST raw media (alaw audio and h264 video) to OpenResty server (which based on Nginx), with timestamp in custom HTTP header. Every request post media of two seconds. The server take the raw media and transcode it to mpeg-ts format and also m3u8 file as playlist of HLS protocol. The transcoding is done with ffmpeg library.
Player:
<!DOCTYPE html>
<meta name="robots" content="noindex">
<html>
<head>
<meta charset=utf-8 />
<title>videojs-contrib-hls embed</title>
<link href="https://cdnjs.cloudflare.com/ajax/libs/video.js/5.10.2/alt/video-js-cdn.css" rel="stylesheet">
<script src="https://cdnjs.cloudflare.com/ajax/libs/video.js/5.10.2/video.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/videojs-contrib-hls/3.0.2/videojs-contrib-hls.js"></script>
</head>
<body>
<video id="my_video_1" class="video-js vjs-default-skin" controls preload="auto" width="640" height="268"
data-setup='{}'>
<source src="http://xxx.xxx.xxx.xxx/camera.m3u8" type="application/x-mpegURL">
</video>
</body>
</html>

Related

i can't stream video from http path in flutter video player

i'm trying to create new video player app that load vast xml and and extract the data from it.
one item in the data is video file url path, it can be http or https and i can't know in advance,
i can't just replace http to https because then it can be wrong domain name.
is there a why to stream the video file with http?
Thaks.

I'm getting an error message related to Access-Control-Allow-Origin

I'm working with a landing page that uses cdn plyr
<script src="https://cdn.plyr.io/3.3.10/plyr.js"></script>
<script>const player = new Plyr('#player');</script>
I moved a video from local files to a server and changed the src="to new address form server", but the video stopped working and I'm getting this error:
page.html:1 Failed to load https://www.video.mp4: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://111.0.0.0:12121' is therefore not allowed access.
I tired different things, and even added another videos from other servers and it worked. except my video. The only thing that work is to add crossOrigin="anonymous" to the video tag and install Chrome extension But this wont work for other users, I need something permanent.
I also looked in to many answers:
How does Access-Control-Allow-Origin header work?
Videos not playing due to no Access Control Allow Origin
HTML5 video doesn't play with crossOrigin=“anonymous”
Please any ideas how to make this work?
This is a problem caused when you try to send request from a server that is different from the server you send request to. As in the comment was indicated, only the server you have uploaded your video to can control the header. But if it's your own server you can easily manipulate the code to allow request from different servers.
Try this for a reference on how to enable on your server W3C CORS Enabled

calling webapi on client side (iphone/safari/chrome/IE) - right way to consume it?

I have created a webapi project that gets the media file from cdn network . in the project we verify the identity of files, change some content header value and return response as a HttpResponseMessage .
Client side , I have a test application where I am trying to access and play video files . client side code looks like this :
<video controls autoplay name="media" height="360" width="220">
<source src="http://localhost/api/media/test/filename.mp4" type="video/mp4"></source>
</video>
my queston is , is this the right way to consume webapi directly to src tag. or i need to intitiate HttpClient on client side and call this webapi method . i searched on internet and see both kind of example . it's working fine in browsers but not working in iphone/ipad device . is it because of the way I am calling this ??
i'll appreicate your answer .

Google Cloud Storage: Setting incorrect MIME-type

I have a Node.js server running on a Google Compute Engine virtual instance. The server streams incoming files to Google Cloud Storage GCS. My code is here: Node.js stream upload directly to Google Cloud Storage
I'm passing Content-Type in the XML headers and it's working just fine for image/jpeg MIME-types, but for video/mp4 GCS is writing files as application/octet-stream.
There's not much to this, so I'm totally at a loss for what could be wrong ... any ideas are welcome!
Update/Solution
The problem was due to the fact that the multiparty module was creating content-type: octet-stream headers on the 'part' object that I was passing into the pipe to GCS. This caused GCS to receive two content-types, of which the octet part was last. As a result, GCS was using this for the inbound file.
Ok, looking at your HTTP request and response it seems like content-type is specified in the URL returned as part of the initial HTTP request. The initial HTTP request should return the endpoint which can be used to upload the file. I'm not sure why that is specified there but looking at the documentation (https://developers.google.com/storage/docs/json_api/v1/how-tos/upload - start a resumable session) it says that X-Upload-Content-Type needs to be specified, along some other headers. This doesn't seem to be specified in HTTP requests that were mentioned above. There might be an issue with the library used but the returned endpoint does not look as what is specified in the documentation.
Have a look at https://developers.google.com/storage/docs/json_api/v1/how-tos/upload, "Example: Resumable session initiation request" and see if you still have the same issue if you specify the same headers as suggested there.
Google Cloud Storage is content-type agnostic, i.e., it treats any kind of content in the same way (videos, music, zip files, documents, you name it).
But just to give some idea,
First I believe that the video () you are uploading is more or less size after it being uploded. so , it falls in application/<sub type>. (similar to section 3.3 of RFC 4337)
To make this correct, I believe you need to fight with storing mp4 metadata before and after the file being uploaded.
please let us know of your solution.
A solution that worked for me in a similar situation is below. TLDR: Save video from web app to GCS with content type video/mp4 instead of application/stream.
Here is the situation. You want to record video in the browser and save it to Google Cloud Storage with a content type set to video/mp4 instead of application/octet-stream. User records video and clicks button to send video file to your server for saving. After sending the video file from the client to your server, the server sends the video file to Google Cloud Storage for saving.
You successfully save the video to Google Cloud Storage and by default GCS assigns a content type of application/octet-stream to the video.
To assign a content type video/mp4 instead of application/octet-stream, here is some server-side Python code that works.
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.upload_from_file(file_obj, rewind=True)
blob.content_type = 'video/mp4'
blob.patch()
Here are some links that might help.
https://cloud.google.com/storage/docs/uploading-objects#storage-upload-object-python
https://stackoverflow.com/a/33320634/19829260
https://stackoverflow.com/a/64274097/19829260
NOTE: at the time of this writing, the Google Docs about editing metadata don't work for me because they say to set metadata but metadata seems to be read-only (see SO post https://stackoverflow.com/a/33320634/19829260)
https://cloud.google.com/storage/docs/viewing-editing-metadata#edit

FIle downloading using HTTP

I am having trouble finding the correct XEP to use for this specific use case:
Initiator (e.g. iOS or Android device) uploads a file to a server and needs to notify the responder (in this case this would be a browser based client) to download the file from the location he just uploaded to using HTTP.
All the XEP's I have come across talk about streams or IBB/SOCKS5. I did found the following which could be useful but no updates since 2007:
http://xmpp.org/extensions/inbox/jingle-httpft.html
Am I overlooking something on an XEP which is in draft or final?
Either use XEP-0066: Out of Band Data, or just encode the link in a XEP-0071: XHTML-IM a element.
The first:
<message from='stpeter#jabber.org/work'
to='MaineBoy#jabber.org/home'>
<body>Yeah, but do you have a license to Jabber?</body>
<x xmlns='jabber:x:oob'>
<url>http://www.jabber.org/images/psa-license.jpg</url>
</x>
</message>
The second:
<message>
<body>here is a file [http://www.jabber.org/images/psa-license.jpg]</body>
<html xmlns='http://jabber.org/protocol/xhtml-im'>
<body xmlns='http://www.w3.org/1999/xhtml'>
<p>Here is a <a href='http://www.jabber.org/images/psa-license.jpg'>file</a></p>
</body>
</html>
</message>