i can't stream video from http path in flutter video player - flutter

i'm trying to create new video player app that load vast xml and and extract the data from it.
one item in the data is video file url path, it can be http or https and i can't know in advance,
i can't just replace http to https because then it can be wrong domain name.
is there a why to stream the video file with http?
Thaks.

Related

How to fetch audio from my api with protected urls using audioplayers (flutter)

I'm trying to play audios using audioplayers on my flutter app. My audio url are coming from an API that i've build using Java spring security and the urls to those audios are protected (I'm using token authorization (bearer token))
Fetching data from the internet From Flutter's documentation.
And other Networking tutorials.
According to AudioPlayer's documentation:
The remote URL must be accessible and not be a redirect. If it's not an audio file, it does a redirect, it requires some headers, cookies or authentication, it will not work.
Then, AudioPlayer is not an option to use. On the other hand, just_audio could support headers parameter, then you may set the authorization header:
await player.setUrl(url,headers: {'Authorization': tokenValue});

Audio source with axios stream?

I have an HTML5 audio control and would his src property is pointing to my middleware, an express / node server that delivers a streaming mp3 file.
On the middleware I'm using res.pipe() to output the mp3 file.
It's working great with one caveat: I can't send my authorization header.
So I want to use axios to access my middleware which works fine but I can't figure out how to "feed" the audio element.
If I do:
const response = axios.get('/api/stream',{requestHeaders:'stream'});\
myAudio.src = response;
It throws an error and I'm block from there...
Thanks for any help :)
Finally managed to pass a token as a get parameters instead of trying to use axios.

How to initialize HLS stream?

My IP Camera push stream only when somebody request. So there is no ts segment or m3u8 file when a client requests stream. Is there a way to tell the client to retry the same request a second later, so that the server has time to send a command, wait for streaming video and generate a m3u8 file.
I don't want to hold the connection to the client while waiting for the first m3u8 file because it may cause too many connections which the server can't handle.
Edit:
The server running Nginx(actually OpenResty) receives audio/video data from an IP Camera, then do transcoding with ffmpeg library and finally publish via HLS protocol.
The problem:
The IP Camera does not push stream all the time. It pushes only when client requests. It takes several seconds for the server to receive media stream and generate the first m3u8 file after the client (HLS player such as ffplay or video.js with videojs-contrib-hls plugin) requests. So the client will get a 404 and fail to open the HLS stream.
What I tried:
I don't want to hold the connection to the client while waiting for the first m3u8 file because it may cause too many connections which the server can't handle.
The client can check if the stream is ready by adding custom code. But I want to know if there is a better way first.
Question:
How can the server Initial the HLS stream and avoid the problem?
Is there a way to generate a fake m3u8 file to tell the standard HLS players to retry again automatically?
Or is there an way to configure the HLS player so that it will retry automatically when it get 404 from server. Any players that can run in browser is acceptable, such as flowplayer with flashls, video.js with videojs-contrib-hls.
Edit 2:
How does the camera start:
My system has a control module to communicate with the camera. It will control the camera to start or stop pushing media. I'm sure the camera will start only after a client request. So it doesn't matter how the camera starts
About Nginx in my system:
The camera POST raw media (alaw audio and h264 video) to OpenResty server (which based on Nginx), with timestamp in custom HTTP header. Every request post media of two seconds. The server take the raw media and transcode it to mpeg-ts format and also m3u8 file as playlist of HLS protocol. The transcoding is done with ffmpeg library.
Player:
<!DOCTYPE html>
<meta name="robots" content="noindex">
<html>
<head>
<meta charset=utf-8 />
<title>videojs-contrib-hls embed</title>
<link href="https://cdnjs.cloudflare.com/ajax/libs/video.js/5.10.2/alt/video-js-cdn.css" rel="stylesheet">
<script src="https://cdnjs.cloudflare.com/ajax/libs/video.js/5.10.2/video.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/videojs-contrib-hls/3.0.2/videojs-contrib-hls.js"></script>
</head>
<body>
<video id="my_video_1" class="video-js vjs-default-skin" controls preload="auto" width="640" height="268"
data-setup='{}'>
<source src="http://xxx.xxx.xxx.xxx/camera.m3u8" type="application/x-mpegURL">
</video>
</body>
</html>

How to intercept multipart/form data in fiddler and access a binary file which is a part of the request

I am trying to intercept requests send to a server from my mobile device. There is this post request which will upload payload to the server and the request has a file of type .pb, which i cant read in fiddler. Is there a way to get hold of the file ?
It's not clear what "i cant read in fiddler" means.
Use Fiddler's HexView request inspector to inspect the POST body. You can select the bytes of the file upload and choose Save bytes to save the file out to your desktop.

Google Cloud Storage: Setting incorrect MIME-type

I have a Node.js server running on a Google Compute Engine virtual instance. The server streams incoming files to Google Cloud Storage GCS. My code is here: Node.js stream upload directly to Google Cloud Storage
I'm passing Content-Type in the XML headers and it's working just fine for image/jpeg MIME-types, but for video/mp4 GCS is writing files as application/octet-stream.
There's not much to this, so I'm totally at a loss for what could be wrong ... any ideas are welcome!
Update/Solution
The problem was due to the fact that the multiparty module was creating content-type: octet-stream headers on the 'part' object that I was passing into the pipe to GCS. This caused GCS to receive two content-types, of which the octet part was last. As a result, GCS was using this for the inbound file.
Ok, looking at your HTTP request and response it seems like content-type is specified in the URL returned as part of the initial HTTP request. The initial HTTP request should return the endpoint which can be used to upload the file. I'm not sure why that is specified there but looking at the documentation (https://developers.google.com/storage/docs/json_api/v1/how-tos/upload - start a resumable session) it says that X-Upload-Content-Type needs to be specified, along some other headers. This doesn't seem to be specified in HTTP requests that were mentioned above. There might be an issue with the library used but the returned endpoint does not look as what is specified in the documentation.
Have a look at https://developers.google.com/storage/docs/json_api/v1/how-tos/upload, "Example: Resumable session initiation request" and see if you still have the same issue if you specify the same headers as suggested there.
Google Cloud Storage is content-type agnostic, i.e., it treats any kind of content in the same way (videos, music, zip files, documents, you name it).
But just to give some idea,
First I believe that the video () you are uploading is more or less size after it being uploded. so , it falls in application/<sub type>. (similar to section 3.3 of RFC 4337)
To make this correct, I believe you need to fight with storing mp4 metadata before and after the file being uploaded.
please let us know of your solution.
A solution that worked for me in a similar situation is below. TLDR: Save video from web app to GCS with content type video/mp4 instead of application/stream.
Here is the situation. You want to record video in the browser and save it to Google Cloud Storage with a content type set to video/mp4 instead of application/octet-stream. User records video and clicks button to send video file to your server for saving. After sending the video file from the client to your server, the server sends the video file to Google Cloud Storage for saving.
You successfully save the video to Google Cloud Storage and by default GCS assigns a content type of application/octet-stream to the video.
To assign a content type video/mp4 instead of application/octet-stream, here is some server-side Python code that works.
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.upload_from_file(file_obj, rewind=True)
blob.content_type = 'video/mp4'
blob.patch()
Here are some links that might help.
https://cloud.google.com/storage/docs/uploading-objects#storage-upload-object-python
https://stackoverflow.com/a/33320634/19829260
https://stackoverflow.com/a/64274097/19829260
NOTE: at the time of this writing, the Google Docs about editing metadata don't work for me because they say to set metadata but metadata seems to be read-only (see SO post https://stackoverflow.com/a/33320634/19829260)
https://cloud.google.com/storage/docs/viewing-editing-metadata#edit