Application publishes video stream to Wowza. Then Wowza pushes that stream to FMS server.
I can setup restreaming manually via Stream -> Stream Targets -> New Target.
Also I found that it is possible with REST API( https://www.wowza.com/docs/stream-targets-query-examples-push-publishing ).
But I actually want to do it completely automatically.
Is it possible? Maybe Wowza have some triggers on stream broadcast?
Since you are looking into a REST API solution, one way to make this completely automated is when you publish your stream, trigger a script that queries the stream name via REST API, saves the stream name to a variable, and adds it as a Stream Target via REST API.
Related
I have found four sets that can be ingested directly into Memgraph at Awesome Streams site. I've also found a tutorial How to build a Spotify Recommendation Engine using Kafka and Memgraph.
Is there a public stream of this dateset? I know that I can download it, and I know that there is an already app but I'd like for a public stream so that I can showcase this in my school without need for bringing my laptop.
Memgraph currently offers four data streams at Awesome Streams. These are the same streams that can be found at https://github.com/memgraph/data-streams. Spotify data stream is mentioned at https://github.com/memgraph/spotify-song-recommender.
You can open up the issue at https://github.com/memgraph/data-streams and ask for a Spotify stream to be included.
I need to stream video data to a server from an ESP32-cam. So I need to set the esp32-cam as a client, but I could not find any example code or any resource regarding how to stream video data to a server. There are example that show how to set ESP32-cam as a video streaming server, but not client. I could not find any resource. Is this possible at all?
Or as an alternative solution, would it be possible to connect the esp32-cam server to another server?
I would appreciate if you could give any resources.
Thanks in advance!
I think you mean you want the camera to act like an IP-Camera and send its stream to a server so you can then stream the video from that server.
Many servers for IP cameras will be set up to receive RTSP streams - there are example libraries to send a RTSP stream from your esp32-cam which you can use for this. One popular example: https://github.com/geeksville/Micro-RTSP
As a note, you could also have your server act as a client when you esp32-cam acts as a streaming server. It could then re-stream the video to wherever ever you want to send it.
I have one question regarding sending stream to TV using wowza.
I need to send multiple streams running at same time to TV station with using one link
Basically question is that, I have multiple streams with different name and when i need to send to TV it convert to one unique name on run-time.
Is this possible ? if yes please explain bit more..
thanks in advance
By sending to "TV" you mean leveraging Push Publish to send to an external CDN or Wowza Server, then you can specify the outbound stream name within the Push Publish mapping by setting the "streamName" parameter. You could also remap the inbound published stream name via the approach found here.
Otherwise, if you are referring to requests made for a particular stream on your given Wowza Instance (vs pushing outbound), then you could leverage the Stream Name Alias module of which you could map any stream name to another.
Thanks,
Matt
These applications stream video from client app to their own server. I am interested in knowing what type of protocol they use? I am planning on building a similar application but I dont know how to go about the video streaming. Once I get the stream to my server I will use OpenCV to do some processing and return the result to the client.
I recommend you to send only a minimum of data and do the processing as much as possible on the client. Since sending the whole video stream is a huge waste of traffic (and can not be done in realtime I think)
I would use a TCP connection to send an intermediate result to the server, that the server can process further. The desing of that communication depends on what you are sending and what you want to do with it.
You can wrap it in xml for instance, or serialize an object and so on.
I need to test the application I developed, that uses HTTP Live Streaming ( audio only ) and I would like to see how it works in comparison with other similar apps. How can I conclude that an app uses HTTP Live Streaming by using a packet sniffer?
Look for a request for a playlist file with an .m3u8 extension and in the response look for #EXTM3U on the first line. The IETF draft should give you all the information you need to interpret the rest of the traffic.
Here are some values you might find useful if you want to look at the stream closer.
I have never tried this tool but Apple released a stream validator, all you need is the playlist URL.
Bear in mind that it's possible to use a number of different streaming technologies and it's a popular practice with alot of services to use the one which makes most sense at the time. If you don't simulate the correct environment you might never get an HLS stream at all.