I want to set up a little website from which i can stream video/audio to my android phone. I'm wondering will the WAMP server stream through LAN when both the phone and the pc are on the same network or will it still go through the web. I have low upload speed it's high enough to stream videos but i prefer the server to stream through lan when we're on the same network and over the web when we're not. So will the server do this by himself or should i look in another variant.
Yes it will stream through the LAN.
There are no special things you need. The only thing is that the media won't actually stream. Whatever your media is will only play after completely downloading.
Related
What direction should i look for Flutter camera audio/video stream to RTMP Media Server?
Developing a live broadcasting solution,
Our attempts so far;
I. PWA approach, Antmedia WebRTC SDK was used to establish WebRTC ICE candidate connection between web mobile and antmedia server.
Problem faced: seems connection too heavy for some phones. One of my phones was able to stream through.
II. Looked in the direction of hybrid, i have been searching internet for cordova/ionic SDK for live streaming to RTMP Media Server, no luck so far. Found bambuser https://www.npmjs.com/package/cordova-plugin-bambuser, seems the SDK is tied to their live streaming services.
Please, what do i need to know? Also need expert review on Flutter camera plugin Image-Bytes access + ffmpeg for encoding and transporting streams to RTMP url rtmp://SERVER_NAME/LiveApp/STREAM_ID
I have a raspberry pi which has webrtc via uv4l2. It is awesome! I want to record the video from the camera on a server. It's your basic surveillance camera setup... central linux server with lots of storage space, remote IP cameras, etc. I've read dozens of pages and still can't figure it out. I tried all this kurento mumbo jumbo but it's all wretch an no vomit. It never gets there. What's the command to grabthe rpi video and dump it to disk? Please help!!!
UV4L already supports audio+video recording on the server (other than on the client), if you use it with Janus WebRTC. Have a look at the bottom of the Video Conference DEMO OS page for more details. At the moment, you will have to use the REST API to login into a Janus room and turn on/off the recording. The REST API is ideal if you want to control UV4L from a custom application, but there is also a panel which allows you to dynamically send REST requests to the UV4L server.
I am trying to establish multimedia streaming among multiple android devices. Where all of them play a synchronized video. An good example of my idea is Samsung video group play. Thanx in advance.
Google how to implement an RTSP server on the host device and then all of the client devices can connect to the RTSP stream. For the synchronization matter - either use multicast or server the same payload with adjusted timestamps to all the clients.
My partner and I have a webcam site that basically runs the old-school method....Every 0.5 seconds the javascript reloads the image in the browser from the webcam. However we are wanting to upgrade to a streaming media server to get higher quality video, and possibly audio. We aren't tied to any one specific file format or server type, as of right now we are leaning towards slicehost (as scalability is important), and installing darwin streaming server or wowza.
This is meant to be a live stream. Does anyone have any suggestions for hosts/server software?
Wowza is great and they offer an Amazon EC2 setup with micro pricing to make it affordable.
You can always go with Flash Media Server, but that is expensive.
Red5 is free and open source.
UPDATE
Based on your comment, you can also use UStream. It is free and will hook into Flash Encoder, which is also free.
Do you absolutely have to stand up your own streaming server? Services like LiveStream can do what you're talking about for much cheaper than setting up your own hardware.
I have seen plenty of articles and SO questions about streaming TO an iPhone app, but my question is the reverse, that is, streaming FROM an iPhone app.
I have audio content in an iPhone app, that I want to stream to a browser. So the idea is that the browser can connect to a server running on the iphone. The server on the iphone will give the audio to the browser. The browser will play the endless stream.
I already have seamless looping content on the phone with AudioQueue. I already know how to setup a server running on the phone with CocoaHTTPServer. Is there a third piece that can make the AudioQueue (or a FileStream) stream to a browser connected to the internal iPhone server?
Anybody have any thoughts on how to implement this?
Well, there are a few good open source projects to dissect, port, or imitate for this. What I would suggest is looking at how Icecast and streamTranscoderv3 operate together. The latter will take an audio source and send it to an Icecast server as a source. Port parts of both and run them locally on the iPhone and you'd have a solution. I imagine that Bonjour could be used so that other systems on the LAN could find and listen to the iPhone.
Or send the streamTranscoder output to an Icecast server elsewhere and make it available for the world.
Unfortunately, neither project is over engineered - the code isn't super modular but it is comprehensible and modestly cross platform.