how does Webcam stream is usually done? - streaming

I am currently doing a small project and one of the components is to capture the webcam stream from one side to the other (Client-->Server). right now i have the stream from the Server as bytes and as far as i know i should transfer these bytes using UDP. My question is how to do that,
is it should be enclosed into a file and then transferred?
should i transfer the raw bytes?
should i create a buffer at the client side and when it gets full show it on the screen?
in short i would like to know how to implement the transfer of the stream from the server to the client (i need just on side).

You can stream a webcam to a client via multiple ways.
use Windows media Server/ Flash media Server. Push your webcam to the server by Windows Media Encoder or flash media encoder, and use the server live link to playback on the client(windows /Web).
Use Windows Media Encoder to stream your webcam to anyone without a server involved. when your encoder starts, you will get a URL to view your stream, which you can use to playback on the client(windows /Web).
use third party streaming services, where they give you a publishing point to publish your webcam stream, and use the link provided by them to playback on the client(windows /Web).. (check with brighcove or Mogulus by LiveStream
Hope this helps.

Related

get the first frame of the Mjpeg files using jmf with a client server application

am using jmf library with netbeans 8.0.2 IDE
for transporting media across network.
so i want make a little thumbnail for the video
founds on current path before showing the whole video
i found a way to get the number of files with distinct file type
my files type is Mjpeg so now
i have the files paths and names
now how to get the first frame only
from the video and not showing the whole video
so please Help Me!
i am using RTP and RTSP protocols to send packets
between the server and the client
and using the Socket class for building the socket

iOS - Develop iPhone app to stream camera video to a computer?

I'm looking for a way to create an app that will allow captured camera video to be streamed on a computer. For example, one person could be walking an iPhone around a room and another person could have that video streamed on their computer. Something kind of like a one-way Facetime except the receiver is on a computer. Also, I can't just use an existing app as later I would like to change the program to do some computer vision processing on the incoming data.
At the moment, I've found that AV Foundation should be the correct option for the video capture (from this question). However, I'm having difficulty finding the method by which I can actually stream this data. In particular, searching for how to create the apps on the iPhone frequently results in existing apps that do the task, but not how to create the app.
Can anyone give me a pointer to the information on how to stream the video capture from the iPhone? Thank you much.
You can use "Wowza media Server" for Streaming purpose
For wowza media server doenload :
Wowza Download
After installing wowza Now you need to set up live setting in wowza for that purpose you need:
Setting Up Live Application
For iOS side there is library is useful for video streaming using RTMP connection
You can get Library at
RTMP library for Streaming
Library example
RTMP library for Streaming example
In this good example of Streaming from iOS side
I had success with ANGL lib and Wowza media server. It gives smooths RTMP stream.

how to stream high quality video and audio from iPhone to remote server through Internet

I am seeking the following three items, which I cannot find on STACKOVERFLOW or anywhere:
sample code for AVFoundation capturing to file chunks (~10seconds) that are ready for compression?
sample code for compressing the video and audio for transmisison across the Internet?
ffmpeg?
sample code for HTTP Live Streaming sending files from iPhone to Internet server?
My goal is to use the iPhone as a high quality AV camcorder that streams to a remote server.
If the intervening data rate bogs down, files should buffer at the iPhone.
thanks.
You can use AVAssetWriter to encode a MP4 file of your desired length. The AV media will be encoded into the container in H264/AAC. You could then simply upload this to a remote server. If you wanted you could segment the video for HLS streaming, but keep in mind that HLS is designed as a server->client streaming protocol. There is no notion of push as far as I know. You would have to create a custom server to accept pushing of segmented video streams (which does not really make a lot of sense given the way HLS is designed. See the RFC Draft. A better approach might be to simply upload the MP4(s) via a TCP socket and have your server segment the video for streaming to client viewers. This could be easily done with FFmpeg either on the command line, or via a custom program.
I also wanted to add that if you try and stream 720p video over a cellular connection your app will more then likely get rejected for excessive data use.
Capture Video and Audio using AVFouncation. You can specify the Audio and Video codecs to kCMVideoCodecType_H264 and kAudioFormatMPEG4AAC, Frame sizes, Frame rates in AVCaptureformatDescription. It will give you Compressed H264 video and AAC aduio.
Encapsulate this and transmit to server using any RTP servers like Live555 Media.

Flash Media Server live streaming with multiple video files

I am using Flash Media server 4.5 and i read the tutorial if i want to stream the live feed, i may need to use the media live encoder. but what i found in media encoder is i have to manually setup everything and it only support camera devices.
But in my case i have multiple video files keep received from another program, my goal is use the Flash Media server to perform a live boardcasting with these video file one by one.
That means when client watching the streaming, they will not notice the server is play mov1, then mov2, then mov4, then mov5... and so on.
Also can FMS dynamically create a new streaming session (invoke by code), so that when client A uploading some video files to the server, the FMS open a new streaming session only stream cilent A video files?
Can FMS achieve such purposes? any tutorial provided would be very helpful!
Edit for Open Bounty
I want to basically deliver a live stream of video where a list of videos are source. I am currently using Flash Media Server with Cloudfront CDN to deliver content. So if I have video1, video2, and video3. I want to play them back to back as a live stream (so no skipping ahead in video), is it possible to do this? Bounty goes to clever workaround. Think of this as a television channel.
i have been working on the live streaming technologies for the past 1and half year . There is no option in flash live encoder for any file encoding.
1.To encoder your file you can use your dvd player devices or some thing else which supports usb devices playback options.and use the dvd player output to broadcast using flash media live encoder.
2.And the another set is to setup windows media encoder that supports file encoding(no need od dvd player) but it supports only windows media services.
At present i live webcast video file in this way only for my company http://www.malar.tv/live.php

Mac/iPhone:Streaming video file to iPhone

I have a http streaming link which gives me .flv streaming feed. I want to convert that and access in my iPhone program. How can i do that? I want to have a desktop software like VLC and input this streaming feed URL and convert to iPhone supported and stream again to iPhone. I tried VLC with H.264 and Mpeg-1 audio, but seems to be it doesn't give the supported format, so as iPhone program doesn't play the video.
Could someone please guide me how can i setup a desktop software which can stream iPhone supported file?
Thanks in advance.
I think even the great VLC can't convert FLV on the fly...(or even do anything with FLV). As far as streaming goes, you'll probably be limited to the local network (Wi-Fi). I'd start with the simple way—create an ad-hoc file server on the desktop, then use AVPlayer's initWithURL method to find that video.
On the desktop, you could query the IP address of the computer, and ask the user to enter that URL (along with an optional port assignment and file component, like http://192.168.0.2:2234/streamingVideo.mp4) onto the iDevice, then convert to NSURL.
What exactly is the http streaming link? This matters a lot as in order to stream to the iPhone you need to use HTTP Live Streaming which requires some different bits than a typical flash media, or more properly RTMP, server. Typically you need two different streaming architectures or some expensive boxes.