video streaming over LTE simulation - lte

Hello I want to simulate a video streaming over LTE.I have tried SIMULTE but had problems in installing the ns-3 and the main problem is to know if it can support video streaming.Then I tried LTE-sim, but could not download the cygwin packages due to the internet speed which poor here in Algeria.
Can any one give me advice the best(which support video streaming) and easy( am no programmer) simuator to use?

SIMULTE supports video streaming
After a successful installation of Omnetpp and INET you will be able to simualte Video Streaming on Simulte
When you create your project you should indicate that the udpapp you are using is video streaming
Your .ini file should contain the following:
**.cli[*].udpApp[*].typename = "UDPVideoStreamCli"
#UDP application on the client side
**.server.udpApp[*].typename = "UDPVideoStreamSvr"
##UDP application on the server side

Related

iOS - Develop iPhone app to stream camera video to a computer?

I'm looking for a way to create an app that will allow captured camera video to be streamed on a computer. For example, one person could be walking an iPhone around a room and another person could have that video streamed on their computer. Something kind of like a one-way Facetime except the receiver is on a computer. Also, I can't just use an existing app as later I would like to change the program to do some computer vision processing on the incoming data.
At the moment, I've found that AV Foundation should be the correct option for the video capture (from this question). However, I'm having difficulty finding the method by which I can actually stream this data. In particular, searching for how to create the apps on the iPhone frequently results in existing apps that do the task, but not how to create the app.
Can anyone give me a pointer to the information on how to stream the video capture from the iPhone? Thank you much.
You can use "Wowza media Server" for Streaming purpose
For wowza media server doenload :
Wowza Download
After installing wowza Now you need to set up live setting in wowza for that purpose you need:
Setting Up Live Application
For iOS side there is library is useful for video streaming using RTMP connection
You can get Library at
RTMP library for Streaming
Library example
RTMP library for Streaming example
In this good example of Streaming from iOS side
I had success with ANGL lib and Wowza media server. It gives smooths RTMP stream.

Flash Media Server live streaming with multiple video files

I am using Flash Media server 4.5 and i read the tutorial if i want to stream the live feed, i may need to use the media live encoder. but what i found in media encoder is i have to manually setup everything and it only support camera devices.
But in my case i have multiple video files keep received from another program, my goal is use the Flash Media server to perform a live boardcasting with these video file one by one.
That means when client watching the streaming, they will not notice the server is play mov1, then mov2, then mov4, then mov5... and so on.
Also can FMS dynamically create a new streaming session (invoke by code), so that when client A uploading some video files to the server, the FMS open a new streaming session only stream cilent A video files?
Can FMS achieve such purposes? any tutorial provided would be very helpful!
Edit for Open Bounty
I want to basically deliver a live stream of video where a list of videos are source. I am currently using Flash Media Server with Cloudfront CDN to deliver content. So if I have video1, video2, and video3. I want to play them back to back as a live stream (so no skipping ahead in video), is it possible to do this? Bounty goes to clever workaround. Think of this as a television channel.
i have been working on the live streaming technologies for the past 1and half year . There is no option in flash live encoder for any file encoding.
1.To encoder your file you can use your dvd player devices or some thing else which supports usb devices playback options.and use the dvd player output to broadcast using flash media live encoder.
2.And the another set is to setup windows media encoder that supports file encoding(no need od dvd player) but it supports only windows media services.
At present i live webcast video file in this way only for my company http://www.malar.tv/live.php

Streaming video, cloud servers, and videojs

I'm interested in setting up a streaming video server (perhaps on a cloudfront server) with videojs. I understand that flash video can be streamed, however, is it possible to stream video using videojs and a different codec? (like h246). I tried looking through the videojs documentation and forums but did not find any additional info.
Video.js has limited support for RTMP streaming in Flash, but hopefully more in the next few months.
HTTP Live Streaming (HLS) is the most supported streaming format for HTML5 (iOS, Safari, latest Android). Video.js can support that on the devices that support HLS natively.
I think you would have to transcode the h264 file on the fly to get the effect you want. Subsonic is a program which will read your file structure, display your videos and music in a webui, transcode the audio/video and stream it--but it uses jwplayer, not videojs.
However, it is opensource, so if you want to try to modify that, I'm sure it would be possible.

Mac/iPhone:Streaming video file to iPhone

I have a http streaming link which gives me .flv streaming feed. I want to convert that and access in my iPhone program. How can i do that? I want to have a desktop software like VLC and input this streaming feed URL and convert to iPhone supported and stream again to iPhone. I tried VLC with H.264 and Mpeg-1 audio, but seems to be it doesn't give the supported format, so as iPhone program doesn't play the video.
Could someone please guide me how can i setup a desktop software which can stream iPhone supported file?
Thanks in advance.
I think even the great VLC can't convert FLV on the fly...(or even do anything with FLV). As far as streaming goes, you'll probably be limited to the local network (Wi-Fi). I'd start with the simple way—create an ad-hoc file server on the desktop, then use AVPlayer's initWithURL method to find that video.
On the desktop, you could query the IP address of the computer, and ask the user to enter that URL (along with an optional port assignment and file component, like http://192.168.0.2:2234/streamingVideo.mp4) onto the iDevice, then convert to NSURL.
What exactly is the http streaming link? This matters a lot as in order to stream to the iPhone you need to use HTTP Live Streaming which requires some different bits than a typical flash media, or more properly RTMP, server. Typically you need two different streaming architectures or some expensive boxes.

Is there a way to test HTTP Live Streaming via an iSight camera?

I'm working on an iPhone app that will use HTTP Live Streaming. Using Apple's provided tools (particularly mediafilesegmenter), I'm able to successfully segment and serve an archived video. Now I want to test Live Streaming stuff. I don't own any sort of camcorder, I just have my iSight built-in to my Mac. Is there a way to leverage this camera to test out Live Streaming? Run iSight from the command line maybe? If so, I need a port number for mediastreamsegmenter.
I haven't tried this out on an iPhone but I would imagine it would work:
http://autonome.wordpress.com/2009/05/31/streaming-your-isight-camera-to-the-web-with-the-video-tag/
Note that the post says you need a nightly build of VLC, this is now out of date - as long as you're using 1.0 you should be fine.