Block direct access to rtsp link - streaming

I'm planning building streaming server using Darwin Streaming Server and streaming player client with VLC library. My goals is only through my client that can access the video on Darwin Streaming Server. I don't want anyone access my RSTP link without using my client. Because when I get the any RTSP link like rtsp://localhost/myvideo.mp4 I can directly play it on VLC player. But is it possible to block direct access to RTSP link without using the client I build by configuring on Darwin Streaming Server?

You can use an authentication scheme to only allow authenticated users to access content at particular URL.
Essentially this means that the web application will only send responses to clients it has verified are who they say they are - otherwise it will send an error message or simply not reply.
There is an OWASP authentication cheat sheet for authentication here, which is kept up to date and is a good starting place for you to get an overview:
https://www.owasp.org/index.php/Authentication_Cheat_Sheet
OWASP is an open industry organitsion focusing on application security -they describe themselves as:
OWASP is an open community dedicated to enabling organizations to conceive, develop, acquire, operate, and maintain applications that can be trusted
Note that this will not stop authorised users being able to copy and potentially re-distribute the content. For that the typical defence is to encrypt the content and either build your own secure player and secure key exchange solution or use one of the standard DRM solutions which cover nearly all devices now.

Related

HTML5 / Javascript client code to record from microphone to send it to IceCast / ShoutCast server? Another solution without desktop software?

I'm trying to help this open radio station guys: radioqk.org. However I'm quite new about the topic of streaming and radio servers. I'm quite surprised that all what I found is about a desktop software clients (eg. Sam broadcaster, Butt, Radittcast, DarkSnow...). However they are confusing to configure. So we are trying to embed it on their website to make it easier to stream from any part of the World to any stream server (eg. giss.tv, caster.fm, listen2myradio.com...)
I have read that it's not possible at the moment, because there is no way to make a streaming HTTP PUT request.
However, if I have understood well, it is possible with liquidsoap.fm because its server support the webcast.js protocol, using the following code: https://github.com/webcast/webcaster
On the other hand, I have search php code able to record from microphone to store it on the server. Or maybe it's about HTML5 and its new function getUserMedia()? It seems it was difficult a few months ago, but now it is possible so:
Is there any live-streaming service with the client integrated so it can record from the user's computer microphone / sound card? I mean, is there a similar service like giss.tv able to record from the user's computer microphone / sound card?
If I'm right, IceCast is the most common opensource implementation of radio streaming. Is there any implementation of IceCast able to record from the user's computer microphone / sound card?
By the way, the idea is integrating it in a WordPress server. That's why I have based the search on PHP (I have not found a WordPress plugin able to solve this problem). However it could be done in another language / server to embed it into WordPress afterwards.
Finally, a workaround could be the following article that talk about including on the website a hyperlink to a Java-coded VNC viewer to take a desktop application to the web in 15 Minutes. In the VNC server side would be any of the desktop software available we have talk about above.
Any light about this topic? I'm quite confused about what path I should take...
I have read that it's not possible at the moment, because there is no way to make a streaming HTTP PUT request.
That's correct. In the very near future we'll have Streams support in the Fetch API, which gets around this issue. In the mean time, it isn't possible directly.
As I mentioned in the post you linked to, you can use a binary websocket connection. That's what the Liquidsoap webcast.js uses... a binary web socket, and a server that supports it. Liquidsoap supports their own protocol, so you can use this to then stream to a server like Icecast.
Is there any live-streaming service with the client integrated so it can record from the user's computer microphone / sound card?
I run the AudioPump Web Encoder, which acts as a go-between for web based clients and your servers. The web-based client can be configured in the URL, so the users don't need to do anything. This might meet your needs.
If I'm right, IceCast is the most common opensource implementation of radio streaming. Is there any implementation of IceCast able to record from the user's computer microphone / sound card?
Yes, Icecast is a popular open source server. But, the server itself cannot and should not be what is recording the audio. You wouldn't want to run the server in the same place as you're doing the encoding... otherwise you'd need all your bandwidth there. The encoder is separate from the server so that you can encode in one place, upload one stream, and then the server distribute it to thousands.

Socket connection between rails and iphone native app

I have an iphone app with rails serving as a backend server.
Now I need to implement a chat functionality using sockets connections.
A lot of examples show you how to implement chat using sockets in browser.
What I need here is how I can implement an application where you create socket server in the rails app , and the client in iphone app which listens to the channel I give them.
I tried using faye(examples given only how to implement client in the browser) and using fayeObjC library for iphone to create client, but am not able to listen to the channel from this library.I know I must be implementing it wrong here.
I'll share my code also here, but first I need to know is there a better solution than this?
Also I appreciate some links to some examples where socket server is in rails and clients are iphone app.
Appreciate any help and mostly need a right direction to implement it.
Update
I tried the faye combination again and it worked.Although still looking for more solutions.
You can check about TCP sockets:
http://www.raywenderlich.com/3932/how-to-create-a-socket-based-iphone-app-and-server
Chat Application Using Ruby
http://quickblox.com/modules/chat/
http://caydenliew.com/2011/11/ios-mac-os-communication-with-asyncsocket/
http://www.macresearch.org/cocoa-scientists-part-xxix-message
Next link is a comprehensive Networking Guide - Using Internet Sockets
You must keep in mind two major problems to peer-to-peer communications (Chat): reachability and how to receive new messages while your application is in the background (get notifications).
For the last you can use APNS approach: an invisible notification will be pushed to the iPhone indicating that a new message is ready to be read. So your app will make a request for unread messages (what app like WhatsApp does).
Besides TCP sockets you could use websockets (HTTP - so there are no firewall problems).
Best in class - Socket.IO.
Here you will find the wiki https://github.com/learnboost/socket.io/wiki (you will find there an extension for Ruby also)
Here an example for iOS chat client for socket.io & node.js backend
Jabber
Another option: XMPP - "stands for eXtensible Messaging and Presence Protocol. Such a protocol is open-standard and oriented to message exchange (builds and maintains by Jabber community). Message exchange happens near real time, so it is an ideal infrastructure to build chat-like applications. The protocol also implements a mechanism to notify presence information (whether a user is online or not) and the maintenance of a contact list. XMPP is a thorough protocol, which has been adopted also by big companies like Google to build their Instant Messaging service."
Here you will find all about developing a Jabber Client for iOS (enable users to sign in, add buddies, and send messages; how to install and configure a jabber server, create accounts, and interact with the server from an iOS application http://mobile.tutsplus.com/tutorials/iphone/building-a-jabber-client-for-ios-server-setup/
I know that SocketRocket by square is a strong native Objective-C library. But it doesn't offer the channel abstraction you seem to be looking for.
If you would consider outsourcing the WebSocket connections then you could use a hosted service like Pusher, who I work for. You can publish messages (trigger events) on channels using the pusher-gem. And you can subscribe to channels and receive messages using one of Pusher's Objective-C libraries.
Other solutions will also have Objective-C libraries and you can find a list of them via this realtime web tech guide.

iOS Device communication

I am keen to get some apps built that can communicate with other devices/ web etc. i have played around with FTP and can get so far. But what is the best way to do this? We don't have any Servers with databases etc, but do have a site that we are currently uploading and downloading files to.
can anyone suggest a good/ better way to get the device to send/ receive files?
thanks
sam
If it's HTTP communication you're wanting to do, the simplest and most powerful tool is ASIHTTPRequest.
HTTP is the protocol your web browser uses to talk to web servers. If you have a site you're storing and downloading files at, it's almost certainly HTTP you're talking to it.
For iOS device to device communication one can use Bump API.
EDIT: I don't know of a generic framework for device <-> server communications, but having built applications that use web services of other providers like Yelp, Yahoo, Google Maps, I would say the way to go for this is to have REST based web services which exchange data in JSON format.

How can an iPhone communicate with a computer?

How do apps like Apple's "Remote" app control mac applications and send data?
is this via php with exec() commands or some other method? and how would I do it in c?
Also, how fast is this rate of transfer (can I use it to send real-time data like streaming video or audio?)
thanks to anyone who cares to enlighten me on this issue :-)
Apps generally communicate using a TCP/IP based protocol and the wireless LAN connection (the iPhone also has Bluetooth). In the case of the Remote app the communication is via with the Digital Audio Control Protocol. iTunes implements DACP so the Remote app can control it. Other common protocols are HTTP and FTP. There are classes in the iPhone SDK to connect to noth HTTP and FTP servers. There is also the Cocoa HTTP Server project which allows the iPhone to act as a HTTP server.
iPhone apps can also use Bonjour/mDNS/zeroconfig (they're different names for the same technology) so that the user doesn't have to be concerned with configuring IP address'.
The data transfer rate of the wireless connection is faster enough to stream video.
many of these remote applications work by installing client software on the computer and establishing a network connection. In the case of Apple's remote software, the client software is built into the networking capabilities of iTunes. You must authenticate using your iTunes sign in and be on local wifi.
The third party app "intelliremote" works very similarly only has it's own client software to install and can work across a WAN with proper port forwarding enabled on your network.
I haven't encountered any realtime streaming options as most of these utilities are designed to pass control messages and meta-info on media files.

how to make my server support APNs?

i read in some stackOverflow post that The host would need to have port 2195 open and support push notifications under apns.how do i make my server to support apns.
what does this line means in Push Notification guide
To establish a trusted provider identity, we should
present this certificate to APNs at connection time using peer-to-peer authentication.
do i need to make a connection to APns through my native app? someone please explain this
Pretty much any server that doesn't have restriction on what sockets you can open is ready to be an APNS provider. In your project code, you can just open a socket to Apple servers (or use a library).
My understanding that Google App Engine and basically any traditional shared web hosting block any port other than 80, so you cannot use them. However, you can look into Urban Airship that provides a RESTful API that basically use from any service. It might get pricy though.
In short you need to establish an SSL connection, and then send the payload in the pre-defined binary format.
alt text http://developer.apple.com/iphone/library/documentation/NetworkingInternet/Conceptual/RemoteNotificationsPG/Art/aps_provider_binary.jpg
See this guide for more details.
I use appengine, so I solved this by renting a super cheap server in hetzner.de. It basically serves as proxy to my appengine app and uses this library: https://github.com/notnoop/java-apns
that library has a one line way to send the message, using the certificate.
Google just opened up the ability to do Socket stuff on AppEngine, so I created a sample project that you can use to send push notifications using GCM and APNS in python. Feel free to use any of it that you like.
I've also included a sample iOS project and Android project that hook up nicely with the app engine solution.
https://github.com/GarettRogers/appengine-apns-gcm
** Full Disclosure ** this is my project, and I'm in no way trying to promote it because it's mine... it's simply the only available project that solves your problem at the moment. If you find anything that does a better job, please leave a comment.
You may check out java-apns-gae.
It's an open-source Java APNS library that was specifically designed to work (and be used) on Google App Engine.
https://github.com/ZsoltSafrany/java-apns-gae