Is there a way to adjust H264 profile level in the Ozeki VoIP SIP SDK? - sip

Currently developing a softphone application using Ozeki's VoIP SIP SDK. Facing an issue where calls initiated by the SDK using H264 codec do not specify any profile-level-id in the SIP Invite message. This results in some callees defaulting to extremely low profile levels such as 1.1 (42000B) which is 176x144 resolution. The same callees called by a softphone that does provide a profile level return much better quality video.
Here is the part of the packet capture that shows the lack of H264 profile-level-id.
I am wondering if there are any means to provide a profile level in the SDK that is 3.1 (42001F) or higher. I have tried creating new H264 codec definitions but couldn't attach them to the softphone model. Any help or ideas are much appreciated.

I ended up manually editing the SDP message to insert a profile-level-id parameter in hex values. The SDK has the ISDPMessageManipulator interface to allow intercepting incoming and outgoing messages.

Related

iOS adhoc wifi sensor data

My iPhone connects over adhoc wifi to a wifi sensor module.
The challenge is to code an app that uses this sensor module. But I'm not sure what specific API's to use to best architect this implementation.
I've started looking into the CocoaAsyncSocket class as it seems to be an appropriate tool for such use.
Does the user always have to manually connect to the adhoc wifi device? (through the Settings app) or can my own app handle the searching, making, and breaking of the wifi connection?
I doubt iOS lets me programmatically toggle whether Wifi is on/off.
Once the sensor data is being received, what container would best handle the stream?
For example, on other platforms, I coded a rotating queue buffer.
Thanks for your input.
Edit: The protocol in question is straight CSV formatted ASCII. Not HTTP, FTP, etc. Just raw data. The app is to simply open a port on the connected IP, and read/write.
Your application cannot turn wifi on/off, or select a wifi network.
Without more information on what protocols this wifi sensor module speaks, it will be impossible for anyone to give more than vague recommendations. If the module can serve data over an http connection, that would probably be ideal. If it requires your software to open a connection on a specific port and communicate over something other than http or ftp, your job will be a bit more complicated. CFNetwork and projects derived from it's usage, such as CocoaAsyncSocket which you mentioned. You can see another implementation of an HTTP connection over CFNetwork in ASIHTTPRequest, that may help as a reference for handling download streams, queuing operations, etc.
As for storing the data, again it's hard to give any concrete recommendations without more information. If you want to store the data to the filesystem of the iOS device, NSData will probably meet your needs. If you need/prefer to use a queue for buffering data, there is a simple category on NSArray which provides queue semantics. The link to CHCircularBuffer in that SO article is dead, but this github project appears to have it.
edit: Here is the official version of the CHDataStructures. I don't know if it's an improvement over the previous link, but it appears to be updated for the more recent iOS SDK.
Add the SSID of your wifi ad hoc networks in settings. When you see the SSID of the network, click the arrow and choose connect automatically. After this, the phone will automatically connect whenever it comes in range. After this, the two devices can communicate using regular socket APIs.

iPhone peer-to-peer voice chat

I see that Game Kit allows you to develop games with voice chat.
I want to build a more general, peer-to-peer voice chat application, that does not have to live in the Game Center. So a couple questions:
1. What peer to peer system/technologies could be used for this?
2. If I wanted to allow voice chat with a Flash client (i.e. iPhone app <--> Server <---> Flash client on PC) would options for 1 work for this?
I have some experience with RTFMP for Flash to Flash client chat, and no iPhone dev experience, so just want to test out some ideas.
Maybe one idea: build using the Ribbit Platform - they have both Objective-C and Flash SDKs, but this looks more like traditional\SIP calling.
Anyway, would appreciate anything that points me in the right direction.
Thanks.
Now that flash has access to raw Microphone data, you could roll your own client and server; yet, since, currently, it doesn't have UDP sockets in AIR for mobile, you would be forced into considering audio quality vs lagg with even tighter restrictions then usual.
You can now roll your own native extension to make this work; yet, I am assuming you want something that only requires coding in AS3.
Therefore, considering your restrictions, the only real bet would be to use Flash's built-in communications capabilities (e.g. RTMP).
With the above being said, there are opensource alternatives to the array of Adobe's own flash communication servers:
the red5 server, and rtmpd.
IMHO Ribbit's services are kind of pointless.

Getting started with a VOIP app for iOS or Android?

I'd like to create an app for iOS that does VOIP, presumably by interacting with a website. I can start with Android too.
Does anyone know of any tutorials, suggestions or libraries that would be of any use.
(The app would need to be rewritten for BB and android eventually, too.)
EDIT:
Bonus: What is SIP?
These answers suggest using siphon.
SIP is the Session Initiation Protocol, a transport- and media-agnostic protocol for setting up, modifying and tearing down long-term associations between multiple parties. It's formally defined in RFC 3261.
Usually SIP is paired with the Session Description Protocol which describes the media streams the various parties wish to use. SIP uses an offer/answer model for the parties to exchange these media descriptions.
If you can possibly avoid it, don't write a SIP stack (unless it's for fun, of course). It's a LOT of work.
Consider the Twilio Client iOS VoIP SDK. It makes it dead-simple to integrate VoIP capabilities into iOS apps. No need to know anything about SIP.
The Session Initiation Protocol (SIP) is a signaling communications protocol, widely used for controlling multimedia communication sessions such as voice and video calls over Internet Protocol (IP) networks.
SIP Requests are:-
REGISTER: Used by a UA to indicate its current IP address and the URLs for which it would like to receive calls.
INVITE: Used to establish a media session between user agents.
ACK: Confirms reliable message exchanges.
CANCEL: Terminates a pending request.
BYE: Terminates a session between two users in a conference.
OPTIONS: Requests information about the capabilities of a caller, without setting up a call.
SIP Response
Provisional (1xx): Request received and being processed.
Success (2xx): The action was successfully received, understood, and accepted.
Redirection (3xx): Further action needs to be taken (typically by sender) to complete the request.
Client Error (4xx): The request contains bad syntax or cannot be fulfilled at the server.
Server Error (5xx): The server failed to fulfill an apparently valid request.
Global Failure (6xx): The request cannot be fulfilled at any server.
Also you need to check this.
apple document on how to use voip app
for SDK you can use paid sdk's or free sdk's those comes under free are siphon,twilio,ozeki etc..Using these SDK's you can easily implement SIP,SDP,RTCP,SRTCP,RTP,SRTCP etc.
Well I'd suggest looking at the a SIP library that can work on Android. There are several SIP libraries out there for Java, but it's unknown if they work on Android.
There is a project that adds SIP/VoIP to Android:
http://sipdroid.org/
I'd check that out and see what they did to get the audio from the handset, and approach to implementing SIP, etc. You can't use that code for closed source development because it's GPL, and they are very clear about who can use it.
SIP protocol:
http://www.cs.columbia.edu/sip/
http://www.sipforum.com/
Hope that helps you get started.

Sending data between OSX and iPhones/iPads

I am wondering how I can send data between a machine and a mobile device. I know about the game kit an have read a bit about bonjour (but don't know to much about it), but would like to know some expert thoughts on what the best way is.
What I basically want to build is a one way traffic application that sends data from OSX to the mobile device (iPhone, iPod touch or iPad). The data send is either pictures, text (of a certain size and position ect) or video. The mobile device just has to receive this data and display it... nothing more.
My guess is that a WiFi solution would be best.
How could I best do this? Are there any tutorials that might help me putting this together?
Thanks in advance!
Best regards,
Paul Peelen
As no reply yet . . .
Bonjour is more focused on LAN networks, so would restrict you to WiFi.
It's also more of a service discovery standard - your Mac app would advertise the service on the LAN, and clients could see it - but your actual app communications will run on a different TCP socket, using whatever protocol is appropriate.
This linked answer may be helpful (although you will want CFNetwork in reverse - pushing from Mac to phone)
[iPhone]: How send output stream via wireless network?
For video you are probably better off looking for higher level frameworks (i.e. the AV ones).
Without knowing the full details of what you want to do, I wonder if rather than pushing data to the iPhone, the best thing would be to send a lightweight notification to the iPhone (AMQP, XMPP, or similar protocol) passing a URL back to the resource on the Mac - that way you could use standard HTTP GET for images, video, etc, on the iPhone side, throw the URL at a webkit view to display - and on your Mac side you could then use an off-the-shelf web server (Apache, or an embedded HTTP server within your code).

Streaming audio to mobile phones, what technology to use?

I'm planning on building an application where audio media is going to be streamed to the mobile phone for the user to listen.
The targets are smartphones: iPhone/Blackberry/Android/(J2ME ?).
I see that streaming on iPhone has to be done with HTTP Live streaming, but I don't see it supported by other platforms.
Should I broadcast the streams via rstp ? http ? Is there any way to use a unified solution for all the different mobile platform ? If anyone already had to go through this, help would be greatly appreciated.
One answer to the question "what technology to use ?", for iPhone specifically is WiFi. I know that's not the type of question you are asking, but its a point worth making! Many apps that support streaming over 3G have been rejected by Apple due to bandwidth usage. You may need to be prepared to sense network connection type and limit streaming to when you have a WiFi connection only.
Blackberry works with http and RSTP on OS 4.3 or later. I'm not familiar with other platforms but I would think http would be the most compatible.
Here is a PDF that lists the supported types by the major models.
http://docs.blackberry.com/en/smartphone_users/deliverables/15801/711-01774-123_Supported_Media_Types_on_BlackBerry_Smartphones.pdf
You will probably want to do RTSP, but it doesn't really matter. HTTP Live Streaming is just a protocol on the client side I am pretty sure. All these acronyms just describe ways of transmitting data. If a browser can access the data for a given protocol....chances are a phone can too. It sounds like you are asking more of a server side question.....but that question is the least of your worries You are going to have to think more along the lines of "How am I going to scale this" rather than "What protocol should I use to transmit data". Also, the unified solution for all clients, would be to have a server that they all hit for data. You still need to develop separate clients for each OS.
Both Android and BlackBerry supports RTSP.
Note that some BlackBerry devices only support 15fps video, so you may need separated streams to give the best experience to your users.
iPhone, starting from iPhone OS 3.0, needs HTTP Live Streaming.
The only software solution I know to support all above is Wowza, but you still need an encoder. (I think Wowza supports RTP as input, but need double checking.)
iPhone can play non-streamed audio (progressive download). Considered all platforms you would normally you just needs streams that are suitable transcoded. See f.e. https://developer.apple.com/library/content/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/Introduction/Introduction.html The title says its about 'HTTP Live Streaming' but a lot applies for just downloading and playing streams.