I know Anymote protocol lets iOS / Android mobile devices to talk to Google TV. I was wondering if I can use Fling events to pass in a remote video URL and ask Google TV to invoke Media Player to playback that URL?
I am following GoogleTV Pairing Protocol Guidelines documented here,
http://code.google.com/tv/remote/docs/pairing.html
and also suggested by a member, I am using proto-buf-objc
http://code.google.com/p/metasyntactic/wiki/ProtocolBuffers project
to create my objective-c message classes from .proto file.
I could also able to find "_anymote._tcp" service provided by GoogleTV
Pairing Protocol server and got IP/port number after resolving the
service.
I created an SSL connection as described in the Pairing Protocol
documentation for sending and receiving messages.
Then I constructed a PairingRequest, wrapped that in OuterMessage and
used writeToCodedOutputStream method to send that to my open
NSOutputStream to GoogleTV.
I also got a response back on my NSInputStream and I am trying to
parse it into an OuterMessage using [OuterMessage
parseFromData:_data_received] but that throws an exception saying "
exception 'InvalidProtocolBuffer', reason: ''" ". I am not whats going
wrong here.
Does anyone at Google has any recommendations on what might be going
wrong? Also any code example or iOS library of Paring protocol
would be very helpful. I just want to enable Airplay like
functionality for GoogleTV in my app.
Thanks,
For Remotes for Google TV, you would have to implement Anymote on iOS. It shouldn't be too difficult - protocol buffers already exist for it. Android is easier w/ sample code.
And it's fairly easy to Fling a URL.
Related
I have a client - server IOS application. In addition, I need to write an application for it on WatchOS. So I have 2 questions:
When I send a command from Apple Watch, do I need to connect to the server from the watch or transfer information to the IOS application and connect to the server from the phone?
If I have to connect to the server with an IOS application, then how can I connect in background mode?
As an example, you can take any messenger, for example Telegram
If I reply to a message with AppleWatch, how is the message being sent to the server (Via iPhone or directly via AppleWatch)
For connection to the server, I use the "Starscream" framework
Maybe I don't understand something, but Apple made it impossible to establish a connection in background mode
I would be grateful if you tell me or provide examples and articles, so that I could understand what to do.
Apple Watch apps can connect directly to servers using URLSession ... no need to go via the phone.
Here is an example on using URLSession in Combine, which is the "latest and greatest" way of doing things.
Otherwise this example shows using it in a more conventional way, with the bonus of SwiftUI.
Otherwise
Im looking to initiate pushcommunication between Android and iPhone, both ways. The data that is to be transferred is critical so therefor I cannot use ordinary push because its not prioritzed.
It is important that the user doesnt "see" the communication.
I have looked at using SMS and "catch" the message before it reaches the ordinary "SMS application" of the phone. As I understand this is possible in Android but not in iPhone.
Any other suggestions?
You can use Bonjour (a.k.a. Zeroconf) for this. There are tons of links for using Bonjour on the iPhone; here's a link for doing it on Android:
http://android.noisepages.com/2010/02/yes-android-can-do-zeroconfbonjour-jmdns/
I have written a VST/AU/RTAS synthesiser plugin for OSX and Windows that also has an iPhone equivalent. I would like to allow the two to communicate with each other over a local area network so that the iPhone app can be used to send MIDI controller data to the plugin. I plan to create a MIDI source on the iPhone and publish it as a Bonjour service so that the plugin running on OSX or Windows can find it and receive midi from it.
I have a couple of questions to ask about this:
1) Do I actually have to publish the MIDI source as a Bonjour service or does a coremidi host (running on iPhone) automatically publish itself?
2) Are there any code examples available that show how to do this sort of thing?
I have seen the following post but the answer to this only covers the client side, finding a Bonjour service but not the publishing side, and it transmits MIDI via OSC, and it only covers OSX but not Windows (I know, I'm not asking much! ;) )
How to send MIDI or OSC signals to a Mac application from my iOS application?
Cheers,
John.
AFAIK you'll have to publish the service yourself. NSNetService and NSNetServiceBrowser are the classes you need. Check out the companion guide. I found this article on Cocoa for Scientists particularly helpful in getting started. Both have some decent code samples. The Bonjour Browser is useful for testing.
The list of bonjour service types already has
apple-midi
and
imidi
But I think it's best to make up your own application-specific type name unless your app is plug-compatible with one of these services.
I'm trying to write a simple chat application for the iPhone (as an experiment). Is there a simple way for two devices to discover each others' IP addresses, and given the addresses is there a simple API or protocol that would let me send text messages back and forth?
I've investigated SIP (specifically Sofia and eXosip), but these tools exist as C libraries and are beyond my current ability to port them to the iPhone.
Update: I'm trying to connect two devices over the Internet (i.e. not over Bluetooth or a local wireless network, which is what GameKit does).
You're going to need a server that provides the match making service. Game Center makes this pretty easy, but your users will have to have Game Center accounts.
Alternatively, you can set up an XMPP (formerly Jabber, it's what powers Google Chat) server (I've never done this, but there are several available) and use the XMPP Framework for Cocoa. There are instructions for using it in iPhone apps here.
I'm sure there are other chat servers and client source also available. IRC and Mobile Colloquy come to mind.
Finally, you could write your own server using your favorite server language / framework. This isn't too hard (I've done it myself), but it's far from what I'd call simple, and I wouldn't use it for a production system.
There is support for exactly this kind of ad-hoc peer-to-peer networking in GameKit. Have a look at the second half of the GameKit documentation for details:
http://developer.apple.com/library/ios/#documentation/...
NSNetService is a good option.
Take a look at WebRTC Datachannels. WebRTC is a newer option with native iOS support a standard that is still being finalized, but it is more flexible should the iOS app need to communicate with browser or even android peers
The new apple remote app on iPhone and iPad is pretty cool. I'm wondering is there a public API to use?
Since the remote app is not a server, so it's not like other "control itunes" apps or programs. I'm wondering is there any API or service exposed on PC/Mac side so that other apps can use it to control itunes?
Ultimately, I'd like to build my own app(on iPhone) or PC program to control the iTunes.
There is no API, but the protocol used by iTunes, called DAAP, is well understood. If you don't mind dropping down to handling it directly via the network protocol there are some decent resources that explain it pretty well. Take a look at:
Digital Audio Access Protocol (Wikipedia)
DAAP protocol - Tapjam
TunesRemote: Android DACP/iTunes Remote Control