I can do it from XCode, but I want to be able to launch an iPhone App (on the device) from the command line. Is it possible?
Why? Because I want to capture some of the output for semi-automated testing. I'm guessing I need to use a debug build for NSLog output, but I'd also be interested to know about other methods for getting NSLog / stdio data back to the host Mac.
There is a project on github called titanium_mobile (part of Titanium Developer).
I use a utility from that project called iphonesim. It launches an iPhone app from the command line (though I am not sure how, I think there is a way to do that with SpringBoard.app). If you take a step up one level in the Titanium Mobile code and look at builder.py you can see how they launch an app in the simulator and capture the output.
Ultimately I solved my specific need a different way. I needed to get data from the iPhone's accelerometers into a prototype app in Adobe AIR(Flash).
I used this app on the iPhone which drops UDP packets with X,Y,Z forces in them.
http://code.google.com/p/accelerometer-simulator/wiki/Home
Found that from this blog post which might be of interest to people trying to do other similar things.
http://ifiddling.blogspot.com/2009/01/dummy2.html
I used a Python script to present a server to Flash, grab UDP accelerometer packets, munge them into AMF and send them to Flash. Flash uses a socket to connect to this server and receive the accelerometer data.
A few parts, but it works nicely.
You can do this on the device if it is jailbroken. You can put a debug build and symbols on your device and run gdb on it. It is totally unsupported but I hear it works. Not sure if there is a good tutorial. Google?
One method would be to use the AsyncSocket class, and pass whatever data you want to log from the iPhone to a basic host app on the Mac, which NSLogs whatever it receives. If you follow the EchoServer application, you should be able to integrate it in just a few minutes
Related
I'm working on trying to launch an automated testing solution for some iOS applications. I'm using fruitstrap to transfer and install a compiled app over to the connected iPhone, but I'm struggling to find a way to automatically launch the application after the installation is complete.
Fruitstrap has an option to run the app in the GDB debugger, which works. Unfortunately there are some test cases which will require the app to be run without the debugger attached (special crash handling). I've spent a good amount of time muddling through the resources available on MobileDevice Library which is what Fruitstrap uses, but I haven't been able to turn anything up on launching an App.
Any ideas?
You can do what you want by using fruitstrap or Xcode to start a "bootstrap" program that causes your target application to run via a custom URL as described by Michael.
While the bootstrap program would be running under the debugger the URL-invoked program would be running normally.
Creating a bootstrap program and using URL Schemes may be an option for some people, and certainly should be considered, but it doesn't fit into my requirements.
What I ended up doing was to launch the app with the debugger through fruitstrap. I re-compiled fruitstrap to include the following prep commands (In the GDB_PREP_CMDS define):
handle all noprint pass nostop
continue
The handle will pass the signal on to the program so the custom signal handler (crash handler in this case) will handle the signal. The continue was something I needed so that the app would actually run once the debugger started.
There is one unfortunate flaw in this, which unfortunately I do not know a workaround for. The ARM7 version of GDB does not have the 'set dont_handle_bad_access' command like the darwin version does. For some reason passing EXC_BAD_ACCESS signals to the program does not work and the app hangs. This is significant since this is the signal for most crashes. But as it stands now, its the best I can do, and at least its handling uncaught exceptions.
I believe you may be looking for some sort of Custom URL Scheme.
Have a look at the following document and scroll down to: Implementing Custom URL Schemes
http://developer.apple.com/library/ios/#DOCUMENTATION/iPhone/Conceptual/iPhoneOSProgrammingGuide/AdvancedAppTricks/AdvancedAppTricks.html
You can also google URL Schemes in iOS to see if you come across something similar to what you are trying to do.
Let me know if this helped you out. Would be interesting to hear if you had any success.
Cheers.
Does apple support mirroring of ipad on tv can any one give me some idea
Read this article:
There’s been a lot of confusion about how the iPad VGA Adapter works. I received mine today, and I thought I’d try to clear things up a little (and give you some code to play with, if you’re an iPad developer with a VGA adapter of your own).
The first thing to understand is that the adapter does not mirror your iPad screen. You can’t plug your iPad into your TV or monitor and see all your apps on the big screen, like Steve Jobs does when he’s giving a demo.
Apps that want to support external display via the adapter must explicitly do so; the developer has to write code to support it. There are some standard iPhone OS-supplied APIs which will automatically do the right thing (such as video playback via standard controllers), but generally you won’t see anything on your external display unless the app you’re using has taken steps to put something there. That’s the “bad” (though surely not surprising) news.
The good news is that it’s trivially easy to support external display from your app if you’re a developer; the connected display just shows up as another UIScreen object. I made a sample project (which you can download as a zip archive here) that shows how to do it.
It’s basically just a nib with two windows (one for showing on the iPad, and one for showing on the connected external display), and a tiny bit of code to make it work.
I am working on an app for the iPad and would like to be able to include the option to use a separate iOS device to control it. I have seen examples of this with games (notably Chopper 2), but have no idea how it is done.
Can anyone point me in the direction of the iOS frameworks that back this feature? I have looked through the SDK but cannot find the relevant sections.
Thanks
Im sure they use Game Kit or you could use the lower level Bonjour discovery.
Read through the GameKit docs.
You can start there. I guess the controller is actually a separate feature of the app that just sends messages over the network, using sockets to send and receive the data.
Send over the network from the controller. Receive them on the ipad in a running thread or however the service you use handles it. process the received messages.
Agreed with #alJaree. I'm working on something similar, though I've found it much easier to implement through Unity. Prime31 has a number of sweet plugins that allow you to implement things like Bluetooth through gamekit in a single line of code. I'm on my ipad right now so I cant be sure of the exact URL, but I think it's just prime31.com, in their 'unity' section.
I have gone through following link
http://zachwaugh.com/2009/03/programmatically-retrieving-ip-address-of-iphone/
and I have also tried this one (but this isn't recognized by Apple).
http://appsamuck.com/day4.html
I just want that when user tap on "wifi" button.
Reports stored in documents directory can be accessed by other pc using wifi using the IP that I display on my iPhone application. How is it possible?
In my apps, I use CocoaHTTPServer to get local info into and off of the phone. You run the server and out-of-the-box, it indexes all the files in the documents directory.
To do what you want, you will need to edit the code to return some other kind of data format (xml probably is the easiest) the call this from inside your app to get that data. CocoaHTTPServer easily take POST right out of the box too, so you can post an xml response as well.
After thinking about it, CocoaHTTPServer is best run on the computer side behind the scenes. the iphone can then send info to the computer where handling the code should be easier and you have more options.
I can't point to any specific examples but the way to do this would be the ZeroConf protocol - both the iPhone and PC would have to be on the same network to have this work.
Has anybody had any success ever attaching a debugger to a tethered device? I am able to debug my j2me application in the emulator, but have a lot of trouble sorting out phone-specific problems when they come up. The phone I'm using is a Nokia N95, but ideally the debug process would work on any phone.
Is this possible? If so does anyone have steps they've used to set it up?
Sony Ericsson supports debugging on ebery phone at least since K700, this is done by using KDWP. UIQ 3 communicators also can be debugged the same way.
By the way, it the latest phones by SE it is even possible to monitor memory consumption and CPU profiling. So if you wanna debug your apps on real phones, I would suggest also using SE phones, they are really good at it. I use Netbeans, and it works without any problems with any SE phone.
Motorola phones support a debugging interface called KDWP(Motodev registration required).Their MIDway tool can also be useful for getting debug trace information from a midlet running on a device.
As other stated, on device debug is something that strictly depends on manufacturer's will and often it's nearly impossible. However, i can address you to Gear Java Mobile Framework that gives you the opportunity to use an on-device debug console to print your messages and thus read phone specific issues. If you need some explanation on how to use it, take a look to this tutorial
Unfortunately this is not generally possible. Some makers (like Sony-Erricson) support this on some of their phones but not all. I am not sure if there is on-device-debugging tool for N95 but you can use Nokia's emulator which should be pretty close to the device. The new Java ME SDK comes with promise of real ODD in near future. But it still very much depends on OEM cooperation.
I find a good debugging method is to control a string value which gets painted on top of everything else when it is not null. This will work anywhere, though obviously isn't ideal, but can be used to catch Exceptions, print values etc. Of course you're limited to the small screen, but in theory you could even code some scrolling functionality.
Some people use RMS logging but personally I could never be bothered.
As others have said here, Motorola have Midway which I think is great.
Others are correct here in that on-device debugging is very much device specific. I haven't done anything with Series 60, but at least on Series 40 phones, I had to open up a CommConnection and write out to it in order to see much of anything going on. The device emulators are again a mixed bag, but you usually can get 90% of the way to your application working on them and can usually get your debugger connected to them. If you aren't making use of any of the hardware on the phone, that should get you most of the way there.
I've used the Blackberry tools on occasion to debug J2ME applications (without using RIM APIs) but it is very slow and still is only emulation, not the actual device (but it sometimes does help to shake the odd thing out). I agree it is frustrating when you have something running on an emulator only to find that it doesn't run on the hardware.
You can not debug step by step like android or other SDK.
In J2ME you can trace the error by adding the log statement in the code and add another midlet and display that log screen in that midlet.
Example: add Log.p("Log statement.....");
LogMidlet.java
// Add the following line in the startup method of this midlet.
Log.getInstance().showLog();
This way you can somehow track the error in j2me.
I think it is possible to add additional debugging information on preprocess step. Like this:
public void myMethod() {
Debug.traceMethod("myMethod");
int var = 1;
Debug.newLine();
var++;
Debug.newLine();
...
}