Google TV Picture in Picture - google-tv

I am aware that Live TV is the only gTV application that can run in PiP (Picture in Picture) mode.
My question is there anyway i can program my application, when invoked, to always run with LiveTV as a frame inside?
I need to ensure that my application always runs with LiveTv as a frame inside!
Thanks In Advance

No official way - and the unofficial ways will not be supported in the future.

Related

Record iPhone screen and user's actions

In order to do usability testing I'd like to record an iPhone's display along with every user action. I can't modify the application itself however jailbreaking the phone wouldn't be a problem.
Ideally I'd like to get a full resolution video of the screen display with an overlay showing touch events on top of it.
For now the best solution I've found is using a video-out cable and record its output, but with this solution I'd need an external camera to capture what the user was doing and it wouldn't be very precise.
Other ideas?
The application display recorder, found in the big boss repo (cydia) works very well for this.
I have tried MirrorOp (requires JailBreak) and AirSquirrels' Reflector (no JB required) for usability testing. Both work very well, but none grab touch feedback. You can use a second camera or a Hug the notebook approach.

iphone - making Wifi-list app

I want to programmatically show a list of available wifi on my iphone.
I tried to run this program but it doesn't work.
http://code.google.com/p/iphone-wireless/wiki/Stumbler
Does anyone have any tutorial or sample code about creating a list of wifi-network.
Thanks.
I think that's not possible. I didn't try it myself. But I just read the first few lines of the description of Stumbler. There's the following line:
Stumbler can not be distributed through the app store, as it uses private APIs!
So I don't think it's possible to create an application that lists all WiFis. On iPhone this is a part of the Operating System.
Sandro Meier

Is it possible for an app to run in the background and collect data?

I want to make an app that runs in the background so that if a user is reading a web page or PDF file on an iPhone or iPad, he can mark some words, see the meaning of those words, and then have those words stored in the app's database. He can then afterwards look for the words which he has learnt and increase his vocabulary.
Does the iOS 4 API allow that? What are the limitations? Advantages? Disadvantages?
Thanks in advance
No is does not, you can only run voip. audio or navigation apps in the background.
Only one app can run at a time, and installing one app cannot effect any built in app. So no, there is no way to achieve what you are trying for here.
Your best bet is to instruct users to copy the word, open the app, and then you can snag whatever is on the clipboard then.
(Note there are multitasking APIs, but you still cant access anything outside of your app even if you convince the OS to let you run in the background for a little while)

How to use the camera device in an iPhone app

I am working on an image editing app and therefore googled for it.
I have found some links which says that we can work with camera by ourselves, like here.
They say we can:
capture images from within our app (ColorSplash app)
using accelerometer with camera and some other features
So far my coding doing just opening camera and lets user do the rest.
But I want above listed features... at least the first one.
Can it be done?
i use code from this site to do what your first task do:
http://www.zimbio.com/iPhone/articles/1109/Picking+Images+iPhone+SDK+UIImagePickerController
http://trailsinthesand.com/picking-images-with-the-iphone-sdk-uiimagepickercontroller/
these both were links really helpful.
#Sawan yes you can do the things u want,capturing of image and its selection please take help from here and also u can use accelerometer in the same way we use in our apps

Using Shark to profile an iPhone game, pressing "Start" doesn't do anything

I've done a decent amount of reading about how to profile iPod applications using Shark, and all works well until I try to click "Start" (and nothing happens). I've tried profiling an individual process (app), all processes, and it doesn't seem to Start. This tutorial is one place that I used to set it up (so I think I'm following all the steps):
http://rudifa.wordpress.com/2009/09/16/profiling-an-iphone-application-with-shark/
(I've also rebooted my mac and iPod)
Anyone have any ideas on what to try next?
Shark support has been dropped on iOS 4. Official Apple position can be seen here:
https://devforums.apple.com/message/243237
They expect you to use Instruments and Time Profiler instead.
Have you tried the Time Profiler instrument? Apple seems to be moving in that direction.