Know what device in flash builder for mobile - iphone

I'm doing an application for mobile devices, specifically for iphone and ipad. And I want to use different interfaces for each. How can I do that? Is there a variable that holds the name of the device?
Such as
if(device=="iPhone")
{use this state}
else if (device=="iPad")
{use that state}
??

You can use Capabilities.os to obtain the device's operating system and see if it uses iOS, then use Capabilities.screenResolutionX and Capabilities.screenResolutionY to determine if the resolutions correspond to an iPhone or an iPad.

Look at this document which shows general principles for scaling things based on the screen resolution.
It is not an exact answer to your question but I am guessing that you just want to scale your interfaces based on the screen resolution.
You can also check the screen DPI using Capabilities.screenDPI. which is nice to know how spread out those pixels are along with Capabilities.screenResolutionX and Capabilities.screenResolutionY to get the resolution.
If you really want to know what the operating system is you can check using Capabilities.os, but as for an exact device I'm not sure if there is a method for that.

Related

iphone camera application

while in graduate school for biomedical eng I developed a device we called a "vein finder". It was a simple device that was good enough for our eng school to patent.
I think it would be very very easy to use the iphone camera to develop an iphone app whereby MD's/ nurses/EMT's could use the app to easily identify peripherial veins that are not visible to the naked eye. This would be invaluable in starting IV's and giving bedside medications. The vein finder was esp helpful for patients in shock who had poor venous filling and therefore didn't have veins that "popped into view" with a tourniquet.
It would require using the iphone light at specific wavelengths... anybody have any idea if that is possible?
I don't think you have any control over the flash light that illuminates images. You can only turn it on and off. I also doubt you would be able to get the specific wavelengths you need.
My suggestion would be to look into building a peripheral light device which plugs in to the headphone jack for power and has the necessary functions for emitting light at different wavelengths. That way, you would be able to get the exact result you require.
You may also need to look into the camera itself, as it may not have the ability to capture the light at the wavelengths you may require. Hope that Helps!
Do you mean overlaying veins over the picture on the camera? It would be easy if the veins are always in the same place... What would your imagined procedure be for using this app?
EDIT: You can not manipulate the iPhone light for different frequencies; you can only turn it on and off. You would need to get a separate external light for that.
To work around the fact that you can't set the wavelength of the built in light, you should use an external light along with the iOS device.
Apologies for massive necro!
The LED light of the iPhone is not the problem - most white light sources contain IR or near-IR (in fact they are made up from the entire visible spectrum). Daylight works very well too.
The problem is the receiver. Most camera lenses, including smartphones, have an IR filter - thus cutting out the useful part of the spectrum. All we need is a digital camera with IR filter removed, that has a viewfinder (where you'll see the veins).

Playing iPhone movies through TV out

Is there any way to emulate the Videos app such that we still maintain controls on the device (iPad/iPhone), but sends the video out through the cables to the TV? I looked into screen mirroring, but it's way too slow for videos, and regardless, the UIGetScreenImage() used by screen mirroring is no longer allowed by Apple.
The Videos app seems to have exactly what I need, but I don't see anything simple to make that happen.
Update (10/15/10): So apparently movies played through UIWebView have TV-Out support, while MPMoviePlayerController movies don't.
http://rebelalfons.posterous.com/iphone-os-support-for-tv-out
However, there is a caveat: this does not work on older devices updated to the most recent iOS. That is, iPod touches, iPhone 3G & 3GS don't work, while iPhone 4 and iPads do. Hoping there's some more stuff that we can use to fill in the gaps in compability, since I know its possible. Apps like AirVideo and StreamToMe currently support this functionality.
I am not sure if there is an official way to do this. But as for your examples, AirVideo uses method Swizzling (check out: http://www.cocoadev.com/index.pl?MethodSwizzling) to override checks in Movie Player, and acting like Videos app
I guess some with some reverse engineering on SDK, you can find where the checks are made, and swizzle that method, with your custom one.
In 3.2 and later (postdating the site you link to), UIScreen has a class method, 'screens' that'll return an array of one object — the main screen — if no external display is available, or two screens — the main screen and the external screen — if a TV lead is connected. The task should be as simple as positioning the views you want to appear on the external screen within its frame and the controls you want on the device within the other.
Have you tried that?
Edited for one additional comment: also as of 3.2, it is explicitly permissible to create an MPMovieController and then grab the view from it to treat as a normal UIView rather than doing a full 'present'. So that's how you'd get a movie view that you can position as you wish.

Can the exposure time be manually adjusted for an iOS cameras?

I want to adjust the exposure of the iPhone/iPod touch camera with intimate detail. I would prefer to take a series of photos with decreasing exposure times to obtain a sequence of images (for HDR reconstruction). Is this possible?
If not, what's the next best thing? It seems you can set a point of interest in the image for the autoexposure. Perhaps I could search for a dark/light region of the image and then use this exposurePointOfInterest to adjust the exposure, but this seems like a very indirect solution that is also error-prone. If anybody has tried an alternative, such an answer is also desirable.
As iOS gives control of frame durations by
MinFrameDuration
MaxFrameDuration
since exposure times vary based on fram rate and frame duration
By setting min and max frame rate to a particular value
You will be locking the fram rate.
That will effect your exposure times.
This is also very indirect way of controlling, may be it helps your case
some example would be like this:
if (conn.isVideoMinFrameDurationSupported)
conn.videoMinFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
if (conn.isVideoMaxFrameDurationSupported)
conn.videoMaxFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
Since you would have to decrease the shutter speed of the camera, this unfortunately does not appear to be possible, and more importantly, against the HIG:
Changing the behavior of iPhone external hardware is a violation of
the iPhone Developer Program License Agreement. Applications must
adhere to the iPhone Human Interface Guidelines as outlined in the
iPhone Developer Program License Agreement section 3.3.7
Related article Apple Removes Camera+ iPhone App From The App Store After Developer Reveals Hack To Enable Hidden Feature.
If it can be done programatically, instead of with the hardware, you might have a chance, but then its just an effect on an image,not a true long exposure picture.
There are some simulated slow shutter apps that do get approved like Slow Shutter or Magic Shutter.
Related article: New iPhone Camera App “Magic Shutter” Hits The App Store.
This is supported since iOS 8:
http://developer.xamarin.com/guides/ios/platform_features/intro_to_manual_camera_controls/
Have a look at AVCaptureExposureModeCustom and CaptureDevice.LockExposure...
I tried to do this for my motion activated camera app (Pocket Sentry) and I found that it is not possible to do this AND get approved in the app store.
I have been trying to do this myself. I think its possible only by using the exposure point of interest property. I am detecting the dark and bright spots and then adjusting the point accordingly.
Please refer : Detecting bright/dark points on iPhone screen
Does anyone know a better way to do this?
I am not too sure, but you should try using AVFoundation class to build the camera app, following the apple's sample code:
AVCam Sample Code
And then try to leverage the exposureMode property of the Class:
exposureMode Class Reference

How to accommodate for the iPhone 4 screen resolution?

According to Apple, the iPhone 4 has a new and better screen resolution:
3.5-inch (diagonal) widescreen Multi-Touch display
960-by-640-pixel resolution at 326 ppi
This little detail affects our apps in a heavy way. Most of the demo apps on the net have one thing in common: They position views in the believe that the screen has a fixed size of 320 x 480 pixels. So what most (if not all) developers do is: they designed everything in such a way, that a touchable area is (for example) 50 x 50 pixels big. Just enough to tap it. Things have been positioned relative to the upper left, to reach a specific position on screen - let's say the center, or somewhere at the bottom.
When we develop high-resolution apps, they probably won't work on older devices. And if they do, they would suffer a lot from 4-times the size of any image, having to scale them down in memory.
According to Supporting High-Resolution Screens In Views, from the Apple docs:
On devices with high-resolution screens, the imageNamed:,
imageWithContentsOfFile:, and
initWithContentsOfFile: methods
automatically looks for a version of
the requested image with the #2x
modifier in its name. It if finds one,
it loads that image instead. If you do
not provide a high-resolution version
of a given image, the image object
still loads a standard-resolution
image (if one exists) and scales it
during drawing.
When it loads an image, a UIImage object automatically sets the size and
scale properties to appropriate values
based on the suffix of the image file.
For standard resolution images, it
sets the scale property to 1.0 and
sets the size of the image to the
image’s pixel dimensions. For images
with the #2x suffix in the filename,
it sets the scale property to 2.0 and
halves the width and height values to
compensate for the scale factor. These
halved values correlate correctly to
the point-based dimensions you need to
use in the logical coordinate space to
render the image.
This is purely speculation, but if the resolution really is 960 x 640 - that's exactly twice as high a resolution as the current version. It would be trivially simple for the iPhone to check the apps build target and detect a legacy version of the app and simply scale it by 2. You'd never notice the difference.
Engadget's reporting of the keynote included the following transcript from Steve Jobs
...It makes it so your apps run
automatically on this, but it renders
your text and controls in the higher
resolution. Your apps look even
better, but if you do a little bit of
work, then they will look stunning. So
we suggest that you do that
So I infer from that, if you use existing APIs your app will get scaled up. If you take advantage of new iOS4 APIs, you can get all groovy with the new pixels.
It sounds like the display will be ok but I'm concerned about the logic in my game. Will touchesBegan positions return points in the new resolution? The screen bounds will be different, these types of things could potentially be problems for me.
Scaling to a double resolution for display purpose is straight forward, but will this scalling apply to all api's that input/output a screen coordinate? If not things are going to break aren't they?
Fair enough if it's been handled extensively throughout the framework.. I would imagine there are a lot of potential api's this effects.
For people who are coming to this thread looking for a solution to a mobile web interface, check out this post on the Webkit blog: http://webkit.org/blog/55/high-dpi-web-sites/
It seems that Webkit has solved this problem four years ago.
Yes it is true.
According to WWDC it appears that apple has build it some form of automatic conversion so that the resolution for applications will not be completely off. Think up-convert for dvd to HDTV's.
My guess would be that apple knows what most of the standards developers have been using and will already be using these for an immediate conversion. Of course if you are programming an application to take advantage of the new resolution it will look much nicer than whatever the result of apples auto-conversion is.
All of your labels and system buttons will be at 326dpi but your images will still be pixel doubled until you add the hi-res resources. I am currently updating my apps. If you build and run on the iPhone 4 sim then it is presented at 50%, go to Window > Scale > 100% to see the real difference! Labels are smooth, my images look shocking!

iPhone 4.0 Screen Resolution and writing robust code

Does anyone know what will happen with existing apps when they run on the iPhone 4.0 in terms of the new screen resolution? I am assuming, just like developing for the iPad that there should be no hard coded screen resolutions in your code.
I'd also like advice on the best way of writing robust code to work well on any device. For instance, detecting the screen resolution is not enough - on the iPad the screen is physically bigger so you can display more items on it. On the new iPhone the screen is the same physical size but higher resolution, so the likely thing is that you wont want to display more items, just higher resolution versions of them.
Any help would be useful,
Regards
Dave
EDIT: I have read the other similar posts, I guess what I really would like to know is what is the recommended way to write code for all App Store devices in a robust way so they a) all work b) make best use of the device.
UIKit has be redesigned so that old apps just work unmodified in the iPhone 4. There are then several things you can do, some programmatic and some by just adding higher resolution images to you app bundle.
Firstly, and most simply you can include new double res images that are used by your app with a suffix of #2x in the name. i.e., Event.png as well as Event#2x.png. [UIImage ImageNamed:] will automatically look for the a file with this suffix if it is running at the higher res.
All the other UIKit stuff now uses points instead of pixels. So for both old and new apps full screen is still 320 x 480 points. This pretty much means everything will work including touches, etc. Although they may now return fractions of a point.
The only real gotcha so far seems to be if you use CGBitmapContextCreate as this uses pixels and requires some jiggery - pokery.