iPhone Image Loading in Photos Application - iphone

I am not sure if this is the right forum for asking this question, but I googled "questions related to iPhone" and the first page had SO from top to bottom. So here goes.
When I open the Photos Application in iPhone (3GS, 4 and 5 all running iOS 5.0) and open an image, first a blurred image appears for a fraction of a second, which clears up into the actual picture. My question is, does the Photo application have a low resolution copy of the high res images which it displays while the image is being loaded, OR does it generate a low res image on the fly before going on to load the high res image.
I am writing an application to browse through the photos, and need to know which is the best approach. That is the purpose behind this question.

The best way is using ALAsset thumbnail . If you have concerns with image clarity, then go with ALAsset fullResolutionImage
Here are some details that you must read before going to start developing an photos application.
An instance of ALAssetsLibrary provides access to the videos and photos that are under the control of the Photos application.
An ALAsset object represents a photo or a video managed by the Photo application.
There are different ALAsset Accessing Representations
1.thumbnail
Returns a thumbnail representation of the asset.
- (CGImageRef)thumbnail
2.aspectRatioThumbnail
Returns an aspect ratio thumbnail of the asset.
- (CGImageRef)aspectRatioThumbnail
3.defaultRepresentation
Returns an asset representation object for the default representation.
- (ALAssetRepresentation *)defaultRepresentation
4.representationForUTI:
Returns an an asset representation object for a given representation UTI.
- (ALAssetRepresentation *)representationForUTI:(NSString *)representationUTI
An ALAssetRepresentation object encapsulates one of the representations of a given ALAsset object.
1.CGImageWithOptions:
Returns a full resolution CGImage of the representation.
- (CGImageRef)CGImageWithOptions:(NSDictionary *)options
2.fullResolutionImage
Returns a CGImage representation of the asset.
- (CGImageRef)fullResolutionImage
3.fullScreenImage
Returns a CGImage of the representation that is appropriate for displaying full screen.
- (CGImageRef)fullScreenImage
Sample Code

Related

iPhone iOS how to add geotagging info to a PNG file?

I'm building a camera effects app and would like to be able to add geo tagging to the screenshots that I capture.
I'm grabbing camera images from an AVCaptureSession frame buffer and am displaying them to the user after some processing. The end result is a PNG screenshot. How can I take a CLLocation object and add it to a PNG image?
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
//how to add geotagging to this method?
[self performImageCaptureFrom:sampleBuffer];
}
Thank you!
You can follow this post to add the exif data to the png image. How to write exif metadata to an image (not the camera roll, just a UIImage or JPEG)
Note that the user must have already allowed location services for your app. If you are processing the image or letting the user do so, you might find it easier to add the metadata before the final save (which will allow you to not consume double memory by not reopening the image twice, and also since I don't know what kind of processing you are doing the Geo-tagging wont be lost during it).

How do I create an AVAsset with a UIImage captured from a camera?

I am a newbie trying to capture camera video images using AVFoundation and
want to render the captured frames without using AVCaptureVideoPreviewLayer. I
want a slider control to be able to slow down or speed up the rate of display of
camera images.
Using other peoples code as examples, I can capture images and using an NSTimer,
with my slider control can define on the fly how often to display them, but I
can't convert the image to something I can display. I want to move these
images into a UIView or UIImageView and render them in the timer Fire function.
I have looked at Apples AVCam app, (which uses an AVCaptureVideoPreviewLayer)
but because it has its own built in AVCaptureSession, I can't adjust how often the
images are displayed. (well, you can adjust the preview layer frame rate but
that can't be done on the fly)
I have looked at the AVFoundation programming guide, which talks about AVAssets
and AVPlayer, etc. but I can't see how a camera image can be turned into an
AVAsset. When I look at the AVFoundation guide, and other demos which show how
to define an AVAsset, it only gives me choices of using http stream data to
create the asset, or a url to define an asset using an existing file. I can't
figure out how to make my captured UIImage into an AVAsset, in which case I guess
I could use an AVPlayer, AVPlayerItems and AVAssetTracks to show the image with
an observeValueForKeyPath function checking status and doing [myPlayer play].
(I also studied the WWDC session 405 "Exploring AV Foundation" to see how that
is done)
I have tried similar code as in the WWDC Session 409 "Using the Camera on iPhone."
Like that myCone demo, I can set up the device, the input, the capture session,
the output, the setting up of a callback function to a CMSampleBuffer, and I
can collect UIImages and size them, etc. At this point I want to send that image
to a UIView or UIimageView. The session 409 just talks about doing it with
CFShow(sampleBuffer). This wasn't explained, and I guess its just assuming a
knowledge of Core Foundation I don't yet have. I think I am turning the captured
output in the sample buffer into a UIImage, but I can't figure out how to render
it. I created an IBOutlet UIImageView in my nib file, but when I try to stuff
the image into that view, nothing gets displayed. Do I need an AVPlayerLayer?
I have looked at the UIImagePickerViewController as an alternate method of
controlling how often I display captured camera images, and I dont see that I
can change the time on the fly to display images using that controller either.
So, as you can see, I am learning this stuff with the Apple development forum and
their documentation, the WWDC videos, and various websites such as
stackoverflow.com but have yet to see any examples of doing camera to screen
without using AVCaptureVideoPreviewLayer, UIImagePickderViewController or by
using an AVAsset that isnt already a file or http stream.
Can anybody make a suggestion? Thanks in advance.

Is there any standard fileopen dialog in iphone programming?

I want to create application, that will process images(mostly - photographs from mobile camera).
So i need at first to provide the way to open image file from phone memory.
how can i do that?
Sorry for such a stupid question, but i am newbie in iphone programming(yet:)).
P.S. I use xcode, cocoa
You can use UIImagePickerController to get a standard interface for selecting a photo from the user's library or camera roll. Depending on the configuration, it can also be used to capture a new image with the camera. Note however that you get a UIImage back, not a file, if you want to upload it somewhere, you will first have to create a file or NSData object from the image, using either UIImagePNGRepresentation() or UIImageJPEGRepresentation().

Is there a way to get paths of all the images of CAmera roll

Is there any way to create an array or something
having paths of all the images stored in camera roll.
Please enlighten me on this.
Thnx in advance
SpyPhone uses a direct path to the users photo library:
http://github.com/nst/spyphone/
It seems that any application can read it. However, this seems like falls under the area of "undocumented API" and could well be rejected in a real application.
I don't think there is. As far as the API is concerned for the UIImagePickerController you can only get a single path for a Movie that was recorded by the user (if running 3.0 on a 3GS). And even that path is a file URL to the temporary folder where the movie is stored before being written to the library.
You can't get paths for any images in the photo library or the camera roll.
When you pick an image using the UIImagePickerController the controller returns the Original Image and an "Edited Image" if the image was edited before being chosen.

UIImagePickerController does not deliver geo tag data

When I use UIImagePickerController to select a photo, either from the Camera Roll or the Photo Library, the image that gets returned to me in the method
'didFinishPickingImage'
does not contain the exif data for latitude and longitude. I know that the headers are there, because they show up when imported into iPhoto, also if I upload images from the Camera Roll, they also contain the exif headers for location. Is there a way to get UIImagePickerController to deliver that information as well?
It's up to you to geotag the image received from the UIImagePickerController, see my answer here :
UIImagePickerController and extracting EXIF data from existing photos
It seems that there is not. The UIImagePickerController incorrectly strips off the location tags, but they are there on the file (which you are not allowed to access as it's outside your sandbox).
See the discussion on Apple here: link text
And the comments on Flicker here: link text