Does the iPhone 3G/3Gs camera put metadata in its images? (and how do you get it?) - iphone

I've been trying to figure how whether the iPhone (either 3G or 3Gs) camera puts metadata into it's images. Anecdotally, it appears that it does (e.g., I've seen images posted on the web that included a bunch of metadata), but I can't find reference to it anywhere in the SDK documentation. So....does anyone have a definitive answer? Also, if there is metadata, how do I get at it?

There is metadata. Check out the iphone-exif project, which provides you a means to get/set the EXIF tags. As they note, UIImage will strip out the metadata. iphone-exif works around this. It requires you to use the UIImageJPEGRepresentation() function to feed the NSData into a specialized scanner class, which they provide.

IIRC, the API hides that data out of privacy concerns. The data is in the images but you can't get to it using the Apple API.

Related

iOS video player metadata

My question is if there is any built-in interpretation of metadata by the video player in iOS. I know one can add meta-data to a video and interpret it within a custom application as shown here.
In iOS on ipod or iphone, an HTML video is opened within the native player. I would like to display a message above or below the video for a short duration at the beginning. Since I cannot control the native player I thought there might be some built in metadata interpretation that could be used to perform this. I have not been able to find any information on this.
Any help is appreciated.
The blog you've posted includes details on using the native player MPMoviePlayerController to display meta data, which is pretty cool actually. You learn something new every day! If you're making a Phonegap App I suppose you could write a plugin to do this?
Or alternatively, have a look at this other OS question which appears to suggest that it is possible - though not seemingly with metadata embedded in the actual video. Apparently this works on iOS.
Reading metadata from the <track> of an HTML5 <video> using Captionator

iOS Advanced Audio API for decompressing format

On iOS, is it possible to get the user's audio stream in a decompressed format? For example, the MP3 is returned as a WAV that can be used for audio analysis? I'm relatively new to the iOS platform, and I remember seeing that this wasn't possible in older iOS versions. I read that iOS 4 brought in some advanced APIs but I'm not sure where I can find documentations/samples for these.
If you don't mind using API for iOS 4.1 and above, you could try using the AVAssetReader class and friends. In this similar question you have a full example on how to extract video frames. I would expect the same to work for audio, and the nice thing is that the reader deals with all the details of decompression. You can even do composition with AVComposition to merge several streams.
These classes are part of the AVFramework, which allows not only reading but also creating your own content.
Apple has an OpenAL example at http://developer.apple.com/library/mac/#samplecode/OpenALExample/Introduction/Intro.html where Scene.m should interest you.
The Apple documentation has this picture where the Core Audio framework clearly shows that it gives you MP3 out. It also states that you can access audio units in a more radical way if you so need.
The same Core Audio document gives also some information about using MIDI if it may help you.
Edit:
You're in luck today.
In this example an audio file is loaded and fed into an AudioUnit graph. You could fairly easily write an AudioUnit of your own to put into this graph and which analyzes the PCM stream as you see fit. You can even do it in the callback function, although that's probably not a good idea because callbacks are encouraged to be as simple as possible.

Access iPhone User Songs and Videos?

I was wondering if I can access user's songs and videos in iPhone, part of that access if I can save them or modify them? hopefully not a Jailbroken iPhones
I am greatly appreciated.
Yes, you can. I can't speak for video, because I've only done it for audio, but you can definitely get audio data. These links should get you started. Note: I am as yet unsure if this works with tracks that use any kind of iTunes-related DRM.
First of all, this blog post talks you through the method of accessing the data. Note the reliance on iOS 4.1 or above.
This SO question/answer explains how to get at the raw pcm data, should you want to do more than just save it out.
You can allow the user to pick songs using the MPMediaPickerController class. I think you can save the selected item to your app's sandbox directory.
You can read up on this a bit more with this SO question.

iPhone streaming debugging information

I'm looking for a way (doesn't need to be app-store save!!) to get ahold of video-streaming-relevant debugging information.
What I'm trying to do, is to write an application that opens a video stream and displays information like:
framerate
bitrate audio / video
etc etc.
codec information
basically i want to display as much information for any given stream.
Thanks for any information in advance,
best regards
sam
Even though you tagged your question with MPMoviePlayerController, that class probably isn't going to help you out very much. First of all, there's a limited amount of information you can access from it at a high level, certainly nothing about codecs and audio bitrate. And even if the class does store this type of information somehow, your app would be disqualified from being in the iTunes AppĀ Store if you access non-public methods or properties.
Secondly, MPMoviePlayerController only supports a limited number of codecs itself, namely the ones that can be decoded in hardware on the iPhone/iPad (H.264 baseline and MPEG-4 videos).
Anyways, a good option could be FFMpeg for the iPhone. Getting the information you need seems to be much more straightforward this way; check out this blog post for a nice tutorial for using the libraries.
I'm not sure about the potential legal issues concerning distributing such a program in the App Store, but if you statically link it with your binary that would at least satisfy Apple... you'll have to check the FFMpeg Legal Site for their end.

Newbie wants to create a PDF reader for ipod touch - what's the best approach?

I want to make a small app that displays a PDF, presenting zoom-able single pages with a previous-next page function.
The Core Graphics API is pretty much the same in Cocoa and Cocoa touch. Read up on CGPDFDocument, it should provide you with everything you will need to render PDF pages. You won't need to read the PDF spec or use a library to parse PDF files directly. You will probably to learn more about Core Graphics / Quartz 2D / etc. to understand how to use those functions inside of a Cocoa app.
Based on the gradually evolving Apple policy of rejecting application submissions that duplicate functionality already on the iPhone I would worry about spending too much time even as a newbie on something that is part of the core iPhone feature-set.
This is pretty trivial. The CGPDFDocument functions will allow you to do anything you'd want to do with a PDF file.
The iPhone and iPod touch can view PDFs already, as one of the TV adverts in the UK shows an email with a .pdf attachment (of swimming lessons) being viewed. It can also view .doc, .xls, and so on, so if he is creating a viewer type application then supporting those as well could be a nice feature addition later on.
This means there is a PDF framework on these devices that you will need to access. Presumably Apple can provide support here if he is a paid up developer. Syncing the PDFs to the device is the actual real difficulty, as this isn't supported by iTunes. I assume that you would need to write a network based synchronisation tool, or have an online cloud for holding people's PDFs.
The device doesn't support Flash, so using PDF to Flash conversion tools will not work.
I found this HTML5 framework that should work on an iPad http://bakerframework.com/
but I didn't test it yet.