Is there a way to render IPhone mic input directly into memory without working with files? - iphone

I see that the IPhone core audio does not include audioDevice objects to render audio input directly into RAM. I hear people talking about using files to do this(like speak here) but I am thinking there must be a way to do this otherwise. Your thoughts would be appreciated.

Check out the aurioTouch sample in the iPhone Developer site.

Related

iOS Advanced Audio API for decompressing format

On iOS, is it possible to get the user's audio stream in a decompressed format? For example, the MP3 is returned as a WAV that can be used for audio analysis? I'm relatively new to the iOS platform, and I remember seeing that this wasn't possible in older iOS versions. I read that iOS 4 brought in some advanced APIs but I'm not sure where I can find documentations/samples for these.
If you don't mind using API for iOS 4.1 and above, you could try using the AVAssetReader class and friends. In this similar question you have a full example on how to extract video frames. I would expect the same to work for audio, and the nice thing is that the reader deals with all the details of decompression. You can even do composition with AVComposition to merge several streams.
These classes are part of the AVFramework, which allows not only reading but also creating your own content.
Apple has an OpenAL example at http://developer.apple.com/library/mac/#samplecode/OpenALExample/Introduction/Intro.html where Scene.m should interest you.
The Apple documentation has this picture where the Core Audio framework clearly shows that it gives you MP3 out. It also states that you can access audio units in a more radical way if you so need.
The same Core Audio document gives also some information about using MIDI if it may help you.
Edit:
You're in luck today.
In this example an audio file is loaded and fed into an AudioUnit graph. You could fairly easily write an AudioUnit of your own to put into this graph and which analyzes the PCM stream as you see fit. You can even do it in the callback function, although that's probably not a good idea because callbacks are encouraged to be as simple as possible.

Is it possible to read iPhone or Android display data through the audio jack?

I'm not very well versed in the iPhone and Android API, so please bear with me if this is a stupid question.
As I understand it, Square's card reader works by converting the magnetic information on the card stripe into an audio tone that its software can then process. [1]
In a similar way, is there a way to somehow read what exactly is being displayed on the device screen simply through a small device inserted into the audio jack on that device?
[1] http://www.quora.com/How-does-Squares-hardware-work
It's not quite clear what you wish to achieve. You can indeed make an app that would output a representation (perhaps audio frequency-shift keying?) of the screen's contents to the iPhone's audio jack.
The iPhone (and other iOS-based devices) use TRRS connectors for bi-directional audio (and hence arbitrary modulated data) communication and there are well-supported publicly-documented APIs for using these interfaces.
That said, if you're writing your own app: why would you want to output the contents of the screen? If you are developing the app in question, why not transmit the salient data in a more effective manner? Which leads me to my next assumption:
You want to read what's being displayed on the device's screen at any time, not just when an app of your creation is open. In this case, the answer is that it is not possible, with the possible exception of a jailbroken solution. That said, I can't imagine a jailbroken solution being useful much longer on account of iOS 5 introduced "display mirroring" by means of AirPlay.
On Android, I have no idea. :-)
No. The screen is not connected to the audio jack.
I think you can make an app to take a screenshot and then encode that photo as music to play it.
It won't sound good though :)
For this kind of task, there is built in camera

iPhone SDK: Is it possible to process audio file from local library

Well, I will try best not to make it as a 'I just want the code' question...
I'm recently working on a project which requires some audio signal processing from local music files (e.g. iTunes Library). The whole work includes:
Get the PCM data of an audio file (normally from iTunes library); <--AudioQueue (?)
Write the PCM data to a new file (it seems that Apple does not allow direct modification on music tracks); <--CoreAudio(?)
Do some processing and modification, like filters, manipulators, etc. <-- Will be developed in C++
Play the processed track. <--RemoteIO
The problem is, after going through some blogs and discussions:
http://lists.apple.com/archives/coreaudio-api/2009/Aug/msg00100.html, http://atastypixel.com/blog/using-remoteio-audio-unit/
http://osdir.com/ml/coreaudio-api/2009-08/msg00093.html
as well as the official sample codes, I got a feeling that the CoreAudio SDK allow us to apply audio processing only on voice demos recorded from Mic.
My question is that:
Can I get raw data from iTunes library tracks instead of Mic input?
If the first question is 'No', is there a way to 'fool' the SDK to let it think it is getting data from Mic input, not from iTunes? (I have done some similar 'hacking' stuff in C# before XD)
If the whole processing just doesn't work, can anyone provide some alternative ideas?
Any help will be appreciated. Thank you very much :-)
Thanks.
Just found something really cool yesterday.
From iPhone Media Library to PCM Samples in Dozens of Confounding, Potentially Lossy Steps
(http://www.subfurther.com/blog/?p=1103
And also a class library by MIT:
TSLibraryImport: Objective-C class + sample code for importing files from user's iPod Library in iOS4.
(http://bitbucket.org/artgillespie/tslibraryimport/changeset/a81838f8c78a
Hope they help!
Cheers,
Manca
1) No. Apple does not allow direct access to PCM data of songs. Otherwise you could create music-sharing apps, which is not in Apple's interests.
2) No. Hacking and getting approved is impossible due to Apple's code approval mechanism.
3) The only alternative I could think of is that you have to do the processing part on PC/Mac and then transfer it to the iPhone. Or you would have to store the files in your own applications folder - you should be able to load and process these via CoreAudio.
I know this thread is old but... did this work for you, Manca? And did this app get approved?
EDIT: just discovered the AVAssetReader class, introduced since iOS 4.1, should help

Create video in iPhone

I need to convert image sequences(ie,png) to video file in iPhone. How i can convert the images to video.
Regards,
Just ignore bad advice like "use ffmpeg". That would work on the desktop, but the license issue makes including ffmpeg source code in your iPhone app legally questionable. Apple provides a class named AVAssetWriter that you would use in your app to encode a series of images as h.264 stored in a .m4v quicktime container file. While the apple provided logic does work, it is not so easy to actually use and you will need to read quite a lot of documentation to get the code working. If you want to skip implementing it yourself (and likely save yourself 3 or 4 days of work), please consider using my AVAnimator library for iOS as the h.264 encoding logic is already implemented in the class AVAssetWriterConvertFromMaxvid. Once encoded as h.264, the video can be played with the standard player and it is small enough to upload to a remote server.
You are likely going to need something like FFMPEG

Newbie wants to create a PDF reader for ipod touch - what's the best approach?

I want to make a small app that displays a PDF, presenting zoom-able single pages with a previous-next page function.
The Core Graphics API is pretty much the same in Cocoa and Cocoa touch. Read up on CGPDFDocument, it should provide you with everything you will need to render PDF pages. You won't need to read the PDF spec or use a library to parse PDF files directly. You will probably to learn more about Core Graphics / Quartz 2D / etc. to understand how to use those functions inside of a Cocoa app.
Based on the gradually evolving Apple policy of rejecting application submissions that duplicate functionality already on the iPhone I would worry about spending too much time even as a newbie on something that is part of the core iPhone feature-set.
This is pretty trivial. The CGPDFDocument functions will allow you to do anything you'd want to do with a PDF file.
The iPhone and iPod touch can view PDFs already, as one of the TV adverts in the UK shows an email with a .pdf attachment (of swimming lessons) being viewed. It can also view .doc, .xls, and so on, so if he is creating a viewer type application then supporting those as well could be a nice feature addition later on.
This means there is a PDF framework on these devices that you will need to access. Presumably Apple can provide support here if he is a paid up developer. Syncing the PDFs to the device is the actual real difficulty, as this isn't supported by iTunes. I assume that you would need to write a network based synchronisation tool, or have an online cloud for holding people's PDFs.
The device doesn't support Flash, so using PDF to Flash conversion tools will not work.
I found this HTML5 framework that should work on an iPad http://bakerframework.com/
but I didn't test it yet.