I need to support Voice Over (for blind/partial blind persons) for my application. Please guide me how to implement this feature.
This is the Accessibility Programming Guide for iOS .
I think this question is very vague with many aspects, so I'll try to touch a few:
Technically, if you can abstract each string from the interface, this can help: http://www.acapela-for-iphone.com/. From the top of my head, I can think of saying out loud the positions of the interface elements. This approach requires simple interfaces, with few elements on 1 screen and maybe multiple screens for subsequent actions.
The best answer I've seen so far is this article which points to a Stanford lecture by Apple Engineer, Chris Fleisach
http://www.podfeet.com/blog/tutorials-5/build-accessible-ios-apps/
Chris's video: http://www.youtube.com/watch?v=5b0V6MltEnw
The video has what we all are looking for.
In the video starting around the 15:30 sec mark Chris starts talking about the relevant information. The API discussion starts at 19:29 sec.
I suggest to take a look at :
The design criteria to be well known before you code.
The developers guide to get some info about the way to code essential features.
The VoiceOver gestures if you need to know more than just the basic ones.
Some WWDC videos that are summarized and perfectly detailed with video timelapses to get directly the appropriate playback.
Many illustrations help to understand the concepts and the code snippets are provided in ObjC and Swift.
Related
I'am new to Iphone, to Xcode and to openGL ES.
I looking for an example of a source code witch demonstrates how to create 3d object, rotate it with gestures and zoom in, zoom out ...
Thanks,
Alex
Here is the simple example how to display and rotate 3d models created with Blender: http://iphonedevelopment.blogspot.com/2009/06/using-3d-models-from-blender-in-opengl.html
The complete source can be found here: http://innerloop.biz/code/ExportTest.zip
I can recommend the OpenGL SuperBible 5th Ed. It starts out with the very foundations with lots of example code (I believe the specific scenario you describe is chapter 5), and moves on to topics like platform specific development, OpenGL ES for the iPhone... - and it's a decent read, a rare quality among textbooks, in my opinion. All examples and code in the book should compile in Xcode, and they show various exceptions for individual platforms as necessary.
I am currently looking at this challenge as well.
I will put up my findings so far, and whack a bounty on this question to try and get some focus for it.
http://nineveh.gl/ promises to do the job, but it is in beta and even the most basic examples don't run out-of-the-box (they give compiler errors). so I couldn't recommend it.
It is possible to integrate Unity with native iOS code, eg
http://clevermartian.com/blog/?p=59
http://technology.blurst.com/a-cocoa-based-frontend-for-unity-iphone-applications/
but that stuff looks scary
http://www.sunsetlakesoftware.com/molecules is open source; it may be possible to lift something from there.
I see you had answered this but a good tutorial are the Lamarche Tutorials, there is also OpenGLES 2.0 tutorials:
http://iphonedevelopment.blogspot.co.uk/2009/05/opengl-es-from-ground-up-table-of.html
Also for loading up models look up setting up the POWER VR SDK as there is all the things you need to loading up a 3d model with bone animation , textures lighting e.t.c.
i am not sure if this is what you have searched and looking for but you can take a look at : http://nehe.gamedev.net/tutorial/texture_filters,lighting&_keyboard_control/15002/
in the lower portion of the page, you can see that there is the example code for macos/cocoa ..
i'm still not sure but hope this helps..
I had a play around with OpenGL ES a year or so ago, and I found this on-line O'Reilly book very helpful: http://ofps.oreilly.com/titles/9780596804824/
The chapters are typical of most books on this subject; math primer to 'Advanced' (typically your usual scene using shaders that implement cube-maps, bump-maps etc)
You are also able to download the source code for the examples.
Edit: I also own this book http://www.amazon.co.uk/OpenGL-ES-2-0-Programming-Guide/dp/0321502795/ref=sr_1_1?ie=UTF8&qid=1336064164&sr=8-1
Which I found was a good read with respect to OpenGL ES as-well as 3D graphics in general.
I need to find resources for learning openGL ES for the iPhone.
I've already watched Brad Larson's awesome videos and I'm downloading the advanced videos from apple now.
I know a lot about iOS programming but am clueless on OpenGL, so resources that don't assume I already know openGL.
I want to learn a majority of the OpenGL capabilities, but my major goal is to be able to manipulate an image based on the touch locations. More specifically I want to create a water ripple effect that follows the users finger.
I know there are many equations on StackFlow that implement this, but I'm lost when it comes to finding out how to use them.
I appreciate the kind words on the videos. That definitely makes the class feel like it was worth doing.
Do you have the course notes for both semesters of the class? The spring session notes can be found here in HTML format (VoodooPad format here) and the fall ones here (VoodooPad format here). The links in iTunes U aren't very obvious for those, and they contain many links to OpenGL ES resources that I thought were valuable, as well as all the sample code I show off in the classes.
I like the job that various instructors at Stanford have done with their class sessions on OpenGL ES as part of their iPhone Application Development course (also on iTunes U). They provide a different perspective on the API than I do, and both of us come at it by not assuming that you know OpenGL.
As Bart suggests, Jeff LaMarche's "OpenGL ES from the Ground Up" series is extremely popular for good reason, and he's been posting unpublished chapters from his book on OpenGL ES 2.0 lately as well.
For books, I highly recommend Philip Rideout's iPhone 3D Programming, which introduces fundamentals like the math involved, and takes you all the way through to some fairly advanced techniques. It's also one of the few books to spend a significant amount of time with OpenGL ES 2.0.
However, the best thing that I suggest for learning OpenGL ES is not to spend your time reading books and articles but actually formulate a simple project and try to implement it. Find sample applications out there that do many of the things you want to, and pick them apart. Go back to these resources when you run into brick walls and you'll better understand how the concepts all fit together. I knew very little about OpenGL when I started out with my first application using it, but I built small pieces and standalone prototypes until I knew enough to piece together something that worked.
In your case, I'd look very carefully at the resources linked in the answers to the question "GLSL for simple water surface effects", which do exactly what you want. One implementation uses OpenGL ES 1.1, the other 2.0-style shaders. Pick a way that you want to go (my personal recommendation would be to learn shaders now) and try to make a crude, functional application while working through the above videos and reading material.
You might want to have a look at this: http://iphonedevelopment.blogspot.com/2009/05/opengl-es-from-ground-up-table-of.html
These tutorials seem to be relatively beginner-friendly.
More specifically I want to create a water ripple effect that follows the users finger.
Here is code that does exactly that: http://developer.apple.com/library/ios/#samplecode/GLCameraRipple/Introduction/Intro.html#//apple_ref/doc/uid/DTS40011222
I have googled this thing but am unable to find the answer.
What I have to do is I have to record a sound using iPhone and have to replay it using different accents(British,UK etc...)
Is there any API available or is there any framework provided by Apple to do this.
Thanks in advance for any help..
This is a research topic in speech processing for which you might have to do a lot of reading of academic papers.
There is no API for this, other than for generic audio IO and DSP functions.
Is there any API available or is there any framework provided by Apple to do this.
I'm struggling to think of why Apple would provide an API for this, or why anyone else would for that matter! There aren't exactly many use cases.
Voice recognition systems have a hard enough time trying to understand heavily accented dialects, so what you're asking is computationally difficult, if not impossible.
I am developing an iPhone application (like Audio Processing). I have to give some effect to the audios.
If it is desktop app, many options are there. We can get good examples and full project like audacity. But I want to develop for iPhone.
I got an app with reverb option; (take a look at following link). Just I watch the "video", I did not test this application in my iPhone device.
http://www.appstorehq.com/reverb-iphone-89870/app
My question is; How can I develop the app with reverb functionality ? Is there any documentation for that ? If it is, just share with us.
NOTE: We can use AudioUnit to develop the app with reverb functionality (I am not clear with this.).
EDIT: I don't like to use any third party library.
If anybody having knowledge about this, please share with us.
Thanks.
if yourre targeting ios5 you can just the audio unit subtype kAudioUnitSubType_Reverb2 of the effect audio unit.
reverb unit
AudioComponentDescription auEffectUnitDescription;
auEffectUnitDescription.componentType = kAudioUnitType_Effect;
auEffectUnitDescription.componentSubType = kAudioUnitSubType_Reverb2;
auEffectUnitDescription.componentManufacturer = kAudioUnitManufacturer_Apple;
AUGraphAddNode(
processingGraph,
&auEffectUnitDescription,
&auEffectNode),
Failing that you could just write your own reverb code in the remoteio callback. A simple delay might be easier to do and would sound similar.
iOS 5.0 brings native OpenAL support, so it is now much easier - you don't have to code the algorithm yourself. It also bring support for a variety of reverb spaces:
Small Room
Medium Room
Large Room (2 configurations)
Medium Hall (3 configurations)
Large Hall (2 configurations)
Plate
Medium Chamber
Large Chamber
Cathedral
I suggest that you try the ObjectAL wrapper which already has a great support for the reverb effect:
https://github.com/kstenerud/ObjectAL-for-iPhone
Grab the source from this repository, load "ObjectAL.xcodeproj" and run the ObjectALDemo target on any iOS 5.0 device (should also work on the simulator). This will give you a good starting point and feeling of what the reverb effect is capable of.
If you still don't to use any 3rd party library, you can just grab the relevant pieces from ObjectAL. Look for the reverb-related code in the following source files (and their corresponding headers):
https://github.com/kstenerud/ObjectAL-for-iPhone/blob/master/ObjectAL/ObjectAL/OpenAL/ALListener.m
https://github.com/kstenerud/ObjectAL-for-iPhone/blob/master/ObjectAL/ObjectAL/OpenAL/ALSource.m
https://github.com/kstenerud/ObjectAL-for-iPhone/blob/master/ObjectAL/ObjectAL/OpenAL/ALWrapper.m
Good luck with your project!
AUs are a good place to start.
write your own reverb AU which contains a reverb implementation. there are tons of ways to implement a reverb. a medium/long convolution reverb is much to ask from a phone, but something such as a FDN (feedback delay network) will not require a lot of memory or CPU.
both implementations are easy to implement, if you're familiar with audio programming and optimization. the tough part is actually making one that sounds very good and performs well.
if you're unable to write optimal low level code or you do not (presently) understand basic audio signal processing, then you'll have a few obstacles to overcome -- it may be a long road in that case.
Searching the iOS documentation for "reverb" produces a link to the Core Audio Overview, which references reverb as an "effect unit." Perhaps that's worth further study?
No good, I have attempted the audio unit approach and even though it is in the documentation it is "not" implemented yet by the apple engineers. Each time you call the function to set the reverb property you will only get failure status code. You would have to implement your own reverb effect. Try reading some DSP book and you might find a clue.
you need to learn some DSP-level coding, the DSP cookbook book is okay and there are others out there. But basically you need to be comfortable with handling audio signal in the frequency domain and things such as FFT's. Once you have that, implementing a reverb filter should be straight-forward.
This is an answer I've given before, but I believe it is relevant here. I am going to agree with the others and say that you are going to have to become a bit more familiar with core-audio if you want to do this properly.
I highly recommend this core-audio book. It will teach what you need to do this right and will save you a lot of frustration.
The chapter on audio effects has not been published yet, but if it is anything like the rest of the book it's worth the wait.
EDIT
You will most likely need to do this with an audio effect (which is a form of an audio unit).
Really enjoying the stanford iphone course videos, just wondering if there are others out there of equal or better quality?
I realize I am not answering the question directly, but what I found most helpful while I was getting into Cocoa development were:
ADC videos
WWDC videos - These were even better than the Stanford courses because of the depth and emphasis on going beyond the assignment requirements.
Jeff LaMarche's Beginning iPhone Development book (This came out later)
Aaron Hillegass's Cocoa book (most helpful and motivating programming book I have ever read).
check this: http://itunes.apple.com/itunes-u/advanced-iphone-development/id407243032
more advanced than the Stanford course
You may also check peepcode screencast.
I can highly recommend the WWDC Videos. This year available for free.