Is there any GPGPU library for iPhone?
The original iPhone and iPhone 3G support only support OpenGL ES 1.1 fixed-function pipeline and do not provide a programmable pipeline (no shaders) and therefore cannot be used as general purpose computation devices, at least at the OpenGL abstraction layer.
iPhone 3GS and iPod touch 3G support OpenGL ES 2.0 programmable pipeline. However, it looks like there are limitations like lack of antialiased shaders.
Regardless of the support for programmable shaders in 3GS, I don't think the GPU is powerful enough to be used as a general purpose computational engine.
The iPhone GPU isn't built for general purpose computation, so hacking it for such features would be very limited and slow at best.
Pre 3GS' even have fixed pipeline chips, which makes computation on them impossible for almost all practical purposes.
Related
I want to do some DSP effect processing, create effect like flanger, echo, etc.
Could it be done via OpenAL? Or should I use enterely different framework/library?
Since iOS 5.0 some of the DSP effects are natively supported by OpenAL.
For example, reverb is supported with emulation for more than 10 different spaces (Small/Medium/Large Room, Medium/Large Hall, Plate, Medium/Large Chamber, Cathedral and several variations).
You can find a good reference implementation in the ObjectAL wrapper. The repository is available at https://github.com/kstenerud/ObjectAL-for-iPhone
Grab the source from this repository, load "ObjectAL.xcodeproj" and run the ObjectALDemo target on any iOS 5.0 device (should also work on the simulator). This will give you a good starting point and feeling of what the reverb effect is capable of. I personally recommend taking advantage of the ObjectAL library instead of working with OpenAL directly.
Good luck with your project!
Just write your own audio library. iOS devices don't have hardware acceleration for OpenAL. It isnt particularly difficult to do, and then you can also use apples audio units (some of which are hardware accelerated).
I'm looking for a comparison of iPhone 3G / 3GS graphical systems (opengl) to the ones on a PC / MAC.
Google didn't help. Perhaps anybody here?
While this might be more of a hardware question, there is enough that might influence the design of an OpenGL-based application here that I'll bite.
Using my Molecules application as a template, I benchmarked the rendering throughput of that running on iPhone 3G, iPad, and a 2nd generation MacBook Air (Nvidia GeForce 9400M). For the MacBook, the numbers were generated from running the application in the Simulator with nothing else executing on the system:
iPhone 3G: 423,000 triangles / s
iPad: 1,830,000 triangles / s
MacBook Air: 2,150,000 triangles / s
You can grab the code for the application and try this yourself by enabling the RUN_OPENGL_BENCHMARKS define in SLSMoleculeGLViewController. This causes structures to be rendered for 100 frames, then the total time measured and the rendering rate figured from the complexity of the model being shown.
Note that this is an OpenGL ES 1.1 application which is geometry-limited in its current state. A fill-rate-limited application might have completely different performance characteristics, same as with one that uses OpenGL / OpenGL ES 2.0 shaders.
Aside from performance differences, the OpenGL command set differs between iOS and the Mac. OpenGL ES cuts out a lot of the cruft that's built up in OpenGL over the years (immediate mode, etc.). In general, OpenGL ES is a subset of OpenGL, so you can pretty much port something written for OpenGL ES to OpenGL without a lot of trouble. OpenGL on the desktop also uses a newer version of GLSL for its shaders (1.4, I believe), so some of the commands supported there will not work on iOS devices.
Apple has more about platform-specific differences in their Platform Notes section in the OpenGL ES Programming Guide for iOS.
I'm not sure if this question has been asked already, my stackoverflow-fu has failed me.
So I'm building an OpenGL-ES-based iPhone game and pretty much all of the examples I've found out in the wild are on OpenGL ES 1.x. Which is fine because at least I'm (re)learning a lot about OpenGL in general.
Now that newer devices support OpenGL-ES 2.0, I'm wondering if anyone has ported their OpenGL-ES 1.x app to 2.0 and if so were there any performance or efficiency gains? For instance, I can setup my lighting (in 1.x) with glLightf(blahblah) and I'm done with lighting...but apparently that function doesn't exist in 2.0 so I'm forced to write it myself? So, how can somebody with no experience "programming the pipeline" accomplish this? Is there a default lighting implementation in 2.0?
I'm probably speaking out of ignorance as I haven't really found any solid iPhone-specific OpenGL-ES 2.0 information.
Any help in this space will be greatly appreciated.
From what I've read, and from my limited time working with it, going to OpenGL ES 2.0 from 1.1 isn't so much a matter of performance as it is about capabilities. If you watch the Mastering OpenGL ES for iPhone videos (part of the iPhone Getting Started Videos available through the iPhone Developer Program site), Apple even states that if you can do what you need to under OpenGL ES 1.1, you don't need to step up to 2.0.
OpenGL ES 2.0's fully programmable pipeline can make simple actions much harder than doing the same thing in 1.1, because you need to write code for parts of the pipeline that were handled for you before. However, 2.0 makes practical many stunning effects that you just couldn't do in 1.1. For example, I recommend watching the WWDC 2010 session video 417 - OpenGL ES Shading and Advanced Rendering and the Graphics and Media State of the Union to see what's possible using OpenGL ES 2.0.
To date, few applications have used OpenGL ES 2.0, given the limited subset of iPhone devices that had compatible GPUs and the lack of documentation and examples. I think we'll see this start to change as the pre-iPhone 3G S devices are phased out. In particular, the iPad has had OpenGL ES 2.0 from launch, so if you are designing an application for it you can rely on these capabilities to be there. More code examples and documentation are sure to appear in the near future.
I'm using a series of shaders to perform realtime image processing on the iPhone (3GS/4/iPad). The fps isn't what I'd like it to be.
Are there any tools that I can use to help me work out what the bottlenecks are?
I assume you already know that performance tests on the Simulator are worthless and that you're testing on real metal, so Instruments is always a good place to start - specifically in your case you'd be interested in the OpenGL ES and OpenGL ES Analyzer instruments.
Generally speaking for GLSL, there's a list of common GLSL mistakes at the OpenGL.org site. The O'Reilly labs "iPhone 3D Programming" book has some further hints, such as avoiding expensive operations in conditionals, and watching for texture lookups.
Also, it's going to depend on what kind of image processing you're doing; if you're trying to apply heavy Photoshop-esqe filters that would give a quad-core pause to render, it's going to be costly on a lowly phone.
The only currently available tool is the PVRUniSCo editor, which will give you a cycle count for each line of code in your shader (though only on Windows, it seems).
In discussion with some colleagues we were wondering whether OpenGL work developed for Android or iPhone are effectively interchangeable given that both support the spec.
Or is the reality of sharing OpenGL between the two platforms more a case of quirks, tweaks and not as easy as one might have hoped.
An OpenGL implementation normally consists of two parts:
1. Platform specific part. This has function usually related to creating and displaying surfaces.
2. The OpenGL API. This part is the same on all platforms for the specific implementation of OpenGL, in the case of Android, OpenGLES 1.0.
What this means is that the bulk of your OpenGL code should be easy to port.
In C, you might have glLoadIdentity();
In Java on Android, something like gl.glLoadIdentity();
So for the bulk of your code you can cut and paste, and then search and replace prefixes like 'gl.'
Now for the fun part: you really need to be careful what version you are coding against. OpenGL for the desktop has APIs which don't exist in OpenGLES. There are also some OpenGL data types specific to each platform. In addition, you have 1.0 (e.g. Android) 1.1 (e.g. iPhone) 2.0 (e.g. iPhone GS) to deal with. The differences in API often have to do with additional hardware capability, so it's not like you can write some easy wrapper code to emulate 2.0 features in 1.0/1.1.
OpenGL ES on Android is done according to Khronos Java GLES spec JSR239 , and wraps GL calls in something like glinst.glBindBuffer(FloatBuffer.wrap(data) ... )
OpenGL ES on iPhone is done using stock GL.h files and the same call will just look like glBindBuffer(data...)
The code will not be interchangeable and will cause many quirks, even before you get into the whole mess of differences between 1.0 1.1 and 2.0 APis.
Both platforms use OpenGL ES, but Wikipedia claims that Android uses 1.0 while the iPhone uses 1.1 (original and 3g) an 2.0 for the 3gs link. It's likely that at least some programs will use api functions not included in 1.0, so there won't be full compatibility between the 2 (well 3).