How can I fade a View with a CAEAGLLayer - iphone

I'm currently looking for a way to change the alpha value of a view with a CAEAGLLayer. Setting self.alpha doesn't work, so I think there must be some concept here I don't understand. The app I'm building has an opengl layer over live footage from the iPhone/iPod camera, and I'd like to fade this in and out. I've also tried setting the CAEAGLLayer's opaque and opacity value, which also won't work. Thanks for any thoughts you may have.

I am trying to do something similar, i.e. have just the background of the CAEAGLLayer be transparent, so I can draw over a photo. I have managed to fade the whole thing in and out setting the Alpha value on the layer itself in the layout editor of the Nib, which might work for you, but is not quite the effect I want. What I really want is just a transparent background and there is discussion of how to achieve that here:
http://www.iphonedevsdk.com/forum/iphone-sdk-development/20081-merging-content-uiimageview-eaglview.html
Although I haven't managed to get what they describe working yet ... :-(

Related

Dragging/Resizing a UIImage on the Device

I'd like to allow a user to add a shape (which would just be a UIImage) onto some sort of canvas, then move and resize it on the screen but I'm not sure how to go about this. Ideally I'd like the basics of a drawing app which can use images from a user's device. Each shape would have an associated position, size and z-index.
The only thing I'm unsure of is how I'd create a bounding box (the one with four blue dots to allow resizing/moving). I have experience with UIKit, and would prefer to keep the majority of the app in this for the time being, but I get the feeling this type of thing might be better suited to Cocos2D or a similar framework.
If anyone has any pointers/open source code I can dig through it would be hugely appreciated.
I think you should look into CALayer, or even CAShapeLayer. I'm just starting to play with them, but I'm pretty sure you can easily get the functionality you want with either. Draw the border in the layer's drawLayer:inContext:. Check out the Quartz2d Guide path drawing section for the functions you need.

3D model over iPhone camera images. Transparent background?

I'm trying to make an augmented reallity app for iPhone where you can see, over the camera images, 3D models built using OpenGL ES. I've already made the part where I get the images from the camera and a GLView that inherits from UIView where I draw the model. My problem is that when I put the GLView over the camera images, you don't see only the model but a black rect covering all the view. I've tried to colour the background so it becomes transparent but I haven't got it. Do you know which one could be the problem?
Thanks a lot!!
I answer myself. Appart from using glClearColor when you draw the OpenGl View, you need to have in mind to things:
When you create your CAEAGLayer object, you have to assign its opaque property the value NO.
When you assign the drawableProperties, you have to choose a color format that admits transparency, for example kEAGLColorFormatRGBA8.
Hope this will be useful to someone else! Thanks Benjamin Andris for your answer anyway! :)
Have you set the 'opaque' property of your GLView to NO?

iPhone: Fade In on Application Load

I have my app up and running and would like to add some polish to it. One of the first things I'd like to do is improve the transitions.
Unfortunately I have speent most of my time in OpenGL and still haven't got a solid grasp on working with the UIView system. What is a good way to enter into your App?
I load pretty quickly so I was thinking a quick fade in, but my GL view loads and draws at least a frame before I really get control so I am not sure the best way to go about this.
A quick and dirty way would be to just create a black (or white) solid color full screen UIView overlaying the opengl view, and have it fade its alpha down to zero over some n number of seconds.
There is a SetAlpha method from UIView. In your draw call; decrease gradually the alpha value until 0. When it hits 0, then draw the next "view" of your app (pretty much loads your next 3d objects); and increase the alpha back to 1. should do the trick.

Tinting iPhone application screen red

I'm trying to place a red tint on all the screens of my iPhone application. I've experimented on a bitmap and found I get the effect I want by compositing a dark red color onto the screen image using Multiply (kCGBlendModeMultiply).
So the question is how to efficiently do this in real time on the iPhone?
One dumb way might be to grab a bitmap of the current screen, composite into the bitmap and then write the composited bitmap back to the screen. This seems like it would almost certainly be too slow. In addition, I need some way of knowing when part of the screen has been redrawn so I can update the tinting.
I can almost get the effect I want by putting a red, translucent, fullscreen UIView above everything. That tints everything red within further intervention on my part, but the effect is much "muddier" than results from the composite.
So do any wizards out there know of some mechanism I can use to automatically composite the red over the app in similar fashion to what the translucent red UIView does?
I managed to somewhat make this work but with some side-effects:
I setup a UIView on top of all my app-views (attached to the window) which is not userInteractionEnabled and which is opaque
This UIView carries some custom drawRect-method which first fills the complete area with red color and then after having made a "screenshot" of my window-viewhierarchy I am rendering this image with
CGContextSetBlendMode( c, kCGBlendModeMultiply);
to the UIView.
To constantly update this UIView to the current state of the apps UIViews I constantly produce "screenshots" and render them as fast as possible.
I setup an NSTimer which is doing this snapshotting/rendering in a defined frequency and which is added to the the NSRunLoop for "Tracking".
RESULT: some really laggy response from the UI with several fancy effects, but still usable though if you do not set the frequency of snapshotting/rendering to high.
See screenshot here...
The result looks okay, but the usability really suffers a lot. I had a look at the OpenGL-examples before trying this aproach, but OpenGL is a whole lot of different (mostly C) code which seems to be very near to the hardware and which gives you a real headache.
So, the described approach is what I will shoot for with my next app. I hope Apple accepts it even though it degrades UXP during nightvision mode. They should simply make CALayer filter-backed then my problem will definitely be solved a whole lot better and performing nicely.
You could try this: subclass UIView. Add code to -drawRect method to draw the overlay. Make your UIView subclass pose as UIView everywhere in your app with
class_poseAs ([CustomUIView class], [UIView class]);

How do I use CALayer with the iPhone?

Currently, I have a UIView subclass that "stamps" a single 2px by 2px CGLayerRef across the screen, up to 160 x 240 times.
I currently animate this by moving the UIView "up" the screen 2 pixels (actually, a UIImageView) and then drawing the next "row".
Would using multiple CALayer layers speed up performance of rendering this animation?
Are there tutorials, sample applications or code snippets for use of CALayer with the iPhone SDK?
The reason I ask is that most of the code snippets I find that demonstrate simple examples of CALayer employ method calls that do not work with the iPhone SDK. I appreciate any advice or pointers.
Okay, well, if you want something that has some good examples of CA good that draws things like that and works on the phone, I recommend the GeekGameBoard code that Jens Aflke published (it is an improved version of some Apple demo code).
Based on what you are describing I think you are doing somthing way more complicated than it needs be. My impression is you want basically a static view that you are animating by shifting its position so that it is partially off screen. If you just need to set some static content in your drawRect going through layers is not going to be faster than just calling CGFillRect() with your color. After that you could just use implicit animations and the animator proxy on UIView to move the view. I suspect you could even get rid of the custom drawRect: implementation with a patterned UIColor, but I honestly have not benchmarked the difference between the two.
What CALayer methods are you seeing that don't work on iPhone? Aside from animation features tied to CoreImage I have not noticed much that is missing. The big thing you are likely to notice is that all views are layer backed (so you do not need to do anything special to use layers, you can just grab a UIView's layer through the layer accessors methos), and the coordinate system has a top left origin.
In any event, generally having more things is slower than having fewer things. If you are just repeating the same pattern over and over again you are likely to find the best performance is implementing a custom UIView/CALayer/UIColor that knows how to draw what you want, rather than placing visually identical layers or views next to each other.
Having said that, generally layers are lighter weight than views, so if you have a lot of separate elements that you need to keep logically separated you will find that moving to layers can be a win over using views.
You might want to look at -[UIColor initWithPatternImage:] depending on exactly what you are trying to do. If you are using this two pixel pattern as a background color you could just make a UIColor that draws it and set the background.
What CALayer methods are you seeing that don't work on iPhone?
As one example, I tried implementing the grid demo here, without much luck. It looks like CAConstraintLayoutManager and CAConstraint are not available in QuartzCore.h.
In another attempt, I tried a very simple, small 20x20 CALayer object as a sublayer of my UIView's layer property, but that didn't show up.
Right now, I have a custom UIView of which I override the drawRect method. In drawRect I grab a context and render two types of CGLayerRefs:
At "off" cells I draw the background color across the entire 320x480 canvas.
At "on" cells, I either draw a single CGLayerRef across a grid of 320x480 pixels (initialization) or across a 320x2 row (animation).
During animation, I make a UIImageView clip view from 320x478 pixels, and draw a single row. This "pushes" my bitmap up the screen two pixels at a time.
Basically, I'd like to test whether or not using CALayer will accomplish two things:
Make my rendering faster, if CALayer has less overhead than what I'm doing now
Make my animation smoother, by letting me transition a layer up the screen smoothly
Unfortunately, I can't seem to get a basic CALayer working at the moment, and haven't found a good chunk of sample code to look at and play with.