Looking for clues about orienting an OpenGL ES app in landscape, most information I found dates back from 2008, most of it refering to the early versions of the SDK. Apparently, back in the days, in the case of GL it was recommended to not rotate the view, but instead to apply the rotation as a GL transformation. Is it still the case with the current SDKs? It would be so much simpler to simply rotate the window: all the touch events would be in sync with the rotation.
In other words: how to set up an OpenGL view in landscape mode?
(I'll answer my own question with the solution I found. I'll be happy to consider other answers though.)
In the CAEAGLLayer docs, Apple states (a bit clumsily) that you should make the rotation within GL itself: "When drawing landscape content on a portrait display, you should rotate the content yourself rather than using the CAEAGLLayer transform to rotate it." They don't explain why, but I've read in multiple places about a noticeable drop in performance.
Luckily I solved it with the addition of just a few lines. This is for landscape orientation right, where the home button is on the right.
glPushMatrix();
glRotatef(90, 0.0, 0.0, 1.0);
glTranslatef(0.0f, -320.0f, 0.0f );
// *** ALL RENDERING GOES HERE ***
glPopMatrix();
If you're targeting the iPad, replace -320 by -768.
I also convert the coordinates from the incoming UITouches:
int touchx = touch.y;
int touchy = viewWidth - touch.x;
Maybe, it's changed now. If you look into the latest OpenGL ES programming guide you can find the below sentence:
"In iOS 4.2 and later, the performance of Core Animation rotations of renderbuffers have been significantly improved, and are now the preferred way to rotate content between landscape and portrait mode. For best performance, ensure the renderbuffer’s height and width are each a multiple of 32 pixels."
As like this way:
[eaglLayer setAffineTransform:CGAffineTransformMakeRotation( -90 * M_PI / 180)];
But, CAEAGLLayer Class Reference mentioned "When drawing landscape content on a portrait display, you should rotate the content yourself rather than using the CAEAGLLayer transform to rotate it.". Maybe, documentation is not updated yet. Last updated time is 2008-05-19.
Just FYI who're visited this Q&A as like me.
You need to set up you're camera's matrix with the up vector on the X axis as opposed to the Y axis, which is what you would normally do. Rotating the world by 90 degrees works, but makes working with everything else difficult.
Use somthing like:
up.x = 1; // instead of up.y
up.y = 0;
up.z = 0;
gl.glMatrixMode(GL10.GL_MODELVIEW);
gl.glLoadIdentity();
GLU.gluLookAt(gl, position.x, position.y, position.z, lookAt.x, lookAt.y, lookAt.z, up.x, up.y, up.z);
Related
I've recently had some issues implementing a zooming feature into a painting application. Please let me start off by giving you some background information.
First, I started off by modifying Apple's glPaint demo app. I think it's a great source, since it shows you how to set up the EAGLView, etc...
Now, what I wanted to do next, was to implement zooming functionality. After doing some research, I tried two different approaches.
1) use glOrthof
2) change the frame size of my EAGLView.
While both ways allow me to perfectly zoom in / out, I experience different problems, when it actually comes to painting while zoomed in.
When I use (1), I have to render the view like this:
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrthof(left, right, bottom, top, -1.0f, 1.0f); //those values have been previously calculated
glDisable(GL_BLEND);
//I'm using Apple's Texture2D class here to render an image
[_textures[kTexture_MyImage] drawInRect:[self bounds]];
glEnable(GL_BLEND);
[self swapBuffers];
Now, let's assume I zoom in a little, THEN I paint and after that, I want to zoom out again. In order to get this to work, I need to make sure that "kTexture_MyImage" always contains the latest changes. In order to do that, I need to capture the screen contents after changes have been made and merge them with the original image. The problem here is, that when I zoom in, my screen only shows part of the image (enlarged) and I haven't found a proper way to deal with this yet.
I tried to calculate which part of the screen was enlarged, then do the capturing. After that I'd resize this part to its original size and use yet another method to paste it into the original image at the correct position.
Now, I could go more into detail on how I achieved this, but it's really complicated and I figured, there has to be an easier way. There are already several apps out there, that perfectly do, what I'm trying to achieve, so it must be possible.
As far as approach (2) goes, I can avoid most of the above, since I only change the size of my EAGLView window. However, when painting, the strokes are way off their expected position. I probably need take the zoom level into account when painting and re-calculate the CGPoints in a different way.
However, if you have done similar things in the past or can give me a hint, how I could implement zooming into my painting app, I'd really appreciate it.
Thanks in advance.
Yes, it is definitely possible.
When it comes to paint programs, you should be keeping a linked list or tree of objects to draw for easy insertion / removal. When the user stops painting, (i.e. touchesEnded), you add objects to the data structure containing your scene.
When your user zooms you need to modulate the coordinates of the objects you are drawing with respect to the current viewport, projection, and modelview transforms. In your case, you're not changing the viewport or the modelview transforms so you need only account for the projection transform. You could also implement your zoom using a translation and scale on the modelview matrix but I'll ignore that case for simplicity because it involves inverting the transforms.
The good news is that you are using an orthographic projection so world coordinates correspond to window coordinates when no zooming is in effect. The "world" in your case is a simple canvas that probably corresponds to the size of the device in window coordinates.
Before you add an object to your scene data structure, convert all of the coordinates, using the current projection transform (i.e. the parameters to the glOrthof() call) to world coordinates (i.e. full canvas coordinates). You'll only remain sane if you keep all things in your model in the same coordinate space.
To convert the coordinates, assuming you can never zoom out past full device dimensions in your glOrtho() call, you'll have to scale them down proportional to the ratios of your zoomed ortho dimensions to your unzoomed ortho dimensions then bias them by the difference between your zoomed ortho bottom, left values and those of the original unzoomed ortho values.
I’m trying to support both landscape and portrait orientations in my iPhone Cocos2D game, but I’m having trouble getting the coordinates to translate properly.
Here’s what I’m doing so far.
I have a GameWorld layer that I always keep in portrait, regardless of the device orientation. The following code is in my DeviceRotated event for UIDeviceOrientationLandscapeLeft. (‘self’ is my GameWorld layer)
[self runAction:[CCMoveTo actionWithDuration: 0.25f position:ccp(80, 0)]];
[self runAction:[CCRotateTo actionWithDuration:0.25f angle:90]];
So that I don’t have to write different code for each orientation I was hoping to use the following in my Sprite class to translate Sprite coordinates.
CGPoint spriteLoc = ccp(0,0);
CGPoint translatedSpriteLoc = [self.parent convertToNodeSpace:spriteLoc];
self.position = translatedSpriteLoc;
However, this doesn’t work.
If the device is in portrait mode with the sprite in the lower left corner and I rotate the device to the left, the sprite appears in the lower right. I want the sprite to be in the lower left in landscape just like it is in portrait.
Am I missing something or is there a better way to translate coordinates?
Well, if you don't mind a "jump cut" when you switch orientations, you can just use the built-in orientation support within Cocos2d. See this post at the Cocos2d forums.
If, however, you need pretty orientation, you may have to do something along the lines of what you were showing above, orienting things manually via rotation using actions.
Without more detail, it's hard to say why your approach doesn't work, but my guess is that you are seeing the sprite positioning you describe as a result of the fact that if you don't change orientation, the lower left in portrait IS the lower right in landscape when rotated left, i.e., it's the same point in GL space: (0,0). You're going to have to move the "origin point" of your GameWorld Layer as well as rotating it.
Try adding a full-screen image to your layer to see what's actually happening when rotating it. That should help you narrow down what you need to do.
An iPhone SDK question: I'm drawing a UIImageView on the screen. I've rotated it in 3D and provided a bit of perspective, so the image looks like it's pointing into the screen at an angle. That all works fine. Now the problem is the edges of the resulting picture don't seem to be antialiased at all. Anybody know how to make it so?
Essentially, I'm implementing my own version of CoverFlow (yeah yeah, design patent blah blah) using quartz 3d transformations to do everything. It works fine, except that each cover isn't antialiased, and Apples version is.
I've tried messing around with the edgeAntialisingMask of the CALayer, but that didn't help - the defaults are that every edge should be antialiased...
thanks!
If you rotate only one image, than one trick will resolve the problem. Try to set
layer.shadowOpacity = 0.01;
After that picture will look smoother after 3D Rotation
The Gloomcore answer give a really neat result.
however this sometime make things really LAGGY !
Adding rasterization help a little bit:
.layer.shadowOpacity = 0.01;
.layer.shouldRasterize = YES;
I know the question/answer is old, but hey i just found it.
You could try adding some transparent pixels around the edge of the image, either by putting the UIImageView in a slightly larger empty view that you apply rotation to, or by changing the source images.
I had a similar issue that was solved by only setting shouldRasterize = YES, however because I was re-using my views (and layers), the shouldRasterize = YES killed the performance.
Fortunately I found a solution by turning shouldRasterize = NO at the right time to restore performance in my app's case.
I posted a solution here: Antialiasing edges of UIView after transformation using CALayer's transform
You can try this
Method: Using layer.shouldRasterize
Create a superlayer/superview which is 1 pixel bigger in all 4 directions
Do the transform on the superlayer/superview
Enable layer.shouldRasterize on the original layer/view
Method: Drawing to a UIImage
Draw your content to a UIImage
Make sure that you have a transparent border of 1 pixel around the content
Display the image
Reference: http://darknoon.com/2012/05/18/the-transparent-border-trick/
On an iPhone 3Gs, if you click the little "show my location" symbol on the lower left of the window twice, it switches to a mode that causes the map to rotate so that north on the map faces towards north according to the compass. I don't have a 3Gs, so I just found out about this from a buddy who does have one.
I tried applying a rotation transformation to a MKMapView's layer, like this:
CATransform3D rotationTransform = CATransform3DIdentity;
rotationTransform = CATransform3DRotate(rotationTransform, degreesToRadians(-20), 0.0, 0.0, 1.0);
theMapView.layer.transform = rotationTransform;
That sort of works, but not really. The contents of the map do rotate, but the frame rotates and stretches. The map view ends up in a strip that stretches diagonally across the screen, and it ends up under the buttons in my view.
I tried enclosing the map in another view to isolate it, but that doesn't work either. Next I'll try rotating the enclosing view, but I'm hoping somebody else has figured this out. Getting it to work by trial and error is likely to be difficult at best.
Regards,
Duncan C
I have the same problem. I was able to solve the stretching by placing the MkMapView in a UIView container.
I would like to achieve FSAA on my OpenGL ES app on the iPhone.
Currently I do this by rendering the scene to a texture that is twice the width and height of the screen. I then use the nice function:
void glDrawTexiOES(GLint x, GLint y, GLint z, GLint width, GLint height);
to draw the image resized to the screen resolution.
Is there a better way to do this?
Update Bounty Added I was wondering, given that its now Jan 2010, whether there is a better way to do this on v3.1 3GS phones, etc.
For Gran Turismo on the PSP, the developers achieved an effect similar to anti-aliasing by moving the image back and forth one pixel per frame (demonstration can be found here: http://www.gtplanet.net/why-gran-turismo-psp-looks-so-good/) so if the iPhone doesn't support what you're looking for that's an option.
As of iOS 4.0, full-screen anti-aliasing is directly supported via an Apple extension to OpenGL. The basic concept is similar to what you are already doing: render the scene onto a larger framebuffer, then copy that down to a screen-sized framebuffer, then copy that buffer to the screen. The difference is, instead of creating a texture and rendering it onto a quad, the copy/sample operation is performed by a single function call (specifically, glResolveMultisampleFramebufferAPPLE()).
For details on how to set up the buffers and modify your drawing code, you can read a tutorial on the Gando Games blog which is written for OpenGL ES 1.1; there is also a note on Apple's Developer Forums explaining the same thing.
Thanks to Bersaelor for pointing this out in another SO question.
Technically iPhone's GPU (PowerVR MBX Lite) should support anti-aliasing. However, it seems that current Apple's OpenGL ES drivers (as of Jan 2009) don't expose this capability. So doing "manual AA" just like you do is pretty much the only way.
http://iphonedevelopment.blogspot.com/2009/06/opengl-es-2-shaders.html?showComment=1245770504079#c6001476340022842908
iPhone OS 3.0 + OpenGL ES 2.0. Is anyone seeing better anti-aliasing?
Sounds like you can't do it in hardware since you don't render to the framebuffer, only a texture that the iPhone composites for you.