ImageView Send to Back Programmatically - iphone

After trying all the methods in the previous post I am not able to succeed sending my ImageView back to my Oscilloscope. I am using aurioTouch sample of Apple's example in my app and I have just resized the image in that.. Now I want to have another image in the same view. In my applicationDidFinishLaunching in the beginning I am writing the below code
UIImageView *imgView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"myImage.png"]];
[imgView setFrame:CGRectMake(0,25,320,150)];
[view addSubview:imgView];
after this much of code i tried all the methods that is given in previous post but not able to get this newssmage behind the original image..
Here view is one of the class inherited from UIView.. How can I send new image back to my original image???

One look at the documentation for UIView would give you a list of possible methods to influence the view hierarchy. Some of them are:
– bringSubviewToFront:
– sendSubviewToBack:
– removeFromSuperview
– insertSubview:atIndex:
– insertSubview:aboveSubview:
– insertSubview:belowSubview:
– exchangeSubviewAtIndex:withSubviewAtIndex:

you need to call
- (void)sendSubviewToBack:(UIView *)view
on the view object in your code like this
[view sendSubviewToBack: imgView];
EDIT:
I have seen the code and from the looks of it the the image "Oscilloscope.png" is added on the EAGLView object with variable name view and if I am correct the you are using the same variable to add your imgView too, so I think it should work.

Related

Issues setting up a back ground image and with UIImageView

On my iPhone app, I simply want to set a particular background image, which depends on whether it's an iPhone 5 or not.
So, I tried two approaches:
A) Using
self.view.backgroundColor = [UIColor colorWithPatternImage:backGroundimage];
B) Creating an UIImageView and setting up the image there. Code:
UIImageView *backgroundImageView = [[UIImageView alloc]initWithFrame:screenBounds];
[backgroundImageView setImage:[UIImage imageNamed:backGroundImage]];
[self.view addSubview:backgroundImageView];
But I am having issues with both of them:
Issues with Step A:
When I set the image through that way, I have to deal with the image scaling issues for different sizes of the screen. I use the following code to do the scalling:
UIGraphicsBeginImageContext(screenBounds.size);
[[UIImage imageNamed:backGroundImage] drawInRect:screenBounds];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
self.view.backgroundColor = [UIColor colorWithPatternImage:image];
Another issue from Step A is that the image appears quite blurry. It doesn't have the same sharpness to it.
Issues with Step B:
With this, the image looks really crisp and sharp - just the way it should look.
But when I switch to another view using the following code, strangely enough the UIImageView backgroundImageView still appears on the second one. The code I use to switch views is:
[self presentViewController:secondViewController animated:YES completion:nil];
I even tried [backgroundImageView removeFromSuperview], but that doesn't solve anything either.
So what am I doing wrong? And how can I set up a picture as my background which is dependent on the size of the iphone?
Plan B is a good plan. Presenting another view controller should and will definitely hide your image view. If it isn't happening, then it's a problem with the creation of secondViewController, unrelated to the background image on the presenting VC.
Before presenting secondViewController, log it:
NSLog(#"presenting %#", secondViewController);
I'll bet a dollar that it's nil. If I'm right, let's have a look at how you initialize secondViewController. That's the problem, unrelated to the background image view.
Okay, I finally fixed this issue, although the cause of this issue is still puzzling to me.
To fix this, I had to create an IBOutlet property for UIImageView and hook it up on the XIB file.
The reason I was programmatically creating the UIImageView is because the size of the UIImageView depends on what size iPhone they are using. But for the IBOutlet (let's call it as UIImageViewOutlet, I simply used [self.UIImageViewOutlet setFrame:] to get the size and location that I wanted.
I also discovered that one of the buttons that I was programmatically creating, was still visible in the secondViewController. I ended up creating an Outlet on the XIB file for that one as well and used setFrame on it to position it properly.
If anyone who knows the reason of this problem, I will be very grateful.

How to show XML parsed images by using Tap gestures in iphone?

I am making an App in which i am parsing an XML and storing the url of images in an array.
Now i have to show all that images on next View Controller using Tap Gestures and when i click on images i have some action to perform. So please can anyone help me regarding that?
I can provide the code what i have written if anyone want or tell me some tutorial as i am not able to get it from developer sites.
Load your images into UIImage objects like so:
UIImage *imageFromUrl = [UIImage imageWithContentsOfFile:[NSURL fileURLWithPath:url]];
Then, place them into UIImageView objects wherever you need them. The next thing you should do is add a TapGestureRecognizer:
UIImageView *imgView = [[UIImageView alloc] initWithImage:imageFromUrl];
UITapGestureRecognizer *tgr = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(action)];
[imgView addGestureRecognizer:tgr];
[tgr release];
//Do the rest of your operations here, don't forget to release the UIImageView
And that's it. Do whatever you need in the "action" method that will get called on your ViewController
For that you have to use TapDetactingimgView.
In that view you get scroll,single tap & double tap event.By that you can scroll image & able to tap on that image.
for that Visit : https://bitbucket.org/billgarrison/panzoomimagedemo/src/34671df61417/Classes/TapDetectingImageView.m

Using Image to Perform an Action instead of Button in iPhone Applciation

How can i use image to Perform an Action instead of Using Button or adding an image to Button, just wanna click button and perform particular set of instructions.
(Thanks in Advance)
Use a UIGestureRecogniser.
// IF your image view is called myImage
UIImageView *myImage = ...;
// Add a tap gesture recogniser
UITapGestureRecogniser *g = [[UITapGestureRecogniser alloc] initWithTarget:self action:#selector(imagePressed:)];
[myImage addGestureRecogniser:g];
[g release];
When your image is tapped, this method will get called
- (void)imagePressed:(UIGestureRecogniser *)recogniser {
NSLog(#"%#", recogniser);
}
Why don't you want to use a UIButton - it inherits from UIControl and has a lot of code that you probably don't even know exists? And it can just contain an image so it would look exactly the same?
Well, the pro about using UIButtons is that it got all touch events built right in. You can use UIImageViews, but you'll need to subclass them, while in most situations, a UIButton using a background-image would just fit.

Image not appearing in UIImageView

I have a UIImageView in Interface Builder connected to an outlet called imageView. The statement
[imageView setImage: [UIImage imageNamed: #"xxxx.png"]];
displays an image successfully if placed in viewDidLoad, but NOT if placed in an action method in the same view controller. There's an NSLog in the action method so I know it's firing.
I've looked at the other "UIImageView not appearing" questions but none of them seem directly relevant. Any help would be much appreciated!
Thanks,
Mark
Try forcing a redraw:
[imageView setNeedsDisplay];

iPhone - user interaction with programmatically added UIImageView

So I'm trying to add UIImageViews programatically (in this case I don't have the option of doing it in IB) and I want to be able to refer to them and manipulate them in the -touchesBegan and -touchesMoved methods.
I've added the images like this:
UIImageView *newPiece = [[UIImageView alloc] initWithImage:[UIImage imageNamed:[NSString stringWithFormat:#"%d.png", [piece tag]]]];
newPiece.frame = CGRectMake(pieceX, pieceY, pieceW, pieceH);
[newPiece setCenter:CGPointMake(pieceX, pieceY)];
[newPiece setTag:[piece tag]];
[[self view] addSubview:newPiece];
[newPiece release];
And note, many of these newPiece's are added programmatically, because the method that this is in is called more than once, so the images have different centers and images and stuff, so would I need an array to hold all of them?
Thanks
NSMutableArray would probably suit your needs.
Check this very detailed post. It's precisely about handling several programatically added UIImageViews and it worked nice for me.
Create multiple views and make the touched view follow the users touch
Best luck.