Drag and drop images from toolbar to view canvas - iphone

I want to make a app where I need to have a toolbar. The toolbar will have various number of objects (images). These objects should be draggable and placed on view right above it.
This is a kind of drawing app where user can pick any shape and place on canvas. Can anyone tell me the best to achieve this ?
Thanks in advance.

Tough one.
What I would do is make each item on the toolbar a UIView and override touchesBegan, touchesEnded, and touchesMoved. Make your canvas another UIView. When a user clicks an item to drag, register that item somewhere, say a square. then when they drop on the canvas, draw that square at that location. To make it look pretty, would probably make a UIImageView with an alpha of like .4 to move with the user's touch, so they know what they are dragging, and where it will land.
Here is the code.
UIViewController.h
#class ToolBox;
#interface ViewController : UIViewController {
IBOutlet UIView *canvas;
IBOutlet ToolBox *toolBox;
}
#property (nonatomic, retain) UIView *canvas;
#property (nonatomic, retain) ToolBox *toolBox;
#end
UIViewController.m
#import "ViewController.h"
#import "ToolBox.h"
#import "Tool.h"
#implementation ViewController
#synthesize canvas, toolBox;
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
[[self toolBox] setCanvas:[self canvas]];
Tool *tool = [[[Tool alloc] initWithFrame:CGRectMake(0,0,64,64)] autorelease];
[tool setImageView:[[[UIImageView alloc] initWithImage:[UIImage imageNamed:#"1.png"]] autorelease]];
[tool setImage:[UIImage imageNamed:#"1.png"]];
[tool addSubview:[tool imageView]];
[tool setToolBox:[self toolBox]];
[[self toolBox] addObject:tool];
}
#end
ToolBox.h
#class Tool;
#interface ToolBox : UIView {
NSMutableArray *tools;
UIView *canvas;
UIImage *currentTool;
}
#property (nonatomic, retain) UIImage *currentTool;
#property (nonatomic, retain) NSMutableArray *tools;
#property (nonatomic, retain) UIView *canvas;
-(void)addObject:(Tool *)newTool;
-(void)updatePositions;
#end
ToolBox.m
#implementation ToolBox
#synthesize tools, canvas, currentTool;
- (id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
if (self) {
}
return self;
}
-(void) dealloc {
[[NSNotificationCenter defaultCenter] removeObserver:self];
[super dealloc];
}
-(NSMutableArray *)tools {
if(tools == nil) {
[self setTools:[NSMutableArray array]];
}
return tools;
}
-(void)addObject:(Tool *)newTool {
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(draggingTool:) name:#"Dragging New Tool" object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(draggedTool:) name:#"Dragged Tool To" object:nil];
[[self tools] addObject:newTool];
[self updatePositions];
}
-(void)updatePositions {
int x = 0;
int y = 0;
int width = 64;
int height = 64;
for(Tool *button in [self tools]) {
[button setFrame:CGRectMake(x, y, width, height)];
x += width;
[self addSubview:button];
}
}
-(void)draggingTool:(NSNotification *)notif {
NSDictionary *dict = [notif userInfo];
UIImage *image = [dict valueForKey:#"Image"];
[self setCurrentTool:image];
}
-(void)draggedTool:(NSNotification *)notif {
UITouch *touch = [[notif userInfo] valueForKey:#"Touch"];
CGPoint point = [touch locationInView:canvas];
UIImageView *imageView = [[[UIImageView alloc] initWithImage:currentTool] autorelease];
[imageView setCenter:point];
[canvas addSubview:imageView];
}
#end
Tool.h
#class ToolBox;
#interface Tool : UIView {
UIImageView *imageView;
UIImage *image;
UIImageView *ghostImageView;
ToolBox *toolBox;
}
#property (nonatomic, retain) UIImageView *imageView;
#property (nonatomic, retain) UIImage *image;
#property (nonatomic, retain) UIImageView *ghostImageView;
#property (nonatomic, retain) ToolBox *toolBox;
#end
Tool.m
#import "Tool.h"
#import "ToolBox.h"
#implementation Tool
#synthesize imageView, image, ghostImageView, toolBox;
- (id)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self) {
// Initialization code
}
return self;
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"tool touches began");
NSDictionary *dict = [NSDictionary dictionaryWithObject:[self image] forKey:#"Image"];
[[NSNotificationCenter defaultCenter] postNotificationName:#"Dragging New Tool" object:nil userInfo:dict];
UITouch *touch = [touches anyObject];
CGPoint center = [touch locationInView:[[self toolBox] canvas]];
[self setGhostImageView:[[[UIImageView alloc] initWithImage:[self image]] autorelease]];
[[self ghostImageView] setCenter:center];
[[[self toolBox] canvas] addSubview:[self ghostImageView]];
[[self ghostImageView] setAlpha:.4];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"tool touches ended");
NSDictionary *dict = [NSDictionary dictionaryWithObject:[touches anyObject] forKey:#"Touch"];
[[NSNotificationCenter defaultCenter] postNotificationName:#"Dragged Tool To" object:nil userInfo:dict];
[self setGhostImageView:nil];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
NSLog(#"tool touch moved");
if([self ghostImageView] != nil) {
UITouch *touch = [touches anyObject];
CGPoint center = [touch locationInView:[[self toolBox] canvas]];
NSLog(#"Ghost : %2f, %2f", center.x, center.y);
[[self ghostImageView] setCenter:center];
}
}
#end
My Program came out like this -
Initial screen: The tool is ready to be dragged and dropped onto the canvas
Dropped the tool on the canvas
And here is the ghost for the transition

I'm not exactly sure how this would be implemented in objective c, but I have done almost exactly what you are describing in java. Basically, when you begin to drag the object in the toolbar, it will recognize the action and store a new instance of the item in what I call a transfer container, which is basically a global static variable. That way when the drag action competes (I.e. drops) you can just grab that object from the container and add it to wherever it was dropped.

Related

Move UIImageview using the UITouches

I have subclassed a uiview, in which I am adding two uiimageview, with one view for the background and another image view for an image which one i would like to move on touch. I am adding this subclass uiview from a view controller. The following is the code:
#interface CustomSlider : UIView
{
UIImageView *bgView;
UIImageView *customSliderImageView;
float minStartPoint,maxEndPoint;
}
- (id)initWithFrame:(CGRect)frame inView:(UIView*)mainView;
#end
#implementation CustomSlider
- (id)initWithFrame:(CGRect)frame inView:(UIView*)mainView {
if((self = [super init])) {
UIView *contentView = [[UIView alloc] initWithFrame:CGRectMake(15, 60, frame.size.width, frame.size.height)];
contentView.backgroundColor = [UIColor clearColor];
[mainView addSubview:contentView];
bgView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"price_96.png"]];
bgView.frame = CGRectMake(5, 0, frame.size.width - 10, 8 );
[contentView addSubview:bgView];
customSliderImageView= [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"price_100.png"]];
customSliderImageView.frame = CGRectMake(20, 0, 35, frame.size.height);
[contentView addSubview:customSliderImageView];
minStartPoint = 0;
UIPanGestureRecognizer* pgr = [[UIPanGestureRecognizer alloc]
initWithTarget:self
action:#selector(handlePan:)];
[customSliderImageView addGestureRecognizer:pgr];
[pgr release];
}
return self;
}
#end
AppDelegate Classes
**// .h File**
#import <UIKit/UIKit.h>
#class UITouchTutorialViewController;
#interface UITouchTutorialAppDelegate : NSObject <UIApplicationDelegate> {
UIWindow *window;
UITouchTutorialViewController *viewController;
}
#property (nonatomic, retain) IBOutlet UIWindow *window;
#property (nonatomic, retain) IBOutlet UITouchTutorialViewController *viewController;
#end
//////////////////////////////////////////////////////////////////////////////////
// .m File
#import "UITouchTutorialAppDelegate.h"
#import "UITouchTutorialViewController.h"
#implementation UITouchTutorialAppDelegate
#synthesize window;
#synthesize viewController;
- (void)applicationDidFinishLaunching:(UIApplication *)application {
// Override point for customization after app launch
[window addSubview:viewController.view];
[window makeKeyAndVisible];
}
- (void)dealloc {
[viewController release];
[window release];
[super dealloc];
}
#end
//////////////////////////////////////////////////////////////////////////////
UIViewController Classes
// .h file
#import <UIKit/UIKit.h>
#interface UITouchTutorialViewController : UIViewController {
IBOutlet UIImageView *cloud;
}
#end
// .m File
#import "UITouchTutorialViewController.h"
#implementation UITouchTutorialViewController
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:touch.view];
cloud.center = location;
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesBegan:touches withEvent:event];
}
/*
// Override initWithNibName:bundle: to load the view using a nib file then perform additional customization that is not appropriate for viewDidLoad.
- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil {
if (self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil]) {
// Custom initialization
}
return self;
}
*/
/*
// Implement loadView to create a view hierarchy programmatically.
- (void)loadView {
}
*/
/*
// Implement viewDidLoad to do additional setup after loading the view.
- (void)viewDidLoad {
[super viewDidLoad];
}
*/
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation {
// Return YES for supported orientations
return (interfaceOrientation == UIInterfaceOrientationPortrait);
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning]; // Releases the view if it doesn't have a superview
// Release anything that's not essential, such as cached data
}
- (void)dealloc {
[super dealloc];
}
#end

Getting 2 Images to Drag in View Xcode

I have 2 images in a view and I applied a drag touch function to the images, and when the image collides with the image at the top of the view it gives you an alert and changes the image at the top of the view to a different image.
My problem is getting my if else statement to work, If i take out the == image in
([touch view] == image) {
Only one of the images drag, I cannot seem to get both images to drag. My if else statement has no errors, but does not seem to work.
frfViewController.h
#import <UIKit/UIKit.h>
#interface frfViewController : UIViewController {
UIImageView *image;
UIImageView *image2;
UIImageView *images;
UIImageView *collisionImage;
}
#property (nonatomic, retain) IBOutlet UIImageView *image;
#property (nonatomic, retain) IBOutlet UIImageView *image2;
#property (nonatomic, retain) IBOutlet UIImageView *collisionImage;
#property (nonatomic, retain) IBOutlet UIImageView *images;
#end
frfViewController.m
#import "frfViewController.h"
#interface frfViewController ()
#end
#implementation frfViewController
#synthesize image;
#synthesize image2;
#synthesize images;
#synthesize collisionImage;
- (void)viewDidLoad
{
}
- (void)viewWillAppear:(BOOL)animated
{
// [self.navigationController setNavigationBarHidden:YES animated:animated];
// [super viewWillAppear:animated];
}
- (void)viewWillDisappear:(BOOL)animated
{
[self.navigationController setNavigationBarHidden:NO animated:animated];
[super viewWillDisappear:animated];
}
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint location = [touch locationInView:self.view];
if ([touch view] == image) {
image.center = location;
image.userInteractionEnabled = YES;
}
else if ([touch view] == image2) {
image2.center = location;
image2.userInteractionEnabled = YES;
}
[self ifCollided];
}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
[self touchesBegan:touches withEvent:event];
}
-(void) ifCollided {
if (CGRectIntersectsRect(image.frame, collisionImage.frame) || CGRectIntersectsRect(image2.frame, collisionImage.frame)) {
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"image Collided"
message:#"FuckYea"
delegate:nil
cancelButtonTitle:#"Reset!"
otherButtonTitles:nil];
[alert show];
CGRect myImageRect = CGRectMake(0, 0, 320, 100);
images = [[UIImageView alloc] initWithFrame:myImageRect];
[images setImage:[UIImage imageNamed:#"004.jpg"]];
[self.view addSubview:images];
}
}
#end
Thanks any assistance is much appreciated.
From iOS 4 and up you can use UIPanGestureRecognizer to do dragging easily,
UIPanGestureRecognizer *gesture = [[UIPanGestureRecognizer alloc]
initWithTarget:self
action:#selector(labelDragged:)];
[yourview addGestureRecognizer:gesture];
Add recognizer to each UIView and set selector to point, where you set new coordinates = where you can limit it , if you get collision for example...
- (void)labelDragged:(UIPanGestureRecognizer *)gesture
{
UILabel *label = (UILabel *)gesture.view;
CGPoint translation = [gesture translationInView:label];
// move label , limit it to stay inside self.view.frame
float newx = label.center.x + translation.x;
float newy = label.center.y + translation.y;
if (newx<0) { newx = 0; }
if (newy<0) { newy = 0; }
if (newx>self.view.frame.size.width) {
newx = self.view.frame.size.width;
}
if (newy>self.view.frame.size.height) {
newy = self.view.frame.size.height;
}
//possibly compute collision here and limit new coordinates further
label.center = CGPointMake(newx,newy);
// reset translation
[gesture setTranslation:CGPointZero inView:label];
}
Also have a look at draggable&resizable View class at https://github.com/spoletto/SPUserResizableView , that may save you a bit of work also

Gesture recognizer delegate: how to implement touch delegate functions

I'm actually trying to insert touch delegate functions (type touchBegan:withEvent, or touchEnded:withEvent) inside a customScrollView class that implement gestureRecognizers.
When I try to set the delegate of the recognizer object to self, my SDK has a message of warning stating "incompatible type id ".
I understand that the delegate protocol of GestureRecognizer does not include such function, but I don't know which delegate I should trigger in order to use the aforesaid function inside my custom view.
Thank you very much for your responses
Victor-Marie
Here is my code:
#interface TapScrollView : UIScrollView {
// id<TapScrollViewDelegate> delegate;
NSMutableArray *classementBoutons;
int n;
int o;
//UIImageView *bouton;
}
//#property (nonatomic, assign) id<TapScrollViewDelegate> delegate;
//#property (nonatomic, retain) UIImageView *bouton;
//#property (strong, nonatomic) UIPanGestureRecognizer *bouton01pan;
-(id)init;
-(void)initierScrollView;
-(void) createGestureRecognizers;
-(IBAction)handlePanGesture:(UIPanGestureRecognizer*)sender;
#end
#import "TapScrollView.h"
#implementation TapScrollView
//#synthesize bouton;
- (void)setUpBoutonView {
// Create the placard view -- its init method calculates its frame based on its image
//boutonHome *aBoutonMain = [[boutonHome alloc] init];
//self.boutonMain = aBoutonMain;
//[boutonMain setCenter:CGPointMake(200, 200)];
//[self addSubview:boutonMain];
}
- (id) init
{
if (self = [super init])
{
NSLog(#"Classe TapScrollView initiée");
}
return self;
}
-(void)initierScrollView
{
int i;
for (i=0; i<6; i++) {
UIImage *image = [UIImage imageNamed:#"back.png"];
UIImageView *bouton = [[UIImageView alloc] initWithImage:image];
[bouton setTag:i];
[bouton setFrame:CGRectMake(72+20*i,10,62,55)];
[classementBoutons insertObject:bouton atIndex:i];
bouton.userInteractionEnabled = YES;
UIPanGestureRecognizer *recognizer = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePanGesture:)];
recognizer.delegate = self;
[bouton addGestureRecognizer:recognizer];
[self addSubview:bouton];
}
}
//- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
//{
// UITouch *touch = [touches anyObject];
//[super touchesBegan:touches withEvent:event];
// for (o=1; o<6; o++) {
// if ([touch view] == [self viewWithTag:o])
// {
// UIPanGestureRecognizer *recognizer = [[UIPanGestureRecognizer alloc] initWithTarget:[classementBoutons objectAtIndex:o] action:#selector(handlePanGesture:)];
// [[classementBoutons objectAtIndex:o] addGestureRecognizer:recognizer];
// bouton01 = [self viewWithTag:o];
// }
// }
//CGPoint touchPoint = [touch locationInView:self];
//[self animateFirstTouchAtPoint:touchPoint];
// return;
//}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
for (n=0; n<6; n++) {
NSLog(#"touche cancelled");
[[classementBoutons objectAtIndex:n] setFrame:CGRectMake((72+20)*n,10,62,55)];
}
}
//- (id<TapScrollViewDelegate>) delegate {
// return (id<TapScrollViewDelegate>)super.delegate;
//}
//- (void) setDelegate:(id<TapScrollViewDelegate>) aDelegate
//{
// super.delegate = aDelegate;
//}
#define GROW_ANIMATION_DURATION_SECONDS 0.15
#define SHRINK_ANIMATION_DURATION_SECONDS 0.15
-(IBAction)handlePanGesture:(UIPanGestureRecognizer*)recognizer
{
NSLog(#"Mouvement ok");
CGPoint translation = [recognizer translationInView:self];
recognizer.view.center = CGPointMake(recognizer.view.center.x + translation.x,
recognizer.view.center.y + translation.y);
[recognizer setTranslation:CGPointMake(0, 0) inView:self];
}
#end
Try specifying:
#interface TapScrollView : UIScrollView <UIGestureRecognizerDelegate> {
In this way, your warning should disappear, although I have to admit I have not entirely clear what you are trying to accomplish, so there might be other problems.

Activating UIImageView with GestureRecognizer in a ScrollView

I,
I'm currently trying to implement a Gesture recognizer into a ScrollView.
I first created a custom ScrollView in which I integrated ImageView object.
When the user clicks on a ImageView, normally the PanGestureRecognizer activates and the ImageView object follow the move on the screen.
I have read and followed the instructions on Gesture Recognizer and the Raywenderlich blog (which is very well done).
If someone has a clue of what is missing in my code, I would be happy to read it
Thank in advance. Here is my code
#import <Foundation/Foundation.h>
#import "mainInterface03.h"
#import <QuartzCore/QuartzCore.h>
#import "boutonHome.h"
#import "DragGestureRecognizer.h"
#class boutonHome;
#class DragGestureRecognizer;
#interface TapScrollView : UIScrollView {
// id<TapScrollViewDelegate> delegate;
NSMutableArray *classementBoutons;
int n;
int o;
UIView *bouton01;
}
#property (nonatomic, retain) UIView *bouton01;
#property (retain, nonatomic) IBOutletCollection(UIButton) NSMutableSet* buttons;
-(id)init;
-(void)initierScrollView;
-(void) createGestureRecognizers;
-(IBAction)handlePanGesture:(UIPanGestureRecognizer*)sender;
#end
m.file
#import "TapScrollView.h"
#implementation TapScrollView
#synthesize bouton01;
- (id) init
{
if (self = [super init])
{
NSLog(#"Classe TapScrollView initiée");
}
return self;
}
-(void)initierScrollView
{
int i;
for (i=0; i<6; i++) {
UIImage *image = [UIImage imageNamed:#"back.png"];
UIImageView *bouton = [[UIImageView alloc] initWithImage:image];
[bouton setTag:i];
[bouton setFrame:CGRectMake(72*i+20,10,62,55)];
[classementBoutons insertObject:bouton atIndex:i];
[self addSubview:bouton];
}
UIPanGestureRecognizer *recognizer = [[UIPanGestureRecognizer alloc] initWithTarget:bouton01 action:#selector(handlePanGesture:)];
[bouton01 addGestureRecognizer:recognizer];
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
[super touchesBegan:touches withEvent:event];
for (o=1; o<6; o++) {
if ([touch view] == [self viewWithTag:o])
{
bouton01 = [self viewWithTag:o];
}
}
return;
}
-(IBAction)handlePanGesture:(UIPanGestureRecognizer*)recognizer
{
NSLog(#"Mouvement ok");
CGPoint translation = [recognizer translationInView:self];
recognizer.view.center = CGPointMake(recognizer.view.center.x + translation.x,
recognizer.view.center.y + translation.y);
[recognizer setTranslation:CGPointMake(0, 0) inView:self];
}
#end
I am not sure if this setup can work. Essentially, you are assigning whatever view has been touched to bouton01, which carries the gesture recognizer. Seems a bit convoluted to me, and also your code is not so efficient.
It seems that when you call [super touchesBegan:touches withEvent:event]; it will pass the touch up the view hierarchy. Only after that do make the assignment to bouton01. So it would seem logical that bouton01 never receives a touch event.
Indeed, it is because of this strange approach of iterating through the views and assigning it the one that has the recognizer that this error arose. I would suggest to assign the same recognizer to all concerned views during setup.

Extend TTPhotoViewController with custom TTPhotoView

I have successfully integrated the three20 framework in my project,
and I have extended the TTPhotoViewController to add some further
functionality.
Now I need to add some subviews to the TTPhotoView loaded by the
TTPhotoViewController. In particular I would like to add that subviews
after every TTPhotoView as been loaded. These subviews represents
sensible area over the image so they should scale proportionally with
the image.
The user can tap a subview to get extra info about the image.
I don't know how to implement this behavior. Should I extend the
TTPhotoView and make sure that the TTPhotoViewController use this
extended version instead of its TTPhotoView?
Could someone point me to the right direction?
Thank you
Solved subclassing the TTPhotoView (TapDetectingPhotoView) and then adding all my subviews to that subclass.
The main problem was represented by the photo switching, because the TTPhotoViewController (in particular its inner TTScrollView) reuse the TTPhotoView during switching operation.
So for example if you add your subview to your TTPhotoView subclass and try to switch to the next photo, your subview will probably be here, because the TTPhotoView is reused.
To solve this problem I decided to add and remove all my subviews every time a photo switch occur (see TTPhotoViewController::didMoveToPhoto).
In this way I'm sure that every photoview has its subviews.
Here my implementation (only remarkable methods)
Hope these help!
PhotoViewController.h:
#import "TapDetectingPhotoView.h"
#interface PhotoGalleryController : TTPhotoViewController <TTScrollViewDelegate, TapDetectingPhotoViewDelegate> {
NSArray *images;
}
#property (nonatomic, retain) NSArray *images;
#end
PhotoViewController.m:
#import "PhotoGalleryController.h"
#implementation PhotoGalleryController
#synthesize images;
- (void)viewDidLoad { // fill self.images = ... }
- (TTPhotoView*)createPhotoView {
TapDetectingPhotoView *photoView = [[TapDetectingPhotoView alloc] init];
photoView.tappableAreaDelegate = self;
return [photoView autorelease];
}
#pragma mark -
#pragma mark TTPhotoViewController
- (void)didMoveToPhoto:(id<TTPhoto>)photo fromPhoto:(id<TTPhoto>)fromPhoto {
[super didMoveToPhoto:photo fromPhoto:fromPhoto];
TapDetectingPhotoView *previousPhotoView = (TapDetectingPhotoView *)[_scrollView pageAtIndex:fromPhoto.index];
TapDetectingPhotoView *currentPhotoView = (TapDetectingPhotoView *)[_scrollView pageAtIndex:photo.index];
// destroy the sensible areas from the previous photoview, because the photo could be reused by the TTPhotoViewController!
if (previousPhotoView)
previousPhotoView.sensibleAreas = nil;
// if sensible areas has not been already created, create new
if (currentPhotoView && currentPhotoView.sensibleAreas == nil) {
currentPhotoView.sensibleAreas = [[self.images objectAtIndex:photo.index] valueForKey:#"aMap"];
[self showSensibleAreas:YES animated:YES];
}
}
#pragma mark -
#pragma mark TappablePhotoViewDelegate
// show a detail view when a sensible area is tapped
- (void)tapDidOccurOnSensibleAreaWithId:(NSUInteger)ids {
NSLog(#"SENSIBLE AREA TAPPED ids:%d", ids);
// ..push new view controller...
}
TapDetectingPhotoView.h:
#import "SensibleAreaView.h"
#protocol TapDetectingPhotoViewDelegate;
#interface TapDetectingPhotoView : TTPhotoView <SensibleAreaViewDelegate> {
NSArray *sensibleAreas;
id <TapDetectingPhotoViewDelegate> tappableAreaDelegate;
}
#property (nonatomic, retain) NSArray *sensibleAreas;
#property (nonatomic, assign) id <TapDetectingPhotoViewDelegate> tappableAreaDelegate;
#end
#protocol TapDetectingPhotoViewDelegate <NSObject>
#required
- (void)tapDidOccurOnSensibleAreaWithId:(NSUInteger)ids;
#end
TapDetectingPhotoView.m:
#import "TapDetectingPhotoView.h"
#interface TapDetectingPhotoView (Private)
- (void)createSensibleAreas;
#end
#implementation TapDetectingPhotoView
#synthesize sensibleAreas, tappableAreaDelegate;
- (id)init {
return [self initWithSensibleAreas:nil];
}
- (id)initWithFrame:(CGRect)frame {
return [self initWithSensibleAreas:nil];
}
// designated initializer
- (id)initWithSensibleAreas:(NSArray *)areasList {
if (self = [super initWithFrame:CGRectZero]) {
self.sensibleAreas = areasList;
[self createSensibleAreas];
}
return self;
}
- (void)setSensibleAreas:(NSArray *)newSensibleAreas {
if (newSensibleAreas != self.sensibleAreas) {
// destroy previous sensible area and ensure that only sensible area's subviews are removed
for (UIView *subview in self.subviews)
if ([subview isMemberOfClass:[SensibleAreaView class]])
[subview removeFromSuperview];
[newSensibleAreas retain];
[sensibleAreas release];
sensibleAreas = newSensibleAreas;
[self createSensibleAreas];
}
}
- (void)createSensibleAreas {
SensibleAreaView *area;
NSNumber *areaID;
for (NSDictionary *sensibleArea in self.sensibleAreas) {
CGFloat x1 = [[sensibleArea objectForKey:#"nX1"] floatValue];
CGFloat y1 = [[sensibleArea objectForKey:#"nY1"] floatValue];
area = [[SensibleAreaView alloc] initWithFrame:
CGRectMake(
x1, y1,
[[sensibleArea objectForKey:#"nX2"] floatValue]-x1, [[sensibleArea objectForKey:#"nY2"] floatValue]-y1
)
];
areaID = [sensibleArea objectForKey:#"sId"];
area.ids = [areaID unsignedIntegerValue]; // sensible area internal ID
area.tag = [areaID integerValue];
area.delegate = self;
[self addSubview:area];
[area release];
}
}
// to make sure that if the zoom factor of the TTScrollView is > than 1.0 the subviews continue to respond to the tap events
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
UIView *result = nil;
for (UIView *child in self.subviews) {
CGPoint convertedPoint = [self convertPoint:point toView:child];
if ([child pointInside:convertedPoint withEvent:event]) {
result = child;
}
}
return result;
}
#pragma mark -
#pragma mark TapDetectingPhotoViewDelegate methods
- (void)tapDidOccur:(SensibleAreaView *)aView {
NSLog(#"tapDidOccur ids:%d tag:%d", aView.ids, aView.tag);
[tappableAreaDelegate tapDidOccurOnSensibleAreaWithId:aView.ids];
}
SensibleAreaView.h:
#protocol SensibleAreaViewDelegate;
#interface SensibleAreaView : UIView {
id <SensibleAreaViewDelegate> delegate;
NSUInteger ids;
UIButton *disclosureButton;
}
#property (nonatomic, assign) id <SensibleAreaViewDelegate> delegate;
#property (nonatomic, assign) NSUInteger ids;
#property (nonatomic, retain) UIButton *disclosureButton;
#end
#protocol SensibleAreaViewDelegate <NSObject>
#required
- (void)tapDidOccur:(SensibleAreaView *)aView;
#end
SensibleAreaView.m:
#import "SensibleAreaView.h"
#implementation SensibleAreaView
#synthesize delegate, ids, disclosureButton;
- (id)initWithFrame:(CGRect)frame {
if (self = [super initWithFrame:frame]) {
self.userInteractionEnabled = YES;
UIColor *color = [[UIColor alloc] initWithWhite:0.4 alpha:1.0];
self.backgroundColor = color;
[color release];
UIButton *button = [UIButton buttonWithType:UIButtonTypeDetailDisclosure];
[button addTarget:self action:#selector(buttonTouched) forControlEvents:UIControlEventTouchUpInside];
CGRect buttonFrame = button.frame;
// set the button position over the right edge of the sensible area
buttonFrame.origin.x = frame.size.width - buttonFrame.size.width + 5.0f;
buttonFrame.origin.y = frame.size.height/2 - 10.0f;
button.frame = buttonFrame;
button.autoresizingMask = UIViewAutoresizingFlexibleTopMargin |UIViewAutoresizingFlexibleBottomMargin | UIViewAutoresizingFlexibleLeftMargin |UIViewAutoresizingFlexibleRightMargin | UIViewAutoresizingFlexibleWidth |UIViewAutoresizingFlexibleHeight;
self.disclosureButton = button;
[self addSubview:button];
// notification used to make sure that the button is properly scaled together with the photoview. I do not want the button looks bigger if the photoview is zoomed, I want to preserve its default dimensions
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(zoomFactorChanged:) name:#"zoomFactorChanged" object:nil];
}
return self;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesBegan:touches withEvent:event];
if ([[touches anyObject] tapCount] == 1)
[delegate tapDidOccur:self];
}
- (void)buttonTouched {
[delegate tapDidOccur:self];
}
- (void)zoomFactorChanged:(NSNotification *)message {
NSDictionary *userInfo = [message userInfo];
CGFloat factor = [[userInfo valueForKey:#"zoomFactor"] floatValue];
BOOL withAnimation = [[userInfo valueForKey:#"useAnimation"] boolValue];
if (withAnimation) {
[UIView beginAnimations:nil context:nil];
[UIView setAnimationDuration:0.18];
}
disclosureButton.transform = CGAffineTransformMake(1/factor, 0.0, 0.0, 1/factor, 0.0, 0.0);
if (withAnimation)
[UIView commitAnimations];
}
- (void)dealloc {
[[NSNotificationCenter defaultCenter] removeObserver:self name:#"zoomFactorChanged" object:nil];
[disclosureButton release];
[super dealloc];
}
Some ideas:
Subclass TTPhotoView, then override createPhotoView in your TTPhotoViewController:
- (TTPhotoView*)createPhotoView {
return [[[MyPhotoView alloc] init] autorelease];
}
Try overriding a private method (yes, bad practice, but hey) setImage: in TTPhotoView subclass:
- (void)setImage:(UIImage*)image {
[super setImage:image]
// Add a subview with the frame of self.view, maybe?..
//
// Check for self.isLoaded (property of TTImageView
// which is subclassed by TTPhotoView) to check if
// the image is loaded
}
In general, look at the headers and implementations (for the private methods) of TTPhotoViewController and TTPhotoView. Set some breakpoints to figure out what does what :)
Interesting question. Facebook has a similar functionality with their tags. Their tags do not scale proportionally with the image. In fact, they do not even show the tags if you have zoomed. I don't know if this will help you, but I would look at how (if) three20 handles tags and then maybe try to extend that.